1 00:00:04,240 --> 00:00:07,240 Speaker 1: Welcome to Tech Stuff, a production of I Heart Radios 2 00:00:07,320 --> 00:00:13,880 Speaker 1: How Stuff Works. Hey there, and welcome to Tech Stuff. 3 00:00:13,920 --> 00:00:17,040 Speaker 1: I'm your host, Jonathan Strickland. I'm an executive producer with 4 00:00:17,079 --> 00:00:20,439 Speaker 1: I Heart Radio and I love all things tech. And 5 00:00:20,480 --> 00:00:23,200 Speaker 1: we are continuing on our look back on the big 6 00:00:23,320 --> 00:00:26,799 Speaker 1: text stories of twenty nineteen. Now, the last episode was 7 00:00:26,840 --> 00:00:32,160 Speaker 1: all about bummers. This one is only mostly about bummers. 8 00:00:32,240 --> 00:00:36,240 Speaker 1: But don't blame me. I didn't make the news. Also, 9 00:00:36,240 --> 00:00:38,840 Speaker 1: I should point out that, of course I'm just giving 10 00:00:38,880 --> 00:00:41,240 Speaker 1: kind of a high level overview of stuff that happened 11 00:00:41,240 --> 00:00:43,960 Speaker 1: in twenty nineteen, because to cover everything that happened in 12 00:00:44,000 --> 00:00:49,040 Speaker 1: tech would be beyond even my impressive capabilities. And I 13 00:00:49,040 --> 00:00:52,159 Speaker 1: should also add that I've got some updates to stuff 14 00:00:52,720 --> 00:00:56,400 Speaker 1: what happened in the last episode, because, as it turns out, 15 00:00:56,440 --> 00:00:59,840 Speaker 1: when I recorded part one and when I'm recording part two, 16 00:01:00,400 --> 00:01:04,320 Speaker 1: time past and stuff continue to develop. That's how news works, 17 00:01:04,800 --> 00:01:07,280 Speaker 1: and I resent it. But one story I do need 18 00:01:07,319 --> 00:01:10,680 Speaker 1: to follow up on broke between that time and that 19 00:01:10,800 --> 00:01:16,320 Speaker 1: was that Dennis Muhlenberg, the CEO of Boeing, resigned from 20 00:01:16,319 --> 00:01:19,600 Speaker 1: Boeing as the company was dealing with the consequences of 21 00:01:19,640 --> 00:01:23,520 Speaker 1: the seven thirty seven max fleet being grounded, among other 22 00:01:23,560 --> 00:01:27,120 Speaker 1: problems at the company. Generally speaking, analyst said that his 23 00:01:27,160 --> 00:01:29,679 Speaker 1: stepping down was sort of a necessary part of Boweing 24 00:01:29,720 --> 00:01:34,200 Speaker 1: regaining confidence among customers and shareholders. Not that all blame 25 00:01:34,319 --> 00:01:37,600 Speaker 1: should be put on the CEOs shoulders, but that this 26 00:01:37,680 --> 00:01:39,920 Speaker 1: was one of those steps a company has to take 27 00:01:39,959 --> 00:01:43,240 Speaker 1: in order to convince people, hey, we are really taking 28 00:01:43,240 --> 00:01:47,440 Speaker 1: this seriously and we need to make some changes. Now. 29 00:01:47,640 --> 00:01:50,480 Speaker 1: The next two stories that I want to talk about 30 00:01:50,480 --> 00:01:55,120 Speaker 1: are related, and they are both extremely dark and upsetting, 31 00:01:55,440 --> 00:01:58,720 Speaker 1: but I also feel they are important to acknowledge and consider. 32 00:01:59,360 --> 00:02:01,480 Speaker 1: And for some of you you might feel as though 33 00:02:01,520 --> 00:02:05,040 Speaker 1: I'm going to get really preachy about this. I am 34 00:02:05,040 --> 00:02:08,440 Speaker 1: not going to apologize for that in this case. Now, 35 00:02:08,480 --> 00:02:12,440 Speaker 1: the first dark story is about March fifteen, two thousand nineteen. 36 00:02:13,160 --> 00:02:15,639 Speaker 1: That was when a gunman carried out attacks on two 37 00:02:15,720 --> 00:02:19,720 Speaker 1: different mosques in christ Church, New Zealand, killing fifty one 38 00:02:19,840 --> 00:02:23,280 Speaker 1: people and injuring another forty nine. Now, the reason the 39 00:02:23,400 --> 00:02:27,000 Speaker 1: story gets included in tech news is that the gunman 40 00:02:27,280 --> 00:02:31,720 Speaker 1: streamed the attack on the first mosque over on Facebook Live. 41 00:02:32,680 --> 00:02:35,240 Speaker 1: Other people grabbed the video feed, and then they began 42 00:02:35,320 --> 00:02:38,920 Speaker 1: to post it elsewhere, which ensured that even as platforms 43 00:02:38,960 --> 00:02:42,760 Speaker 1: were removing it, others were hosting it. The New Zealand 44 00:02:42,800 --> 00:02:46,840 Speaker 1: government classified the video as objectionable, which is a legal 45 00:02:47,200 --> 00:02:50,320 Speaker 1: classification in New Zealand. It meant that distributing, copying, or 46 00:02:50,400 --> 00:02:53,200 Speaker 1: exhibiting the video would be against the law, but the 47 00:02:53,320 --> 00:02:56,440 Speaker 1: video was already out on the internet. Now most platforms 48 00:02:56,480 --> 00:02:59,200 Speaker 1: have created digital fingerprints of the video in order to 49 00:02:59,280 --> 00:03:03,680 Speaker 1: detect future uploads, thus blocking it automatically and then removing 50 00:03:03,720 --> 00:03:08,160 Speaker 1: them very quickly. The gunman, identified as Britain Tarrant, had 51 00:03:08,240 --> 00:03:12,280 Speaker 1: been active in far right organizations and white supremacy groups 52 00:03:12,480 --> 00:03:15,760 Speaker 1: online and off, and there's been a rise in activity 53 00:03:15,880 --> 00:03:20,160 Speaker 1: in online communities of such radical groups, raising warnings of 54 00:03:20,280 --> 00:03:24,000 Speaker 1: extremists using the Internet to recruit others and reinforce some 55 00:03:24,400 --> 00:03:28,919 Speaker 1: truly awful beliefs. And this brings me to story number two, 56 00:03:29,360 --> 00:03:31,600 Speaker 1: which is that these groups have made use of some 57 00:03:31,800 --> 00:03:35,600 Speaker 1: notable online communities to encourage one another and create a 58 00:03:35,720 --> 00:03:40,400 Speaker 1: space for extremism. One of those online communities, and perhaps 59 00:03:40,480 --> 00:03:44,240 Speaker 1: the most infamous, is eight kun formerly known as eight chan. 60 00:03:45,040 --> 00:03:48,960 Speaker 1: The history of eight chan dates back to when Frederick 61 00:03:49,040 --> 00:03:52,600 Speaker 1: Brennan created it, as an alternative to an earlier online 62 00:03:52,640 --> 00:03:56,960 Speaker 1: message board community called four chan. Now, Brennan felt four 63 00:03:57,080 --> 00:04:00,640 Speaker 1: chan was becoming too restrictive, which is a sentence that's 64 00:04:00,680 --> 00:04:03,520 Speaker 1: hard to even believe if you're at all familiar with 65 00:04:03,640 --> 00:04:07,040 Speaker 1: four chan. The only rule on a chan was that 66 00:04:07,160 --> 00:04:10,400 Speaker 1: you weren't supposed to post or link to any content 67 00:04:10,560 --> 00:04:14,240 Speaker 1: that would be illegal in the United States. Brennan ended 68 00:04:14,320 --> 00:04:17,680 Speaker 1: his association with the site in two thousand eighteen. In 69 00:04:17,720 --> 00:04:20,760 Speaker 1: two thousand nineteen, in the wake of shootings in christ Church, 70 00:04:20,839 --> 00:04:24,880 Speaker 1: New Zealand, also in Poway, California, and El Paso, Texas, 71 00:04:25,839 --> 00:04:28,680 Speaker 1: and also links back to eight chan showing how the 72 00:04:28,720 --> 00:04:31,680 Speaker 1: perpetrators of those three different shootings had used eight chan 73 00:04:31,800 --> 00:04:35,840 Speaker 1: to publish their own manifestos. Brennan, the founder of the site, 74 00:04:35,960 --> 00:04:37,920 Speaker 1: was one of the voices calling for the site to 75 00:04:38,080 --> 00:04:42,160 Speaker 1: get shut down. That actually did happen in August two 76 00:04:42,240 --> 00:04:45,920 Speaker 1: thousand nineteen, but the site since returned as a con 77 00:04:46,040 --> 00:04:48,760 Speaker 1: as of November two thousand nineteen. And there is a 78 00:04:48,839 --> 00:04:52,440 Speaker 1: pretty complicated situation going on here. On the one hand, 79 00:04:52,880 --> 00:04:55,680 Speaker 1: the founders of the Internet and of the Worldwide Web 80 00:04:56,040 --> 00:04:58,640 Speaker 1: envisioned a platform that would support freedom of speech and 81 00:04:58,720 --> 00:05:02,279 Speaker 1: the exchange of ideas. On the other hand, many people, 82 00:05:02,480 --> 00:05:06,920 Speaker 1: particularly those from already vulnerable communities, are put in danger 83 00:05:07,120 --> 00:05:11,000 Speaker 1: as extremism is on the rise. The safe haven for 84 00:05:11,120 --> 00:05:17,120 Speaker 1: those who espouse these extremist, radical, racist, and misogynist and 85 00:05:17,600 --> 00:05:23,760 Speaker 1: violent beliefs as contributing to an increasingly toxic subculture. In addition, 86 00:05:24,200 --> 00:05:29,640 Speaker 1: several tech companies have enabled this subculture. It hasn't necessarily 87 00:05:29,839 --> 00:05:33,360 Speaker 1: been a conscious decision, but the principles of running a 88 00:05:33,440 --> 00:05:36,160 Speaker 1: business in which your goal is to return value to 89 00:05:36,240 --> 00:05:41,200 Speaker 1: shareholders isn't always in alignment with doing what's actually best 90 00:05:41,440 --> 00:05:44,560 Speaker 1: for the general population. In fact, those two things can 91 00:05:44,960 --> 00:05:48,320 Speaker 1: often come into conflict with one another. In some cases, 92 00:05:48,640 --> 00:05:51,760 Speaker 1: like eight Kun, this is far more apparent, but it's 93 00:05:51,839 --> 00:05:54,920 Speaker 1: also the case with stuff like more public platforms such 94 00:05:54,960 --> 00:05:58,600 Speaker 1: as Twitter and Facebook. Those companies struggle with how to 95 00:05:58,720 --> 00:06:02,719 Speaker 1: deal with a particularly thorny subject to varying degrees of success, 96 00:06:03,480 --> 00:06:07,280 Speaker 1: most of which satisfy very few people. I wish I 97 00:06:07,680 --> 00:06:11,040 Speaker 1: had a solution to this very large problem, but I 98 00:06:11,120 --> 00:06:14,560 Speaker 1: believe such a solution has to go much deeper than 99 00:06:14,640 --> 00:06:18,240 Speaker 1: taking a website offline or removing an option for people 100 00:06:18,279 --> 00:06:21,640 Speaker 1: to voice these hateful philosophies. That's part of it, but 101 00:06:21,760 --> 00:06:25,440 Speaker 1: it doesn't address the deeper underlying problems that feed into 102 00:06:25,600 --> 00:06:28,839 Speaker 1: that toxicity to begin with. So all I can really 103 00:06:28,920 --> 00:06:33,320 Speaker 1: do is appeal to you guys to exercise compassion and 104 00:06:33,440 --> 00:06:38,039 Speaker 1: critical thinking. Those two things are absolutely necessary in my view. 105 00:06:38,760 --> 00:06:42,200 Speaker 1: All Right, the darkest of the dark stuff in this 106 00:06:42,320 --> 00:06:45,760 Speaker 1: episode is over, so let's move on. One thing that 107 00:06:45,920 --> 00:06:48,800 Speaker 1: happened in twenty nineteen might set us on a path 108 00:06:49,000 --> 00:06:52,920 Speaker 1: for widespread use of drones to deliver packages. In the 109 00:06:53,040 --> 00:06:56,680 Speaker 1: spring of twenty nineteen, the Federal Aviation Administration or f 110 00:06:56,839 --> 00:07:01,119 Speaker 1: A a certified Wing that's the drone delivery startup company 111 00:07:01,200 --> 00:07:05,320 Speaker 1: that's owned by the Google parent company alphabet UH. They 112 00:07:05,480 --> 00:07:09,560 Speaker 1: certified them to operate as an air carrier. This regulatory 113 00:07:09,640 --> 00:07:12,960 Speaker 1: step allows Wing to make commercial deliveries in the United States. 114 00:07:13,360 --> 00:07:16,480 Speaker 1: The company had already been conducting tests in Australia in 115 00:07:16,520 --> 00:07:20,160 Speaker 1: anticipation of receiving government approval in the United States. And 116 00:07:20,240 --> 00:07:22,720 Speaker 1: there's still many questions that need to be answered, and 117 00:07:22,800 --> 00:07:25,720 Speaker 1: we're likely to see very limited roll out of drone 118 00:07:25,760 --> 00:07:30,320 Speaker 1: delivery services in specific regions as companies and local governments 119 00:07:30,360 --> 00:07:34,120 Speaker 1: kind of hash out the best way to move forward now. Personally, 120 00:07:34,800 --> 00:07:37,600 Speaker 1: I'm curious to see if drone delivery will prove to 121 00:07:37,680 --> 00:07:40,840 Speaker 1: be a more efficient means of delivering packages on a 122 00:07:40,960 --> 00:07:44,200 Speaker 1: large enough scale to make sense. I mean, I can 123 00:07:44,200 --> 00:07:46,920 Speaker 1: see how it could be incredibly useful in scenarios where 124 00:07:47,520 --> 00:07:50,400 Speaker 1: getting to a location is challenging and the need to 125 00:07:50,520 --> 00:07:54,920 Speaker 1: deliver something important like medication is really urgent, but I'm 126 00:07:54,960 --> 00:07:57,800 Speaker 1: not entirely convinced yet that it would make sense from 127 00:07:57,840 --> 00:08:01,960 Speaker 1: a more general use standpoint. However, I also haven't run 128 00:08:02,040 --> 00:08:04,400 Speaker 1: the figures, nor do I know how much it costs 129 00:08:04,480 --> 00:08:07,960 Speaker 1: to operate delivery services as they stand right now, So 130 00:08:08,760 --> 00:08:11,840 Speaker 1: it's entirely possible that this is a viable alternative to 131 00:08:11,920 --> 00:08:15,160 Speaker 1: more traditional delivery services. I just don't know enough to 132 00:08:15,240 --> 00:08:18,200 Speaker 1: comment on it firmly, But it's hard for me to 133 00:08:18,240 --> 00:08:20,320 Speaker 1: believe that, on the face of it, that it would 134 00:08:20,360 --> 00:08:24,560 Speaker 1: be more cost effective and efficient unless you just had 135 00:08:24,600 --> 00:08:27,280 Speaker 1: truly enormous fleets, in which case then you have the 136 00:08:27,360 --> 00:08:31,840 Speaker 1: technological and administrative difficulties that come with managing that large 137 00:08:31,880 --> 00:08:35,319 Speaker 1: of a fleet. So I just don't know. Sticking with 138 00:08:35,440 --> 00:08:37,800 Speaker 1: government approval, because there are a lot of stories that 139 00:08:38,120 --> 00:08:42,360 Speaker 1: fall into that category this year, the Federal Communications Commission 140 00:08:42,480 --> 00:08:45,920 Speaker 1: in the United States, or the FCC, approved the merger 141 00:08:46,040 --> 00:08:51,080 Speaker 1: of telecommunications companies T Mobile and Sprint. Now. According to analysts, 142 00:08:51,200 --> 00:08:53,559 Speaker 1: the chief purpose of this merger is to enhance T 143 00:08:53,760 --> 00:08:57,200 Speaker 1: mobiles five G technology rollout to give it a stronger 144 00:08:57,280 --> 00:08:59,920 Speaker 1: position in the United States as a As you know, 145 00:09:00,120 --> 00:09:02,640 Speaker 1: five G networks are starting to come online. Just a 146 00:09:02,679 --> 00:09:05,400 Speaker 1: few years ago, according to reports from a consulting firm 147 00:09:05,440 --> 00:09:08,800 Speaker 1: called McKenzie, T Mobile was eyeing a merger was Sprint, 148 00:09:09,080 --> 00:09:11,280 Speaker 1: but for a different reason. It was in an effort 149 00:09:11,360 --> 00:09:15,079 Speaker 1: to become more competitive against A T and T and Verizon, 150 00:09:15,160 --> 00:09:18,439 Speaker 1: which are the other two major cellular phone carriers in 151 00:09:18,480 --> 00:09:22,319 Speaker 1: the United States. While the FCC has given its approval, 152 00:09:22,720 --> 00:09:26,640 Speaker 1: that's just one regulatory hurdle that telecommunications companies have to 153 00:09:26,720 --> 00:09:30,559 Speaker 1: overcome before they can merge. Regulatory agencies at both the 154 00:09:30,720 --> 00:09:34,240 Speaker 1: state and federal levels are still considering this plan, and 155 00:09:34,320 --> 00:09:37,679 Speaker 1: they may place restrictions or limitations on any merger, or 156 00:09:37,760 --> 00:09:41,440 Speaker 1: they might deny it outright. T Mobile has reportedly been 157 00:09:41,559 --> 00:09:44,920 Speaker 1: renegotiating the deal in the meantime, and the old reports 158 00:09:45,040 --> 00:09:47,560 Speaker 1: from two thousand and fifteen, the ones that stated tea 159 00:09:47,600 --> 00:09:50,640 Speaker 1: Mobile was first looking at Sprint for a possible merger, 160 00:09:51,160 --> 00:09:55,480 Speaker 1: said that t Mobile also entertain the notion of allowing Comcast, 161 00:09:56,080 --> 00:10:01,240 Speaker 1: the mega cable corporation, to acquire Team Bowl. There may 162 00:10:01,320 --> 00:10:04,880 Speaker 1: well be some serious offers for acquisitions like that in 163 00:10:04,960 --> 00:10:08,439 Speaker 1: the near future of either te Mobile or Sprint, or 164 00:10:08,720 --> 00:10:12,440 Speaker 1: emerged version of the two from such a cable company, 165 00:10:12,520 --> 00:10:16,839 Speaker 1: whether it's Comcast or a different one. Speaking of corporate maneuvers, 166 00:10:17,320 --> 00:10:22,520 Speaker 1: one drama that finally finished playing out in nineteen really 167 00:10:22,679 --> 00:10:25,640 Speaker 1: kind of fizzled out and sputtered a bit was the 168 00:10:25,800 --> 00:10:30,280 Speaker 1: tale of Amazon's HQ two in New York City. So 169 00:10:30,679 --> 00:10:33,640 Speaker 1: let's backtrack a bit. The company initially announced it was 170 00:10:33,720 --> 00:10:36,959 Speaker 1: looking into expanding its corporate headquarters, which are based out 171 00:10:36,960 --> 00:10:41,880 Speaker 1: of Seattle, Washington, into a different city. In two thousand eighteen. 172 00:10:42,240 --> 00:10:46,839 Speaker 1: They famously held a Request for Proposals in asking for 173 00:10:47,000 --> 00:10:50,679 Speaker 1: cities that were eager to host this new headquarters to 174 00:10:51,080 --> 00:10:55,360 Speaker 1: present their their proposals their deals. That in turn prompted 175 00:10:55,400 --> 00:10:59,559 Speaker 1: a series of stories about incredibly generous tax breaks and 176 00:10:59,760 --> 00:11:03,920 Speaker 1: other incentives, as well as some fairly absurd publicity stunts 177 00:11:04,559 --> 00:11:09,000 Speaker 1: that stretched throughout most of until in November of that year, 178 00:11:09,280 --> 00:11:12,400 Speaker 1: Amazon announced it had settled on two locations that would 179 00:11:12,520 --> 00:11:17,760 Speaker 1: share the duty of being HQ two. One is in Arlington, Virginia, 180 00:11:18,080 --> 00:11:20,840 Speaker 1: and the other was in New York City, New York. Now, 181 00:11:20,880 --> 00:11:25,000 Speaker 1: there was some pretty hefty criticism early on from various 182 00:11:25,080 --> 00:11:29,880 Speaker 1: sources that alleged Amazon had chosen these two locations from 183 00:11:29,920 --> 00:11:33,600 Speaker 1: the beginning, that had these in mind when they even 184 00:11:33,679 --> 00:11:36,439 Speaker 1: asked for the proposals in the first place. One of 185 00:11:36,520 --> 00:11:40,120 Speaker 1: the pieces of supposed evidence that they used to support 186 00:11:40,200 --> 00:11:44,679 Speaker 1: this claim is that Amazon CEO Jeff Bezos apparently had 187 00:11:44,800 --> 00:11:49,079 Speaker 1: homes near those two proposed locations, and that the whole 188 00:11:49,120 --> 00:11:52,760 Speaker 1: selection process was therefore nothing more than an effort to 189 00:11:52,880 --> 00:11:56,439 Speaker 1: create a competitive environment so that both New York and 190 00:11:56,600 --> 00:12:01,600 Speaker 1: Arlington would continuously improve their deals so Amazon we get 191 00:12:01,640 --> 00:12:06,160 Speaker 1: the sweetest tax break, but that presumably the plan all 192 00:12:06,240 --> 00:12:09,439 Speaker 1: along was to move into those two locations. Whether that's 193 00:12:09,440 --> 00:12:12,640 Speaker 1: true or not, we get to two thousand nineteen, and 194 00:12:12,840 --> 00:12:15,839 Speaker 1: early in twenty nineteen, New York City residents voiced some 195 00:12:16,040 --> 00:12:21,400 Speaker 1: rather critical opinions about their new proposed neighbor. Journalists reported 196 00:12:21,440 --> 00:12:24,720 Speaker 1: that the proposed HQ two site in New York City 197 00:12:24,960 --> 00:12:27,839 Speaker 1: would take up land that had previously been intended for 198 00:12:28,080 --> 00:12:32,160 Speaker 1: the use of six thousand homes, including a significant number 199 00:12:32,240 --> 00:12:37,439 Speaker 1: of low income homes. Alexandria Accacio Cortez, a US representative 200 00:12:37,480 --> 00:12:40,839 Speaker 1: from New York, voiced concern that the incentives offered to 201 00:12:40,920 --> 00:12:43,599 Speaker 1: Amazon would hurt the city both in the near and 202 00:12:43,800 --> 00:12:47,320 Speaker 1: the long term, that it would undermine efforts to fund 203 00:12:47,600 --> 00:12:51,760 Speaker 1: government improvements to critical infrastructure in the city because of 204 00:12:51,800 --> 00:12:53,920 Speaker 1: these enormous tax breaks. You know, if Amazon is not 205 00:12:54,000 --> 00:12:56,520 Speaker 1: paying taxes, that revenue is not coming from them, the 206 00:12:56,600 --> 00:12:59,520 Speaker 1: financial burden falls on everyone else in New York and 207 00:12:59,600 --> 00:13:03,199 Speaker 1: frequently that means that programs have to get reduced or 208 00:13:03,320 --> 00:13:06,160 Speaker 1: cut so that you can, you know, make your money 209 00:13:06,280 --> 00:13:09,960 Speaker 1: stretch out further. In February two thou nineteen, Amazon announced 210 00:13:10,000 --> 00:13:13,000 Speaker 1: it was canceling its plan to build out its location 211 00:13:13,120 --> 00:13:17,400 Speaker 1: in New York City. Amazon does lease out some office space, 212 00:13:17,760 --> 00:13:20,240 Speaker 1: a significant amount of office space in New York, but 213 00:13:20,360 --> 00:13:23,360 Speaker 1: it no longer plans to have a second corporate headquarters there. 214 00:13:23,920 --> 00:13:26,480 Speaker 1: And since we're talking about Amazon, let's move on to 215 00:13:26,559 --> 00:13:29,360 Speaker 1: one of the properties that Amazon owns, and that would 216 00:13:29,360 --> 00:13:33,840 Speaker 1: be the Ring Company. That's the company that produces surveillance 217 00:13:33,920 --> 00:13:36,959 Speaker 1: cameras and and surveillance doorbells, you know doorbells that have 218 00:13:37,040 --> 00:13:40,600 Speaker 1: the cameras and communication systems. Well, in twenty nineteen, there 219 00:13:40,679 --> 00:13:43,520 Speaker 1: were a few stories of hackers who had gained access 220 00:13:43,600 --> 00:13:47,880 Speaker 1: to users Ring equipment, whether it was the surveillance cameras 221 00:13:48,320 --> 00:13:51,400 Speaker 1: or the Ring doorbells. Some hackers did this in an 222 00:13:51,440 --> 00:13:54,920 Speaker 1: effort to expose vulnerability, so they were doing it to say, hey, 223 00:13:55,480 --> 00:13:58,240 Speaker 1: we need to fix this because it's a problem. But 224 00:13:58,360 --> 00:14:01,600 Speaker 1: others did it specifically to her asks or exploit people, 225 00:14:02,600 --> 00:14:06,400 Speaker 1: and those stories were alarming and continue to be alarming. 226 00:14:06,679 --> 00:14:10,000 Speaker 1: Some of them involve kids, and it's incredibly disturbing, and 227 00:14:10,080 --> 00:14:14,280 Speaker 1: they've led to at least one class action lawsuit against Amazon. 228 00:14:14,920 --> 00:14:18,319 Speaker 1: The allegation is that Ring isn't doing enough to ensure 229 00:14:18,440 --> 00:14:22,680 Speaker 1: customers privacy and security are maintained, which is particularly a 230 00:14:22,800 --> 00:14:25,680 Speaker 1: problem for a company that markets equipment that's meant to 231 00:14:26,040 --> 00:14:31,680 Speaker 1: enhance security, not exploit vulnerabilities. Now I haven't seen all 232 00:14:31,800 --> 00:14:35,360 Speaker 1: the details about how the Ring systems were actually hacked. 233 00:14:35,840 --> 00:14:39,000 Speaker 1: There are different ways to gain access to connected systems 234 00:14:39,040 --> 00:14:42,360 Speaker 1: on a network. Sometimes you can find a vulnerability in 235 00:14:42,440 --> 00:14:45,880 Speaker 1: an endpoint, such as an actual device connected to the network. 236 00:14:46,200 --> 00:14:47,760 Speaker 1: So in those cases you would say, all right, the 237 00:14:47,800 --> 00:14:53,640 Speaker 1: hacker managed to hack into the network via this RING device. 238 00:14:54,400 --> 00:14:56,720 Speaker 1: That very well maybe the case maybe they were able 239 00:14:56,760 --> 00:15:00,120 Speaker 1: to brute force a password through that and the got 240 00:15:00,200 --> 00:15:02,840 Speaker 1: access that way. But other times hackers might find a 241 00:15:02,880 --> 00:15:05,440 Speaker 1: way to compromise the network itself and then they can 242 00:15:05,520 --> 00:15:08,960 Speaker 1: access the various components connected to that network as if 243 00:15:09,160 --> 00:15:12,680 Speaker 1: they were, in fact the legitimate administrator of the network. 244 00:15:13,360 --> 00:15:15,360 Speaker 1: In the case of Ring, it sounds to me as 245 00:15:15,400 --> 00:15:20,080 Speaker 1: though they found it through password vulnerabilities. The lawsuit states 246 00:15:20,160 --> 00:15:23,040 Speaker 1: that Ring should have required users to create more robust 247 00:15:23,160 --> 00:15:27,160 Speaker 1: passwords and to require two factor authentication to prevent abuse. 248 00:15:27,520 --> 00:15:29,600 Speaker 1: And just in case you're not familiar with the concept, 249 00:15:30,120 --> 00:15:33,760 Speaker 1: two factor authentication is a subset of what is called 250 00:15:33,880 --> 00:15:37,360 Speaker 1: multi factor authentication, which just means that you're using two 251 00:15:37,840 --> 00:15:40,800 Speaker 1: or more factors, which really just means two to three 252 00:15:40,880 --> 00:15:45,960 Speaker 1: factors to authenticate your identity. And those factors are categories 253 00:15:46,040 --> 00:15:50,160 Speaker 1: of stuff, right. Those categories are what you know. This 254 00:15:50,280 --> 00:15:52,880 Speaker 1: would be something like a password or a pin, so 255 00:15:52,960 --> 00:15:54,760 Speaker 1: it would be something that you have knowledge of and 256 00:15:54,920 --> 00:15:59,040 Speaker 1: you provide when you're accessing a system. The second factor 257 00:15:59,720 --> 00:16:02,560 Speaker 1: is what you have, like what you physically have on 258 00:16:02,720 --> 00:16:05,600 Speaker 1: you that could be a mobile device, So it could 259 00:16:05,640 --> 00:16:08,320 Speaker 1: be that you provide your password or pen and then 260 00:16:08,640 --> 00:16:10,480 Speaker 1: it sends a code to your mobile device, which you 261 00:16:10,600 --> 00:16:12,720 Speaker 1: also have to enter, or you might have a token 262 00:16:13,080 --> 00:16:15,600 Speaker 1: that you have to use in some way to access 263 00:16:15,680 --> 00:16:18,640 Speaker 1: the system. And then the third factor is what you are, 264 00:16:19,320 --> 00:16:22,120 Speaker 1: and this would refer to things like biometric data. Maybe 265 00:16:22,320 --> 00:16:25,680 Speaker 1: it's a retinal scan or fingerprint scan or voice scan. 266 00:16:26,400 --> 00:16:29,320 Speaker 1: Multi factor authentication requires you present at least two of 267 00:16:29,400 --> 00:16:32,600 Speaker 1: those three factors, possibly one of all three. It all 268 00:16:32,640 --> 00:16:35,840 Speaker 1: depends on the implementation. So you might enter a password, 269 00:16:35,880 --> 00:16:37,760 Speaker 1: then you receive your code, you enter the code, and 270 00:16:37,800 --> 00:16:40,920 Speaker 1: then you get access. But that proves you both know 271 00:16:41,200 --> 00:16:44,400 Speaker 1: the password and you also have possession of an authorized 272 00:16:44,480 --> 00:16:48,560 Speaker 1: mobile device, which limits the possibility that an unauthorized person 273 00:16:48,800 --> 00:16:52,240 Speaker 1: is going to gain access to that system. Now, this 274 00:16:52,360 --> 00:16:55,000 Speaker 1: touches on an issue that I think is really important 275 00:16:55,040 --> 00:16:57,840 Speaker 1: and is growing more important as the Internet of Things 276 00:16:58,080 --> 00:17:01,000 Speaker 1: gets bigger. And I'm sure you've heard the saying that 277 00:17:01,160 --> 00:17:04,200 Speaker 1: a chain is only as strong as its weakest link. 278 00:17:04,880 --> 00:17:08,800 Speaker 1: In network security, there are many potential weak links. You 279 00:17:08,880 --> 00:17:12,320 Speaker 1: could have a badly designed piece of hardware or software 280 00:17:12,359 --> 00:17:15,840 Speaker 1: that has vulnerabilities in it, and that offers an in 281 00:17:16,080 --> 00:17:19,200 Speaker 1: road for an intrusion into a network. You can also 282 00:17:19,280 --> 00:17:22,479 Speaker 1: have users who practice really poor security habits, like they 283 00:17:22,600 --> 00:17:26,040 Speaker 1: choose common passwords like a common dictionary word as a password. 284 00:17:26,119 --> 00:17:28,600 Speaker 1: That's a terrible, terrible habit and no one should do it. 285 00:17:28,960 --> 00:17:32,640 Speaker 1: Or they're using the same password for multiple services, also 286 00:17:32,720 --> 00:17:36,040 Speaker 1: a terrible idea. But this raises a question who should 287 00:17:36,040 --> 00:17:40,639 Speaker 1: be accountable for data security? After all, users should be 288 00:17:40,800 --> 00:17:44,159 Speaker 1: employing strong, unique passwords as a matter of habit. And 289 00:17:44,240 --> 00:17:46,959 Speaker 1: if you heard about someone's house being robbed because they 290 00:17:47,000 --> 00:17:49,800 Speaker 1: forgot to lock the door, I don't think your first 291 00:17:49,880 --> 00:17:52,760 Speaker 1: instinct would be to sue the lock company for letting 292 00:17:52,800 --> 00:17:56,160 Speaker 1: it happen. I think end users are at least partly 293 00:17:56,320 --> 00:18:00,280 Speaker 1: accountable for good data security. However, that being said, I 294 00:18:00,359 --> 00:18:04,480 Speaker 1: also think that companies have a responsibility. They need to 295 00:18:04,680 --> 00:18:09,440 Speaker 1: create rules that require strong passwords and multi factor authentication 296 00:18:09,560 --> 00:18:13,040 Speaker 1: by default. They need to essentially force users to be 297 00:18:13,160 --> 00:18:17,600 Speaker 1: more careful. They enable users to practice good security, and 298 00:18:17,680 --> 00:18:20,800 Speaker 1: by enable users, I really mean limit the options that 299 00:18:20,960 --> 00:18:25,560 Speaker 1: users have that that result in poor security. I think 300 00:18:25,600 --> 00:18:28,560 Speaker 1: it's the user's responsibility to be more secure and the 301 00:18:28,600 --> 00:18:32,720 Speaker 1: company's responsibility to enable it. But that's just me. Now, 302 00:18:32,800 --> 00:18:35,280 Speaker 1: when we come back, we'll look at some more stories 303 00:18:35,520 --> 00:18:46,280 Speaker 1: from twenty nineteen. Here's a story that started making the 304 00:18:46,359 --> 00:18:48,920 Speaker 1: news just before twenty nineteen wrapped up. I mean I 305 00:18:49,119 --> 00:18:52,840 Speaker 1: learned about it shortly before coming into the studio. So 306 00:18:53,600 --> 00:18:58,399 Speaker 1: Devin Wilson, or at Atomic Thumbs on Twitter, criticize the 307 00:18:58,480 --> 00:19:01,679 Speaker 1: company so Nos for what he saw as a particularly 308 00:19:01,800 --> 00:19:06,679 Speaker 1: egregious example of trying to control the aftermarket on electronics now. 309 00:19:06,760 --> 00:19:12,160 Speaker 1: Sons is primarily known for making speaker systems, particularly smart speakers, 310 00:19:12,520 --> 00:19:15,840 Speaker 1: and like a lot of tech companies, it depends heavily 311 00:19:15,960 --> 00:19:20,920 Speaker 1: on creating incentives for established Sons customers to upgrade and 312 00:19:21,080 --> 00:19:23,760 Speaker 1: update their equipment. You know, if everyone just went out 313 00:19:23,800 --> 00:19:28,359 Speaker 1: and bought the latest Sons speaker and they thought, oh, 314 00:19:28,440 --> 00:19:31,840 Speaker 1: this works great and had no reason to upgrade, the 315 00:19:31,960 --> 00:19:34,879 Speaker 1: company would have a very rough year. So they have 316 00:19:34,960 --> 00:19:37,679 Speaker 1: to create incentives for people to keep buying their stuff. 317 00:19:38,440 --> 00:19:40,840 Speaker 1: In that way. Sonas is strategy is really similar to 318 00:19:40,960 --> 00:19:44,840 Speaker 1: that of things like smartphone handset manufacturers. It's the Apple 319 00:19:44,920 --> 00:19:48,960 Speaker 1: iPhone model. In other words, each subsequent generation of devices 320 00:19:49,040 --> 00:19:53,440 Speaker 1: incorporates features that older devices cannot support, whether that lack 321 00:19:53,520 --> 00:19:57,000 Speaker 1: of support comes from technical limitations of the hardware or 322 00:19:57,280 --> 00:20:00,320 Speaker 1: managerial decisions. Is a moot point as a pose, I 323 00:20:00,359 --> 00:20:03,200 Speaker 1: mean it could come down to executives say, just don't 324 00:20:03,280 --> 00:20:06,680 Speaker 1: let this run on older stuff, not that the older 325 00:20:06,760 --> 00:20:10,879 Speaker 1: stuff is inherently incapable of running. It doesn't matter. The 326 00:20:11,000 --> 00:20:14,040 Speaker 1: end result is the same. So NOS has a trade 327 00:20:14,160 --> 00:20:17,879 Speaker 1: up program that gives existing so nos users a thirty 328 00:20:18,359 --> 00:20:22,280 Speaker 1: credit toward a new so nos device if those users 329 00:20:22,359 --> 00:20:25,879 Speaker 1: activate what is called recycle mode on their older so 330 00:20:26,080 --> 00:20:31,480 Speaker 1: nos device. Now, recycle mode starts a countdown clock. It's 331 00:20:31,520 --> 00:20:34,680 Speaker 1: a twenty one day countdown, and at the end of 332 00:20:34,760 --> 00:20:38,080 Speaker 1: that SONS puts the device on a blacklist so that 333 00:20:38,359 --> 00:20:41,720 Speaker 1: it is bricked, meaning you can't use it at all. 334 00:20:42,119 --> 00:20:46,359 Speaker 1: It will not work, it is ineffective, and it also 335 00:20:46,440 --> 00:20:48,399 Speaker 1: means that you can't give it away or sell it, 336 00:20:48,600 --> 00:20:51,160 Speaker 1: or at least you can't do so ethically because you'd 337 00:20:51,160 --> 00:20:53,800 Speaker 1: just be handing over what amounts to being a giant 338 00:20:53,880 --> 00:20:56,840 Speaker 1: paperweight with a lot of circuit boards and wires in it. 339 00:20:57,400 --> 00:21:00,800 Speaker 1: So really the only options are to try and hack 340 00:21:00,920 --> 00:21:04,200 Speaker 1: the speakers, which isn't really an option most people would 341 00:21:04,240 --> 00:21:06,919 Speaker 1: feel comfortable trying to tackle and would probably have limited 342 00:21:07,040 --> 00:21:10,000 Speaker 1: use anyway, or you could send the speaker to an 343 00:21:10,040 --> 00:21:13,360 Speaker 1: e waste recycling facility, or you just throw the darn 344 00:21:13,440 --> 00:21:17,200 Speaker 1: thing away. That's not a great option. It adds e waste. 345 00:21:17,240 --> 00:21:20,479 Speaker 1: E waste is bad stuff, and recycling, while better than 346 00:21:20,560 --> 00:21:25,080 Speaker 1: throwing stuff out, isn't as environmentally friendly as reusing stuff. 347 00:21:25,400 --> 00:21:28,840 Speaker 1: If you've ever heard reduce, reuse, recycle, well it's in 348 00:21:28,920 --> 00:21:31,840 Speaker 1: that order of preference. You want to reduce the amount 349 00:21:31,840 --> 00:21:35,080 Speaker 1: of waste you generate. You want to reuse stuff as 350 00:21:35,160 --> 00:21:38,200 Speaker 1: much as you can. The stuff you can't reuse, you recycle. 351 00:21:38,680 --> 00:21:42,040 Speaker 1: The stuff you can't reuse or recycle, then you can 352 00:21:42,119 --> 00:21:45,960 Speaker 1: throw away. But even that is, you know, not great. 353 00:21:46,560 --> 00:21:51,680 Speaker 1: So Wilson's point was that so NOSES program actually incentivizes 354 00:21:52,000 --> 00:21:55,520 Speaker 1: creating e waste. It encourages people to break their old 355 00:21:55,600 --> 00:21:59,040 Speaker 1: device in order to get this thirty percent credit towards 356 00:21:59,040 --> 00:22:03,240 Speaker 1: their next purchase, and it makes those old devices useless 357 00:22:03,280 --> 00:22:07,080 Speaker 1: to anyone. And sure, they these people might go and 358 00:22:07,200 --> 00:22:11,399 Speaker 1: recycle their old Sons speakers, but it's not as good 359 00:22:11,440 --> 00:22:14,120 Speaker 1: an option as to keep the equipment in working order 360 00:22:14,440 --> 00:22:16,680 Speaker 1: so that someone else can actually make use of it, 361 00:22:16,920 --> 00:22:19,800 Speaker 1: rather than for it to just go to waste. Critics 362 00:22:19,840 --> 00:22:24,040 Speaker 1: have said, this really isn't about reducing waste, it's about 363 00:22:24,119 --> 00:22:27,680 Speaker 1: so Nos limiting the viability of a secondary market. Because 364 00:22:28,200 --> 00:22:31,560 Speaker 1: Sonas doesn't make money off of someone selling off an 365 00:22:31,600 --> 00:22:34,879 Speaker 1: old pair of speakers or anything like that. The company 366 00:22:34,960 --> 00:22:39,040 Speaker 1: has a financial incentive to discourage aftermarket resales and to 367 00:22:39,119 --> 00:22:42,159 Speaker 1: create pathways for people to buy directly from Sons or 368 00:22:42,320 --> 00:22:46,159 Speaker 1: or from retailers who are carrying so Nos speakers. So 369 00:22:46,440 --> 00:22:49,280 Speaker 1: the criticism states that this is a devious way for 370 00:22:49,400 --> 00:22:53,000 Speaker 1: Sons to play the environmentally conscious card, you know, to 371 00:22:53,080 --> 00:22:56,640 Speaker 1: make it look like they're being eco friendly, while actually 372 00:22:56,720 --> 00:22:59,800 Speaker 1: they're taking aim at a market that can undercut their 373 00:22:59,800 --> 00:23:03,119 Speaker 1: own and revenues, that being the resale market. You know, 374 00:23:03,200 --> 00:23:05,960 Speaker 1: we've seen this with other properties as well, other types 375 00:23:06,000 --> 00:23:09,000 Speaker 1: of gadgets and electronics as well as video games. This 376 00:23:09,160 --> 00:23:12,120 Speaker 1: idea of getting rid of that resale market in order 377 00:23:12,200 --> 00:23:14,600 Speaker 1: to create the incentive for people to go out and 378 00:23:14,720 --> 00:23:18,280 Speaker 1: buy new copies as opposed to used copies or used 379 00:23:19,160 --> 00:23:22,240 Speaker 1: devices in this case. All right, well, how about we 380 00:23:22,359 --> 00:23:25,800 Speaker 1: do some Apple news. A big piece of news regarding 381 00:23:25,800 --> 00:23:29,320 Speaker 1: Apple broke in late June two thousand nineteen, when Jonathan I've, 382 00:23:29,560 --> 00:23:33,119 Speaker 1: better known as Sir Johnny I've announced he was leaving 383 00:23:33,320 --> 00:23:36,800 Speaker 1: the company. He had been with Apple for nearly three decades, 384 00:23:37,160 --> 00:23:40,000 Speaker 1: working primarily in design. He joined the company back in 385 00:23:40,160 --> 00:23:42,560 Speaker 1: nineteen nine six when it wasn't a bit of a pickle. 386 00:23:42,760 --> 00:23:45,480 Speaker 1: That was a year before the ousted Steve Jobs would 387 00:23:45,520 --> 00:23:48,440 Speaker 1: return to the company. He was one of the influential 388 00:23:48,520 --> 00:23:52,200 Speaker 1: designers who defined Apple's iPhone approach, setting the stage for 389 00:23:52,280 --> 00:23:56,800 Speaker 1: the company's meteoric rise in success. He announced he would 390 00:23:56,800 --> 00:23:59,800 Speaker 1: be heading up a new design company called Love From 391 00:24:00,359 --> 00:24:03,240 Speaker 1: and that he would still work with Apple on projects, 392 00:24:03,400 --> 00:24:06,840 Speaker 1: just from an independent business owner standpoint as opposed to 393 00:24:06,960 --> 00:24:11,119 Speaker 1: an Apple executive. I've played an intrinsic role in designing 394 00:24:11,200 --> 00:24:13,960 Speaker 1: some of Apple's defining products over the last twenty years, 395 00:24:14,200 --> 00:24:16,000 Speaker 1: so it will be interesting to see how the company 396 00:24:16,080 --> 00:24:21,160 Speaker 1: moves forward. I've and Jobs together were often cited as 397 00:24:21,320 --> 00:24:24,479 Speaker 1: the visionaries who kind of set Apple on its course, 398 00:24:25,240 --> 00:24:28,560 Speaker 1: and people have been asking what Apple's up to ever 399 00:24:28,680 --> 00:24:32,159 Speaker 1: since Jobs passed away. So with Ive's departure, I'm wondering 400 00:24:32,200 --> 00:24:35,080 Speaker 1: how that's going to affect the company as well. Apple 401 00:24:35,200 --> 00:24:37,600 Speaker 1: wasn't just dealing with the departure of one of its 402 00:24:37,680 --> 00:24:41,000 Speaker 1: more famous employees. However, the company also had some other 403 00:24:41,119 --> 00:24:44,399 Speaker 1: snags in twenty nineteen due to product issues. At the 404 00:24:44,480 --> 00:24:47,640 Speaker 1: beginning of the year, Benjamin Mayo of the Apple news 405 00:24:47,760 --> 00:24:51,720 Speaker 1: and rumors site nine to five Mac broke a big 406 00:24:51,880 --> 00:24:55,600 Speaker 1: story about a vulnerability in the company's FaceTime app. So, 407 00:24:55,720 --> 00:24:59,040 Speaker 1: for those who aren't familiar, FaceTime is a video chat app. 408 00:24:59,160 --> 00:25:03,119 Speaker 1: It's so you can video calls on iOS devices. Uh 409 00:25:03,200 --> 00:25:05,760 Speaker 1: though I should add that at least one Apple user 410 00:25:05,880 --> 00:25:08,240 Speaker 1: had tried to warn the company more than a week 411 00:25:08,400 --> 00:25:12,879 Speaker 1: before the story broke. They had found this independently, so 412 00:25:13,280 --> 00:25:16,280 Speaker 1: it was a known issue, arguably not just to the 413 00:25:16,320 --> 00:25:19,920 Speaker 1: Apple community but to Apple itself before the story broke 414 00:25:20,440 --> 00:25:23,359 Speaker 1: in nine to five Mac. Anyway, Mayo found that if 415 00:25:23,440 --> 00:25:26,960 Speaker 1: you used FaceTime to make a call between any devices 416 00:25:27,160 --> 00:25:30,119 Speaker 1: running iOS version twelve point one or later, and then 417 00:25:30,200 --> 00:25:32,720 Speaker 1: you added your own number into the call, as if 418 00:25:32,760 --> 00:25:37,040 Speaker 1: you were conferencing in yourself the person making the call, 419 00:25:37,240 --> 00:25:39,840 Speaker 1: you would be able to hear the audio from the 420 00:25:39,920 --> 00:25:43,800 Speaker 1: receiver's phone before the receiver had chosen to accept the 421 00:25:43,840 --> 00:25:46,399 Speaker 1: call in the first place. So all you would have 422 00:25:46,480 --> 00:25:49,320 Speaker 1: to do is put in someone's number, have it start 423 00:25:49,359 --> 00:25:53,119 Speaker 1: to dial a FaceTime call conference in yourself, and you 424 00:25:53,200 --> 00:25:57,880 Speaker 1: could listen in on the other phones microphone. You could 425 00:25:57,960 --> 00:26:01,440 Speaker 1: even use some options to act debate the camera on 426 00:26:01,560 --> 00:26:04,240 Speaker 1: the phone. And it wasn't just phones, it was also 427 00:26:04,640 --> 00:26:07,280 Speaker 1: you know, other mobile devices, but also max because the 428 00:26:07,359 --> 00:26:10,960 Speaker 1: mac os supported FaceTime as well, so if you did this, 429 00:26:11,280 --> 00:26:15,280 Speaker 1: you could presumably use someone's Mac computer to spy on 430 00:26:15,440 --> 00:26:19,280 Speaker 1: their home. Uh And because this would only work as 431 00:26:19,320 --> 00:26:22,240 Speaker 1: long as the other device was ringing, and FaceTime times 432 00:26:22,240 --> 00:26:26,000 Speaker 1: out after a certain amount of ringing, that ringing actually 433 00:26:26,119 --> 00:26:30,080 Speaker 1: lasts a lot longer on mac Os. A call can 434 00:26:30,800 --> 00:26:34,040 Speaker 1: end up going much longer in the in the calling 435 00:26:34,160 --> 00:26:37,080 Speaker 1: phase for mac Os because the thought is not everyone 436 00:26:37,240 --> 00:26:39,840 Speaker 1: is at their computer all the time, so you might 437 00:26:39,920 --> 00:26:42,359 Speaker 1: be across the house when the call comes in, you 438 00:26:42,440 --> 00:26:44,680 Speaker 1: might not hear it at first, so they have a 439 00:26:44,800 --> 00:26:50,120 Speaker 1: longer calling session that will last until there's an automatic 440 00:26:50,160 --> 00:26:55,280 Speaker 1: cut off, which means you could presumably spy longer until 441 00:26:55,359 --> 00:26:58,399 Speaker 1: someone noticed that there was a FaceTime call coming in. 442 00:26:58,760 --> 00:27:01,280 Speaker 1: So this was a huge law on the software and 443 00:27:01,359 --> 00:27:06,159 Speaker 1: the vulnerability would be patched. But initially Apple's response was 444 00:27:06,240 --> 00:27:09,240 Speaker 1: just to suspend the group FaceTime feature so that you 445 00:27:09,280 --> 00:27:11,560 Speaker 1: couldn't conference anyone in at all. You could only do 446 00:27:11,680 --> 00:27:14,600 Speaker 1: person to person calls, you couldn't do conference calls. And 447 00:27:14,680 --> 00:27:17,840 Speaker 1: then in February twenty nineteen, the company pushed out the 448 00:27:17,840 --> 00:27:20,800 Speaker 1: patch that sealed up those vulnerabilities and re enabled group 449 00:27:20,920 --> 00:27:25,000 Speaker 1: FaceTime features. Another problem Apple faced was the release of 450 00:27:25,119 --> 00:27:29,320 Speaker 1: iOS version thirteen and the release of Mac OS ten 451 00:27:29,480 --> 00:27:33,640 Speaker 1: point one five a k A. Catalina. Critics found problems 452 00:27:33,680 --> 00:27:37,920 Speaker 1: with both, identifying numerous bugs that prompted some tech reporters 453 00:27:38,200 --> 00:27:41,600 Speaker 1: to advise people, especially for people looking at buying a 454 00:27:41,640 --> 00:27:45,320 Speaker 1: new iPhone, to wait for patches before updating to the 455 00:27:45,400 --> 00:27:48,880 Speaker 1: latest OS version or buying a new phone. Even Apple 456 00:27:49,240 --> 00:27:54,320 Speaker 1: announced iOS thirteen point one a patch to thirteen point 457 00:27:54,359 --> 00:27:58,920 Speaker 1: oh before thirteen point oh had even shut, which indicated 458 00:27:59,119 --> 00:28:02,160 Speaker 1: that the initial rule east wasn't really ready for implementation. 459 00:28:02,600 --> 00:28:06,159 Speaker 1: So why was iOS thirteen so buggy? Or maybe I 460 00:28:06,160 --> 00:28:08,520 Speaker 1: should say why is it because not all those bugs 461 00:28:08,560 --> 00:28:11,600 Speaker 1: have been fixed? Well? Some people have suggested that Apple 462 00:28:11,760 --> 00:28:15,119 Speaker 1: was being overly aggressive when adding in new features to 463 00:28:15,200 --> 00:28:18,520 Speaker 1: the operating system, and that feature creep might have been 464 00:28:18,600 --> 00:28:23,080 Speaker 1: an issue. David Shaer, and Apple software engineer theorized that 465 00:28:23,160 --> 00:28:27,280 Speaker 1: perhaps teams working on certain features, were reluctant to admit 466 00:28:27,560 --> 00:28:31,240 Speaker 1: when they were falling behind on deadlines, and that rather 467 00:28:31,320 --> 00:28:34,280 Speaker 1: than cutting back on features, rather than saying let's not 468 00:28:34,480 --> 00:28:36,639 Speaker 1: do this because it's taking too much time and we 469 00:28:36,760 --> 00:28:40,000 Speaker 1: need to ship, things were kept in the mix far 470 00:28:40,200 --> 00:28:43,760 Speaker 1: longer than they needed to be, and Share also listed 471 00:28:43,800 --> 00:28:47,080 Speaker 1: several other possible contributing factors. It's all in the post 472 00:28:47,240 --> 00:28:50,680 Speaker 1: on tidbits dot Com. I recommend checking that out if 473 00:28:50,720 --> 00:28:53,320 Speaker 1: you want to learn more. The piece is titled six 474 00:28:53,520 --> 00:28:58,160 Speaker 1: reasons why iOS thirteen and Catalina are so Buggy, and 475 00:28:58,360 --> 00:29:02,000 Speaker 1: he goes into much greater detail there now. In late December, 476 00:29:02,320 --> 00:29:05,959 Speaker 1: Apple pushed out an update for its mobile operating system, 477 00:29:06,200 --> 00:29:09,680 Speaker 1: and this one's called iOS thirteen point three, which might 478 00:29:09,800 --> 00:29:14,160 Speaker 1: make you think it's the third update to iOS to 479 00:29:14,320 --> 00:29:17,360 Speaker 1: this version, but it's not the third update. It's actually 480 00:29:17,480 --> 00:29:22,880 Speaker 1: the eighth update since iOS thirteen was first announced. Many 481 00:29:22,960 --> 00:29:26,160 Speaker 1: of the updates were in the iOS thirteen point one 482 00:29:26,320 --> 00:29:30,080 Speaker 1: and iOS thirteen point two designations. Oh and by the 483 00:29:30,120 --> 00:29:33,600 Speaker 1: time you hear this, iOS thirteen point three point one 484 00:29:33,800 --> 00:29:36,760 Speaker 1: might be available. It's currently in beta. As a record 485 00:29:36,840 --> 00:29:41,040 Speaker 1: this episode. Gordon Kelly of Forbes suggests that if you 486 00:29:41,160 --> 00:29:43,880 Speaker 1: have a device running an earlier version of iOS thirteen, 487 00:29:43,920 --> 00:29:47,719 Speaker 1: you should absolutely update to thirteen point three, but if 488 00:29:47,760 --> 00:29:51,959 Speaker 1: you're still running iOS twelve, you might actually still want 489 00:29:52,000 --> 00:29:55,000 Speaker 1: to wait a little bit longer before you upgrade. He 490 00:29:55,120 --> 00:29:58,360 Speaker 1: does say that things are starting to look promising, that 491 00:29:58,640 --> 00:30:02,040 Speaker 1: the initial months following the release of iOS thirteen were 492 00:30:02,080 --> 00:30:05,840 Speaker 1: pretty bad. He would give a categorical skip this update 493 00:30:05,920 --> 00:30:10,320 Speaker 1: until it's fixed recommendation. That recommendation is slowly starting to 494 00:30:10,480 --> 00:30:14,920 Speaker 1: soften as these numerous patches are addressing some of the 495 00:30:15,000 --> 00:30:18,240 Speaker 1: more serious bugs and vulnerabilities that people have found in 496 00:30:18,320 --> 00:30:22,120 Speaker 1: iOS thirteen. But this has not been an illustrious launch 497 00:30:22,480 --> 00:30:26,600 Speaker 1: for Apple. Bugs and operating system updates are really nothing new. 498 00:30:26,800 --> 00:30:29,400 Speaker 1: I mean, it happens all the time. No one's perfect, 499 00:30:29,440 --> 00:30:32,760 Speaker 1: and operating systems are large and complicated pieces of software. 500 00:30:32,840 --> 00:30:35,760 Speaker 1: But it does create a bit of an image problem, 501 00:30:35,840 --> 00:30:39,000 Speaker 1: particularly if you're a company like Apple that has built 502 00:30:39,040 --> 00:30:42,720 Speaker 1: itself on a reputation that its devices just work, and 503 00:30:42,800 --> 00:30:46,400 Speaker 1: it also complicates a discussion that relates back to data security. 504 00:30:47,000 --> 00:30:49,880 Speaker 1: Generally speaking, it's a good idea to keep as up 505 00:30:49,960 --> 00:30:53,320 Speaker 1: to date with operating system and security patches as you 506 00:30:53,400 --> 00:30:57,760 Speaker 1: possibly can. So if there's an update, generally speaking, it's 507 00:30:57,800 --> 00:31:01,840 Speaker 1: good to install right away. Now, eventually you might find 508 00:31:01,960 --> 00:31:05,640 Speaker 1: that your particular device can't support whatever the latest and 509 00:31:05,680 --> 00:31:09,600 Speaker 1: greatest version of the operating system is. That does happen 510 00:31:09,800 --> 00:31:13,520 Speaker 1: where the hardware itself cannot physically support the software, but 511 00:31:13,680 --> 00:31:16,720 Speaker 1: keeping up to date reduces the opportunities that hackers can 512 00:31:16,800 --> 00:31:20,920 Speaker 1: take to exploit vulnerabilities. However, when the operating system itself 513 00:31:21,200 --> 00:31:24,160 Speaker 1: is a buggy mess and the updates aren't much better, 514 00:31:24,680 --> 00:31:28,880 Speaker 1: it's not as clear cut a case that updating is 515 00:31:28,960 --> 00:31:31,920 Speaker 1: your best option. It may be that, yeah, you can update, 516 00:31:32,040 --> 00:31:35,440 Speaker 1: and that will technically patch some things, but could open 517 00:31:35,560 --> 00:31:38,480 Speaker 1: up either brand new vulnerabilities or might just make stuff 518 00:31:38,520 --> 00:31:42,000 Speaker 1: not work anymore. That's not great either. All right, let's 519 00:31:42,040 --> 00:31:46,400 Speaker 1: pop on over to Microsoft, Apple's old rival and sometimes savior. 520 00:31:46,560 --> 00:31:48,360 Speaker 1: If you've listened to old episodes of tech Stuff, you 521 00:31:48,400 --> 00:31:50,360 Speaker 1: know what I'm referring to. Now. I don't have a 522 00:31:50,440 --> 00:31:53,200 Speaker 1: whole lot to say about Microsoft in twenty nineteen. The 523 00:31:53,240 --> 00:31:56,680 Speaker 1: company has moved much of its operations into cloud based services, 524 00:31:57,080 --> 00:31:59,920 Speaker 1: but it did launch a product in late twenty nineteen 525 00:32:00,200 --> 00:32:03,200 Speaker 1: has folks like me a little excited. It's the hollow 526 00:32:03,280 --> 00:32:08,000 Speaker 1: lens too. Now, the hollow lens is an augmented reality platform, 527 00:32:08,160 --> 00:32:12,400 Speaker 1: and augmented reality involves overlaying digital information on top of 528 00:32:12,680 --> 00:32:16,120 Speaker 1: the real world around us in some way. Now, you 529 00:32:16,200 --> 00:32:19,760 Speaker 1: typically can achieve this through one of several approaches. You 530 00:32:19,840 --> 00:32:23,560 Speaker 1: could have special glasses that act as a projection screen 531 00:32:23,680 --> 00:32:26,360 Speaker 1: to display information in front of you as you look around, 532 00:32:26,440 --> 00:32:28,880 Speaker 1: so that you have digital information that you're looking at, 533 00:32:29,320 --> 00:32:31,640 Speaker 1: but you can also look through that and see the 534 00:32:31,720 --> 00:32:35,040 Speaker 1: real world beyond it. You could even have headphones that 535 00:32:35,160 --> 00:32:38,280 Speaker 1: feeds you information by audio that enhance your experience of 536 00:32:38,360 --> 00:32:41,240 Speaker 1: moving through a physical environment. That's the type of augmented reality. 537 00:32:41,440 --> 00:32:44,400 Speaker 1: It's maybe not as flashy as the first type, but 538 00:32:44,520 --> 00:32:47,120 Speaker 1: it's still very legitimate. You could have an app on 539 00:32:47,200 --> 00:32:50,000 Speaker 1: a smartphone that can recognize certain images and display data 540 00:32:50,120 --> 00:32:52,240 Speaker 1: on top of a video view of the world. So 541 00:32:52,320 --> 00:32:54,680 Speaker 1: in this case, you're looking at the world through your 542 00:32:54,760 --> 00:32:58,560 Speaker 1: smartphone screen, which then can overlay digital information on top 543 00:32:58,600 --> 00:33:01,360 Speaker 1: of that video view. But it's as if you're looking 544 00:33:01,480 --> 00:33:03,479 Speaker 1: at the real world around you if you just kind 545 00:33:03,520 --> 00:33:07,640 Speaker 1: of ignore the fact that you're really looking at a monitor. Alright, 546 00:33:07,760 --> 00:33:11,680 Speaker 1: So the hullo lens and its sequel hollow Lens to 547 00:33:12,000 --> 00:33:15,120 Speaker 1: the augment in ing or if you prefer hollow lens 548 00:33:15,160 --> 00:33:19,240 Speaker 1: to electric boogaloo. It's a head mounted display and the 549 00:33:19,320 --> 00:33:22,240 Speaker 1: first generation of the hollow lens received a very limited 550 00:33:22,320 --> 00:33:27,080 Speaker 1: release because it wasn't intended as a consumer electronics product. 551 00:33:27,160 --> 00:33:29,560 Speaker 1: It wasn't meant to go to the average person. It 552 00:33:29,680 --> 00:33:31,960 Speaker 1: was more of a first step into a new market 553 00:33:32,040 --> 00:33:35,280 Speaker 1: for Microsoft. The company launched the Hollow Lens two in 554 00:33:35,400 --> 00:33:38,960 Speaker 1: November two thou nineteen with a price tag of three thousand, 555 00:33:39,240 --> 00:33:42,760 Speaker 1: five hundred dollars. So it's still far from being priced 556 00:33:42,880 --> 00:33:46,360 Speaker 1: as a basic component of home computing. Right. No one, 557 00:33:47,240 --> 00:33:49,160 Speaker 1: h not, not your average person is going to go 558 00:33:49,240 --> 00:33:53,720 Speaker 1: out and adopt the hollow lens too. At I will 559 00:33:53,760 --> 00:33:56,280 Speaker 1: not be buying a hollow lens too. I just can't 560 00:33:56,360 --> 00:34:00,640 Speaker 1: justify that expense for something you know that would interesting 561 00:34:00,880 --> 00:34:04,640 Speaker 1: but have limited utility in my life. However, you could 562 00:34:04,720 --> 00:34:08,440 Speaker 1: say that this is slowly moving this technology into the 563 00:34:08,520 --> 00:34:11,839 Speaker 1: consumer space. Now, the new version of the hollow lens 564 00:34:11,880 --> 00:34:14,400 Speaker 1: has an improved field of view, so users will have 565 00:34:14,560 --> 00:34:17,800 Speaker 1: a less restricted view of the world around them. To me, 566 00:34:17,960 --> 00:34:20,360 Speaker 1: the headset looks kind of like imagine you've got a 567 00:34:20,400 --> 00:34:24,480 Speaker 1: pair of safety goggles. And then above the safety goggles 568 00:34:24,520 --> 00:34:26,480 Speaker 1: that you look through, so those are clear, you're looking 569 00:34:26,520 --> 00:34:29,120 Speaker 1: at the world around you. Above that, you've got a 570 00:34:29,239 --> 00:34:32,840 Speaker 1: device that has a camera system mounted inside of it, 571 00:34:33,480 --> 00:34:36,640 Speaker 1: and that part is actually attached to like a headband, 572 00:34:36,960 --> 00:34:39,800 Speaker 1: so you're wearing that on your forehead. And then below 573 00:34:39,960 --> 00:34:42,279 Speaker 1: that is where the safety goggles are. That's kind of 574 00:34:42,320 --> 00:34:45,320 Speaker 1: what it looks like. I'm doing a poor job describing it, 575 00:34:45,400 --> 00:34:48,000 Speaker 1: but it's hard to do in an audio format. But 576 00:34:48,120 --> 00:34:52,320 Speaker 1: the cameras that face out from this device capture the 577 00:34:52,440 --> 00:34:55,560 Speaker 1: scene around you, right. It takes in that information and 578 00:34:55,680 --> 00:34:59,759 Speaker 1: interprets it through the computational system inside the device, which 579 00:34:59,800 --> 00:35:05,200 Speaker 1: the determines what data to display on your your lenses 580 00:35:05,640 --> 00:35:09,400 Speaker 1: when you're looking at something in particular. So as an example, 581 00:35:09,800 --> 00:35:12,800 Speaker 1: let's say you're looking at an electrical panel, then the 582 00:35:12,960 --> 00:35:16,120 Speaker 1: data that pops up might tell you what each element 583 00:35:16,160 --> 00:35:19,040 Speaker 1: on that electrical panel relates back to, So it's kind 584 00:35:19,040 --> 00:35:21,000 Speaker 1: of like a labeling system in that case. That's just 585 00:35:21,160 --> 00:35:25,040 Speaker 1: one use case for this kind of technology. The company 586 00:35:25,160 --> 00:35:30,279 Speaker 1: also tweaked the gesture control interface that the hollow lens uses. UH. 587 00:35:30,480 --> 00:35:33,120 Speaker 1: This was to improve on responsiveness and to cut down 588 00:35:33,160 --> 00:35:37,200 Speaker 1: on false positives. So gestures obviously would be an important 589 00:35:37,320 --> 00:35:39,520 Speaker 1: way to control this kind of technology. You might use 590 00:35:39,680 --> 00:35:42,000 Speaker 1: voice control as well. Google Glass did that, but it 591 00:35:42,080 --> 00:35:45,000 Speaker 1: seems weird because that sounds like you're talking to yourself. 592 00:35:45,440 --> 00:35:47,239 Speaker 1: I can see that from personal experience, because I got 593 00:35:47,280 --> 00:35:49,960 Speaker 1: to play with Google Glass for a while. But the 594 00:35:50,080 --> 00:35:52,279 Speaker 1: gesture controls, they did something that I thought was pretty clever. 595 00:35:53,040 --> 00:35:56,000 Speaker 1: So for example, they built in a system where you 596 00:35:56,040 --> 00:35:59,279 Speaker 1: would hold out your hand and you would look down 597 00:35:59,360 --> 00:36:01,239 Speaker 1: at your hand and you and by moving your head 598 00:36:01,280 --> 00:36:04,360 Speaker 1: a little bit, you could position an icon so that 599 00:36:04,560 --> 00:36:07,560 Speaker 1: it appears to be projected onto the palm of your hand. 600 00:36:08,120 --> 00:36:10,880 Speaker 1: Then you could touch that icon with the fingers of 601 00:36:10,960 --> 00:36:14,320 Speaker 1: your other hand to activate it and launch whatever the 602 00:36:14,360 --> 00:36:16,719 Speaker 1: app is. And that's kind of neat. It adds a 603 00:36:16,760 --> 00:36:20,400 Speaker 1: sort of tactle response to the gesture control that otherwise 604 00:36:20,520 --> 00:36:22,759 Speaker 1: was lacking. Now I haven't been able to try a 605 00:36:22,840 --> 00:36:26,520 Speaker 1: hollow lens of either generation yet, but I hope I 606 00:36:26,680 --> 00:36:30,280 Speaker 1: can at some point. I love the potential of augmented reality, 607 00:36:30,640 --> 00:36:34,719 Speaker 1: and I think really clever implementations have enormous possibilities in 608 00:36:34,760 --> 00:36:37,040 Speaker 1: the future. But I think it was probably gonna be 609 00:36:37,120 --> 00:36:39,960 Speaker 1: several more years before we see this as a common 610 00:36:40,040 --> 00:36:44,040 Speaker 1: technology for the everyday person. However, for certain industries I 611 00:36:44,120 --> 00:36:47,000 Speaker 1: suspect it will play a much larger role moving forward. 612 00:36:47,200 --> 00:36:49,720 Speaker 1: We've already seen it being used in the medical field, 613 00:36:49,920 --> 00:36:53,640 Speaker 1: and engineering will probably see it move beyond that slowly, 614 00:36:54,080 --> 00:36:57,360 Speaker 1: and then gradually we'll see it possibly enter into the 615 00:36:57,440 --> 00:37:01,040 Speaker 1: mainstream market if there's a compelling enough use case. If 616 00:37:01,080 --> 00:37:03,800 Speaker 1: it's more of a curiosity, I would argue Google Glass 617 00:37:03,880 --> 00:37:06,440 Speaker 1: kind of fell into that category, then it probably won't 618 00:37:06,440 --> 00:37:09,880 Speaker 1: receive much traction, kind of like how virtual reality has 619 00:37:09,920 --> 00:37:13,759 Speaker 1: been struggling again. All right, when we come back, we'll 620 00:37:13,800 --> 00:37:16,680 Speaker 1: talk about some more stories, including touching on what a 621 00:37:16,719 --> 00:37:28,160 Speaker 1: little company that rhymes with Schmoogle has been up to. Okay, 622 00:37:28,280 --> 00:37:33,000 Speaker 1: let's talk Google. So, like Facebook, Google came under intense 623 00:37:33,200 --> 00:37:37,080 Speaker 1: scrutiny throughout two thousand nineteen, whether it was about user 624 00:37:37,200 --> 00:37:42,240 Speaker 1: privacy or allegations that the company was covering up really 625 00:37:42,719 --> 00:37:47,360 Speaker 1: terrible behavior and turning a blind eye despite employee protests, 626 00:37:47,600 --> 00:37:51,480 Speaker 1: or allegations that the company's search results were purposefully promoting 627 00:37:51,800 --> 00:37:57,000 Speaker 1: certain material, specifically sites that were in alignment with Google's 628 00:37:57,040 --> 00:38:00,600 Speaker 1: own perceived agenda perceived by the public guy should say 629 00:38:01,239 --> 00:38:03,640 Speaker 1: at the expense of other materials. So, in other words, 630 00:38:04,000 --> 00:38:08,360 Speaker 1: that Google was promoting things that fell in line with 631 00:38:08,520 --> 00:38:12,400 Speaker 1: what Google wanted and suppressing anything that Google didn't like. 632 00:38:12,760 --> 00:38:15,200 Speaker 1: That was the charge. The company had to weather a 633 00:38:15,280 --> 00:38:17,319 Speaker 1: lot of strife in twenty nineteen, and to be clear, 634 00:38:17,400 --> 00:38:19,719 Speaker 1: at least some of that strife was brought on by 635 00:38:19,880 --> 00:38:23,280 Speaker 1: the company itself. One big change for the company actually 636 00:38:23,320 --> 00:38:25,719 Speaker 1: requires us to take a step back and look up 637 00:38:26,000 --> 00:38:29,720 Speaker 1: a level higher than Google itself. At Google's parent company, 638 00:38:29,960 --> 00:38:34,680 Speaker 1: which I mentioned earlier, is Alphabet Now. In early December nineteen, 639 00:38:35,160 --> 00:38:38,440 Speaker 1: Larry Page, a co founder of Google, announced that he 640 00:38:38,600 --> 00:38:43,000 Speaker 1: was stepping down as the CEO of Alphabet. Sunday, the 641 00:38:43,080 --> 00:38:47,320 Speaker 1: CEO of Google itself, would become the new CEO of Alphabet, 642 00:38:47,520 --> 00:38:51,880 Speaker 1: while simultaneously remaining CEO of Google. Sarah A. Brenn, another 643 00:38:52,080 --> 00:38:55,959 Speaker 1: co founder, had stepped down as Alphabet's president, but both 644 00:38:56,280 --> 00:38:58,480 Speaker 1: Page and Brand said they were going to remain on 645 00:38:58,560 --> 00:39:01,880 Speaker 1: the board of directors for or the company. Paige and 646 00:39:01,960 --> 00:39:04,760 Speaker 1: Brand said that their decisions reflected a need for Google's 647 00:39:04,800 --> 00:39:08,239 Speaker 1: management structure to streamline, but it also came into time 648 00:39:08,280 --> 00:39:11,120 Speaker 1: when the company was dealing with big problems from within 649 00:39:11,360 --> 00:39:14,680 Speaker 1: and without. There was the scrutiny that I mentioned earlier, 650 00:39:14,760 --> 00:39:18,480 Speaker 1: in which government agencies and advocacy groups were criticizing Google's 651 00:39:18,560 --> 00:39:22,719 Speaker 1: policies and talking about its power in the marketplace, and 652 00:39:22,880 --> 00:39:25,880 Speaker 1: there were also allegations that the company had engaged in 653 00:39:25,960 --> 00:39:30,160 Speaker 1: some retaliation against a few employees. Now to understand that 654 00:39:30,280 --> 00:39:32,560 Speaker 1: last bit, we have to look back a little further 655 00:39:32,680 --> 00:39:35,680 Speaker 1: than the beginning of twenty nineteen. So in twenty eighteen, 656 00:39:35,840 --> 00:39:39,520 Speaker 1: internal issues within Google became big news as thousands of 657 00:39:39,600 --> 00:39:43,800 Speaker 1: employees protested issues ranging from sexual harassment problems in the 658 00:39:43,840 --> 00:39:48,520 Speaker 1: workplace to quote, unethical business decisions that create a workplace 659 00:39:48,560 --> 00:39:52,319 Speaker 1: that is harmful to us and our colleagues end quote. Now, 660 00:39:52,400 --> 00:39:55,840 Speaker 1: that last quote actually comes from four former Google employees 661 00:39:56,040 --> 00:39:59,360 Speaker 1: who posted a piece on Medium after they were fired. 662 00:40:00,000 --> 00:40:04,520 Speaker 1: This was around Thanksgiving twenty nineteen. The four employees had 663 00:40:04,600 --> 00:40:09,200 Speaker 1: been leading efforts to unionize at Google to organize employees, 664 00:40:09,760 --> 00:40:12,120 Speaker 1: and it was a move that did not look good 665 00:40:12,200 --> 00:40:15,320 Speaker 1: on Google's part, and it certainly appears at a casual 666 00:40:15,440 --> 00:40:19,400 Speaker 1: glance at least, that Google executives were trying to squash 667 00:40:19,600 --> 00:40:22,680 Speaker 1: employees from being able to organize in a union or 668 00:40:22,800 --> 00:40:28,520 Speaker 1: other organizational structure. The company's official response to inquiries about 669 00:40:28,680 --> 00:40:32,840 Speaker 1: the firing was that the employees had allegedly violated Google's 670 00:40:32,920 --> 00:40:37,400 Speaker 1: security policies, an allegation that the four former employees deny. 671 00:40:37,920 --> 00:40:41,120 Speaker 1: The story is still playing out as I record this episode, 672 00:40:41,520 --> 00:40:44,920 Speaker 1: and it does tap into another upsetting trend in business 673 00:40:45,000 --> 00:40:48,040 Speaker 1: in general and the text fear in particular. It has 674 00:40:48,080 --> 00:40:51,680 Speaker 1: become pretty common practice for a lot of companies to 675 00:40:51,840 --> 00:40:55,760 Speaker 1: require employees to sign an agreement that limits the rights 676 00:40:56,080 --> 00:40:59,080 Speaker 1: an employee has when they want to address problems in 677 00:40:59,120 --> 00:41:02,480 Speaker 1: the workplace. Companies enact these policies so that they can 678 00:41:02,560 --> 00:41:05,719 Speaker 1: limit their own liability and limit the impact those types 679 00:41:05,760 --> 00:41:09,080 Speaker 1: of problems can have on business. The agreements typically force 680 00:41:09,200 --> 00:41:13,480 Speaker 1: employees to try and work through issues through internal systems 681 00:41:13,520 --> 00:41:17,360 Speaker 1: at the company, like going through human resources, and it 682 00:41:17,480 --> 00:41:21,400 Speaker 1: really places restrictions on other options, such as pursuing a 683 00:41:21,560 --> 00:41:24,680 Speaker 1: legal case against the company, like you could get severely 684 00:41:25,120 --> 00:41:30,160 Speaker 1: punished for going outside the company and seeking outside help. Now, 685 00:41:30,280 --> 00:41:33,680 Speaker 1: assuming the HR department is on the side of the employees, 686 00:41:34,520 --> 00:41:38,080 Speaker 1: you could maybe argue that this policy isn't too restrictive. 687 00:41:38,160 --> 00:41:40,120 Speaker 1: It might be you might not feel great about it. 688 00:41:40,239 --> 00:41:42,120 Speaker 1: But if you think, oh, well, HR is gonna be 689 00:41:42,239 --> 00:41:45,600 Speaker 1: on the employees side, maybe you're going to say, well, 690 00:41:45,640 --> 00:41:48,680 Speaker 1: I'm willing to endure it. But at least in some cases, 691 00:41:48,760 --> 00:41:52,640 Speaker 1: particularly with Google, it's appeared that the HR department was 692 00:41:52,719 --> 00:41:55,480 Speaker 1: really more on the side of the corporation on the employees. 693 00:41:55,960 --> 00:41:59,520 Speaker 1: That they had a tendency to shut down complaints or 694 00:41:59,600 --> 00:42:02,080 Speaker 1: to try to mitigate the fallout of complaints by kind 695 00:42:02,120 --> 00:42:06,239 Speaker 1: of shifting people around without removing or punishing anyone who 696 00:42:06,400 --> 00:42:10,239 Speaker 1: was the focal point of an allegation, and so they 697 00:42:10,280 --> 00:42:14,320 Speaker 1: weren't really addressing the underlying issues, which left the affected 698 00:42:14,360 --> 00:42:17,920 Speaker 1: employees with very few options. And it's pretty ugly stuff. 699 00:42:18,239 --> 00:42:19,880 Speaker 1: So that's one of the things that I think a 700 00:42:19,960 --> 00:42:23,920 Speaker 1: lot of employees around the world really, but particularly in 701 00:42:24,000 --> 00:42:28,280 Speaker 1: the tech space, have started to kind of act out against. 702 00:42:28,960 --> 00:42:31,280 Speaker 1: And we're not done with Google yet. The company launched 703 00:42:31,320 --> 00:42:35,640 Speaker 1: its gaming service, Google Stadia in two thousand nineteen. That 704 00:42:35,800 --> 00:42:39,960 Speaker 1: service allows users to access games via streaming, so you're 705 00:42:40,200 --> 00:42:44,040 Speaker 1: actually running the game on one of Google's servers and 706 00:42:44,680 --> 00:42:48,920 Speaker 1: you're playing it via your local connection. According to Google, 707 00:42:49,160 --> 00:42:51,840 Speaker 1: you can stream games up to four K resolution and 708 00:42:51,960 --> 00:42:56,040 Speaker 1: sixty frames per second, assuming that your Internet connection and 709 00:42:56,160 --> 00:42:59,520 Speaker 1: your hardware can support that. The service launched with some 710 00:42:59,640 --> 00:43:03,480 Speaker 1: stupality problems and with a pretty limited library of games, 711 00:43:03,520 --> 00:43:07,160 Speaker 1: and so far it hasn't really taken off, despite the 712 00:43:07,200 --> 00:43:10,120 Speaker 1: fact that it removes the need for buying a high 713 00:43:10,239 --> 00:43:13,800 Speaker 1: end gaming rig or even a gaming console to access 714 00:43:13,960 --> 00:43:17,080 Speaker 1: current generation games. Now. To be fair, Google is not 715 00:43:17,239 --> 00:43:19,719 Speaker 1: the only company that has tried this model with only 716 00:43:19,880 --> 00:43:23,120 Speaker 1: limited success. There are lots of companies that have tried 717 00:43:23,200 --> 00:43:26,720 Speaker 1: a similar approach and also have had some issues getting 718 00:43:26,760 --> 00:43:30,960 Speaker 1: anywhere with it. Meanwhile, over at YouTube, which again is 719 00:43:31,000 --> 00:43:33,839 Speaker 1: part of Google, it's part of that alphabet company, other 720 00:43:33,960 --> 00:43:38,640 Speaker 1: problems were plaguing Googlers. So in September of twenty nineteen, 721 00:43:38,719 --> 00:43:44,040 Speaker 1: YouTube changed its policy for verified creators. Verified creators are 722 00:43:44,080 --> 00:43:46,680 Speaker 1: an interesting thing. So these are creators who had earned 723 00:43:46,760 --> 00:43:51,680 Speaker 1: a verified check mark and that indicated that essentially up 724 00:43:51,760 --> 00:43:55,600 Speaker 1: until recently, that they had attracted at least one hundred 725 00:43:55,680 --> 00:43:59,759 Speaker 1: thousand subscribers. I happen to be one of these verified 726 00:44:00,760 --> 00:44:04,080 Speaker 1: uh creators, but more on that in the minute, because 727 00:44:04,200 --> 00:44:08,399 Speaker 1: it's an interesting case. So in the past, YouTube had 728 00:44:08,440 --> 00:44:11,279 Speaker 1: issued verified check marks two accounts that had reached that 729 00:44:11,360 --> 00:44:14,480 Speaker 1: one hundred thousand subscriber mark or more. That's all you 730 00:44:14,560 --> 00:44:16,040 Speaker 1: really had to do in order to get the verified 731 00:44:16,120 --> 00:44:22,080 Speaker 1: check Now they decided to change that so that not 732 00:44:22,280 --> 00:44:24,840 Speaker 1: only would you need the one hundred thousand subscribers, you 733 00:44:24,960 --> 00:44:28,120 Speaker 1: also need to have your account be active, meaning you 734 00:44:28,200 --> 00:44:31,080 Speaker 1: have to be uploading content on a semi regular basis, 735 00:44:31,160 --> 00:44:34,000 Speaker 1: and it doesn't need to be you know, had had 736 00:44:34,080 --> 00:44:37,600 Speaker 1: been quiet without an upload for ages. Also, the accounts 737 00:44:37,640 --> 00:44:40,520 Speaker 1: needed to be authentic. In other words, the account need 738 00:44:40,600 --> 00:44:43,200 Speaker 1: to be linked back to a real creator, brand, or 739 00:44:43,320 --> 00:44:46,720 Speaker 1: entity that YouTube could verify was in charge of creating 740 00:44:46,760 --> 00:44:51,280 Speaker 1: that content. This meant that a lot of the people 741 00:44:51,400 --> 00:44:55,719 Speaker 1: who had the little check mark didn't necessarily meet those requirements. 742 00:44:55,880 --> 00:45:01,480 Speaker 1: So then YouTube revoked the verified badge for thousands of creators, 743 00:45:02,000 --> 00:45:04,560 Speaker 1: and that created an uproar. I happened to be one 744 00:45:04,560 --> 00:45:06,800 Speaker 1: of the creators who lost my badge I lost. I 745 00:45:06,840 --> 00:45:10,359 Speaker 1: got an email it said you're verified badges going away. Now. 746 00:45:10,400 --> 00:45:12,680 Speaker 1: In my case, I wasn't fussed about it. I'll explain 747 00:45:12,760 --> 00:45:16,200 Speaker 1: that again in just a minute. So YouTube's motivation was 748 00:45:16,320 --> 00:45:21,560 Speaker 1: to clarify what a verification check mark actually meant, because 749 00:45:22,400 --> 00:45:24,799 Speaker 1: one confusion was that people thought that a check mark 750 00:45:24,920 --> 00:45:29,320 Speaker 1: meant that YouTube was endorsing the content of that creator, 751 00:45:29,560 --> 00:45:31,840 Speaker 1: that somehow YouTube was saying, yes, we approve of this. 752 00:45:32,040 --> 00:45:35,120 Speaker 1: This is what that check mark means. But it wasn't 753 00:45:35,120 --> 00:45:36,480 Speaker 1: meant to be that. It was meant to be an 754 00:45:36,520 --> 00:45:40,520 Speaker 1: indicator that the associated account was authentic and not some 755 00:45:40,680 --> 00:45:44,480 Speaker 1: sort of impersonation account. And this is a legit issue 756 00:45:44,880 --> 00:45:48,520 Speaker 1: over on YouTube because a lot of creators, really popular creators, 757 00:45:49,080 --> 00:45:54,320 Speaker 1: see their work get lifted and reposted under other accounts. 758 00:45:54,800 --> 00:45:58,200 Speaker 1: So you might create a really awesome video and maybe 759 00:45:58,239 --> 00:46:00,959 Speaker 1: it gets a little bit of notice, so somebody else 760 00:46:01,239 --> 00:46:04,279 Speaker 1: captures that video, they use a program to download it, 761 00:46:04,320 --> 00:46:06,840 Speaker 1: then they re upload it under their own account, and 762 00:46:06,880 --> 00:46:09,120 Speaker 1: they try to get that one to take off. It's 763 00:46:09,160 --> 00:46:14,560 Speaker 1: even possible for a copy to outperform the original, and 764 00:46:14,719 --> 00:46:16,600 Speaker 1: that means that the person who originally put in the 765 00:46:16,640 --> 00:46:20,360 Speaker 1: work to making that thing be what it is doesn't 766 00:46:20,400 --> 00:46:22,920 Speaker 1: get the benefit of it. They aren't able to monetize 767 00:46:23,480 --> 00:46:27,480 Speaker 1: the appearance on the other channel. You can put down 768 00:46:27,840 --> 00:46:29,839 Speaker 1: takedown strikes and stuff like that, but it means having 769 00:46:29,880 --> 00:46:34,160 Speaker 1: to constantly, you know, search the internet, search YouTube for 770 00:46:34,360 --> 00:46:39,520 Speaker 1: copies of your work. So YouTube wanted to create a 771 00:46:39,680 --> 00:46:44,759 Speaker 1: system that would make a more straightforward approach to verification, 772 00:46:45,080 --> 00:46:46,759 Speaker 1: in the sense that if you saw the check mark, 773 00:46:47,360 --> 00:46:51,080 Speaker 1: you knew this person was a legit, that the content 774 00:46:51,239 --> 00:46:54,600 Speaker 1: coming from that person's channel was in fact coming from 775 00:46:54,719 --> 00:47:00,040 Speaker 1: that creator. And uh so they put that change and 776 00:47:00,320 --> 00:47:05,000 Speaker 1: they revoked all those check marks. People went nuts, and 777 00:47:05,160 --> 00:47:07,560 Speaker 1: so YouTube walked it back a week later, and a 778 00:47:07,600 --> 00:47:10,399 Speaker 1: week later it gave everybody their verified check marks back, 779 00:47:10,600 --> 00:47:14,160 Speaker 1: including me, which again in my case, I don't think 780 00:47:14,200 --> 00:47:18,520 Speaker 1: it was necessary. So let me explain about my check mark. 781 00:47:18,920 --> 00:47:22,000 Speaker 1: So a few years ago, I was hosting a video 782 00:47:22,160 --> 00:47:26,320 Speaker 1: series called forward Thinking This is for Work, and the 783 00:47:26,400 --> 00:47:30,120 Speaker 1: channel for forward Thinking was linked to my personal YouTube 784 00:47:30,280 --> 00:47:33,040 Speaker 1: channel for reasons I don't remember at this point. I 785 00:47:33,080 --> 00:47:35,200 Speaker 1: think it was so that I could go in and 786 00:47:35,320 --> 00:47:38,719 Speaker 1: make changes if I needed to, even though typically we 787 00:47:38,719 --> 00:47:41,399 Speaker 1: would have other people handle all of that. For some reason, 788 00:47:41,440 --> 00:47:44,200 Speaker 1: they trusted me and they linked the channel to my 789 00:47:44,440 --> 00:47:48,560 Speaker 1: personal YouTube account. The result was I got a check 790 00:47:48,640 --> 00:47:52,799 Speaker 1: mark because the Forward Thinking series had a pretty good 791 00:47:52,920 --> 00:47:57,600 Speaker 1: subscriber base, like two fifty thousand subscribers, so it met 792 00:47:57,680 --> 00:48:01,400 Speaker 1: the criteria and it got the check mark. King to me, now, 793 00:48:01,480 --> 00:48:04,680 Speaker 1: this is despite the fact that on my channel, I 794 00:48:04,960 --> 00:48:08,320 Speaker 1: very rarely post anything. I've only got a few videos 795 00:48:08,400 --> 00:48:10,960 Speaker 1: up on my personal channel, and when I do put 796 00:48:11,000 --> 00:48:13,880 Speaker 1: a video up, I get only a few views. You know, 797 00:48:13,920 --> 00:48:16,640 Speaker 1: it's typically like, hey, my mom watches my stuff, which 798 00:48:16,719 --> 00:48:19,000 Speaker 1: is totally fine. I'm I was doing it for fun. 799 00:48:19,160 --> 00:48:21,920 Speaker 1: I wasn't trying to do it as a YouTuber, right, 800 00:48:22,520 --> 00:48:24,680 Speaker 1: So in my case, when my check mark went away, 801 00:48:24,760 --> 00:48:27,400 Speaker 1: I thought, you know, that's totally fair. I'm an outlier 802 00:48:27,600 --> 00:48:32,240 Speaker 1: and I don't meet this requirement. I definitely don't deserve 803 00:48:32,320 --> 00:48:34,920 Speaker 1: the check mark. It's okay that it's gone, But there 804 00:48:34,960 --> 00:48:37,880 Speaker 1: are lots of creators out there who did deserve the 805 00:48:38,000 --> 00:48:41,120 Speaker 1: check mark and they saw it go away. So for them, 806 00:48:41,160 --> 00:48:44,520 Speaker 1: I'm glad that it came back, because that thing can 807 00:48:44,600 --> 00:48:48,160 Speaker 1: really help you. That check mark means that you have 808 00:48:48,239 --> 00:48:50,480 Speaker 1: a little bit more clout when it comes to stuff 809 00:48:50,560 --> 00:48:54,640 Speaker 1: like looking for sponsorships, maybe getting advertisers to support your channel, 810 00:48:54,760 --> 00:48:58,080 Speaker 1: monetizing your work so that you can get compensated for it. 811 00:48:58,960 --> 00:49:01,840 Speaker 1: That's important. Now my case, again, I was doing it 812 00:49:01,920 --> 00:49:04,640 Speaker 1: for fun. I never expected it to be anything beyond 813 00:49:04,719 --> 00:49:07,480 Speaker 1: that for my own personal channel, so I didn't worry 814 00:49:07,520 --> 00:49:09,520 Speaker 1: about it. Forward thinking was a different story, but that 815 00:49:09,680 --> 00:49:13,360 Speaker 1: was also a project that had a company backing it, 816 00:49:13,480 --> 00:49:16,560 Speaker 1: so that was a totally different case. So yeah, I'm 817 00:49:16,600 --> 00:49:20,600 Speaker 1: glad that it got fixed and moving forward, YouTube is 818 00:49:20,680 --> 00:49:24,959 Speaker 1: being much more picky about who gets a verification check mark, 819 00:49:25,040 --> 00:49:30,280 Speaker 1: but they're not wholesale eliminating all the previously awarded check marks. 820 00:49:30,719 --> 00:49:33,960 Speaker 1: But another controversy that's also playing out on YouTube is 821 00:49:34,000 --> 00:49:36,920 Speaker 1: one that's going to roll into and beyond, and it 822 00:49:36,960 --> 00:49:39,240 Speaker 1: all has to do with a law from the nineteen 823 00:49:39,360 --> 00:49:43,200 Speaker 1: nineties intended to protect children, and that law is a 824 00:49:43,360 --> 00:49:47,480 Speaker 1: US law. It's called Children's Online Privacy Protection Act or 825 00:49:47,680 --> 00:49:51,600 Speaker 1: Kappa CEO p p A. It was again first established 826 00:49:51,680 --> 00:49:56,360 Speaker 1: as law, and in twenty nineteen, the Federal Trade Commission, 827 00:49:56,600 --> 00:50:01,239 Speaker 1: also known as the FTC, brought a suit again YouTube 828 00:50:01,520 --> 00:50:04,880 Speaker 1: and alleged that YouTube had been illegally collecting the personal 829 00:50:04,960 --> 00:50:09,440 Speaker 1: information of children without their parents consent, that kids were 830 00:50:09,520 --> 00:50:13,160 Speaker 1: watching videos on YouTube, that what they were watching was 831 00:50:13,239 --> 00:50:15,960 Speaker 1: being tracked by YouTube, and that this was creating a 832 00:50:16,040 --> 00:50:21,080 Speaker 1: digital fingerprint that advertisers were using to target advertising towards 833 00:50:21,160 --> 00:50:24,240 Speaker 1: those children, and the children being too young to consent 834 00:50:24,320 --> 00:50:28,840 Speaker 1: to this meant that this whole practice under Kappa was illegal, 835 00:50:29,440 --> 00:50:33,600 Speaker 1: and specifically that the company was using this quote in 836 00:50:33,680 --> 00:50:36,799 Speaker 1: the form of persistent identifiers that are used to track 837 00:50:36,960 --> 00:50:40,600 Speaker 1: users across the Internet end quote. So in other words, 838 00:50:40,680 --> 00:50:42,440 Speaker 1: this would be the sort of thing where if you 839 00:50:42,480 --> 00:50:45,560 Speaker 1: were watching a bunch of videos about elephants and then 840 00:50:45,719 --> 00:50:49,440 Speaker 1: you happened to navigate over to say Amazon, you might 841 00:50:49,560 --> 00:50:52,960 Speaker 1: see a bunch of suggestions that relate in some way 842 00:50:53,040 --> 00:50:56,839 Speaker 1: to elephants. And the concern was that this was going 843 00:50:56,880 --> 00:50:59,279 Speaker 1: to be targeting kids, and there was no way for 844 00:50:59,400 --> 00:51:01,880 Speaker 1: kids to give legal consent to allow that to happen, 845 00:51:02,200 --> 00:51:06,400 Speaker 1: and that data has value in it, and children's privacy 846 00:51:06,440 --> 00:51:10,279 Speaker 1: and security also has value to them, so that was 847 00:51:10,360 --> 00:51:15,000 Speaker 1: the problem. Well, YouTube would settle this lawsuit out of court. 848 00:51:15,120 --> 00:51:18,239 Speaker 1: They paid a hundred seventy million dollars in fines, which 849 00:51:18,280 --> 00:51:20,960 Speaker 1: really sounds like a lot, but for YouTube, it's nothing. 850 00:51:21,600 --> 00:51:23,520 Speaker 1: And if that's where it all ended, we would just 851 00:51:23,800 --> 00:51:26,080 Speaker 1: wrap up the story and beyond with it. But in 852 00:51:26,160 --> 00:51:28,880 Speaker 1: addition to the fine, the company had to agree to 853 00:51:28,960 --> 00:51:32,759 Speaker 1: create a system that is compliant with Kappa. So this 854 00:51:32,840 --> 00:51:37,400 Speaker 1: would mean that any creator who was making child directed content, 855 00:51:37,640 --> 00:51:41,719 Speaker 1: meaning content meant to be viewed by children would be 856 00:51:41,800 --> 00:51:44,360 Speaker 1: affected by this. They would have to be Kappa compliant. 857 00:51:44,800 --> 00:51:48,040 Speaker 1: They would have to make sure that they were running 858 00:51:48,040 --> 00:51:51,839 Speaker 1: a channel that was not gathering information about the uh 859 00:51:51,960 --> 00:51:54,400 Speaker 1: the children watching it, that they were not building in 860 00:51:54,480 --> 00:51:59,200 Speaker 1: targeted advertising, that they had to self identify as being 861 00:51:59,320 --> 00:52:03,359 Speaker 1: a creator it was creating child directed content. You had 862 00:52:03,440 --> 00:52:06,040 Speaker 1: to actually go into your little profile and click and 863 00:52:06,120 --> 00:52:08,799 Speaker 1: say whether or not your channel was meant for kids 864 00:52:08,880 --> 00:52:12,080 Speaker 1: or not. But this raises questions like what exactly is 865 00:52:12,280 --> 00:52:15,280 Speaker 1: child directed? And it has a lot of creators nervous 866 00:52:15,440 --> 00:52:18,320 Speaker 1: right now because there are creators who do, for example, 867 00:52:18,440 --> 00:52:22,200 Speaker 1: unboxing videos, and some of them are clearly meant for kids. 868 00:52:22,480 --> 00:52:24,680 Speaker 1: Some of them are hosted by kids and clearly meant 869 00:52:24,840 --> 00:52:28,000 Speaker 1: for kids, but there are others where it may not 870 00:52:28,160 --> 00:52:30,440 Speaker 1: be for kids. It maybe for people who really are 871 00:52:30,520 --> 00:52:35,480 Speaker 1: into collecting toys that are from their favorite you know, franchises, 872 00:52:35,560 --> 00:52:40,600 Speaker 1: for example, so toy unboxing would likely be in the spotlight. 873 00:52:41,160 --> 00:52:44,840 Speaker 1: Creators who use video games are likewise concerned. There are 874 00:52:44,880 --> 00:52:46,920 Speaker 1: people who are using video games to tell stories that 875 00:52:47,000 --> 00:52:50,480 Speaker 1: are people use let's plays or play throughs, but they're 876 00:52:50,520 --> 00:52:53,160 Speaker 1: not necessarily meant for kids. There's also people who are 877 00:52:53,200 --> 00:52:56,239 Speaker 1: working in animation, and that animation may not be meant 878 00:52:56,400 --> 00:52:59,840 Speaker 1: for kids, but the general perception is that cartoons are 879 00:53:00,160 --> 00:53:04,400 Speaker 1: children and they're concerned that they will be interpreted as 880 00:53:04,440 --> 00:53:07,080 Speaker 1: being child directed when they don't intend to be, and 881 00:53:07,200 --> 00:53:09,839 Speaker 1: that they will be affected by this. There's a lot 882 00:53:09,920 --> 00:53:11,760 Speaker 1: of fear that this is going to have an effect 883 00:53:11,880 --> 00:53:14,640 Speaker 1: on monetization, so that people might not be able to 884 00:53:14,680 --> 00:53:17,480 Speaker 1: get paid for what they're making, which means they'll probably 885 00:53:17,520 --> 00:53:21,040 Speaker 1: stop making it. I mean, you've got to make your living. Uh. 886 00:53:21,120 --> 00:53:23,720 Speaker 1: They may move on to a different platform than YouTube, 887 00:53:24,040 --> 00:53:27,800 Speaker 1: or they may just stop entirely. Every single violation of 888 00:53:27,960 --> 00:53:31,280 Speaker 1: Kappa can be fined up to a maximum of forty 889 00:53:31,360 --> 00:53:36,000 Speaker 1: two dollars. Now, keep in mind some of these channels 890 00:53:36,320 --> 00:53:40,440 Speaker 1: have hundreds or thousands of videos up online, so if 891 00:53:40,520 --> 00:53:44,120 Speaker 1: they were identified as being child directed and that their 892 00:53:44,200 --> 00:53:48,719 Speaker 1: material wasn't Kappa compliant, they could get that maximum fine 893 00:53:48,840 --> 00:53:52,640 Speaker 1: for every single video that seemed to be that was 894 00:53:52,719 --> 00:53:55,880 Speaker 1: on their channel. So the cost could be staggering. So 895 00:53:55,960 --> 00:53:59,920 Speaker 1: it's possible we'll see entire channels go dark with PAS 896 00:54:00,239 --> 00:54:03,440 Speaker 1: videos hidden away or deleted, all out of fear that 897 00:54:03,560 --> 00:54:07,360 Speaker 1: a mislite labeling situation could result in massive fines, and 898 00:54:07,400 --> 00:54:09,719 Speaker 1: there's still a lot of uncertainty around this issue, and 899 00:54:09,800 --> 00:54:11,960 Speaker 1: we're not entirely sure how it's all going to play 900 00:54:12,000 --> 00:54:15,120 Speaker 1: out now. As for me, well, I'm in favor of 901 00:54:15,239 --> 00:54:18,560 Speaker 1: rules that protect kids from having their data harvested without consent. 902 00:54:18,760 --> 00:54:20,680 Speaker 1: I mean, I don't like that idea at all about 903 00:54:20,760 --> 00:54:24,600 Speaker 1: kids getting tracked and targeted and advertising that's you know, 904 00:54:24,880 --> 00:54:28,080 Speaker 1: before they're able to even work with the idea of 905 00:54:28,160 --> 00:54:32,279 Speaker 1: what that means. They're particularly uh, you know, vulnerable to it. 906 00:54:32,960 --> 00:54:35,840 Speaker 1: It's one thing to be an adult and to understand, 907 00:54:35,880 --> 00:54:37,880 Speaker 1: at least on a basic level, what is going on 908 00:54:38,080 --> 00:54:42,480 Speaker 1: when we use the internet. It's another matter entirely for children. However, 909 00:54:42,840 --> 00:54:47,160 Speaker 1: the application of those rules can be pretty chaotic and disruptive, 910 00:54:47,560 --> 00:54:50,600 Speaker 1: particularly to people who are well intentioned. They are not 911 00:54:50,840 --> 00:54:54,560 Speaker 1: trying to create child directed content, but they're worried about 912 00:54:54,719 --> 00:54:59,200 Speaker 1: their material being misrepresented or misunderstood as child directed, and 913 00:54:59,280 --> 00:55:03,200 Speaker 1: therefore every thing is put in danger. That's not great either, 914 00:55:03,920 --> 00:55:06,160 Speaker 1: and channels that are clearly not meant for kids could 915 00:55:06,200 --> 00:55:08,920 Speaker 1: get caught up in the crosshairs through no fault of 916 00:55:09,000 --> 00:55:12,120 Speaker 1: their own. So this is a situation worthy of attention. 917 00:55:12,239 --> 00:55:16,920 Speaker 1: Because it stands to affect hundreds of creators on YouTube 918 00:55:17,840 --> 00:55:20,799 Speaker 1: who are not trying to make stuff for kids. Then 919 00:55:20,880 --> 00:55:25,960 Speaker 1: you've got people who like their main audience are kids, 920 00:55:26,320 --> 00:55:28,920 Speaker 1: and they're not making stuff for kids. Just so happens 921 00:55:28,960 --> 00:55:33,080 Speaker 1: that their audience is mostly kids. That's an issue all 922 00:55:33,160 --> 00:55:35,399 Speaker 1: on its own, and one that I don't have any 923 00:55:35,480 --> 00:55:38,680 Speaker 1: solutions for. If you're making stuff that you know you 924 00:55:38,800 --> 00:55:44,319 Speaker 1: didn't intend to appeal to children, but children think it's fantastic, 925 00:55:45,360 --> 00:55:47,759 Speaker 1: where does that put you because you weren't targeting them, 926 00:55:48,200 --> 00:55:51,800 Speaker 1: but that's your audience. That's tough. Now. There are a 927 00:55:51,840 --> 00:55:54,880 Speaker 1: lot of other stories I didn't get to like, for example, 928 00:55:54,960 --> 00:55:58,520 Speaker 1: the Testlas Cybertruck debut and how awkward it was when 929 00:55:59,040 --> 00:56:01,480 Speaker 1: they had the debut and they hit it with a 930 00:56:01,600 --> 00:56:04,920 Speaker 1: sledgehammer and then they threw some stuff at the windows 931 00:56:05,000 --> 00:56:08,480 Speaker 1: and the windows started cracking. That was pretty a pretty 932 00:56:08,560 --> 00:56:12,200 Speaker 1: rough showing. And the cybertruck itself is is really funky. 933 00:56:12,719 --> 00:56:14,800 Speaker 1: It's a very odd design, kind of reminds me of 934 00:56:14,880 --> 00:56:18,200 Speaker 1: a Lamborghini Kuntash or an old DeLorean in a way. 935 00:56:19,000 --> 00:56:22,240 Speaker 1: Or I didn't talk about how the Samsung Galaxy folds 936 00:56:22,480 --> 00:56:27,880 Speaker 1: mobile device, the foldable smartphone how that launch didn't go 937 00:56:28,000 --> 00:56:31,080 Speaker 1: so well. You could say that the fold cracked under pressure. 938 00:56:32,200 --> 00:56:34,600 Speaker 1: I didn't talk about the launch of Star Wars Galaxy's 939 00:56:34,680 --> 00:56:37,120 Speaker 1: Edge at Disneyland and Disney World. That was a big deal, 940 00:56:37,200 --> 00:56:39,920 Speaker 1: not just in tech obviously, but in theme parks. Uh. 941 00:56:40,000 --> 00:56:43,320 Speaker 1: There were the seemingly endless supply of movie and television 942 00:56:43,440 --> 00:56:47,120 Speaker 1: streaming services that either launched in twenty nineteen or were 943 00:56:47,120 --> 00:56:50,400 Speaker 1: announced in twenty nineteen, stuff like Disney Plus and the 944 00:56:50,520 --> 00:56:55,440 Speaker 1: upcoming HBO streaming service, Apple Plus launched just tons of them. 945 00:56:55,480 --> 00:56:59,239 Speaker 1: Now there was Baby Yoda. But I think it's a 946 00:56:59,280 --> 00:57:02,960 Speaker 1: good time to wrap up this episode. Let's set our 947 00:57:03,040 --> 00:57:07,680 Speaker 1: sights on twenty Yeah, you know what, Let's all get 948 00:57:07,800 --> 00:57:11,200 Speaker 1: twenty twenty vision in the year with a pun. I 949 00:57:11,239 --> 00:57:13,080 Speaker 1: guess technically I'm starting the year with a pun because 950 00:57:13,120 --> 00:57:17,680 Speaker 1: I think this episode goes live on January one. Anyway, 951 00:57:17,760 --> 00:57:21,000 Speaker 1: that was twenty nineteen in a nutshell. I've got a 952 00:57:21,120 --> 00:57:25,200 Speaker 1: lot of plans for I'm looking forward to sharing with 953 00:57:25,320 --> 00:57:28,840 Speaker 1: you more wonderful stories about technology, interesting stuff about how 954 00:57:28,960 --> 00:57:31,200 Speaker 1: tech works, how it affects us, how we affect it, 955 00:57:32,120 --> 00:57:35,000 Speaker 1: how things change over time, and how that change can 956 00:57:35,040 --> 00:57:39,320 Speaker 1: be messy. But sometimes once you get through the messy parts, 957 00:57:39,520 --> 00:57:42,360 Speaker 1: you can get something really incredible. So we're gonna look 958 00:57:42,360 --> 00:57:43,960 Speaker 1: at those stories as well as well as the times 959 00:57:44,000 --> 00:57:47,000 Speaker 1: where things just didn't go right. We'll be covering more 960 00:57:47,040 --> 00:57:49,120 Speaker 1: of those as well. If you guys have suggestions for 961 00:57:49,240 --> 00:57:52,440 Speaker 1: future topics I should cover in technology, let me know. 962 00:57:52,920 --> 00:57:57,000 Speaker 1: The Facebook and Twitter handle are both tech stuff hs W. 963 00:57:57,480 --> 00:57:59,440 Speaker 1: It's best to reach out to me there and I 964 00:57:59,520 --> 00:58:06,840 Speaker 1: We'll talk to you again really soon. Hext Stuff is 965 00:58:06,840 --> 00:58:09,320 Speaker 1: a production of I Heart Radio's How Stuff Works. For 966 00:58:09,440 --> 00:58:12,400 Speaker 1: more podcasts from I heart Radio, visit the I heart 967 00:58:12,480 --> 00:58:15,640 Speaker 1: Radio app, Apple Podcasts, or wherever you listen to your 968 00:58:15,720 --> 00:58:16,400 Speaker 1: favorite shows.