1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to Tech Stuff, a production from I Heart Radio. 2 00:00:12,080 --> 00:00:14,880 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:15,040 --> 00:00:18,360 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:18,400 --> 00:00:21,840 Speaker 1: and I love all things tech and this is the 5 00:00:22,000 --> 00:00:28,560 Speaker 1: tech News episode for Tuesday, January twelve, twenty one. Before 6 00:00:28,600 --> 00:00:31,240 Speaker 1: I jump into it, I should also point out that 7 00:00:31,320 --> 00:00:34,320 Speaker 1: when I record these episodes, I record them the day 8 00:00:34,479 --> 00:00:38,760 Speaker 1: before they published. That gives my super producer TORII, enough 9 00:00:38,800 --> 00:00:43,920 Speaker 1: time to edit and publish these episodes, which explains why 10 00:00:44,520 --> 00:00:48,760 Speaker 1: when I recorded last Thursday's episode, I didn't have any 11 00:00:48,800 --> 00:00:54,080 Speaker 1: mention of the assault on the United States Capital at all, 12 00:00:54,120 --> 00:00:57,000 Speaker 1: because as I was recording that episode, that had not 13 00:00:57,160 --> 00:01:00,160 Speaker 1: yet happened. In fact, it was right after I had 14 00:01:00,160 --> 00:01:03,160 Speaker 1: finished recording and I was starting to go back and 15 00:01:03,320 --> 00:01:07,640 Speaker 1: listen to that recording that the news was starting to 16 00:01:07,680 --> 00:01:12,440 Speaker 1: break about that situation. And so today's episode has some 17 00:01:12,680 --> 00:01:18,240 Speaker 1: of the fallout from that from the tech perspective. So 18 00:01:18,280 --> 00:01:20,959 Speaker 1: since our last news episode, there's been a ton of 19 00:01:21,000 --> 00:01:25,240 Speaker 1: developments in the United States involving social media. The President 20 00:01:25,280 --> 00:01:31,440 Speaker 1: of the United States and a platform called Parlor. Twitter, Twitch, Facebook, 21 00:01:31,640 --> 00:01:36,080 Speaker 1: and Instagram have all banned Trump from posting to their platforms. 22 00:01:36,400 --> 00:01:40,759 Speaker 1: YouTube has been removing videos. Reddit shut down a subreddit 23 00:01:40,800 --> 00:01:45,720 Speaker 1: called our slash Donald Trump for violating policies that involve 24 00:01:46,240 --> 00:01:50,840 Speaker 1: the restrictions against promoting hate speech or encouraging violence. And 25 00:01:50,920 --> 00:01:54,000 Speaker 1: TikTok has been removing videos that relate to the riots 26 00:01:54,040 --> 00:01:58,520 Speaker 1: on Capitol Hill, leading to some people joking that TikTok 27 00:01:58,680 --> 00:02:03,400 Speaker 1: band Donald Trump before Trump could ban TikTok. Each of 28 00:02:03,440 --> 00:02:06,760 Speaker 1: these platforms have done this because of the content and 29 00:02:06,920 --> 00:02:10,720 Speaker 1: tone of the President's messaging leading up to and including 30 00:02:10,800 --> 00:02:13,760 Speaker 1: the day of riots, saying that he is continuing to 31 00:02:13,880 --> 00:02:18,360 Speaker 1: encourage seditious acts. There are critics of the platforms on 32 00:02:18,440 --> 00:02:22,560 Speaker 1: both sides of the ideological spectrum. You've got people arguing 33 00:02:22,600 --> 00:02:25,800 Speaker 1: that the platforms are trampling on the First Amendment, which 34 00:02:25,919 --> 00:02:29,160 Speaker 1: seems pretty simple to dismiss as these platforms are all 35 00:02:29,160 --> 00:02:33,000 Speaker 1: extensions of the private sector and the First Amendment protects 36 00:02:33,080 --> 00:02:37,440 Speaker 1: citizens from government censorship. There have been multiple references to 37 00:02:37,840 --> 00:02:43,560 Speaker 1: Orwell and Orwellian practices, which is particularly rich and suggests 38 00:02:43,600 --> 00:02:45,919 Speaker 1: to me that a lot of people invoking the name 39 00:02:46,160 --> 00:02:50,000 Speaker 1: or Well have never actually read or Well. There have 40 00:02:50,080 --> 00:02:54,400 Speaker 1: also been many criticisms about the power that these tech 41 00:02:54,520 --> 00:02:58,160 Speaker 1: companies have, which interestingly is also a talking point on 42 00:02:58,320 --> 00:03:01,880 Speaker 1: the opposite end of the ideal logical spectrum, though that 43 00:03:01,919 --> 00:03:05,320 Speaker 1: point extends to corporations as a whole, and how over 44 00:03:05,360 --> 00:03:08,320 Speaker 1: the last couple of decades the United States government has 45 00:03:08,360 --> 00:03:12,880 Speaker 1: granted corporations more power year by year, enabling them to 46 00:03:12,919 --> 00:03:17,200 Speaker 1: have enormous political influence in the process. And then on 47 00:03:17,280 --> 00:03:20,320 Speaker 1: the more liberal side of things, you've got critics saying 48 00:03:20,360 --> 00:03:23,840 Speaker 1: that these platforms are moving to ban Trump, but they've 49 00:03:23,840 --> 00:03:26,680 Speaker 1: proven that they could have done that if they had 50 00:03:26,760 --> 00:03:30,320 Speaker 1: wanted to at any point before now, which leads to 51 00:03:30,400 --> 00:03:33,160 Speaker 1: questions about why it took so long after there were 52 00:03:33,240 --> 00:03:37,840 Speaker 1: numerous examples of violations of policies over the last few years, 53 00:03:37,840 --> 00:03:42,279 Speaker 1: in other words, that the president had clearly violated policies 54 00:03:42,320 --> 00:03:46,680 Speaker 1: of these different platforms policies that if someone that was 55 00:03:46,760 --> 00:03:49,280 Speaker 1: not the president had done this, they would have found 56 00:03:49,320 --> 00:03:53,800 Speaker 1: themselves facing consequences. So essentially, these critics are saying that 57 00:03:53,960 --> 00:03:57,480 Speaker 1: now that we're two weeks out from Biden's inauguration, sooner 58 00:03:57,520 --> 00:04:01,080 Speaker 1: than that, really one week out now, that's when the 59 00:04:01,200 --> 00:04:06,400 Speaker 1: platforms have finally deemed it possible to remove Trump from them. 60 00:04:06,440 --> 00:04:10,000 Speaker 1: So no one's really happy about how this was done. 61 00:04:10,040 --> 00:04:12,800 Speaker 1: But some people view it as necessity to reduce the 62 00:04:12,880 --> 00:04:16,200 Speaker 1: chance for further chaos, and other people see it as 63 00:04:16,279 --> 00:04:18,720 Speaker 1: an attack on freedoms and a sign of too much 64 00:04:18,760 --> 00:04:22,640 Speaker 1: political power. And these two lines of thought don't necessarily 65 00:04:22,680 --> 00:04:26,640 Speaker 1: fall equally into the camps of democrat and republican. It 66 00:04:26,800 --> 00:04:31,880 Speaker 1: is a complicated and messy situation. Meanwhile, let's talk about Parlor, 67 00:04:32,160 --> 00:04:35,960 Speaker 1: which bills itself as a free speech social network. Now, 68 00:04:36,000 --> 00:04:40,039 Speaker 1: this social network receives a lot of funding from conservative 69 00:04:40,040 --> 00:04:44,400 Speaker 1: hedge fund manager Robert Mercer and his daughter Rebecca. Mercer 70 00:04:44,520 --> 00:04:47,760 Speaker 1: was a big investor in Cambridge Analytica back in the 71 00:04:47,839 --> 00:04:51,720 Speaker 1: day and is a notable financial supporter of various right 72 00:04:51,760 --> 00:04:57,120 Speaker 1: wing organizations. Parlor itself is associated with conservatives in general, 73 00:04:57,440 --> 00:05:00,760 Speaker 1: and while the company says it is new role, the 74 00:05:00,839 --> 00:05:04,159 Speaker 1: fact is that the vast majority of activity on Parlor 75 00:05:04,600 --> 00:05:08,400 Speaker 1: falls on the conservative and frequently the far right wing 76 00:05:08,560 --> 00:05:12,200 Speaker 1: side of the political spectrum. Parlor is also where a 77 00:05:12,279 --> 00:05:14,960 Speaker 1: lot of the people involved in the riots on Capitol 78 00:05:15,040 --> 00:05:18,680 Speaker 1: Hill have accounts, and it's a place that has been 79 00:05:18,800 --> 00:05:23,400 Speaker 1: used to organize activities like that. As such, people concerned 80 00:05:23,440 --> 00:05:27,039 Speaker 1: with Parlor's role in facilitating violence and assaults on the 81 00:05:27,080 --> 00:05:31,000 Speaker 1: foundations of democracy began to pressure Google and Apple to 82 00:05:31,080 --> 00:05:35,279 Speaker 1: remove Parlor from their respective app stores. The two companies 83 00:05:35,400 --> 00:05:39,920 Speaker 1: ultimately complied, removing access to those apps, and then Amazon 84 00:05:39,960 --> 00:05:44,279 Speaker 1: announced it was booting Parlor from their web services platform 85 00:05:44,320 --> 00:05:48,719 Speaker 1: by eleven nine pm Pacific time on Sunday, January tent which, 86 00:05:48,839 --> 00:05:52,520 Speaker 1: as I record, this was yesterday, So it is no 87 00:05:52,600 --> 00:05:56,800 Speaker 1: longer on those servers. A company representative route to Parlor, 88 00:05:56,960 --> 00:06:00,760 Speaker 1: stating quote, Recently, we've seen a steady increase in this 89 00:06:00,920 --> 00:06:04,560 Speaker 1: violent content on your website, all of which violates our terms. 90 00:06:04,960 --> 00:06:08,000 Speaker 1: It's clear that Parlor does not have an effective process 91 00:06:08,000 --> 00:06:12,240 Speaker 1: to comply with the a WS terms of service end quote. 92 00:06:12,520 --> 00:06:16,520 Speaker 1: That's an Amazon Web Services That's what aw S means. 93 00:06:16,560 --> 00:06:20,919 Speaker 1: So this means that the actual servers that Parlor used 94 00:06:21,000 --> 00:06:24,920 Speaker 1: to host their network are now no longer available to Parlor, 95 00:06:25,360 --> 00:06:30,520 Speaker 1: effectively d platforming the platform. The admins of Parlor initially 96 00:06:30,560 --> 00:06:33,919 Speaker 1: said that the plan was to relaunch Parlor on another 97 00:06:34,120 --> 00:06:37,200 Speaker 1: web service, but that the platform might be down for 98 00:06:37,240 --> 00:06:40,000 Speaker 1: about a week as a result. They also said that 99 00:06:40,040 --> 00:06:42,839 Speaker 1: the plan was to rebuild Parlor from scratch, but that 100 00:06:42,920 --> 00:06:46,880 Speaker 1: seems unrealistic to me. Rebuilding would take a lot longer 101 00:06:46,920 --> 00:06:51,040 Speaker 1: than just a week from a coding perspective. Moreover, subsequent 102 00:06:51,080 --> 00:06:54,000 Speaker 1: reports indicate that the founders aren't having much luck when 103 00:06:54,000 --> 00:06:56,720 Speaker 1: it comes to finding a new home for the platform, 104 00:06:56,760 --> 00:07:01,080 Speaker 1: and even the company's own legal team quit again. This 105 00:07:01,240 --> 00:07:05,200 Speaker 1: brought out First Amendment defenders, but opponents pointed out that 106 00:07:05,279 --> 00:07:09,800 Speaker 1: speech that incites violence or encourages sedition or treason are 107 00:07:09,840 --> 00:07:13,960 Speaker 1: not protected by the First Amendment, as the saying goes, 108 00:07:14,160 --> 00:07:17,280 Speaker 1: the right to swing your fist ends where the other 109 00:07:17,360 --> 00:07:23,440 Speaker 1: man's nose begins. These recent developments have elevated conversations that 110 00:07:23,560 --> 00:07:26,280 Speaker 1: have been going on for years about the role of 111 00:07:26,320 --> 00:07:31,120 Speaker 1: social networks and their responsibility to moderate. In the United States, 112 00:07:31,440 --> 00:07:35,880 Speaker 1: Section to thirty gives platforms an effective immunity against legal 113 00:07:35,920 --> 00:07:40,280 Speaker 1: action for the content that users post to those platforms. 114 00:07:40,320 --> 00:07:45,040 Speaker 1: Some have said that it prevents those platforms from moderating discussions, 115 00:07:45,080 --> 00:07:49,240 Speaker 1: which is actually the opposite of the intent of that 116 00:07:49,360 --> 00:07:53,960 Speaker 1: piece of legislation. It was meant to encourage platforms to 117 00:07:54,040 --> 00:07:57,800 Speaker 1: moderate content on them, also without fear of legal action 118 00:07:57,920 --> 00:08:00,400 Speaker 1: for doing so so. If a platform were to, say, 119 00:08:00,480 --> 00:08:04,680 Speaker 1: remove all posts that referenced violence, the platform would not 120 00:08:04,720 --> 00:08:08,200 Speaker 1: be held legally liable if someone felt that the platform 121 00:08:08,280 --> 00:08:12,960 Speaker 1: was overstepping its bounds. However, rather than seeing these platforms 122 00:08:13,000 --> 00:08:17,360 Speaker 1: take a more active role in moderating content, the opposite 123 00:08:17,440 --> 00:08:20,720 Speaker 1: has largely been the case, and in fact, platforms have 124 00:08:20,840 --> 00:08:25,040 Speaker 1: found it profitable to do so because the inflammatory content 125 00:08:25,560 --> 00:08:29,280 Speaker 1: drives a lot of engagement, and that ultimately boils down 126 00:08:29,280 --> 00:08:33,160 Speaker 1: to people spending more time on these platforms and thus 127 00:08:33,200 --> 00:08:37,319 Speaker 1: earning those platforms more money as a result thanks to advertising. 128 00:08:38,160 --> 00:08:42,440 Speaker 1: And so there has been an incentive for platforms to 129 00:08:42,520 --> 00:08:45,920 Speaker 1: kind of let things be until the situation boils over 130 00:08:46,440 --> 00:08:50,720 Speaker 1: because it was more profitable that way. Once things escalate 131 00:08:50,760 --> 00:08:54,400 Speaker 1: beyond a certain point, the risks of housing the material 132 00:08:54,800 --> 00:08:58,199 Speaker 1: end up being greater than the reward of hosting it, 133 00:08:58,600 --> 00:09:01,160 Speaker 1: and that's when we see platfor worms leap to action. 134 00:09:01,520 --> 00:09:04,360 Speaker 1: Now that's not to say that all social network leaders 135 00:09:04,400 --> 00:09:08,360 Speaker 1: feel no responsibility for what happens on the platforms. It's 136 00:09:08,440 --> 00:09:11,320 Speaker 1: more of a commentary on how the whole business is 137 00:09:11,520 --> 00:09:16,360 Speaker 1: a moral in nature, not immoral necessarily, but a moral 138 00:09:17,280 --> 00:09:21,440 Speaker 1: and moving forward with Democrats in control of both Houses 139 00:09:21,480 --> 00:09:25,600 Speaker 1: of Congress plus the White House, after January twenty. There's 140 00:09:25,600 --> 00:09:28,640 Speaker 1: a growing expectation that the US government will be taking 141 00:09:28,679 --> 00:09:32,480 Speaker 1: a harder look at big tech with regulations and reform 142 00:09:32,600 --> 00:09:36,240 Speaker 1: in mind, which could include a renewed push to break 143 00:09:36,320 --> 00:09:39,360 Speaker 1: up some of those big companies that the government deems 144 00:09:39,400 --> 00:09:42,000 Speaker 1: to be either a monopoly or at the very least 145 00:09:42,080 --> 00:09:47,520 Speaker 1: practice anti competitive strategies. That's one extreme we can see. 146 00:09:47,600 --> 00:09:50,960 Speaker 1: It's also possible that we'll see the government create incentives 147 00:09:51,080 --> 00:09:54,480 Speaker 1: or find ways to pressure social networking companies to take 148 00:09:54,520 --> 00:09:58,640 Speaker 1: a more active and transparent approach to moderating content on 149 00:09:58,679 --> 00:10:02,240 Speaker 1: the networks themselves. This is something that the platforms have 150 00:10:02,400 --> 00:10:06,559 Speaker 1: said might be a necessity, including Mark Zuckerberg. There's also 151 00:10:06,640 --> 00:10:10,440 Speaker 1: the possibility that Section to thirty will come under fire again, 152 00:10:10,720 --> 00:10:13,600 Speaker 1: which is interesting because it's a piece of legislation that 153 00:10:13,600 --> 00:10:17,920 Speaker 1: gets targeted both by conservatives and by certain Democrats, but 154 00:10:18,040 --> 00:10:23,080 Speaker 1: for very different reasons. Representatives from organizations like Tech Freedom 155 00:10:23,320 --> 00:10:27,440 Speaker 1: argue that removing or changing Section to thirty will not 156 00:10:27,600 --> 00:10:30,640 Speaker 1: solve the problem, and that any approach will require a 157 00:10:30,720 --> 00:10:34,360 Speaker 1: great deal of thought and care behind it, or else 158 00:10:34,520 --> 00:10:37,560 Speaker 1: it will be doomed to fail. Removing to thirty would 159 00:10:37,600 --> 00:10:40,760 Speaker 1: then be nothing more than a symbolic gesture, and further, 160 00:10:40,880 --> 00:10:43,800 Speaker 1: it could hurt smaller platforms and thus make the anti 161 00:10:43,800 --> 00:10:48,360 Speaker 1: competitive problem in tech even worse. So the message here 162 00:10:48,360 --> 00:10:51,640 Speaker 1: seems to be think carefully about the solution and don't 163 00:10:51,679 --> 00:10:54,920 Speaker 1: make it something that makes the problem even worse than 164 00:10:54,960 --> 00:10:58,080 Speaker 1: it already is, or just doesn't even address the problem 165 00:10:58,200 --> 00:11:02,600 Speaker 1: in the first place. While there are ongoing investigations into 166 00:11:02,640 --> 00:11:06,440 Speaker 1: the security failures that allowed dozens of rioters to roam 167 00:11:06,480 --> 00:11:10,000 Speaker 1: the halls of the US Capital, there's a parallel effort 168 00:11:10,160 --> 00:11:13,960 Speaker 1: to address the information security problems that arose as a 169 00:11:14,040 --> 00:11:18,440 Speaker 1: result of unauthorized people having access to numerous computers and 170 00:11:18,600 --> 00:11:22,800 Speaker 1: systems in the capital during the incident. The physical access 171 00:11:22,800 --> 00:11:26,000 Speaker 1: to facilities and equipment means there's a possibility one or 172 00:11:26,040 --> 00:11:29,920 Speaker 1: more people not only accessed information they should not have, 173 00:11:30,520 --> 00:11:35,520 Speaker 1: but potentially they could have bugged offices, copied data, stolen computers. 174 00:11:35,520 --> 00:11:38,440 Speaker 1: In fact, we know for a fact that some computers 175 00:11:38,440 --> 00:11:41,120 Speaker 1: have been stolen or at least have been reported as 176 00:11:41,160 --> 00:11:44,720 Speaker 1: being stolen, or maybe stolen other equipment. They could have 177 00:11:44,760 --> 00:11:49,040 Speaker 1: injected malware into computer systems or more. With this sort 178 00:11:49,040 --> 00:11:52,520 Speaker 1: of intrusion, you have to move forward assuming the worst 179 00:11:52,720 --> 00:11:56,760 Speaker 1: that is that everything has been compromised. Otherwise you leave 180 00:11:56,800 --> 00:12:01,360 Speaker 1: yourself vulnerable to the worst. It's pretty much the worst 181 00:12:01,440 --> 00:12:06,280 Speaker 1: case scenario for computer security. That being said, effectively, each 182 00:12:06,320 --> 00:12:10,960 Speaker 1: member of Congress has their own information system, which means 183 00:12:11,200 --> 00:12:15,760 Speaker 1: there is some decentralization at play here. They're not all 184 00:12:16,320 --> 00:12:19,400 Speaker 1: just automatically linked together. They're all kind of their own 185 00:12:19,640 --> 00:12:23,199 Speaker 1: little silos, but they are connected to so in other words, 186 00:12:23,480 --> 00:12:26,800 Speaker 1: if someone does gain access to one specific account, they 187 00:12:26,800 --> 00:12:31,439 Speaker 1: do not necessarily have access to all other accounts across Congress. 188 00:12:32,320 --> 00:12:35,280 Speaker 1: That's a small comfort, but since we don't yet know 189 00:12:35,480 --> 00:12:39,360 Speaker 1: the extent to which systems were compromised, it's also a 190 00:12:39,400 --> 00:12:41,960 Speaker 1: bit of a moot point. And depending on the level 191 00:12:41,960 --> 00:12:44,720 Speaker 1: of access people were able to get, the damage they 192 00:12:44,720 --> 00:12:49,160 Speaker 1: could do could still be considerable. And there's the possibility 193 00:12:49,240 --> 00:12:51,960 Speaker 1: that foreign agents could have played a part in the 194 00:12:52,080 --> 00:12:55,640 Speaker 1: riots on January six as well, as has been widely 195 00:12:55,679 --> 00:12:59,840 Speaker 1: reported though far too late. The planning that led to 196 00:13:00,040 --> 00:13:03,560 Speaker 1: these riots was largely done on platforms like Parlor and Reddit, 197 00:13:04,000 --> 00:13:07,719 Speaker 1: and while there appears to have been no centralized leadership, 198 00:13:08,120 --> 00:13:11,679 Speaker 1: the various smaller groups that made up the whole each 199 00:13:11,840 --> 00:13:16,679 Speaker 1: shared plans extensively within their respective groups. So any foreign 200 00:13:16,760 --> 00:13:20,400 Speaker 1: agency that was paying more attention than the media was 201 00:13:20,760 --> 00:13:23,680 Speaker 1: could have followed along and played a part as well 202 00:13:23,840 --> 00:13:28,720 Speaker 1: and taken the opportunity of access to compromise cybersecurity in 203 00:13:28,760 --> 00:13:32,520 Speaker 1: the nation's capital. It will likely take many days to 204 00:13:32,800 --> 00:13:36,400 Speaker 1: assess the full level of damage, and even then we 205 00:13:36,559 --> 00:13:41,800 Speaker 1: likely won't know the whole story, and maybe we never will. Well, 206 00:13:42,559 --> 00:13:46,640 Speaker 1: that's enough doom and gloom from the rioting at the 207 00:13:46,760 --> 00:13:49,679 Speaker 1: nation's capital. I think when we come back, we're going 208 00:13:49,720 --> 00:13:52,400 Speaker 1: to take a look at a totally different topic what 209 00:13:52,720 --> 00:13:56,080 Speaker 1: c e S twenty twenty one is shaping up to 210 00:13:56,200 --> 00:13:59,520 Speaker 1: be this week. But first let's take a quick break. 211 00:14:07,160 --> 00:14:09,600 Speaker 1: So before the break, I mentioned it is c e 212 00:14:09,960 --> 00:14:12,400 Speaker 1: S week. Now I talked about this last week. It 213 00:14:12,520 --> 00:14:16,319 Speaker 1: is a very different show than what we would normally see. 214 00:14:16,720 --> 00:14:20,000 Speaker 1: C e S is the Consumer Electronics Show. It's an 215 00:14:20,160 --> 00:14:24,160 Speaker 1: enormous industry trade show that has had at least one 216 00:14:24,480 --> 00:14:28,160 Speaker 1: big event in a physical location every year since nineteen 217 00:14:28,200 --> 00:14:32,920 Speaker 1: sixty seven until now. In fact, since it has been 218 00:14:32,920 --> 00:14:38,040 Speaker 1: held exclusively at Las Vegas in January, well, COVID has 219 00:14:38,120 --> 00:14:41,360 Speaker 1: forced c e S to go virtual like lots of 220 00:14:41,360 --> 00:14:44,320 Speaker 1: other events in the world, and The shift and venue 221 00:14:44,440 --> 00:14:47,080 Speaker 1: is just one of the changes that we're seeing this 222 00:14:47,160 --> 00:14:51,480 Speaker 1: week in c e S had more than four thousand 223 00:14:51,560 --> 00:14:56,080 Speaker 1: exhibitors officially at the event, and that doesn't count the 224 00:14:56,200 --> 00:14:59,360 Speaker 1: number of companies that set up shop in hotel suites 225 00:14:59,360 --> 00:15:01,960 Speaker 1: but are not in an official part of the show. 226 00:15:02,440 --> 00:15:04,600 Speaker 1: I didn't realize that was a thing until I started 227 00:15:04,640 --> 00:15:07,080 Speaker 1: going to c e S. And now we get invites 228 00:15:07,160 --> 00:15:10,600 Speaker 1: to go to different hotel suites to check out technology, 229 00:15:10,640 --> 00:15:13,200 Speaker 1: and I realized, Oh, the reason you did this is 230 00:15:13,240 --> 00:15:16,520 Speaker 1: because renting out a hotel suite is cheaper than establishing 231 00:15:16,880 --> 00:15:20,880 Speaker 1: show floor space at CEES. Plus you're not competing against 232 00:15:20,920 --> 00:15:25,000 Speaker 1: everybody else in the immediate area for attention. However, in 233 00:15:25,960 --> 00:15:29,760 Speaker 1: one the list of exhibitors is less than half of 234 00:15:29,800 --> 00:15:32,360 Speaker 1: what it was last year. It is one thousand, nine 235 00:15:32,640 --> 00:15:36,800 Speaker 1: D sixty seven. Some big companies, including companies like Ford 236 00:15:36,920 --> 00:15:40,640 Speaker 1: and Toyota, aren't participating this year. And while ce S 237 00:15:40,720 --> 00:15:43,800 Speaker 1: isn't exactly a car trade show, we usually see a 238 00:15:43,800 --> 00:15:48,320 Speaker 1: pretty big representation of car company technology on display during 239 00:15:48,360 --> 00:15:51,920 Speaker 1: the event, and it shouldn't come as a big surprise 240 00:15:52,000 --> 00:15:54,080 Speaker 1: that a lot of the technology we expect to see 241 00:15:54,120 --> 00:15:58,440 Speaker 1: this week and beyond into the rest of one will 242 00:15:58,480 --> 00:16:03,320 Speaker 1: play into how co it has affected our lives. Companies 243 00:16:03,400 --> 00:16:07,680 Speaker 1: will be introducing technologies designed to promote safety and wellness 244 00:16:07,720 --> 00:16:11,760 Speaker 1: in office spaces. So we're anticipating a time when people 245 00:16:11,800 --> 00:16:14,080 Speaker 1: will be able to work together in the same physical 246 00:16:14,120 --> 00:16:18,200 Speaker 1: location in relative safety across more of the world. And 247 00:16:18,240 --> 00:16:20,440 Speaker 1: there's likely going to be a lot more technology that 248 00:16:20,520 --> 00:16:23,600 Speaker 1: supports those who will be working from home. As kind 249 00:16:23,600 --> 00:16:26,480 Speaker 1: of an ongoing arrangement, We've heard from a lot of 250 00:16:26,520 --> 00:16:29,720 Speaker 1: companies over the course of twenty twenty last year that 251 00:16:30,120 --> 00:16:33,400 Speaker 1: moving forward they are going to allow more of their 252 00:16:33,560 --> 00:16:36,920 Speaker 1: employees to work from home, you know, just as a 253 00:16:36,960 --> 00:16:40,840 Speaker 1: matter of course, even after we reach something like herd 254 00:16:40,840 --> 00:16:45,000 Speaker 1: immunity with COVID. One of the things that happens at 255 00:16:45,120 --> 00:16:49,800 Speaker 1: c S is that the Consumer Technology Association brings together 256 00:16:49,880 --> 00:16:54,080 Speaker 1: a panel of judges to pour over thousands of products 257 00:16:54,120 --> 00:16:57,920 Speaker 1: and select some that are standouts in their respective categories. 258 00:16:58,440 --> 00:17:01,960 Speaker 1: The judges come from various sectors and include executives from 259 00:17:02,040 --> 00:17:09,280 Speaker 1: companies like Intel, Spotify, Comcast, Amazon, Apple, Google, and more. 260 00:17:09,840 --> 00:17:14,239 Speaker 1: Oh and that more includes some really interesting outliers Like 261 00:17:14,480 --> 00:17:17,160 Speaker 1: this year, one of the judges is a representative from 262 00:17:17,320 --> 00:17:22,480 Speaker 1: Cirque to Sola Entertainment Group, which famously had a financial 263 00:17:22,520 --> 00:17:26,560 Speaker 1: collapse last year. Somehow my invitation to be part of 264 00:17:26,560 --> 00:17:29,240 Speaker 1: this must have been lost in the mail. But I'm 265 00:17:29,280 --> 00:17:32,560 Speaker 1: just joking because I'm neither qualified to really judge this 266 00:17:32,600 --> 00:17:35,119 Speaker 1: sort of stuff, nor do I relish the thought of 267 00:17:35,200 --> 00:17:40,280 Speaker 1: having to examine countless examples of technology and then start 268 00:17:40,359 --> 00:17:45,000 Speaker 1: assigning scores to them. Anyway, we already know the winners 269 00:17:45,080 --> 00:17:49,000 Speaker 1: for this year's ce S Best of Innovation. Now. The 270 00:17:49,080 --> 00:17:55,399 Speaker 1: honorees in total include two products from companies. Obviously, I 271 00:17:55,440 --> 00:17:58,639 Speaker 1: am not going to list them all. That would be ludicrous, 272 00:17:58,880 --> 00:18:01,119 Speaker 1: So I'm just gonna give you a a couple that 273 00:18:01,119 --> 00:18:04,480 Speaker 1: that stood out to me. A Canadian company called E 274 00:18:04,720 --> 00:18:09,159 Speaker 1: two I p E twip, I don't know, anyway, they 275 00:18:09,160 --> 00:18:13,480 Speaker 1: submitted a product called Engineered electro Magnetic Surfaces that I 276 00:18:13,520 --> 00:18:17,280 Speaker 1: think is really cool. Now, these are sheets of plastic 277 00:18:17,520 --> 00:18:20,840 Speaker 1: that are very very thin, and they're semi transparent, and 278 00:18:20,880 --> 00:18:26,880 Speaker 1: the sheets can reflect, redirect, or block specific radio frequency waves. 279 00:18:26,920 --> 00:18:30,520 Speaker 1: So you can engineer this stuff to allow certain radio 280 00:18:30,560 --> 00:18:33,280 Speaker 1: frequency waves to pass through and to block other ones 281 00:18:33,320 --> 00:18:37,040 Speaker 1: and so on. The applications for such material are numerous, 282 00:18:37,320 --> 00:18:40,119 Speaker 1: so you could use it to help focus signals like 283 00:18:40,280 --> 00:18:44,280 Speaker 1: five G radio frequency signals to a particular location. So 284 00:18:44,440 --> 00:18:47,119 Speaker 1: let's say that you wanted to have a five G 285 00:18:47,320 --> 00:18:49,879 Speaker 1: wireless connection to your home. You want to set up 286 00:18:49,880 --> 00:18:54,160 Speaker 1: an antenna pick up five G transmission signals and use 287 00:18:54,240 --> 00:18:56,919 Speaker 1: that for your home network. You've got a router and 288 00:18:57,359 --> 00:19:00,399 Speaker 1: the five G signals come in to the router and 289 00:19:00,440 --> 00:19:03,919 Speaker 1: then you can connect your local network that way. Well, 290 00:19:04,080 --> 00:19:06,600 Speaker 1: we know that the ultra fast version of five G 291 00:19:07,040 --> 00:19:09,439 Speaker 1: has a very limited range, so you need to be 292 00:19:09,480 --> 00:19:12,800 Speaker 1: close to these antenna and it also has a limited 293 00:19:12,800 --> 00:19:15,800 Speaker 1: ability to penetrate walls. So even if you are close, 294 00:19:16,359 --> 00:19:18,159 Speaker 1: you wouldn't be able to pick this up with a 295 00:19:18,240 --> 00:19:22,520 Speaker 1: router antenna inside your home because the signals themselves wouldn't 296 00:19:22,560 --> 00:19:25,040 Speaker 1: get into your home in the first place. So you 297 00:19:25,119 --> 00:19:28,639 Speaker 1: might need something akin to a satellite dish that uses 298 00:19:28,720 --> 00:19:34,480 Speaker 1: this engineered material to help direct signals to the antenna. 299 00:19:34,600 --> 00:19:37,520 Speaker 1: So in other words, it's capturing and redirecting more of 300 00:19:37,560 --> 00:19:40,080 Speaker 1: those five G signals to get a nice strong connection, 301 00:19:40,560 --> 00:19:43,840 Speaker 1: and that way you could end up having a five 302 00:19:43,920 --> 00:19:48,040 Speaker 1: G connectivity to the outside world, and then your local 303 00:19:48,240 --> 00:19:50,880 Speaker 1: network connects by a five G that'd be super cool. 304 00:19:51,200 --> 00:19:54,600 Speaker 1: Or imagine using this material to help shield a location 305 00:19:54,840 --> 00:19:59,440 Speaker 1: from outside signals, to make sure that external signals can't 306 00:19:59,480 --> 00:20:02,160 Speaker 1: get into to a facility. If you need to create 307 00:20:02,200 --> 00:20:05,399 Speaker 1: a cyber secure location. Let's say you've got an air gap, 308 00:20:05,800 --> 00:20:07,960 Speaker 1: so you don't want to connect to the internet at large, 309 00:20:08,119 --> 00:20:10,720 Speaker 1: but you still want to have an internal network complete 310 00:20:10,720 --> 00:20:14,840 Speaker 1: with wireless components. So inside the building you're able to 311 00:20:14,880 --> 00:20:18,520 Speaker 1: connect to WiFi, but it's only for the internal network. 312 00:20:19,119 --> 00:20:21,520 Speaker 1: You would want to have something like this to block 313 00:20:21,560 --> 00:20:25,960 Speaker 1: external signals from potentially interfering or snooping on you. There 314 00:20:26,000 --> 00:20:28,200 Speaker 1: are a lot of other potential uses for this material 315 00:20:28,240 --> 00:20:31,040 Speaker 1: as well. I just thought it was super cool. Another 316 00:20:31,160 --> 00:20:34,719 Speaker 1: product that One and Innovation award is the bio Button. 317 00:20:35,080 --> 00:20:37,720 Speaker 1: It is a wearable device about the size of a 318 00:20:37,800 --> 00:20:43,119 Speaker 1: half dollar coin US half dollar. It monitors various bio 319 00:20:43,200 --> 00:20:47,280 Speaker 1: markers like respiratory rate, heart rate, and body temperature, with 320 00:20:47,359 --> 00:20:51,639 Speaker 1: a specific purpose for screening for COVID nineteen. I don't 321 00:20:51,680 --> 00:20:55,400 Speaker 1: know how you wear it. I looked at pictures of this, 322 00:20:55,720 --> 00:20:58,720 Speaker 1: but I didn't see any description of how it's actually warrnt. 323 00:20:58,720 --> 00:21:01,280 Speaker 1: I imagine it has to be worn against the skin. 324 00:21:01,920 --> 00:21:05,679 Speaker 1: If it's monitoring your temperature. Otherwise how would it know. 325 00:21:06,480 --> 00:21:09,240 Speaker 1: But you pair it with a smartphone app and you 326 00:21:09,280 --> 00:21:13,400 Speaker 1: receive notifications based on those bio markers, such as a 327 00:21:13,480 --> 00:21:16,919 Speaker 1: clear or no clear notification before you start your day. 328 00:21:16,960 --> 00:21:18,760 Speaker 1: Maybe this is just something that you use to do 329 00:21:18,800 --> 00:21:22,879 Speaker 1: a quick scan as opposed to consistent monitoring, But that 330 00:21:23,000 --> 00:21:25,160 Speaker 1: wasn't the implication I got when I was reading over 331 00:21:25,200 --> 00:21:29,200 Speaker 1: the material. Anyway, this sort of approach could be critical 332 00:21:29,240 --> 00:21:33,120 Speaker 1: to preventing further spread of the illness, with some big 333 00:21:33,200 --> 00:21:38,760 Speaker 1: qualifiers there, because this approach is dependent upon detecting symptomatic 334 00:21:38,880 --> 00:21:42,240 Speaker 1: cases of COVID nineteen. But we also know it's possible 335 00:21:42,240 --> 00:21:46,080 Speaker 1: to be asymptomatic as well as contagious, and this device 336 00:21:46,080 --> 00:21:48,720 Speaker 1: wouldn't be much help for those cases. I mean, it 337 00:21:48,880 --> 00:21:53,080 Speaker 1: is looking for symptoms of COVID. If you're asymptomatic by definition, 338 00:21:53,080 --> 00:21:55,720 Speaker 1: you don't have those, so it's not like it's a 339 00:21:55,960 --> 00:22:02,680 Speaker 1: failsafe device that is completely dependable. Is one more measure 340 00:22:02,720 --> 00:22:06,359 Speaker 1: to use in the efforts to prevent the transmission of 341 00:22:06,400 --> 00:22:10,560 Speaker 1: COVID nineteam, but shouldn't be seen as the big solution. 342 00:22:11,359 --> 00:22:14,480 Speaker 1: And the last one I want to highlight for This 343 00:22:14,600 --> 00:22:18,240 Speaker 1: episode comes out of IBM. They received a Best of 344 00:22:18,280 --> 00:22:23,240 Speaker 1: Innovation award for the Mayflower Autonomous Ship or m AS design. 345 00:22:24,119 --> 00:22:27,400 Speaker 1: This ship will be fully autonomous. It will be guided 346 00:22:27,480 --> 00:22:31,560 Speaker 1: by artificial intelligence from start to finish whenever it takes 347 00:22:31,600 --> 00:22:35,080 Speaker 1: its trip. It's supposed to go on its maiden voyage 348 00:22:35,200 --> 00:22:38,160 Speaker 1: in the spring of this year, where it will set 349 00:22:38,200 --> 00:22:43,240 Speaker 1: forth from Plymouth, England and travel to Plymouth, Massachusetts, under 350 00:22:43,240 --> 00:22:46,920 Speaker 1: its own power, under its own guidance, and thus echoing 351 00:22:46,960 --> 00:22:50,879 Speaker 1: the famous Pilgrim's journey. It is meant to self navigate 352 00:22:50,920 --> 00:22:54,480 Speaker 1: and self operate, and it could be a very valuable 353 00:22:54,480 --> 00:22:57,760 Speaker 1: technology in the future to allow for stuff like cargo 354 00:22:57,840 --> 00:23:02,160 Speaker 1: transportation without putting crews and risk of various hazards like 355 00:23:02,600 --> 00:23:06,800 Speaker 1: storms or even pirates. Really. In addition, scientists will be 356 00:23:06,880 --> 00:23:09,439 Speaker 1: able to learn more about the oceans by looking at 357 00:23:09,560 --> 00:23:12,720 Speaker 1: data that the ship gathers as it navigates through environments 358 00:23:13,040 --> 00:23:16,160 Speaker 1: that would be too dangerous for humans. The m AS 359 00:23:16,240 --> 00:23:20,040 Speaker 1: itself is more of a prototype ship. It looks fairly 360 00:23:20,080 --> 00:23:22,840 Speaker 1: futuristic and nifty, with lots of solar panels on it 361 00:23:22,920 --> 00:23:26,560 Speaker 1: and stuff. It does not look like a cargo vessel 362 00:23:26,560 --> 00:23:29,200 Speaker 1: in other words, and it's not meant to be. Should 363 00:23:29,320 --> 00:23:33,160 Speaker 1: this technology prove viable, I expect we would see more 364 00:23:33,240 --> 00:23:36,800 Speaker 1: industrial versions that would look a lot less science fiction 365 00:23:36,880 --> 00:23:39,560 Speaker 1: in a lot more functional. But I will be following 366 00:23:39,600 --> 00:23:43,760 Speaker 1: this story with interest to see how it develops. I 367 00:23:43,800 --> 00:23:49,040 Speaker 1: think that autonomous shipping is definitely a big area of 368 00:23:49,480 --> 00:23:53,359 Speaker 1: development that will see over the next few years. Uh. 369 00:23:53,560 --> 00:23:56,200 Speaker 1: Kind of along the same lines that we've heard about 370 00:23:56,200 --> 00:23:59,040 Speaker 1: how autonomous driving is really going to make an enormous 371 00:23:59,040 --> 00:24:01,879 Speaker 1: impact in the truck industry. There are a lot of 372 00:24:01,880 --> 00:24:04,480 Speaker 1: other products that received awards. These were just a few 373 00:24:04,560 --> 00:24:07,199 Speaker 1: I found interesting. Let's close out with a couple of 374 00:24:07,200 --> 00:24:10,520 Speaker 1: other brief c e S notes, but we'll cover more 375 00:24:10,560 --> 00:24:13,879 Speaker 1: of c e S in Thursday's episode. So another c 376 00:24:14,000 --> 00:24:17,120 Speaker 1: E S news, l G showed off it's TV lineup 377 00:24:17,240 --> 00:24:21,400 Speaker 1: at its press conference. The smart TV line uses the 378 00:24:21,520 --> 00:24:25,680 Speaker 1: web OS operating system, and the new smart TVs will 379 00:24:25,720 --> 00:24:29,679 Speaker 1: have Google Stadia and in video g force now built 380 00:24:29,760 --> 00:24:34,120 Speaker 1: into them. Both of these are cloud gaming services. They're 381 00:24:34,240 --> 00:24:37,320 Speaker 1: very different. In Video g force Now is all about 382 00:24:37,400 --> 00:24:41,720 Speaker 1: being able to access games that exist in your libraries 383 00:24:41,760 --> 00:24:45,840 Speaker 1: with other services like Steam, whereas Google Stadia is more 384 00:24:46,000 --> 00:24:49,439 Speaker 1: of a walled Garden, where you own the access to 385 00:24:49,520 --> 00:24:53,080 Speaker 1: games within Stadia itself. To learn more about Google Stadia, 386 00:24:53,240 --> 00:24:56,200 Speaker 1: make sure you check out tomorrow's episode of Tech Stuff 387 00:24:56,400 --> 00:24:58,960 Speaker 1: because that's when I'll do a deep dive on that 388 00:24:59,040 --> 00:25:04,440 Speaker 1: cloud gaming service. Also, Samsung showed off some new robots 389 00:25:04,480 --> 00:25:08,680 Speaker 1: designed to help around the house. The butt Handy is 390 00:25:08,720 --> 00:25:11,080 Speaker 1: a robot that looks like a pedestal on top of 391 00:25:11,119 --> 00:25:15,880 Speaker 1: a robotic vacuum and has an articulated arm attached that pedestal, 392 00:25:16,359 --> 00:25:18,399 Speaker 1: and Samsung showed off a video of it doing stuff 393 00:25:18,440 --> 00:25:21,520 Speaker 1: like pouring a glass of wine from a bottle and 394 00:25:21,640 --> 00:25:25,399 Speaker 1: loading a dishwasher, which come on, I mean a dishwasher 395 00:25:25,480 --> 00:25:29,680 Speaker 1: is already a labor saving device, but loading a dishwasher anyway. 396 00:25:29,920 --> 00:25:33,640 Speaker 1: Samsung showed all these off, along with a three other 397 00:25:33,680 --> 00:25:36,720 Speaker 1: types of robots as press conference, so once again we 398 00:25:36,760 --> 00:25:40,919 Speaker 1: see another push for consumer robots. I haven't seen a 399 00:25:40,960 --> 00:25:46,399 Speaker 1: lot of adoption of consumer robots outside of robotic vacuums 400 00:25:46,440 --> 00:25:50,640 Speaker 1: and then a few other specialized robots like gutter cleaners 401 00:25:50,680 --> 00:25:54,159 Speaker 1: and pool cleaning robots. Beyond that, I haven't seen a 402 00:25:54,200 --> 00:25:57,560 Speaker 1: whole lot of movement in the industry to actually go 403 00:25:57,800 --> 00:26:03,000 Speaker 1: that consumer route. But maybe maybe we'll see a change. 404 00:26:03,119 --> 00:26:06,200 Speaker 1: I don't know. And to close out this section, Phillips 405 00:26:06,240 --> 00:26:09,440 Speaker 1: showed off a smart toothbrush. Now, we've seen a lot 406 00:26:09,440 --> 00:26:13,760 Speaker 1: of different advanced toothbrushes over the last few years, including 407 00:26:14,119 --> 00:26:17,160 Speaker 1: some that track how you're brushing and giving you tips 408 00:26:17,480 --> 00:26:19,560 Speaker 1: on what you should do in order to get a 409 00:26:19,600 --> 00:26:22,040 Speaker 1: better coverage of your mouth. This one is supposed to 410 00:26:22,119 --> 00:26:26,240 Speaker 1: use artificial intelligence to monitor your oral hygiene and automatically 411 00:26:26,240 --> 00:26:29,800 Speaker 1: adjust its vibration frequency based on how you brush your teeth. 412 00:26:29,840 --> 00:26:32,639 Speaker 1: So you know, if you're brushing your teeth extra hard 413 00:26:32,880 --> 00:26:37,560 Speaker 1: or very gently, then the vibration should kind of, you know, 414 00:26:38,520 --> 00:26:41,720 Speaker 1: pick up where you left off or adjust based on 415 00:26:41,800 --> 00:26:44,359 Speaker 1: how you are brushing your teeth. So the idea is 416 00:26:44,359 --> 00:26:46,920 Speaker 1: that your teeth are always getting the cleaning action that 417 00:26:47,000 --> 00:26:50,000 Speaker 1: they need. I don't know how much it will cost. 418 00:26:50,040 --> 00:26:52,840 Speaker 1: It as part of the Sonic Care line of toothbrushes. 419 00:26:53,440 --> 00:26:56,800 Speaker 1: Full disclosure, I have a Sonic Care toothbrush. Bought it myself. 420 00:26:57,200 --> 00:26:59,879 Speaker 1: This is not a ad. I wasn't paid to do it. 421 00:27:00,160 --> 00:27:03,400 Speaker 1: They didn't send me one. Um, so I've definitely used 422 00:27:03,400 --> 00:27:07,360 Speaker 1: their products before. I do not know, like the standard 423 00:27:07,400 --> 00:27:11,240 Speaker 1: Sonic here was already fairly expensive. I don't know how 424 00:27:11,280 --> 00:27:13,800 Speaker 1: expensive this is gonna be or whether that will be 425 00:27:13,840 --> 00:27:18,439 Speaker 1: a barrier to entry. We'll see. But some people just 426 00:27:18,520 --> 00:27:23,880 Speaker 1: love having smart everything. I don't think that's necessarily smart, ironically, 427 00:27:24,440 --> 00:27:27,000 Speaker 1: but you know, for some people, that's just the way 428 00:27:27,040 --> 00:27:30,399 Speaker 1: they go. We have some more stories to wrap up 429 00:27:30,400 --> 00:27:32,719 Speaker 1: this episode, but before we get to that, we're going 430 00:27:32,760 --> 00:27:42,800 Speaker 1: to take another break. Like I said in the last section, 431 00:27:42,840 --> 00:27:45,680 Speaker 1: I'll talk more about c e S on Thursday's episode. 432 00:27:45,680 --> 00:27:48,280 Speaker 1: I'm sure once we have more things that have been 433 00:27:48,320 --> 00:27:51,720 Speaker 1: released by then. As I mentioned before, on recording this 434 00:27:51,920 --> 00:27:55,960 Speaker 1: on Monday, January eleven, the day before it publishes, so 435 00:27:56,240 --> 00:27:58,400 Speaker 1: I only have a limited amount of information I can 436 00:27:58,440 --> 00:28:01,679 Speaker 1: cover at the moment. Last week, however, I talked about 437 00:28:01,680 --> 00:28:06,119 Speaker 1: how employees at Google are moving to unionize. Well. According 438 00:28:06,200 --> 00:28:10,399 Speaker 1: to the Telegraph, some staffers at Google have been told 439 00:28:10,400 --> 00:28:13,600 Speaker 1: by executives that they need to take training in how 440 00:28:13,680 --> 00:28:19,040 Speaker 1: to moderate internal email discussions, including flagging those discussions whenever 441 00:28:19,080 --> 00:28:23,800 Speaker 1: they might contain sensitive material. Now there's no smoking gun here, 442 00:28:24,160 --> 00:28:26,960 Speaker 1: but the fear is that Google is attempting to head 443 00:28:27,000 --> 00:28:31,400 Speaker 1: off the efforts to unionize, that they're keeping an eye 444 00:28:31,400 --> 00:28:38,800 Speaker 1: on things with a goal of counteracting the unionization of Google, 445 00:28:39,520 --> 00:28:43,680 Speaker 1: which you know, it's kind of a bad look. I 446 00:28:43,720 --> 00:28:47,040 Speaker 1: also mentioned last week that Facebook changed the terms of 447 00:28:47,080 --> 00:28:51,000 Speaker 1: service of WhatsApp, the messaging service that Facebook owns, and 448 00:28:51,080 --> 00:28:54,680 Speaker 1: that is going to facilitate the sharing of information from 449 00:28:54,880 --> 00:28:58,640 Speaker 1: the messaging platform to Facebook as a whole. In other words, 450 00:28:58,960 --> 00:29:03,640 Speaker 1: Facebook would be able to mind WhatsApp for data that 451 00:29:03,680 --> 00:29:08,240 Speaker 1: could be used mostly in advertising. Well now an antitrust 452 00:29:08,320 --> 00:29:12,479 Speaker 1: board in Turkey has launched an investigation into Facebook because 453 00:29:12,560 --> 00:29:15,800 Speaker 1: of those new terms, and just a reminder, anyone who 454 00:29:15,880 --> 00:29:18,920 Speaker 1: uses WhatsApp has no choice but to agree to those 455 00:29:19,040 --> 00:29:23,680 Speaker 1: terms or stop using the service. There's no opt in feature. 456 00:29:24,240 --> 00:29:27,520 Speaker 1: The antitrust board also said it was ordering that Facebook 457 00:29:27,560 --> 00:29:31,479 Speaker 1: stopped from implementing those terms. This is a complicated issue 458 00:29:31,680 --> 00:29:35,520 Speaker 1: because a lot of privacy advocates have expressed concerns about 459 00:29:35,560 --> 00:29:40,040 Speaker 1: Facebook's changes. However, Turkey also has a history of suppressing 460 00:29:40,160 --> 00:29:43,440 Speaker 1: social networks as a way of stifling voices that are 461 00:29:43,480 --> 00:29:48,160 Speaker 1: not in alignment with the Turkish government. So it's not 462 00:29:48,320 --> 00:29:52,440 Speaker 1: a simple black and white matter here. It's it's pretty complicated. 463 00:29:53,480 --> 00:29:56,960 Speaker 1: Tim berners Lee, who developed the first web page and 464 00:29:57,040 --> 00:29:59,680 Speaker 1: thus is seen as the father of the Worldwide Web, 465 00:30:00,080 --> 00:30:03,920 Speaker 1: advocates for a fundamental change in how the online world 466 00:30:04,040 --> 00:30:08,920 Speaker 1: handles personal information rather than personal data accumulating and what 467 00:30:09,000 --> 00:30:12,920 Speaker 1: he calls silos, which are essentially, you know, databases owned 468 00:30:12,920 --> 00:30:18,480 Speaker 1: by some of the tech world's largest companies like Google, Amazon, Facebook, etcetera. 469 00:30:18,720 --> 00:30:22,960 Speaker 1: He advocates for pods p o ds that stands for 470 00:30:23,320 --> 00:30:28,800 Speaker 1: personal Online Data stores. These would be individualized stores of 471 00:30:28,880 --> 00:30:32,680 Speaker 1: data unique to each person and owned by that person. 472 00:30:32,760 --> 00:30:35,880 Speaker 1: So you would own the collection of data about you, 473 00:30:36,480 --> 00:30:39,920 Speaker 1: and that would include information about stuff like your online purchases, 474 00:30:40,400 --> 00:30:44,240 Speaker 1: which websites you visit, the social networking platforms you belong to. 475 00:30:44,440 --> 00:30:46,640 Speaker 1: All of that kind of stuff, the stuff that gets 476 00:30:46,680 --> 00:30:49,280 Speaker 1: bought and sold today, All of that would actually belong 477 00:30:49,360 --> 00:30:53,240 Speaker 1: to you, And if a company wanted to access that information, 478 00:30:53,680 --> 00:30:57,000 Speaker 1: the company would first need to get your consent to 479 00:30:57,440 --> 00:31:00,480 Speaker 1: get that particular bit of data, and they would also 480 00:31:00,560 --> 00:31:04,120 Speaker 1: be restricted to access only the data relevant to whatever 481 00:31:04,160 --> 00:31:07,200 Speaker 1: the request is. Kind of similar to the way that 482 00:31:07,400 --> 00:31:10,360 Speaker 1: app permissions are supposed to work on smartphones. You know, 483 00:31:10,440 --> 00:31:14,120 Speaker 1: a smartphone. When you activate an app, if it needs 484 00:31:14,120 --> 00:31:16,720 Speaker 1: to access something like your camera or microphone, it asks 485 00:31:16,760 --> 00:31:20,000 Speaker 1: you for permission for that, but it can't also then 486 00:31:20,040 --> 00:31:23,280 Speaker 1: get permission to access other stuff that it does not 487 00:31:23,520 --> 00:31:27,320 Speaker 1: expressly tell you about. It's restricted to just the things 488 00:31:27,320 --> 00:31:29,600 Speaker 1: that asks you for. That's sort of the same kind 489 00:31:29,600 --> 00:31:33,320 Speaker 1: of concept. Berners Lee has co founded a company called 490 00:31:33,520 --> 00:31:37,560 Speaker 1: interrupt I n R U p T meant to promote 491 00:31:37,560 --> 00:31:41,360 Speaker 1: adoption of this idea, but it faces an enormous challenge 492 00:31:41,440 --> 00:31:44,800 Speaker 1: in that huge companies have already developed these silos and 493 00:31:44,840 --> 00:31:47,680 Speaker 1: they depend heavily on them. I may need to do 494 00:31:47,760 --> 00:31:50,640 Speaker 1: a full episode on this idea and the technology behind 495 00:31:50,680 --> 00:31:53,440 Speaker 1: it if you want to learn more. However, I recommend 496 00:31:53,440 --> 00:31:56,720 Speaker 1: the New York Times article titled he created the Web, 497 00:31:57,120 --> 00:32:00,320 Speaker 1: Now He's out to remake the digital world. It blushed 498 00:32:00,360 --> 00:32:06,360 Speaker 1: on January t one. And remember quimby, the short form 499 00:32:06,480 --> 00:32:10,920 Speaker 1: video platform optimized for mobile devices that launched in April 500 00:32:10,920 --> 00:32:16,400 Speaker 1: of and shut down just last December. Well, Roku, the 501 00:32:16,440 --> 00:32:20,080 Speaker 1: company that's behind various streaming set top devices and smart 502 00:32:20,120 --> 00:32:26,200 Speaker 1: TV interfaces, has arranged to acquire all of Quimby's content library. 503 00:32:26,680 --> 00:32:31,200 Speaker 1: Roku will host the content, which includes more than seventy shows, 504 00:32:31,240 --> 00:32:36,320 Speaker 1: on its own ad supported channel, which is otherwise free. Personally, 505 00:32:36,560 --> 00:32:39,400 Speaker 1: I'm thankful that the content is going to live on 506 00:32:39,640 --> 00:32:42,040 Speaker 1: and have a chance to find an audience even after 507 00:32:42,160 --> 00:32:46,040 Speaker 1: Quimby's collapse. While I never thought Quimby really had a 508 00:32:46,120 --> 00:32:50,160 Speaker 1: strong business case even before the pandemic, I didn't want 509 00:32:50,200 --> 00:32:52,960 Speaker 1: to see content creators have the rug pulled out from 510 00:32:53,040 --> 00:32:55,640 Speaker 1: under them. So it will be interesting to see if 511 00:32:55,760 --> 00:32:59,480 Speaker 1: Roku not just supports these shows but then extends them, 512 00:32:59,520 --> 00:33:01,560 Speaker 1: you know, whether or not some of these shows get 513 00:33:01,640 --> 00:33:04,720 Speaker 1: further seasons in the future, picking up where Quimby left off, 514 00:33:05,080 --> 00:33:08,120 Speaker 1: perhaps with some restructuring to bring the content closer in 515 00:33:08,160 --> 00:33:11,000 Speaker 1: alignment to the running time that people are more used to, 516 00:33:11,320 --> 00:33:14,240 Speaker 1: because the average Quimby video was between eight and ten 517 00:33:14,280 --> 00:33:17,920 Speaker 1: minutes long, and longer form stuff was divided up into 518 00:33:18,080 --> 00:33:22,000 Speaker 1: lots of eight to ten minute chapters, you could say, 519 00:33:22,280 --> 00:33:25,240 Speaker 1: so glad to see that that content is back in 520 00:33:25,360 --> 00:33:29,000 Speaker 1: some form. A lot of very talented people were working 521 00:33:29,080 --> 00:33:33,320 Speaker 1: on Quimby content, and then it wasn't their fault that 522 00:33:33,480 --> 00:33:36,160 Speaker 1: the platform failed. In fact, I would say that really 523 00:33:36,680 --> 00:33:40,880 Speaker 1: the the selling point of the platform itself was a 524 00:33:40,920 --> 00:33:45,120 Speaker 1: little questionable. Again, uh, you understand it. The idea was 525 00:33:45,160 --> 00:33:48,280 Speaker 1: that people are spending a lot more time looking at 526 00:33:48,880 --> 00:33:51,920 Speaker 1: you know, entertainment on their phones, at least when the 527 00:33:51,920 --> 00:33:55,640 Speaker 1: world's not having them all locked at home, and thus 528 00:33:56,160 --> 00:34:00,560 Speaker 1: creating a short form, high production, value service kind of 529 00:34:00,600 --> 00:34:03,720 Speaker 1: made sense. But in a pandemic world where nobody leaves 530 00:34:03,760 --> 00:34:06,960 Speaker 1: their house, uh, and everyone's able to watch stuff on 531 00:34:07,560 --> 00:34:11,160 Speaker 1: entertainment systems, it's a lot harder to sell Quimby. So 532 00:34:11,440 --> 00:34:13,920 Speaker 1: glad that the content sticking around, curious to see if 533 00:34:13,920 --> 00:34:20,560 Speaker 1: it will have a life beyond that initial outburst of creativity. 534 00:34:20,800 --> 00:34:23,960 Speaker 1: But yeah, that's where Quimby is now. It's found a 535 00:34:24,160 --> 00:34:29,160 Speaker 1: home on Roku. And that wraps up this news episode 536 00:34:29,360 --> 00:34:34,560 Speaker 1: of Tech Stuff. We will continue covering the stories on Thursday. 537 00:34:34,600 --> 00:34:38,080 Speaker 1: That will include an update on what's going on at CES, 538 00:34:38,520 --> 00:34:41,840 Speaker 1: as well as I'm sure updates on the unfolding political 539 00:34:41,880 --> 00:34:47,360 Speaker 1: situation with technologies role. Uh. There's already ongoing stories that 540 00:34:47,400 --> 00:34:50,600 Speaker 1: I didn't include here that we're breaking as I was recording, 541 00:34:50,640 --> 00:34:56,280 Speaker 1: about various big tech companies deciding to halt political spending 542 00:34:56,360 --> 00:35:01,080 Speaker 1: at least in the short term, and specifically toward politicians 543 00:35:01,120 --> 00:35:07,120 Speaker 1: who were seen to undermine the democratic process. So that's 544 00:35:07,239 --> 00:35:09,759 Speaker 1: something that is unfolding as I record this, and of 545 00:35:09,760 --> 00:35:13,160 Speaker 1: course by the time you hear it on Tuesday, that 546 00:35:13,200 --> 00:35:16,160 Speaker 1: will have progressed further, so I'll give an update on 547 00:35:16,200 --> 00:35:20,520 Speaker 1: Thursday about those stories. But we'll also cover other stuff, 548 00:35:20,600 --> 00:35:24,000 Speaker 1: so I look forward to chatting with you. Tomorrow's episode 549 00:35:24,000 --> 00:35:26,480 Speaker 1: of tech Stuff is all about Google Stadius, so I 550 00:35:26,480 --> 00:35:28,680 Speaker 1: hope you enjoy that. If you want to reach out 551 00:35:28,719 --> 00:35:30,960 Speaker 1: to me, the best place to do it is on Twitter. 552 00:35:31,239 --> 00:35:33,920 Speaker 1: The handle for the show is text Stuff H s 553 00:35:34,120 --> 00:35:42,600 Speaker 1: W and I'll talk to you again really soon. Text 554 00:35:42,600 --> 00:35:46,080 Speaker 1: Stuff is an I Heart Radio production. For more podcasts 555 00:35:46,080 --> 00:35:48,840 Speaker 1: from my Heart Radio, visit the i Heart Radio app, 556 00:35:49,000 --> 00:35:52,160 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows