1 00:00:04,600 --> 00:00:08,320 Speaker 1: Sleepwalkers is a production of I Heart Radio and Unusual Productions. 2 00:00:15,240 --> 00:00:18,320 Speaker 1: I'm Lain and I'm Kara Price. Welcome to a special 3 00:00:18,360 --> 00:00:34,880 Speaker 1: bonus episode of Sleepwalkers from the Consumer Electronic show, So Kara, 4 00:00:35,000 --> 00:00:37,920 Speaker 1: I'd never been to Las Vegas before, which is the 5 00:00:38,000 --> 00:00:41,360 Speaker 1: difference between us. I've been to Vegas too many times. Well, 6 00:00:41,479 --> 00:00:43,720 Speaker 1: I could tell, and it did feel good to be 7 00:00:43,800 --> 00:00:45,960 Speaker 1: in good hands with an old Vegas hand like you. 8 00:00:46,360 --> 00:00:48,880 Speaker 1: One of the new things though for me was slats, 9 00:00:48,880 --> 00:00:52,000 Speaker 1: which I don't normally play. I think so consciously. I 10 00:00:52,040 --> 00:00:54,400 Speaker 1: was thinking about what Tristan Harris talked about in the 11 00:00:54,400 --> 00:00:57,000 Speaker 1: first season Sleepwalkers. You know he was that former googler 12 00:00:57,240 --> 00:00:59,640 Speaker 1: who told us that Instagram is actually supposed to feel 13 00:00:59,640 --> 00:01:02,360 Speaker 1: a lot like slot machines. Well, that's right. Tristan studied 14 00:01:02,400 --> 00:01:05,360 Speaker 1: at the Stanford Persuasion Lab and told us about how 15 00:01:05,720 --> 00:01:10,320 Speaker 1: casino architecture has influenced the development of highly addictive type 16 00:01:10,360 --> 00:01:13,440 Speaker 1: products like Instagram. So it's interesting for me to actually 17 00:01:13,440 --> 00:01:16,720 Speaker 1: see Vegas and the bright lights and the impossibility of 18 00:01:16,880 --> 00:01:20,600 Speaker 1: escape firsthand, not to mention the replicas of the Empire 19 00:01:20,640 --> 00:01:23,520 Speaker 1: State Building, the Canals of Venice, the Colisseum of Rome, 20 00:01:23,680 --> 00:01:25,520 Speaker 1: and you know, I was lucky enough to see the 21 00:01:25,520 --> 00:01:27,640 Speaker 1: Seattle Space Needle for the first time. I just didn't 22 00:01:27,640 --> 00:01:30,240 Speaker 1: know that it was in Las Vegas. But that's not 23 00:01:30,280 --> 00:01:33,120 Speaker 1: why we were there. We were there for CEES, the 24 00:01:33,160 --> 00:01:37,040 Speaker 1: Consumer Electronics Show, and this episode we're actually going to 25 00:01:37,160 --> 00:01:39,480 Speaker 1: talk about some of the coolest things we saw there, 26 00:01:39,520 --> 00:01:42,039 Speaker 1: but we're going to focus more on the innovations that 27 00:01:42,040 --> 00:01:44,959 Speaker 1: are at the intersection of technology and humanity rather than 28 00:01:45,000 --> 00:01:48,240 Speaker 1: talk about you know, infamous toilet paper dispensers. One of 29 00:01:48,240 --> 00:01:50,680 Speaker 1: the big reasons we went is because we were invited 30 00:01:50,720 --> 00:01:54,480 Speaker 1: by WaveMaker, which is an agency part of w p P, 31 00:01:55,160 --> 00:01:58,280 Speaker 1: to do an interview on stage, a live podcast, so 32 00:01:58,320 --> 00:02:00,960 Speaker 1: to speak with Matt and on a hand, who is 33 00:02:01,000 --> 00:02:04,440 Speaker 1: head of product at ARC Publishing and ARC Publishing is 34 00:02:04,520 --> 00:02:07,280 Speaker 1: part of the Washington Post. Yeah, and ARC is also 35 00:02:07,400 --> 00:02:10,880 Speaker 1: an interesting case of AI and action because they're forward 36 00:02:10,880 --> 00:02:13,880 Speaker 1: thinking in terms of increasing the visibility of content through 37 00:02:13,880 --> 00:02:18,560 Speaker 1: personalization and optimizing everything from headlines to photo selection, all 38 00:02:18,680 --> 00:02:21,560 Speaker 1: using machine learning, and those are things that really matter 39 00:02:21,680 --> 00:02:24,400 Speaker 1: for journalists and readers. Yeah, and this use of AI 40 00:02:24,480 --> 00:02:27,040 Speaker 1: stands out to me because it provides a solution to 41 00:02:27,160 --> 00:02:30,040 Speaker 1: real problem. How do you get eyeballs on the right 42 00:02:30,080 --> 00:02:33,080 Speaker 1: content when there's just so much. That said, the issue 43 00:02:33,080 --> 00:02:36,639 Speaker 1: of personalization does also raise questions about what happens when 44 00:02:36,680 --> 00:02:39,200 Speaker 1: machines start to know us better than we know ourselves. 45 00:02:39,600 --> 00:02:42,079 Speaker 1: Not to mention, what are the appropriate limits of how 46 00:02:42,120 --> 00:02:45,960 Speaker 1: companies use AI and dature about us. Yeah, AI can 47 00:02:46,000 --> 00:02:51,040 Speaker 1: definitely streamline processes by detecting patterns that you know, human 48 00:02:51,080 --> 00:02:54,440 Speaker 1: beings cannot see, or it can allow you to do 49 00:02:54,480 --> 00:02:57,000 Speaker 1: things at the scale like tag hundreds of thousands of 50 00:02:57,080 --> 00:03:01,560 Speaker 1: articles that again, human beings just cannot do. So greater 51 00:03:01,600 --> 00:03:04,680 Speaker 1: efficiency is on one side of the spectrum and extremely 52 00:03:04,720 --> 00:03:06,960 Speaker 1: attractive to people, but on the other side you have 53 00:03:07,080 --> 00:03:10,040 Speaker 1: issues of taking humans out of the loop, like the 54 00:03:10,080 --> 00:03:13,960 Speaker 1: black box problem and authenticity in a world of deep fakes. 55 00:03:14,000 --> 00:03:17,359 Speaker 1: So a question for businesses and users of technology is 56 00:03:17,400 --> 00:03:21,040 Speaker 1: sort of when does AI add to our experience and 57 00:03:21,080 --> 00:03:24,000 Speaker 1: when does it maybe hold us back or take advantage 58 00:03:24,000 --> 00:03:27,040 Speaker 1: of us. You know, for example, from seeing news stories 59 00:03:27,040 --> 00:03:29,680 Speaker 1: that we should see, but maybe the algorithm doesn't think 60 00:03:29,720 --> 00:03:32,080 Speaker 1: we want to see it or that we won't click 61 00:03:32,120 --> 00:03:34,720 Speaker 1: on it. Right in the old days when everyone received 62 00:03:34,760 --> 00:03:37,400 Speaker 1: a print use of paper on their doorstep, everyone has 63 00:03:37,440 --> 00:03:40,640 Speaker 1: the same front page and the same headlines. Nowadays, when 64 00:03:40,640 --> 00:03:43,080 Speaker 1: you log onto a news website or onto social media, 65 00:03:43,520 --> 00:03:47,160 Speaker 1: everybody has a different version of the world, and that 66 00:03:47,320 --> 00:03:50,280 Speaker 1: is obviously positive for driving engagement, but may not be 67 00:03:50,320 --> 00:03:53,520 Speaker 1: so positive in terms of having conversations with the same 68 00:03:53,520 --> 00:03:56,600 Speaker 1: facts about the same stories. Equally, we have to ask 69 00:03:56,800 --> 00:03:59,200 Speaker 1: do we want articles where the headline has been written 70 00:03:59,200 --> 00:04:02,160 Speaker 1: by an algorithm or do we prefer headlines written by 71 00:04:02,200 --> 00:04:04,800 Speaker 1: a person. And that's something we talked about with Matt 72 00:04:05,160 --> 00:04:08,520 Speaker 1: because ARC actually tested the headline writing technology. Let's talk 73 00:04:08,560 --> 00:04:13,800 Speaker 1: to Matt seriously. Let's cut to the chase. Are really 74 00:04:13,920 --> 00:04:18,320 Speaker 1: came out of a collaboration trying to better understand what 75 00:04:18,760 --> 00:04:22,040 Speaker 1: actual journalists needed. Can you talk a little bit more. Yeah, 76 00:04:22,120 --> 00:04:24,240 Speaker 1: at the very beginning, you know, we were just trying 77 00:04:24,240 --> 00:04:27,000 Speaker 1: to solve problems for ourselves seven or eight years ago. 78 00:04:27,400 --> 00:04:28,520 Speaker 1: You know, we knew and he had to make some 79 00:04:28,520 --> 00:04:32,080 Speaker 1: pretty fundamental transformation to the post and to really prepare 80 00:04:32,080 --> 00:04:34,760 Speaker 1: ourselves for the digital future. We didn't have the right 81 00:04:34,800 --> 00:04:36,760 Speaker 1: tools to do it, and we couldn't really find the 82 00:04:36,839 --> 00:04:38,920 Speaker 1: right tools on the market either. What we did was 83 00:04:39,400 --> 00:04:40,800 Speaker 1: spent a lot of time with the journalists and the 84 00:04:40,920 --> 00:04:42,840 Speaker 1: editors trying to figure out what it was that make 85 00:04:42,880 --> 00:04:45,359 Speaker 1: their lives easier. It's trying to figure out how do 86 00:04:45,400 --> 00:04:47,440 Speaker 1: you make journalists work better, how can they publish faster? 87 00:04:47,880 --> 00:04:49,320 Speaker 1: What are the little things you can do inside of 88 00:04:49,320 --> 00:04:51,320 Speaker 1: a product to make it easier for them to write 89 00:04:51,360 --> 00:04:54,600 Speaker 1: stories or publish From there? About four years ago is 90 00:04:54,640 --> 00:04:57,520 Speaker 1: when we started evolving it into a commercial offering. Today 91 00:04:57,600 --> 00:05:00,360 Speaker 1: we're running hundreds of websites around the world. We're in 92 00:05:00,400 --> 00:05:04,360 Speaker 1: about twenty different countries. We're running companies like BP their 93 00:05:04,400 --> 00:05:07,040 Speaker 1: internal communications as well as some of their marketing. We're 94 00:05:07,120 --> 00:05:09,640 Speaker 1: running large broadcasters and all their live video and body 95 00:05:09,680 --> 00:05:11,440 Speaker 1: and of course we're still running a lot of newspapers 96 00:05:11,440 --> 00:05:14,080 Speaker 1: and news publishers like The Post and many others around 97 00:05:14,080 --> 00:05:17,160 Speaker 1: the world, Lucky and publishing. You know that AI and 98 00:05:17,240 --> 00:05:21,919 Speaker 1: artificial intelligence are made in headlines, and there was a 99 00:05:22,040 --> 00:05:25,039 Speaker 1: story in the Financial Times last year we said fort 100 00:05:25,880 --> 00:05:31,520 Speaker 1: of AI startups use no AI whatsoever. So I bet 101 00:05:31,600 --> 00:05:35,279 Speaker 1: it's probably higher. But when we talk about using AI, 102 00:05:35,400 --> 00:05:37,400 Speaker 1: or when you talk about using AI, what do we 103 00:05:37,400 --> 00:05:40,479 Speaker 1: actually mean? So it can span the range of technologies 104 00:05:40,480 --> 00:05:43,600 Speaker 1: from something like machine learning, which is basically a way 105 00:05:43,600 --> 00:05:45,760 Speaker 1: to use algorithms, to take large sets of data and 106 00:05:45,800 --> 00:05:48,520 Speaker 1: either uncovered patterns in it or try to model a 107 00:05:48,520 --> 00:05:52,040 Speaker 1: way to predict a certain outcome to technologies like computer vision, 108 00:05:52,040 --> 00:05:53,960 Speaker 1: which you can use to look at images or video 109 00:05:54,320 --> 00:05:56,839 Speaker 1: and extract information about them by recognizing patterns and trying 110 00:05:56,839 --> 00:05:59,920 Speaker 1: to identify objects inside of them, and so a lot 111 00:05:59,920 --> 00:06:01,720 Speaker 1: of those technologies, then when you put them together, you 112 00:06:01,760 --> 00:06:04,400 Speaker 1: can form some really interesting workflows that you know in 113 00:06:04,400 --> 00:06:06,080 Speaker 1: the past you might have had to use humans to 114 00:06:06,080 --> 00:06:09,040 Speaker 1: do that, you can actually do much more simple automatically. 115 00:06:09,520 --> 00:06:12,640 Speaker 1: Was there a particular business challenge or challenge at the 116 00:06:12,640 --> 00:06:15,560 Speaker 1: Washington Post that you couldn't have solved if you hadn't 117 00:06:15,600 --> 00:06:19,880 Speaker 1: been using AI. Any story that we write on Washington Post, 118 00:06:19,920 --> 00:06:22,880 Speaker 1: we're mapping to a set of two or three hundred topics. 119 00:06:22,880 --> 00:06:24,800 Speaker 1: Maybe an example of one of those might be like 120 00:06:25,000 --> 00:06:28,680 Speaker 1: congressional policy or narcotics crime. What you're trying to do 121 00:06:28,720 --> 00:06:30,800 Speaker 1: is say, if I look at all this content, I'm 122 00:06:30,839 --> 00:06:33,080 Speaker 1: not just pulling specific words out of it. I'm actually 123 00:06:33,080 --> 00:06:35,520 Speaker 1: trying to figure out what is this content about, what 124 00:06:35,640 --> 00:06:38,120 Speaker 1: is the fundamental concept of this content. So you pick 125 00:06:38,160 --> 00:06:40,680 Speaker 1: a set of articles, let's say a hundred thousand news 126 00:06:40,760 --> 00:06:43,040 Speaker 1: articles in the case of this example for the post, 127 00:06:43,960 --> 00:06:47,000 Speaker 1: and at first you use humans it's called micro labor 128 00:06:47,120 --> 00:06:49,160 Speaker 1: to do this training set. And the goal is you're 129 00:06:49,160 --> 00:06:52,320 Speaker 1: building an algorithm based on a set of real data. 130 00:06:52,480 --> 00:06:54,760 Speaker 1: And so the humans are going there and saying this article, yeah, 131 00:06:54,760 --> 00:06:57,200 Speaker 1: this is about congressional policy. Why because I know it is, 132 00:06:57,240 --> 00:06:59,360 Speaker 1: I read it, that's what it's about. This one's about 133 00:06:59,440 --> 00:07:02,960 Speaker 1: narcotics time, and this one's about soccer. And so you 134 00:07:03,000 --> 00:07:06,680 Speaker 1: train all these articles against that algorithm until finally the 135 00:07:06,720 --> 00:07:10,640 Speaker 1: algorithm is basically sufficiently advanced to predict a new article 136 00:07:10,680 --> 00:07:13,360 Speaker 1: that you put into it and determine an outcome with 137 00:07:13,400 --> 00:07:16,320 Speaker 1: the same high probability of success that you were able 138 00:07:16,360 --> 00:07:19,600 Speaker 1: to with humans training it. Now, every time a journalist 139 00:07:19,920 --> 00:07:22,840 Speaker 1: saves or publishes a story, we're able to parse over 140 00:07:22,880 --> 00:07:25,600 Speaker 1: all the content inside that story. Then we can predict 141 00:07:25,680 --> 00:07:28,200 Speaker 1: the strength at which it's likely to belong to that topic. 142 00:07:28,840 --> 00:07:32,920 Speaker 1: How do you create a better user experience, in your case, 143 00:07:32,960 --> 00:07:36,480 Speaker 1: news experience for an individual or consumer With that metadat 144 00:07:36,520 --> 00:07:38,080 Speaker 1: you can do a lot of interesting things. We can 145 00:07:38,120 --> 00:07:40,680 Speaker 1: figure out that hey, this is something that they're interested 146 00:07:40,720 --> 00:07:42,200 Speaker 1: in reading, perhaps they'd like to read more, and it 147 00:07:42,240 --> 00:07:45,920 Speaker 1: actually serves the signal into our recommendation algorithms. From your perspective, 148 00:07:46,000 --> 00:07:50,840 Speaker 1: where can businesses sort of harness the power of machine 149 00:07:50,920 --> 00:07:55,320 Speaker 1: learning to really hone in on who their customer is 150 00:07:55,400 --> 00:07:59,200 Speaker 1: and what that customer wants. We want to deliver more 151 00:07:59,240 --> 00:08:01,080 Speaker 1: content to our readers who want to help them find 152 00:08:01,080 --> 00:08:04,760 Speaker 1: more content that we've created. We have about nine journalists 153 00:08:04,760 --> 00:08:07,360 Speaker 1: at the Washington Post. We write something like, you know, 154 00:08:07,360 --> 00:08:10,040 Speaker 1: three or four hundred original stories a day, so there's 155 00:08:10,040 --> 00:08:12,360 Speaker 1: a lot of content there. To get readers to all 156 00:08:12,360 --> 00:08:14,840 Speaker 1: that different content and have them continue moving through your 157 00:08:14,960 --> 00:08:17,360 Speaker 1: content you spend a lot of money to produce is 158 00:08:17,360 --> 00:08:20,200 Speaker 1: really challenging, and so that's a great use case for personalization. 159 00:08:20,840 --> 00:08:22,680 Speaker 1: But where you can make it really come alive is 160 00:08:23,040 --> 00:08:26,880 Speaker 1: by having more sophisticated metadata, more sophisticated information about that content. 161 00:08:26,960 --> 00:08:29,080 Speaker 1: It's more likely to bring readers to it, and so 162 00:08:29,160 --> 00:08:31,400 Speaker 1: that's where these machine learning models really come in handy. 163 00:08:32,160 --> 00:08:34,400 Speaker 1: I think part of what's fun about this conversation is 164 00:08:34,440 --> 00:08:37,520 Speaker 1: there's a lot of cases out there where average users, 165 00:08:38,200 --> 00:08:39,920 Speaker 1: you know, they imagine they see something like that. You 166 00:08:39,960 --> 00:08:42,160 Speaker 1: see the boots on Instagram and you think, oh my god, 167 00:08:42,240 --> 00:08:47,120 Speaker 1: like these companies must crazy, like you know, indiscernible for magic, right, like, 168 00:08:47,200 --> 00:08:49,160 Speaker 1: there must be some crazy model out there doing this, 169 00:08:49,200 --> 00:08:51,960 Speaker 1: and perhaps there is. But in a lot of ways, 170 00:08:52,640 --> 00:08:57,640 Speaker 1: you know, your users aren't necessarily as aware of the 171 00:08:57,679 --> 00:09:01,400 Speaker 1: advertising ecosystem, the data ecosystem, and how these things tied 172 00:09:01,440 --> 00:09:04,240 Speaker 1: together between platforms and sites, and I think, as like 173 00:09:04,280 --> 00:09:07,360 Speaker 1: industry professionals, we always kind of underestimate that fact. And 174 00:09:07,400 --> 00:09:10,040 Speaker 1: so the net effect is that users are completely surprised 175 00:09:10,040 --> 00:09:11,720 Speaker 1: by this. They think you must be doing something completely 176 00:09:11,800 --> 00:09:13,880 Speaker 1: unheard of to achieve it, when in fact, you know 177 00:09:14,080 --> 00:09:16,599 Speaker 1: it could be really simple data sharing. And so the 178 00:09:16,640 --> 00:09:18,160 Speaker 1: reason I think that's important is then when you do 179 00:09:18,200 --> 00:09:21,080 Speaker 1: build technologies that actually utilize some of these more sophisticated 180 00:09:21,120 --> 00:09:24,600 Speaker 1: methods to build data sets, you have to be aware 181 00:09:24,640 --> 00:09:27,000 Speaker 1: that your users. You know, first of all, your users 182 00:09:27,000 --> 00:09:29,520 Speaker 1: aren't gonna necessarily anticipate the outcomes that you can create, 183 00:09:30,120 --> 00:09:31,400 Speaker 1: and if you don't do a good job on the 184 00:09:31,400 --> 00:09:33,559 Speaker 1: product side of making sure that you really think through 185 00:09:33,600 --> 00:09:36,000 Speaker 1: the use case and how you're leveraging technology to solve it, 186 00:09:36,400 --> 00:09:39,360 Speaker 1: you can generate unexpected outcomes. You know, there was the 187 00:09:39,400 --> 00:09:43,600 Speaker 1: example of a retailer who produced advertising flyers that were 188 00:09:43,600 --> 00:09:46,520 Speaker 1: able to predict folks who are pregnant, right, even if 189 00:09:47,000 --> 00:09:49,120 Speaker 1: some of those folks didn't necessarily know that themselves yet 190 00:09:49,200 --> 00:09:51,800 Speaker 1: or hadn't shared it with with their family or their spouses. 191 00:09:52,280 --> 00:09:56,480 Speaker 1: And so that was a case really of both the 192 00:09:56,559 --> 00:10:00,680 Speaker 1: company and the consumer being shocked by outcomes we're generating 193 00:10:00,679 --> 00:10:03,520 Speaker 1: exactly right. I mean, the you know, the algorithm doesn't 194 00:10:03,520 --> 00:10:06,360 Speaker 1: do anything magic, but that's a case of you know, 195 00:10:06,400 --> 00:10:08,520 Speaker 1: putting together in that case, like a marketing program, where 196 00:10:08,559 --> 00:10:10,920 Speaker 1: you don't really think through what's the possible data that 197 00:10:10,920 --> 00:10:13,280 Speaker 1: this could produce and what are my users? What do 198 00:10:13,320 --> 00:10:15,280 Speaker 1: they already know about this data? You know, you need 199 00:10:15,360 --> 00:10:17,320 Speaker 1: to think really hard about your users and what they 200 00:10:17,360 --> 00:10:18,880 Speaker 1: want and what they're trying to achieve, and what the 201 00:10:18,960 --> 00:10:21,959 Speaker 1: dangers are and leveraging this technology. It's no different than 202 00:10:22,200 --> 00:10:24,679 Speaker 1: in that way than any previous technology solutions you might 203 00:10:24,679 --> 00:10:26,520 Speaker 1: have used to build a great product for people, and 204 00:10:26,520 --> 00:10:28,800 Speaker 1: it can be misused just as easily. Funny enough, the 205 00:10:28,840 --> 00:10:32,360 Speaker 1: first episode of Sleepwalkers season one open with a story 206 00:10:32,480 --> 00:10:37,840 Speaker 1: of Washington Post employee Gillian Brockell, who, to your point 207 00:10:37,840 --> 00:10:41,520 Speaker 1: about pregnancy and data stuffered a miscarriage but continue to 208 00:10:41,559 --> 00:10:46,320 Speaker 1: receive targeted ads for pregnancy goods after a miscarriage, and 209 00:10:46,360 --> 00:10:49,640 Speaker 1: she wrote this openless is the technology companies saying please 210 00:10:49,640 --> 00:10:52,200 Speaker 1: stop targeting me. But that raised a big question for us, 211 00:10:52,240 --> 00:10:55,040 Speaker 1: which is what happens when the algorithms go wrong? Yeah, 212 00:10:55,200 --> 00:10:57,440 Speaker 1: I'd almost be more specific with the way that you 213 00:10:57,520 --> 00:11:00,200 Speaker 1: say that, and like the algorithm didn't go wrong, right, 214 00:11:00,240 --> 00:11:02,559 Speaker 1: but like the implementation of it and the product that 215 00:11:02,600 --> 00:11:05,320 Speaker 1: they built around it did because it wasn't really correctly conceived. 216 00:11:05,360 --> 00:11:07,160 Speaker 1: And we have to make sure that like what you're 217 00:11:07,200 --> 00:11:09,160 Speaker 1: trying to do automatically fits really well with what your 218 00:11:09,240 --> 00:11:11,559 Speaker 1: users are trying to accomplish, doesn't happen in a way 219 00:11:11,559 --> 00:11:14,600 Speaker 1: that's not expected. Is a well designed product, you know, 220 00:11:14,640 --> 00:11:17,079 Speaker 1: So in that specific case, yeah, I mean it always 221 00:11:17,120 --> 00:11:19,240 Speaker 1: starts and ends with kind of good product design. If 222 00:11:19,520 --> 00:11:21,760 Speaker 1: you're not doing that, just like any other tool, you 223 00:11:21,800 --> 00:11:23,920 Speaker 1: can misuse it. One of the other things we did 224 00:11:23,960 --> 00:11:27,640 Speaker 1: on the show was we used a language generator to 225 00:11:27,720 --> 00:11:31,360 Speaker 1: can't with pickup lines based on a data set of 226 00:11:31,400 --> 00:11:36,480 Speaker 1: all none of them were actually I don't have kind 227 00:11:36,480 --> 00:11:39,560 Speaker 1: of things like you are a thing and I love you, 228 00:11:39,559 --> 00:11:41,199 Speaker 1: you know, which is now the name of the book 229 00:11:41,360 --> 00:11:44,320 Speaker 1: by the woman who. Yeah, woman Jell Shane her wrote 230 00:11:44,320 --> 00:11:46,079 Speaker 1: a book about it, and then she also did these 231 00:11:46,080 --> 00:11:48,800 Speaker 1: things like AI recipes, like one was for chocolate chocolate 232 00:11:48,880 --> 00:11:52,400 Speaker 1: chocolate chicken cake. So there is funny things and Shakespeare 233 00:11:52,559 --> 00:11:55,960 Speaker 1: on it, and I didn't revealed two things. One is 234 00:11:56,040 --> 00:12:00,199 Speaker 1: when you turn these deep learning algorithms onto big data sets, 235 00:12:00,520 --> 00:12:02,959 Speaker 1: they reveal passions you might not necessarily be aware of, 236 00:12:03,000 --> 00:12:05,000 Speaker 1: like with a lot of chicken and quite a lot 237 00:12:05,040 --> 00:12:07,880 Speaker 1: of chocolates. On the other hand, like these were clearly 238 00:12:08,360 --> 00:12:10,840 Speaker 1: not something human would ever make. So how do you 239 00:12:10,880 --> 00:12:13,800 Speaker 1: think about the line between doing fun things in AI 240 00:12:13,960 --> 00:12:16,680 Speaker 1: and doing stuff which is valuable for business and also 241 00:12:16,800 --> 00:12:19,040 Speaker 1: not getting lost in the uncanny valley. So a good 242 00:12:19,080 --> 00:12:21,559 Speaker 1: example of this for instances. We spent some time at 243 00:12:21,559 --> 00:12:24,520 Speaker 1: the Post trying to build a headline generation algorithm we 244 00:12:24,520 --> 00:12:28,640 Speaker 1: could automatically create headlines for stories. And you know, the 245 00:12:28,679 --> 00:12:31,240 Speaker 1: idea I think at the beginning wasn't necessarily that, you know, 246 00:12:31,280 --> 00:12:33,760 Speaker 1: journalists are never read headlines again, but we'd be able 247 00:12:33,760 --> 00:12:35,800 Speaker 1: to create some alternative headlines in different ways to think 248 00:12:35,840 --> 00:12:38,280 Speaker 1: about a story. Our intention was, let's see if we 249 00:12:38,320 --> 00:12:39,800 Speaker 1: can come up with something so that we can create 250 00:12:39,840 --> 00:12:42,360 Speaker 1: several different variants of a headline. Part of our software 251 00:12:42,400 --> 00:12:45,440 Speaker 1: platform we include content testing framework. So one of the 252 00:12:45,480 --> 00:12:48,040 Speaker 1: things that we can do is say, for a given story, 253 00:12:48,120 --> 00:12:50,480 Speaker 1: let's have three different headlines for it, Let's run a 254 00:12:50,520 --> 00:12:53,280 Speaker 1: test as soon as it publishest of the audience is 255 00:12:53,320 --> 00:12:55,640 Speaker 1: going to get each variant, and then as people start 256 00:12:55,720 --> 00:12:57,120 Speaker 1: to click one more than the other, we're gonna shift 257 00:12:57,120 --> 00:12:59,600 Speaker 1: the burden of traffic to the most successful variant. And that, 258 00:12:59,640 --> 00:13:02,240 Speaker 1: I would them, by itself works really well. If you know, 259 00:13:02,280 --> 00:13:04,120 Speaker 1: folks in the audience here were to look at the 260 00:13:04,160 --> 00:13:06,520 Speaker 1: homepage of our site right now, there's probably two or 261 00:13:06,559 --> 00:13:08,480 Speaker 1: three stories that are running those types of tests where 262 00:13:08,480 --> 00:13:11,040 Speaker 1: different people would se different headlines or different images, or 263 00:13:11,240 --> 00:13:14,680 Speaker 1: in fact maybe actually just complutely different stories, and those 264 00:13:14,720 --> 00:13:17,320 Speaker 1: tests will resolve in like fifteen or twenty minutes. So 265 00:13:17,400 --> 00:13:19,280 Speaker 1: that works well enough by itself. But then we realized, well, 266 00:13:19,280 --> 00:13:20,960 Speaker 1: we could probably create more of these tests if only 267 00:13:20,960 --> 00:13:23,439 Speaker 1: we could automatically create headlines for them. We could just 268 00:13:23,480 --> 00:13:25,600 Speaker 1: be running these tests all the time for every single story. 269 00:13:26,080 --> 00:13:28,840 Speaker 1: But what we found was, you know, not exactly so 270 00:13:29,320 --> 00:13:31,760 Speaker 1: if the idea was to save journalists time and doing 271 00:13:31,800 --> 00:13:33,679 Speaker 1: that in the end, I mean, you'd have to come 272 00:13:33,760 --> 00:13:36,240 Speaker 1: up with something that's fairly solid and ready to publish out. 273 00:13:36,600 --> 00:13:38,720 Speaker 1: We were able to create something that allowed you know, 274 00:13:38,800 --> 00:13:41,200 Speaker 1: journalists basically have different formulations that they could play with 275 00:13:41,240 --> 00:13:43,000 Speaker 1: and maybe gave them some ideas of what to create, 276 00:13:43,240 --> 00:13:44,839 Speaker 1: but it still require people to look at it. In 277 00:13:44,880 --> 00:13:48,720 Speaker 1: the end, how can businesses work better with their engineers, 278 00:13:48,840 --> 00:13:53,920 Speaker 1: with their tech teams to sort of create and not 279 00:13:54,280 --> 00:13:57,240 Speaker 1: stay siloed in a way that like somebody who works 280 00:13:57,240 --> 00:13:59,840 Speaker 1: in marketing feels like, well, you know, there's actually this 281 00:14:00,040 --> 00:14:01,520 Speaker 1: need that I have, but I don't know who to 282 00:14:01,559 --> 00:14:03,600 Speaker 1: talk to about it, and that don't really know what 283 00:14:03,640 --> 00:14:05,800 Speaker 1: to do. It's an awesome question to me, Like one 284 00:14:05,800 --> 00:14:06,920 Speaker 1: of the best things that you can do as a 285 00:14:06,960 --> 00:14:10,080 Speaker 1: business is to put those people together, sometimes even physically. 286 00:14:10,200 --> 00:14:12,920 Speaker 1: So when we started this project, you know, we literally 287 00:14:12,960 --> 00:14:17,560 Speaker 1: co located engineers, product people directly inside the news room 288 00:14:17,600 --> 00:14:19,760 Speaker 1: to sit with the folks who are doing this work. Now, 289 00:14:19,760 --> 00:14:22,480 Speaker 1: when it comes to a I M L, you remember, 290 00:14:22,480 --> 00:14:24,640 Speaker 1: these are just tools. These are tools to make work easier. 291 00:14:24,680 --> 00:14:27,160 Speaker 1: Their tools in a lot of cases for automation and efficiency. 292 00:14:27,560 --> 00:14:29,920 Speaker 1: There's some problems that can't be solved without it. In 293 00:14:30,000 --> 00:14:32,000 Speaker 1: the end, though, you know, you're still trying to solve 294 00:14:32,040 --> 00:14:34,040 Speaker 1: some business problem, and most of those involved some sort 295 00:14:34,040 --> 00:14:35,560 Speaker 1: of users that you need to get to know. So 296 00:14:36,200 --> 00:14:38,440 Speaker 1: you know, even at the post we had data sciencests 297 00:14:38,440 --> 00:14:40,640 Speaker 1: who were on those teams embedded in the news room 298 00:14:40,680 --> 00:14:42,680 Speaker 1: as well. You know, they weren't kind of seated somewhere 299 00:14:42,720 --> 00:14:44,960 Speaker 1: else thinking of problems on their own. There's a time 300 00:14:45,000 --> 00:14:47,440 Speaker 1: and a place for creating room for prototyping sometimes that 301 00:14:47,480 --> 00:14:50,440 Speaker 1: has to happen to especially with really advanced technologies. But 302 00:14:50,480 --> 00:14:54,000 Speaker 1: beyond prototyping, putting those teams together is super crucial. So 303 00:14:54,040 --> 00:14:57,160 Speaker 1: how do you make sure, speaking metaphorically, you write a 304 00:14:57,200 --> 00:15:00,160 Speaker 1: good brief to your AI team, well, your engineering team. 305 00:15:00,200 --> 00:15:03,200 Speaker 1: I still think, you know, start with the problem that 306 00:15:03,200 --> 00:15:05,120 Speaker 1: you're trying to solve. Like, if you're going in thinking 307 00:15:05,200 --> 00:15:08,120 Speaker 1: let's use AI to solve something, I think you're probably 308 00:15:08,720 --> 00:15:10,960 Speaker 1: starting the problem the wrong way, and start by framing 309 00:15:11,000 --> 00:15:13,240 Speaker 1: up the problem in business terms. For people, you'd be surprised, 310 00:15:13,280 --> 00:15:15,800 Speaker 1: I think how much you know engineers and product folks 311 00:15:15,800 --> 00:15:17,880 Speaker 1: really actually prefer to get that first before they start 312 00:15:17,920 --> 00:15:19,880 Speaker 1: diving into what's the technology that I'm getting used to 313 00:15:19,880 --> 00:15:23,160 Speaker 1: solve this problem? With a buzz around AI, especially right now. 314 00:15:23,920 --> 00:15:25,480 Speaker 1: You know, people tend to go into it. I think 315 00:15:25,480 --> 00:15:28,080 Speaker 1: thinking this is kind of something that's pretty close to magic. 316 00:15:28,560 --> 00:15:30,440 Speaker 1: We just use AI. It's going to solve these problems 317 00:15:30,440 --> 00:15:32,320 Speaker 1: that we haven't been able to solve some other way. 318 00:15:32,480 --> 00:15:34,280 Speaker 1: And that's not really the right way to approach it. 319 00:15:34,560 --> 00:15:37,119 Speaker 1: But a I will be transformative, I think for organizations 320 00:15:37,120 --> 00:15:39,640 Speaker 1: that apply it the right way, with the product mindset, 321 00:15:39,720 --> 00:15:41,320 Speaker 1: with a good knowledge of the problem that they're trying 322 00:15:41,320 --> 00:15:44,600 Speaker 1: to solve, with empathy for their users. When we're doing 323 00:15:44,680 --> 00:15:48,239 Speaker 1: our research for this panel, as an article in Bloomberg 324 00:15:48,240 --> 00:15:53,520 Speaker 1: News saying that Jeff Bezos is pestimally very invested in 325 00:15:53,520 --> 00:15:56,480 Speaker 1: the product, and then you called him Jeff in conversation, 326 00:15:56,720 --> 00:16:01,160 Speaker 1: I found very impress So we will unders people sending 327 00:16:01,160 --> 00:16:04,400 Speaker 1: me an email already without obviously telling us the contented 328 00:16:04,440 --> 00:16:07,640 Speaker 1: your meetings. How does his vision imbue what you do? 329 00:16:08,160 --> 00:16:11,440 Speaker 1: So certainly for us it's boon to have him owning 330 00:16:11,440 --> 00:16:14,000 Speaker 1: the company. I think that's one of the greatest things 331 00:16:14,040 --> 00:16:16,360 Speaker 1: is you know, obviously at the Washington Post were known 332 00:16:16,400 --> 00:16:19,040 Speaker 1: for an amazing newsroom, but we've also spent a lot 333 00:16:19,040 --> 00:16:21,480 Speaker 1: of time investing in our engineering team that started to 334 00:16:21,520 --> 00:16:23,720 Speaker 1: some extent before you know, we were purchased by Jeff, 335 00:16:23,760 --> 00:16:26,000 Speaker 1: but certainly after we purchased us. It opened a lot 336 00:16:26,040 --> 00:16:28,000 Speaker 1: of new doors for us, and it gets people excited 337 00:16:28,040 --> 00:16:29,120 Speaker 1: to come and work for us and some of the 338 00:16:29,120 --> 00:16:32,080 Speaker 1: problems that we're trying to solve. It really inspires people 339 00:16:32,160 --> 00:16:34,160 Speaker 1: to be able to build like a platform like we built, 340 00:16:34,760 --> 00:16:36,920 Speaker 1: you know, within a newspaper company. I think would have 341 00:16:36,960 --> 00:16:39,280 Speaker 1: been hired to Fathom probably ten years ago, but I 342 00:16:39,280 --> 00:16:41,040 Speaker 1: mean today we really can say that, you know, we're 343 00:16:41,040 --> 00:16:44,320 Speaker 1: a content company and we're a technology company. And I 344 00:16:44,360 --> 00:16:46,680 Speaker 1: think part of that starts with him in the leadership 345 00:16:46,720 --> 00:17:04,920 Speaker 1: that he provides. More Sleepwalkers after the break, So, Kara, 346 00:17:05,080 --> 00:17:08,720 Speaker 1: that was our conversation on stage with Matt monahantan CS 347 00:17:08,800 --> 00:17:11,960 Speaker 1: in early January. It was interesting because we hear so 348 00:17:12,040 --> 00:17:15,800 Speaker 1: much about tech companies becoming publishers, whether it's Facebook, YouTube, 349 00:17:15,880 --> 00:17:19,600 Speaker 1: or Twitter, but we hear less about publishers becoming tech companies. 350 00:17:20,080 --> 00:17:22,280 Speaker 1: I guess that's where Jeff Basos as an owner is 351 00:17:22,320 --> 00:17:25,120 Speaker 1: what we might call a differentiator. So I was personally 352 00:17:25,160 --> 00:17:28,480 Speaker 1: struck with Matt's experiments with the headline generator. You know, 353 00:17:28,520 --> 00:17:30,960 Speaker 1: for the time being, it doesn't work well enough to 354 00:17:31,000 --> 00:17:34,760 Speaker 1: be a commercial product, but I think it will soon. 355 00:17:34,840 --> 00:17:37,280 Speaker 1: You know, look at autocomplete when you send a Gmail 356 00:17:37,359 --> 00:17:41,200 Speaker 1: like Sincerely Comma. You know I get those all the time, 357 00:17:41,640 --> 00:17:44,960 Speaker 1: and it works. You know, in an apocalyptic reading, that 358 00:17:45,040 --> 00:17:47,520 Speaker 1: means that machines will take over our lives and there 359 00:17:47,520 --> 00:17:49,919 Speaker 1: will be no work left for humans. We won't have 360 00:17:49,960 --> 00:17:52,119 Speaker 1: to come up with smart headlines. But I think in 361 00:17:52,160 --> 00:17:56,320 Speaker 1: a more optimistic reading, using algorithms to generate writing suggestions 362 00:17:56,320 --> 00:18:00,840 Speaker 1: could actually enable originality. Reminds me of that Chinese science 363 00:18:00,840 --> 00:18:03,200 Speaker 1: fiction writer who you and I have talked about named 364 00:18:03,280 --> 00:18:08,120 Speaker 1: Chen show Fund, who actually used an algorithm to create 365 00:18:08,280 --> 00:18:11,800 Speaker 1: ideas for his own work, and he used it when 366 00:18:11,840 --> 00:18:14,200 Speaker 1: he had writer's block. He wasn't using it to replace 367 00:18:14,680 --> 00:18:17,400 Speaker 1: his creative skill. He was using it as an enhancement tool. 368 00:18:17,440 --> 00:18:19,639 Speaker 1: And I think that's really interesting. Yeah, And in season 369 00:18:19,640 --> 00:18:21,920 Speaker 1: one of sleep Walkers, we spoke to a filmmaker called 370 00:18:21,960 --> 00:18:24,760 Speaker 1: Oscar Shop who actually shot a whole film written by 371 00:18:24,800 --> 00:18:28,880 Speaker 1: Ai called Sunspring. Oscar and Chen turned the technology into 372 00:18:28,920 --> 00:18:31,359 Speaker 1: a tool that actually serves their purposes. You know, you 373 00:18:31,400 --> 00:18:34,359 Speaker 1: can develop all kinds of technology in a vacuum. About 374 00:18:34,359 --> 00:18:37,040 Speaker 1: the technology that really serves people and fills a need. 375 00:18:37,400 --> 00:18:39,760 Speaker 1: Is the technology that sticks around, you know, speaking of 376 00:18:39,760 --> 00:18:42,800 Speaker 1: technology that really sticks around, and then some technology that 377 00:18:43,160 --> 00:18:46,399 Speaker 1: might not stick around. There's so much stuff on the 378 00:18:46,440 --> 00:18:49,879 Speaker 1: floor of CS you and I have never been to 379 00:18:49,880 --> 00:18:52,159 Speaker 1: see us before. I think we were very overwhelmed by 380 00:18:52,160 --> 00:18:54,880 Speaker 1: what we saw and excited. It was kind of inspected 381 00:18:54,920 --> 00:18:58,679 Speaker 1: gudgets paradise, and obviously, as someone who's obsessed with technology 382 00:18:58,880 --> 00:19:01,639 Speaker 1: and consumer technology, I would have bought every single thing 383 00:19:01,640 --> 00:19:04,320 Speaker 1: I thought you tried to buy. I did try to 384 00:19:04,359 --> 00:19:07,080 Speaker 1: buy that. I mean that keyboard with the mouse burnt 385 00:19:07,119 --> 00:19:11,400 Speaker 1: into the keyboard. How much did this cost? I almost 386 00:19:11,400 --> 00:19:17,040 Speaker 1: bought a laser cool laser patch for my back, which 387 00:19:17,119 --> 00:19:21,560 Speaker 1: placebo or not made me look very hot. But no, 388 00:19:21,760 --> 00:19:24,520 Speaker 1: in all seriousness, you know, there are things that were 389 00:19:24,560 --> 00:19:27,320 Speaker 1: on the floor that are kind of amazing when you 390 00:19:27,359 --> 00:19:30,320 Speaker 1: think about it, Like from this company called Pillow Health. 391 00:19:30,680 --> 00:19:33,359 Speaker 1: They've developed this device called Priya that looks like a 392 00:19:33,359 --> 00:19:35,520 Speaker 1: little face, a cute little face, as they always do, 393 00:19:36,040 --> 00:19:39,840 Speaker 1: and it's basically a pill dispenser that is voice and 394 00:19:39,880 --> 00:19:43,080 Speaker 1: face activated, so anybody could have one. I could have one, 395 00:19:43,080 --> 00:19:45,919 Speaker 1: You could have one. But I think they've developed it 396 00:19:46,040 --> 00:19:50,080 Speaker 1: mostly for elderly people who have many pills that they 397 00:19:50,080 --> 00:19:54,080 Speaker 1: have to take throughout the day, and who's children or 398 00:19:54,640 --> 00:19:57,800 Speaker 1: health aids want to be able to control when their 399 00:19:57,840 --> 00:20:01,360 Speaker 1: medicine is dispensed. And I think for someone who might 400 00:20:01,480 --> 00:20:07,040 Speaker 1: have memory impairment, physical impairment, the idea that someone who 401 00:20:07,160 --> 00:20:10,920 Speaker 1: isn't in the room with that person could control when 402 00:20:10,960 --> 00:20:15,440 Speaker 1: they're getting you know, vital medicine is really amazing. And 403 00:20:15,760 --> 00:20:18,359 Speaker 1: you know, you say what you will about privacy. I 404 00:20:18,400 --> 00:20:22,800 Speaker 1: think being able to do something like take care of 405 00:20:22,880 --> 00:20:26,359 Speaker 1: your elderly parent with a device is you know, the 406 00:20:26,480 --> 00:20:30,359 Speaker 1: perfect intersection of technology and humanity, right. I spoke to 407 00:20:30,359 --> 00:20:32,760 Speaker 1: the founder about exactly that. And you know, we have 408 00:20:32,800 --> 00:20:35,440 Speaker 1: a lot of concerns about facial recognition that we've discussed 409 00:20:35,480 --> 00:20:38,840 Speaker 1: at length on Sleepwalkers and will continue to discuss in 410 00:20:38,920 --> 00:20:41,600 Speaker 1: season two. But in a narrow use case like this, 411 00:20:41,680 --> 00:20:44,479 Speaker 1: in a voluntary use case where it can help somebody 412 00:20:44,480 --> 00:20:47,639 Speaker 1: out to remember something very important, like what pills are taken, 413 00:20:47,720 --> 00:20:50,280 Speaker 1: when it may well be that that's a sacrifice which 414 00:20:50,320 --> 00:20:53,840 Speaker 1: is very much worth taking. There was another startup on 415 00:20:53,840 --> 00:20:56,600 Speaker 1: the floor that really caught my eye, which was called 416 00:20:57,119 --> 00:20:59,399 Speaker 1: in New Pathy, and according to the card which have 417 00:20:59,440 --> 00:21:01,439 Speaker 1: in front of me. It's the first device in the 418 00:21:01,480 --> 00:21:04,760 Speaker 1: world which is equipped with technology to visualize your dog's 419 00:21:04,840 --> 00:21:09,080 Speaker 1: status from his or her heart rate information. And this 420 00:21:09,200 --> 00:21:11,399 Speaker 1: is basically a harness that you put on your dog 421 00:21:11,640 --> 00:21:14,359 Speaker 1: and it recalls your dog's heart rate and in particular 422 00:21:14,680 --> 00:21:17,240 Speaker 1: the variability in your dog's heart rate to tell you 423 00:21:17,280 --> 00:21:19,720 Speaker 1: if your dog is happy or sad, or anxious or 424 00:21:19,760 --> 00:21:23,480 Speaker 1: excited or curious. And you know, people struggle to know 425 00:21:23,520 --> 00:21:25,679 Speaker 1: what that dogs are thinking. And if you can use 426 00:21:25,800 --> 00:21:30,040 Speaker 1: data from historical doggie feelings to model what a current 427 00:21:30,080 --> 00:21:32,400 Speaker 1: dog is feeling and use that to have better interaction 428 00:21:32,440 --> 00:21:34,800 Speaker 1: with your dog, more power to you. I think it's cool. 429 00:21:34,920 --> 00:21:36,840 Speaker 1: There was this other piece of technology from a company 430 00:21:36,840 --> 00:21:40,560 Speaker 1: called we Labs. They were Japanese, right, and it kind 431 00:21:40,560 --> 00:21:43,000 Speaker 1: of blew my mind in the same way that like 432 00:21:43,119 --> 00:21:45,919 Speaker 1: thinking about automation of drive through blew my mind. You know. 433 00:21:46,640 --> 00:21:50,000 Speaker 1: It was this like would beam that looked like a 434 00:21:50,040 --> 00:21:53,639 Speaker 1: beam in a house, and it had a computer that 435 00:21:53,760 --> 00:21:57,960 Speaker 1: was inside of it. And the woman who was showcasing 436 00:21:57,960 --> 00:22:01,159 Speaker 1: it basically asked ours to stay end up against it 437 00:22:01,240 --> 00:22:04,479 Speaker 1: like you would when you're charting at child's height, and 438 00:22:04,520 --> 00:22:08,120 Speaker 1: she took a pen or stylus and marked Oz's height, 439 00:22:08,920 --> 00:22:12,760 Speaker 1: and then immediately that marking was uploaded into the cloud 440 00:22:13,240 --> 00:22:17,520 Speaker 1: and displayed on a device next to this would beam 441 00:22:17,560 --> 00:22:20,600 Speaker 1: And it just made me think, like this thing that 442 00:22:20,960 --> 00:22:24,240 Speaker 1: millions of families do as their children are growing up 443 00:22:24,920 --> 00:22:27,960 Speaker 1: is now being digitized, and again like going back to 444 00:22:28,000 --> 00:22:33,040 Speaker 1: the intersection of technology and human behavior, like imagine if 445 00:22:33,080 --> 00:22:36,840 Speaker 1: someone moves from the house, those height markings that were 446 00:22:36,920 --> 00:22:40,240 Speaker 1: such a part of your child's growing up can be 447 00:22:40,320 --> 00:22:42,880 Speaker 1: taken with you in the cloud. It's just I mean, 448 00:22:42,920 --> 00:22:44,600 Speaker 1: that's the kind of stuff where I'm like, do I 449 00:22:44,640 --> 00:22:46,960 Speaker 1: need it? Does someone need it? Who cares? But the 450 00:22:47,000 --> 00:22:54,560 Speaker 1: idea that it's like replicating this very very personal feeling 451 00:22:54,720 --> 00:22:57,679 Speaker 1: and you know activity that we do in our childhood 452 00:22:57,880 --> 00:22:59,720 Speaker 1: is I don't know, it kind of blew my mind. 453 00:23:00,040 --> 00:23:01,879 Speaker 1: All three of the things we ended up talking about, 454 00:23:01,880 --> 00:23:04,800 Speaker 1: you know, pillow Health, the doggy heart rate monitor, and 455 00:23:04,840 --> 00:23:08,000 Speaker 1: this Japanese WOULD device. You know, they go back to 456 00:23:08,080 --> 00:23:11,840 Speaker 1: the most human things are our parents, okay, is our 457 00:23:11,880 --> 00:23:14,879 Speaker 1: dog okay? Our children growing up? What does it make 458 00:23:15,000 --> 00:23:17,680 Speaker 1: us feel as they grow up? And so technology that 459 00:23:17,720 --> 00:23:21,760 Speaker 1: addresses those questions in a sensitive and humanistic way will 460 00:23:21,800 --> 00:23:24,520 Speaker 1: always be interesting to us because it really allows us 461 00:23:24,520 --> 00:23:27,480 Speaker 1: to think about and tell stories about ourselves. The oldest 462 00:23:27,480 --> 00:23:29,960 Speaker 1: stories we tell, the stories that are parts of novels 463 00:23:29,960 --> 00:23:32,680 Speaker 1: and films and all other kinds of art. So that's 464 00:23:32,760 --> 00:23:34,920 Speaker 1: to me where technology is most interesting and the types 465 00:23:34,960 --> 00:23:37,480 Speaker 1: of stories that will continue to tell on Sleepwalkers. So 466 00:23:37,560 --> 00:23:40,800 Speaker 1: everything we just talked about is consumer focus and very interesting, 467 00:23:41,160 --> 00:23:43,680 Speaker 1: but a I can also help address problems at scale, 468 00:23:44,040 --> 00:23:47,119 Speaker 1: you know, issues ranging from climate change to pain management, 469 00:23:47,160 --> 00:23:49,640 Speaker 1: and those are all things that we're going to talk 470 00:23:49,640 --> 00:23:53,719 Speaker 1: about in our very exciting season two. Thank you for listening, 471 00:23:53,760 --> 00:23:55,959 Speaker 1: and we're looking forward to seeing you for Season two 472 00:23:55,960 --> 00:24:07,840 Speaker 1: of Sleepwalkers very soon. Und the Roots under the Roots 473 00:24:07,880 --> 00:24:08,680 Speaker 1: for the f