1 00:00:05,160 --> 00:00:08,039 Speaker 1: Thank you to Jessica Lesson and The Information for hosting 2 00:00:08,080 --> 00:00:11,520 Speaker 1: this episode's conversation. And because we did this at the 3 00:00:11,600 --> 00:00:14,840 Speaker 1: Yale Club, you may hear some audio that sounds a 4 00:00:14,880 --> 00:00:17,759 Speaker 1: little different than the podcast we do inside the studio. 5 00:00:18,960 --> 00:00:23,400 Speaker 1: Hi Katie, Hello Brian. So, Marty Baron has become kind 6 00:00:23,400 --> 00:00:25,919 Speaker 1: of this generation's Ben Bradley. I was going to say 7 00:00:25,960 --> 00:00:29,040 Speaker 1: he's kind of the dean of American journalism right now. Yeah, 8 00:00:29,040 --> 00:00:31,600 Speaker 1: and not just because he was portrayed in the movie 9 00:00:31,600 --> 00:00:38,000 Speaker 1: Spotlight by Lee of Shriver, yes, but also because he's 10 00:00:38,120 --> 00:00:44,159 Speaker 1: probably the most skillful, experienced, courageous newspaper editor in America 11 00:00:44,560 --> 00:00:46,639 Speaker 1: and at the Washington Post, he's really made a name 12 00:00:46,680 --> 00:00:50,239 Speaker 1: for himself by having the courage to take on the 13 00:00:50,240 --> 00:00:53,479 Speaker 1: new administration and to take on Donald Trump. Brian, you 14 00:00:53,520 --> 00:00:56,240 Speaker 1: and I were completely geeking out because we got to 15 00:00:56,280 --> 00:00:59,279 Speaker 1: talk to Marty at a conference that was organized by 16 00:00:59,400 --> 00:01:02,080 Speaker 1: The Information, aation which was founded by a friend of yours, 17 00:01:02,360 --> 00:01:05,720 Speaker 1: Jessica Lesson. Jess and I went to college together and 18 00:01:05,959 --> 00:01:11,560 Speaker 1: she created Harvard. Yeah, your safety school came. It preempt 19 00:01:11,600 --> 00:01:14,440 Speaker 1: your favorite joke. But The Information is kind of a 20 00:01:14,480 --> 00:01:17,520 Speaker 1: new kind of tech newsletter where people pay for content 21 00:01:17,600 --> 00:01:20,080 Speaker 1: there are no ads, and it's popular among a lot 22 00:01:20,120 --> 00:01:24,200 Speaker 1: of Silicon Valley insiders. And so we talked with Marty 23 00:01:24,440 --> 00:01:28,720 Speaker 1: and Jess about the future of the media industry, about 24 00:01:28,800 --> 00:01:31,040 Speaker 1: d C and the Trump era, and of course about 25 00:01:31,040 --> 00:01:33,840 Speaker 1: how technology is changing all of this. That's right, And 26 00:01:33,959 --> 00:01:36,520 Speaker 1: just a few months after Marty became the top editor 27 00:01:36,880 --> 00:01:39,640 Speaker 1: at The Washington Post, the paper was purchased by Jeff 28 00:01:39,680 --> 00:01:44,119 Speaker 1: Bezos from its longtime owners, the grand Family. Now you all, 29 00:01:44,160 --> 00:01:47,520 Speaker 1: I'm sure know that Jeff Bezos is Amazon dot COM's 30 00:01:47,520 --> 00:01:50,360 Speaker 1: founder and CEO. And ever since the sale went through 31 00:01:50,360 --> 00:01:53,600 Speaker 1: in two thousand thirteen, I've really wondered what this transition 32 00:01:53,720 --> 00:01:58,520 Speaker 1: was like and how the Washington Post is merging technology 33 00:01:58,600 --> 00:02:01,160 Speaker 1: and some of the greatest journal is um that's being 34 00:02:01,160 --> 00:02:04,600 Speaker 1: produced today. So that was the first question I asked Marty. 35 00:02:04,720 --> 00:02:07,960 Speaker 1: How has a tech owner and a tech mentality changed 36 00:02:08,000 --> 00:02:13,400 Speaker 1: his job and the mission of the Washington Post. Sure, 37 00:02:13,440 --> 00:02:16,760 Speaker 1: we'll have changed it dramatically and immediately. So when our 38 00:02:16,800 --> 00:02:20,960 Speaker 1: mission previously had been described as foreign about Washington, certainly 39 00:02:20,960 --> 00:02:24,520 Speaker 1: recognizing that Washington was the location of the nation's capital, 40 00:02:24,560 --> 00:02:27,600 Speaker 1: but we were we had pretty much a regional focus. Uh. 41 00:02:27,600 --> 00:02:29,840 Speaker 1: And when Jeff came in. Uh, he said that that 42 00:02:29,919 --> 00:02:32,960 Speaker 1: wasn't the right strategy for us, and he immediately changed 43 00:02:33,000 --> 00:02:37,840 Speaker 1: our strategy to be focused on becoming national and even international. UH. 44 00:02:37,840 --> 00:02:40,400 Speaker 1: And you know, he said that the Internet had taken 45 00:02:40,440 --> 00:02:42,400 Speaker 1: a lot of white things away from us, obviously the 46 00:02:42,400 --> 00:02:45,320 Speaker 1: protection that we had against a competition of all sorts, 47 00:02:46,080 --> 00:02:48,880 Speaker 1: but it had also given us some gifts. UH. And 48 00:02:48,919 --> 00:02:51,560 Speaker 1: the primary gift that it had given us was essentially 49 00:02:51,600 --> 00:02:54,200 Speaker 1: free distribution, and it would be crazy for us not 50 00:02:54,280 --> 00:02:58,280 Speaker 1: to take advantage of that, particularly given the Washington Post brand, 51 00:02:58,639 --> 00:03:01,760 Speaker 1: Uh that it was no nationally, it was known internationally 52 00:03:02,000 --> 00:03:05,600 Speaker 1: and had the opportunity to become a national news organization. 53 00:03:06,080 --> 00:03:09,000 Speaker 1: And that was the fundamental change that took place for us. 54 00:03:09,240 --> 00:03:11,440 Speaker 1: You know, I've always thought that the media outlet that 55 00:03:11,600 --> 00:03:16,120 Speaker 1: figures out the secret sauce between technology and content was 56 00:03:16,160 --> 00:03:17,839 Speaker 1: going to win the day. And I have to say, 57 00:03:17,840 --> 00:03:20,560 Speaker 1: I think the Washington Post is the closest I've seen 58 00:03:21,200 --> 00:03:24,359 Speaker 1: to really making that combination. And I imagine, Marty, when 59 00:03:24,360 --> 00:03:27,560 Speaker 1: you started out in this business, you had didn't have 60 00:03:27,639 --> 00:03:32,639 Speaker 1: to worry about a B testing, algorithms, programmatic advertising, sponsored 61 00:03:32,720 --> 00:03:35,960 Speaker 1: content and all that jazz. How do you balance sort 62 00:03:36,000 --> 00:03:39,360 Speaker 1: of technology and content and are you worried that all 63 00:03:39,440 --> 00:03:41,920 Speaker 1: these things are going to have a negative impact on 64 00:03:42,000 --> 00:03:45,480 Speaker 1: quality journalism? Right, Well, it's true that those Uh, I 65 00:03:45,480 --> 00:03:47,480 Speaker 1: didn't have to worry about those things, because those things 66 00:03:47,480 --> 00:03:50,680 Speaker 1: didn't exist when I got into the business forty years ago. Uh. 67 00:03:50,720 --> 00:03:53,200 Speaker 1: And there was actually, uh the people who were involved 68 00:03:53,200 --> 00:03:55,040 Speaker 1: in the newsroom really weren't involved at all on the 69 00:03:55,080 --> 00:03:59,440 Speaker 1: business side. There was people talked about the wall between them. Uh. 70 00:03:59,640 --> 00:04:02,760 Speaker 1: You know are certain things where we we don't share 71 00:04:02,800 --> 00:04:04,560 Speaker 1: with them, but there are other things where we actually 72 00:04:04,560 --> 00:04:07,920 Speaker 1: have to collaborate. So you know, fundamentally, I think that 73 00:04:08,200 --> 00:04:12,080 Speaker 1: UM technology has just changed the way that we tell stories. Uh. 74 00:04:12,120 --> 00:04:14,680 Speaker 1: And I think I think of it as uh, the 75 00:04:14,720 --> 00:04:17,719 Speaker 1: Internet and all that it is brought about. All digital 76 00:04:17,720 --> 00:04:22,160 Speaker 1: platforms now represent a different medium or actually different mediums 77 00:04:22,200 --> 00:04:24,600 Speaker 1: for us, in the same way that when radio came 78 00:04:24,640 --> 00:04:26,920 Speaker 1: into existence, there was a different way of communicating with 79 00:04:26,960 --> 00:04:28,680 Speaker 1: the audience that you didn't just get up there and 80 00:04:28,720 --> 00:04:32,680 Speaker 1: read a newspaper story. And when television came along, you 81 00:04:32,680 --> 00:04:35,320 Speaker 1: didn't read a radio script and you didn't read a newspaper. 82 00:04:35,360 --> 00:04:37,600 Speaker 1: There was a different way of communicating with the audience. 83 00:04:38,040 --> 00:04:42,480 Speaker 1: So with the Internet. The newspaper industry responded by just 84 00:04:42,520 --> 00:04:46,000 Speaker 1: putting up newspaper stories on the Internet, and we expected 85 00:04:46,040 --> 00:04:48,120 Speaker 1: to succeed that way because we just viewed it as 86 00:04:48,160 --> 00:04:51,719 Speaker 1: a distribution platform. Uh. And that didn't really work very well. 87 00:04:51,880 --> 00:04:54,640 Speaker 1: And then we said, well, let's do that faster um, 88 00:04:54,760 --> 00:04:57,400 Speaker 1: and that didn't work so well either. And I think 89 00:04:57,440 --> 00:05:00,480 Speaker 1: we're coming to the recognition that this is a different 90 00:05:00,520 --> 00:05:03,560 Speaker 1: medium or different mediums uh. And we have to tell 91 00:05:03,600 --> 00:05:06,279 Speaker 1: stories in completely different ways. But you also have to 92 00:05:06,360 --> 00:05:09,080 Speaker 1: track the attention. And I'm just curious. I mean, Marty 93 00:05:09,160 --> 00:05:11,320 Speaker 1: Baron and clickbait are two things that I would not 94 00:05:11,400 --> 00:05:13,880 Speaker 1: put in the same sentence. So how do you How 95 00:05:13,880 --> 00:05:17,320 Speaker 1: do you thank you for that? I appreciate that. Well, 96 00:05:17,400 --> 00:05:21,120 Speaker 1: I don't think we do clickbait, so I think that's testing. 97 00:05:21,680 --> 00:05:24,240 Speaker 1: We do a b testing, but that's not clickbait. I mean, 98 00:05:24,360 --> 00:05:27,160 Speaker 1: there are you know. I was asked a few weeks ago, 99 00:05:27,400 --> 00:05:30,039 Speaker 1: how do you I was actually in Spain for speaking engagement, 100 00:05:30,040 --> 00:05:32,840 Speaker 1: and the students there asked me, how do you get 101 00:05:32,839 --> 00:05:35,080 Speaker 1: people to read beyond the headlines? I said, write a 102 00:05:35,080 --> 00:05:38,360 Speaker 1: goo to headline, uh, and then they'll read more. Uh. 103 00:05:38,440 --> 00:05:41,559 Speaker 1: And that's not clickbait. That is, we can write about 104 00:05:41,680 --> 00:05:45,120 Speaker 1: very substantive matters, very serious matters, but we don't have 105 00:05:45,200 --> 00:05:47,400 Speaker 1: to write about them in a stuffy way. And we 106 00:05:47,440 --> 00:05:52,320 Speaker 1: can certainly write about them using a more conversational, accessible style, 107 00:05:52,760 --> 00:05:54,960 Speaker 1: which I think is suitable for the web. And there's 108 00:05:55,000 --> 00:05:57,640 Speaker 1: what we are trying to do with at the Washington Post, 109 00:05:58,040 --> 00:06:00,240 Speaker 1: and not just doing it in a different style, but 110 00:06:00,360 --> 00:06:02,480 Speaker 1: using all the tools that are now available to us, 111 00:06:02,720 --> 00:06:07,880 Speaker 1: so uh, displaying social media, using audio, using video, using 112 00:06:07,880 --> 00:06:13,920 Speaker 1: original documents, annotating those original documents, uh, incorporating interactivity with graphics, 113 00:06:13,920 --> 00:06:16,560 Speaker 1: and in all sorts of other ways. Those are tools 114 00:06:16,600 --> 00:06:18,800 Speaker 1: that we can use to better tell stories. And there's 115 00:06:18,800 --> 00:06:20,440 Speaker 1: no reason that we shouldn't do that. And I guess 116 00:06:20,440 --> 00:06:23,080 Speaker 1: clickbate Sorry brands want every one more follow up? I 117 00:06:23,160 --> 00:06:26,279 Speaker 1: think clickbait. I guess can notes, uh, you know, enticing 118 00:06:26,279 --> 00:06:28,719 Speaker 1: someone to click something and then not delivering. But I 119 00:06:28,720 --> 00:06:31,919 Speaker 1: did have an interesting conversation that maybe a year or so, Marty, 120 00:06:31,960 --> 00:06:35,040 Speaker 1: with your boss Fred Ryan, who told me that when 121 00:06:35,040 --> 00:06:39,000 Speaker 1: there was a headline about succession in Saudi Arabia, nobody clicked, 122 00:06:39,200 --> 00:06:42,160 Speaker 1: but then when you changed it to Game of Thrones 123 00:06:42,200 --> 00:06:46,760 Speaker 1: in Saudi Arabia. Suddenly everybody clicked on that story, so 124 00:06:46,800 --> 00:06:50,800 Speaker 1: it's thrones actually applies in that country. By the way, 125 00:06:50,839 --> 00:06:53,680 Speaker 1: But would you say it, it's really pushed you to 126 00:06:53,720 --> 00:06:57,680 Speaker 1: be more creative in terms of how you attract consumers 127 00:06:57,720 --> 00:07:00,000 Speaker 1: to the stories you're telling. You. I think it has 128 00:07:00,279 --> 00:07:02,680 Speaker 1: it's it's it's pushed us to be more creative. It's 129 00:07:02,680 --> 00:07:05,039 Speaker 1: pushed us to be more colloquial. It's pushed us to 130 00:07:05,160 --> 00:07:07,239 Speaker 1: talk in the way that other people talk, that ordinary 131 00:07:07,240 --> 00:07:10,560 Speaker 1: people talk. Uh, And look, I mean newspapers have a 132 00:07:10,680 --> 00:07:13,240 Speaker 1: very structured style. We all, all of us who got 133 00:07:13,240 --> 00:07:15,440 Speaker 1: into the business, we sort of learned that style. It 134 00:07:15,520 --> 00:07:18,200 Speaker 1: was appropriate to that particular medium. And by the way, 135 00:07:18,240 --> 00:07:21,520 Speaker 1: the headlines for newspapers were designed to fit into the 136 00:07:21,560 --> 00:07:25,559 Speaker 1: space that was permitted. So you would have were short 137 00:07:25,600 --> 00:07:30,040 Speaker 1: words like somebody eyed something and somebody molds something. Nobody 138 00:07:30,040 --> 00:07:32,720 Speaker 1: talks that way. Nobody talks about eyeing something and mulling 139 00:07:32,840 --> 00:07:35,120 Speaker 1: something and things like that, And those were words that 140 00:07:35,160 --> 00:07:38,400 Speaker 1: were designed to fit. They didn't use articles like the 141 00:07:38,640 --> 00:07:41,240 Speaker 1: and a and and and things like that. Those were 142 00:07:41,280 --> 00:07:43,600 Speaker 1: thrown out. Well, now we can write headlines in the 143 00:07:43,600 --> 00:07:47,360 Speaker 1: way that people ordinarily speak. People can identify with that. 144 00:07:48,160 --> 00:07:51,280 Speaker 1: The style of writing I think is it sounds more authentic. 145 00:07:51,320 --> 00:07:53,720 Speaker 1: I believe it is more authentic. It's more reflective of 146 00:07:53,760 --> 00:07:55,600 Speaker 1: the voice of the author. You have a better sense 147 00:07:55,600 --> 00:07:57,960 Speaker 1: of the personality of the author. And I think that 148 00:07:57,960 --> 00:08:00,160 Speaker 1: that's all a good thing. I mean, this question be 149 00:08:00,240 --> 00:08:01,920 Speaker 1: for both of you. I mean one change that I 150 00:08:01,960 --> 00:08:05,080 Speaker 1: think technology is wrought is that, particularly on social media, 151 00:08:05,560 --> 00:08:08,880 Speaker 1: reporters feel a lot more comfortable using the first person 152 00:08:09,080 --> 00:08:13,920 Speaker 1: singular and expressing opinions. When and where does that cross 153 00:08:14,000 --> 00:08:17,560 Speaker 1: the line into bias as opposed to just having a 154 00:08:17,640 --> 00:08:20,920 Speaker 1: distinctive voice. Well, I think that's very hard to define, 155 00:08:20,960 --> 00:08:24,000 Speaker 1: and it's it is risky territory. I think people want 156 00:08:24,000 --> 00:08:27,960 Speaker 1: to use social media Twitter, Facebook, other read it wherever 157 00:08:28,000 --> 00:08:32,040 Speaker 1: it might be to reflect who they are, give give 158 00:08:32,280 --> 00:08:34,719 Speaker 1: readers a sense of who they are as persons, that 159 00:08:34,800 --> 00:08:38,920 Speaker 1: it's not a nameless, faceless institution there. Uh. But um, 160 00:08:39,200 --> 00:08:41,560 Speaker 1: they have to be careful that they not go out 161 00:08:41,559 --> 00:08:43,920 Speaker 1: and express opinions that they would not be allowed to 162 00:08:43,960 --> 00:08:48,520 Speaker 1: express actually on in a conventional news story. Uh, And 163 00:08:48,559 --> 00:08:51,400 Speaker 1: that is that is tricky. I don't know exactly where 164 00:08:51,400 --> 00:08:54,160 Speaker 1: the line is, but I know that we're being monitored 165 00:08:54,200 --> 00:08:59,040 Speaker 1: all the time by political factions out there. Uh, And 166 00:08:59,280 --> 00:09:01,360 Speaker 1: we hear from pell when they feel that we've crossed 167 00:09:01,360 --> 00:09:04,559 Speaker 1: that line. And we always were always constantly evaluating that, 168 00:09:04,960 --> 00:09:07,319 Speaker 1: and from time to time we have to remind people 169 00:09:07,400 --> 00:09:10,440 Speaker 1: that they need to restrain themselves. But it seems to 170 00:09:10,440 --> 00:09:13,439 Speaker 1: me that people do want a point of view increasingly, 171 00:09:13,800 --> 00:09:19,520 Speaker 1: and and that reporters are acting generally more like calumnists 172 00:09:19,600 --> 00:09:22,920 Speaker 1: these days. Um, And I think that must be tricky, 173 00:09:23,000 --> 00:09:25,160 Speaker 1: but it also I think the market is demanding it 174 00:09:25,200 --> 00:09:27,840 Speaker 1: in some ways, don't you think, Marty, Yeah, to some degree. 175 00:09:27,840 --> 00:09:29,560 Speaker 1: And I think, as I said, I think people want 176 00:09:29,559 --> 00:09:31,760 Speaker 1: more authenticity. They want to know who you are. They 177 00:09:31,760 --> 00:09:36,120 Speaker 1: don't want just some anodyne presence on on on the 178 00:09:36,160 --> 00:09:39,560 Speaker 1: web or on other digital report Well maybe I don't 179 00:09:39,559 --> 00:09:41,719 Speaker 1: want to take it out on They do quite a 180 00:09:41,720 --> 00:09:47,680 Speaker 1: good job, but that, um they. I think there's a 181 00:09:47,720 --> 00:09:52,719 Speaker 1: difference between let's say, providing analysis, being honest with your 182 00:09:52,760 --> 00:09:58,040 Speaker 1: readers about what your conclusions are based on actual factual reporting, 183 00:09:58,480 --> 00:10:00,400 Speaker 1: as opposed to just going out the next wresting an 184 00:10:00,440 --> 00:10:05,000 Speaker 1: opinion and taking sides. I mean, today we launched or 185 00:10:05,080 --> 00:10:08,200 Speaker 1: this will be last week we launched Briefing, which is 186 00:10:08,200 --> 00:10:13,120 Speaker 1: The Information's first commentary product. And for years our subscribers 187 00:10:13,600 --> 00:10:17,400 Speaker 1: wanted not just our unique articles, but the opinions of 188 00:10:17,400 --> 00:10:20,440 Speaker 1: the reporter who's covered Google for ten years on what 189 00:10:20,559 --> 00:10:24,600 Speaker 1: Google just announced. And we waited a long We waited 190 00:10:24,640 --> 00:10:27,640 Speaker 1: three years to offer it because I wanted to be 191 00:10:27,679 --> 00:10:30,199 Speaker 1: able to offer a product where we had enough reporters 192 00:10:30,559 --> 00:10:33,680 Speaker 1: we could get some breath and expertise. So we have 193 00:10:34,280 --> 00:10:38,040 Speaker 1: now in its live today um last week Briefing dot 194 00:10:38,080 --> 00:10:40,800 Speaker 1: the Information dot com our first stab at this. It's 195 00:10:40,840 --> 00:10:44,120 Speaker 1: you know, our reporters take on the day's news. And 196 00:10:44,400 --> 00:10:47,040 Speaker 1: I was leary because I think there's so much just 197 00:10:47,160 --> 00:10:50,439 Speaker 1: punditry and opinion or sense that to be a great 198 00:10:50,520 --> 00:10:52,280 Speaker 1: journalist you have to have a brand. So you have 199 00:10:52,360 --> 00:10:55,560 Speaker 1: to be have a like brand for journalism as a personality. 200 00:10:56,160 --> 00:11:00,600 Speaker 1: It's trustworthiness, knowledge, sort of influence based on your experience 201 00:11:00,640 --> 00:11:03,200 Speaker 1: as a reporter. So I'm interested to see where it 202 00:11:03,200 --> 00:11:05,719 Speaker 1: goes for us. Let's talk about fake news? Can we? 203 00:11:05,840 --> 00:11:09,080 Speaker 1: I mean, is there anything that can be done about 204 00:11:09,120 --> 00:11:11,960 Speaker 1: all these fake news stories? I know that I interviewed 205 00:11:12,000 --> 00:11:14,640 Speaker 1: the guy who started Craigslist, Craig Newmark, who is here 206 00:11:14,720 --> 00:11:17,760 Speaker 1: actually Hi, Craig, wherever you are, and he gave a 207 00:11:17,800 --> 00:11:20,800 Speaker 1: million dollars to the Point Institute to try to figure 208 00:11:20,840 --> 00:11:23,000 Speaker 1: out like should there be a good housekeeping feel of 209 00:11:23,040 --> 00:11:27,320 Speaker 1: approval or something along those lines so a consumer can 210 00:11:27,400 --> 00:11:32,000 Speaker 1: understand that certain journalistic practices were followed in the creation 211 00:11:32,080 --> 00:11:34,960 Speaker 1: of a story. Where do you see that going, Marty, 212 00:11:35,040 --> 00:11:37,760 Speaker 1: And is there anything that can be done about it? 213 00:11:37,760 --> 00:11:40,800 Speaker 1: Because some of them are so artfully done, it really 214 00:11:40,880 --> 00:11:43,760 Speaker 1: is hard to tell the difference. Well, I you know, 215 00:11:43,840 --> 00:11:45,600 Speaker 1: it's hard to It's hard to answer. I think it's 216 00:11:45,640 --> 00:11:48,760 Speaker 1: a difficult challenge. Probably the greatest challenge that we face 217 00:11:48,800 --> 00:11:51,199 Speaker 1: in the industry at the moment is that there are 218 00:11:51,320 --> 00:11:57,520 Speaker 1: media outlets that are propagating falsehoods, that are propagating dangerous 219 00:11:57,520 --> 00:12:01,360 Speaker 1: conspiracy theories all of that. Um. Look, I mean from 220 00:12:01,360 --> 00:12:02,840 Speaker 1: our standpoint, I think we just have to do our 221 00:12:02,920 --> 00:12:06,120 Speaker 1: job and rooted in the reporting. We also do fact 222 00:12:06,160 --> 00:12:10,560 Speaker 1: checks uh constantly, um. And we have an inordinate number 223 00:12:10,559 --> 00:12:13,040 Speaker 1: of fact checks on the President at the moment, uh 224 00:12:13,080 --> 00:12:16,160 Speaker 1: and that will probably continue for quite some time, but 225 00:12:16,720 --> 00:12:19,480 Speaker 1: also on what other people are what other people are saying. Now, 226 00:12:19,520 --> 00:12:21,280 Speaker 1: the problem is that people who are aligned with a 227 00:12:21,280 --> 00:12:25,240 Speaker 1: certain political point of view, are not moved by these 228 00:12:25,240 --> 00:12:27,839 Speaker 1: fact checks. In fact, they view these fact checks as 229 00:12:27,960 --> 00:12:30,200 Speaker 1: part of the conspiracy, as part of an effort to 230 00:12:30,200 --> 00:12:34,120 Speaker 1: suppress what they're hearing from other outlets. Uh, it is 231 00:12:34,160 --> 00:12:36,080 Speaker 1: a it is a huge challenge. I do think that 232 00:12:36,120 --> 00:12:39,800 Speaker 1: the social media companies have a responsibility here, which I 233 00:12:39,840 --> 00:12:42,400 Speaker 1: think they are beginning to recognize and still beginning to 234 00:12:42,400 --> 00:12:46,200 Speaker 1: grapple with. And that would include Google and Facebook and 235 00:12:46,640 --> 00:12:49,080 Speaker 1: Twitter and some of the other they ought to be 236 00:12:49,160 --> 00:12:52,760 Speaker 1: doing that they're not doing today. Well, I'm not entirely sure. 237 00:12:52,840 --> 00:12:55,040 Speaker 1: I mean, I think that they they've already taken some 238 00:12:55,080 --> 00:12:59,319 Speaker 1: steps to reduce the economic incentives for people to spread falsehoods. 239 00:12:59,640 --> 00:13:02,160 Speaker 1: I think that's a that's a good first step. Uh. 240 00:13:02,200 --> 00:13:06,080 Speaker 1: They've in Facebook has incorporated fact checks onto its site. 241 00:13:06,120 --> 00:13:08,360 Speaker 1: I think that's at a very nascent stage and a 242 00:13:08,480 --> 00:13:11,000 Speaker 1: very urgent, uncertain stage. You know. I think they're very 243 00:13:11,000 --> 00:13:13,600 Speaker 1: sensitive to the free speech issues, and I'm sensitive to 244 00:13:13,600 --> 00:13:17,520 Speaker 1: the free speech issues, and I'm sure you are as well. Um, 245 00:13:17,559 --> 00:13:20,360 Speaker 1: but it's also dangerous not to do anything about it. 246 00:13:20,480 --> 00:13:22,920 Speaker 1: And so I think we're at a very early stage 247 00:13:22,920 --> 00:13:24,600 Speaker 1: where people are trying to figure out how do we 248 00:13:24,679 --> 00:13:26,520 Speaker 1: how do we grapple with this. You know, there's another 249 00:13:26,520 --> 00:13:30,200 Speaker 1: definition of fake news, which is not stories that are inaccurate. 250 00:13:30,240 --> 00:13:33,400 Speaker 1: It's stories that the administration doesn't agree with, even if 251 00:13:33,440 --> 00:13:36,360 Speaker 1: they are. That's how they that's that's how they define 252 00:13:36,520 --> 00:13:39,480 Speaker 1: fake news. And so how do you combat that or 253 00:13:39,480 --> 00:13:42,719 Speaker 1: at least the perception among you of the country that 254 00:13:42,760 --> 00:13:45,800 Speaker 1: if you write something critical of the president, you're propagating 255 00:13:45,800 --> 00:13:50,000 Speaker 1: fake information. Yeah, it's a tough one. Um. I think 256 00:13:50,080 --> 00:13:51,920 Speaker 1: that we have to be more transparent, and I think 257 00:13:51,960 --> 00:13:54,760 Speaker 1: that's where the industry is going right now. Certainly what 258 00:13:54,880 --> 00:13:57,760 Speaker 1: Google is pushing with its trust prom with it's not 259 00:13:57,800 --> 00:14:00,480 Speaker 1: really Google's project, but there's a trust project Google is 260 00:14:00,520 --> 00:14:04,280 Speaker 1: helping to support. I think it's what David Farrenhold, who 261 00:14:04,280 --> 00:14:06,200 Speaker 1: just won the pull ups are at the Post did 262 00:14:06,320 --> 00:14:11,280 Speaker 1: with his investigation of Trump's charitable activities or lack thereof, 263 00:14:11,320 --> 00:14:14,320 Speaker 1: as you as you noted, uh, is that he actually 264 00:14:14,320 --> 00:14:16,640 Speaker 1: opened up his investigation to the public. He said, who 265 00:14:16,640 --> 00:14:18,760 Speaker 1: else should I call? What else should I look at? 266 00:14:18,840 --> 00:14:20,800 Speaker 1: Here are my notes? He actually took a picture of 267 00:14:20,840 --> 00:14:23,720 Speaker 1: his handwritten notes, put them up on Twitter and said, 268 00:14:24,120 --> 00:14:26,720 Speaker 1: here's my list. Who else do? Where? Where else should 269 00:14:26,760 --> 00:14:29,960 Speaker 1: I go? And he enlisted the public and his investigation, 270 00:14:30,240 --> 00:14:33,680 Speaker 1: and that was a very transparent process. And I think 271 00:14:33,680 --> 00:14:35,000 Speaker 1: we have to do more of that. I think that 272 00:14:35,120 --> 00:14:37,840 Speaker 1: we have to make sure that we include original documents. 273 00:14:37,840 --> 00:14:41,280 Speaker 1: I think that we have to h include audio of 274 00:14:41,320 --> 00:14:44,600 Speaker 1: our interviews. I think that we have to, uh, you know, 275 00:14:44,680 --> 00:14:47,040 Speaker 1: do all those kinds of things, talk a bit more 276 00:14:47,080 --> 00:14:50,160 Speaker 1: about how we went about our work. I'm not saying 277 00:14:50,200 --> 00:14:54,040 Speaker 1: that's a total answer, it's a possible answer. I wonder 278 00:14:54,080 --> 00:14:56,720 Speaker 1: how effective you know, Donald Trump was so affected during 279 00:14:56,760 --> 00:14:59,920 Speaker 1: the course of campaign. Of the campaign sort of ripped 280 00:15:00,000 --> 00:15:04,680 Speaker 1: eating these mantras like crooked Hillary or what was Marco Rubio? 281 00:15:04,880 --> 00:15:08,200 Speaker 1: He was Marco, you know, blah blah blah. But but 282 00:15:08,280 --> 00:15:12,480 Speaker 1: I think he was actually incredibly smart about kind of 283 00:15:12,520 --> 00:15:16,800 Speaker 1: the repetitive nature of these monikers. And when he does 284 00:15:16,840 --> 00:15:19,320 Speaker 1: that about the news media fake news. I mean, I 285 00:15:19,320 --> 00:15:22,040 Speaker 1: think there's a method to the madness. And I'm curious 286 00:15:22,080 --> 00:15:27,800 Speaker 1: if you feel he his de legitimization of the news 287 00:15:27,880 --> 00:15:33,800 Speaker 1: media is going to start actually almost seeping in subconsciously 288 00:15:34,240 --> 00:15:38,200 Speaker 1: into the minds of consumers so that they have even 289 00:15:38,360 --> 00:15:42,680 Speaker 1: it's going to increase the mistrust that already exists. Well, 290 00:15:42,760 --> 00:15:45,040 Speaker 1: it's evident that that's already happened. I think he's actually 291 00:15:45,040 --> 00:15:47,680 Speaker 1: achieved some of his goals in that regard. There was 292 00:15:47,720 --> 00:15:51,680 Speaker 1: a Quinnipiac poll that showed that something like of Republicans 293 00:15:52,040 --> 00:15:54,640 Speaker 1: now believe that certain media outlets are the enemy of 294 00:15:54,680 --> 00:15:56,840 Speaker 1: the people. Uh. There was just a poll that I 295 00:15:56,880 --> 00:15:59,320 Speaker 1: saw this this morning that has over a third of 296 00:15:59,360 --> 00:16:04,080 Speaker 1: Republican now believe that, uh, a free press can can 297 00:16:04,160 --> 00:16:08,480 Speaker 1: be more dangerous, can actually be dangerous. I think that's 298 00:16:08,600 --> 00:16:11,520 Speaker 1: very concerning now. Thankfully Democrats don't see it that way. 299 00:16:11,560 --> 00:16:14,880 Speaker 1: Independence don't see it that way. But I think it's 300 00:16:14,960 --> 00:16:19,240 Speaker 1: a it's a it's having a seriously corrosive effect on 301 00:16:19,280 --> 00:16:23,320 Speaker 1: our credibility and I think ultimately on the democratic system 302 00:16:23,320 --> 00:16:27,720 Speaker 1: that in this country we do our jobs. I think that, uh, 303 00:16:27,840 --> 00:16:29,400 Speaker 1: you know, the day in and day out, we just 304 00:16:29,440 --> 00:16:32,720 Speaker 1: have to do our jobs. We have to present actual evidence, 305 00:16:32,760 --> 00:16:35,320 Speaker 1: we have to be transparent. I like to remind people 306 00:16:35,360 --> 00:16:38,160 Speaker 1: that during Watergate, I was in high school at the time, 307 00:16:38,280 --> 00:16:41,000 Speaker 1: but the press, the popularity, the approval of the press 308 00:16:41,040 --> 00:16:43,560 Speaker 1: was at a very low state. Uh. The Washington Post 309 00:16:43,680 --> 00:16:46,640 Speaker 1: was doing the investigation, other media outlets were doing their investigation. 310 00:16:47,200 --> 00:16:49,600 Speaker 1: Was the press was held in extremely low regard. The 311 00:16:49,640 --> 00:16:53,400 Speaker 1: press was under constant attack from the Nixon administration. Spirag 312 00:16:53,480 --> 00:16:56,600 Speaker 1: knew his his first vice president was constantly going after 313 00:16:57,400 --> 00:17:00,680 Speaker 1: the press. You know, we had that ati is um 314 00:17:00,800 --> 00:17:04,080 Speaker 1: uh was one phrase, but there were worse. It was 315 00:17:04,160 --> 00:17:07,760 Speaker 1: very illiterative, but not very not very elegant. Uh And 316 00:17:08,359 --> 00:17:10,680 Speaker 1: but he had other more harsh words. And of course 317 00:17:10,720 --> 00:17:13,560 Speaker 1: there was an enemy's list and all that, you know, 318 00:17:14,040 --> 00:17:16,760 Speaker 1: uh Lee investigations and all that sort of thing that 319 00:17:17,040 --> 00:17:21,640 Speaker 1: seemed very familiar today. But after it was over, when 320 00:17:21,920 --> 00:17:24,520 Speaker 1: Nixon had to resign, when the public knew that their 321 00:17:24,560 --> 00:17:28,320 Speaker 1: president really wasn't crook, the popularity of the press really 322 00:17:28,440 --> 00:17:31,000 Speaker 1: rose because it showed that the press had actually been 323 00:17:31,040 --> 00:17:34,679 Speaker 1: doing its job, doing it courageously, doing it accurately, and 324 00:17:34,800 --> 00:17:38,080 Speaker 1: doing it in spite of this demonization by the White House. 325 00:17:41,720 --> 00:17:43,800 Speaker 1: We're going to take a quick break. We'll be back 326 00:17:43,800 --> 00:17:47,120 Speaker 1: though with more from Marty Baron and Jessica Lesson right 327 00:17:47,160 --> 00:17:59,119 Speaker 1: after Brian reads these ads. So as a reminder, in 328 00:17:59,119 --> 00:18:01,440 Speaker 1: our next couple episd ssodes, you'll hear from Sheila Evans, 329 00:18:01,480 --> 00:18:04,679 Speaker 1: who is the president of HBO Documentary Films, and also 330 00:18:04,880 --> 00:18:08,520 Speaker 1: Christy Todd Whitman, who was the first female governor of 331 00:18:08,520 --> 00:18:10,920 Speaker 1: New Jersey and the former head of the e p 332 00:18:11,000 --> 00:18:14,719 Speaker 1: A under President George W. Bush. And of course, as always, 333 00:18:14,720 --> 00:18:16,960 Speaker 1: we want to hear from you guys, so please call us. 334 00:18:17,200 --> 00:18:19,840 Speaker 1: We'll leave a message with your questions for Sheila Evans 335 00:18:19,840 --> 00:18:22,160 Speaker 1: and Christy Whitman. Can I do the phone number please? 336 00:18:22,440 --> 00:18:25,560 Speaker 1: Of course that's nine to nine, two to four four 337 00:18:25,640 --> 00:18:29,480 Speaker 1: six three seven. Again, that number is nine to nine 338 00:18:29,840 --> 00:18:32,639 Speaker 1: two to four four six three seven. And if you 339 00:18:32,680 --> 00:18:34,680 Speaker 1: call in the next thirty minutes you get a set 340 00:18:34,720 --> 00:18:38,960 Speaker 1: of GINSU knives. I sounded a little like one of 341 00:18:38,960 --> 00:18:42,840 Speaker 1: those ladies, you know, an infomercial. No, not an infomercial 342 00:18:43,240 --> 00:18:47,280 Speaker 1: like on a sex line. Yeah, we won't tell. We 343 00:18:47,320 --> 00:18:49,679 Speaker 1: won't tell people about that part of your career. What 344 00:18:49,760 --> 00:18:55,199 Speaker 1: are you wearing? Okay? All right, and now back to 345 00:18:55,200 --> 00:18:59,240 Speaker 1: our interview with Marty Barren and Jessica Lesson. You know, 346 00:18:59,280 --> 00:19:01,720 Speaker 1: it's interesting Silicon Valley is home to some of the 347 00:19:01,760 --> 00:19:05,719 Speaker 1: only institutions that people still seem to respect, even as 348 00:19:06,080 --> 00:19:10,040 Speaker 1: you know, the media goes down in public estimation corporations 349 00:19:10,040 --> 00:19:13,159 Speaker 1: Washington d C. Do you think there's gonna be a 350 00:19:13,200 --> 00:19:16,320 Speaker 1: point at which sort of tech gazillionaires are going to 351 00:19:16,400 --> 00:19:20,120 Speaker 1: become the new Wall Street bankers who are reviled rather 352 00:19:20,200 --> 00:19:22,440 Speaker 1: than celebrated. But I think it's happening. I mean, look 353 00:19:22,480 --> 00:19:28,600 Speaker 1: at Ubert right, Um, a company that's a Silicon Valley darling. Um. 354 00:19:28,640 --> 00:19:32,760 Speaker 1: But you know, really in a spate of bad press, 355 00:19:32,880 --> 00:19:35,840 Speaker 1: many of it of their own making, I believe, But 356 00:19:36,240 --> 00:19:39,560 Speaker 1: I do think that there is a I do think 357 00:19:39,640 --> 00:19:42,720 Speaker 1: tech CEOs are are the new bankers. And if you 358 00:19:42,800 --> 00:19:46,600 Speaker 1: look at how um, you know some of these really young, 359 00:19:46,720 --> 00:19:50,800 Speaker 1: wealthy billionaires are treated, um and the scrutiny they get 360 00:19:50,880 --> 00:19:53,840 Speaker 1: much of it deserved. Um. You know, there's an interesting 361 00:19:53,880 --> 00:19:56,600 Speaker 1: It's it's nice being out on the East Coast and 362 00:19:56,680 --> 00:19:59,480 Speaker 1: out of the Silicon Valley bubble. And I planned here 363 00:19:59,480 --> 00:20:03,840 Speaker 1: in the heartland of Manhattan where I know, and then 364 00:20:03,880 --> 00:20:06,040 Speaker 1: I'm going to d C next week or later this week, 365 00:20:06,080 --> 00:20:09,240 Speaker 1: I'll be at another bubble. But um, look, I think 366 00:20:09,320 --> 00:20:13,080 Speaker 1: that there's still It's fascinating to me how outside of 367 00:20:13,160 --> 00:20:16,400 Speaker 1: Silicon Valley the valley is viewed and um, there's no 368 00:20:16,480 --> 00:20:19,560 Speaker 1: doubt that right now, I think there's a big gap, 369 00:20:19,680 --> 00:20:23,160 Speaker 1: and there's in the sense that the valley is viewed 370 00:20:23,160 --> 00:20:28,719 Speaker 1: as people are curious, maybe a bit worried about it 371 00:20:28,720 --> 00:20:31,960 Speaker 1: coming to encroach on their turf um and tech companies, 372 00:20:32,119 --> 00:20:33,920 Speaker 1: but also a lot of sense that, you know, we're 373 00:20:33,960 --> 00:20:36,919 Speaker 1: living in another planet with self driving cars and disconnected 374 00:20:36,920 --> 00:20:38,920 Speaker 1: from reality, and I think there's a lot of truth 375 00:20:39,000 --> 00:20:42,159 Speaker 1: to that. My advice to to the entrepreneurs in Silicon 376 00:20:42,240 --> 00:20:44,679 Speaker 1: Valley is like, wake up and change your tone a 377 00:20:44,680 --> 00:20:47,560 Speaker 1: little bit about how you're talking about things like jobs. 378 00:20:48,160 --> 00:20:52,320 Speaker 1: You know, Techy's gloat about reducing jobs, right, and and 379 00:20:52,359 --> 00:20:55,760 Speaker 1: that's obviously I don't think the right narrative. So I 380 00:20:55,800 --> 00:20:58,560 Speaker 1: think we're probably even only at the beginning of the backlash. 381 00:20:59,119 --> 00:21:01,440 Speaker 1: Let's take a brain interlude and talked to you, Marty 382 00:21:01,480 --> 00:21:04,600 Speaker 1: about your background. Um, you grew up in Florida, You're 383 00:21:04,640 --> 00:21:07,080 Speaker 1: the son of immigrants. I know that you worked on 384 00:21:07,119 --> 00:21:11,160 Speaker 1: your high school newspaper, your college newspaper, Lehigh, Right, So 385 00:21:11,560 --> 00:21:14,320 Speaker 1: what drew you to this business? And one of the 386 00:21:14,359 --> 00:21:20,119 Speaker 1: biggest changes you've witnessed since you got into the business. Uh, well, 387 00:21:20,160 --> 00:21:22,080 Speaker 1: you know what drew it to me was I wasn't 388 00:21:22,080 --> 00:21:23,960 Speaker 1: a family that had come to the United States. They 389 00:21:23,960 --> 00:21:26,600 Speaker 1: were keenly interested in what was happening in this country. 390 00:21:26,600 --> 00:21:29,080 Speaker 1: Where they had arrived. They were keenly interested in what 391 00:21:29,119 --> 00:21:31,680 Speaker 1: was happening around the world. We had a news news 392 00:21:31,760 --> 00:21:35,399 Speaker 1: habit in the household newspaper every day, the local newspaper, 393 00:21:35,880 --> 00:21:37,440 Speaker 1: which was the only one that you could get at 394 00:21:37,440 --> 00:21:41,080 Speaker 1: that time. Uh, the national news with the Huntley Brinkley 395 00:21:41,119 --> 00:21:44,800 Speaker 1: Report at the time, and then local news and then 396 00:21:44,800 --> 00:21:46,919 Speaker 1: Time magazine every week. And that was just part of it. 397 00:21:46,960 --> 00:21:49,280 Speaker 1: And we would talk about that and and they were 398 00:21:49,320 --> 00:21:52,080 Speaker 1: keenly interested. So I became interested in all of that. 399 00:21:52,600 --> 00:21:55,760 Speaker 1: You know, the changes are there's so many changes. I mean, 400 00:21:55,920 --> 00:21:58,359 Speaker 1: obviously the Internet has been the most dramatic change. I mean, 401 00:21:58,400 --> 00:22:02,000 Speaker 1: it's just changed everything about our business. Uh. We had 402 00:22:02,040 --> 00:22:05,040 Speaker 1: all sorts of protections in this business. Before it was 403 00:22:05,160 --> 00:22:07,320 Speaker 1: very difficult to get into the business. We didn't have 404 00:22:07,320 --> 00:22:10,360 Speaker 1: as much competition or all that much competition for advertising. 405 00:22:10,520 --> 00:22:13,239 Speaker 1: We didn't have as much competition for readers. People. The 406 00:22:13,240 --> 00:22:15,960 Speaker 1: newspapers were the only place people could get some information. 407 00:22:16,280 --> 00:22:19,480 Speaker 1: Those were huge advantages. Turns out none of those advantages 408 00:22:19,560 --> 00:22:22,400 Speaker 1: were actually earned. They were just gifted to us and 409 00:22:22,520 --> 00:22:26,080 Speaker 1: so uh, and then when those advantages disappeared, we had 410 00:22:26,080 --> 00:22:28,840 Speaker 1: to adjust. You know, for me, it was very difficult 411 00:22:28,880 --> 00:22:30,520 Speaker 1: at the beginning, I have to admit, I mean, I, 412 00:22:30,800 --> 00:22:33,080 Speaker 1: like many people in my field, I sort of went 413 00:22:33,160 --> 00:22:35,400 Speaker 1: through this period of mourning because you could just see 414 00:22:35,840 --> 00:22:38,760 Speaker 1: staffs being cut and things like that. But at some point, 415 00:22:38,880 --> 00:22:41,240 Speaker 1: you know, you have to stop mourning in the same 416 00:22:41,280 --> 00:22:43,800 Speaker 1: way if you were to lose a relative, a close relative, 417 00:22:43,920 --> 00:22:46,080 Speaker 1: or a friend. At some point you just go on 418 00:22:46,119 --> 00:22:48,679 Speaker 1: and live your life. And I came to that conclusion, 419 00:22:48,920 --> 00:22:52,160 Speaker 1: and then you start to see, well, what are the opportunities. 420 00:22:52,200 --> 00:22:54,840 Speaker 1: And there are incredible opportunities in this field right now. 421 00:22:54,880 --> 00:22:57,359 Speaker 1: We reach more people, we can tell stories in different ways, 422 00:22:57,800 --> 00:23:00,240 Speaker 1: we can tell these stories more effectively, we can show 423 00:23:00,280 --> 00:23:03,800 Speaker 1: more of our work. All of these things are really fantastic, 424 00:23:03,880 --> 00:23:06,760 Speaker 1: and for someone like me, it's it's an exciting time 425 00:23:06,760 --> 00:23:08,560 Speaker 1: to be in the business. Jess, you were at the 426 00:23:08,560 --> 00:23:13,120 Speaker 1: Wall Street Journal for years before founding the Information curious 427 00:23:13,720 --> 00:23:16,360 Speaker 1: your reaction as a as an alum of the Murdoch 428 00:23:16,520 --> 00:23:20,080 Speaker 1: Empire to what's going on at Fox News, And I 429 00:23:20,119 --> 00:23:23,720 Speaker 1: love to hear Marty's thoughts about that as well. You know, 430 00:23:23,760 --> 00:23:27,000 Speaker 1: I think earlier today we had Jeff Sucker on stage 431 00:23:27,000 --> 00:23:28,919 Speaker 1: and he was talking about Fox News is sort of 432 00:23:28,920 --> 00:23:33,760 Speaker 1: the administrations, UM propaganda wing, and its just sort of 433 00:23:34,080 --> 00:23:36,760 Speaker 1: when I look at it, it's just part of the 434 00:23:37,520 --> 00:23:42,760 Speaker 1: continued polarization of our news landscape. And I think that 435 00:23:44,000 --> 00:23:46,600 Speaker 1: it's problematic and there really aren't solutions. I mean people 436 00:23:46,600 --> 00:23:48,199 Speaker 1: like say, well, we'll just sort of play it in 437 00:23:48,200 --> 00:23:52,120 Speaker 1: the middle, or will be more transparent, UM put our 438 00:23:52,160 --> 00:23:54,520 Speaker 1: audio online and then people will trust us more. I 439 00:23:54,520 --> 00:23:57,280 Speaker 1: think people hear what they want to hear. And so 440 00:23:58,520 --> 00:24:01,639 Speaker 1: one thing that I just stay laser focused on is 441 00:24:01,920 --> 00:24:05,920 Speaker 1: hiring more journalists who were doing original reporting and whose 442 00:24:06,000 --> 00:24:08,240 Speaker 1: job it is to go out there and get stories 443 00:24:08,240 --> 00:24:10,640 Speaker 1: that no one else is writing. And Marty's absolutely right, 444 00:24:10,680 --> 00:24:13,760 Speaker 1: you know, the newsroom head count is shrinking. The information 445 00:24:13,840 --> 00:24:16,760 Speaker 1: We have the second largest technology reporting team and Selicon 446 00:24:16,840 --> 00:24:19,640 Speaker 1: Valley behind Bloomberg, and we're three years old. And it's 447 00:24:19,680 --> 00:24:22,040 Speaker 1: because we have a business model that as loud as 448 00:24:22,080 --> 00:24:25,560 Speaker 1: to scale and hire a great journalists. So UM, I 449 00:24:25,600 --> 00:24:28,399 Speaker 1: think when I look at one outlet or another outlet, 450 00:24:28,480 --> 00:24:30,040 Speaker 1: or what they're doing or not, I'm just saying, are 451 00:24:30,080 --> 00:24:33,680 Speaker 1: they hiring reporters who are going to write great stories? 452 00:24:33,880 --> 00:24:36,120 Speaker 1: And I try to convince them those stories will drive 453 00:24:36,160 --> 00:24:40,080 Speaker 1: their business. There's a sense that just great journalism isn't 454 00:24:40,080 --> 00:24:43,199 Speaker 1: a good business. You have to have a fancy events business, 455 00:24:43,280 --> 00:24:44,720 Speaker 1: or you have to have a B two B business, 456 00:24:44,720 --> 00:24:47,000 Speaker 1: you have to something else. But our experience has been 457 00:24:47,080 --> 00:24:50,479 Speaker 1: it's a great business. We're casually positive, we're growing fast. 458 00:24:50,760 --> 00:24:54,960 Speaker 1: So um, sort of zooming out how I see the landscape? 459 00:24:55,000 --> 00:24:57,840 Speaker 1: And um, I don't pay close attention to the day 460 00:24:57,880 --> 00:25:02,119 Speaker 1: to day of Fox News, the the latest surveys about 461 00:25:02,240 --> 00:25:05,359 Speaker 1: people's trust in the media. How concerned are you about 462 00:25:05,400 --> 00:25:08,840 Speaker 1: how siloed it is? And and as just said, people 463 00:25:08,920 --> 00:25:10,920 Speaker 1: hear what they want to hear. A friend of mine said, 464 00:25:10,960 --> 00:25:14,320 Speaker 1: people are looking for affirmation, not information. I mean just 465 00:25:14,640 --> 00:25:17,440 Speaker 1: taking a look at the big picture. How worried are 466 00:25:17,480 --> 00:25:21,040 Speaker 1: you for sort of the state of democracy? You have 467 00:25:21,280 --> 00:25:24,439 Speaker 1: so much division in this country and have really you know, 468 00:25:24,520 --> 00:25:27,760 Speaker 1: the more you think about it. In some ways, I think, uh, 469 00:25:28,200 --> 00:25:33,320 Speaker 1: Kelly and Conway, this notion of alternative facts, Um, it was. 470 00:25:33,480 --> 00:25:37,159 Speaker 1: It was kind of ridiculed initially, but in some ways 471 00:25:37,280 --> 00:25:41,000 Speaker 1: I think it's weirdly true. Yeah, well I call alternative 472 00:25:41,040 --> 00:25:43,760 Speaker 1: facts fiction, but um, you know what I mean. I 473 00:25:43,760 --> 00:25:46,119 Speaker 1: mean different points of view, you know, I unders I 474 00:25:46,200 --> 00:25:49,080 Speaker 1: understand and I'm extremely worried about it, and I've talked 475 00:25:49,119 --> 00:25:51,000 Speaker 1: about this a lot. As I said, I think it's 476 00:25:51,000 --> 00:25:53,440 Speaker 1: the greatest challenge that our industry phases, and I think 477 00:25:53,440 --> 00:25:57,560 Speaker 1: it's a challenge to civil society and democracy. I think 478 00:25:57,600 --> 00:26:01,399 Speaker 1: that people are drawn to sites that affirm their pre 479 00:26:01,440 --> 00:26:04,320 Speaker 1: existing point of view, and that's a concern. But when 480 00:26:04,320 --> 00:26:07,439 Speaker 1: you're when you're drawn to sites that not only affirm 481 00:26:07,480 --> 00:26:10,320 Speaker 1: your pre existing point of view, but present you with 482 00:26:10,440 --> 00:26:14,200 Speaker 1: so called information that is in fact fiction, that's completely 483 00:26:14,280 --> 00:26:18,520 Speaker 1: made up, that's full of bizarre conspiracy theories. Uh, that 484 00:26:18,680 --> 00:26:23,000 Speaker 1: is incredibly corrosive to civil society. You have in order 485 00:26:23,040 --> 00:26:25,159 Speaker 1: to have a democracy, you have to agree on a 486 00:26:25,240 --> 00:26:28,520 Speaker 1: base set of facts. You can disagree on the analysis 487 00:26:28,560 --> 00:26:32,720 Speaker 1: of that that those facts, you can disagree on the 488 00:26:32,760 --> 00:26:37,280 Speaker 1: prescriptions for for solving the problems of society. But fundamental, 489 00:26:37,359 --> 00:26:39,080 Speaker 1: you have to agree on a base set of facts 490 00:26:39,119 --> 00:26:41,480 Speaker 1: and then work from there. Uh. And right now we 491 00:26:41,520 --> 00:26:44,159 Speaker 1: can't even agree on what happened yesterday, and that is 492 00:26:44,440 --> 00:26:47,919 Speaker 1: a huge challenge to civil society. Let's talk about not 493 00:26:48,040 --> 00:26:50,639 Speaker 1: the facts, but the factor. What did you make of 494 00:26:50,720 --> 00:26:54,760 Speaker 1: the whole bill O'Reilly incident? Well, I'm not quite sure 495 00:26:54,800 --> 00:26:58,080 Speaker 1: how to answer that. I mean, you know, obviously yeah, 496 00:26:58,160 --> 00:26:59,880 Speaker 1: well I'm not sure. I want to try too hard. 497 00:27:00,520 --> 00:27:03,440 Speaker 1: But look, I mean I think there was good reporting 498 00:27:03,560 --> 00:27:06,280 Speaker 1: that took place. Uh. The New York Times was did 499 00:27:06,520 --> 00:27:12,199 Speaker 1: a fine job. Yes, I was. Uh, you know, I 500 00:27:12,200 --> 00:27:15,040 Speaker 1: guess I have to be honest. Um. So you know, 501 00:27:15,119 --> 00:27:17,199 Speaker 1: we had done a lot of the work on Bill Cosby, 502 00:27:17,560 --> 00:27:19,959 Speaker 1: uh so, and they did a lot of work on 503 00:27:19,960 --> 00:27:23,320 Speaker 1: on Bill O'Reilly. So. Um, but I do, but I 504 00:27:23,359 --> 00:27:25,280 Speaker 1: am jealous. I mean, I look, I mean I think 505 00:27:25,320 --> 00:27:27,720 Speaker 1: that they appeared to have done good work, appears to 506 00:27:27,720 --> 00:27:33,240 Speaker 1: have been well documented, It had impact, it was sufficiently 507 00:27:33,280 --> 00:27:36,359 Speaker 1: at least with there was an independent, supposedly independent investigation 508 00:27:36,440 --> 00:27:40,280 Speaker 1: at Fox of the actual underlying facts. And for whatever reason, 509 00:27:40,359 --> 00:27:45,040 Speaker 1: whether it was commercial or because they found the information credible, uh, 510 00:27:45,240 --> 00:27:50,000 Speaker 1: they discontinued their association with with Bill O'Reilly. Uh. And 511 00:27:50,280 --> 00:27:52,280 Speaker 1: you know, but there's still a public out there that 512 00:27:52,600 --> 00:27:55,880 Speaker 1: may continue to be drawn to him, and and we'll 513 00:27:55,920 --> 00:27:57,840 Speaker 1: have to see how that that goes. I know we 514 00:27:57,880 --> 00:27:59,520 Speaker 1: have to wrap up soon, but before we go, I 515 00:27:59,560 --> 00:28:02,080 Speaker 1: do want to ask you about Peter Teel and his 516 00:28:02,560 --> 00:28:05,879 Speaker 1: sort of niche as the President's point man in Silicon Valley. 517 00:28:05,920 --> 00:28:08,760 Speaker 1: I mean, he's famously contrarian. There's a very contrary position 518 00:28:08,800 --> 00:28:12,960 Speaker 1: he's taken relative to others there. What's what's been the 519 00:28:13,000 --> 00:28:16,400 Speaker 1: reaction to his role? So in Silicon Valley, I mean 520 00:28:16,480 --> 00:28:19,440 Speaker 1: people didn't quite know what to make of it. Initially 521 00:28:19,640 --> 00:28:23,520 Speaker 1: thought it was, you know, Peter is an investor, and so, um, 522 00:28:23,600 --> 00:28:26,280 Speaker 1: he if you bet on an undervalued asset and that 523 00:28:26,359 --> 00:28:30,239 Speaker 1: asset skyrocks, that value accrues to you. And so I 524 00:28:30,280 --> 00:28:32,560 Speaker 1: think in the Valley a lot of people saw his 525 00:28:32,680 --> 00:28:35,760 Speaker 1: endorsement of Trump and then working for Trump first as 526 00:28:36,280 --> 00:28:41,040 Speaker 1: crazy Peter position and then um, honestly like a little 527 00:28:41,040 --> 00:28:44,520 Speaker 1: bit of good for him for betting on the right horse. Um. 528 00:28:44,560 --> 00:28:47,719 Speaker 1: I think it's notable. Right now we're not hearing as 529 00:28:47,800 --> 00:28:50,960 Speaker 1: much about him, and I think that is probably reflective 530 00:28:50,960 --> 00:28:55,960 Speaker 1: of the fact that he wants to and will remain involved, 531 00:28:56,080 --> 00:28:59,760 Speaker 1: but is pulling back, probably because he got a little 532 00:29:00,160 --> 00:29:03,920 Speaker 1: some more negative reaction as well from the Trump haters. 533 00:29:04,000 --> 00:29:06,640 Speaker 1: And I mean it's just a whole mix. But besides, 534 00:29:06,680 --> 00:29:08,959 Speaker 1: there's you know a little bit of entrepreneurs who had 535 00:29:09,000 --> 00:29:12,440 Speaker 1: taken money from him, uh felt the need to reaffirm 536 00:29:13,000 --> 00:29:17,840 Speaker 1: to their companies Facebook did this too that Uh, you know, 537 00:29:17,920 --> 00:29:21,560 Speaker 1: they believe in a lot of perspectives in their organization 538 00:29:21,600 --> 00:29:23,880 Speaker 1: and don't want to start blacklisting investors based on their 539 00:29:23,920 --> 00:29:26,600 Speaker 1: political point of views. But um, I think it's died 540 00:29:26,600 --> 00:29:30,880 Speaker 1: down a little and will probably continue. Um, but we 541 00:29:30,920 --> 00:29:33,440 Speaker 1: shall see. I mean, I think Peter is a fascinating guy. 542 00:29:34,120 --> 00:29:36,640 Speaker 1: It's a very strong point of views, and he when 543 00:29:36,640 --> 00:29:41,000 Speaker 1: he sees an opportunity to have influence, he he'll step 544 00:29:41,000 --> 00:29:43,640 Speaker 1: into it. So we'll see what happens next. And Marty 545 00:29:43,960 --> 00:29:47,520 Speaker 1: and closing, Uh, what are you all doing about access? 546 00:29:47,520 --> 00:29:50,960 Speaker 1: And how challenging has that been? Because I know the 547 00:29:51,040 --> 00:29:55,080 Speaker 1: Trump administration has invited more sympathetic news outlets to be 548 00:29:55,160 --> 00:29:57,080 Speaker 1: a part of the White House press briefing, to be 549 00:29:57,160 --> 00:30:01,920 Speaker 1: sort of part of informal gaggle is in and so is. 550 00:30:02,840 --> 00:30:07,400 Speaker 1: Have your reporters said, hey, we're just not getting access 551 00:30:07,640 --> 00:30:10,640 Speaker 1: or does everybody and their brother want to talk about 552 00:30:10,640 --> 00:30:13,840 Speaker 1: the in fighting that's going on in the Trump administration? 553 00:30:13,880 --> 00:30:16,360 Speaker 1: And how how are you handling all that? Well? You know, 554 00:30:16,440 --> 00:30:18,280 Speaker 1: I mean, I think our reporters have a pretty good 555 00:30:18,280 --> 00:30:21,680 Speaker 1: relationship with people at the White House. It's very professionals, 556 00:30:21,880 --> 00:30:24,560 Speaker 1: as it should be I think they're able at the 557 00:30:24,560 --> 00:30:26,360 Speaker 1: White House, at least they're able to reach the people 558 00:30:26,400 --> 00:30:29,400 Speaker 1: they need. At the agencies, I think it's much more difficult. 559 00:30:29,520 --> 00:30:33,160 Speaker 1: I think the cabinet secretaries are very nervous about speaking 560 00:30:33,160 --> 00:30:37,719 Speaker 1: with the press. They are underlings. Are the bureaucrats who 561 00:30:37,760 --> 00:30:39,880 Speaker 1: have been there, the government workers, I should say, who 562 00:30:39,880 --> 00:30:41,320 Speaker 1: have been there for a long time, who were expert 563 00:30:41,360 --> 00:30:45,240 Speaker 1: in their field, are are terrified that they will be 564 00:30:45,360 --> 00:30:48,680 Speaker 1: they will be fired for actually giving you background to 565 00:30:49,320 --> 00:30:51,840 Speaker 1: actually help you understand an issue, or that they might 566 00:30:51,880 --> 00:30:56,040 Speaker 1: say something that that doesn't conform to the administration's position. 567 00:30:56,360 --> 00:31:00,760 Speaker 1: I think that's very very concerning, and it's more important 568 00:31:00,760 --> 00:31:03,560 Speaker 1: that we speak to those kinds of people than necessarily 569 00:31:03,560 --> 00:31:06,400 Speaker 1: that we get the the in fighting at the at 570 00:31:06,400 --> 00:31:08,720 Speaker 1: the White House. We also have to make sure that 571 00:31:08,800 --> 00:31:11,120 Speaker 1: what we're doing is not based entirely on access, and 572 00:31:11,160 --> 00:31:14,360 Speaker 1: it is not. Some journalism requires some level of access, 573 00:31:14,600 --> 00:31:19,960 Speaker 1: but not all journalism requires access, requires access to documents, 574 00:31:19,960 --> 00:31:22,959 Speaker 1: it requires access to people outside of the administration. It 575 00:31:23,000 --> 00:31:27,320 Speaker 1: requires our being energetic and aggressive in our reporting, and 576 00:31:27,320 --> 00:31:30,080 Speaker 1: we're doing that work as well. We're not We're not 577 00:31:30,160 --> 00:31:32,920 Speaker 1: dependent on access to people at the White house or 578 00:31:33,000 --> 00:31:35,960 Speaker 1: even in other branches of the administration. Well, thank you 579 00:31:36,040 --> 00:31:38,680 Speaker 1: so much for taking the time to We are totally 580 00:31:38,720 --> 00:31:43,720 Speaker 1: speaking out here. I feel like you guys have other 581 00:31:43,760 --> 00:31:46,000 Speaker 1: things to do. Thank you so much, Jessica for having 582 00:31:46,080 --> 00:31:57,040 Speaker 1: us and Marty for being thank you, Thank you, Thanks 583 00:31:57,080 --> 00:32:00,600 Speaker 1: as always to our intrepid producer Gianna Palmer for putting 584 00:32:00,600 --> 00:32:03,840 Speaker 1: together the show, to Jared O'Connell for getting to the 585 00:32:03,920 --> 00:32:07,440 Speaker 1: venue super early to mix and engineer this show, and 586 00:32:07,480 --> 00:32:11,360 Speaker 1: also to Nora Richie for additional production assistance. Last time 587 00:32:11,440 --> 00:32:14,280 Speaker 1: we did a podcast, everybody spent the night at my house, 588 00:32:14,440 --> 00:32:16,920 Speaker 1: so we'll have to do that again. We had a 589 00:32:16,960 --> 00:32:19,760 Speaker 1: slumber party It's True out at the beach, which was 590 00:32:19,800 --> 00:32:22,520 Speaker 1: really fun. But I don't have room for you guys 591 00:32:22,560 --> 00:32:25,720 Speaker 1: in my New York apartment. So thanks to our social 592 00:32:25,760 --> 00:32:28,960 Speaker 1: media maybn Alison Bresnik, and to Emily Beena for her 593 00:32:29,040 --> 00:32:31,880 Speaker 1: part in producing the show, and Mark Phillips, thank you 594 00:32:31,920 --> 00:32:34,720 Speaker 1: as always for our catchy theme music. Katie Currik and 595 00:32:34,760 --> 00:32:37,959 Speaker 1: I are are executive producers, and remember you can email 596 00:32:38,040 --> 00:32:42,160 Speaker 1: us at comments at Currek podcast dot com. Find Katie 597 00:32:42,200 --> 00:32:45,800 Speaker 1: on social media. She's at Katie Currik on Twitter, on Instagram, 598 00:32:45,880 --> 00:32:49,320 Speaker 1: Katie dot Kurric on Snapchat, and I'm at goldsmith b 599 00:32:49,680 --> 00:32:52,800 Speaker 1: on Twitter. Best of all, you can rate and review us, 600 00:32:53,280 --> 00:32:55,760 Speaker 1: but only yes, but only if you have nice things 601 00:32:55,800 --> 00:32:59,680 Speaker 1: to say. I'm very sensitive, and don't forget to subscribe 602 00:32:59,720 --> 00:33:02,840 Speaker 1: as well. Thank you so much for listening. John said 603 00:33:02,880 --> 00:33:04,840 Speaker 1: that on my tombstone, you know what it's going to say, 604 00:33:05,280 --> 00:33:08,920 Speaker 1: thank you so much, because that's apparently what I say 605 00:33:08,960 --> 00:33:11,160 Speaker 1: all the time. So, by the way, I'm getting cremated. 606 00:33:11,320 --> 00:33:15,040 Speaker 1: And he said he was going to spread my ashes, No, 607 00:33:15,360 --> 00:33:17,960 Speaker 1: hopefully not soon. He was going to spread my ashes 608 00:33:18,160 --> 00:33:21,000 Speaker 1: all over the country because in death, as in life, 609 00:33:21,560 --> 00:33:25,760 Speaker 1: I was spread too thin. Is that touching? And he 610 00:33:25,800 --> 00:33:27,640 Speaker 1: also assumes he's going to be alive when you die. 611 00:33:29,360 --> 00:33:32,680 Speaker 1: There's so much wrong with this. No he's not. He 612 00:33:32,760 --> 00:33:36,280 Speaker 1: six years younger. I was trying to help you, cougar. 613 00:33:36,360 --> 00:33:39,360 Speaker 1: All right, we digress. The bottom line is we really 614 00:33:39,360 --> 00:33:42,720 Speaker 1: appreciate your listening. Thank you so much, and we'll talk 615 00:33:42,760 --> 00:33:43,440 Speaker 1: to you next time.