1 00:00:07,920 --> 00:00:11,160 Speaker 1: Welcome to Odd Lots. I'm Tracy Alloway, Executive editor at 2 00:00:11,160 --> 00:00:14,800 Speaker 1: Bloomberg Markets, and I'm Joe wasn't all Managing editor at 3 00:00:14,800 --> 00:00:18,040 Speaker 1: Bloomberg Markets. All right, so Joe, this week we have 4 00:00:18,239 --> 00:00:22,120 Speaker 1: something a little bit different for our listeners. It's not 5 00:00:22,400 --> 00:00:25,040 Speaker 1: entirely markets related, but I think it has a lot 6 00:00:25,120 --> 00:00:28,240 Speaker 1: to do with what is going on in markets right now, 7 00:00:28,280 --> 00:00:32,199 Speaker 1: if that makes any sense, Keep going, okay, alright, So 8 00:00:32,560 --> 00:00:36,239 Speaker 1: this week's episode is actually inspired by a conversation I 9 00:00:36,320 --> 00:00:39,159 Speaker 1: had when I was in London earlier this month. It 10 00:00:39,280 --> 00:00:43,360 Speaker 1: was shortly after the Brexit referendum, which shocked a lot 11 00:00:43,400 --> 00:00:46,920 Speaker 1: of people. The decision by the UK populace to leave 12 00:00:47,040 --> 00:00:49,640 Speaker 1: the EU came as a shock to a lot of people, 13 00:00:49,720 --> 00:00:53,800 Speaker 1: including myself, if I'm a honest. So what happened is 14 00:00:53,840 --> 00:00:56,280 Speaker 1: I was in London. I was talking to someone that 15 00:00:56,360 --> 00:00:59,400 Speaker 1: you actually know, but who shall remain unnamed, someone I 16 00:00:59,440 --> 00:01:04,120 Speaker 1: respect a lot. And this person said they had actually 17 00:01:04,200 --> 00:01:07,199 Speaker 1: voted to leave the EU and it was the last 18 00:01:07,240 --> 00:01:10,679 Speaker 1: person I ever would have thought would vote to leave. 19 00:01:10,840 --> 00:01:15,840 Speaker 1: But then as we started talking, h this person listed 20 00:01:15,880 --> 00:01:19,160 Speaker 1: all the reasons they thought leaving was a good decision, 21 00:01:19,280 --> 00:01:23,680 Speaker 1: including a long drawn out argument about the death of neoliberalism, UM, 22 00:01:23,680 --> 00:01:27,440 Speaker 1: and I realized, you know, my instinctive reaction to hearing 23 00:01:27,440 --> 00:01:29,880 Speaker 1: someone saying that they want to vote to leave is uh, 24 00:01:30,080 --> 00:01:34,080 Speaker 1: you know, incredulity, I guess, And maybe maybe it shouldn't be. 25 00:01:34,240 --> 00:01:38,760 Speaker 1: Everyone has valid reasons, right absolutely, And before we go 26 00:01:38,800 --> 00:01:40,960 Speaker 1: on you further, I'm just hoping that as soon as 27 00:01:41,000 --> 00:01:45,600 Speaker 1: the recording stops on this podcast, you tell me who 28 00:01:45,640 --> 00:01:48,720 Speaker 1: this person is. But I think this is an incredibly 29 00:01:48,920 --> 00:01:52,560 Speaker 1: important topic because I think that when it comes to 30 00:01:52,720 --> 00:01:56,320 Speaker 1: the briggs At referendum, when it comes to politics in 31 00:01:56,360 --> 00:02:00,440 Speaker 1: the US, it's pretty clear that um and elsewhere the world, 32 00:02:00,800 --> 00:02:07,080 Speaker 1: it's pretty clear that people separate themselves into these ideological bubbles. 33 00:02:07,120 --> 00:02:11,079 Speaker 1: And not only do they not know or can imagine 34 00:02:11,120 --> 00:02:16,919 Speaker 1: how someone sees something opposite, they don't even experience the 35 00:02:16,919 --> 00:02:20,520 Speaker 1: They seem to make almost no attempt to understand the 36 00:02:20,560 --> 00:02:22,880 Speaker 1: other side's point of view. And I think this is 37 00:02:23,120 --> 00:02:27,680 Speaker 1: pretty long term ramifications for the media, for democracy, for 38 00:02:27,720 --> 00:02:31,720 Speaker 1: the economy, and probably also for markets. Given how many 39 00:02:31,800 --> 00:02:34,520 Speaker 1: people in the Briggsit referendum, we're completely stunned by the 40 00:02:34,520 --> 00:02:38,040 Speaker 1: results exactly. That's the perfect segue alright, So here with 41 00:02:38,120 --> 00:02:41,600 Speaker 1: us today to talk about why the other side of 42 00:02:41,639 --> 00:02:46,359 Speaker 1: the argument is not necessarily dumb. Is Sean Blanda. He 43 00:02:46,560 --> 00:02:50,520 Speaker 1: is editor in chief of You, which is a publication 44 00:02:50,600 --> 00:02:55,000 Speaker 1: that's dedicated to providing the missing curriculum for creative careers. 45 00:02:55,200 --> 00:02:57,960 Speaker 1: But he is also the author of a really good 46 00:02:58,040 --> 00:03:01,520 Speaker 1: Medium post on this topic that kind of enunciates a 47 00:03:01,520 --> 00:03:15,880 Speaker 1: lot of the ideas that we are discussing right now. Sean, 48 00:03:16,040 --> 00:03:18,880 Speaker 1: Welcome to the program. Very slight to be here. Thank 49 00:03:18,880 --> 00:03:22,640 Speaker 1: you for having me. Well. Sean, you wrote a a 50 00:03:22,720 --> 00:03:26,040 Speaker 1: post on medium about the other side not always being wrong, 51 00:03:26,080 --> 00:03:28,880 Speaker 1: and of course that other side can mean a lot 52 00:03:28,919 --> 00:03:32,040 Speaker 1: of different things. But for many people working in finance 53 00:03:32,080 --> 00:03:35,840 Speaker 1: and the major cities or technology, the other side either 54 00:03:35,960 --> 00:03:39,560 Speaker 1: means the Brexit referendum, or it means voting for Trump 55 00:03:39,680 --> 00:03:42,080 Speaker 1: or something else. So what did you mean in this 56 00:03:42,200 --> 00:03:44,160 Speaker 1: post and why did you write it. I'm a very 57 00:03:44,200 --> 00:03:47,880 Speaker 1: active Twitter user, and a news event would happen, I 58 00:03:47,920 --> 00:03:52,120 Speaker 1: would see coalescing of viewpoints, and then you know, the 59 00:03:52,160 --> 00:03:55,240 Speaker 1: opposition viewpoints starts to filter into that bubble, and then 60 00:03:55,320 --> 00:03:58,360 Speaker 1: I would see it immediately mocked like it was an antibody, 61 00:03:58,560 --> 00:04:04,200 Speaker 1: and I thought things cannot be this simple, and this 62 00:04:04,280 --> 00:04:06,920 Speaker 1: is right. This is right before Brexit, where when this happened, 63 00:04:07,000 --> 00:04:11,520 Speaker 1: and the notion that the other side had something to 64 00:04:11,560 --> 00:04:13,920 Speaker 1: say was not even crossing people's mind. So one thing 65 00:04:13,960 --> 00:04:15,720 Speaker 1: I did is I followed people that I disagreed with 66 00:04:15,760 --> 00:04:19,800 Speaker 1: on Twitter, and I found that things would come up 67 00:04:19,800 --> 00:04:22,479 Speaker 1: and it would change my mind. As an example, the 68 00:04:22,520 --> 00:04:26,240 Speaker 1: recent debate around gun control. Um, you know, as has 69 00:04:26,279 --> 00:04:29,640 Speaker 1: been well documented, there were four gun control bills put 70 00:04:29,680 --> 00:04:32,480 Speaker 1: on the floor of the House. People were arguing about it, 71 00:04:32,680 --> 00:04:36,839 Speaker 1: and initially I was on one side of that, which is, yes, 72 00:04:37,040 --> 00:04:40,840 Speaker 1: let's let's heavily regulate guns through this no fly list. 73 00:04:41,240 --> 00:04:45,280 Speaker 1: And then someone posited, how's that no fly list mandated? Who? 74 00:04:45,320 --> 00:04:47,640 Speaker 1: What judicial oversight? And I was like, oh, I would 75 00:04:47,640 --> 00:04:48,880 Speaker 1: not have thought of that if I was not following 76 00:04:48,880 --> 00:04:51,919 Speaker 1: together side. I this this thing is more complex than 77 00:04:51,960 --> 00:04:54,159 Speaker 1: I thought. The other side is not just a bunch 78 00:04:54,160 --> 00:04:56,239 Speaker 1: of buffoons that don't agree with me. They have nuanced 79 00:04:56,279 --> 00:04:59,160 Speaker 1: political views as well, and I've seen that born out 80 00:04:59,160 --> 00:05:01,360 Speaker 1: again and again, and Exit is the latest example of that. 81 00:05:01,880 --> 00:05:05,160 Speaker 1: So when we talk about opinions sort of coalescing on 82 00:05:05,200 --> 00:05:07,680 Speaker 1: either side, why do you think that happens? Is it 83 00:05:07,760 --> 00:05:11,400 Speaker 1: just human nature, like it's in our nature to cluster 84 00:05:11,600 --> 00:05:14,440 Speaker 1: around people who are similar to us and you know, 85 00:05:14,560 --> 00:05:19,080 Speaker 1: maybe socioeconomic status and also their opinions. Yes, and I 86 00:05:19,120 --> 00:05:20,920 Speaker 1: think in the past it was harder to do that 87 00:05:21,120 --> 00:05:24,240 Speaker 1: because we didn't pick and choose are the media we 88 00:05:24,360 --> 00:05:27,920 Speaker 1: consumed and less of the people we hung out with. 89 00:05:28,160 --> 00:05:31,359 Speaker 1: I mean, especially for kind of the young media savvy 90 00:05:31,440 --> 00:05:34,480 Speaker 1: person who's very transient. You can pick your friends and 91 00:05:34,480 --> 00:05:38,960 Speaker 1: pick the news you you view, and you can summarily 92 00:05:39,120 --> 00:05:42,880 Speaker 1: reject whole outlets and just kind of stick to things 93 00:05:42,960 --> 00:05:45,680 Speaker 1: that appeal to you. And at first that doesn't have 94 00:05:45,720 --> 00:05:49,479 Speaker 1: such a bad impact, but as years and years go on, 95 00:05:49,880 --> 00:05:52,560 Speaker 1: you are totally divorced from some people's realities. And you 96 00:05:52,600 --> 00:05:55,320 Speaker 1: saw this in Brexit, where people felt, you know, left 97 00:05:55,360 --> 00:05:58,120 Speaker 1: behind by some of the economic prosperity that some other 98 00:05:58,120 --> 00:06:00,960 Speaker 1: people have enjoyed. And if you were amongst and surrounded 99 00:06:01,000 --> 00:06:02,680 Speaker 1: by only people who have enjoyed that, you could not 100 00:06:02,800 --> 00:06:05,560 Speaker 1: even fathom why someone would want to change the situation 101 00:06:05,680 --> 00:06:08,520 Speaker 1: as it is, Um, what do you think this? So 102 00:06:08,640 --> 00:06:11,960 Speaker 1: I've certainly noticed the same phenomenon. I mean, you know, 103 00:06:11,960 --> 00:06:14,640 Speaker 1: I'm on Facebook like everyone else, and I watched the 104 00:06:14,720 --> 00:06:18,080 Speaker 1: instant reactions to things. And I watched the sort of 105 00:06:18,160 --> 00:06:22,440 Speaker 1: widespread dismay among my social circles that the idea that 106 00:06:22,680 --> 00:06:27,039 Speaker 1: um the UK would vote to leave the EU. What 107 00:06:27,240 --> 00:06:31,560 Speaker 1: is it? Worries me because I think like long term 108 00:06:31,600 --> 00:06:34,679 Speaker 1: about democracy and I for what it's worth. I don't 109 00:06:34,680 --> 00:06:36,640 Speaker 1: think it's just, you know, I don't think it's just 110 00:06:36,720 --> 00:06:40,160 Speaker 1: the elites are the sort of sort of like Chelismpaul 111 00:06:40,200 --> 00:06:42,640 Speaker 1: and elites, because you witness like the same sort of 112 00:06:42,680 --> 00:06:47,600 Speaker 1: like um uh, sort of snyde caricatures of the other 113 00:06:48,240 --> 00:06:50,440 Speaker 1: side by the other side. The fact that we use 114 00:06:50,480 --> 00:06:53,240 Speaker 1: the word elites and right can laugh it off is 115 00:06:53,240 --> 00:06:56,640 Speaker 1: one example of that exactly. But what do you think this? Ultimately? 116 00:06:57,440 --> 00:07:00,719 Speaker 1: I worry about, like what does it do for democracy 117 00:07:00,760 --> 00:07:04,360 Speaker 1: if people can wall themselves into these worlds in which 118 00:07:04,400 --> 00:07:08,719 Speaker 1: no competing ideas can enter the debate. I think you lose, 119 00:07:09,720 --> 00:07:12,880 Speaker 1: You lose empathy for the people around you in a 120 00:07:12,920 --> 00:07:17,280 Speaker 1: way that you can't even fathom. And I think the 121 00:07:17,680 --> 00:07:21,360 Speaker 1: way there's I think we all have issues that are 122 00:07:21,600 --> 00:07:25,920 Speaker 1: controversial in which when you see it affect you or 123 00:07:25,960 --> 00:07:29,040 Speaker 1: someone around you, it can change your mind. And that 124 00:07:29,080 --> 00:07:31,720 Speaker 1: can be something like your dad doesn't have healthcare, and 125 00:07:31,760 --> 00:07:34,080 Speaker 1: what does that mean? What does the struggle you go through. 126 00:07:34,560 --> 00:07:38,920 Speaker 1: Or one of your friends is unfairly arrested and jailed, 127 00:07:39,160 --> 00:07:41,280 Speaker 1: what does that mean to and you explore the judicial 128 00:07:41,280 --> 00:07:43,080 Speaker 1: process for the first time and you go, oh my god, 129 00:07:43,600 --> 00:07:46,160 Speaker 1: this group of people had a point or I see 130 00:07:46,200 --> 00:07:48,400 Speaker 1: it through their lens now, and now I think the 131 00:07:48,400 --> 00:07:50,160 Speaker 1: only way that happens to us is when it directly 132 00:07:50,200 --> 00:07:53,680 Speaker 1: happens to us, and we're stopping to take people's word 133 00:07:53,760 --> 00:07:58,280 Speaker 1: for for it sometimes and assuming that their intention is 134 00:07:58,320 --> 00:08:03,920 Speaker 1: malice or at worst stupidity. That that then you only 135 00:08:04,320 --> 00:08:08,160 Speaker 1: enforce policy positions, follow news that reinforces your worldview because 136 00:08:08,240 --> 00:08:11,080 Speaker 1: you just can't even imagine what it's like not to 137 00:08:11,160 --> 00:08:13,480 Speaker 1: have your worldview. And what's worse is you don't even 138 00:08:13,520 --> 00:08:16,720 Speaker 1: know there is another worldview. You feel like you're in 139 00:08:16,720 --> 00:08:20,760 Speaker 1: the supermajority and there's these fringe nut wackos that are 140 00:08:20,760 --> 00:08:23,920 Speaker 1: in the minority, and who are they? They're ridiculous. I'm 141 00:08:23,920 --> 00:08:26,480 Speaker 1: the smart one. So we've sort of touched on this, 142 00:08:26,560 --> 00:08:30,880 Speaker 1: but how much does social media actually play into this dynamic? 143 00:08:30,920 --> 00:08:34,160 Speaker 1: Because you know, we all use Twitter, we all use Facebook, 144 00:08:34,240 --> 00:08:38,120 Speaker 1: we all have our sort of personally curated selection of 145 00:08:38,160 --> 00:08:40,240 Speaker 1: news nowadays, and a lot of it comes to us 146 00:08:40,240 --> 00:08:43,439 Speaker 1: through our friends and our contacts. Does that end up 147 00:08:43,520 --> 00:08:49,920 Speaker 1: hardening the lines between different opinions different groups? I would say, 148 00:08:49,920 --> 00:08:53,360 Speaker 1: if you let it. Because one thing is, you know, 149 00:08:54,360 --> 00:08:56,679 Speaker 1: we live in New York City. I don't know anyone 150 00:08:57,320 --> 00:08:59,760 Speaker 1: who I'm not related to that lives in a rural area. 151 00:09:00,160 --> 00:09:02,040 Speaker 1: So social media is great because I can seek out 152 00:09:02,040 --> 00:09:04,200 Speaker 1: people who live in areas not like New York City 153 00:09:04,440 --> 00:09:05,960 Speaker 1: and see the things that they care about, the things 154 00:09:05,960 --> 00:09:09,280 Speaker 1: they're struggling with. So it's good. But I can very 155 00:09:09,280 --> 00:09:11,600 Speaker 1: easily say this person is a nut job. I'm going 156 00:09:11,679 --> 00:09:14,040 Speaker 1: to block them, like we literally have the word block 157 00:09:14,240 --> 00:09:17,480 Speaker 1: and mute. I think one thing, you know, it's worth 158 00:09:17,480 --> 00:09:21,199 Speaker 1: pointing out that I think these this phenomenon is not 159 00:09:21,679 --> 00:09:25,960 Speaker 1: as simple as there's the cosmopolitan elites and everyone who else. 160 00:09:26,040 --> 00:09:29,120 Speaker 1: I I used to live in Vermont and so almost 161 00:09:29,120 --> 00:09:31,520 Speaker 1: all of my high school friends are lots of them 162 00:09:31,520 --> 00:09:34,840 Speaker 1: were huge Bernie Sanders supporters. And what I noticed during 163 00:09:34,880 --> 00:09:39,559 Speaker 1: the campaign. I mean, obviously they were vociferously pro Sanders 164 00:09:39,600 --> 00:09:43,360 Speaker 1: and many of them very anti Hillary. But what I noticed, 165 00:09:43,720 --> 00:09:47,119 Speaker 1: not just dad, was they had like basically convinced themselves 166 00:09:47,160 --> 00:09:49,760 Speaker 1: that he was winning for much of the race. There's 167 00:09:49,760 --> 00:09:51,719 Speaker 1: a lot of Bernie math skets and things like that, 168 00:09:51,800 --> 00:09:55,640 Speaker 1: Bernie math and so it wasn't just that they only 169 00:09:55,640 --> 00:09:59,840 Speaker 1: they saw, you know, they the ideological arguments, but that 170 00:10:00,240 --> 00:10:04,040 Speaker 1: actual reality of what was happening with votes that were 171 00:10:04,040 --> 00:10:07,680 Speaker 1: counted were numbers. They had a completely different perspective on 172 00:10:08,000 --> 00:10:10,240 Speaker 1: than what was getting through the mainstream media. And I 173 00:10:10,320 --> 00:10:13,600 Speaker 1: think a lot of reporters cover politics, We'll say they like, 174 00:10:13,760 --> 00:10:16,920 Speaker 1: you know, got a lot of anger from the Berneu 175 00:10:17,000 --> 00:10:19,760 Speaker 1: Sanders supporters because they thought they were just lying or 176 00:10:19,840 --> 00:10:22,520 Speaker 1: making it up the various numbers. Yeah right, you you 177 00:10:22,559 --> 00:10:25,520 Speaker 1: build your own reality. And then this is a this 178 00:10:25,600 --> 00:10:28,800 Speaker 1: is a side effect of the skepticism where you're you 179 00:10:28,840 --> 00:10:31,520 Speaker 1: think the media is against people like me. Everyone's against 180 00:10:31,559 --> 00:10:34,000 Speaker 1: people like me, because everyone I know like me feels 181 00:10:34,000 --> 00:10:36,800 Speaker 1: this way, So there must be something here. You saw 182 00:10:36,840 --> 00:10:39,600 Speaker 1: something similar in their run up to the twelve election, 183 00:10:39,679 --> 00:10:41,640 Speaker 1: where there was a lot of people that were convinced 184 00:10:41,760 --> 00:10:43,600 Speaker 1: m Ronnie is gonna win in a landslide becauseveryone they 185 00:10:43,679 --> 00:10:46,840 Speaker 1: knew was yeah, yeah, right, And and they pick and 186 00:10:46,920 --> 00:10:49,960 Speaker 1: choose the news they had and that's obviously not what happened, 187 00:10:50,000 --> 00:10:52,880 Speaker 1: So I agree. I I think I used the the 188 00:10:53,000 --> 00:10:57,000 Speaker 1: Cosmopolitan not Cosmopolitan, because I'm simply acknowledging how I came 189 00:10:57,040 --> 00:10:59,319 Speaker 1: to it, which is through my filter bubble. I lived 190 00:10:59,320 --> 00:11:01,479 Speaker 1: in New York before years. Previously, I lived in Philadelphia 191 00:11:01,520 --> 00:11:03,760 Speaker 1: for seven years, so for my entire adult life, I've 192 00:11:03,760 --> 00:11:05,600 Speaker 1: been part of a large city. But I agree with 193 00:11:05,600 --> 00:11:08,760 Speaker 1: you entirely well. We saw an example of this alternate 194 00:11:08,960 --> 00:11:13,840 Speaker 1: reality dynamic earlier this month when Andrea led some of 195 00:11:13,840 --> 00:11:19,240 Speaker 1: the former Conservative Party leader candidate basically said that Sterling 196 00:11:19,240 --> 00:11:22,319 Speaker 1: had performed absolutely fine in the aftermath of Brexit, and 197 00:11:22,679 --> 00:11:25,800 Speaker 1: anyone um with access to a Bloomberg terminal or basic 198 00:11:25,840 --> 00:11:30,800 Speaker 1: finance data could see that that really wasn't true. Do people, 199 00:11:31,040 --> 00:11:34,120 Speaker 1: I mean, what what do we actually do when this happens? 200 00:11:34,200 --> 00:11:36,920 Speaker 1: Is there any way to fact check these things? Do 201 00:11:37,000 --> 00:11:41,439 Speaker 1: people care about fact checking anymore? I mean, I think 202 00:11:41,480 --> 00:11:43,559 Speaker 1: the answers no. But I think that was the most 203 00:11:43,600 --> 00:11:46,080 Speaker 1: common response to the piece was what happens when they're 204 00:11:46,080 --> 00:11:48,080 Speaker 1: actually wrong? Like what happens when I'm trying to give 205 00:11:48,120 --> 00:11:50,360 Speaker 1: into a draining outside and it's really not I can 206 00:11:50,600 --> 00:11:53,679 Speaker 1: I can empirically say that, and to that, I would say, 207 00:11:54,040 --> 00:11:57,520 Speaker 1: you know, I think it's helpful to figure out why 208 00:11:57,559 --> 00:12:01,080 Speaker 1: a person is motivated to get to that conclusion and 209 00:12:01,080 --> 00:12:05,280 Speaker 1: then attack that motivation rather than the misfect. Why does 210 00:12:05,280 --> 00:12:09,000 Speaker 1: that particular person what what what reason do they have? What? 211 00:12:09,000 --> 00:12:11,240 Speaker 1: What reason do Bernie Sanders fans have to believe that 212 00:12:11,240 --> 00:12:13,439 Speaker 1: he's winning? I mean, the obviously the answer is pretty obvious, 213 00:12:13,520 --> 00:12:16,720 Speaker 1: but I think it has a sound gripe in how 214 00:12:16,760 --> 00:12:19,600 Speaker 1: he might be treated by the main the larger media 215 00:12:19,640 --> 00:12:22,000 Speaker 1: outlets at large, right, And that's just the manifestation of that, 216 00:12:22,040 --> 00:12:24,840 Speaker 1: And I think too often we attack the manifestation without 217 00:12:24,880 --> 00:12:29,600 Speaker 1: looking at the core reason. You Know, something I've wondered 218 00:12:29,640 --> 00:12:32,960 Speaker 1: about is is this just going to get worse and 219 00:12:33,000 --> 00:12:36,439 Speaker 1: worse because it doesn't seem like there's an obvious mechanism 220 00:12:36,600 --> 00:12:40,160 Speaker 1: out there to solve it. And you think about the 221 00:12:40,200 --> 00:12:44,000 Speaker 1: speed at which technology is advancing, like virtual reality, it 222 00:12:44,120 --> 00:12:49,240 Speaker 1: seems like or um or payment systems or communication systems 223 00:12:49,280 --> 00:12:53,600 Speaker 1: that can connect the ideologically like minded across the world 224 00:12:53,679 --> 00:12:56,640 Speaker 1: even faster. It seems like there's just more and more 225 00:12:57,040 --> 00:13:01,480 Speaker 1: opportunities proliferating for people to create their own worlds and 226 00:13:01,559 --> 00:13:06,400 Speaker 1: inhabit spaces, whether physical or technological, in which you can 227 00:13:06,480 --> 00:13:10,680 Speaker 1: more more finely tune uh what gets in and the 228 00:13:10,720 --> 00:13:13,000 Speaker 1: other people in them. I think I think we're going 229 00:13:13,040 --> 00:13:15,720 Speaker 1: to develop This is me being extremely optimistic, and people 230 00:13:15,720 --> 00:13:18,040 Speaker 1: listening to this might roll their eyes, but I think 231 00:13:18,080 --> 00:13:21,200 Speaker 1: we're going to develop the muscles as a media consuming 232 00:13:21,240 --> 00:13:24,960 Speaker 1: public to start to recognize that this is happening and 233 00:13:24,960 --> 00:13:28,240 Speaker 1: when this is happening, And I think I think something 234 00:13:28,280 --> 00:13:30,280 Speaker 1: like Brexit is a good wake up call to that 235 00:13:30,400 --> 00:13:34,720 Speaker 1: where I know I had no my media consumption that 236 00:13:34,800 --> 00:13:37,840 Speaker 1: I like, I thought it was a guarantee that the 237 00:13:37,840 --> 00:13:40,520 Speaker 1: answer was remained right. I thought it was guaranteed. So 238 00:13:40,720 --> 00:13:43,080 Speaker 1: when that happens, I'm like, oh, what what am I 239 00:13:43,080 --> 00:13:45,200 Speaker 1: consuming about this? And what media am I consuming and 240 00:13:45,240 --> 00:13:47,839 Speaker 1: actively searching out other Now I know that's kind of 241 00:13:47,880 --> 00:13:50,240 Speaker 1: altruistic and that puts a lot of responsibility on the individual, 242 00:13:50,280 --> 00:13:52,280 Speaker 1: a busy individual that may have other things going on. 243 00:13:52,679 --> 00:13:57,320 Speaker 1: But I I think we're going to develop a muscle 244 00:13:57,440 --> 00:13:59,520 Speaker 1: the similar way we do with I don't know, bad 245 00:13:59,520 --> 00:14:01,920 Speaker 1: content on television, Like we know that if our kids 246 00:14:01,920 --> 00:14:04,480 Speaker 1: are up at ten pm, they might see bad content 247 00:14:04,520 --> 00:14:06,679 Speaker 1: and we should we should police that change the channel. 248 00:14:06,760 --> 00:14:10,600 Speaker 1: I think we're going to find that that that process 249 00:14:10,640 --> 00:14:13,680 Speaker 1: as a society, I hope. So is this the awkward 250 00:14:13,880 --> 00:14:16,880 Speaker 1: teenage phase in our new media consumption? We just have 251 00:14:17,000 --> 00:14:20,240 Speaker 1: to sort of grow into it. Yeah, I mean, and 252 00:14:20,520 --> 00:14:22,360 Speaker 1: you can see it in things we thought were established. 253 00:14:22,400 --> 00:14:26,400 Speaker 1: I mean I just walked here and I saw, you know, 254 00:14:27,200 --> 00:14:30,280 Speaker 1: twenty people walk around with their Pokemon go on their phone, right, 255 00:14:30,320 --> 00:14:33,680 Speaker 1: And and your inclination is to be like, that's ridiculous, 256 00:14:33,720 --> 00:14:35,320 Speaker 1: that's crazy, that's a fat But that kind of shows 257 00:14:35,360 --> 00:14:38,600 Speaker 1: like how nascent the medium is of cameras on phones, right, 258 00:14:38,840 --> 00:14:41,080 Speaker 1: Like we just we just figured out this super great 259 00:14:41,080 --> 00:14:43,240 Speaker 1: game that allows people to talk to each other. You know, 260 00:14:43,320 --> 00:14:46,880 Speaker 1: you mentioned walking outside and seeing that, And I've had 261 00:14:46,880 --> 00:14:50,840 Speaker 1: this thought before. You know, if you're like, if you're 262 00:14:50,840 --> 00:14:54,520 Speaker 1: on the if you're online all day, you see people 263 00:14:54,560 --> 00:14:59,680 Speaker 1: are just like raging at each other and fighting the characture. Yeah, 264 00:14:59,680 --> 00:15:01,320 Speaker 1: and then like you sort of step outside and you 265 00:15:01,400 --> 00:15:04,240 Speaker 1: see people going about their business, You're like, oh, maybe 266 00:15:04,280 --> 00:15:09,240 Speaker 1: this is not really Everybody is like yeah, people are polite, 267 00:15:09,600 --> 00:15:12,600 Speaker 1: And I often feel like there's this dichotomy, like people 268 00:15:12,640 --> 00:15:16,200 Speaker 1: get upset at things online that I've never once seen 269 00:15:16,280 --> 00:15:18,640 Speaker 1: someone get upset for in real life and people are 270 00:15:18,880 --> 00:15:21,680 Speaker 1: offline and I see, like people like a level of 271 00:15:21,720 --> 00:15:26,360 Speaker 1: civility and empathy offline and just when I'm going out 272 00:15:26,440 --> 00:15:28,520 Speaker 1: my day to day business and I'm like, actually, maybe 273 00:15:28,520 --> 00:15:30,960 Speaker 1: people aren't that bad, and I'd like, I'd like, God, 274 00:15:31,000 --> 00:15:32,800 Speaker 1: I'd like to think that online we're just going to 275 00:15:32,920 --> 00:15:37,240 Speaker 1: start to realize, you know. I think I think there's 276 00:15:37,600 --> 00:15:39,080 Speaker 1: a group of people that whenever you see something that 277 00:15:39,120 --> 00:15:41,040 Speaker 1: disagrees with you online, you have to be like, I 278 00:15:41,080 --> 00:15:42,880 Speaker 1: have to answer that, I have to set them right. 279 00:15:42,960 --> 00:15:44,680 Speaker 1: And maybe the mechanism if someone was going down the 280 00:15:44,680 --> 00:15:46,880 Speaker 1: street young something you didn't agree with you, like, they're 281 00:15:46,920 --> 00:15:48,840 Speaker 1: probably having a rough day, Like I just gotta let 282 00:15:48,880 --> 00:15:51,760 Speaker 1: that go, Like I'll talk to them when cooler heads prevail. 283 00:15:52,320 --> 00:15:54,440 Speaker 1: I think they're the manners and social graces will are 284 00:15:54,440 --> 00:15:57,160 Speaker 1: are only now starting to come together. And I think 285 00:15:57,200 --> 00:16:00,000 Speaker 1: it does occasionally take an event to make an individual 286 00:16:00,000 --> 00:16:03,040 Speaker 1: old person realize where their blind spots are. But the 287 00:16:03,120 --> 00:16:06,480 Speaker 1: downside while we figure this out is that we do 288 00:16:06,680 --> 00:16:13,040 Speaker 1: have um potentially dangerous populism sort of bubbling to the surface, 289 00:16:13,240 --> 00:16:15,560 Speaker 1: and it seems like people are getting more and more 290 00:16:15,640 --> 00:16:17,880 Speaker 1: angry with each other. And that's actually you know, it's 291 00:16:17,880 --> 00:16:21,560 Speaker 1: happening online, but it's translating into real life and our 292 00:16:21,600 --> 00:16:27,520 Speaker 1: actual political situation. So there's some urgency here. And I 293 00:16:27,560 --> 00:16:30,040 Speaker 1: think I think the I mean the events of the 294 00:16:30,120 --> 00:16:33,520 Speaker 1: past week, you've seen unity on a on a larger scale, 295 00:16:33,840 --> 00:16:36,520 Speaker 1: you know, calls for togetherness. Now, how that togetherness happens 296 00:16:36,520 --> 00:16:39,160 Speaker 1: obviously is a whole other can of worms, But I 297 00:16:39,160 --> 00:16:41,160 Speaker 1: think I think it goes back to finding the root 298 00:16:41,200 --> 00:16:46,400 Speaker 1: calls for why people are divisive? What what is? What 299 00:16:46,560 --> 00:16:48,840 Speaker 1: is the thing? What is the reason? You know? I live? 300 00:16:48,880 --> 00:16:51,240 Speaker 1: Has been made about people economically hurting and then taking 301 00:16:51,240 --> 00:16:55,040 Speaker 1: it out in certain ways or Yeah, I just I 302 00:16:55,120 --> 00:16:57,960 Speaker 1: just feel if we recognize the people across from us 303 00:16:58,000 --> 00:17:00,600 Speaker 1: as human beings with nuance opinions that have read things 304 00:17:00,640 --> 00:17:02,520 Speaker 1: you haven't read, it to watch movies that you haven't watched, 305 00:17:02,520 --> 00:17:04,520 Speaker 1: that have had tragedies happened to them that haven't happened 306 00:17:04,520 --> 00:17:07,240 Speaker 1: to you, and taking a second go, maybe there's some 307 00:17:07,240 --> 00:17:10,840 Speaker 1: summation of their life experiences, life experiences that have led 308 00:17:10,880 --> 00:17:12,920 Speaker 1: them there and being empathetic to that. And that's hard 309 00:17:12,920 --> 00:17:14,679 Speaker 1: to do a hund forty characters, I will grant you, 310 00:17:14,840 --> 00:17:16,680 Speaker 1: but just knowing that that's possible, that that's even a 311 00:17:16,720 --> 00:17:18,639 Speaker 1: possibility is something I don't feel like a lot of 312 00:17:18,640 --> 00:17:21,760 Speaker 1: people are doing. They just summarily reject the thing that happens. 313 00:17:21,840 --> 00:17:24,200 Speaker 1: What I've had. I had a couple of people share 314 00:17:25,119 --> 00:17:27,399 Speaker 1: the piece you wrote, and I think it resonated with 315 00:17:27,440 --> 00:17:30,240 Speaker 1: a lot of people. Um, what kind of reactions have 316 00:17:30,320 --> 00:17:33,080 Speaker 1: you had? Did people? Were there people who? Because I 317 00:17:33,160 --> 00:17:34,840 Speaker 1: mean I could see it going kind of two ways 318 00:17:34,840 --> 00:17:36,840 Speaker 1: where they're like, yeah, this is a real issue. Or 319 00:17:36,960 --> 00:17:40,800 Speaker 1: another reaction is I'm all for empathy, but why should 320 00:17:40,880 --> 00:17:45,760 Speaker 1: I try to see the perspective of racist, xenophobic knuckle draggers. Yeah, 321 00:17:46,000 --> 00:17:47,680 Speaker 1: And so I'm curious, like, what kind of reaction do 322 00:17:47,720 --> 00:17:50,840 Speaker 1: you got? So there's a line right where? And I 323 00:17:50,880 --> 00:17:53,440 Speaker 1: think racism is that line when someone just being outright 324 00:17:53,560 --> 00:17:57,600 Speaker 1: racist me reasoning with you might not go anywhere. And 325 00:17:57,640 --> 00:18:03,119 Speaker 1: I think that line is further back than people think. 326 00:18:03,560 --> 00:18:10,000 Speaker 1: I think people advocate for certain you know, policy positions 327 00:18:10,160 --> 00:18:13,080 Speaker 1: or the certain ways the country should work, and they're 328 00:18:13,080 --> 00:18:16,919 Speaker 1: rejected as being out of touch or dumb or naive. 329 00:18:17,119 --> 00:18:21,440 Speaker 1: And I think that all I'm asking is like take 330 00:18:21,720 --> 00:18:26,199 Speaker 1: the step path that And I think I think I 331 00:18:26,200 --> 00:18:29,800 Speaker 1: think that patience is what's needed. And it's easy to 332 00:18:29,880 --> 00:18:32,280 Speaker 1: say that when someone's not being racist towards you, and 333 00:18:32,280 --> 00:18:34,320 Speaker 1: I grant that a million times out of a million, 334 00:18:34,320 --> 00:18:37,520 Speaker 1: but I just I just think dismissing someone as stupid 335 00:18:37,720 --> 00:18:40,040 Speaker 1: or bigoted or not having all the facts, like, just 336 00:18:40,119 --> 00:18:43,600 Speaker 1: take take a second, I guess all I would. So 337 00:18:43,680 --> 00:18:46,760 Speaker 1: what was the reaction like, So the reaction there was 338 00:18:47,280 --> 00:18:49,240 Speaker 1: two bits of reaction, which were three bits of reaction, 339 00:18:49,280 --> 00:18:51,679 Speaker 1: which you're interesting overwhelmingly, which is very thankful when you 340 00:18:51,720 --> 00:18:53,879 Speaker 1: write stuff and this doesn't normally happen, but people like 341 00:18:53,920 --> 00:18:55,720 Speaker 1: this was great, this was helpful, thank you, And some 342 00:18:55,760 --> 00:18:57,760 Speaker 1: people were said, I noticed I do this in my 343 00:18:57,800 --> 00:19:00,800 Speaker 1: own life when this was happening gamer Gate was going 344 00:19:00,840 --> 00:19:03,480 Speaker 1: on in full then, so a lot of people reached 345 00:19:03,520 --> 00:19:05,080 Speaker 1: out to me about that. We're like, oh, I understand 346 00:19:05,119 --> 00:19:09,080 Speaker 1: the other uh part. Now. Secondly, was what I said before, 347 00:19:09,200 --> 00:19:13,080 Speaker 1: what happens when they're just wrong, factually wrong, And what 348 00:19:13,119 --> 00:19:14,439 Speaker 1: I would say is get to the point of why 349 00:19:14,520 --> 00:19:17,400 Speaker 1: they believe what they believe. And then third, I think 350 00:19:17,400 --> 00:19:21,359 Speaker 1: the most troubling aspect of this is people who were 351 00:19:21,880 --> 00:19:25,120 Speaker 1: closed minded used that article as a weapon to say 352 00:19:25,160 --> 00:19:28,520 Speaker 1: why people that weren't them were closed minded, right, which 353 00:19:28,560 --> 00:19:33,080 Speaker 1: is like an infinite loop. Yeah, which and I purposely 354 00:19:33,160 --> 00:19:35,000 Speaker 1: try to as best I can use the language of 355 00:19:35,080 --> 00:19:36,919 Speaker 1: the other side, because the other side is different things 356 00:19:36,960 --> 00:19:41,000 Speaker 1: to different people. But then people use that to justify 357 00:19:41,119 --> 00:19:42,840 Speaker 1: the fact that, like, more people should listen to me. 358 00:19:42,880 --> 00:19:46,240 Speaker 1: You see, I'm I'm part of the other side. You 359 00:19:46,240 --> 00:19:49,679 Speaker 1: should take me seriously. But everyone thinks the other side, 360 00:19:49,680 --> 00:19:53,560 Speaker 1: and everyone thinks that they're the marginalized one exactly, and 361 00:19:53,560 --> 00:19:57,159 Speaker 1: and some people are right, but it's it's I've had 362 00:19:57,160 --> 00:19:59,240 Speaker 1: people say thank you for this article. I use this 363 00:19:59,359 --> 00:20:03,439 Speaker 1: to uh, you know, all those those damn feminazis online. 364 00:20:03,440 --> 00:20:05,359 Speaker 1: I finally I should I said this to all of 365 00:20:05,359 --> 00:20:07,639 Speaker 1: them to shut them up. And I'm like, did you 366 00:20:07,680 --> 00:20:09,560 Speaker 1: listen to what they were saying, you know, to take 367 00:20:09,640 --> 00:20:11,040 Speaker 1: up take a minute? Why did they feel that way? 368 00:20:11,160 --> 00:20:14,879 Speaker 1: Just just just don't don't use this as another weapon. Alright, 369 00:20:14,920 --> 00:20:18,440 Speaker 1: a plead for some civility on the internet. Why don't 370 00:20:18,520 --> 00:20:21,639 Speaker 1: we leave it there, Sean, thank you so much for 371 00:20:21,720 --> 00:20:31,639 Speaker 1: joining us, Thanks for having me. Thank you, Joe. I 372 00:20:31,680 --> 00:20:33,840 Speaker 1: know that wasn't as market see as a lot of 373 00:20:33,840 --> 00:20:35,760 Speaker 1: our other episodes, but what do you think about that. 374 00:20:36,960 --> 00:20:40,760 Speaker 1: I think this is just such an important topic because, 375 00:20:41,160 --> 00:20:44,880 Speaker 1: as I was saying before, it's really hard to be 376 00:20:45,320 --> 00:20:51,840 Speaker 1: that optimistic about democracy if these trends towards walling yourself 377 00:20:51,920 --> 00:20:57,240 Speaker 1: off to only like minded argument and only facts that 378 00:20:57,280 --> 00:21:01,440 Speaker 1: could support your side. If we continue with that trend, 379 00:21:01,480 --> 00:21:05,960 Speaker 1: it seems like a certain um level of valuing debate 380 00:21:06,320 --> 00:21:08,879 Speaker 1: and a certain some degree of a shared set of 381 00:21:08,880 --> 00:21:12,120 Speaker 1: effects seems to be sort of a crucial ingredient. Yeah, 382 00:21:12,119 --> 00:21:16,000 Speaker 1: and I think this idea of the Balkanization of opinions 383 00:21:16,119 --> 00:21:20,119 Speaker 1: is incredibly relevant now obviously in politics for democracy, as 384 00:21:20,160 --> 00:21:23,720 Speaker 1: you pointed out, but again also when it comes to investing. 385 00:21:23,800 --> 00:21:26,040 Speaker 1: And I know that there have been reams written about 386 00:21:26,080 --> 00:21:31,320 Speaker 1: confirmation bias and investing cognitive biases, that sort of thing. 387 00:21:31,400 --> 00:21:34,160 Speaker 1: But really it seems more important than ever to kind 388 00:21:34,200 --> 00:21:37,880 Speaker 1: of stop and think about what the other side is saying. 389 00:21:37,960 --> 00:21:40,879 Speaker 1: And if you manage to do that right, if you 390 00:21:40,960 --> 00:21:47,080 Speaker 1: managed to gauge increasingly divided public opinions correctly, you could 391 00:21:47,200 --> 00:21:49,520 Speaker 1: make a lot of money, and um, maybe you could 392 00:21:49,600 --> 00:21:52,240 Speaker 1: come to understand your fellow man a little bit more 393 00:21:52,280 --> 00:21:56,879 Speaker 1: as well. Yeah. The bregsit um, the breggsit vote obviously 394 00:21:56,920 --> 00:22:00,240 Speaker 1: a huge opportunity for people to make money had they 395 00:22:00,240 --> 00:22:03,639 Speaker 1: had a different perspective than the mainstream, you know, even 396 00:22:03,680 --> 00:22:07,119 Speaker 1: within investing in economics. It's kind of fascinating how people 397 00:22:07,200 --> 00:22:10,520 Speaker 1: form certain camps like the gold camp or the anti 398 00:22:10,560 --> 00:22:13,080 Speaker 1: fed camp, or the tech camp or whatever. And so 399 00:22:13,160 --> 00:22:17,320 Speaker 1: I think that there is probably almost always an opportunity 400 00:22:17,440 --> 00:22:21,399 Speaker 1: for profit if you force yourself out of um how 401 00:22:21,520 --> 00:22:23,560 Speaker 1: you think. And I know, like I've changed my mind 402 00:22:24,560 --> 00:22:30,919 Speaker 1: many issues relating to both politics and economics. Big mind changes, uh, 403 00:22:31,080 --> 00:22:33,679 Speaker 1: over the years, and I think I'm better for it. 404 00:22:33,720 --> 00:22:37,719 Speaker 1: All Right, we'll do Joe Wisenthal Mind Changes episode sometime 405 00:22:37,840 --> 00:22:40,280 Speaker 1: later this year. But maybe, all right, I just write 406 00:22:40,280 --> 00:22:42,840 Speaker 1: a post twenty big things have changed my mind on that. 407 00:22:43,280 --> 00:22:45,680 Speaker 1: Well I would read that. Yeah, well it was. It 408 00:22:45,760 --> 00:22:48,600 Speaker 1: was great talking with you, Tracy. This has been another 409 00:22:48,680 --> 00:22:52,240 Speaker 1: episode of the Odd Loves Podcast. I'm Joe Wisenthal. You 410 00:22:52,280 --> 00:22:54,440 Speaker 1: can find me on Twitter at the Stalwarts, and I'm 411 00:22:54,440 --> 00:22:57,720 Speaker 1: Tracy Alloway. I'm on Twitter at Tracy Alloway. You can 412 00:22:57,720 --> 00:23:07,480 Speaker 1: find Sean on Twitter at Sean Blend. Thanks for listening. Yeah,