1 00:00:00,600 --> 00:00:05,360 Speaker 1: Hello, champions. I hope you well. It's dawned on me 2 00:00:05,480 --> 00:00:13,880 Speaker 1: lately that I'm becoming more of a fence sitter. In fact, 3 00:00:14,600 --> 00:00:17,320 Speaker 1: I was talking with somebody yesterday. I'm recording this eleven 4 00:00:17,400 --> 00:00:22,520 Speaker 1: twenty nine am on a Sunday, Sunday, the second of March, 5 00:00:23,800 --> 00:00:29,600 Speaker 1: And yesterday somebody asked me something about Donald Trump. I 6 00:00:29,640 --> 00:00:33,160 Speaker 1: don't I'm not a political person, so just saying, isn't 7 00:00:33,159 --> 00:00:37,240 Speaker 1: it funny when you say two words donald Trump. Almost 8 00:00:37,400 --> 00:00:43,720 Speaker 1: everybody has a reaction between some kind of opinion and 9 00:00:43,960 --> 00:00:49,920 Speaker 1: some physiological visceral reaction. Right, everybody's or not everybody, but 10 00:00:50,159 --> 00:00:54,000 Speaker 1: most people are have a side. And they asked me 11 00:00:54,040 --> 00:00:56,320 Speaker 1: what I thought about some particular things. It doesn't matter, 12 00:00:57,560 --> 00:00:59,080 Speaker 1: and this is not do I like him, do I 13 00:00:59,080 --> 00:01:01,520 Speaker 1: hate him? What do I think his policies? Blah blah blah. 14 00:01:01,680 --> 00:01:04,680 Speaker 1: It was just like with a particular thing. I said, look, 15 00:01:05,959 --> 00:01:07,399 Speaker 1: this is what I think, but I could be wrong, 16 00:01:07,400 --> 00:01:09,039 Speaker 1: but I really don't know. And now like, get off 17 00:01:09,040 --> 00:01:12,679 Speaker 1: the fence, Get off the fence. And I said why, 18 00:01:13,440 --> 00:01:19,600 Speaker 1: like in other words, pick aside, I'm like, nah, pick aside. No, 19 00:01:20,160 --> 00:01:22,080 Speaker 1: I don't want to pick aside. I don't know. I 20 00:01:22,120 --> 00:01:24,000 Speaker 1: don't have all the data. I don't have all the evidence. 21 00:01:24,040 --> 00:01:28,160 Speaker 1: I don't know, and what I do know is not 22 00:01:28,400 --> 00:01:31,080 Speaker 1: nearly enough. But by the way, what I know is 23 00:01:31,120 --> 00:01:32,959 Speaker 1: what I've learned through the media. How much of the 24 00:01:33,000 --> 00:01:35,240 Speaker 1: media is accurate. How much of what I'm seeing or 25 00:01:35,319 --> 00:01:38,440 Speaker 1: think I'm seeing is evidence or data? How much of 26 00:01:38,480 --> 00:01:42,000 Speaker 1: it is manufactured bullshit? How much of it is rhetoric? 27 00:01:42,080 --> 00:01:44,440 Speaker 1: How much of it is a form of marketing or 28 00:01:44,440 --> 00:01:49,440 Speaker 1: branding or you know, virtual manipulation or whatever. I don't know. 29 00:01:49,560 --> 00:01:53,320 Speaker 1: I don't know, but it it just it dawned on 30 00:01:53,400 --> 00:01:59,320 Speaker 1: me this idea that we very much inhabit a cognitive 31 00:01:59,320 --> 00:02:03,360 Speaker 1: and emotional and cultural and sociological landscape most of us, 32 00:02:04,360 --> 00:02:11,240 Speaker 1: where we are encouraged. No, we're coerced. We're coerced to 33 00:02:11,360 --> 00:02:14,320 Speaker 1: pick a team, to pick a side, to pick a group. 34 00:02:15,639 --> 00:02:17,880 Speaker 1: And I was thinking that, you know what, sometimes I 35 00:02:18,040 --> 00:02:23,120 Speaker 1: really like being on the fence, not for the sake 36 00:02:23,160 --> 00:02:26,480 Speaker 1: of being a fence sitter, but because it's for me. 37 00:02:26,600 --> 00:02:29,320 Speaker 1: I think it's the best fit. If I don't know, 38 00:02:30,160 --> 00:02:32,280 Speaker 1: If I don't know, or I don't have absolute clarity 39 00:02:32,360 --> 00:02:35,839 Speaker 1: or certainty, why do I need Let's you know, when 40 00:02:35,840 --> 00:02:39,639 Speaker 1: we talk about, you know, choosing a side. We're literally 41 00:02:39,720 --> 00:02:43,640 Speaker 1: talking about picking a group of thought, like a thought bubble, 42 00:02:43,800 --> 00:02:47,359 Speaker 1: or a way of being or thinking or behaving, a culture, 43 00:02:47,360 --> 00:02:50,000 Speaker 1: a religion, Pick a side, get off the fence. You 44 00:02:50,120 --> 00:02:52,840 Speaker 1: need to belong, You need to belong to a team 45 00:02:52,919 --> 00:02:58,200 Speaker 1: or a side or a group. Here's what I think. No, 46 00:02:58,280 --> 00:03:02,520 Speaker 1: we don't know, we don't Why do I need to 47 00:03:02,639 --> 00:03:07,720 Speaker 1: climb down from a vantage point that gives me a 48 00:03:07,800 --> 00:03:11,520 Speaker 1: perspective and a level of awareness, an understanding, and perhaps 49 00:03:11,680 --> 00:03:16,840 Speaker 1: vision on my fence where I can see both sides. 50 00:03:17,680 --> 00:03:19,799 Speaker 1: I don't have to choose, I don't have to align. 51 00:03:19,840 --> 00:03:23,919 Speaker 1: And maybe sometimes I will climb down from my fence 52 00:03:23,919 --> 00:03:27,520 Speaker 1: and I will join a particular ideology or philosophy because 53 00:03:27,560 --> 00:03:29,600 Speaker 1: I truly believe that, and I think that, and I 54 00:03:29,720 --> 00:03:33,359 Speaker 1: have what I think is good evidence. But maybe, you know, 55 00:03:33,600 --> 00:03:37,440 Speaker 1: maybe we don't have to absolutely know. Maybe we can say, 56 00:03:37,480 --> 00:03:40,600 Speaker 1: you know what, I'm not sure? And I just wrote 57 00:03:40,600 --> 00:03:42,680 Speaker 1: down what is it? One, two, three, four, five things 58 00:03:42,720 --> 00:03:45,480 Speaker 1: I just typed them a little peuty before we went 59 00:03:45,560 --> 00:03:48,440 Speaker 1: live here that I've spoken about in the last three weeks, 60 00:03:48,440 --> 00:03:52,800 Speaker 1: that I've essentially said to people I don't know. And 61 00:03:52,840 --> 00:03:56,560 Speaker 1: it bothers people when I don't pick a side, but 62 00:03:56,640 --> 00:04:00,800 Speaker 1: I feel that in twenty twenty five, so I use 63 00:04:00,840 --> 00:04:06,320 Speaker 1: that term very fucking loosely. And I don't even know 64 00:04:06,320 --> 00:04:08,200 Speaker 1: if I'm an expert. I think I have a bit 65 00:04:08,240 --> 00:04:10,400 Speaker 1: of knowledge about a few things, but I wouldn't really 66 00:04:10,440 --> 00:04:13,160 Speaker 1: call myself an expert. But experts or people who call 67 00:04:13,200 --> 00:04:20,200 Speaker 1: themselves experts, feel compelled or coerced to have an absolute, 68 00:04:20,279 --> 00:04:26,800 Speaker 1: categoric opinion, idea perspective on everything, and that's their stance, 69 00:04:27,360 --> 00:04:30,240 Speaker 1: despite the fact that they really never have all the 70 00:04:30,279 --> 00:04:34,440 Speaker 1: evidence or data or knowledge. Everyone feels compelled to pick 71 00:04:34,440 --> 00:04:36,840 Speaker 1: a side because if you're not on this side, well 72 00:04:36,839 --> 00:04:39,800 Speaker 1: then you're clearly on the other side. By the way, 73 00:04:39,839 --> 00:04:41,960 Speaker 1: if you don't hate Trump, then you must love Trump. 74 00:04:42,240 --> 00:04:44,120 Speaker 1: How about this? How about I don't give a fuck? 75 00:04:44,200 --> 00:04:47,279 Speaker 1: How about how about I don't know if intimittent fasting 76 00:04:47,360 --> 00:04:50,440 Speaker 1: is brilliant or not brilliant. You know, these are the 77 00:04:50,480 --> 00:04:54,200 Speaker 1: things that I've I've had this a similar conversation about 78 00:04:54,240 --> 00:04:58,320 Speaker 1: in the last five weeks. So Donald Trump, fasting, God 79 00:04:59,000 --> 00:05:04,080 Speaker 1: supplements as in, you know, bodybuilding and nutritional supplements and 80 00:05:04,400 --> 00:05:10,040 Speaker 1: AI and in the context of each of those situations 81 00:05:10,080 --> 00:05:14,560 Speaker 1: and conversations I said to people, I don't know, I 82 00:05:14,600 --> 00:05:17,920 Speaker 1: don't know, I don't have a strong opinion on that, 83 00:05:18,720 --> 00:05:22,120 Speaker 1: or but you know, there might be some things, some supplements, 84 00:05:22,200 --> 00:05:27,159 Speaker 1: like do I think creatine monohydrate is good? Well, I 85 00:05:27,200 --> 00:05:31,400 Speaker 1: can tell you from my experience it seems to work 86 00:05:31,480 --> 00:05:34,279 Speaker 1: for me. Now that's not very hard science, is it. 87 00:05:34,279 --> 00:05:37,520 Speaker 1: It seems to work for me. So I'd rather not 88 00:05:37,720 --> 00:05:42,800 Speaker 1: say anything categoric about creatine monohydrate or protein powders or 89 00:05:42,920 --> 00:05:47,599 Speaker 1: trimethyl glycine TMG or fucking any of them, right, because 90 00:05:47,760 --> 00:05:50,920 Speaker 1: I don't I don't, I don't know, I don't have 91 00:05:51,120 --> 00:05:54,279 Speaker 1: absolute knowledge, So I don't want to go I don't 92 00:05:54,279 --> 00:05:56,320 Speaker 1: want to be in the yes this is good or no, 93 00:05:56,480 --> 00:05:59,080 Speaker 1: this is bad camp. So I'm going to go. Look, 94 00:06:00,000 --> 00:06:03,520 Speaker 1: I don't think there's an unequivocal categoric answer to that 95 00:06:03,600 --> 00:06:06,800 Speaker 1: for all people. So I'm going to tell you what 96 00:06:06,960 --> 00:06:09,400 Speaker 1: I think, which is not to be confused with what 97 00:06:09,520 --> 00:06:15,479 Speaker 1: I globally know. I can tell you what I've experienced personally, which, again, 98 00:06:15,920 --> 00:06:17,880 Speaker 1: which is why I'm always saying this, This is not 99 00:06:17,960 --> 00:06:22,880 Speaker 1: a recommendation for the majority. This is an experience of 100 00:06:22,920 --> 00:06:29,159 Speaker 1: a very small minority, I E. Me same with AI, 101 00:06:29,800 --> 00:06:32,560 Speaker 1: Like there are some things about AI that are I 102 00:06:32,600 --> 00:06:36,560 Speaker 1: think brilliant. Are there some things that for me are 103 00:06:37,240 --> 00:06:44,880 Speaker 1: potentially down the track maybe problematic or fucking terrifying. Yes, yes, 104 00:06:45,839 --> 00:06:51,200 Speaker 1: But again that's just my thinking. That's not data, that's 105 00:06:51,320 --> 00:06:56,599 Speaker 1: not that's not fact, that's not universal truth. So this 106 00:06:56,920 --> 00:07:00,920 Speaker 1: whole idea that sitting on a fence, that not having 107 00:07:01,000 --> 00:07:05,720 Speaker 1: a categoric, absolute stance or opinion, that that is some 108 00:07:05,839 --> 00:07:11,280 Speaker 1: kind of flaw or problem or weakness. I think fence 109 00:07:11,320 --> 00:07:15,000 Speaker 1: sitting could actually be a strength. I think it could 110 00:07:15,080 --> 00:07:21,120 Speaker 1: actually be wisdom. And in a world that's increasingly divided 111 00:07:21,160 --> 00:07:29,160 Speaker 1: into polarized camps, I think the metaphorical fensitter often faces 112 00:07:29,200 --> 00:07:33,360 Speaker 1: criticism from both sides. So there's the profound question worth exploring, 113 00:07:33,640 --> 00:07:37,320 Speaker 1: and that is sitting on the fence truly a position 114 00:07:37,400 --> 00:07:44,679 Speaker 1: of weakness? Or maybe is it sometimes a vantage point 115 00:07:44,760 --> 00:07:49,840 Speaker 1: or a perspective of wisdom. From the fence, we can 116 00:07:49,840 --> 00:07:54,200 Speaker 1: gain a unique perspective unavailable to those who aren't sitting 117 00:07:54,360 --> 00:07:57,160 Speaker 1: up on the fence with us. Let's not say everyone 118 00:07:57,200 --> 00:07:59,520 Speaker 1: needs to be up on the fence, but for those 119 00:07:59,560 --> 00:08:02,560 Speaker 1: who have firmly planted on one side of the fence 120 00:08:02,720 --> 00:08:05,880 Speaker 1: or other. They can't see what we can see. Now, 121 00:08:06,520 --> 00:08:08,560 Speaker 1: having said that, am I ever on a side? Of 122 00:08:08,560 --> 00:08:11,440 Speaker 1: course I am? Of course I am. And I don't 123 00:08:11,480 --> 00:08:14,200 Speaker 1: think being on a side is bad, of course, But 124 00:08:14,360 --> 00:08:20,360 Speaker 1: neither do I think that being on neither side is 125 00:08:20,400 --> 00:08:24,240 Speaker 1: bad either Necessarily. Kind of like an observer on a 126 00:08:24,320 --> 00:08:29,360 Speaker 1: hill overlooking a battlefield, a fence citter can see the 127 00:08:29,400 --> 00:08:34,080 Speaker 1: patterns and similarities and blind spots that combatants miss in 128 00:08:34,120 --> 00:08:39,800 Speaker 1: the heat of engagement. So this elevated perspective of being 129 00:08:40,040 --> 00:08:44,120 Speaker 1: in a place that the majority are not, which is 130 00:08:44,360 --> 00:08:47,000 Speaker 1: I'm not in group A or group B, I'm not 131 00:08:47,080 --> 00:08:50,079 Speaker 1: in a group, or I might be in a third group, 132 00:08:50,200 --> 00:08:53,200 Speaker 1: which is the I don't know group, and it's okay. 133 00:08:54,520 --> 00:08:57,520 Speaker 1: Like this allows me from my fence. It allows me 134 00:08:57,679 --> 00:09:06,120 Speaker 1: to recognize patterns across supposedly opposing viewpoints, to identify common 135 00:09:06,160 --> 00:09:14,240 Speaker 1: ground that opposing sides or partisans might not see, and 136 00:09:14,320 --> 00:09:19,199 Speaker 1: to see also how like both sides use similar tactics 137 00:09:19,840 --> 00:09:25,760 Speaker 1: and similar tricks and similar manipulation. And also it allows 138 00:09:25,800 --> 00:09:31,600 Speaker 1: me to see how tribal psychology shapes perception on both sides. 139 00:09:31,640 --> 00:09:34,360 Speaker 1: We're living in that echo chamber of thought that echo, 140 00:09:35,000 --> 00:09:38,080 Speaker 1: and each side has their own echo chamber, their own 141 00:09:38,120 --> 00:09:41,839 Speaker 1: confirmation bias, their own way of thinking and belonging and 142 00:09:41,880 --> 00:09:45,079 Speaker 1: being and interacting. And by the way, if you want 143 00:09:45,120 --> 00:09:48,040 Speaker 1: to be in our group, in our thought cult, on 144 00:09:48,200 --> 00:09:50,559 Speaker 1: our side of the fence, then you need to agree 145 00:09:51,120 --> 00:09:56,880 Speaker 1: with everything. But this is problematic because when we are 146 00:09:56,920 --> 00:10:02,920 Speaker 1: on when we're on one side of the fence, we 147 00:10:03,040 --> 00:10:09,280 Speaker 1: automatically believe that what we think and our ideology, our philosophy, 148 00:10:09,440 --> 00:10:13,680 Speaker 1: our practices, our habits, our rituals, our rules, we automatically 149 00:10:13,760 --> 00:10:17,360 Speaker 1: think that we're right, which keeps us shut down from 150 00:10:18,120 --> 00:10:21,280 Speaker 1: the possibility that the group on the other side of 151 00:10:21,320 --> 00:10:25,600 Speaker 1: the fence might actually be right with some things. So 152 00:10:25,720 --> 00:10:28,600 Speaker 1: the moment that I think I'm unequivocally right, or I 153 00:10:28,679 --> 00:10:34,600 Speaker 1: unequivocally know, or my truth is the objective universal, there 154 00:10:34,679 --> 00:10:39,199 Speaker 1: is no argument truth, then I also think, and I've 155 00:10:39,200 --> 00:10:42,920 Speaker 1: said this many times before, that everyone who doesn't agree 156 00:10:42,960 --> 00:10:45,720 Speaker 1: with my group or my thinking, or my rituals, or 157 00:10:45,760 --> 00:10:52,199 Speaker 1: my perspective or my religion or my ideas, I automatically 158 00:10:52,240 --> 00:10:54,800 Speaker 1: think that they're wrong. And that's a really precarious position. 159 00:10:55,120 --> 00:11:01,640 Speaker 1: Think about, you know, think about how many times you've 160 00:11:01,640 --> 00:11:05,600 Speaker 1: been wrong about stuff in your life. How many times 161 00:11:06,559 --> 00:11:08,720 Speaker 1: you got And I'm not talking about Oh I trusted 162 00:11:08,760 --> 00:11:11,720 Speaker 1: Diane and she fucked me over. No, it's not about Diane. 163 00:11:11,760 --> 00:11:14,360 Speaker 1: It's about you. It's about how many times you did 164 00:11:14,480 --> 00:11:18,840 Speaker 1: something and you got it wrong. You thought a certain thing, 165 00:11:18,960 --> 00:11:22,120 Speaker 1: you took a certain path, and it was wrong. And 166 00:11:22,160 --> 00:11:25,280 Speaker 1: this is not This is not that you're bad or flawed, 167 00:11:25,320 --> 00:11:27,280 Speaker 1: or I'm bad or flawed. This is that you and 168 00:11:27,320 --> 00:11:31,319 Speaker 1: me are humans. We do shit that is dumb. We 169 00:11:31,400 --> 00:11:36,160 Speaker 1: do shit that is flawed. We get things wrong. Now, 170 00:11:36,520 --> 00:11:38,720 Speaker 1: when we join a group, when we get down from 171 00:11:38,720 --> 00:11:40,839 Speaker 1: the fence, and then we go into our new group, 172 00:11:41,000 --> 00:11:44,760 Speaker 1: we pick a side that is cool as long as 173 00:11:44,800 --> 00:11:47,520 Speaker 1: there's a door between the fences, that I can maybe 174 00:11:48,040 --> 00:11:50,400 Speaker 1: put my left knee through the fence and go well 175 00:11:50,800 --> 00:11:54,760 Speaker 1: on this particular topic, on this idea, regarding this issue, 176 00:11:55,280 --> 00:11:59,160 Speaker 1: I'm not in. The problem with being in a group 177 00:11:59,320 --> 00:12:01,960 Speaker 1: is that you need to conform to the group. You 178 00:12:02,000 --> 00:12:04,760 Speaker 1: need to think like the group. You need to be 179 00:12:04,840 --> 00:12:07,839 Speaker 1: fully committed to be able to belong, which means you 180 00:12:07,880 --> 00:12:12,280 Speaker 1: are fully discouraged from disagreeing, from not aligning, from not conforming, 181 00:12:12,840 --> 00:12:21,040 Speaker 1: from not being in the thought cult that is pervasive 182 00:12:21,840 --> 00:12:24,000 Speaker 1: on each side of the fence, and it happens on 183 00:12:24,040 --> 00:12:26,080 Speaker 1: both sides of the fence, doesn't it both sides of 184 00:12:26,080 --> 00:12:29,960 Speaker 1: the fence. So the problem with this is package deal thinking. 185 00:12:30,120 --> 00:12:34,960 Speaker 1: So choosing a side means accepting an entire ideology wholesale. 186 00:12:34,960 --> 00:12:38,200 Speaker 1: You're expected to embrace not just one position, but all 187 00:12:38,320 --> 00:12:43,640 Speaker 1: the positions, an entire constellation of beliefs, many of which 188 00:12:43,720 --> 00:12:49,480 Speaker 1: may or may not logically connect. And this bundling of beliefs, 189 00:12:49,520 --> 00:12:54,400 Speaker 1: this grouping of thoughts and ideas and ideologies, restricts your 190 00:12:54,440 --> 00:12:59,920 Speaker 1: own intellectual freedom, and it stops us from honest self 191 00:13:00,200 --> 00:13:05,720 Speaker 1: inquiry and reflection because we're more interested in belonging, because 192 00:13:05,800 --> 00:13:09,120 Speaker 1: belonging meets an emotional need. We want to belong, And 193 00:13:09,160 --> 00:13:11,640 Speaker 1: when I belong, I feel safe, and I feel included, 194 00:13:11,960 --> 00:13:14,720 Speaker 1: and I feel seen, and I feel valued, and all 195 00:13:14,760 --> 00:13:19,760 Speaker 1: of these internal kind of awarenesses that I have about 196 00:13:19,920 --> 00:13:22,440 Speaker 1: some of these things that actually I believe a bullshit. 197 00:13:22,760 --> 00:13:25,600 Speaker 1: I'll just bite my tongue because I'm more interested in 198 00:13:25,679 --> 00:13:30,840 Speaker 1: belonging than thinking critically and letting them know what I think. 199 00:13:32,480 --> 00:13:38,040 Speaker 1: Once we choose a side, that choice often becomes intertwined 200 00:13:38,800 --> 00:13:43,720 Speaker 1: with our identity and our position is no longer just 201 00:13:43,800 --> 00:13:47,720 Speaker 1: what we think. It literally becomes who we are. I am. 202 00:13:47,920 --> 00:13:52,480 Speaker 1: It's one of those I am statements. I am a 203 00:13:52,960 --> 00:13:55,920 Speaker 1: fill in the blank, I do this, I am in 204 00:13:55,960 --> 00:13:59,959 Speaker 1: this group. And this makes it psychologically costly to chain 205 00:14:00,040 --> 00:14:03,760 Speaker 1: in our minds, and it makes it socially costly, and 206 00:14:03,800 --> 00:14:07,880 Speaker 1: it makes it emotionally difficult. It makes it and once 207 00:14:07,880 --> 00:14:11,120 Speaker 1: we're really intertwined with a side or a group, once 208 00:14:11,200 --> 00:14:14,520 Speaker 1: our sense of self and self worth and identity and 209 00:14:14,679 --> 00:14:18,240 Speaker 1: our who I am this becomes intertwined with being on 210 00:14:18,240 --> 00:14:23,800 Speaker 1: a particular side of the fence, then we the people 211 00:14:23,840 --> 00:14:26,760 Speaker 1: who say we want to be open minded and objective, 212 00:14:26,840 --> 00:14:31,760 Speaker 1: we lose all of that capacity regarding that particular ideology. Anyway, 213 00:14:32,400 --> 00:14:35,800 Speaker 1: Like there's never I've never met walked into a room, 214 00:14:36,360 --> 00:14:38,440 Speaker 1: which I've done this many times and said, put up 215 00:14:38,480 --> 00:14:42,880 Speaker 1: your hand if you if you don't want to be 216 00:14:42,960 --> 00:14:46,800 Speaker 1: open minded or objective, and the hands go up, because 217 00:14:46,840 --> 00:14:49,240 Speaker 1: everybody in the room wants to be open minded and 218 00:14:49,280 --> 00:14:53,960 Speaker 1: objective except when it comes to their beliefs. Except that 219 00:14:54,040 --> 00:14:56,640 Speaker 1: comes when it comes to the things that they think 220 00:14:56,800 --> 00:15:04,080 Speaker 1: they know for sure. So choosing a side, which I'll 221 00:15:04,120 --> 00:15:06,160 Speaker 1: say again, it's not a bad it's not a bad thing. 222 00:15:06,320 --> 00:15:11,239 Speaker 1: It's not a bad thing. But side taking can activate 223 00:15:11,320 --> 00:15:17,800 Speaker 1: and amplify numerous cognitive biases. So confirmation bias, which we've 224 00:15:17,800 --> 00:15:20,040 Speaker 1: spoken about many times on the show, we seek out 225 00:15:20,040 --> 00:15:25,240 Speaker 1: information or ideas or behavior that confirms what we already believed. 226 00:15:26,280 --> 00:15:31,760 Speaker 1: Disconfirmation bias we don't talk about much, but it's the opposite. 227 00:15:31,840 --> 00:15:37,800 Speaker 1: It's when we hear anything, or read anything, or are 228 00:15:37,840 --> 00:15:41,840 Speaker 1: exposed to any idea or thought or ideology or philosophy 229 00:15:41,920 --> 00:15:45,680 Speaker 1: or theology for that matter that doesn't align with ours. 230 00:15:46,480 --> 00:15:51,880 Speaker 1: We either scrutinize it way more harshly or we don't 231 00:15:51,920 --> 00:15:57,600 Speaker 1: even consider it. And I think, like, this is a 232 00:15:57,720 --> 00:16:01,960 Speaker 1: really I'm thinking of about this in relation to me, right, 233 00:16:03,240 --> 00:16:06,080 Speaker 1: this is a hard lesson. This is a hard you know, 234 00:16:07,040 --> 00:16:10,400 Speaker 1: I'm all about personal development. I'm all about helping you develop, 235 00:16:11,120 --> 00:16:14,480 Speaker 1: you know, self and self awareness and physically, mentally, emotionally, 236 00:16:14,520 --> 00:16:17,840 Speaker 1: spiritually whatever that means. But this is a hard thing 237 00:16:18,000 --> 00:16:23,160 Speaker 1: to do, is to recognize your own bias and to 238 00:16:23,280 --> 00:16:27,120 Speaker 1: acknowledge your bias, and to acknowledge that when it comes 239 00:16:27,120 --> 00:16:29,920 Speaker 1: to this or that I am not open minded, I 240 00:16:29,960 --> 00:16:36,560 Speaker 1: am closed minded because I know, I know, and it's 241 00:16:36,720 --> 00:16:41,120 Speaker 1: just it's just not true. Sometimes the truth is the 242 00:16:41,400 --> 00:16:45,640 Speaker 1: uncomfortable truth is that we don't always know. Sometimes we 243 00:16:45,680 --> 00:16:50,720 Speaker 1: absolutely know, sometimes we absolutely know, but there are times 244 00:16:51,520 --> 00:16:57,720 Speaker 1: where what we think is the objective, overwhelming, categoric truth 245 00:16:58,440 --> 00:17:03,280 Speaker 1: is just a story that we fully believe. All right, 246 00:17:03,360 --> 00:17:07,680 Speaker 1: let's talk about why fence sitting gets such a bad rap. 247 00:17:07,680 --> 00:17:12,480 Speaker 1: But it is culturally and socially very maligned. We want 248 00:17:12,480 --> 00:17:18,000 Speaker 1: to know where people stand, and the shoulder shrugging I'm 249 00:17:18,040 --> 00:17:20,600 Speaker 1: not sure, or I don't have a strong opinion, or 250 00:17:20,920 --> 00:17:25,120 Speaker 1: I still need to think, or I need more evidence 251 00:17:25,200 --> 00:17:27,800 Speaker 1: or more information or more data. That doesn't go down 252 00:17:28,080 --> 00:17:35,160 Speaker 1: well often. So I think sometimes fence sitting gets misinterpreted 253 00:17:35,600 --> 00:17:41,959 Speaker 1: as apathy. It gets conflated with not caring, when in reality, 254 00:17:43,280 --> 00:17:47,640 Speaker 1: I think many fence sitters care deeply, but refuse simplistic 255 00:17:47,760 --> 00:17:53,600 Speaker 1: solutions to complex problems. It's not always A or B. 256 00:17:54,440 --> 00:17:58,760 Speaker 1: And sometimes we're literally presented with two schools of thought 257 00:18:00,720 --> 00:18:04,120 Speaker 1: and maybe we're not even thinking about other possibilities. So 258 00:18:04,160 --> 00:18:09,679 Speaker 1: we get emotionally and socially and intellectually corralled into this 259 00:18:09,920 --> 00:18:15,160 Speaker 1: place where people want us to choose this option or 260 00:18:15,200 --> 00:18:18,760 Speaker 1: that option. But when you say is there a third option, 261 00:18:20,240 --> 00:18:23,640 Speaker 1: they haven't even thought about that and the idea that look, 262 00:18:24,240 --> 00:18:27,880 Speaker 1: I'm not going to choose either option. Like I said, 263 00:18:28,000 --> 00:18:30,720 Speaker 1: some people, they see that as not caring, They see 264 00:18:30,720 --> 00:18:33,919 Speaker 1: that as apathy, they see that as ignorance. For me, 265 00:18:34,000 --> 00:18:39,680 Speaker 1: I actually think sometimes I think it's a I think 266 00:18:39,720 --> 00:18:43,639 Speaker 1: it's an act of courage because it's much easier to 267 00:18:43,760 --> 00:18:46,760 Speaker 1: just agree with someone, isn't it. And I don't like 268 00:18:46,880 --> 00:18:53,240 Speaker 1: disagreeing for the sake of disagreeing. But you know, it's 269 00:18:53,280 --> 00:18:58,679 Speaker 1: so easy to just keep that social lubricant that is 270 00:18:59,400 --> 00:19:05,040 Speaker 1: agreement happening. It's so easy to do that. But what 271 00:19:05,240 --> 00:19:07,960 Speaker 1: happens when we just say yes to things that we 272 00:19:08,040 --> 00:19:11,439 Speaker 1: actually don't really believe, or we just align or we 273 00:19:11,600 --> 00:19:14,840 Speaker 1: just conform. I think in the middle of all of 274 00:19:14,880 --> 00:19:19,560 Speaker 1: that agreement and all of that choosing a team, I 275 00:19:19,600 --> 00:19:22,240 Speaker 1: think we lose ourselves. Like I am so passionate about 276 00:19:22,920 --> 00:19:26,200 Speaker 1: I'm happy to belong to well, I don't really want 277 00:19:26,240 --> 00:19:28,920 Speaker 1: to belong to a group. I just think how I think. 278 00:19:28,960 --> 00:19:32,640 Speaker 1: And if there are other people that think like I do, 279 00:19:32,840 --> 00:19:37,360 Speaker 1: or belong or what's the word behave, or have similar 280 00:19:37,440 --> 00:19:41,080 Speaker 1: values or ideas that's cool, But I would really hope 281 00:19:41,119 --> 00:19:43,959 Speaker 1: those people don't align with me on everything, because then 282 00:19:44,000 --> 00:19:46,640 Speaker 1: I feel like, well, now we've created our home cults 283 00:19:47,040 --> 00:19:49,439 Speaker 1: by trying not to be in a thought cult. Now 284 00:19:49,520 --> 00:19:52,840 Speaker 1: we've created one. We're all so good. Of Course we're 285 00:19:52,880 --> 00:19:55,560 Speaker 1: going to diverge. Of Course we're not going to agree 286 00:19:55,560 --> 00:19:59,679 Speaker 1: on everything. Of Course there's going to be you know, 287 00:19:59,720 --> 00:20:03,240 Speaker 1: there's going to be disagreement. But I think this idea 288 00:20:03,520 --> 00:20:10,080 Speaker 1: that we all need to somehow live in echo chambers 289 00:20:10,200 --> 00:20:13,320 Speaker 1: or groups where we essentially are the same and think 290 00:20:13,400 --> 00:20:16,800 Speaker 1: the same and behave the same and periodically vote the 291 00:20:16,840 --> 00:20:20,040 Speaker 1: same or worship the same or eat the same or 292 00:20:20,840 --> 00:20:25,480 Speaker 1: whatever it is. I don't see that as being you 293 00:20:25,880 --> 00:20:31,000 Speaker 1: or me being the best version of ourselves. I think 294 00:20:31,000 --> 00:20:36,199 Speaker 1: it's also true and completely understandable that we have we 295 00:20:36,240 --> 00:20:40,080 Speaker 1: have an action bias. Like we are, we really like 296 00:20:40,520 --> 00:20:44,879 Speaker 1: people to take action, be decisive, don't sit on the fence. 297 00:20:44,960 --> 00:20:49,160 Speaker 1: We have a real psychological preference for action over inaction, 298 00:20:49,880 --> 00:20:54,879 Speaker 1: and the fence sitters fen sitting or the fence sitter's 299 00:20:54,960 --> 00:21:01,480 Speaker 1: deliberation or thoughtfulness or consideration can be taken for paralysis 300 00:21:01,520 --> 00:21:12,240 Speaker 1: analysis paralysis rather than being recognized as something of social, emotional, psychological, 301 00:21:12,880 --> 00:21:17,959 Speaker 1: and cognitive value where I'm actually really thinking deeply, and 302 00:21:18,000 --> 00:21:19,840 Speaker 1: I know that it doesn't suit you, and I know 303 00:21:19,920 --> 00:21:23,080 Speaker 1: that it's not convenient for your group. I know that 304 00:21:23,119 --> 00:21:26,159 Speaker 1: it's not convenient for you right now in this conversation. 305 00:21:26,359 --> 00:21:30,200 Speaker 1: But the truth is I don't have an opinion just yet. 306 00:21:30,800 --> 00:21:34,880 Speaker 1: I don't. And maybe you'll see me in a week 307 00:21:34,960 --> 00:21:36,600 Speaker 1: or a month or a year and I still won't know. 308 00:21:36,640 --> 00:21:41,840 Speaker 1: And I don't. It's like, like I have this really 309 00:21:41,880 --> 00:21:44,040 Speaker 1: interesting thing because you guys know that I grew up 310 00:21:44,080 --> 00:21:47,680 Speaker 1: in a very much a God paradigm, in a very churchy, 311 00:21:47,840 --> 00:21:57,440 Speaker 1: very very theologically strong environment, with certain beliefs and certain 312 00:21:57,560 --> 00:22:02,119 Speaker 1: values and certain unequivocal rules. And this is how it is. 313 00:22:02,160 --> 00:22:05,600 Speaker 1: This is how, This is how religion works, Craig, This 314 00:22:05,680 --> 00:22:09,360 Speaker 1: is how Catholicism works. And then later on that I 315 00:22:09,400 --> 00:22:12,680 Speaker 1: was in another church, which was a fundamental Christian church, 316 00:22:12,720 --> 00:22:14,240 Speaker 1: and this is how it works here. And this is 317 00:22:14,280 --> 00:22:16,199 Speaker 1: the Bible, and this is true and God said this, 318 00:22:16,359 --> 00:22:18,560 Speaker 1: and so you need to do these things. You need 319 00:22:18,640 --> 00:22:21,200 Speaker 1: to stop doing those things. More of this, less of that, 320 00:22:21,720 --> 00:22:24,040 Speaker 1: and if you don't do this, then this is the price. 321 00:22:24,080 --> 00:22:26,240 Speaker 1: And if you do do this. This is the reward, 322 00:22:26,760 --> 00:22:31,120 Speaker 1: and so are you on board. And by the way, 323 00:22:31,200 --> 00:22:36,000 Speaker 1: in order to have this mystical, magical, beautiful outcome, you 324 00:22:36,200 --> 00:22:39,760 Speaker 1: need to basically sign on this metaphoric dotted line so 325 00:22:39,800 --> 00:22:42,359 Speaker 1: you can join our team. And if you join our team, 326 00:22:42,359 --> 00:22:44,840 Speaker 1: you're in the right team. All the other four thousand 327 00:22:44,960 --> 00:22:46,960 Speaker 1: or so religions are wrong, but we're right, and we've 328 00:22:47,000 --> 00:22:49,280 Speaker 1: got the one true hotline to God. And thank God 329 00:22:49,320 --> 00:22:52,120 Speaker 1: you found us, literally because here you are and now 330 00:22:52,119 --> 00:22:55,399 Speaker 1: you're in the group. Now, the problem with that is 331 00:22:56,880 --> 00:23:00,879 Speaker 1: one of the problems, one of the intellectual problems with that. 332 00:23:01,040 --> 00:23:03,040 Speaker 1: Now I'm not saying there is or isn't a God, 333 00:23:03,280 --> 00:23:08,679 Speaker 1: so don't judge me. I'm saying I don't know. I 334 00:23:08,720 --> 00:23:14,160 Speaker 1: don't know, I don't know. I don't absolutely have absolute knowledge. 335 00:23:15,520 --> 00:23:18,919 Speaker 1: And that bothers lots of people because like we very 336 00:23:19,000 --> 00:23:22,440 Speaker 1: much live in a culture where people not everyone. There's 337 00:23:22,680 --> 00:23:25,240 Speaker 1: agnosticism where people are kind of not sure I know, 338 00:23:25,320 --> 00:23:29,600 Speaker 1: and there's atheism, and there's right down the old theological 339 00:23:29,680 --> 00:23:32,719 Speaker 1: rabbit hole fully committed. And again this is not judgment 340 00:23:32,800 --> 00:23:36,719 Speaker 1: or criticism of any of those particular operating systems that 341 00:23:36,760 --> 00:23:40,199 Speaker 1: people choose, but when people say to me things like 342 00:23:40,240 --> 00:23:42,879 Speaker 1: which I get asked pretty regularly, what do you believe? 343 00:23:44,400 --> 00:23:48,680 Speaker 1: And I'm not even sure what I believe anymore? Do 344 00:23:49,480 --> 00:23:54,080 Speaker 1: I hope that there's loving, compassionate God that cares about 345 00:23:54,119 --> 00:23:57,560 Speaker 1: you and me, and blah blah blah all that kind 346 00:23:57,600 --> 00:24:00,840 Speaker 1: of I maybe believe it less than I used to, 347 00:24:01,359 --> 00:24:03,800 Speaker 1: and I feel guilty about that. And then I'm like, well, 348 00:24:04,280 --> 00:24:08,359 Speaker 1: do I feel guilty because that's God? Or do I 349 00:24:08,359 --> 00:24:12,959 Speaker 1: feel guilty because I have this historical intertwined relationship with 350 00:24:13,040 --> 00:24:16,040 Speaker 1: this culture and this theology in this church and this 351 00:24:16,280 --> 00:24:20,760 Speaker 1: sociology that operates around this certain way of being. Like, 352 00:24:21,000 --> 00:24:25,520 Speaker 1: this is difficult, and this is where thinking critically can 353 00:24:25,600 --> 00:24:32,000 Speaker 1: be scary, thinking for yourself and being okay with saying, look, 354 00:24:33,119 --> 00:24:35,119 Speaker 1: maybe there's a God, maybe there is an I don't know, 355 00:24:36,160 --> 00:24:39,120 Speaker 1: and I think if we're all truthful again, I could 356 00:24:39,160 --> 00:24:41,280 Speaker 1: be wrong, but we'd all have to say I don't know. 357 00:24:42,880 --> 00:24:45,000 Speaker 1: Saying I don't know is not the same as saying 358 00:24:45,040 --> 00:24:48,800 Speaker 1: I don't believe. Knock yourself out with your belief. I 359 00:24:48,840 --> 00:24:51,920 Speaker 1: believe lots of things that I can't prove. So if 360 00:24:51,920 --> 00:24:53,679 Speaker 1: you said to me, do you know? I would say no, 361 00:24:53,800 --> 00:24:56,080 Speaker 1: I don't know. But what I do is I believe 362 00:24:56,200 --> 00:24:58,520 Speaker 1: or I have perhaps I have faith. I can have 363 00:24:58,960 --> 00:25:02,080 Speaker 1: faith is literally believing something that I don't have data 364 00:25:02,160 --> 00:25:06,120 Speaker 1: or evidence or science for. And so but people don't 365 00:25:06,160 --> 00:25:08,520 Speaker 1: want you to be offensive when it comes to things 366 00:25:08,560 --> 00:25:12,080 Speaker 1: like this, these big ticket items. Of course they don't, 367 00:25:13,720 --> 00:25:18,879 Speaker 1: but maybe maybe for some of us saying look, I 368 00:25:19,040 --> 00:25:21,679 Speaker 1: believe certain things that I want this to be true, 369 00:25:21,760 --> 00:25:23,560 Speaker 1: and there are certain things I believe and I want 370 00:25:23,600 --> 00:25:26,520 Speaker 1: to be true. But when I'm being brave, when I'm 371 00:25:26,560 --> 00:25:28,919 Speaker 1: being the most authentic, or I think I'm being the 372 00:25:28,960 --> 00:25:31,640 Speaker 1: most authentic version of me, I have to say, look, 373 00:25:31,640 --> 00:25:33,359 Speaker 1: this is what I want to be true, this is 374 00:25:33,359 --> 00:25:35,399 Speaker 1: what I think. But I could be wrong. And the 375 00:25:35,440 --> 00:25:37,479 Speaker 1: reason that I have to say that is one, I 376 00:25:37,520 --> 00:25:40,720 Speaker 1: don't actually know. I don't actually know, I don't have absolute, 377 00:25:40,800 --> 00:25:43,960 Speaker 1: unequivocal knowledge. And two I have to say that because 378 00:25:44,680 --> 00:25:49,920 Speaker 1: I have been so wrong so many times. In fact, 379 00:25:50,040 --> 00:25:57,840 Speaker 1: choosing aside has probably been more disadvantageous than helpful over 380 00:25:57,880 --> 00:26:00,439 Speaker 1: the years where I feel, oh, fuck, I've got to 381 00:26:00,520 --> 00:26:03,080 Speaker 1: I've got to join one of these teams, because you 382 00:26:03,160 --> 00:26:07,080 Speaker 1: can't not be on a team, I'm telling you. Sometimes 383 00:26:07,320 --> 00:26:13,119 Speaker 1: fuck the team. Sometimes do a deep dive yourself. Listen 384 00:26:13,119 --> 00:26:18,880 Speaker 1: to that in a wisdom, listen to that subconscious mind 385 00:26:18,880 --> 00:26:20,879 Speaker 1: of yours. What do you know? What do you believe? 386 00:26:20,960 --> 00:26:22,959 Speaker 1: Or what do you think? And there will be still 387 00:26:23,119 --> 00:26:26,760 Speaker 1: things that you still have conflict and confusion around, And 388 00:26:26,840 --> 00:26:30,960 Speaker 1: that's okay. How about this? We just live with some 389 00:26:31,040 --> 00:26:34,320 Speaker 1: of the conflict and confusion and the inner turmoil. Why 390 00:26:34,359 --> 00:26:37,600 Speaker 1: do we need to know everything? Why do we need 391 00:26:37,640 --> 00:26:41,760 Speaker 1: to have everything sorted? Why do we need to know? 392 00:26:42,880 --> 00:26:45,959 Speaker 1: Why do we need to have a strong opinion on 393 00:26:46,080 --> 00:26:51,520 Speaker 1: fucking everything? Why do we criticize people who aren't in 394 00:26:51,760 --> 00:26:55,240 Speaker 1: a group? Why do we do that? Why do why 395 00:26:55,240 --> 00:26:57,639 Speaker 1: do we go get off the fence? You're getting splinters 396 00:26:57,640 --> 00:27:00,320 Speaker 1: in your ass? Why do we do that? We do 397 00:27:00,440 --> 00:27:03,000 Speaker 1: that because we're in a group and they want them 398 00:27:03,040 --> 00:27:05,320 Speaker 1: to be in our group. That's why we do it. 399 00:27:06,440 --> 00:27:10,399 Speaker 1: Fuck that. That is the opposite of independent critical thinking. 400 00:27:11,000 --> 00:27:13,800 Speaker 1: To me, that is the height of being a fucking 401 00:27:14,160 --> 00:27:18,439 Speaker 1: social sheep. You might do a full kind of audit 402 00:27:18,520 --> 00:27:21,679 Speaker 1: and inventory and deep dive into what you think and 403 00:27:21,800 --> 00:27:24,280 Speaker 1: where that came from, and what you believe and how 404 00:27:24,400 --> 00:27:27,359 Speaker 1: and why you believe it, and they're all good things. 405 00:27:27,359 --> 00:27:30,199 Speaker 1: And you might eventually find that, oh wow, all of 406 00:27:30,240 --> 00:27:34,679 Speaker 1: that stuff actually just naturally puts me on this side 407 00:27:34,680 --> 00:27:36,760 Speaker 1: of the fence. And if that's the case, I think 408 00:27:36,800 --> 00:27:40,879 Speaker 1: that's great. But I think being coerced and conjoled cajoled 409 00:27:41,480 --> 00:27:46,000 Speaker 1: is that a word? I think it is to pick 410 00:27:46,080 --> 00:27:51,000 Speaker 1: a side. I don't see that in any way as 411 00:27:51,040 --> 00:27:56,879 Speaker 1: being helpful to you or me as individuals. Oh, I 412 00:27:56,920 --> 00:27:58,760 Speaker 1: don't know if any of this is making sense. It 413 00:27:58,840 --> 00:28:05,719 Speaker 1: makes sense to me anyway. So maybe what's the middle path. 414 00:28:06,160 --> 00:28:11,080 Speaker 1: What's the middle path? Is it being the active fence sitter. 415 00:28:12,080 --> 00:28:16,639 Speaker 1: Perhaps what we need isn't to glorify fence sitting definitely 416 00:28:17,200 --> 00:28:21,120 Speaker 1: or side taking per se, but maybe it is too 417 00:28:22,880 --> 00:28:29,320 Speaker 1: m Maybe it's to reimagine what principled neutrality looks like. 418 00:28:29,520 --> 00:28:37,280 Speaker 1: Oh I like that principled neutrality. So I've written a 419 00:28:37,320 --> 00:28:40,640 Speaker 1: couple of things that I think are maybe a stepping 420 00:28:40,680 --> 00:28:46,280 Speaker 1: off point, and number one is to commit to principles 421 00:28:47,320 --> 00:28:52,160 Speaker 1: over tribes, over groups, over echo chambers. For me, the 422 00:28:52,160 --> 00:28:58,960 Speaker 1: wise fence sitter commits to consistent principles rather than groupthink, 423 00:29:00,160 --> 00:29:06,200 Speaker 1: or consistent principles rather than consistent tribes. They apply the 424 00:29:06,240 --> 00:29:11,560 Speaker 1: same standards of evidence and ethics regardless of which side 425 00:29:12,560 --> 00:29:16,840 Speaker 1: an idea comes from. I'm going to say that again 426 00:29:16,840 --> 00:29:20,120 Speaker 1: because I think that's really powerful and worth remembering and 427 00:29:20,160 --> 00:29:25,080 Speaker 1: worth operationalizing. So the idea is to commit to principles 428 00:29:25,120 --> 00:29:29,080 Speaker 1: over tribes. The wise fensit it commits to consistent principles 429 00:29:29,440 --> 00:29:33,120 Speaker 1: rather than consistent tribes. They apply the same standards of 430 00:29:33,160 --> 00:29:37,240 Speaker 1: evidence and ethics regardless of where that idea came from. 431 00:29:37,240 --> 00:29:42,720 Speaker 1: In other words, which side that idea or construct or 432 00:29:42,800 --> 00:29:49,400 Speaker 1: theory came from. Number two on maybe choosing the middle 433 00:29:49,480 --> 00:29:55,640 Speaker 1: path of being an active fendsitter is intellectual humility as strength, 434 00:29:55,760 --> 00:30:00,400 Speaker 1: recognizing that the limitations of our knowledge are not weakness 435 00:30:00,960 --> 00:30:07,680 Speaker 1: but perhaps wisdom. It's not weakness to recognize that you 436 00:30:07,800 --> 00:30:12,520 Speaker 1: don't know, or that you don't have an opinion, or 437 00:30:12,560 --> 00:30:15,200 Speaker 1: that you don't fit into a group. That's not weakness, 438 00:30:15,640 --> 00:30:21,000 Speaker 1: that's wisdom, that's courage. The fence sitters I don't know 439 00:30:21,200 --> 00:30:31,640 Speaker 1: yet is actually a position of strength and independence and authenticity, 440 00:30:31,800 --> 00:30:41,280 Speaker 1: not indecision. Two more, Number three strategic neutrality. Some issues 441 00:30:42,240 --> 00:30:47,280 Speaker 1: genuinely require taking a clear position. The wise fence sitter 442 00:30:47,400 --> 00:30:50,320 Speaker 1: knows when to descend from the fence and when stay 443 00:30:50,400 --> 00:30:55,320 Speaker 1: up there when to maintain perspective like the interesting thing 444 00:30:55,440 --> 00:30:59,440 Speaker 1: is like in reality. Think about this. When I was 445 00:30:59,480 --> 00:31:04,000 Speaker 1: thinking about before I started recording, I was literally thinking 446 00:31:04,040 --> 00:31:07,360 Speaker 1: of sitting up on a fence with a group. Let's say, 447 00:31:08,280 --> 00:31:10,280 Speaker 1: say I'm up on a big fence, like a two 448 00:31:10,320 --> 00:31:13,320 Speaker 1: meter high fence, if you're in the States, six and 449 00:31:13,360 --> 00:31:16,880 Speaker 1: a half feet I'm up on a two meter fence, 450 00:31:16,920 --> 00:31:19,000 Speaker 1: and there's a thousand people to the left of me 451 00:31:19,040 --> 00:31:20,840 Speaker 1: and a thousand people to the right of me, and 452 00:31:20,880 --> 00:31:23,400 Speaker 1: they can't see each other. They can all see me, 453 00:31:23,520 --> 00:31:26,240 Speaker 1: and I can all see them, but none of them 454 00:31:26,360 --> 00:31:29,960 Speaker 1: can see each other. And if you can't talk to 455 00:31:30,000 --> 00:31:33,400 Speaker 1: anyone on the other side, because you're on a side 456 00:31:33,880 --> 00:31:37,440 Speaker 1: you can't talk to or listen to, or consider or 457 00:31:37,480 --> 00:31:42,720 Speaker 1: perhaps gain wisdom or insight or understanding from anyone on 458 00:31:42,760 --> 00:31:46,920 Speaker 1: the other side, how the fuck can you possibly be 459 00:31:47,120 --> 00:31:54,400 Speaker 1: open minded because you already believe that everyone, every thousand, 460 00:31:54,760 --> 00:31:57,600 Speaker 1: every one of those thousand on the other side are wrong, 461 00:31:57,640 --> 00:32:00,400 Speaker 1: and every one of the thousand on your side all right. 462 00:32:00,520 --> 00:32:07,520 Speaker 1: And this is problematic. The wise fenceitter knows when to 463 00:32:07,560 --> 00:32:12,320 Speaker 1: get down from the fence and when to maintain their perspective. 464 00:32:13,200 --> 00:32:16,760 Speaker 1: I just think that sometimes sometimes being up in that 465 00:32:16,800 --> 00:32:22,440 Speaker 1: position where you don't need to align with a group, 466 00:32:23,200 --> 00:32:26,200 Speaker 1: you know, it's like and this will probably shatter some 467 00:32:26,280 --> 00:32:31,440 Speaker 1: of you. Sometimes I think Donald Trump's a fucking idiot. 468 00:32:32,160 --> 00:32:35,120 Speaker 1: Sometimes I think, oh, my goodness, like some of the shit, 469 00:32:35,320 --> 00:32:39,880 Speaker 1: some of the and sometimes every now and then I'm like, oh, 470 00:32:40,160 --> 00:32:43,479 Speaker 1: that seems to be a good idea. Now do I 471 00:32:43,520 --> 00:32:49,200 Speaker 1: think he's categorically this or that? No, I don't, well 472 00:32:49,240 --> 00:32:52,200 Speaker 1: that you know, am I on a particular side of 473 00:32:52,240 --> 00:32:54,800 Speaker 1: the fence. No, I'm not, and I actually don't want 474 00:32:54,840 --> 00:32:59,040 Speaker 1: to be. That's the other thing I think when you 475 00:32:59,200 --> 00:33:03,360 Speaker 1: identify I am pro this person or anti this person, 476 00:33:03,960 --> 00:33:08,920 Speaker 1: or I think I think, you know, eating this way 477 00:33:09,120 --> 00:33:12,240 Speaker 1: is good and eating that way is bad. I think 478 00:33:12,280 --> 00:33:14,640 Speaker 1: that this kind of workout is the best kind of workout. 479 00:33:14,720 --> 00:33:17,600 Speaker 1: That kind of workout is dangerous. Blah blah blah. Now, 480 00:33:17,720 --> 00:33:22,440 Speaker 1: unless all of these thoughts are really based in some 481 00:33:22,560 --> 00:33:27,920 Speaker 1: kind of objective science or wisdom, then I don't see 482 00:33:27,960 --> 00:33:34,240 Speaker 1: any real advantage in having to jump into either one 483 00:33:34,280 --> 00:33:43,880 Speaker 1: of those psychological, sociological, cognitive melting pots. So I think 484 00:33:44,880 --> 00:33:48,200 Speaker 1: I think sometimes from their vantage point, fence sitters can 485 00:33:48,280 --> 00:33:54,320 Speaker 1: serve as translators between opposing camps, cognitive camps, helping each 486 00:33:54,680 --> 00:33:57,760 Speaker 1: side if those sides will allow them to understand the 487 00:33:57,840 --> 00:34:02,080 Speaker 1: legitimate concerns of the other. And this is one of 488 00:34:02,080 --> 00:34:05,440 Speaker 1: the challenges because around all of these things, all of 489 00:34:05,480 --> 00:34:09,320 Speaker 1: these ideologies, all of these philosophies, all of these issues, 490 00:34:09,360 --> 00:34:12,799 Speaker 1: and just have a look at the world. I don't 491 00:34:12,800 --> 00:34:18,160 Speaker 1: say this often, and I'm not lying awake sleepless every night, 492 00:34:18,320 --> 00:34:23,080 Speaker 1: but I'm pretty concerned. I'm pretty concerned with the world. 493 00:34:23,160 --> 00:34:29,040 Speaker 1: And I think while humans are allegedly the smartest species, 494 00:34:31,360 --> 00:34:37,759 Speaker 1: I would say I don't. It depends how we measure smart, right, 495 00:34:39,040 --> 00:34:41,719 Speaker 1: what are the metrics that we're using for intelligence? If 496 00:34:41,760 --> 00:34:43,800 Speaker 1: we're talking about, you know, being able to build roads 497 00:34:43,800 --> 00:34:46,600 Speaker 1: and cars and flight of space and build comp Yeah, 498 00:34:46,600 --> 00:34:50,080 Speaker 1: of course, of course, if they're the measurements, we win. 499 00:34:52,600 --> 00:34:57,120 Speaker 1: But when you also look at the absolutely devastating mindless 500 00:34:57,400 --> 00:35:05,080 Speaker 1: stupidity that have happens, the hate, the division, the pick 501 00:35:05,120 --> 00:35:11,000 Speaker 1: a side of the fenceness, the divergence, not the convergence, 502 00:35:11,080 --> 00:35:15,400 Speaker 1: the separation that you don't think like me, you're my enemy, 503 00:35:15,680 --> 00:35:25,240 Speaker 1: thinking I just I worry. I worry. So maybe sitting 504 00:35:25,239 --> 00:35:28,280 Speaker 1: on the fence isn't a failure of courage, but actually 505 00:35:28,360 --> 00:35:33,920 Speaker 1: an act of intellectual integrity and bravery. And a culture 506 00:35:33,960 --> 00:35:37,560 Speaker 1: that rewards rapid judgment and tribal loyalty, the decision to 507 00:35:37,680 --> 00:35:41,120 Speaker 1: maintain perspective, to sit on the fence when it's appropriate 508 00:35:42,480 --> 00:35:46,560 Speaker 1: might be exactly the counterbalance we need right now