WEBVTT - Why We Have Too Much Free Will || Ken Sheldon

0:00:00.440 --> 0:00:02.960
<v Speaker 1>Hi everyone, I'm excited to announce that the eight week

0:00:03.040 --> 0:00:07.320
<v Speaker 1>online Transcend Course is back. Become certified in learning the

0:00:07.360 --> 0:00:09.840
<v Speaker 1>latest science of human potential and learn how to live

0:00:09.880 --> 0:00:13.680
<v Speaker 1>a more fulfilling, meaningful, creative, and self actualized life. The

0:00:13.720 --> 0:00:16.480
<v Speaker 1>new cohort starts on June twenty fifth and will include

0:00:16.480 --> 0:00:19.400
<v Speaker 1>more than ten hours of recorded lectures, live group Q

0:00:19.520 --> 0:00:22.200
<v Speaker 1>and A sessions with me, small group sessions with our

0:00:22.239 --> 0:00:25.560
<v Speaker 1>world class faculty, a plethora of resources and articles to

0:00:25.560 --> 0:00:29.000
<v Speaker 1>support your learning, and an exclusive workbook of growth challenges

0:00:29.160 --> 0:00:31.760
<v Speaker 1>that will help you overcome your deepest fears and grow

0:00:31.880 --> 0:00:35.040
<v Speaker 1>as a whole person. There are even some personalized self

0:00:35.080 --> 0:00:38.159
<v Speaker 1>actualization coaching spots available with our world class faculty. As

0:00:38.200 --> 0:00:40.760
<v Speaker 1>an add on, I'm happy to announce that just for

0:00:40.840 --> 0:00:43.960
<v Speaker 1>Psychology podcast listeners, you can get twenty percent off the

0:00:44.000 --> 0:00:47.880
<v Speaker 1>course but going to transcendcourse dot com and entering psych

0:00:48.080 --> 0:00:52.120
<v Speaker 1>Podcast in the coupon box. We will be closing registration soon,

0:00:52.360 --> 0:00:54.880
<v Speaker 1>so I suggest signing up as soon as possible. We

0:00:54.960 --> 0:00:56.959
<v Speaker 1>have so much fun in this course and you will

0:00:57.000 --> 0:00:59.920
<v Speaker 1>receive a lot of support along your self actualization journey.

0:01:00.320 --> 0:01:03.720
<v Speaker 1>Just go to transcendcourse dot com to register and enter

0:01:04.080 --> 0:01:10.560
<v Speaker 1>psych podcast psych Podcast In the coupon box, I look

0:01:10.600 --> 0:01:14.319
<v Speaker 1>forward to welcoming you to the transcender community.

0:01:14.440 --> 0:01:16.880
<v Speaker 2>There ain't nobody else making our choices but us. We

0:01:17.000 --> 0:01:21.880
<v Speaker 2>might like to think so, but we're responsible and that's scary,

0:01:22.640 --> 0:01:26.440
<v Speaker 2>but it's also tremendously empowering if you fully grasp it.

0:01:31.280 --> 0:01:34.600
<v Speaker 1>Hello, and welcome to the Psychology Podcast. Today we welcome

0:01:34.680 --> 0:01:37.800
<v Speaker 1>back Ken Sheldon to the show. Ken is a Curator's

0:01:37.840 --> 0:01:42.440
<v Speaker 1>Distinguished Professor of psychological Science at the University of Columbia, Missouri.

0:01:42.720 --> 0:01:45.560
<v Speaker 1>He has written and edited over two hundred academic books,

0:01:45.760 --> 0:01:49.400
<v Speaker 1>scholarly articles, and book chapters. Among these, some of his

0:01:49.480 --> 0:01:53.600
<v Speaker 1>most notable work include Optimal Human Being and self determination

0:01:53.760 --> 0:01:57.440
<v Speaker 1>Theory in the Clinic. His latest book is called Freely Determined.

0:01:57.720 --> 0:02:00.760
<v Speaker 1>What the New Psychology of the Self teach us about

0:02:00.800 --> 0:02:03.720
<v Speaker 1>how to live. In this episode, I talked to Ken

0:02:03.760 --> 0:02:07.680
<v Speaker 1>Sheldon about free will. Instead of questioning its existence, Ken

0:02:07.760 --> 0:02:09.680
<v Speaker 1>is concerned with how we might use free will to

0:02:09.680 --> 0:02:12.480
<v Speaker 1>help us reach our goals. Each person has the capacity

0:02:12.520 --> 0:02:15.079
<v Speaker 1>to make good and bad choices and to learn from

0:02:15.080 --> 0:02:18.440
<v Speaker 1>the past. Although we are unable to know everything about ourselves,

0:02:18.600 --> 0:02:21.400
<v Speaker 1>we can still make informed decisions. Believing that we have

0:02:21.440 --> 0:02:24.959
<v Speaker 1>the ability to choose directly affects our well being. In values,

0:02:25.440 --> 0:02:28.600
<v Speaker 1>we also touch on the topics of neuroscience, self determination,

0:02:29.040 --> 0:02:32.720
<v Speaker 1>and responsibility. I really enjoy this conversation with Ken. I

0:02:32.760 --> 0:02:35.640
<v Speaker 1>find him really insightful, and especially when it comes to

0:02:35.680 --> 0:02:39.480
<v Speaker 1>this quite heated topic within academia, this topic of do

0:02:39.480 --> 0:02:42.000
<v Speaker 1>we have free will or not? I like his perspective.

0:02:42.040 --> 0:02:44.880
<v Speaker 1>I think it's fresh, and I think it really draws

0:02:44.919 --> 0:02:49.239
<v Speaker 1>a lot on psychological science as opposed to maybe more

0:02:49.240 --> 0:02:52.840
<v Speaker 1>metaphysical ideas. Wherever you stand in the free will debate,

0:02:52.880 --> 0:02:55.480
<v Speaker 1>we'd love to hear from you. Please come into the comments,

0:02:55.600 --> 0:02:58.399
<v Speaker 1>and at the very least, I hope this conversation stimulated

0:02:58.720 --> 0:03:02.760
<v Speaker 1>some deep thought and reflection on this very important and

0:03:02.960 --> 0:03:06.480
<v Speaker 1>fascinating topic. So, without further ado, I bring you Ken

0:03:06.840 --> 0:03:09.760
<v Speaker 1>Sheldon again. It's so great to have you on the

0:03:09.800 --> 0:03:11.040
<v Speaker 1>Psychology Podcast.

0:03:12.320 --> 0:03:13.880
<v Speaker 2>Yeah, it's great to be here. I think I was

0:03:14.080 --> 0:03:16.360
<v Speaker 2>with you a few years ago, but things have kind

0:03:16.400 --> 0:03:18.160
<v Speaker 2>of changed a little bit since then, so it'll be

0:03:18.160 --> 0:03:19.840
<v Speaker 2>good to reconnect. Yeah.

0:03:19.840 --> 0:03:22.040
<v Speaker 1>Absolutely, a lot has changed in the world. And the

0:03:22.120 --> 0:03:24.600
<v Speaker 1>last time we talked about your brilliant work on the

0:03:25.120 --> 0:03:29.480
<v Speaker 1>optimal Human. So you wrote this fastening new book called

0:03:29.639 --> 0:03:32.919
<v Speaker 1>Freely Determined. I had to take the cover off because

0:03:33.160 --> 0:03:35.520
<v Speaker 1>it's hard to read it with the cover fallen off.

0:03:35.560 --> 0:03:39.600
<v Speaker 1>But anyway, Freely Determined, congratulations on the publication of this book,

0:03:39.760 --> 0:03:43.080
<v Speaker 1>and it adds some new ideas into this space, this

0:03:43.560 --> 0:03:47.000
<v Speaker 1>free will debate space. I noticed you don't get too

0:03:47.040 --> 0:03:52.880
<v Speaker 1>mirrored down in the philosophy of mind rigorous nuanced debate,

0:03:53.080 --> 0:03:55.320
<v Speaker 1>you know, and you know you don't even stake a position.

0:03:55.520 --> 0:03:57.120
<v Speaker 1>I don't think you anywhere in the book did you

0:03:57.160 --> 0:04:00.360
<v Speaker 1>say I'm a compatibilist. You know. I feel like you

0:04:00.400 --> 0:04:02.480
<v Speaker 1>focus a lot on the psychology of it, But what

0:04:02.640 --> 0:04:05.760
<v Speaker 1>are you within that whole debate? Like, are you a compatibilist?

0:04:05.840 --> 0:04:06.600
<v Speaker 1>Are you something else?

0:04:07.400 --> 0:04:10.480
<v Speaker 2>Probably you would call me a compatibilist. What I'm trying

0:04:10.480 --> 0:04:13.960
<v Speaker 2>to say is that you can have determinism and free

0:04:13.960 --> 0:04:17.039
<v Speaker 2>will at the same time. You just have to recognize

0:04:17.080 --> 0:04:21.320
<v Speaker 2>that human higher mental processes are part of the determining formula.

0:04:21.640 --> 0:04:23.680
<v Speaker 2>You know that what we do weigh up here in

0:04:23.760 --> 0:04:27.160
<v Speaker 2>this abstract mental world that we live in, is part

0:04:27.200 --> 0:04:30.400
<v Speaker 2>of what determines the behaviors that we take. And I

0:04:30.400 --> 0:04:33.040
<v Speaker 2>don't think that it's reducible to any of the lower

0:04:33.120 --> 0:04:37.560
<v Speaker 2>level machinery. The machinery gives us the capacity to do

0:04:37.600 --> 0:04:41.359
<v Speaker 2>what we do, but it doesn't determine what we do. Instead,

0:04:42.040 --> 0:04:46.800
<v Speaker 2>we selves attempting to operate our own minds, taking advantage

0:04:46.800 --> 0:04:49.440
<v Speaker 2>of all this machinery. We have really kind of running

0:04:49.440 --> 0:04:50.960
<v Speaker 2>the show for better or worse.

0:04:51.839 --> 0:04:53.760
<v Speaker 1>This sounds like a debate that you have with your father.

0:04:53.920 --> 0:04:56.760
<v Speaker 1>Is that right, He's a very deterministic view of the situation.

0:04:57.240 --> 0:05:00.280
<v Speaker 2>Yeah. I start off the book talking about these debates

0:05:00.320 --> 0:05:03.200
<v Speaker 2>going you know, way back into my teens with my dad,

0:05:03.240 --> 0:05:06.920
<v Speaker 2>who was a law professor and but also a determinist

0:05:07.120 --> 0:05:09.000
<v Speaker 2>was kind of a funny thing. He didn't think people

0:05:09.040 --> 0:05:11.440
<v Speaker 2>should be punished for their crimes because they couldn't help

0:05:11.480 --> 0:05:13.400
<v Speaker 2>doing them. But I, you know, I had the hardest

0:05:13.400 --> 0:05:18.200
<v Speaker 2>time because his arguments were so undefeatable in some ways.

0:05:18.920 --> 0:05:21.200
<v Speaker 2>And so you could say that my whole career as

0:05:21.240 --> 0:05:25.679
<v Speaker 2>a personality psychologist has been about trying to find ways

0:05:25.720 --> 0:05:28.840
<v Speaker 2>around those arguments or ways to think about them differently.

0:05:29.720 --> 0:05:34.000
<v Speaker 2>That gives us, you know, a viable position as mental

0:05:34.160 --> 0:05:37.640
<v Speaker 2>entities for taking control of our own lives and for

0:05:37.839 --> 0:05:40.440
<v Speaker 2>leading the kind of lives that we think we would like.

0:05:40.960 --> 0:05:41.480
<v Speaker 2>That's just it.

0:05:41.600 --> 0:05:43.720
<v Speaker 1>The leading the lives that we think well, the leading

0:05:43.760 --> 0:05:46.719
<v Speaker 1>the lives that we hopefully would like, because you go

0:05:46.760 --> 0:05:48.920
<v Speaker 1>through a lot sorting out the difference, how to find

0:05:48.920 --> 0:05:50.560
<v Speaker 1>out the difference between what we think we might like

0:05:50.600 --> 0:05:52.080
<v Speaker 1>and what we do like. So I'm going to give

0:05:52.080 --> 0:05:53.400
<v Speaker 1>you a little bit more credit than you just gave

0:05:53.440 --> 0:05:56.800
<v Speaker 1>yourself there. I'm about you know the kind of life right,

0:05:56.960 --> 0:05:59.440
<v Speaker 1>you know, because you go great pains just to for

0:05:59.560 --> 0:06:03.279
<v Speaker 1>us to really look deep inside and figure that out.

0:06:03.560 --> 0:06:05.000
<v Speaker 1>You know, like, there are a lot of things that

0:06:05.040 --> 0:06:08.840
<v Speaker 1>we think we should like. There's even things we might

0:06:08.839 --> 0:06:10.159
<v Speaker 1>think we like and we don't really like.

0:06:11.320 --> 0:06:14.960
<v Speaker 2>Yeah, yeah, it's not easy. No, we're in this funny position.

0:06:15.560 --> 0:06:19.360
<v Speaker 2>Having free will means we are also free to make mistakes.

0:06:19.520 --> 0:06:21.960
<v Speaker 2>We're free to be clueless, We're free to have no

0:06:22.080 --> 0:06:26.040
<v Speaker 2>idea what we really want. We're free to make terrible choices,

0:06:26.320 --> 0:06:28.599
<v Speaker 2>but we're also free to learn from those choices and

0:06:28.640 --> 0:06:31.599
<v Speaker 2>hopefully make better ones in the future. So, you know,

0:06:31.839 --> 0:06:34.800
<v Speaker 2>the book kind of talks about the free will debate

0:06:34.920 --> 0:06:38.000
<v Speaker 2>early on, and it doesn't really take a position, as

0:06:38.000 --> 0:06:41.679
<v Speaker 2>you said, It just concludes that there's reasonable doubt pretty

0:06:41.720 --> 0:06:45.320
<v Speaker 2>good reason for doubting the determinist's perspective. And then it

0:06:45.360 --> 0:06:47.080
<v Speaker 2>moves on to what I think is a more important

0:06:47.160 --> 0:06:50.000
<v Speaker 2>question of Okay, if we do have free will, as

0:06:50.040 --> 0:06:53.480
<v Speaker 2>I'm claiming or saying this probable, how well are we

0:06:53.520 --> 0:06:55.719
<v Speaker 2>able to use it? And that I think is the

0:06:55.880 --> 0:07:00.680
<v Speaker 2>really important question, because we're kind of cut off, you know,

0:07:00.720 --> 0:07:04.760
<v Speaker 2>we're living in this mental world system. To Daniel Cottaman

0:07:04.920 --> 0:07:09.120
<v Speaker 2>called it the tyranny of the prefrontal cortex. We're sort

0:07:09.120 --> 0:07:14.560
<v Speaker 2>of suspended above the machinery above. We have the ability

0:07:14.560 --> 0:07:17.880
<v Speaker 2>to be out of touch with ourselves, and so critical

0:07:17.960 --> 0:07:21.040
<v Speaker 2>question is to get the information, the good information that

0:07:21.080 --> 0:07:24.200
<v Speaker 2>we need to make choices that are going to cause

0:07:24.200 --> 0:07:28.720
<v Speaker 2>our entire organism to thrive, even though we're kind of

0:07:28.760 --> 0:07:32.600
<v Speaker 2>stuck as the conscious representatives of that organism moment to moment,

0:07:32.680 --> 0:07:36.440
<v Speaker 2>and we might be pretty clueless about what's really happening

0:07:36.480 --> 0:07:37.240
<v Speaker 2>inside of us.

0:07:37.840 --> 0:07:41.200
<v Speaker 1>It just seems like you somehow make a distinction between

0:07:42.160 --> 0:07:47.400
<v Speaker 1>system one being, you know, our unconscious mind so to speak,

0:07:47.440 --> 0:07:49.679
<v Speaker 1>that all the things beneath the surface of our conscious

0:07:49.680 --> 0:07:53.080
<v Speaker 1>awareness that are happening automatically kind of keeping the system

0:07:53.160 --> 0:07:58.320
<v Speaker 1>running without much pre thought. It seems like that system

0:07:58.400 --> 0:08:01.480
<v Speaker 1>one you try be a lot less free will to

0:08:01.760 --> 0:08:05.160
<v Speaker 1>than system two, which you talk a lot about the

0:08:05.200 --> 0:08:07.560
<v Speaker 1>symbolic mind, which we'll get to in a second. Yeah,

0:08:07.760 --> 0:08:11.480
<v Speaker 1>and system two being our ability to plan, to error, correct,

0:08:11.920 --> 0:08:15.240
<v Speaker 1>to change course, to maybe influence our system one. You

0:08:15.280 --> 0:08:17.600
<v Speaker 1>know that we can influence our system one by changing habits,

0:08:17.640 --> 0:08:19.760
<v Speaker 1>can't we? You know, it's not like a system one

0:08:19.760 --> 0:08:20.960
<v Speaker 1>is completely impenetrable.

0:08:21.400 --> 0:08:22.560
<v Speaker 2>That's right, that's right.

0:08:22.960 --> 0:08:25.520
<v Speaker 1>So you actually like weigh the two systems differently in

0:08:25.600 --> 0:08:27.600
<v Speaker 1>terms of their free will implications.

0:08:27.800 --> 0:08:30.440
<v Speaker 2>I asked a very excellent question. I haven't thought about

0:08:30.480 --> 0:08:32.839
<v Speaker 2>it quite that way, but I would agree with that

0:08:33.000 --> 0:08:37.920
<v Speaker 2>assessment in the book I adapt A philosopher, Christian Lists.

0:08:38.320 --> 0:08:41.480
<v Speaker 2>He wrote a book in twenty nineteen called Why free

0:08:41.480 --> 0:08:45.000
<v Speaker 2>Will Is Real, and he proposed a definition of free

0:08:45.000 --> 0:08:49.160
<v Speaker 2>will that put the question squarely into my territory as

0:08:49.200 --> 0:08:52.000
<v Speaker 2>a goal researcher. And he said, free will is nothing

0:08:52.040 --> 0:08:56.000
<v Speaker 2>more than the ability to ask yourself what you want,

0:08:56.800 --> 0:09:00.880
<v Speaker 2>call up from your non conscious mind some possibilities for action,

0:09:01.679 --> 0:09:04.720
<v Speaker 2>choose one of them, and then get moving. And so

0:09:04.760 --> 0:09:08.880
<v Speaker 2>that's really a pretty straightforward and kind of intuitive perspective.

0:09:08.880 --> 0:09:10.640
<v Speaker 2>If you think about it. What does it mean for

0:09:10.679 --> 0:09:12.679
<v Speaker 2>an agent to have free will? It can think about

0:09:12.679 --> 0:09:15.360
<v Speaker 2>what it wants, and then it can decide, and then

0:09:15.400 --> 0:09:17.720
<v Speaker 2>it can start trying to get it. And so that's

0:09:17.760 --> 0:09:20.600
<v Speaker 2>what I've been studying my whole career is how people

0:09:20.679 --> 0:09:24.080
<v Speaker 2>set goals and how well do they pursue goals, and

0:09:24.120 --> 0:09:28.440
<v Speaker 2>then how do the goals influence their happiness after they're

0:09:28.520 --> 0:09:32.480
<v Speaker 2>done pursuing them and perhaps achieving them. So I really

0:09:32.559 --> 0:09:36.959
<v Speaker 2>love that definition that a Christian List provided because it

0:09:37.120 --> 0:09:40.920
<v Speaker 2>sort of helped me sidestep these centuries of philosophical debate,

0:09:41.600 --> 0:09:45.840
<v Speaker 2>which I see as pretty impenetrable and very much bogged down.

0:09:45.920 --> 0:09:50.200
<v Speaker 2>Everybody's got their own position, and I love the ability

0:09:50.240 --> 0:09:53.320
<v Speaker 2>to sidestep all that and say, well, simple free will

0:09:53.400 --> 0:09:56.160
<v Speaker 2>is the ability to think about what you want to

0:09:56.240 --> 0:09:59.720
<v Speaker 2>make a choice, and that is a system to process.

0:10:00.000 --> 0:10:03.839
<v Speaker 2>It's there's a lot of conscious stuff going on there

0:10:03.840 --> 0:10:05.720
<v Speaker 2>where we say, well, wait a minute, what do I

0:10:05.760 --> 0:10:08.000
<v Speaker 2>do here? Well, I could do this, that or that?

0:10:08.160 --> 0:10:10.280
<v Speaker 2>What do I want to do? We think about the future,

0:10:10.360 --> 0:10:12.400
<v Speaker 2>what will happen in each case? I think, I like

0:10:12.880 --> 0:10:15.400
<v Speaker 2>option B, I'm going to go for that. And so

0:10:15.480 --> 0:10:18.800
<v Speaker 2>that's not something that's happening in system one, or at

0:10:18.840 --> 0:10:21.760
<v Speaker 2>least not in that same way, because System one is

0:10:21.800 --> 0:10:25.240
<v Speaker 2>more sort of automatic and habitual. So the system too

0:10:25.440 --> 0:10:28.680
<v Speaker 2>is the part of us that is kind of in

0:10:28.720 --> 0:10:32.640
<v Speaker 2>a representational world, living in a symbolic self. We can

0:10:32.679 --> 0:10:36.320
<v Speaker 2>talk about that, making choices between alternatives.

0:10:36.760 --> 0:10:39.439
<v Speaker 1>Yeah, I gotta say, I love your perspective because it's

0:10:39.679 --> 0:10:43.160
<v Speaker 1>very much in the line with my perspective, and I

0:10:43.200 --> 0:10:45.720
<v Speaker 1>have framed it a little bit differently. I've called it

0:10:45.800 --> 0:10:48.280
<v Speaker 1>cybernetic free will, but I think we're talking about the

0:10:48.280 --> 0:10:51.880
<v Speaker 1>same thing, you know, where cybernetic systems that can can

0:10:51.880 --> 0:10:55.000
<v Speaker 1>consciously represent the distance between our starting state and our

0:10:55.000 --> 0:10:58.560
<v Speaker 1>goal state, we can try to consciously develop strategies to

0:10:58.840 --> 0:11:01.160
<v Speaker 1>reduce that gap so we can get to where we

0:11:01.200 --> 0:11:03.360
<v Speaker 1>want to go. I mean, I studied, I studied with

0:11:03.400 --> 0:11:06.400
<v Speaker 1>Herbert Simon, so I came from that whole, that whole

0:11:06.440 --> 0:11:08.280
<v Speaker 1>school of you know, goal stuff.

0:11:08.720 --> 0:11:12.360
<v Speaker 2>I love the cybernetic perspective because it underlies the car

0:11:12.800 --> 0:11:16.720
<v Speaker 2>Shire approach, which is the main correct perspective that I

0:11:16.840 --> 0:11:18.160
<v Speaker 2>use in my goal research.

0:11:19.120 --> 0:11:22.200
<v Speaker 1>Beautiful, Yeah, beautiful. And I don't know if you follow

0:11:22.559 --> 0:11:25.600
<v Speaker 1>my colleague Colin Deyong's work at all on cybernetic Big

0:11:25.640 --> 0:11:26.520
<v Speaker 1>five theory.

0:11:26.640 --> 0:11:28.040
<v Speaker 2>Yeah, yeah, I do know his word.

0:11:28.240 --> 0:11:35.320
<v Speaker 1>But cool, yeah, very interesting. So I believe in that perspective,

0:11:35.360 --> 0:11:37.720
<v Speaker 1>and I would just call it cybernetic free will, as

0:11:37.720 --> 0:11:41.560
<v Speaker 1>distinguished from magic free will. But both of us agree,

0:11:41.600 --> 0:11:44.120
<v Speaker 1>I mean, anyone with a with a with a sane

0:11:44.160 --> 0:11:48.120
<v Speaker 1>mind agrees that that there are we're not outside the

0:11:48.120 --> 0:11:54.000
<v Speaker 1>causal structure of nature, that even our system two processes

0:11:54.440 --> 0:11:58.160
<v Speaker 1>are caused by a long series of chain of things

0:11:58.200 --> 0:12:01.520
<v Speaker 1>that that obviously lay out way outside out of our understanding,

0:12:01.559 --> 0:12:05.000
<v Speaker 1>comprehension knowledge, going very very far back, maybe even to

0:12:05.040 --> 0:12:05.600
<v Speaker 1>the Big Bang.

0:12:05.679 --> 0:12:09.679
<v Speaker 2>Right, I would say that the system two processes are

0:12:09.880 --> 0:12:13.920
<v Speaker 2>enabled by lots of stuff, and they're influenced by lots

0:12:13.920 --> 0:12:16.360
<v Speaker 2>of stuff, a lot of it that we don't know about,

0:12:17.200 --> 0:12:21.760
<v Speaker 2>but that the moment for choice comes, and that the

0:12:21.800 --> 0:12:25.720
<v Speaker 2>conscious experience of that's the one I want, flips the

0:12:25.760 --> 0:12:29.920
<v Speaker 2>brain into an implemental state. It crosses the rubicon, to

0:12:30.120 --> 0:12:34.920
<v Speaker 2>use Peter Golwitzer's terminology. And so I don't think that

0:12:35.040 --> 0:12:40.000
<v Speaker 2>you can predict what the decision of that self of

0:12:40.000 --> 0:12:42.559
<v Speaker 2>the moment is going to be, because it depends on

0:12:42.600 --> 0:12:47.719
<v Speaker 2>what it wants, and it's a sort of not independent

0:12:47.880 --> 0:12:52.880
<v Speaker 2>but somewhat independent process going on in the universe that,

0:12:53.120 --> 0:12:56.520
<v Speaker 2>again is enabled by all the machinery. But I don't

0:12:56.559 --> 0:13:00.080
<v Speaker 2>think it makes any sense to say me agreeing to

0:13:00.120 --> 0:13:03.079
<v Speaker 2>come on your show was predetermined since the Big Bang,

0:13:03.320 --> 0:13:05.680
<v Speaker 2>you know, or anything else that you or I did today.

0:13:06.240 --> 0:13:08.640
<v Speaker 2>It also doesn't make sense to say that we're magically

0:13:08.679 --> 0:13:13.839
<v Speaker 2>outside of all that physical stuff. Again, we're influenced by

0:13:13.880 --> 0:13:17.640
<v Speaker 2>it and enabled by it. But you know, I think

0:13:17.679 --> 0:13:20.640
<v Speaker 2>we have a certain degree of independence from it, and

0:13:20.679 --> 0:13:24.119
<v Speaker 2>that's the independence that allows us to make terrible mistakes.

0:13:24.920 --> 0:13:28.120
<v Speaker 2>We can even commit suicide. No other animal does that,

0:13:28.440 --> 0:13:32.600
<v Speaker 2>you know. I mean, that's the ultimate insult to personal survival.

0:13:32.640 --> 0:13:35.960
<v Speaker 2>We can get so confused that we could take our

0:13:36.000 --> 0:13:39.559
<v Speaker 2>own life, you know, And that might be Actually I

0:13:39.600 --> 0:13:41.959
<v Speaker 2>don't think it is. But the ultimate act of free

0:13:41.960 --> 0:13:46.440
<v Speaker 2>will is just to deny your self existence. But more

0:13:46.520 --> 0:13:49.720
<v Speaker 2>probably as are people who have a lot of emotional

0:13:49.800 --> 0:13:55.240
<v Speaker 2>problems that are controlling their decision making to an unhealthy extent.

0:13:55.679 --> 0:14:00.120
<v Speaker 1>Well, I mean, couldn't someone argue that? Probabilistically speaking, Once

0:14:00.160 --> 0:14:05.680
<v Speaker 1>you start adding up enough factors external, environmental and internal factors,

0:14:05.960 --> 0:14:09.040
<v Speaker 1>suicide ideation starts to get more and more and more

0:14:09.080 --> 0:14:13.040
<v Speaker 1>and more likely. You know, certainly there's there are still causes,

0:14:13.160 --> 0:14:14.839
<v Speaker 1>you know that that allow us to do something. I mean,

0:14:14.880 --> 0:14:17.840
<v Speaker 1>we were in the social sciences. We don't think we

0:14:17.840 --> 0:14:20.080
<v Speaker 1>can't predict things at all, And I just think there

0:14:20.080 --> 0:14:24.640
<v Speaker 1>are some things that have very high correlations, like so strong.

0:14:24.960 --> 0:14:27.760
<v Speaker 1>For instance, if you find someone I just thought this

0:14:27.760 --> 0:14:30.240
<v Speaker 1>an example on the spot, but I was thinking, you

0:14:30.240 --> 0:14:34.000
<v Speaker 1>find someone who's a devout vegetarian or a devout vegan

0:14:34.120 --> 0:14:37.960
<v Speaker 1>for thirty years, you can predict if you go to

0:14:38.000 --> 0:14:40.560
<v Speaker 1>a restaurant, you know they're not going to order that cheeseburger,

0:14:40.760 --> 0:14:43.200
<v Speaker 1>like I guess. The question is if you can predict

0:14:43.200 --> 0:14:45.440
<v Speaker 1>that with pretty much one hundred percent accuracy, are you

0:14:45.480 --> 0:14:47.440
<v Speaker 1>saying that vegan person had no free will in that

0:14:47.480 --> 0:14:50.040
<v Speaker 1>matter to choose the burger? I guess they could have.

0:14:50.520 --> 0:14:53.240
<v Speaker 1>I guess they could have, But they could have. We

0:14:53.280 --> 0:14:55.960
<v Speaker 1>could predict they wouldn't unless they thought of unless they

0:14:56.040 --> 0:14:58.040
<v Speaker 1>knew that we were trying to predict them, and then

0:14:58.080 --> 0:15:00.520
<v Speaker 1>they wanted to consciously defy us, and then maybe that

0:15:00.640 --> 0:15:01.760
<v Speaker 1>was their active free will.

0:15:02.360 --> 0:15:05.000
<v Speaker 2>I don't know, well, I don't know if it's fair

0:15:05.200 --> 0:15:08.360
<v Speaker 2>to say that when we are able to know what

0:15:08.400 --> 0:15:12.440
<v Speaker 2>we want and consistently do it, that we are determined

0:15:12.640 --> 0:15:15.960
<v Speaker 2>to do that thing. It's more than you're able to

0:15:16.040 --> 0:15:19.400
<v Speaker 2>detect and follow our own preferences. And so that person

0:15:19.560 --> 0:15:22.440
<v Speaker 2>knows they're not going to eat meat. It's not like

0:15:22.520 --> 0:15:25.520
<v Speaker 2>they have a compulsion not to eat meat, or maybe

0:15:25.520 --> 0:15:27.640
<v Speaker 2>they sort of do it, maybe it's kind of gross

0:15:27.680 --> 0:15:30.440
<v Speaker 2>to them, But I would put it that they have

0:15:30.520 --> 0:15:34.520
<v Speaker 2>a very strongly internalized value of not eating meat, and

0:15:34.560 --> 0:15:39.520
<v Speaker 2>that to express that value in their choices. And you know,

0:15:39.560 --> 0:15:42.240
<v Speaker 2>if suddenly they were starving and meat was all there was,

0:15:42.320 --> 0:15:47.560
<v Speaker 2>I'm sure they could go against that value or overcome

0:15:47.720 --> 0:15:51.520
<v Speaker 2>that determinism in favor of staying alive.

0:15:52.280 --> 0:15:55.040
<v Speaker 1>Wow. Yeah, Okay, this is a good point. You use

0:15:55.080 --> 0:15:57.360
<v Speaker 1>the word compulsion. You know, a lot of philosophers are

0:15:57.360 --> 0:16:01.840
<v Speaker 1>free will will argue from a compatibilist point of view

0:16:01.840 --> 0:16:04.480
<v Speaker 1>that that we have freedom as long as we're not

0:16:05.280 --> 0:16:08.280
<v Speaker 1>compulsively you know, like there's a number of conditions that

0:16:08.320 --> 0:16:10.120
<v Speaker 1>have to be met, and one of them is, you know,

0:16:10.160 --> 0:16:14.320
<v Speaker 1>not internal distractions that cause us to have compulsions or

0:16:15.040 --> 0:16:18.440
<v Speaker 1>or being externally controlled, like being tied down right and

0:16:18.440 --> 0:16:20.680
<v Speaker 1>being told you must eat meat right or something like

0:16:20.680 --> 0:16:25.400
<v Speaker 1>that. That was a weird example, very weird, very weird.

0:16:25.440 --> 0:16:30.040
<v Speaker 2>But anyway, actually, my grand my grandfather did my father

0:16:30.120 --> 0:16:32.720
<v Speaker 2>in law did that to my daughter one time to

0:16:32.840 --> 0:16:38.520
<v Speaker 2>Thanksgiving dinner. You must eat. She was a vegetarian, quite obnoxious,

0:16:38.600 --> 0:16:40.480
<v Speaker 2>let me tell you. Wow.

0:16:40.640 --> 0:16:42.560
<v Speaker 1>Wow. I hope the young lady did not grow up

0:16:42.560 --> 0:16:44.360
<v Speaker 1>with PDS PTSD from that.

0:16:44.840 --> 0:16:47.520
<v Speaker 2>Well. I would answer this this line of questioning by

0:16:47.760 --> 0:16:52.520
<v Speaker 2>referring to self determination theory, which are underlies most of

0:16:52.520 --> 0:16:56.600
<v Speaker 2>my work. And it makes a distinction between feeling a

0:16:56.680 --> 0:17:01.720
<v Speaker 2>compulsion to do something, calls that introject did motivation, feel

0:17:01.720 --> 0:17:06.280
<v Speaker 2>guilty you're making yourself do it, and internal motivation or

0:17:06.320 --> 0:17:11.280
<v Speaker 2>internalized where you're fully standing behind behind it. And so

0:17:11.760 --> 0:17:15.239
<v Speaker 2>if you are acting with a compulsion, it's true you

0:17:15.320 --> 0:17:18.159
<v Speaker 2>do not feel as free. And that is, you know,

0:17:18.160 --> 0:17:22.200
<v Speaker 2>a long term finding of self determination theory. And when

0:17:22.200 --> 0:17:25.520
<v Speaker 2>you act with that sense of not being free, it

0:17:25.680 --> 0:17:29.000
<v Speaker 2>negatively affects your well being. So it's important to have

0:17:29.040 --> 0:17:32.960
<v Speaker 2>a sense of acting freely. And I take a little

0:17:32.960 --> 0:17:35.040
<v Speaker 2>bit of a stronger position in the book, and I

0:17:35.119 --> 0:17:39.960
<v Speaker 2>say that it's always us making the choices, whether we

0:17:40.080 --> 0:17:43.520
<v Speaker 2>feel compelled to compel to do them or not. We're

0:17:43.560 --> 0:17:45.520
<v Speaker 2>signing off on them. And you know, it's sort of

0:17:45.520 --> 0:17:50.000
<v Speaker 2>that existential perspective where you know that Jean Paul Sart,

0:17:50.119 --> 0:17:53.439
<v Speaker 2>where you're responsible no matter what you reveal who you

0:17:53.480 --> 0:17:57.920
<v Speaker 2>are by your choices. Choices scary. You know, people might

0:17:57.960 --> 0:18:01.880
<v Speaker 2>try to avoid from choice or or escape from freedom

0:18:02.040 --> 0:18:08.199
<v Speaker 2>in Eric Frome's terms. Nevertheless, we're always responsible. And so

0:18:08.760 --> 0:18:10.760
<v Speaker 2>I don't think we're off the hook by saying I

0:18:10.800 --> 0:18:13.800
<v Speaker 2>felt compelled to do that. I think the answer to

0:18:13.840 --> 0:18:17.639
<v Speaker 2>that is, well, that's sort of an immature position to

0:18:17.720 --> 0:18:22.840
<v Speaker 2>be doing your living your life from, and you should

0:18:22.840 --> 0:18:25.359
<v Speaker 2>try to get over that, and with some work, maybe

0:18:25.400 --> 0:18:27.240
<v Speaker 2>some therapy, hopefully you can.

0:18:28.040 --> 0:18:30.920
<v Speaker 1>Yeah, this tying it to self determination theory is fascinating.

0:18:31.080 --> 0:18:33.040
<v Speaker 1>You know, you can hear the skeptics. I can hear

0:18:33.040 --> 0:18:37.400
<v Speaker 1>their voice in my head saying, like, Okay, well, feeling

0:18:37.440 --> 0:18:39.400
<v Speaker 1>like you have free will doesn't mean you have free will,

0:18:39.800 --> 0:18:43.040
<v Speaker 1>like just feeling like you're autonomous. You know, it doesn't

0:18:43.040 --> 0:18:45.280
<v Speaker 1>take away the fact that we don't. I mean, it's

0:18:45.280 --> 0:18:47.719
<v Speaker 1>just a mirage, it's just an illusion. But then you know,

0:18:47.800 --> 0:18:49.360
<v Speaker 1>you come back and you say, I mean, I think

0:18:49.359 --> 0:18:52.000
<v Speaker 1>I could do like the whole artificial intelligence chat between

0:18:52.040 --> 0:18:55.320
<v Speaker 1>you and the other person. Even so, get your let

0:18:55.359 --> 0:18:58.840
<v Speaker 1>me allow you to actually say it. I'm trying to

0:18:58.840 --> 0:19:00.960
<v Speaker 1>predict what you would say right now. Is that any

0:19:01.240 --> 0:19:03.160
<v Speaker 1>from a meta level of this discussion.

0:19:03.520 --> 0:19:05.960
<v Speaker 2>Just rephrase the question real quick for me. No.

0:19:06.080 --> 0:19:08.520
<v Speaker 1>I mean, I'm just seeing a conversation unfold between you

0:19:08.600 --> 0:19:11.359
<v Speaker 1>and a skeptic, or not a skeptic, but a determinist,

0:19:11.400 --> 0:19:14.840
<v Speaker 1>a hard determinist, or even a determinist who a soft

0:19:14.880 --> 0:19:17.679
<v Speaker 1>even a soft determinist, you know within the field who's like, well,

0:19:17.680 --> 0:19:21.159
<v Speaker 1>you know, just feeling like you're free doesn't make you free.

0:19:21.480 --> 0:19:23.720
<v Speaker 1>It's you didn't have any control, You didn't have any

0:19:23.800 --> 0:19:25.320
<v Speaker 1>choice over feeling free anyway.

0:19:25.640 --> 0:19:28.960
<v Speaker 2>Yeah. Yeah, Well that's a funny thing that self determination

0:19:29.119 --> 0:19:32.240
<v Speaker 2>theory is not directly addressed the free will question. It

0:19:32.359 --> 0:19:34.920
<v Speaker 2>is just said that the feeling of freedom is important.

0:19:35.560 --> 0:19:39.280
<v Speaker 2>And so I am taking that additional step, and I'm

0:19:39.320 --> 0:19:44.040
<v Speaker 2>even making the quite radical and it's I'm not fully

0:19:44.119 --> 0:19:47.000
<v Speaker 2>standing behind it, but I think it's plausible. The radical

0:19:47.080 --> 0:19:49.879
<v Speaker 2>argument that free will is an evolved capacity of the

0:19:49.960 --> 0:19:54.480
<v Speaker 2>human mind, again defining this capacity to ask ourselves what

0:19:54.560 --> 0:19:57.720
<v Speaker 2>we want and make a choice and I think that,

0:19:57.880 --> 0:20:02.639
<v Speaker 2>you know, given the complexity of you know, the social

0:20:02.720 --> 0:20:05.480
<v Speaker 2>environments that we face, and you know, we have to

0:20:05.520 --> 0:20:07.119
<v Speaker 2>be able to make it up on the fly and

0:20:07.200 --> 0:20:10.480
<v Speaker 2>figure out what's going on and make decisions constantly. And

0:20:10.520 --> 0:20:13.320
<v Speaker 2>so I think that's the capacity that we evolved, or

0:20:13.320 --> 0:20:16.320
<v Speaker 2>at least to ask ourselves, what do I want? Okay,

0:20:16.359 --> 0:20:20.479
<v Speaker 2>I'll do this. That doesn't mean that there were no

0:20:20.560 --> 0:20:23.320
<v Speaker 2>constraints on what we were doing, you know, of course,

0:20:23.640 --> 0:20:26.000
<v Speaker 2>you know I I was hungry that I might choose

0:20:26.040 --> 0:20:28.199
<v Speaker 2>to eat instead of you know, going to the library

0:20:28.280 --> 0:20:31.720
<v Speaker 2>and checking out the book. But this point you raised about,

0:20:32.320 --> 0:20:35.439
<v Speaker 2>but we don't know why we won't want something. We

0:20:35.480 --> 0:20:38.280
<v Speaker 2>don't know why we made that choice, and so we

0:20:38.320 --> 0:20:42.080
<v Speaker 2>can't be free. I do take exception to that, because

0:20:42.920 --> 0:20:45.840
<v Speaker 2>it makes it so the only free being is an

0:20:45.920 --> 0:20:51.920
<v Speaker 2>omniscient entity that knows everything about itself, and of course

0:20:52.000 --> 0:20:54.440
<v Speaker 2>we are not that. I don't think anything is that.

0:20:55.119 --> 0:20:58.399
<v Speaker 2>If you insist that that's what free will means, then

0:20:58.480 --> 0:21:00.480
<v Speaker 2>I have to agree with you. It does and exist

0:21:00.640 --> 0:21:05.480
<v Speaker 2>because we don't know everything about ourselves, but we know enough,

0:21:05.560 --> 0:21:09.840
<v Speaker 2>hopefully to keep sailing the ship, you know, through the

0:21:09.960 --> 0:21:14.439
<v Speaker 2>seas of our lives, hopefully in some coherent direction towards

0:21:14.600 --> 0:21:18.479
<v Speaker 2>meaningful goals that we that we've chosen. Do we know

0:21:18.560 --> 0:21:21.439
<v Speaker 2>why we've chosen those goals? No, we don't know the

0:21:21.480 --> 0:21:25.719
<v Speaker 2>brain processes that you know, the neurons that summed and

0:21:25.840 --> 0:21:29.200
<v Speaker 2>created that are making that choice. But on the other hand,

0:21:29.240 --> 0:21:31.919
<v Speaker 2>we do know why we made that choice. That's the

0:21:31.960 --> 0:21:34.840
<v Speaker 2>one we thought we wanted, and only we and our

0:21:34.960 --> 0:21:38.119
<v Speaker 2>conscious experience at that moment were in a position to

0:21:38.280 --> 0:21:41.960
<v Speaker 2>make that determination. So it's sort of tricky, you know.

0:21:42.240 --> 0:21:45.520
<v Speaker 2>It's that this is the kind of compatibilism that I'm

0:21:45.520 --> 0:21:46.159
<v Speaker 2>talking about.

0:21:46.920 --> 0:21:48.800
<v Speaker 1>Well, did you did you any chance to watch my

0:21:48.800 --> 0:21:51.400
<v Speaker 1>two part series with Sam Harris on free Will?

0:21:51.920 --> 0:21:53.560
<v Speaker 2>I did not get a chance to, but it's on

0:21:53.600 --> 0:21:55.080
<v Speaker 2>my list for the Christmas break.

0:21:55.640 --> 0:21:58.760
<v Speaker 1>Amazing because we we kind of get at it a

0:21:58.760 --> 0:22:01.240
<v Speaker 1>little bit. We go go at it a little bit.

0:22:01.480 --> 0:22:03.119
<v Speaker 1>You know, he's a friend, but you know we have

0:22:03.160 --> 0:22:06.680
<v Speaker 1>different opinions on this matter, and I can well imagine, Yeah,

0:22:06.720 --> 0:22:09.280
<v Speaker 1>there was this sticking point where you know, he was

0:22:09.280 --> 0:22:11.760
<v Speaker 1>talking about cello and how I love cell and how

0:22:11.760 --> 0:22:13.240
<v Speaker 1>I'd love to teach him how to play cell and

0:22:13.280 --> 0:22:15.000
<v Speaker 1>he said, I have no choice in the matter, I

0:22:15.040 --> 0:22:17.399
<v Speaker 1>do not want to play cello. He's like, I do

0:22:17.520 --> 0:22:20.280
<v Speaker 1>not have any motivation to play cello, so therefore I

0:22:20.320 --> 0:22:22.240
<v Speaker 1>have no free will in playing cello. And I said,

0:22:22.600 --> 0:22:25.720
<v Speaker 1>that's ridiculous. Like we can do things we're not motivated

0:22:25.760 --> 0:22:27.760
<v Speaker 1>to want to do, right, we can. That's the will,

0:22:28.119 --> 0:22:30.520
<v Speaker 1>you know. That's like, you know, we can. You know,

0:22:30.640 --> 0:22:32.919
<v Speaker 1>just because our system one is telling us we're not

0:22:32.960 --> 0:22:34.959
<v Speaker 1>interested in something doesn't mean we can't use system two

0:22:35.040 --> 0:22:37.560
<v Speaker 1>to push through something. It's not like I want to

0:22:37.640 --> 0:22:39.639
<v Speaker 1>motivate to go to the gym every time. But he

0:22:39.720 --> 0:22:41.480
<v Speaker 1>just didn't see it that way, you know, and it

0:22:41.520 --> 0:22:44.520
<v Speaker 1>was a little frustrating, probably for both of us in

0:22:44.560 --> 0:22:45.560
<v Speaker 1>that conversation.

0:22:46.000 --> 0:22:49.439
<v Speaker 2>I can imagine the way I could see Sam coming

0:22:49.480 --> 0:22:53.080
<v Speaker 2>to want to play the cello is if he's got

0:22:53.080 --> 0:22:56.879
<v Speaker 2>this self concept of I'm not a musical guy. Yet

0:22:57.040 --> 0:23:00.720
<v Speaker 2>at some level of him there's a talent circies that

0:23:00.760 --> 0:23:05.040
<v Speaker 2>have been undeveloped, And after his conversation with you, perhaps

0:23:05.040 --> 0:23:08.440
<v Speaker 2>he would have started to think a little bit about, well,

0:23:08.480 --> 0:23:11.040
<v Speaker 2>you know what, that's not such a bad idea. I've

0:23:11.240 --> 0:23:13.439
<v Speaker 2>always kind of half thought it might be fun to

0:23:14.000 --> 0:23:16.800
<v Speaker 2>wear in a musical instrument. And you know, the cello

0:23:16.880 --> 0:23:20.240
<v Speaker 2>is really pretty nice. To start listening to cello pieces,

0:23:20.760 --> 0:23:22.840
<v Speaker 2>and you could come around to wanting to play the cello,

0:23:23.520 --> 0:23:27.920
<v Speaker 2>and it wouldn't be that he was overcoming his system

0:23:27.960 --> 0:23:31.679
<v Speaker 2>one resistance. It would be that he was overcoming his

0:23:32.800 --> 0:23:37.119
<v Speaker 2>inaccurate self concept. So that's one way to sort of

0:23:37.119 --> 0:23:37.879
<v Speaker 2>flip it around.

0:23:38.680 --> 0:23:42.040
<v Speaker 1>Yeah, that's a very interesting reframing. You know, there's a

0:23:42.119 --> 0:23:44.320
<v Speaker 1>quote that I love from William James where he said,

0:23:44.359 --> 0:23:46.679
<v Speaker 1>my first act of free will shall be to believe

0:23:46.680 --> 0:23:48.600
<v Speaker 1>in free will. I'm sure you're familiar.

0:23:48.280 --> 0:23:49.800
<v Speaker 2>With that quote, right, Yeah, yeah, I've heard it.

0:23:50.160 --> 0:23:52.200
<v Speaker 1>I'm sure you have. I'm sure you have. The context

0:23:52.240 --> 0:23:55.840
<v Speaker 1>surrounding that quote is really even more interesting because he

0:23:55.920 --> 0:23:59.399
<v Speaker 1>suffered with depression and even some suicidal thoughts in his life,

0:23:59.720 --> 0:24:03.760
<v Speaker 1>deep deep depression. And apparently after he said that and

0:24:03.840 --> 0:24:09.119
<v Speaker 1>made that deliberation, his whole life changed. And it's fascinating

0:24:09.119 --> 0:24:13.200
<v Speaker 1>to think the extent to which those beliefs matter within

0:24:13.240 --> 0:24:15.680
<v Speaker 1>our own head. I mean, Sam, by the way, Sam

0:24:15.720 --> 0:24:18.320
<v Speaker 1>Harris would be the first one to argue that belief matters.

0:24:18.320 --> 0:24:22.360
<v Speaker 1>I mean, that's why he rails against religions, fundamentalist religions

0:24:22.359 --> 0:24:25.760
<v Speaker 1>that maybe have some dangerous ideologies. You know, obviously the

0:24:25.800 --> 0:24:29.760
<v Speaker 1>contents of our mind can influence greatly how we act

0:24:29.760 --> 0:24:32.879
<v Speaker 1>in the world, in our agency. So it just seems

0:24:32.920 --> 0:24:35.439
<v Speaker 1>like you're arguing in a lot of ways that the

0:24:35.480 --> 0:24:39.200
<v Speaker 1>stories that we tell ourselves with the symbolical mind, that

0:24:39.200 --> 0:24:41.199
<v Speaker 1>that matters for our free will. It sounds like that's

0:24:41.200 --> 0:24:42.680
<v Speaker 1>a big part of what you're saying. Is that right?

0:24:42.960 --> 0:24:47.480
<v Speaker 2>It is because in order to make choices between the

0:24:47.560 --> 0:24:51.360
<v Speaker 2>alternatives that we have called into our minds, there has

0:24:51.440 --> 0:24:53.760
<v Speaker 2>to be a sense of being a person who cares.

0:24:54.680 --> 0:24:58.120
<v Speaker 2>And I call this the symbolic self. I draw from

0:24:58.119 --> 0:25:02.640
<v Speaker 2>some work by Constantine set Katy's and John Scarnsky, on

0:25:02.720 --> 0:25:07.119
<v Speaker 2>the evolution of the symbolic Self, a fascinating nineteen ninety

0:25:07.119 --> 0:25:10.280
<v Speaker 2>seven article, and they said that in order for us

0:25:10.359 --> 0:25:14.520
<v Speaker 2>to interact in this complex social world with language, all

0:25:14.560 --> 0:25:18.880
<v Speaker 2>these other humans, we needed to evolve a sense of

0:25:18.920 --> 0:25:23.040
<v Speaker 2>being a person with a history and preferences. And we

0:25:23.240 --> 0:25:27.520
<v Speaker 2>feel ourselves being that person, even though in a census

0:25:27.760 --> 0:25:30.719
<v Speaker 2>just a hallucination. You know, it's a product of our minds.

0:25:31.600 --> 0:25:35.080
<v Speaker 2>But I'm making the argument that it has top down

0:25:35.240 --> 0:25:39.520
<v Speaker 2>control or some degree or influence at least over the

0:25:39.600 --> 0:25:45.600
<v Speaker 2>choices that get made according to its own preferences. So

0:25:45.640 --> 0:25:48.399
<v Speaker 2>it comes down to this sort of hierarchical perspective on

0:25:48.520 --> 0:25:51.840
<v Speaker 2>the organization of matter that I talk about in one

0:25:51.880 --> 0:25:52.960
<v Speaker 2>of the chapters.

0:25:52.560 --> 0:25:54.560
<v Speaker 3>In reclok And I don't know if you want to

0:25:54.560 --> 0:25:58.800
<v Speaker 3>get into that, but I'll just say briefly that I

0:25:58.840 --> 0:26:03.480
<v Speaker 3>think that evolution and has in the process of life evolving,

0:26:04.280 --> 0:26:08.640
<v Speaker 3>more and more complex levels of control have showed up,

0:26:09.119 --> 0:26:13.480
<v Speaker 3>and each one is been selected for because it regulates

0:26:13.520 --> 0:26:16.920
<v Speaker 3>the level below. Right, So when the first cell had

0:26:16.960 --> 0:26:20.240
<v Speaker 3>to have the capacity to regulate the chemicals coming in

0:26:20.280 --> 0:26:23.719
<v Speaker 3>and out of the cell, the first multicellular organism had

0:26:23.760 --> 0:26:27.320
<v Speaker 3>to have a capacity to regulate how the different cells

0:26:27.359 --> 0:26:28.680
<v Speaker 3>we're interacting with each other.

0:26:29.240 --> 0:26:33.040
<v Speaker 2>Eventually we've got a nervous system that takes control of that.

0:26:33.600 --> 0:26:35.520
<v Speaker 2>And so I follow that logic all the way up,

0:26:35.560 --> 0:26:38.000
<v Speaker 2>and I say, well, you've got a brain, and then

0:26:38.080 --> 0:26:42.640
<v Speaker 2>you've got cognitive processes controlling that brain, and then you've

0:26:42.640 --> 0:26:48.080
<v Speaker 2>got self processes controlling the cognitive processes. So that's where

0:26:48.119 --> 0:26:52.120
<v Speaker 2>our own sort of agency originates. But then it continues

0:26:52.160 --> 0:26:55.560
<v Speaker 2>up from there. We are nested within social relationships and

0:26:56.440 --> 0:27:01.000
<v Speaker 2>social spheres and those can try to control us. And

0:27:01.040 --> 0:27:03.679
<v Speaker 2>that's a lot of what self determination theory is about,

0:27:03.880 --> 0:27:08.240
<v Speaker 2>is how we deal with feeling controlled by other people.

0:27:08.920 --> 0:27:12.320
<v Speaker 2>And it says people with power over us should support

0:27:12.359 --> 0:27:15.720
<v Speaker 2>our autonomy so that we can maximize our sense of

0:27:15.760 --> 0:27:20.600
<v Speaker 2>self determination and get the benefits of being embedded in

0:27:20.640 --> 0:27:24.880
<v Speaker 2>these social relationships instead of just being oppressed by them.

0:27:25.200 --> 0:27:29.440
<v Speaker 2>So it's a really unique, I think, way of framing

0:27:29.560 --> 0:27:32.320
<v Speaker 2>kind of an old idea that goes back to the

0:27:32.440 --> 0:27:37.359
<v Speaker 2>tompt French theorist in the nineteenth century, the hierarchy of

0:27:37.359 --> 0:27:41.520
<v Speaker 2>the sciences, And this is something I believe pretty strongly

0:27:41.680 --> 0:27:46.280
<v Speaker 2>that each new level of organization builds upon the rest,

0:27:46.440 --> 0:27:51.159
<v Speaker 2>but it is somewhat independent of what's below, given its

0:27:51.240 --> 0:27:55.359
<v Speaker 2>ability to regulate what happens down below. So if I

0:27:55.400 --> 0:27:57.639
<v Speaker 2>decide to go off for a run this morning, I

0:27:57.640 --> 0:28:00.920
<v Speaker 2>don't usually do that, but I suppose I did. Suddenly

0:28:01.000 --> 0:28:03.399
<v Speaker 2>my all physical stuff in my body is having to

0:28:03.480 --> 0:28:07.359
<v Speaker 2>deal with the stress of running, the physiology, and that

0:28:07.440 --> 0:28:11.720
<v Speaker 2>was all determined by a conscious decision I made that morning.

0:28:12.119 --> 0:28:14.560
<v Speaker 2>And so the hard headed skeptic is going to say, well,

0:28:14.560 --> 0:28:16.880
<v Speaker 2>you don't know why you chose to run and I

0:28:16.920 --> 0:28:20.240
<v Speaker 2>could say, well, I don't know where that idea came from,

0:28:20.320 --> 0:28:23.640
<v Speaker 2>but I liked it and I picked it. That's why

0:28:23.800 --> 0:28:24.400
<v Speaker 2>I ran.

0:28:29.280 --> 0:28:31.280
<v Speaker 1>In kind of a nutshell. You're just saying you don't

0:28:31.359 --> 0:28:36.080
<v Speaker 1>like reductionism. You don't you don't like taking the mind

0:28:36.240 --> 0:28:37.359
<v Speaker 1>and reducing it to the brain.

0:28:37.400 --> 0:28:41.800
<v Speaker 2>Even Yeah, I think reductionist. Reductionism is hugely important, and

0:28:41.840 --> 0:28:45.440
<v Speaker 2>it's taught us so much. But I think it's mainly

0:28:45.640 --> 0:28:49.320
<v Speaker 2>useful when the system is breaking down. You say, oh,

0:28:48.800 --> 0:28:52.360
<v Speaker 2>they have sickle cell anemia. What's wrong? You know the

0:28:52.360 --> 0:28:55.360
<v Speaker 2>blood systems I'm working, Well, the cells that compose that

0:28:55.400 --> 0:29:00.320
<v Speaker 2>system are misshaped, or a person. I talk about the

0:29:00.320 --> 0:29:04.320
<v Speaker 2>example of Charles Whitman in the sixties who shot a

0:29:04.360 --> 0:29:07.360
<v Speaker 2>bunch of people from a bell tower in Texas and

0:29:07.400 --> 0:29:10.160
<v Speaker 2>it turned out he had a tumor in his brain. Well,

0:29:10.440 --> 0:29:14.360
<v Speaker 2>he wasn't able to make a good free choice because

0:29:14.360 --> 0:29:18.440
<v Speaker 2>his brain didn't support it. And so I would say

0:29:18.440 --> 0:29:22.840
<v Speaker 2>that reductionism mainly works when you're trying to explain why

0:29:22.880 --> 0:29:26.000
<v Speaker 2>the system has failed. But then if you're going to

0:29:26.040 --> 0:29:29.640
<v Speaker 2>talk about what the system does, you know, then you

0:29:29.720 --> 0:29:31.960
<v Speaker 2>need to know what's going on at the higher level.

0:29:32.280 --> 0:29:34.360
<v Speaker 2>When I drive out to the interstate, I either go

0:29:34.480 --> 0:29:38.480
<v Speaker 2>left or right. None of my machinery determines which way

0:29:38.480 --> 0:29:42.400
<v Speaker 2>it is. It's my intention to go to either Saint

0:29:42.440 --> 0:29:47.680
<v Speaker 2>Louis or Kansas City. That's controlling my machinery in that case.

0:29:48.400 --> 0:29:52.040
<v Speaker 1>Yeah, I assume you've read the book Freedom Evolves by

0:29:52.120 --> 0:29:52.880
<v Speaker 1>Daniel Dennett.

0:29:53.440 --> 0:29:59.000
<v Speaker 2>I have leafed through. Its pretty dense stuff, but a

0:29:59.000 --> 0:30:00.000
<v Speaker 2>lot of good, good idea.

0:30:00.920 --> 0:30:06.040
<v Speaker 1>It seems very very compatible with your argument, very compatible

0:30:06.080 --> 0:30:08.479
<v Speaker 1>with the argument, except you put up more of a

0:30:08.480 --> 0:30:12.200
<v Speaker 1>psychological machinery around it. I mean, he just argues that

0:30:12.240 --> 0:30:14.760
<v Speaker 1>we have a lot of these abilities, that abilities that

0:30:14.760 --> 0:30:17.040
<v Speaker 1>we evolve that give us a free will worth wanting.

0:30:17.600 --> 0:30:20.520
<v Speaker 1>And that's essentially what you're saying. But you are able

0:30:20.520 --> 0:30:22.239
<v Speaker 1>to flesh that out in a lot of ways with

0:30:22.600 --> 0:30:28.040
<v Speaker 1>neuroscience and with self determination theory. Your your groundbreaking research

0:30:28.080 --> 0:30:31.400
<v Speaker 1>and goals. Yeah, your work, Your work has really been

0:30:31.840 --> 0:30:34.800
<v Speaker 1>very impactful in the field of positive psychology, field of

0:30:34.840 --> 0:30:38.240
<v Speaker 1>psychology overall. It's it's certainly influenced my work as well,

0:30:38.280 --> 0:30:41.360
<v Speaker 1>I should say, and the work you've done, the breath

0:30:41.360 --> 0:30:44.680
<v Speaker 1>of it on goals and as well as well being,

0:30:45.800 --> 0:30:48.440
<v Speaker 1>and particularly the link between goals and well being is

0:30:48.480 --> 0:30:50.040
<v Speaker 1>kind of your niche, that's.

0:30:50.120 --> 0:30:51.800
<v Speaker 2>That's your thing.

0:30:51.880 --> 0:30:54.400
<v Speaker 1>Yeah, I know, I see you, I see you, I

0:30:54.440 --> 0:30:57.000
<v Speaker 1>see you, I get you. And it's really made a

0:30:57.040 --> 0:30:59.840
<v Speaker 1>substantial impact. So to be in the field of psychology

0:30:59.880 --> 0:31:01.840
<v Speaker 1>to be able to link that to the free will debate,

0:31:01.880 --> 0:31:05.320
<v Speaker 1>it just was super fascinating to me to read. I

0:31:05.360 --> 0:31:07.240
<v Speaker 1>will say one of the things that was super fastating

0:31:07.280 --> 0:31:09.560
<v Speaker 1>for me to read as well was your discussion of

0:31:09.600 --> 0:31:14.000
<v Speaker 1>the default mode network, because I've been studying the default

0:31:14.000 --> 0:31:18.080
<v Speaker 1>mode network with my colleagues and linking it to creative thinking,

0:31:18.800 --> 0:31:23.320
<v Speaker 1>and you don't make that direct link, but you have

0:31:23.400 --> 0:31:25.240
<v Speaker 1>a chapter on the default mode network, and you do

0:31:25.280 --> 0:31:28.600
<v Speaker 1>have a separate chapter on creativity, and so maybe we

0:31:28.640 --> 0:31:31.160
<v Speaker 1>can make that link together right now more explicit.

0:31:31.680 --> 0:31:34.480
<v Speaker 2>Well, I'm actually working on a review chapter right now

0:31:34.520 --> 0:31:40.320
<v Speaker 2>with a self determination theory neuroscientist named Google Lee, where

0:31:40.360 --> 0:31:43.320
<v Speaker 2>we're making that connection. We're asking the question, how do

0:31:43.360 --> 0:31:46.920
<v Speaker 2>we decide what to do next? And we're outlining that

0:31:47.000 --> 0:31:50.840
<v Speaker 2>question using control theory and the cybernetic perspective, but we're

0:31:50.880 --> 0:31:55.520
<v Speaker 2>also looking at the brain processes and we're concluding that

0:31:55.880 --> 0:32:01.560
<v Speaker 2>there's interplay between central executive network, salience network and the

0:32:01.560 --> 0:32:05.920
<v Speaker 2>default mode network, or there's back and forth between the

0:32:06.560 --> 0:32:11.880
<v Speaker 2>prefernal cortex and consciousness of question arises as arises say

0:32:11.880 --> 0:32:14.040
<v Speaker 2>in a creator's mind when they're trying to solve a

0:32:14.120 --> 0:32:16.560
<v Speaker 2>problem and we don't know the answer to that question,

0:32:16.680 --> 0:32:20.600
<v Speaker 2>but the fact that it was a conscious experience broadcasts

0:32:20.640 --> 0:32:24.680
<v Speaker 2>it to the broader or lower mind and starts to

0:32:24.720 --> 0:32:27.680
<v Speaker 2>work on it at a non conscious level. And the

0:32:28.480 --> 0:32:31.640
<v Speaker 2>mode network processes are a big part of that. And

0:32:31.680 --> 0:32:35.640
<v Speaker 2>then the salience network at one point sort of notices WHOA,

0:32:35.800 --> 0:32:39.440
<v Speaker 2>some thought that's just sort of wandered across the default

0:32:39.480 --> 0:32:44.320
<v Speaker 2>mode view screen. It says that's important, and then we say, whoa,

0:32:44.320 --> 0:32:48.200
<v Speaker 2>And then we can make a choice and maybe realize

0:32:48.240 --> 0:32:50.720
<v Speaker 2>that we need to completely change what we've been doing

0:32:50.800 --> 0:32:53.320
<v Speaker 2>in our life. So in that last chapter in the book,

0:32:53.360 --> 0:32:57.720
<v Speaker 2>I make a connection between the process of changing your

0:32:57.720 --> 0:33:00.760
<v Speaker 2>life adopting a whole new set of goals, and the

0:33:00.840 --> 0:33:11.120
<v Speaker 2>creative process using that four stage model of WALLAS of preparation, preparation, incubation, illumination, elaboration.

0:33:11.640 --> 0:33:14.200
<v Speaker 2>And so yes, I am kind of trying to link

0:33:14.240 --> 0:33:17.520
<v Speaker 2>those two chapters right now in this review article that

0:33:17.560 --> 0:33:20.640
<v Speaker 2>we're writing. It is pretty exciting because I think it's

0:33:20.680 --> 0:33:22.680
<v Speaker 2>the original stuff that we're pulling together.

0:33:23.280 --> 0:33:25.120
<v Speaker 1>That's great. Let me know if I can help it all.

0:33:25.400 --> 0:33:26.640
<v Speaker 1>You know, if you just won't even talk to me

0:33:26.640 --> 0:33:29.200
<v Speaker 1>about the research we've conducted on the topic, it might

0:33:29.240 --> 0:33:30.000
<v Speaker 1>be beneficial.

0:33:30.240 --> 0:33:33.520
<v Speaker 2>You know, I would love to send you a just

0:33:33.560 --> 0:33:35.920
<v Speaker 2>if you wouldn't mind taking a quick read of this

0:33:36.040 --> 0:33:37.240
<v Speaker 2>draft when we finish.

0:33:37.080 --> 0:33:39.080
<v Speaker 1>It, Yeah, because i'd love to.

0:33:39.360 --> 0:33:41.760
<v Speaker 2>It could have a lot of holes then your expertise

0:33:41.800 --> 0:33:44.040
<v Speaker 2>would be great to check them.

0:33:44.120 --> 0:33:46.920
<v Speaker 1>I'd love to. I'd love to. Yeah, I'm so excited

0:33:46.920 --> 0:33:49.240
<v Speaker 1>that you're working in that. You know, we have listeners

0:33:49.280 --> 0:33:51.480
<v Speaker 1>and might not be familiar with anything we're really talking

0:33:51.480 --> 0:33:54.080
<v Speaker 1>about right now, So let's talk a little about what

0:33:54.200 --> 0:33:57.320
<v Speaker 1>the default mode network. What the world are we talking about.

0:33:57.440 --> 0:34:00.920
<v Speaker 1>There seems to be a particular netbrain network that cognitive

0:34:00.920 --> 0:34:04.960
<v Speaker 1>neuroscientists discovered by accident because they didn't really seem to

0:34:05.040 --> 0:34:07.360
<v Speaker 1>care much about what people were doing in the fMRI

0:34:07.560 --> 0:34:09.760
<v Speaker 1>machine when they weren't doing the task that was assigned

0:34:09.760 --> 0:34:11.759
<v Speaker 1>to them, and it seemed to be like, oh wow,

0:34:11.760 --> 0:34:14.520
<v Speaker 1>maybe there's some value to the inner stream of consciousness,

0:34:14.560 --> 0:34:17.480
<v Speaker 1>as William James would put it. How would you characterize

0:34:17.480 --> 0:34:19.680
<v Speaker 1>the major functions of the default midle network, because there's

0:34:19.719 --> 0:34:21.160
<v Speaker 1>a lot of like kind of debate in the field

0:34:21.200 --> 0:34:23.919
<v Speaker 1>about exactly what it does well.

0:34:23.920 --> 0:34:26.640
<v Speaker 2>It does a bunch of different things, that's true. I

0:34:26.640 --> 0:34:30.600
<v Speaker 2>think it's involved both in executive control and in mind wandering,

0:34:31.400 --> 0:34:36.080
<v Speaker 2>which makes it hard to disentangle all the different stuff.

0:34:36.640 --> 0:34:39.560
<v Speaker 2>But I think of it as when we're just sort

0:34:39.560 --> 0:34:42.960
<v Speaker 2>of not doing anything in particular. We don't have a

0:34:43.000 --> 0:34:46.680
<v Speaker 2>cybernetic tope loop, we don't have a goal that's active

0:34:46.680 --> 0:34:49.560
<v Speaker 2>in our mind. We're not doing something. We're just hanging

0:34:49.560 --> 0:34:52.960
<v Speaker 2>out in our mind during sort of a free flowing state.

0:34:53.920 --> 0:34:56.280
<v Speaker 2>And so, you know, we used to think that didn't

0:34:56.360 --> 0:34:59.480
<v Speaker 2>mean anything, it was just mind wandering, But no, we

0:34:59.560 --> 0:35:05.520
<v Speaker 2>think default activity is critical for sort of processing what's

0:35:05.560 --> 0:35:08.920
<v Speaker 2>going on and connecting things up with our goals and

0:35:09.000 --> 0:35:14.000
<v Speaker 2>motives and linking memories in spontaneous ways that can arise

0:35:14.080 --> 0:35:19.200
<v Speaker 2>to consciousness and sometimes can present us with the Aha

0:35:19.360 --> 0:35:22.520
<v Speaker 2>moment that this is the solution to the question or

0:35:22.560 --> 0:35:26.160
<v Speaker 2>the problem I've been trying to solve. So the default

0:35:26.200 --> 0:35:30.120
<v Speaker 2>mode network, I think, arguably is us, you know, when

0:35:30.160 --> 0:35:35.040
<v Speaker 2>we're not doing anything, as compared to the focused mode

0:35:35.080 --> 0:35:38.520
<v Speaker 2>when we're trying to solve a problem, but a lot

0:35:38.520 --> 0:35:40.680
<v Speaker 2>of times we're not doing that. We're just kind of

0:35:41.000 --> 0:35:44.200
<v Speaker 2>zoning along, but that doesn't mean we're not doing anything.

0:35:45.040 --> 0:35:47.960
<v Speaker 2>It's kind of like dreaming, but at a more conscious level,

0:35:47.960 --> 0:35:53.440
<v Speaker 2>we're kind of processing, making connections, wondering stuff and remembering

0:35:53.560 --> 0:35:57.439
<v Speaker 2>things and reflecting on what somebody said the other day

0:35:57.480 --> 0:36:02.319
<v Speaker 2>and maybe feeling guilty because we haven't connected with our

0:36:02.360 --> 0:36:05.440
<v Speaker 2>sick friend that we heard about, and the friend wonders

0:36:05.480 --> 0:36:08.359
<v Speaker 2>about us and the bugs a little bit, and then

0:36:08.640 --> 0:36:12.360
<v Speaker 2>we realized, Hey, I saw this great cake baking recipe,

0:36:12.400 --> 0:36:14.560
<v Speaker 2>and I know I could bake that cake for my

0:36:14.640 --> 0:36:18.399
<v Speaker 2>friend and take it to him or heart. So it's

0:36:18.480 --> 0:36:24.600
<v Speaker 2>the default mode. Network is the part between action, but

0:36:24.680 --> 0:36:28.480
<v Speaker 2>it's critical because it provides a lot of the spark

0:36:28.600 --> 0:36:33.720
<v Speaker 2>to return to the next action, to adopt the next goal,

0:36:34.200 --> 0:36:37.120
<v Speaker 2>and we don't know a lot about that process, to

0:36:37.120 --> 0:36:42.600
<v Speaker 2>be honest, it's surprising how little personality and cybernetic models

0:36:42.600 --> 0:36:45.480
<v Speaker 2>have looked at that. Like even the Carver and Shire

0:36:45.600 --> 0:36:49.560
<v Speaker 2>model of goals doesn't say where did the goals come from?

0:36:49.600 --> 0:36:53.000
<v Speaker 2>How do we get that goal? It just said we

0:36:53.080 --> 0:36:57.880
<v Speaker 2>picked the action. That's that's a determined by the higher

0:36:57.960 --> 0:37:00.400
<v Speaker 2>level goals in our system. Right, I want to be

0:37:00.480 --> 0:37:03.480
<v Speaker 2>a compassionate person, so I will picking to do that

0:37:04.080 --> 0:37:06.960
<v Speaker 2>you know, expresses my compassion. But you know, a lot

0:37:06.960 --> 0:37:10.160
<v Speaker 2>of times we're doing something new, we're waning it and

0:37:10.200 --> 0:37:14.120
<v Speaker 2>we don't have any big pre existing idea, but we're

0:37:14.160 --> 0:37:17.000
<v Speaker 2>still figuring out what to do next. I think the

0:37:17.000 --> 0:37:19.040
<v Speaker 2>default mode has a lot to do with that. But

0:37:19.400 --> 0:37:21.600
<v Speaker 2>there's a lot to be figured out with that as well.

0:37:21.640 --> 0:37:24.680
<v Speaker 2>And I'm just starting to learn about the neuroscience of

0:37:24.719 --> 0:37:25.040
<v Speaker 2>all this.

0:37:25.920 --> 0:37:28.400
<v Speaker 1>Yeah, it's tricky stuff. One thing I do want to

0:37:28.400 --> 0:37:29.840
<v Speaker 1>double click on is I do think a lot of

0:37:29.840 --> 0:37:32.480
<v Speaker 1>the functions of the default mode network are to form

0:37:32.520 --> 0:37:35.239
<v Speaker 1>the core of human existence. That's how I freezed it,

0:37:35.280 --> 0:37:37.920
<v Speaker 1>and I do think that's true. I think that the

0:37:37.920 --> 0:37:41.640
<v Speaker 1>default Moile network can couple and decouple with the executive

0:37:41.640 --> 0:37:45.320
<v Speaker 1>attention network to various degrees depending on the different stages

0:37:45.360 --> 0:37:48.680
<v Speaker 1>of the creative process. Yeah, for different you know, sometimes

0:37:48.960 --> 0:37:51.640
<v Speaker 1>we can focus intensely on our daydreams so we can

0:37:51.680 --> 0:37:56.520
<v Speaker 1>have positive as my mentor Dromo Singer called positive constructive daydreaming.

0:37:56.920 --> 0:37:59.840
<v Speaker 1>Is probably when the default mode is linked and coupled

0:38:00.200 --> 0:38:05.200
<v Speaker 1>our deep attention resources or working memory resources. So yeah,

0:38:05.200 --> 0:38:07.280
<v Speaker 1>these things could be all these different kind of brain networks.

0:38:07.320 --> 0:38:09.000
<v Speaker 1>You also mentioned the salience network, which I think is

0:38:09.040 --> 0:38:11.040
<v Speaker 1>one of the most underrated brain networks. I'm glad that

0:38:11.080 --> 0:38:13.520
<v Speaker 1>you mentioned the salience network. That seems to be really

0:38:13.560 --> 0:38:17.399
<v Speaker 1>important for creativity to be able to see what may

0:38:17.480 --> 0:38:21.799
<v Speaker 1>count as a creative solution in the first place. So

0:38:21.840 --> 0:38:23.640
<v Speaker 1>I think that's wonderful that you brought that in. But

0:38:23.719 --> 0:38:27.680
<v Speaker 1>all these kinds of brain networks couple and decouple constantly.

0:38:27.760 --> 0:38:29.920
<v Speaker 1>You know that it's where it's fall and back and forth,

0:38:30.000 --> 0:38:31.840
<v Speaker 1>you know, and and so you know, the brains are

0:38:31.880 --> 0:38:32.720
<v Speaker 1>constantly dynamic.

0:38:32.840 --> 0:38:33.040
<v Speaker 2>You know.

0:38:33.360 --> 0:38:34.920
<v Speaker 1>You know all this, But I think we need to

0:38:34.920 --> 0:38:37.680
<v Speaker 1>explain this to to our listeners. You know, But how

0:38:37.719 --> 0:38:40.440
<v Speaker 1>it all unpinges on the free will debate. It's still

0:38:40.480 --> 0:38:44.600
<v Speaker 1>not completely clear to me because someone can say that,

0:38:45.320 --> 0:38:47.840
<v Speaker 1>you know, all this is is maybe you're talking about

0:38:48.000 --> 0:38:52.520
<v Speaker 1>human will, but it's not free will, because a lot

0:38:52.600 --> 0:38:55.680
<v Speaker 1>of people narrowly view free will as could you have

0:38:55.880 --> 0:38:59.720
<v Speaker 1>done differently in that specific if you rewinded the tape

0:39:00.280 --> 0:39:03.440
<v Speaker 1>at that specific precise moment, would you ever have acted

0:39:03.480 --> 0:39:07.799
<v Speaker 1>differently than and you're talking a lot about the things

0:39:07.840 --> 0:39:11.120
<v Speaker 1>we do after the moment to change the course of

0:39:11.160 --> 0:39:13.760
<v Speaker 1>our future, you know, because I've called it the imagination network.

0:39:13.800 --> 0:39:15.880
<v Speaker 1>I've called the default mode the imagination network because it

0:39:16.560 --> 0:39:19.680
<v Speaker 1>can impact our future. But does it really impact the

0:39:19.719 --> 0:39:20.520
<v Speaker 1>notion of free will?

0:39:20.520 --> 0:39:23.840
<v Speaker 2>Though? Yeah, well again I'm not arguing for magic free

0:39:23.840 --> 0:39:27.160
<v Speaker 2>will independent of all the machioney. I think that we

0:39:27.280 --> 0:39:32.440
<v Speaker 2>have this mental ability to take stock of who we are,

0:39:32.560 --> 0:39:35.480
<v Speaker 2>where we are, and what we want and make some decisions,

0:39:36.080 --> 0:39:39.720
<v Speaker 2>and could we have done differently? I think that almost

0:39:39.800 --> 0:39:44.040
<v Speaker 2>might be a nonsensical question, because you, first of all,

0:39:44.080 --> 0:39:46.360
<v Speaker 2>you can't go back to that moment and say, okay,

0:39:46.400 --> 0:39:50.120
<v Speaker 2>do something different or not. It's already happened, and so

0:39:50.200 --> 0:39:53.080
<v Speaker 2>everything that's already happened has already happened. So I don't

0:39:53.560 --> 0:39:55.680
<v Speaker 2>to ask could you have done differently? Is to ask

0:39:56.080 --> 0:39:59.560
<v Speaker 2>could the universe have turned out any differently at this moment?

0:40:00.400 --> 0:40:03.400
<v Speaker 2>Since it began four billion years ago? It's kind of

0:40:03.440 --> 0:40:07.080
<v Speaker 2>the same question. I think it's more useful to ask

0:40:07.520 --> 0:40:10.160
<v Speaker 2>could I have done differently? And if you think about it,

0:40:10.200 --> 0:40:13.440
<v Speaker 2>a lot of times you were poised between two or

0:40:13.440 --> 0:40:18.160
<v Speaker 2>three almost equally attractive options, and you know, you picked one,

0:40:18.480 --> 0:40:21.759
<v Speaker 2>and maybe you later realize you wished you hadn't done that,

0:40:21.840 --> 0:40:24.680
<v Speaker 2>and so you would pick differently the next time. But

0:40:24.960 --> 0:40:28.160
<v Speaker 2>at the time, you picked the thing that looked most appealing,

0:40:28.760 --> 0:40:30.799
<v Speaker 2>And so why would you have picked differently from that?

0:40:30.840 --> 0:40:34.319
<v Speaker 2>You picked the thing that you wanted? And so why

0:40:34.480 --> 0:40:37.120
<v Speaker 2>would it be free will to think that you have

0:40:37.200 --> 0:40:40.760
<v Speaker 2>the capacity to pick what you don't want? And actually,

0:40:41.440 --> 0:40:43.319
<v Speaker 2>now that I think about it, that's part of what

0:40:43.360 --> 0:40:46.040
<v Speaker 2>I'm saying free will is we have the capacity to

0:40:46.040 --> 0:40:52.960
<v Speaker 2>pick what we don't want. Yeah, for you won't Yeah,

0:40:53.480 --> 0:40:56.000
<v Speaker 2>I mean we can make bad decisions. We can look.

0:40:56.040 --> 0:40:58.560
<v Speaker 2>The last chapter of my book talks about a woman

0:40:58.600 --> 0:41:01.920
<v Speaker 2>who made a bad, weird decision decades ago and it's

0:41:01.920 --> 0:41:06.600
<v Speaker 2>been miserable ever since, until finally her misery. Misery inspired

0:41:06.640 --> 0:41:10.479
<v Speaker 2>her to start asking the questions what do I really want?

0:41:10.680 --> 0:41:14.400
<v Speaker 2>Where her non conscious mind started to provide her with

0:41:14.440 --> 0:41:19.520
<v Speaker 2>some alternatives until she was finally able to pick something different.

0:41:20.360 --> 0:41:22.560
<v Speaker 2>And I think that's all you can really ask for,

0:41:23.239 --> 0:41:27.520
<v Speaker 2>you know, an agent regarding free will, the ability to

0:41:27.640 --> 0:41:31.200
<v Speaker 2>pick what we think we want and learn from our mistakes.

0:41:32.000 --> 0:41:33.759
<v Speaker 1>From mistakes, well, that that takes us out of the

0:41:33.840 --> 0:41:37.520
<v Speaker 1>realm of how a lot of philosophers define narrowly they defined.

0:41:38.080 --> 0:41:40.120
<v Speaker 1>It's like, it's like they can't think of free will

0:41:40.160 --> 0:41:42.800
<v Speaker 1>outside of the rewind the tape thing. Do you know

0:41:42.840 --> 0:41:43.239
<v Speaker 1>what I mean?

0:41:43.760 --> 0:41:47.080
<v Speaker 2>Yeah, I don't they're rewind the tape thing, doesn't It's

0:41:47.120 --> 0:41:49.600
<v Speaker 2>a thought experiment that I don't think is very co here.

0:41:52.040 --> 0:41:54.480
<v Speaker 1>Can can drop on the call it right there.

0:41:54.600 --> 0:41:56.680
<v Speaker 2>I mean, I'm probably just not getting you.

0:41:56.640 --> 0:41:58.640
<v Speaker 1>Should have snapped, you should have snapped with that and

0:41:58.640 --> 0:42:01.439
<v Speaker 1>get a little sassy, get a little sassy over there.

0:42:03.520 --> 0:42:05.839
<v Speaker 1>Well you know, no, I hear you, brother, I hear you, brother.

0:42:06.719 --> 0:42:08.760
<v Speaker 1>I'm a big fan of the a lot of Daniel

0:42:08.760 --> 0:42:11.440
<v Speaker 1>Dennet's writings on the abilities that are they give us

0:42:11.480 --> 0:42:13.680
<v Speaker 1>a free will worth One time, I think a lot

0:42:13.719 --> 0:42:17.200
<v Speaker 1>of things I was not thinking, Yeah it's so okay,

0:42:17.320 --> 0:42:21.120
<v Speaker 1>Well you'll see a great friend, a great friend there

0:42:21.160 --> 0:42:24.200
<v Speaker 1>of ideas, you know, so I've always yeah, I'm I'm

0:42:24.239 --> 0:42:25.960
<v Speaker 1>a big I'm a big fan of those ideas. I'm

0:42:25.960 --> 0:42:29.720
<v Speaker 1>a big fan of of you making the case that

0:42:29.800 --> 0:42:32.120
<v Speaker 1>a lot of the kinds of system two functions we're

0:42:32.120 --> 0:42:34.400
<v Speaker 1>talking about are things that they are free will is

0:42:34.400 --> 0:42:37.239
<v Speaker 1>worth wanting in the Daniel Dennitts terms, but in your terms,

0:42:37.280 --> 0:42:39.479
<v Speaker 1>you're saying there's You're straight up saying that's free will.

0:42:39.520 --> 0:42:42.000
<v Speaker 1>And so that's fascinating because you you're just like straight

0:42:42.080 --> 0:42:44.040
<v Speaker 1>up You're like no, You're like, no, we got more

0:42:44.040 --> 0:42:45.120
<v Speaker 1>free will than you realize.

0:42:46.200 --> 0:42:50.080
<v Speaker 2>We're doomed to have free will. Yeah, that's a funny

0:42:50.080 --> 0:42:53.000
<v Speaker 2>way to flat it around. But that's that's the existentialist

0:42:53.040 --> 0:42:57.160
<v Speaker 2>in me. You know, that's saying, yeah, there ain't nobody

0:42:57.160 --> 0:42:59.560
<v Speaker 2>else making our choices but us. We might like to

0:42:59.600 --> 0:43:05.000
<v Speaker 2>think so, but we're responsible. And that's scary, but it's

0:43:05.040 --> 0:43:08.360
<v Speaker 2>also tremendously empowering if you fully grasp it.

0:43:09.440 --> 0:43:12.640
<v Speaker 1>Hmm. Yeah, And I know that's another major theme of

0:43:12.640 --> 0:43:16.879
<v Speaker 1>your work, is the link between freedom and responsibility. I

0:43:16.920 --> 0:43:22.120
<v Speaker 1>assume you've read Roll May's stuff. Yeah, yeah, in the sixties.

0:43:22.640 --> 0:43:26.520
<v Speaker 2>I'm an old humanist going from way back, mostly a rogarian,

0:43:26.680 --> 0:43:30.399
<v Speaker 2>but yeah, some of the other folks as well, I've read.

0:43:31.040 --> 0:43:35.080
<v Speaker 1>Yeah, roll May wrote beautifully about about that interplay, and

0:43:35.120 --> 0:43:37.200
<v Speaker 1>I think I love that you brought in, brought that in,

0:43:37.360 --> 0:43:39.680
<v Speaker 1>you know, to this this whole debate. And it's very

0:43:39.680 --> 0:43:42.120
<v Speaker 1>interesting because you have like a whole chapter the problem

0:43:42.200 --> 0:43:46.480
<v Speaker 1>of too much freedom, the chapter that's the chapter you have.

0:43:47.000 --> 0:43:49.560
<v Speaker 1>So it's like, it's interesting because there's there's kind of

0:43:49.640 --> 0:43:52.399
<v Speaker 1>different continuums that one can make in this argument. One

0:43:52.400 --> 0:43:55.000
<v Speaker 1>could say, a lot of people, a lot of philosophers

0:43:55.000 --> 0:43:57.560
<v Speaker 1>of mine, say, especially determined to say, look, we have

0:43:57.640 --> 0:43:59.360
<v Speaker 1>a heck of a lot less free will than we

0:43:59.440 --> 0:44:02.000
<v Speaker 1>realize we have. Then there are sums like, actually, we

0:44:02.080 --> 0:44:04.720
<v Speaker 1>have more than we realize. But then I think you're

0:44:04.880 --> 0:44:07.840
<v Speaker 1>even to the right of that continuum and saying we

0:44:07.960 --> 0:44:11.680
<v Speaker 1>have so much free will that we don't know what

0:44:11.719 --> 0:44:14.919
<v Speaker 1>to do with it. In a way, do you think

0:44:15.000 --> 0:44:16.480
<v Speaker 1>do you and this might be controversial, but do you

0:44:16.480 --> 0:44:18.879
<v Speaker 1>think like it's easier being a determinist? Like if you're

0:44:18.880 --> 0:44:21.640
<v Speaker 1>a hard determinist, do you feel like you're evading your responsibility,

0:44:21.760 --> 0:44:22.400
<v Speaker 1>your destiny?

0:44:23.239 --> 0:44:26.319
<v Speaker 2>You know, I think that hard determinism is seductive.

0:44:26.480 --> 0:44:27.320
<v Speaker 1>It's a deep question.

0:44:28.280 --> 0:44:33.040
<v Speaker 2>It's seductive because it seems like the most scientifically rigorous

0:44:33.080 --> 0:44:35.800
<v Speaker 2>position to take, and it's the position that goes along

0:44:35.840 --> 0:44:41.040
<v Speaker 2>with the scientific ideal of putting our subjective wants and

0:44:41.160 --> 0:44:44.759
<v Speaker 2>values aside and being guided by the facts. Right, And

0:44:44.840 --> 0:44:47.680
<v Speaker 2>so we have this wishful thinking of wanting free will,

0:44:47.760 --> 0:44:52.120
<v Speaker 2>but as true scientists, we have to put that aside.

0:44:52.880 --> 0:44:55.799
<v Speaker 2>I disagree with that. I think when we do that,

0:44:55.840 --> 0:44:59.239
<v Speaker 2>it's bad for us because we're accepting an idea that

0:44:59.360 --> 0:45:03.120
<v Speaker 2>isn't true and we're going to negatively impact our own functioning.

0:45:03.680 --> 0:45:07.160
<v Speaker 2>Quite a bit of research, experimental research on what happens

0:45:07.640 --> 0:45:11.799
<v Speaker 2>when you convince people that determinism is true, and they

0:45:11.840 --> 0:45:18.680
<v Speaker 2>become less moral, less able to self regulate, less pro social,

0:45:19.440 --> 0:45:22.920
<v Speaker 2>number they become numb to pain. So if you convince

0:45:23.040 --> 0:45:26.840
<v Speaker 2>people yeah, I'm serious the code, I.

0:45:26.880 --> 0:45:28.400
<v Speaker 1>Want to ask you a follow up question to that,

0:45:28.480 --> 0:45:32.000
<v Speaker 1>because is has anyone ever done the study on different

0:45:32.040 --> 0:45:35.239
<v Speaker 1>philosophers and psychologists and what they believe in their actual morality?

0:45:35.480 --> 0:45:38.040
<v Speaker 1>Like that? From that level, wouldn't that be interesting? Wouldn't

0:45:38.040 --> 0:45:41.400
<v Speaker 1>that be an interesting study like are hard determinists more immoral?

0:45:42.560 --> 0:45:42.799
<v Speaker 1>You know?

0:45:43.040 --> 0:45:46.000
<v Speaker 2>I would be a fascinating study. And a version that

0:45:46.040 --> 0:45:49.640
<v Speaker 2>I've thought about it is does an evolutionary psychologist who

0:45:49.680 --> 0:45:53.719
<v Speaker 2>thinks that men evolve to be philanderers have less ability

0:45:53.800 --> 0:46:01.440
<v Speaker 2>to control their right unfortunate choices with their female graduate students?

0:46:02.000 --> 0:46:06.000
<v Speaker 2>You know, I think free will can determinism? Can you know,

0:46:06.160 --> 0:46:09.720
<v Speaker 2>make you think you're not responsible. I had to do that. Yeah,

0:46:09.960 --> 0:46:13.839
<v Speaker 2>so that's another you're right besides the well being hit

0:46:14.040 --> 0:46:16.920
<v Speaker 2>that you take with those beliefs. Yeah, I think it's

0:46:16.960 --> 0:46:18.040
<v Speaker 2>a dangerous belief.

0:46:19.000 --> 0:46:21.280
<v Speaker 1>Wow, you even go so far to say that that

0:46:21.280 --> 0:46:22.640
<v Speaker 1>that it's a dangerous belief.

0:46:23.440 --> 0:46:24.960
<v Speaker 2>You know, I don't care. I'm getting to the end

0:46:25.000 --> 0:46:26.560
<v Speaker 2>of my career. I'll just say what I think.

0:46:26.640 --> 0:46:31.680
<v Speaker 1>And you have no fucks to give about for you,

0:46:31.760 --> 0:46:35.480
<v Speaker 1>about for you, Ken, Ken's like I got zeros to

0:46:35.560 --> 0:46:38.359
<v Speaker 1>give in the in this debate, it's you. No, I

0:46:38.400 --> 0:46:40.600
<v Speaker 1>hear you. The idea, whether or not it's a dangerous

0:46:40.600 --> 0:46:42.880
<v Speaker 1>idea is is? I can again, I can hear the

0:46:43.640 --> 0:46:46.399
<v Speaker 1>voice in my head of the hard determinists saying, because

0:46:46.400 --> 0:46:48.200
<v Speaker 1>I've debated so much of them, so I can kind

0:46:48.200 --> 0:46:51.040
<v Speaker 1>of like hear what their counter to everything you say is.

0:46:51.520 --> 0:46:53.640
<v Speaker 1>And I think they would say that it makes them

0:46:54.040 --> 0:46:59.360
<v Speaker 1>more compassionate because you realize that people, even in their

0:46:59.400 --> 0:47:03.160
<v Speaker 1>most poor choices, really didn't have a choice ultimately, ultimately,

0:47:03.920 --> 0:47:06.640
<v Speaker 1>and that that should give us all more more compassion

0:47:06.680 --> 0:47:09.160
<v Speaker 1>for each other, not less. I can hear them saying

0:47:09.160 --> 0:47:11.120
<v Speaker 1>that as a rebuttal to what you're saying.

0:47:11.080 --> 0:47:12.840
<v Speaker 2>Yeah, and that's kind of what my dad was saying.

0:47:12.920 --> 0:47:16.239
<v Speaker 2>It's a bit of a liberal perspective of you, you know,

0:47:16.520 --> 0:47:19.320
<v Speaker 2>forgive people of their crimes if they came from a terrible,

0:47:19.600 --> 0:47:23.319
<v Speaker 2>you know background. And I don't think I don't think

0:47:23.360 --> 0:47:27.040
<v Speaker 2>we can really afford to say that because the question

0:47:27.080 --> 0:47:30.719
<v Speaker 2>of who is really off the hook versus isn't it's

0:47:30.880 --> 0:47:34.479
<v Speaker 2>very tricky. I think you're only really off the hook

0:47:34.520 --> 0:47:38.280
<v Speaker 2>if you've got some organic problems. And that's once again,

0:47:38.360 --> 0:47:41.320
<v Speaker 2>determinism only makes sense when there's a problem down in

0:47:41.360 --> 0:47:45.400
<v Speaker 2>the machinery, or reductionism only makes sense. So if your

0:47:45.440 --> 0:47:48.040
<v Speaker 2>machinery is off, then okay, you're off the hook. But

0:47:48.120 --> 0:47:51.680
<v Speaker 2>if you're just making bad choices and then you're saying, well,

0:47:51.680 --> 0:47:55.440
<v Speaker 2>I couldn't help it, I think that's again a dangerous

0:47:55.480 --> 0:47:58.440
<v Speaker 2>thing to a dangerous road to go down. Wow.

0:47:58.520 --> 0:48:02.080
<v Speaker 1>Okay, okay, let's end this interview talking a little about

0:48:02.120 --> 0:48:04.600
<v Speaker 1>the importance of brains working together. You know you this

0:48:04.680 --> 0:48:09.120
<v Speaker 1>chapter called a Living Will Together Man, have you with

0:48:09.120 --> 0:48:14.000
<v Speaker 1>the extended brain hypothesis that uh, David Chalmers and Andy

0:48:14.040 --> 0:48:16.600
<v Speaker 1>Clark put forward and that Annie Murphy Paul wrote about

0:48:16.640 --> 0:48:17.680
<v Speaker 1>in a recent book.

0:48:17.840 --> 0:48:19.400
<v Speaker 2>Nope and that sounds like another one. I got to

0:48:19.480 --> 0:48:21.560
<v Speaker 2>check it out. I think I'm going to be familiar

0:48:21.600 --> 0:48:24.440
<v Speaker 2>with the arguments once you explain it to me quickly.

0:48:25.120 --> 0:48:27.960
<v Speaker 1>Oh yeah, you'll, you'll definitely. I mean, well, just that

0:48:28.160 --> 0:48:33.200
<v Speaker 1>there are there are brains, don't stop and start enclosed

0:48:33.200 --> 0:48:38.120
<v Speaker 1>in this head, That we are deeply, deeply embodied and

0:48:38.239 --> 0:48:42.319
<v Speaker 1>connected to others, and our brain is extended beyond just

0:48:42.400 --> 0:48:44.200
<v Speaker 1>this skin here.

0:48:44.520 --> 0:48:44.960
<v Speaker 2>It's just it.

0:48:44.920 --> 0:48:47.439
<v Speaker 1>Seems very much in line with your your Living Well

0:48:47.480 --> 0:48:50.799
<v Speaker 1>Together chapter, we talk about how we really influence each

0:48:50.840 --> 0:48:55.560
<v Speaker 1>other's We really influence each other's free will in profound ways. Right,

0:48:55.600 --> 0:48:59.560
<v Speaker 1>it's not all strictly in an individual pursuit the pursuit

0:48:59.600 --> 0:49:00.680
<v Speaker 1>of free will, all right.

0:49:00.719 --> 0:49:03.719
<v Speaker 2>Yeah, And that sounds consistent with what I've been what

0:49:03.760 --> 0:49:07.000
<v Speaker 2>I say in the book about this multi level hierarchy

0:49:07.080 --> 0:49:12.319
<v Speaker 2>of organization where we are also nested within groups of

0:49:12.360 --> 0:49:16.040
<v Speaker 2>other minds and we're influenced by them, but we also

0:49:16.680 --> 0:49:20.640
<v Speaker 2>influence them, and it's critical for our well being and

0:49:20.680 --> 0:49:26.360
<v Speaker 2>our adaptation, I would argue, to be able to form

0:49:27.120 --> 0:49:33.239
<v Speaker 2>coherent groups, alliances, coalitions, and evolutionary terminology, and so we're

0:49:33.280 --> 0:49:39.120
<v Speaker 2>not just not just these abstract entities stuck in our heads.

0:49:39.160 --> 0:49:42.440
<v Speaker 2>We're also connected to other minds out there in the world,

0:49:43.520 --> 0:49:46.920
<v Speaker 2>hopefully in a way that benefits everybody. So it sounds

0:49:46.920 --> 0:49:48.600
<v Speaker 2>like a really cool idea. In another book, I have

0:49:48.640 --> 0:49:49.759
<v Speaker 2>to go look up, Scott.

0:49:51.360 --> 0:49:53.160
<v Speaker 1>Yeah, yeah, look it up.

0:49:53.280 --> 0:49:54.319
<v Speaker 2>Let me just look up real quick.

0:49:54.360 --> 0:49:56.760
<v Speaker 1>The title of Manny Murphy Paul's new book, she basically

0:49:56.800 --> 0:50:00.000
<v Speaker 1>summarizes the work Urr. Book is called The Extended Mind

0:50:00.640 --> 0:50:04.440
<v Speaker 1>by Annie Murphy Paul. The subtitle is The Power of

0:50:04.520 --> 0:50:08.120
<v Speaker 1>Thinking Outside the Brain. Annie's a dear friend of mine.

0:50:08.160 --> 0:50:09.480
<v Speaker 1>And yeah, I think it's.

0:50:09.320 --> 0:50:09.960
<v Speaker 2>A cool book.

0:50:10.560 --> 0:50:12.160
<v Speaker 1>Yeah, and I think I do like that He of

0:50:12.200 --> 0:50:16.160
<v Speaker 1>us thinking about free will more than just an individualistic pursuit, right,

0:50:16.160 --> 0:50:18.600
<v Speaker 1>I mean, we influence each other's and each other in

0:50:18.640 --> 0:50:21.520
<v Speaker 1>such profound ways. We limit each other's freedom, not just

0:50:21.600 --> 0:50:25.080
<v Speaker 1>internally but also externally. The liberals got something right there

0:50:25.440 --> 0:50:29.600
<v Speaker 1>to a certain degree. Your father is right, but it's

0:50:29.640 --> 0:50:32.160
<v Speaker 1>not the whole story, but you know it is part

0:50:32.160 --> 0:50:33.120
<v Speaker 1>of the story for sure.

0:50:33.480 --> 0:50:33.680
<v Speaker 2>Yeah.

0:50:33.719 --> 0:50:37.080
<v Speaker 1>Yeah, well, this is this has been great. Is there

0:50:37.120 --> 0:50:39.520
<v Speaker 1>anything else you want to add that you haven't said?

0:50:39.880 --> 0:50:43.160
<v Speaker 2>I guess I would just reiterate that I can't prove

0:50:43.200 --> 0:50:46.600
<v Speaker 2>and nobody can prove that free will exists. I think

0:50:46.680 --> 0:50:49.239
<v Speaker 2>it's likely if you think about it the right way,

0:50:49.280 --> 0:50:51.760
<v Speaker 2>and I think about it the way Kristen lifts list

0:50:51.880 --> 0:50:55.800
<v Speaker 2>laid out is philosophical arguments. But even if it's not true,

0:50:56.480 --> 0:50:58.719
<v Speaker 2>the illusion of freedom, if you want to call that,

0:50:58.880 --> 0:51:02.239
<v Speaker 2>makes a huge difference, and it might operate like a

0:51:02.320 --> 0:51:05.800
<v Speaker 2>kind of placebo effect that it even though it's an illusion,

0:51:06.239 --> 0:51:09.799
<v Speaker 2>it makes itself true to a considerable extent in that

0:51:10.000 --> 0:51:12.759
<v Speaker 2>if you believe in it, suddenly you're doing things that

0:51:12.840 --> 0:51:15.640
<v Speaker 2>are better for you and making you happier and making

0:51:15.680 --> 0:51:19.719
<v Speaker 2>you adapt and thrive to a greater extent. So whether

0:51:19.800 --> 0:51:21.759
<v Speaker 2>you want to call that an illusion, I think is

0:51:22.640 --> 0:51:24.680
<v Speaker 2>you know, I think it's going too far to just

0:51:24.760 --> 0:51:28.080
<v Speaker 2>say it's an illusion. I'd say it's an adaptation. But

0:51:29.600 --> 0:51:32.000
<v Speaker 2>you know, it can go around and around with these

0:51:32.040 --> 0:51:34.000
<v Speaker 2>things linguistics.

0:51:34.000 --> 0:51:35.960
<v Speaker 1>You can call it an adaptive illusion, but then you

0:51:35.960 --> 0:51:38.520
<v Speaker 1>don't really like calling it an illusion. Yeah, no, I

0:51:38.560 --> 0:51:41.919
<v Speaker 1>hear you. I mean it's like religion, right, like or.

0:51:42.200 --> 0:51:42.960
<v Speaker 2>Belief in God.

0:51:43.840 --> 0:51:47.359
<v Speaker 1>Yeah, you know, like we don't know for sure. It's

0:51:47.360 --> 0:51:49.080
<v Speaker 1>sort of like a similar argument to what you're saying.

0:51:49.120 --> 0:51:52.000
<v Speaker 1>I mean, we probably won't never know while we're alive.

0:51:52.600 --> 0:51:55.320
<v Speaker 1>You know, maybe we'll find out someday. Oh hey, God,

0:51:56.040 --> 0:52:00.399
<v Speaker 1>you do exist after all. But it's hard. It's hard

0:52:00.400 --> 0:52:01.319
<v Speaker 1>to know that right now.

0:52:01.640 --> 0:52:04.080
<v Speaker 2>Yeah, right, And I don't know. I don't know the

0:52:04.120 --> 0:52:08.319
<v Speaker 2>truth of this, and neither does anybody else. But it's

0:52:08.360 --> 0:52:10.560
<v Speaker 2>true that it is like a religion to people. You know,

0:52:11.120 --> 0:52:15.399
<v Speaker 2>determinism is like a religion to people. Free will might

0:52:15.440 --> 0:52:19.000
<v Speaker 2>be like a religion to me. I don't know. I

0:52:19.040 --> 0:52:23.200
<v Speaker 2>would agree with Sam Harris. I do think that God

0:52:23.320 --> 0:52:25.680
<v Speaker 2>is a delusion, that religion is a delusion. But I

0:52:25.719 --> 0:52:30.279
<v Speaker 2>can't be sure about that either. So really you just

0:52:30.320 --> 0:52:32.320
<v Speaker 2>have to be sort of agnostic if you're going to

0:52:32.560 --> 0:52:35.080
<v Speaker 2>be humble in the.

0:52:35.080 --> 0:52:38.000
<v Speaker 1>End, Ken Sheldon letting it all hang out here at

0:52:38.040 --> 0:52:41.960
<v Speaker 1>the end of his career. God is good. But we

0:52:42.000 --> 0:52:45.040
<v Speaker 1>have free will. That's that's interesting. Usually you find a

0:52:45.080 --> 0:52:47.640
<v Speaker 1>lot of the really deep religious Christians believe we have

0:52:48.160 --> 0:52:51.200
<v Speaker 1>free will. It's an interesting isn't that interesting? You often

0:52:51.239 --> 0:52:52.480
<v Speaker 1>find they do believe in free will.

0:52:52.560 --> 0:52:53.279
<v Speaker 2>God gave it to you.

0:52:55.040 --> 0:52:56.839
<v Speaker 1>Well, that's what it is. I guess is that God

0:52:56.880 --> 0:53:02.200
<v Speaker 1>gave it to us to us, But there you go, Well,

0:53:02.200 --> 0:53:03.280
<v Speaker 1>who gave us evolution?

0:53:04.480 --> 0:53:06.400
<v Speaker 2>This is a chance prodcast.

0:53:08.200 --> 0:53:10.719
<v Speaker 1>Anyway. I won't go down that rabbit hole too much

0:53:10.920 --> 0:53:12.920
<v Speaker 1>at this point, but we can do that over some

0:53:12.960 --> 0:53:16.040
<v Speaker 1>beers someday or some wine. Really great chatting with you,

0:53:16.280 --> 0:53:19.120
<v Speaker 1>and thanks for reading this really provocative book.

0:53:20.000 --> 0:53:22.120
<v Speaker 2>I appreciate it. It was fun reconnecting with you, and

0:53:22.120 --> 0:53:23.200
<v Speaker 2>we should do it more often.

0:53:23.840 --> 0:53:35.279
<v Speaker 1>Likewise, thanks for listening to this episode of The Psychology Podcast.

0:53:35.719 --> 0:53:37.800
<v Speaker 1>If you'd like to react in some way to something

0:53:37.800 --> 0:53:40.360
<v Speaker 1>you heard, I encourage you to join in the discussion

0:53:40.360 --> 0:53:43.800
<v Speaker 1>at thus psychology podcast dot com. We're on our YouTube page,

0:53:43.880 --> 0:53:46.840
<v Speaker 1>The Psychology Podcast. We also put up some videos of

0:53:46.880 --> 0:53:49.680
<v Speaker 1>some episodes on our YouTube page as well, so you'll

0:53:49.719 --> 0:53:51.800
<v Speaker 1>want to check that out. Thanks for being such a

0:53:51.800 --> 0:53:54.200
<v Speaker 1>great supporter of the show, and tune in next time

0:53:54.239 --> 0:53:59.759
<v Speaker 1>for more on the mind, brain, behavior, and creativity.