WEBVTT - On Background: Effective Altruism Still Has Friends

0:00:15.316 --> 0:00:23.676
<v Speaker 1>Pushkin back. In twenty twelve, when Sam Bankman Freed was

0:00:23.716 --> 0:00:27.156
<v Speaker 1>an undergraduate in MIT, he learned about a movement called

0:00:27.196 --> 0:00:32.436
<v Speaker 1>effective altruism or EA for short. The argument behind EA,

0:00:32.636 --> 0:00:36.676
<v Speaker 1>very roughly is that philanthropy is broken. People shouldn't be

0:00:36.716 --> 0:00:39.396
<v Speaker 1>giving away their money based on emotional attachments to certain

0:00:39.436 --> 0:00:43.436
<v Speaker 1>causes or institutions. Rather, they should lean more on evidence

0:00:43.436 --> 0:00:45.116
<v Speaker 1>and reason in order to do as much good as

0:00:45.156 --> 0:00:50.516
<v Speaker 1>possible in the world. Sam Bankman Freed soon became one

0:00:50.516 --> 0:00:54.116
<v Speaker 1>of the most famous proponents of EA. One of the

0:00:54.156 --> 0:00:56.076
<v Speaker 1>things that interested me when I met him the first

0:00:56.076 --> 0:00:59.676
<v Speaker 1>time was his interest in this. I found it curious.

0:00:59.756 --> 0:01:02.236
<v Speaker 1>I had heard bits and pieces about EA, but I

0:01:02.276 --> 0:01:05.036
<v Speaker 1>got the full throated version from him, And as I

0:01:05.036 --> 0:01:07.356
<v Speaker 1>started to write about his world, I found his company

0:01:07.436 --> 0:01:12.876
<v Speaker 1>FTX was filled with people who also considered themselves effective altruists. Then,

0:01:12.916 --> 0:01:17.796
<v Speaker 1>of course, Sam's cryptocurrency exchanged dramatically collapse last year. He

0:01:17.916 --> 0:01:20.316
<v Speaker 1>and FTX had been possibly the best thing ever to

0:01:20.356 --> 0:01:23.876
<v Speaker 1>happen to the effective altruist movement, but the best thing

0:01:23.956 --> 0:01:32.836
<v Speaker 1>quickly transformed into the worst. This is on background from

0:01:32.876 --> 0:01:45.996
<v Speaker 1>Against the Rules, I'm Michael Lewis. Effective altruism is running

0:01:45.996 --> 0:01:50.356
<v Speaker 1>through the background of my forthcoming book, Going Infinite. The

0:01:50.436 --> 0:01:54.236
<v Speaker 1>movement started just a few years before Sam Bankmanfreed encountered it.

0:01:54.556 --> 0:01:57.436
<v Speaker 1>I need to understand what effective altrus really believe and

0:01:57.676 --> 0:02:00.076
<v Speaker 1>what it is about this movement the so appealing to

0:02:00.156 --> 0:02:03.876
<v Speaker 1>smart young people, and also how these advocates are trying

0:02:03.876 --> 0:02:07.956
<v Speaker 1>to pick up the pieces after ftx's collapse. So we

0:02:08.036 --> 0:02:10.996
<v Speaker 1>found two college student who are leaders in effective altruism

0:02:11.116 --> 0:02:14.676
<v Speaker 1>organizations to talk to me about this. Al Shin is

0:02:14.716 --> 0:02:18.956
<v Speaker 1>a senior at Harvard studying statistics and Gabriel mccobe is

0:02:19.036 --> 0:02:23.236
<v Speaker 1>a senior at Stanford studying computer science. I started off

0:02:23.276 --> 0:02:26.316
<v Speaker 1>asking them how they'd explain effective altruism to someone who

0:02:26.356 --> 0:02:29.316
<v Speaker 1>knows nothing about it. Gabe answers, first.

0:02:30.396 --> 0:02:32.596
<v Speaker 2>We are like, at least I am like a person

0:02:32.596 --> 0:02:34.836
<v Speaker 2>with a lot of privilege. I'm born in the US,

0:02:34.996 --> 0:02:38.396
<v Speaker 2>I go to like a pretty good university. I'm a

0:02:38.436 --> 0:02:41.236
<v Speaker 2>man who has grown up in a middle class family.

0:02:41.316 --> 0:02:43.356
<v Speaker 2>Like I have a lot of privilege and resources to

0:02:44.076 --> 0:02:47.516
<v Speaker 2>do change and impact in the world. And maybe you

0:02:47.596 --> 0:02:50.316
<v Speaker 2>want to, like actually try to figure out what's the

0:02:50.316 --> 0:02:52.916
<v Speaker 2>way you can have the best positive impact and to me,

0:02:52.996 --> 0:02:56.196
<v Speaker 2>effective altriism is both like a framework for trying to

0:02:56.196 --> 0:02:59.356
<v Speaker 2>think about how we might do a whole lot of

0:02:59.356 --> 0:03:01.076
<v Speaker 2>good with our with our careers at our time, with

0:03:01.116 --> 0:03:03.556
<v Speaker 2>our other resources, and then a community of people who

0:03:03.596 --> 0:03:06.596
<v Speaker 2>are actually trying to put that into practice and actively

0:03:06.676 --> 0:03:10.156
<v Speaker 2>like push their careers and the resources towards doing a

0:03:10.196 --> 0:03:10.556
<v Speaker 2>lot of good.

0:03:11.556 --> 0:03:15.076
<v Speaker 1>What kind of pushback do you get when you're talking

0:03:15.116 --> 0:03:18.676
<v Speaker 1>to people your age about these ideas.

0:03:18.596 --> 0:03:24.996
<v Speaker 3>H I think one possible avenue of pushback is this

0:03:25.516 --> 0:03:31.796
<v Speaker 3>very cynical, almost nihilist view that you can't actually enact

0:03:31.956 --> 0:03:35.916
<v Speaker 3>positive change. The actual level of suffering in the world

0:03:35.996 --> 0:03:38.556
<v Speaker 3>is not going to be reduced. There's just not enough

0:03:38.596 --> 0:03:39.556
<v Speaker 3>people working on this.

0:03:39.716 --> 0:03:42.636
<v Speaker 1>Do you feel like what's under that objection is basic selfishness,

0:03:42.676 --> 0:03:45.756
<v Speaker 1>that that I don't really want to have to think

0:03:45.756 --> 0:03:47.876
<v Speaker 1>about the world this way. So if I can dismiss

0:03:47.916 --> 0:03:50.116
<v Speaker 1>these arguments as saying as being preposterous, I can go

0:03:50.156 --> 0:03:51.196
<v Speaker 1>about my selfish life.

0:03:52.236 --> 0:03:56.116
<v Speaker 3>Certainly there's some people like that, and I would not

0:03:56.196 --> 0:03:58.756
<v Speaker 3>say though, that everyone is like that. I think there

0:03:58.876 --> 0:04:03.836
<v Speaker 3>is genuinely people who have some level of very cynical,

0:04:03.956 --> 0:04:08.996
<v Speaker 3>nihilistic outlook as to what humanity can accomplish, and that

0:04:09.276 --> 0:04:13.316
<v Speaker 3>bleeds into not taking people seriously when they tend to

0:04:13.356 --> 0:04:17.716
<v Speaker 3>have a more hopeful, optimistic attitude. And even for people

0:04:17.756 --> 0:04:21.236
<v Speaker 3>who are very involved in EA, at some point, these

0:04:21.276 --> 0:04:24.196
<v Speaker 3>also just tend to be people who are philosophically inclined,

0:04:24.236 --> 0:04:27.156
<v Speaker 3>and the philosophically inclined also tend toward nihilisa at some

0:04:27.156 --> 0:04:29.156
<v Speaker 3>point in their lives. So maybe they'll come.

0:04:28.996 --> 0:04:29.956
<v Speaker 4>Back around.

0:04:31.396 --> 0:04:33.356
<v Speaker 2>And like about the point of like people being selfish,

0:04:33.396 --> 0:04:36.036
<v Speaker 2>Like I don't know when I was like first, like

0:04:36.116 --> 0:04:38.876
<v Speaker 2>thinking about the ideas, it can feel overwhelming. Sometimes they

0:04:38.876 --> 0:04:42.356
<v Speaker 2>give about like oh wow, like maybe the default thing

0:04:42.556 --> 0:04:44.196
<v Speaker 2>we're grown up with. As all said, is the world

0:04:44.236 --> 0:04:45.996
<v Speaker 2>kind of beats into you that you're just a tiny

0:04:46.036 --> 0:04:48.276
<v Speaker 2>little ant in this universe. You can't really do anything.

0:04:48.316 --> 0:04:50.476
<v Speaker 2>They are all these big systems. It's pretty hopeless to

0:04:50.476 --> 0:04:52.276
<v Speaker 2>actually do any change. And then EA is all like

0:04:52.356 --> 0:04:55.076
<v Speaker 2>whoa people can have an impact, You can be like

0:04:55.116 --> 0:04:57.636
<v Speaker 2>dispropersonally impactful, you can do all this stuff That feels

0:04:57.636 --> 0:05:00.836
<v Speaker 2>like a lot of responsibility. It also feels like maybe

0:05:00.876 --> 0:05:02.596
<v Speaker 2>a lot of pressure thinking about like, oh, do I

0:05:02.596 --> 0:05:05.276
<v Speaker 2>actually want to change the trajectory of my career. I'm

0:05:05.276 --> 0:05:08.516
<v Speaker 2>just like a young college student like I thought I

0:05:08.556 --> 0:05:10.876
<v Speaker 2>was going to like do these other like fun interesting things,

0:05:10.916 --> 0:05:13.916
<v Speaker 2>but maybe I actually want to like change totally what

0:05:13.916 --> 0:05:15.476
<v Speaker 2>I would work on, and that can be kind of

0:05:15.516 --> 0:05:17.196
<v Speaker 2>scary for me. I think I had some of that

0:05:17.636 --> 0:05:20.756
<v Speaker 2>maybe cognitive dissonance when I was first getting involved and

0:05:20.796 --> 0:05:22.356
<v Speaker 2>trying to think of like how much I wanted to

0:05:22.356 --> 0:05:25.676
<v Speaker 2>trade off, like what I would otherwise do if I

0:05:25.716 --> 0:05:30.356
<v Speaker 2>was like totally egoistic, versus what I wanted to prioritize

0:05:30.396 --> 0:05:33.796
<v Speaker 2>in terms of altruism and helping others and all this.

0:05:34.076 --> 0:05:39.316
<v Speaker 1>When I first heard the ideas, I had encountered them

0:05:39.356 --> 0:05:41.756
<v Speaker 1>in a very casual way because people had told me

0:05:41.796 --> 0:05:44.396
<v Speaker 1>about young people going to Wall Street to make money

0:05:44.436 --> 0:05:47.396
<v Speaker 1>to give it away, and I thought that that was

0:05:47.476 --> 0:05:51.116
<v Speaker 1>really interesting, what a curious subversion of Wall Street. But

0:05:52.196 --> 0:05:54.876
<v Speaker 1>when I first got the full blast from the people

0:05:54.876 --> 0:05:59.476
<v Speaker 1>at FTX, I remember thinking, this is so alien to

0:05:59.556 --> 0:06:03.716
<v Speaker 1>anything I heard or felt, or any impulse I had

0:06:03.756 --> 0:06:06.276
<v Speaker 1>when I was a college student, And I asked myself

0:06:06.396 --> 0:06:09.756
<v Speaker 1>like why, And I think the answer that the honest

0:06:09.796 --> 0:06:13.356
<v Speaker 1>answer was I didn't want to change the world. I

0:06:13.476 --> 0:06:15.916
<v Speaker 1>liked the world just the way I found it. There

0:06:15.996 --> 0:06:21.316
<v Speaker 1>was a kind of idiot happiness that I experienced when

0:06:21.316 --> 0:06:23.756
<v Speaker 1>I was a college student that would have inoculated me

0:06:24.436 --> 0:06:28.836
<v Speaker 1>against any kind of intellectual assault from an effective altruist.

0:06:29.396 --> 0:06:32.596
<v Speaker 1>You probably don't really run across that as much. I bet.

0:06:32.796 --> 0:06:38.036
<v Speaker 2>Also at Stanford there's like definitely a like people call

0:06:38.076 --> 0:06:40.116
<v Speaker 2>it the Stanford bubble. It's easy to like feel like

0:06:40.156 --> 0:06:43.436
<v Speaker 2>you're in this little utopia separated from the universe, and

0:06:43.436 --> 0:06:45.596
<v Speaker 2>a lot of people just like, I don't know, life

0:06:45.636 --> 0:06:47.676
<v Speaker 2>is good here and let's just go work in fintech

0:06:47.796 --> 0:06:51.076
<v Speaker 2>or something. But also there are like quite a lot

0:06:51.076 --> 0:06:53.636
<v Speaker 2>of people here who do recognize that the world has

0:06:53.716 --> 0:06:56.316
<v Speaker 2>quite a lot of issues, and a lot of people here,

0:06:56.356 --> 0:06:58.356
<v Speaker 2>I think do want to do things about that. Whether

0:06:58.396 --> 0:07:00.516
<v Speaker 2>that's like I don't know, there's like a lot of

0:07:00.556 --> 0:07:02.996
<v Speaker 2>people interested in climate change or social justice or things

0:07:03.036 --> 0:07:05.956
<v Speaker 2>like this. I think it's very much the case that

0:07:06.036 --> 0:07:09.356
<v Speaker 2>many young people now are at least very broadly in

0:07:09.436 --> 0:07:10.156
<v Speaker 2>with helping others.

0:07:12.116 --> 0:07:14.076
<v Speaker 1>After a short break, Gabe All and I get into

0:07:14.116 --> 0:07:19.076
<v Speaker 1>what sets effective altruism apart from ordinary altruism on background,

0:07:19.156 --> 0:07:28.876
<v Speaker 1>will be right back. I'm back with Harvard senior Al

0:07:28.956 --> 0:07:33.676
<v Speaker 1>Shin and Stanford senior Gabriel Mukobe on background, I would

0:07:33.716 --> 0:07:36.876
<v Speaker 1>love you both to explain to someone who's never heard

0:07:36.876 --> 0:07:41.316
<v Speaker 1>of these ideas, like, what's the difference between effective altruism

0:07:41.316 --> 0:07:44.076
<v Speaker 1>and just altruism ordinary philanthropy.

0:07:45.356 --> 0:07:50.276
<v Speaker 2>Yeah. So, Peter Wldeford, who's one of the co founders

0:07:50.276 --> 0:07:52.036
<v Speaker 2>of Rethink Priorities, has a recent.

0:07:51.796 --> 0:07:53.356
<v Speaker 1>Post that I quite like.

0:07:53.636 --> 0:07:56.116
<v Speaker 2>It's called eas three radical ideas I want to protect,

0:07:56.556 --> 0:08:00.916
<v Speaker 2>and the three ideas are radical empathy, trying to think

0:08:00.956 --> 0:08:03.676
<v Speaker 2>about the ways that society might be wrong in not

0:08:03.836 --> 0:08:07.436
<v Speaker 2>extending moral consideration to others. The second is scope sensitivity.

0:08:07.836 --> 0:08:11.036
<v Speaker 2>A lot of people are bad at actually estimating the

0:08:11.156 --> 0:08:13.436
<v Speaker 2>amount of impact certain things will do. If you like

0:08:13.476 --> 0:08:15.796
<v Speaker 2>survey people on like, hey, you could do this one

0:08:15.836 --> 0:08:19.316
<v Speaker 2>intervention and it would like save a thousand chicken lives

0:08:19.396 --> 0:08:21.116
<v Speaker 2>versus this other thing and it would save like one

0:08:21.156 --> 0:08:23.636
<v Speaker 2>hundred thousand chicken lives. And you ask the people like, hey,

0:08:23.636 --> 0:08:25.436
<v Speaker 2>how good is this thing compared to the other, and

0:08:25.436 --> 0:08:28.076
<v Speaker 2>then they like, way to un upscale. The difference is like, oh,

0:08:28.116 --> 0:08:29.556
<v Speaker 2>it's like goes from like a six to a seven.

0:08:29.796 --> 0:08:32.356
<v Speaker 2>So like, emotionally, a lot of people have a hard time,

0:08:32.836 --> 0:08:38.076
<v Speaker 2>like empirically or objectively evaluating the impact of things.

0:08:38.076 --> 0:08:41.236
<v Speaker 1>Of the things they do right or the money they give.

0:08:41.636 --> 0:08:45.836
<v Speaker 1>And EA is measuring this so that you have scope sensitivity.

0:08:45.516 --> 0:08:47.756
<v Speaker 2>Measuring or like at least just trying to be aware

0:08:47.796 --> 0:08:51.836
<v Speaker 2>of this effect and trying to actually do things that

0:08:52.036 --> 0:08:54.276
<v Speaker 2>seem to have more of an impact on others rather

0:08:54.356 --> 0:08:56.756
<v Speaker 2>than just like more impact on how we feel good

0:08:56.756 --> 0:08:59.796
<v Speaker 2>inside of our heads about us doing things. And then

0:08:59.836 --> 0:09:02.196
<v Speaker 2>the third thing is scout mindset. And this is the

0:09:02.236 --> 0:09:06.636
<v Speaker 2>idea of like thinking of your arguments and the ways

0:09:06.636 --> 0:09:08.956
<v Speaker 2>you approach the world, not as like soldiers. Like a

0:09:09.156 --> 0:09:11.836
<v Speaker 2>lot of people argue with each other as soldiers. Their

0:09:11.916 --> 0:09:14.036
<v Speaker 2>arguments are soldiers is the term, I guess where you

0:09:14.916 --> 0:09:16.956
<v Speaker 2>are on the right side. You're like fighting for your

0:09:16.996 --> 0:09:19.556
<v Speaker 2>kingdom or whatever, and you're going to say the things

0:09:19.556 --> 0:09:21.516
<v Speaker 2>in order to win the battle. This is different from

0:09:21.516 --> 0:09:24.316
<v Speaker 2>how scouts approach the world, where you don't know about

0:09:24.556 --> 0:09:27.756
<v Speaker 2>the territory and a landscape and you're trying to maintain

0:09:27.836 --> 0:09:31.396
<v Speaker 2>uncertainty over different possibilities and taking different evidence and try

0:09:31.436 --> 0:09:33.076
<v Speaker 2>to figure out what the truth is. Maybe this is

0:09:33.116 --> 0:09:35.556
<v Speaker 2>just a better way of approaching thinking about the world

0:09:35.636 --> 0:09:38.316
<v Speaker 2>that can enable us to be closer to the truth,

0:09:38.556 --> 0:09:41.916
<v Speaker 2>more honest and more objective and hopefully mess up less

0:09:41.956 --> 0:09:44.156
<v Speaker 2>when we're trying to do a lot of impactful things.

0:09:44.916 --> 0:09:46.596
<v Speaker 1>A can I ask you the same question, just like

0:09:46.676 --> 0:09:50.716
<v Speaker 1>if you're just explaining the difference between general do goodism

0:09:51.276 --> 0:09:55.076
<v Speaker 1>or philanthropy and effective altruism, how do you go about it?

0:09:55.756 --> 0:10:00.316
<v Speaker 3>I think effective in effective altruism is not to contrast

0:10:00.316 --> 0:10:04.636
<v Speaker 3>it with ineffective altruism, but to make it seem like

0:10:04.836 --> 0:10:08.316
<v Speaker 3>this is altruism that is really focused on the effects

0:10:08.596 --> 0:10:12.876
<v Speaker 3>of what we're doing, that isn't about how the do

0:10:12.996 --> 0:10:17.276
<v Speaker 3>gooder necessarily feels about the actions that they're doing, but

0:10:17.916 --> 0:10:22.276
<v Speaker 3>what amount of good do those actions put into the world.

0:10:22.556 --> 0:10:24.636
<v Speaker 1>How big a part of your life is it? Is

0:10:24.636 --> 0:10:27.916
<v Speaker 1>there anything you've done differently with your life because this

0:10:27.996 --> 0:10:29.396
<v Speaker 1>movement exists?

0:10:31.156 --> 0:10:35.876
<v Speaker 3>Hmm? I think it has altered the course of what

0:10:36.116 --> 0:10:39.516
<v Speaker 3>I chose to study. I mean I basically change from

0:10:39.556 --> 0:10:43.956
<v Speaker 3>some interest in computer science and machine learning and protein

0:10:43.996 --> 0:10:47.796
<v Speaker 3>folding coming into college into getting a more broader statistical

0:10:47.836 --> 0:10:51.436
<v Speaker 3>and probability base that could be generalized to more problems

0:10:51.516 --> 0:10:54.916
<v Speaker 3>depending on what seemed more urgent at the time, which

0:10:55.636 --> 0:10:58.916
<v Speaker 3>is forever changing. So I thought, like a more broad

0:10:59.436 --> 0:11:02.836
<v Speaker 3>probability basis would be good for that. I think there

0:11:02.956 --> 0:11:09.036
<v Speaker 3>is some reasonable limit you have to impose on your

0:11:09.196 --> 0:11:13.436
<v Speaker 3>self before you find yourself over optimizing to doing the

0:11:13.476 --> 0:11:17.596
<v Speaker 3>most good and ending up not necessarily operating like a

0:11:17.676 --> 0:11:20.196
<v Speaker 3>human being, but just like a little ball of stress

0:11:20.516 --> 0:11:23.516
<v Speaker 3>that is constantly tormented with the amount of suffering that

0:11:23.516 --> 0:11:27.316
<v Speaker 3>you're putting into the world. This is something that the

0:11:27.356 --> 0:11:31.796
<v Speaker 3>effective Altruism movement is trying to discourage among especially young

0:11:31.836 --> 0:11:35.836
<v Speaker 3>people who get involved, because again the philosophically minded, in

0:11:35.836 --> 0:11:38.596
<v Speaker 3>addition to going through a nihilist phase, also go through

0:11:38.636 --> 0:11:40.476
<v Speaker 3>like several neurotic phases in their life.

0:11:40.516 --> 0:11:45.716
<v Speaker 1>Probably what would be going too far with effective altruism

0:11:45.716 --> 0:11:46.276
<v Speaker 1>in your mind?

0:11:48.476 --> 0:11:48.756
<v Speaker 4>Hmm?

0:11:49.556 --> 0:11:52.476
<v Speaker 1>What line when someone crosses it in the movement do

0:11:52.516 --> 0:11:54.476
<v Speaker 1>you think I wouldn't do that?

0:11:54.956 --> 0:11:59.556
<v Speaker 3>Something like if you're forcing yourself to do something that

0:11:59.676 --> 0:12:06.436
<v Speaker 3>you hate, like absolutely despise, because of some abstract reasoning

0:12:06.516 --> 0:12:09.676
<v Speaker 3>that this is the most good that you, in particular

0:12:09.756 --> 0:12:14.116
<v Speaker 3>can do. I think sacrificing a lot of personal happiness

0:12:14.156 --> 0:12:17.596
<v Speaker 3>for like very uncertain future outcomes when there are options

0:12:17.596 --> 0:12:19.956
<v Speaker 3>that like you could do that you would be happy

0:12:19.996 --> 0:12:24.036
<v Speaker 3>with that aren't bad. It's like not impossible that someone

0:12:24.356 --> 0:12:30.356
<v Speaker 3>hears that, oh, this particular uh, like biosecurity or pandemic

0:12:30.356 --> 0:12:33.436
<v Speaker 3>preparedness or bio risk is a really important cause area,

0:12:33.716 --> 0:12:35.916
<v Speaker 3>and then they force themselves to go work in the

0:12:35.916 --> 0:12:38.316
<v Speaker 3>wet lab even though they hate lab work. They absolutely

0:12:38.316 --> 0:12:41.196
<v Speaker 3>despise it. They don't like the like PPA, they don't

0:12:41.196 --> 0:12:43.916
<v Speaker 3>like the protocols, they don't like working with chemicals or organisms,

0:12:43.956 --> 0:12:47.276
<v Speaker 3>et cetera. But for some reason they have decided based

0:12:47.276 --> 0:12:49.596
<v Speaker 3>on their reasoning, and this is the most important thing

0:12:49.596 --> 0:12:52.356
<v Speaker 3>that they can do. I would say that, like, this

0:12:52.476 --> 0:12:57.556
<v Speaker 3>extreme sacrifice of personal happiness and ability to enjoy the

0:12:57.876 --> 0:13:01.476
<v Speaker 3>work that you do is probably indicative that you like.

0:13:01.996 --> 0:13:04.956
<v Speaker 3>This is not something that the movement wants to induce

0:13:05.076 --> 0:13:07.276
<v Speaker 3>in people who are exposed to its ideas.

0:13:07.756 --> 0:13:10.196
<v Speaker 1>Right, gave, let me ask you this same question, like

0:13:10.676 --> 0:13:13.516
<v Speaker 1>how it's affected your life, your actual decisions you've made

0:13:13.596 --> 0:13:17.036
<v Speaker 1>or things you've done that you would not otherwise have

0:13:17.116 --> 0:13:19.196
<v Speaker 1>done if effective altruism didn't exist.

0:13:19.916 --> 0:13:22.836
<v Speaker 2>Yeah, Like, now I'm for my career, I'm like primarily

0:13:22.876 --> 0:13:25.996
<v Speaker 2>considering working in AI safety or policy or field building.

0:13:26.316 --> 0:13:31.036
<v Speaker 2>I think, like most directly, probably it's like accelerated my

0:13:31.516 --> 0:13:36.996
<v Speaker 2>like thinking of this and trying to critically analyze the

0:13:36.996 --> 0:13:39.396
<v Speaker 2>trajectory I want to set my personal career on and

0:13:40.356 --> 0:13:43.076
<v Speaker 2>what I should be doing to get there. So maybe

0:13:43.076 --> 0:13:44.836
<v Speaker 2>it's more of like a speed up effect than like

0:13:44.876 --> 0:13:48.596
<v Speaker 2>a drastic turn. I think it's been useful to have

0:13:48.636 --> 0:13:50.996
<v Speaker 2>both like this framework, this intellectual thing and the community.

0:13:51.116 --> 0:13:53.076
<v Speaker 2>And the community has been like really helpful and I've

0:13:53.076 --> 0:13:55.116
<v Speaker 2>made a lot of friends, and I'm like very thankful

0:13:55.156 --> 0:13:56.756
<v Speaker 2>to have a lot of people who are supporting me

0:13:56.796 --> 0:13:59.516
<v Speaker 2>and can talk to to discuss these very hard questions.

0:14:00.436 --> 0:14:04.316
<v Speaker 1>But this must come up in the community, Like where

0:14:04.316 --> 0:14:07.636
<v Speaker 1>do you draw the line? Yeah, yeah, definitely, And I'm

0:14:07.676 --> 0:14:11.236
<v Speaker 1>thinking not just like choice of career, but how about

0:14:11.236 --> 0:14:15.036
<v Speaker 1>a question like do you have kids? Is it a

0:14:15.116 --> 0:14:20.316
<v Speaker 1>selfish and ineffective pursuit having kids? I mean you hear

0:14:20.356 --> 0:14:21.156
<v Speaker 1>people talk about that.

0:14:22.396 --> 0:14:25.436
<v Speaker 2>Yeah, definitely. I hang out with people who are like

0:14:25.516 --> 0:14:28.236
<v Speaker 2>mostly too young to be having kids right now.

0:14:29.196 --> 0:14:31.196
<v Speaker 1>So what you're thinking about you But it's gonna come

0:14:31.276 --> 0:14:31.716
<v Speaker 1>up someone.

0:14:31.836 --> 0:14:32.396
<v Speaker 3>Yeah.

0:14:32.476 --> 0:14:35.156
<v Speaker 2>Yeah, there's certainly, like a lot of people writing on

0:14:35.196 --> 0:14:38.196
<v Speaker 2>the forum dot effective ulstulism dot org. There's one post

0:14:38.596 --> 0:14:40.796
<v Speaker 2>they had decided not to have kids, and they're like, wow,

0:14:40.916 --> 0:14:42.716
<v Speaker 2>this is this is just the rational thing to do,

0:14:43.636 --> 0:14:45.356
<v Speaker 2>like it's it would be selfish for us to do

0:14:45.356 --> 0:14:47.996
<v Speaker 2>as and they like committed to this, and they were miserable.

0:14:48.076 --> 0:14:50.196
<v Speaker 1>Yeah, And the argument argument is, you have you'll have

0:14:50.276 --> 0:14:52.916
<v Speaker 1>less of an impact on the problems that threaten humanity

0:14:52.956 --> 0:14:56.236
<v Speaker 1>and less of an impact on totally human happiness if

0:14:56.516 --> 0:14:58.956
<v Speaker 1>if you have to spend so many hours raising a child.

0:14:58.996 --> 0:15:01.476
<v Speaker 2>That was the general idea, the or like at least

0:15:01.476 --> 0:15:04.956
<v Speaker 2>the naive utilitarian idea perhaps, But it turns out they

0:15:04.956 --> 0:15:07.716
<v Speaker 2>were miserable. They like really didn't like it. So they

0:15:07.836 --> 0:15:10.236
<v Speaker 2>talked about it and decided like, yeah, it's actually, like

0:15:10.276 --> 0:15:12.156
<v Speaker 2>IL said, this is something we actually care about a lot.

0:15:12.156 --> 0:15:14.636
<v Speaker 2>That's like really integral to our personal happiness and to

0:15:14.676 --> 0:15:16.836
<v Speaker 2>our life satisfaction. We do want to have kids, And

0:15:16.876 --> 0:15:18.556
<v Speaker 2>they decided to have kids. And I think now they

0:15:18.556 --> 0:15:21.436
<v Speaker 2>have kids and they're much happier. They're also like more productive,

0:15:21.476 --> 0:15:24.196
<v Speaker 2>they said, and like this has actually been an increase

0:15:24.236 --> 0:15:27.596
<v Speaker 2>to their productivity and they're like self reported ability to

0:15:28.076 --> 0:15:31.036
<v Speaker 2>do a lot of good. So certainly, like people think

0:15:31.076 --> 0:15:32.716
<v Speaker 2>about this a lot and try to talk about it,

0:15:32.796 --> 0:15:36.596
<v Speaker 2>but it seems like there are no clear answers. Universally,

0:15:36.636 --> 0:15:38.556
<v Speaker 2>a lot of this stuff just depends on the individual.

0:15:38.596 --> 0:15:41.516
<v Speaker 1>Perhaps, how do you have anything to say about having kids?

0:15:42.356 --> 0:15:47.236
<v Speaker 3>Yeah, I would say on the flip side of this problem,

0:15:47.436 --> 0:15:50.796
<v Speaker 3>probably a lot of people have kids for the wrong

0:15:50.916 --> 0:15:56.956
<v Speaker 3>reasons too, like social pressure, familial pressure, et cetera. And

0:15:57.596 --> 0:16:00.436
<v Speaker 3>maybe being an effective altruism makes them more likely to

0:16:00.556 --> 0:16:03.836
<v Speaker 3>identify like these pressures that are causing them to make

0:16:03.876 --> 0:16:06.596
<v Speaker 3>a decision that wouldn't be good for them, either in

0:16:06.716 --> 0:16:08.916
<v Speaker 3>terms of personal happiness or even in terms of their

0:16:09.236 --> 0:16:10.196
<v Speaker 3>Do you solve the problem?

0:16:10.356 --> 0:16:13.076
<v Speaker 1>Do you all have these discussions about kids?

0:16:13.116 --> 0:16:13.556
<v Speaker 3>Not really?

0:16:13.996 --> 0:16:16.556
<v Speaker 1>Not really? What about kidneys? What about giving a kidney?

0:16:19.636 --> 0:16:22.756
<v Speaker 1>So the first my characters had big talks about kids

0:16:22.796 --> 0:16:24.596
<v Speaker 1>and mostly came down on the side you're not supposed

0:16:24.596 --> 0:16:27.956
<v Speaker 1>to have them. They were arguing, that's where they settled. Mostly.

0:16:28.316 --> 0:16:31.636
<v Speaker 1>The kidney question was another one, like do you have

0:16:31.676 --> 0:16:34.836
<v Speaker 1>an obligation to give away your kidney? Yeah?

0:16:34.956 --> 0:16:37.796
<v Speaker 2>It seems like there's like a lot of discussion about

0:16:37.836 --> 0:16:40.676
<v Speaker 2>this several years ago in the EA community, at least

0:16:40.676 --> 0:16:43.036
<v Speaker 2>like what I've read online. I was not involved, Like

0:16:43.116 --> 0:16:43.676
<v Speaker 2>back then.

0:16:44.156 --> 0:16:45.916
<v Speaker 1>You were your post, you're a post.

0:16:45.876 --> 0:16:48.916
<v Speaker 2>Post kidney yeah post. A lot of people were thinking

0:16:48.916 --> 0:16:51.476
<v Speaker 2>about this and it seems like there is like quite

0:16:51.516 --> 0:16:53.836
<v Speaker 2>a compelling case. Maybe like a lot of people just

0:16:53.956 --> 0:16:56.276
<v Speaker 2>die from being on the kidney wait list, and like

0:16:56.316 --> 0:16:59.636
<v Speaker 2>there aren't enough kidneys, and uh, seems like generally it's

0:16:59.796 --> 0:17:02.636
<v Speaker 2>somewhat safe to survive on one kidney for a lot

0:17:02.676 --> 0:17:04.876
<v Speaker 2>of people. Maybe this just like is a very direct

0:17:04.876 --> 0:17:07.876
<v Speaker 2>way to mostly kind of factually save someone's life. So

0:17:07.916 --> 0:17:10.316
<v Speaker 2>I think that's like, I don't know pretty any strong case.

0:17:10.556 --> 0:17:12.876
<v Speaker 2>I've not thought about this a lot. I've not given

0:17:12.876 --> 0:17:15.276
<v Speaker 2>a kidney. I don't know anyone else of like my

0:17:15.476 --> 0:17:18.916
<v Speaker 2>ea friends who has. It seems like for for various reasons,

0:17:18.916 --> 0:17:21.316
<v Speaker 2>this is like falling out of favor in the in

0:17:21.356 --> 0:17:24.036
<v Speaker 2>the conversations. It's also just like a very extreme kind

0:17:24.036 --> 0:17:27.156
<v Speaker 2>of commitment, Like it's it's a very bad recruiting strategy

0:17:27.196 --> 0:17:29.116
<v Speaker 2>to be like, hey have you heard of effective autism?

0:17:29.356 --> 0:17:32.396
<v Speaker 2>By the way, can we have your kidney? This is

0:17:32.436 --> 0:17:34.716
<v Speaker 2>this is like I don't know, it's not the best

0:17:34.716 --> 0:17:38.436
<v Speaker 2>way to like approach people, I think, And yeah, maybe it's.

0:17:38.316 --> 0:17:40.676
<v Speaker 1>Really this it's kind of a second date conversation.

0:17:40.876 --> 0:17:44.036
<v Speaker 2>Yeah, you got to give it a little time to do, perhaps, right.

0:17:44.196 --> 0:17:48.516
<v Speaker 3>I've had some friends bring up the discussion of kidney's

0:17:48.836 --> 0:17:53.756
<v Speaker 3>it still seems mostly like I throw away philosophical argument

0:17:53.796 --> 0:17:58.396
<v Speaker 3>then necessarily a direct moral imperative. I don't think there's

0:17:58.396 --> 0:18:01.236
<v Speaker 3>ever been a instance where someone who's like, I'm going

0:18:01.276 --> 0:18:03.356
<v Speaker 3>to donate my kidney unless someone gives me a good

0:18:03.436 --> 0:18:04.276
<v Speaker 3>argument to stop me.

0:18:04.556 --> 0:18:10.116
<v Speaker 1>Right now, I'm curious, are there still debates with NEA

0:18:10.156 --> 0:18:17.476
<v Speaker 1>about focusing on immediate suffering versus focusing on big, long

0:18:17.596 --> 0:18:21.156
<v Speaker 1>term existential threats, current lives versus lives in the future.

0:18:21.756 --> 0:18:23.236
<v Speaker 3>Oh yeah, definitely, definitely.

0:18:25.196 --> 0:18:28.516
<v Speaker 1>Is it kind of still an urgent top of your conversation, Anya.

0:18:28.956 --> 0:18:33.276
<v Speaker 3>I would say yes, but it's not stopping people from

0:18:33.476 --> 0:18:37.276
<v Speaker 3>doing work in either of those areas. While the debate

0:18:37.356 --> 0:18:41.076
<v Speaker 3>is not settled, certainly people are still pushing forward and

0:18:41.876 --> 0:18:47.316
<v Speaker 3>no solving or alleviating tuberculosis burden versus working on these

0:18:47.356 --> 0:18:49.596
<v Speaker 3>like accidential threats that might happen in the future.

0:18:50.316 --> 0:18:52.396
<v Speaker 1>We're going to take a break here. When we return,

0:18:52.516 --> 0:18:55.396
<v Speaker 1>Gave and Al talk about effective altruism and the idea

0:18:55.436 --> 0:19:05.836
<v Speaker 1>of the bank Shop. It's interesting to me because after

0:19:05.956 --> 0:19:08.356
<v Speaker 1>my main character in the book, after Sam Bankman freed,

0:19:09.396 --> 0:19:13.956
<v Speaker 1>he does collide with THEA and in its very early incarnation,

0:19:14.556 --> 0:19:16.916
<v Speaker 1>and in its early and its first incarnation, it really

0:19:16.956 --> 0:19:20.116
<v Speaker 1>is about like taking some of your salary and giving

0:19:20.156 --> 0:19:24.756
<v Speaker 1>it to an effective charity, that is, you know, saving

0:19:24.756 --> 0:19:29.636
<v Speaker 1>the most lives in poor countries. And a year into it,

0:19:29.716 --> 0:19:33.116
<v Speaker 1>there's a kind of intellectual revolution inside of the in

0:19:33.156 --> 0:19:36.236
<v Speaker 1>the minds of the very people who who brought him EA,

0:19:36.996 --> 0:19:39.836
<v Speaker 1>and they are starting to argue that actually the more

0:19:39.876 --> 0:19:43.636
<v Speaker 1>impactful thing to do is try to prevent humanity from

0:19:43.636 --> 0:19:46.596
<v Speaker 1>being wiped out by some pathogen or by you know,

0:19:46.716 --> 0:19:51.556
<v Speaker 1>unaligned AI or or whatever it is. And I'm just

0:19:51.636 --> 0:19:54.636
<v Speaker 1>kind of wondering how alive that debate still is, or

0:19:54.636 --> 0:19:57.836
<v Speaker 1>if it's just kind of a dead letter, or if

0:19:57.876 --> 0:20:00.396
<v Speaker 1>it's affecting the way you're thinking about what you're doing

0:20:00.436 --> 0:20:01.716
<v Speaker 1>with your lives.

0:20:02.516 --> 0:20:05.956
<v Speaker 3>Yeah, I think a big unsettled question here is being

0:20:05.956 --> 0:20:10.636
<v Speaker 3>able to prioritize these very different moral responsibilities we have

0:20:10.876 --> 0:20:14.876
<v Speaker 3>around harm. So I think this is something Elizabeth Ashford

0:20:14.916 --> 0:20:18.836
<v Speaker 3>introduced a few years ago, this idea of like primary, secondary,

0:20:18.836 --> 0:20:22.796
<v Speaker 3>and tertiary moral responsibilities. Primary is you personally don't do harm,

0:20:23.036 --> 0:20:26.556
<v Speaker 3>Secondary is you prevent other people from doing harm to others,

0:20:26.756 --> 0:20:30.036
<v Speaker 3>And third is if someone has experienced harm you work

0:20:30.116 --> 0:20:36.396
<v Speaker 3>to alleviate that harm. So in terms of this, it's

0:20:36.476 --> 0:20:40.556
<v Speaker 3>really a very difficult prioritization scheme when you have all

0:20:40.596 --> 0:20:44.996
<v Speaker 3>these different responsibilities toward harm. And if one of these categories,

0:20:45.036 --> 0:20:48.596
<v Speaker 3>which for example, would be like alleviating as I mentioned before,

0:20:48.636 --> 0:20:52.676
<v Speaker 3>tuberculosis burden that affects millions of people, is still a

0:20:52.756 --> 0:20:57.596
<v Speaker 3>leading infectious disease killer versus preventing a pandemic from being

0:20:57.676 --> 0:21:01.156
<v Speaker 3>engineered that could kill everybody, it's very difficult to see

0:21:01.196 --> 0:21:05.836
<v Speaker 3>where the moral prioritization lines up in terms of what

0:21:06.116 --> 0:21:09.676
<v Speaker 3>you should personally work on. And it's also something that

0:21:09.876 --> 0:21:13.316
<v Speaker 3>like might not necessarily be solvable with doing like a

0:21:13.316 --> 0:21:18.676
<v Speaker 3>probability calculation, because like moral responsibility is not necessarily something

0:21:18.676 --> 0:21:20.436
<v Speaker 3>that you can like boil down to numbers.

0:21:20.956 --> 0:21:24.436
<v Speaker 1>It's also hard to make probability calculations about a lot

0:21:24.436 --> 0:21:25.676
<v Speaker 1>of the existential threats.

0:21:26.596 --> 0:21:30.076
<v Speaker 2>Yeah, it's like certainly it's probably like harder than you're

0:21:30.116 --> 0:21:32.116
<v Speaker 2>like more likely to be wrong than like a randomized

0:21:32.156 --> 0:21:35.236
<v Speaker 2>control trial of malaria interventions. Right at the same time,

0:21:35.276 --> 0:21:37.796
<v Speaker 2>I think a lot of people can make pretty good

0:21:38.036 --> 0:21:40.796
<v Speaker 2>educated guesses, perhaps based on the evidence we have and

0:21:40.836 --> 0:21:43.236
<v Speaker 2>the reasoning we have about things like a lot of

0:21:43.276 --> 0:21:46.956
<v Speaker 2>people in the EA movement perhaps predicted something like COVID

0:21:47.076 --> 0:21:48.556
<v Speaker 2>before it happened. A lot of people are like raising

0:21:48.596 --> 0:21:50.636
<v Speaker 2>the fire alarms of we should have much better pandemic

0:21:50.676 --> 0:21:55.796
<v Speaker 2>preparedness and monitoring and rapid vaccine development programs, and then

0:21:55.996 --> 0:21:58.196
<v Speaker 2>they like, for various reasons, messed up and weren't listened

0:21:58.196 --> 0:22:01.996
<v Speaker 2>to and COVID happened. Anyway, Maybe like similar things are

0:22:01.996 --> 0:22:05.036
<v Speaker 2>happening with with AI risk now too, where if you

0:22:05.076 --> 0:22:08.076
<v Speaker 2>survey like leading machine learning experts who are like not

0:22:08.156 --> 0:22:10.836
<v Speaker 2>focused on these risks and are just like people publishing

0:22:10.876 --> 0:22:13.756
<v Speaker 2>top papers that some of the top machine learning conferences.

0:22:14.196 --> 0:22:17.636
<v Speaker 2>The result from the most recent AI impact survey was,

0:22:17.676 --> 0:22:20.476
<v Speaker 2>like the median respondent gives like a ten percent chance

0:22:20.556 --> 0:22:24.116
<v Speaker 2>that the development of AI will like destroy everyone, like

0:22:24.156 --> 0:22:26.556
<v Speaker 2>results in catastrophic outcomes, including human extinction.

0:22:26.796 --> 0:22:28.796
<v Speaker 1>They're real things, They're possibly real.

0:22:28.876 --> 0:22:30.716
<v Speaker 3>I definitely agree with Gabe on this point that the

0:22:31.516 --> 0:22:35.236
<v Speaker 3>difficulty and maybe the lack of clarity on an exact

0:22:35.916 --> 0:22:38.996
<v Speaker 3>number of people that might be killed by AI in

0:22:38.996 --> 0:22:42.116
<v Speaker 3>the future is not a counter argument to the fact

0:22:42.156 --> 0:22:45.796
<v Speaker 3>that this could be a really big problem. Like the uncertainty,

0:22:45.876 --> 0:22:49.676
<v Speaker 3>isn't that like there's a super super super high probability

0:22:49.716 --> 0:22:52.636
<v Speaker 3>that nothing is going to happen. There's definitely lots of

0:22:53.116 --> 0:22:57.076
<v Speaker 3>fairly fairly good analysis on things that may come to

0:22:57.116 --> 0:23:00.196
<v Speaker 3>pass if we don't do anything to regulate or do

0:23:00.476 --> 0:23:01.596
<v Speaker 3>solve certain issues.

0:23:02.036 --> 0:23:04.316
<v Speaker 1>The characters in my book did this bank shot that

0:23:04.916 --> 0:23:08.116
<v Speaker 1>effective altruism kind of dreamed up They all earned to

0:23:08.116 --> 0:23:10.796
<v Speaker 1>give idea that you go do a job that you

0:23:10.876 --> 0:23:13.356
<v Speaker 1>might not necessarily do to maximize the dollars you make

0:23:13.396 --> 0:23:15.636
<v Speaker 1>that you can then give away to people like you

0:23:15.876 --> 0:23:17.036
<v Speaker 1>so you can do what you're doing.

0:23:17.716 --> 0:23:20.556
<v Speaker 2>Yeah, so certainly a lot of early EA stuff was

0:23:20.636 --> 0:23:23.156
<v Speaker 2>kind of focused on, hey, we're just a very small movement,

0:23:23.796 --> 0:23:25.876
<v Speaker 2>maybe we should like figure out ways to more effectively

0:23:25.916 --> 0:23:30.556
<v Speaker 2>direct philanthropic funds towards more impactful organizations. Over time, as

0:23:30.596 --> 0:23:32.956
<v Speaker 2>the community has grown, as people have engaged in these

0:23:32.996 --> 0:23:35.276
<v Speaker 2>ideas further, now people are like, hey, what if we

0:23:35.316 --> 0:23:37.716
<v Speaker 2>are the philanthropic organizations, what if we actually do this

0:23:37.756 --> 0:23:39.836
<v Speaker 2>direct work and try to have more of an impact.

0:23:40.516 --> 0:23:43.676
<v Speaker 2>There's certainly like a lot of current issues, like a

0:23:43.676 --> 0:23:46.636
<v Speaker 2>lot of perhaps global health and developments and animal welfare,

0:23:46.676 --> 0:23:48.196
<v Speaker 2>things that can just benefit from a lot of money.

0:23:48.196 --> 0:23:50.756
<v Speaker 2>Maybe that's where a lot of this funding still goes to.

0:23:50.956 --> 0:23:53.276
<v Speaker 2>But certainly they are like a lot of within there,

0:23:53.356 --> 0:23:55.916
<v Speaker 2>within those fields and within other causes, there's like still

0:23:55.916 --> 0:23:58.476
<v Speaker 2>a very big need for people doing direct work, and

0:23:58.516 --> 0:24:00.076
<v Speaker 2>so it's maybe more of the current focus.

0:24:02.036 --> 0:24:06.356
<v Speaker 1>Is that true, it's Harvard too, ol I would say so.

0:24:06.396 --> 0:24:10.796
<v Speaker 3>Certainly we do have some aspects of even our introductory

0:24:10.796 --> 0:24:15.876
<v Speaker 3>fellowship that is more of this classic If you have

0:24:16.036 --> 0:24:20.116
<v Speaker 3>like spare money and a lot of people would benefit

0:24:20.156 --> 0:24:24.196
<v Speaker 3>from donating it, then you probably should and you should

0:24:24.236 --> 0:24:26.516
<v Speaker 3>make sure it goes to highly effective charities that can

0:24:26.956 --> 0:24:29.116
<v Speaker 3>make that money go as far as possible and save

0:24:29.116 --> 0:24:33.716
<v Speaker 3>the most people. What we don't really encourage is making

0:24:35.036 --> 0:24:38.036
<v Speaker 3>lots of money and then expecting people to donate it

0:24:38.316 --> 0:24:43.116
<v Speaker 3>for a lot of the same reasons that Gabe has outlined. Generally,

0:24:43.196 --> 0:24:46.916
<v Speaker 3>like the people who are coming into Harvard EA tend

0:24:46.916 --> 0:24:49.716
<v Speaker 3>to have like very specialized skill sets and ability to

0:24:49.796 --> 0:24:55.516
<v Speaker 3>acquire like lots of really really cool cool skills basically,

0:24:55.916 --> 0:24:59.476
<v Speaker 3>and if it we didn't like direct them to doing

0:24:59.596 --> 0:25:01.916
<v Speaker 3>like the very best things that they could do with

0:25:01.956 --> 0:25:06.116
<v Speaker 3>like these very specific and unique talents, then this would

0:25:06.156 --> 0:25:08.676
<v Speaker 3>also be like quite a quite a waste.

0:25:08.796 --> 0:25:11.396
<v Speaker 1>It's a interesting because it's very different from what was

0:25:11.436 --> 0:25:14.636
<v Speaker 1>being pitched by the leaders of effective altruism back when

0:25:14.636 --> 0:25:17.236
<v Speaker 1>Sam Beckman Free was MT he listened to speech where

0:25:17.436 --> 0:25:20.396
<v Speaker 1>it was really encouraging the audience to go to Wall

0:25:20.396 --> 0:25:23.156
<v Speaker 1>Street or go into some high paying job and channel

0:25:23.196 --> 0:25:26.436
<v Speaker 1>the dollars. You know, instead of being a doctor in Africa,

0:25:26.516 --> 0:25:31.796
<v Speaker 1>you could pay for ten doctors in Africa. And that's

0:25:31.636 --> 0:25:33.716
<v Speaker 1>that idea is not kicking around in the same way.

0:25:34.436 --> 0:25:37.316
<v Speaker 2>I think it's also much harder perhaps now to be

0:25:37.396 --> 0:25:39.756
<v Speaker 2>like really successful at earning to give compared to some

0:25:39.756 --> 0:25:42.196
<v Speaker 2>of this other stuff. Like maybe earning to give is

0:25:42.236 --> 0:25:44.636
<v Speaker 2>much more heavy tailed in the sense that it's just

0:25:44.676 --> 0:25:47.236
<v Speaker 2>the very few people who make like, like the very

0:25:47.236 --> 0:25:49.596
<v Speaker 2>extreme amounts of money who can like pay for a

0:25:49.596 --> 0:25:50.396
<v Speaker 2>lot of this stuff.

0:25:50.476 --> 0:25:53.036
<v Speaker 1>It's funny because he earn to give idea originates with

0:25:53.076 --> 0:25:56.316
<v Speaker 1>Toby Ord doing it himself, and he's a He's on

0:25:56.356 --> 0:26:00.596
<v Speaker 1>an Oxford professor's salary and arguing and making the point

0:26:00.596 --> 0:26:04.116
<v Speaker 1>that just by donating an oversized chunk of his salary

0:26:04.156 --> 0:26:07.156
<v Speaker 1>not so much that he couldn't survive, that he would

0:26:07.196 --> 0:26:10.716
<v Speaker 1>save like eighty thousand children from likeness in Africa, and

0:26:11.076 --> 0:26:15.196
<v Speaker 1>it's that idea was intoxicating to Sam and to the

0:26:15.196 --> 0:26:17.876
<v Speaker 1>people around him, but it doesn't sound like it's really

0:26:18.276 --> 0:26:20.996
<v Speaker 1>it's as infectious with you all. The reason that's odd

0:26:21.036 --> 0:26:22.716
<v Speaker 1>to me is that you're both sitting in institutions that

0:26:22.756 --> 0:26:25.996
<v Speaker 1>are pipelines to very high paying jobs in tech and finance,

0:26:26.596 --> 0:26:28.476
<v Speaker 1>and it's sort of like that it would be a

0:26:28.956 --> 0:26:30.316
<v Speaker 1>it would be a natural step.

0:26:30.796 --> 0:26:33.196
<v Speaker 2>So I wouldn't say it's like gone away, but I

0:26:33.196 --> 0:26:35.996
<v Speaker 2>would like distinguish that kind of earning to give from

0:26:36.036 --> 0:26:38.396
<v Speaker 2>like the oh, let's actually try to like get into

0:26:38.436 --> 0:26:40.836
<v Speaker 2>the very tail end of the highest paying jobs, particularly

0:26:40.916 --> 0:26:42.476
<v Speaker 2>not through the thing you're going to do otherwise, but

0:26:42.916 --> 0:26:44.916
<v Speaker 2>do the thing that earns the most money as possible

0:26:45.076 --> 0:26:47.236
<v Speaker 2>and donate. And that seems like much much harder to

0:26:47.276 --> 0:26:51.156
<v Speaker 2>me to succeed at compared to like directing people to

0:26:51.276 --> 0:26:53.756
<v Speaker 2>direct work in fields that they might be really competent at.

0:26:55.116 --> 0:26:57.516
<v Speaker 1>For a brief period, Sam Beckman Freed made it look

0:26:57.596 --> 0:27:01.476
<v Speaker 1>very easy, And I'm wondering when FTX blew up, if

0:27:01.516 --> 0:27:05.116
<v Speaker 1>it reverberated around your organizations and people talk about it.

0:27:05.796 --> 0:27:10.836
<v Speaker 3>Yeah, there was definitely a lot of shock waves that

0:27:10.876 --> 0:27:17.276
<v Speaker 3>went through every organization affiliated with Effective Altruism in some way,

0:27:18.276 --> 0:27:22.156
<v Speaker 3>some more than others. It really did cause a lot

0:27:22.196 --> 0:27:26.596
<v Speaker 3>of people, including college organizers, to step back and just

0:27:26.716 --> 0:27:31.156
<v Speaker 3>feel like, did why did this happen? Basically in terms

0:27:31.156 --> 0:27:35.516
<v Speaker 3>of like how how responsible should the EA community as

0:27:35.556 --> 0:27:38.916
<v Speaker 3>a whole be for making this happen? As like one

0:27:38.956 --> 0:27:43.116
<v Speaker 3>guy who got affiliated with EA and is like now

0:27:43.156 --> 0:27:47.876
<v Speaker 3>in a position of huge monetary influence, felt empowered to

0:27:47.916 --> 0:27:50.596
<v Speaker 3>make a lot of like very very very poor decisions.

0:27:51.196 --> 0:27:52.916
<v Speaker 1>Gabe, do you have some thoughts on this?

0:27:53.396 --> 0:27:56.596
<v Speaker 2>Yeah, Like, certainly like that resonates a lot there if

0:27:56.596 --> 0:28:00.676
<v Speaker 2>you especially, like look online, So people in EA really

0:28:00.676 --> 0:28:02.756
<v Speaker 2>like the scout mindset thing as like a part of

0:28:02.756 --> 0:28:04.916
<v Speaker 2>that is considering where you might be wrong, and so

0:28:04.996 --> 0:28:07.396
<v Speaker 2>maybe affect Rautism more than a lot of other communities

0:28:07.436 --> 0:28:09.876
<v Speaker 2>really likes to criticize itself, try to think of ways

0:28:09.876 --> 0:28:12.356
<v Speaker 2>are wrong in order to improve our our like our

0:28:12.556 --> 0:28:15.756
<v Speaker 2>strategy and framework and whatever. So there certainly, like has

0:28:15.796 --> 0:28:17.756
<v Speaker 2>been a lot of online discourse about this, a lot

0:28:17.756 --> 0:28:20.916
<v Speaker 2>of people, especially like older people in the community discussing

0:28:20.956 --> 0:28:22.876
<v Speaker 2>like whoa is this our fault? Like all said, did

0:28:23.036 --> 0:28:24.996
<v Speaker 2>did we go wrong? Is this eas fold? Or is

0:28:25.116 --> 0:28:27.396
<v Speaker 2>Sam just a bad person and we like missed him

0:28:27.476 --> 0:28:31.916
<v Speaker 2>or something. Certainly, like like the online discussion within the

0:28:31.956 --> 0:28:34.676
<v Speaker 2>EA community seems to have been like even larger than

0:28:34.716 --> 0:28:37.676
<v Speaker 2>the media discussion of this. Like I don't know, maybe

0:28:37.836 --> 0:28:39.916
<v Speaker 2>the new cycle just moves fast and people move on

0:28:39.956 --> 0:28:42.876
<v Speaker 2>to new things, but like it feels like still they're

0:28:42.956 --> 0:28:45.196
<v Speaker 2>like every couple of weeks or something, the new community

0:28:45.236 --> 0:28:47.836
<v Speaker 2>posts about like hey, how could we have prevented the

0:28:47.876 --> 0:28:51.436
<v Speaker 2>FTX situation? But among like college students, it seems like

0:28:51.476 --> 0:28:53.796
<v Speaker 2>that's quite different. They're like some of some of our

0:28:53.916 --> 0:28:56.356
<v Speaker 2>like new EA intro fellows who are like doing the

0:28:56.356 --> 0:28:59.116
<v Speaker 2>fellowship in the fall when this happened, I was like, oh,

0:28:59.156 --> 0:29:01.196
<v Speaker 2>so you did you hear about the news about FTX

0:29:01.236 --> 0:29:03.236
<v Speaker 2>and SBF What do you think about that? And they're like, Oh,

0:29:03.276 --> 0:29:04.276
<v Speaker 2>who's Sam Bigmanfreed?

0:29:04.516 --> 0:29:07.196
<v Speaker 1>Have you heard anybody in these forums or anybody in

0:29:07.236 --> 0:29:11.116
<v Speaker 1>your groups try to justify Sam's behavior, saying that actually

0:29:11.156 --> 0:29:15.996
<v Speaker 1>this was he's Robin hood, that we don't mind him

0:29:16.036 --> 0:29:19.436
<v Speaker 1>having tried to take taken this risk with depositors' money

0:29:19.476 --> 0:29:21.116
<v Speaker 1>because the money was going to go to these other

0:29:21.196 --> 0:29:25.316
<v Speaker 1>things and it could have worked out. Don't blame him.

0:29:25.076 --> 0:29:28.356
<v Speaker 2>People in effective Elshism seem to want to always try

0:29:28.356 --> 0:29:30.556
<v Speaker 2>to consider both sides of any kind of situation and

0:29:30.596 --> 0:29:32.516
<v Speaker 2>try to like yep, as all said, like think about

0:29:32.556 --> 0:29:35.756
<v Speaker 2>a thing and then immediately think about the current arguments. Certainly,

0:29:35.796 --> 0:29:38.956
<v Speaker 2>when FTX first happened, there's I remember seeing some comments

0:29:38.996 --> 0:29:41.036
<v Speaker 2>about like, oh, maybe maybe this was because of X

0:29:41.116 --> 0:29:44.396
<v Speaker 2>reasons and why, or at least like we should be

0:29:44.476 --> 0:29:46.676
<v Speaker 2>like cutting Sam some slack because he must feel like

0:29:46.716 --> 0:29:50.396
<v Speaker 2>really shitty right now, which is probably true. I think generally,

0:29:50.436 --> 0:29:53.316
<v Speaker 2>now the sentiment has changed a lot, and everyone's like, Wow,

0:29:53.556 --> 0:29:56.116
<v Speaker 2>you're really messed up. This seems like very clearly bad.

0:29:58.076 --> 0:30:00.996
<v Speaker 1>This was hugely useful to me, more useful than you know,

0:30:01.516 --> 0:30:04.356
<v Speaker 1>And I really appreciate you giving me the time. Thanks

0:30:04.436 --> 0:30:04.796
<v Speaker 1>so much.

0:30:05.676 --> 0:30:07.676
<v Speaker 2>Yeah, thank you too, Thank you so much for chatting Michael.

0:30:07.756 --> 0:30:12.836
<v Speaker 1>Yeah, yeah, all right, bye bye. Al Shinn is a

0:30:12.876 --> 0:30:16.516
<v Speaker 1>senior at Harvard and Gabriel mccobe is a senior at Stanford.

0:30:18.156 --> 0:30:19.916
<v Speaker 1>Let's end today with a letter we got from a

0:30:19.916 --> 0:30:22.636
<v Speaker 1>listener about a little snippet of conversation I had in

0:30:22.676 --> 0:30:25.836
<v Speaker 1>the previous episode. I was chatting with Molly White, the

0:30:25.916 --> 0:30:29.476
<v Speaker 1>noted crypto critic. Molly got her start in public life

0:30:29.476 --> 0:30:32.276
<v Speaker 1>at the age of thirteen when she edited a Wikipedia

0:30:32.356 --> 0:30:36.156
<v Speaker 1>article about unicycles. At the very end of the episode,

0:30:36.476 --> 0:30:38.956
<v Speaker 1>we tucked in the little exchange that I had with Molly,

0:30:39.516 --> 0:30:42.436
<v Speaker 1>mainly because my producers thought it was funny. We call

0:30:42.476 --> 0:30:45.356
<v Speaker 1>this an easter egg. It ran after the episode's credits,

0:30:45.396 --> 0:30:47.196
<v Speaker 1>so probably a lot of you didn't even hear it.

0:30:49.356 --> 0:30:52.076
<v Speaker 1>What did you edit on the unicycle Wikipedia page?

0:30:52.916 --> 0:30:57.156
<v Speaker 4>I'm pretty sure that I added information about the existence

0:30:57.356 --> 0:31:00.756
<v Speaker 4>of other types of unicycles. You know, so there's like

0:31:00.876 --> 0:31:04.076
<v Speaker 4>the very standard unicycle, but there's also ones that like

0:31:04.476 --> 0:31:08.596
<v Speaker 4>arement for going off road, and there are ones that

0:31:08.636 --> 0:31:12.396
<v Speaker 4>have like really big wheels that let you go long distance.

0:31:12.876 --> 0:31:17.316
<v Speaker 1>You know, an off road unicycle is is is to

0:31:17.956 --> 0:31:24.036
<v Speaker 1>cycling what bitcoin is to money. It's sort of it's

0:31:24.076 --> 0:31:26.596
<v Speaker 1>sort of like a solution in search of the problem.

0:31:26.676 --> 0:31:32.796
<v Speaker 1>Absolutely well. Listener Chris Holme heard that, and she wrote

0:31:32.836 --> 0:31:37.196
<v Speaker 1>in her professional title is writer in chief at Chris

0:31:37.236 --> 0:31:41.676
<v Speaker 1>Holme Unicycles Limited. Chris writes, I think Michael might need

0:31:41.716 --> 0:31:44.996
<v Speaker 1>a bit of context on mountain unicycling. It is hardly

0:31:45.076 --> 0:31:48.156
<v Speaker 1>the Bitcoin of mountain biking. Even if he doesn't find

0:31:48.156 --> 0:31:51.076
<v Speaker 1>the sport interesting. I suspect he might find it intriguing

0:31:51.196 --> 0:31:53.676
<v Speaker 1>to find the reality of the sport so very different

0:31:53.996 --> 0:31:58.836
<v Speaker 1>from the perception. Well, Chris, I want to apologize, because

0:31:58.836 --> 0:32:02.316
<v Speaker 1>I never want to offend anybody. But actually, when I

0:32:02.356 --> 0:32:06.796
<v Speaker 1>said that, the response I got was even more disturbed

0:32:06.796 --> 0:32:09.716
<v Speaker 1>than yours was from bitcoiners who didn't want bitcoin to

0:32:09.756 --> 0:32:14.156
<v Speaker 1>be seen as off road unicycling. So I had got

0:32:14.156 --> 0:32:16.236
<v Speaker 1>it from both sides. I obviously should not have made

0:32:16.236 --> 0:32:22.276
<v Speaker 1>this connection. Obviously, like off road unicycling is dicycling with

0:32:22.476 --> 0:32:24.516
<v Speaker 1>something else is to something else, but not bitcoin is

0:32:24.516 --> 0:32:29.396
<v Speaker 1>to money, And one day, if I summon the nerve

0:32:29.956 --> 0:32:32.356
<v Speaker 1>I might get on an actual unicycle. I kind of

0:32:32.396 --> 0:32:34.876
<v Speaker 1>doubt I'm ever going to get to the mountain unicycling,

0:32:35.436 --> 0:32:38.036
<v Speaker 1>and I kind of doubt that I'm ever going to

0:32:38.076 --> 0:32:40.556
<v Speaker 1>have a reason to actually explore the reality of the sport.

0:32:41.556 --> 0:32:45.196
<v Speaker 1>So I apologize. I apologize for being so cavalier about

0:32:45.196 --> 0:32:48.916
<v Speaker 1>something that is obviously important to other people. If you

0:32:48.956 --> 0:32:51.956
<v Speaker 1>have a question, I'd love to hear it. You can

0:32:51.996 --> 0:32:54.436
<v Speaker 1>contact me with it by clicking on our show notes

0:32:54.636 --> 0:33:01.356
<v Speaker 1>or going to ATR podcast dot com. On Background is

0:33:01.396 --> 0:33:04.876
<v Speaker 1>hosted by me Michael Lewis, and produced by Catherine Girardeau

0:33:05.036 --> 0:33:09.596
<v Speaker 1>and Lydia gene Kott. Our editor is Julia Barton. Our

0:33:09.636 --> 0:33:14.516
<v Speaker 1>engineer is Sarah Bruguier. Thanks to our SVP of production,

0:33:14.796 --> 0:33:18.636
<v Speaker 1>Greta Cone, our show is recorded by Toper Ruth at

0:33:18.636 --> 0:33:22.996
<v Speaker 1>Berkeley Advanced Media Studios. Our music was composed by Matthias

0:33:23.036 --> 0:33:27.316
<v Speaker 1>Bossi and John Evans of stell Wagon Symphonet. My old

0:33:27.316 --> 0:33:31.396
<v Speaker 1>friend Nick Brittel composed our theme song. On Background is

0:33:31.436 --> 0:33:35.916
<v Speaker 1>a production of Pushkin Industries. To find more Pushkin podcasts,

0:33:36.476 --> 0:33:40.156
<v Speaker 1>listen on the iHeartRadio app, Apple Podcasts or wherever you

0:33:40.196 --> 0:33:43.356
<v Speaker 1>listen to podcasts, and if you'd like to listen ad

0:33:43.356 --> 0:33:47.356
<v Speaker 1>free and learn about other exclusive offerings, don't forget to

0:33:47.396 --> 0:33:51.116
<v Speaker 1>sign up for a Pushkin Plus subscription at Pushkin dot Fm,

0:33:51.156 --> 0:34:00.636
<v Speaker 1>Backslash Plus or on our Apple show page. Al I'm

0:34:00.636 --> 0:34:04.236
<v Speaker 1>going to be at your graduation? Oh be cause you're

0:34:04.596 --> 0:34:06.796
<v Speaker 1>cause you're graduating with Quinn. Have you seen her around?

0:34:07.036 --> 0:34:09.596
<v Speaker 3>I have not used a career dance studio recent, which

0:34:09.596 --> 0:34:13.076
<v Speaker 3>is where I would most likely see Quinn. When Quinn

0:34:13.156 --> 0:34:15.796
<v Speaker 3>knew me in the first year arts program, I had

0:34:15.836 --> 0:34:19.756
<v Speaker 3>not started dancing. I started breakdancing in the fall after

0:34:20.596 --> 0:34:21.356
<v Speaker 3>the program.

0:34:21.436 --> 0:34:23.796
<v Speaker 1>Do you consider yourself an effective breakdancer?

0:34:27.756 --> 0:34:29.916
<v Speaker 3>I think this falls squarely in the realm of things

0:34:29.956 --> 0:34:33.036
<v Speaker 3>that I'm not optimizing in my life.