WEBVTT - Grand Theft Automated: How to Save a Trillion Lives

0:00:15.410 --> 0:00:15.890
<v Speaker 1>Pushkin.

0:00:24.930 --> 0:00:29.450
<v Speaker 2>In a penthouse apartment in the Bahamas, a billionaire is

0:00:29.530 --> 0:00:33.970
<v Speaker 2>hosting a meeting. It's the kind of place you might

0:00:34.050 --> 0:00:38.810
<v Speaker 2>expect to find a billionaire, marble floors, a grand piano,

0:00:39.530 --> 0:00:43.450
<v Speaker 2>a balcony with a hot tub, and views of the marina.

0:00:44.890 --> 0:00:47.930
<v Speaker 2>As the billionaire and his colleagues debate what to do

0:00:47.970 --> 0:00:51.730
<v Speaker 2>with his money, they look to one man in particular

0:00:52.210 --> 0:00:59.090
<v Speaker 2>for wise advice. A business analyst, a financial expert, no

0:00:59.610 --> 0:01:06.650
<v Speaker 2>a moral philosopher who's devoted his career to thinking about altruism.

0:01:07.810 --> 0:01:12.050
<v Speaker 2>This is the second of two cautionary tales about altruism.

0:01:12.650 --> 0:01:16.530
<v Speaker 2>In the first, we heard about a scientist called George Price,

0:01:17.050 --> 0:01:20.970
<v Speaker 2>who helped to unravel the mystery of how evolution produced

0:01:21.090 --> 0:01:27.170
<v Speaker 2>altruistic behavior, and who then became the most extreme altruist

0:01:27.370 --> 0:01:31.490
<v Speaker 2>you could imagine, giving away his last penny and the

0:01:31.570 --> 0:01:37.450
<v Speaker 2>coat on his back. But Wade, perhaps the billionaire in

0:01:37.490 --> 0:01:42.810
<v Speaker 2>the Bahamas is an even more extreme altruist. The only

0:01:42.850 --> 0:01:45.370
<v Speaker 2>reason he ever wanted to make money was to give

0:01:45.410 --> 0:01:49.210
<v Speaker 2>it away. He looks more like a student than a billionaire,

0:01:49.570 --> 0:01:54.130
<v Speaker 2>with his baggy cargo shorts, crumpled T shirt, and disheveled hair.

0:01:55.050 --> 0:01:58.570
<v Speaker 2>He's turned his penthouse into a dorm room. There are

0:01:58.650 --> 0:02:03.050
<v Speaker 2>bean bags for napping on, monitor wires trailing haphazardly across

0:02:03.090 --> 0:02:06.810
<v Speaker 2>the marble floor, a cheap bookcase full of board games,

0:02:07.250 --> 0:02:12.810
<v Speaker 2>a freezer stuffed with three dollar vegetable beryani from Trader Joe's.

0:02:13.610 --> 0:02:17.490
<v Speaker 2>Over the next year, he wants to give away a

0:02:17.650 --> 0:02:21.050
<v Speaker 2>billion dollars, and he wants to do it as effectively

0:02:21.290 --> 0:02:27.090
<v Speaker 2>as possible, hence the moral philosopher. The year is twenty

0:02:27.370 --> 0:02:33.690
<v Speaker 2>twenty two. The billionaire's name is Sam Bankman Freed, and

0:02:33.850 --> 0:02:38.250
<v Speaker 2>his altruistic activities are about to be interrupted by arrest

0:02:38.970 --> 0:02:53.050
<v Speaker 2>and imprisonment. I'm Tim Harford, and you're listening to cautionary tales.

0:03:11.290 --> 0:03:17.330
<v Speaker 2>It's nineteen seventy two. In London. George Price is spiraling,

0:03:18.290 --> 0:03:21.650
<v Speaker 2>convinced that Jesus wants him to give all his possessions

0:03:21.690 --> 0:03:25.810
<v Speaker 2>to homeless people. Fifty miles up the road, in Oxford,

0:03:26.290 --> 0:03:32.490
<v Speaker 2>a philosopher called Peter Singer publishes an essay titled Famine,

0:03:32.530 --> 0:03:38.050
<v Speaker 2>Affluence and Morality. Singer asks us to imagine that we're

0:03:38.170 --> 0:03:41.810
<v Speaker 2>walking along past a muddy pond, going about our day,

0:03:42.490 --> 0:03:45.730
<v Speaker 2>when we see that in the pond a small child

0:03:45.850 --> 0:03:50.810
<v Speaker 2>is drowning upon this shallow we could easily wade in

0:03:50.890 --> 0:03:54.930
<v Speaker 2>and save the child. That that would ruin the nice

0:03:55.010 --> 0:03:56.170
<v Speaker 2>new clothes we're wearing.

0:03:57.170 --> 0:03:57.890
<v Speaker 1>What do we do?

0:03:58.890 --> 0:04:02.650
<v Speaker 2>Of course, of course we wade in and saved the child.

0:04:02.930 --> 0:04:05.890
<v Speaker 1>If we didn't, we'd never be able to live with ourselves.

0:04:07.170 --> 0:04:11.770
<v Speaker 2>But think about this, says s across the world, a

0:04:11.810 --> 0:04:16.250
<v Speaker 2>small child is dying from hunger. We could save that

0:04:16.490 --> 0:04:21.090
<v Speaker 2>child's life by giving money to charity less than the

0:04:21.130 --> 0:04:23.690
<v Speaker 2>cost of the nice new clothes we were willing to ruin.

0:04:24.850 --> 0:04:28.290
<v Speaker 2>Surely our obligation to give to the charity is just

0:04:28.370 --> 0:04:31.250
<v Speaker 2>as strong as our obligation to wade into the pond.

0:04:32.330 --> 0:04:35.850
<v Speaker 2>Morally speaking, it doesn't matter if the child we can

0:04:35.970 --> 0:04:38.930
<v Speaker 2>save is right there in front of us or ten

0:04:39.010 --> 0:04:44.970
<v Speaker 2>thousand miles away. If you follow Singer's logic, spending money

0:04:45.010 --> 0:04:48.810
<v Speaker 2>on nice clothes instead of donating it to starving children

0:04:49.370 --> 0:04:53.850
<v Speaker 2>is just as immoral as walking past the drowning child

0:04:53.890 --> 0:04:58.970
<v Speaker 2>in the pond. Follow the logic further, and as long

0:04:59.010 --> 0:05:03.250
<v Speaker 2>as there's one starving child in the world, it's immoral

0:05:03.370 --> 0:05:06.970
<v Speaker 2>to own anything we don't really need. We should keep

0:05:07.050 --> 0:05:10.050
<v Speaker 2>on giving until we're only just better off than the

0:05:10.090 --> 0:05:16.050
<v Speaker 2>starving child ourselves. Nobody lives like this, of course, well

0:05:16.850 --> 0:05:22.010
<v Speaker 2>nobody except George Price. Peter Singer's essay became a fixture

0:05:22.210 --> 0:05:28.090
<v Speaker 2>in undergraduate philosophy classes, including the one I took. Students

0:05:28.210 --> 0:05:32.930
<v Speaker 2>tended to have one of two reactions. Either they tried

0:05:32.970 --> 0:05:37.450
<v Speaker 2>to find some flaw in Singer's logic, or they conceded

0:05:37.490 --> 0:05:41.610
<v Speaker 2>that Singer might be right, but shoved that thought firmly

0:05:41.650 --> 0:05:44.090
<v Speaker 2>to the back of their minds so they could resume

0:05:44.210 --> 0:05:48.410
<v Speaker 2>their normal lives without constantly thinking about all the starving

0:05:48.610 --> 0:05:56.810
<v Speaker 2>children they were thereby condemning to death. Will mccaskell was different.

0:05:57.770 --> 0:06:02.690
<v Speaker 2>In two thousand and five, aged eighteen, mccaskell read Peter

0:06:02.850 --> 0:06:09.050
<v Speaker 2>Singer's essay. He thought Singer was clearly right and decided

0:06:09.170 --> 0:06:12.410
<v Speaker 2>he should walk the walk by giving what he could

0:06:13.370 --> 0:06:17.530
<v Speaker 2>as a student. That wasn't easy. Students never have much money,

0:06:17.930 --> 0:06:21.930
<v Speaker 2>and mccaskell did also want to make friends. He tried

0:06:21.970 --> 0:06:24.970
<v Speaker 2>to compromise. When his friends went to the pub for

0:06:25.050 --> 0:06:30.530
<v Speaker 2>a drink, mccaskell ordered tat water, then quietly refilled the

0:06:30.610 --> 0:06:36.210
<v Speaker 2>glass with cheap lager he'd brought from the store. Mccaskell

0:06:36.330 --> 0:06:39.890
<v Speaker 2>got his degree in philosophy and a job in academia.

0:06:41.210 --> 0:06:44.730
<v Speaker 2>He decided that the first twenty six thousand pounds a

0:06:44.810 --> 0:06:47.130
<v Speaker 2>year of his salary would be enough to live on

0:06:47.850 --> 0:06:52.170
<v Speaker 2>about thirty three thousand dollars. Anything he earned above that

0:06:53.050 --> 0:06:57.250
<v Speaker 2>he give away. He researched the most effective ways to

0:06:57.330 --> 0:07:01.930
<v Speaker 2>donate some charitable causes. It turns out give you far

0:07:02.010 --> 0:07:06.770
<v Speaker 2>more bang for your buck than others. Bednets, for example,

0:07:07.490 --> 0:07:11.370
<v Speaker 2>save lives in countries with hilaria by stopping mosquitoes from

0:07:11.450 --> 0:07:15.890
<v Speaker 2>biting you while you sleep. By one estimate, around three

0:07:16.090 --> 0:07:23.170
<v Speaker 2>thousand dollars spent on bednets would save one life. Mccaskell

0:07:23.290 --> 0:07:29.330
<v Speaker 2>met others who shared his ideas. A movement emerged. Mccaskell

0:07:29.370 --> 0:07:32.410
<v Speaker 2>and his colleagues asked themselves what it should be called,

0:07:32.770 --> 0:07:37.690
<v Speaker 2>and came up with the name effective altruism. They became

0:07:37.970 --> 0:07:42.690
<v Speaker 2>the effective altruists, committed to giving away a significant chunk

0:07:42.730 --> 0:07:46.890
<v Speaker 2>of their income. In a modest basement office in Oxford,

0:07:47.370 --> 0:07:52.330
<v Speaker 2>McCaskill and his colleagues set up the Center for Effective Altruism.

0:07:52.650 --> 0:07:57.850
<v Speaker 2>They ate cheap vegetarian food, supermarket baguettes and hummers, and

0:07:58.090 --> 0:08:04.090
<v Speaker 2>debated the most effective ways to be altruistic. For instance,

0:08:04.450 --> 0:08:08.850
<v Speaker 2>might deworming pills do even more good than bednets per

0:08:08.930 --> 0:08:13.850
<v Speaker 2>day dollar spent? The numbers said they might. People with

0:08:13.970 --> 0:08:18.050
<v Speaker 2>money started to ask mccaskell's advice on where to donate.

0:08:19.130 --> 0:08:22.450
<v Speaker 2>That gave him a dilemma, because mccaskell was aware of

0:08:22.490 --> 0:08:27.370
<v Speaker 2>studies that show classically handsome people are more persuasive at

0:08:27.410 --> 0:08:32.130
<v Speaker 2>getting donations for charities, and mccaskell had always been conscious

0:08:32.170 --> 0:08:35.890
<v Speaker 2>of the gap between his two front teeth. Should he

0:08:35.930 --> 0:08:40.010
<v Speaker 2>invest in braces to make himself more handsome? On the

0:08:40.050 --> 0:08:43.410
<v Speaker 2>one hand, the money he spent on braces couldn't then

0:08:43.490 --> 0:08:47.890
<v Speaker 2>be spent on bednets or deworming pills. On the other hand,

0:08:48.290 --> 0:08:51.850
<v Speaker 2>it might make him a more effective advocate for those causes.

0:08:53.410 --> 0:08:58.330
<v Speaker 2>Mccaskell asked his old friends about this moral dilemma will

0:08:58.770 --> 0:09:01.370
<v Speaker 2>They said, if you want to get your teeth fixed,

0:09:01.810 --> 0:09:06.690
<v Speaker 2>get your teeth fixed. In a profile of McCaskill for

0:09:06.730 --> 0:09:10.250
<v Speaker 2>The New Yorker, one friend recalls it felt like it

0:09:10.410 --> 0:09:13.890
<v Speaker 2>subsumed his own humanity to become a vehicle for the

0:09:13.930 --> 0:09:19.170
<v Speaker 2>saving of humanity. Mccaskeal was getting asked for another kind

0:09:19.170 --> 0:09:24.930
<v Speaker 2>of advice too, career advice. Students at Oxford University wanted

0:09:24.970 --> 0:09:27.330
<v Speaker 2>to know what line of work they should go into

0:09:27.610 --> 0:09:30.690
<v Speaker 2>if they wanted to do the most good. Should they

0:09:30.730 --> 0:09:34.370
<v Speaker 2>become a doctor in a poor country, for example, or

0:09:34.410 --> 0:09:38.850
<v Speaker 2>a medical researcher to try to cure cancer. Mccaskell came

0:09:38.930 --> 0:09:42.730
<v Speaker 2>up with a surprising answer, none of the above.

0:09:43.890 --> 0:09:45.410
<v Speaker 1>You are at a top university.

0:09:45.530 --> 0:09:48.050
<v Speaker 2>He told them, you have a chance at careers that

0:09:48.090 --> 0:09:49.410
<v Speaker 2>could make you lots of money.

0:09:50.090 --> 0:09:52.410
<v Speaker 1>Why not make money and give it away?

0:09:53.290 --> 0:09:56.650
<v Speaker 2>If you become a high flying banker, for example, you

0:09:56.650 --> 0:10:00.850
<v Speaker 2>could easily fund a dozen doctors in poor countries, far

0:10:00.890 --> 0:10:04.210
<v Speaker 2>more effective than becoming a doctor in a poor country yourself.

0:10:05.650 --> 0:10:11.410
<v Speaker 2>The logic was impeccable. Mccaskal called the idea earning to give.

0:10:13.650 --> 0:10:19.130
<v Speaker 2>In twenty twelve, McCaskill visited Cambridge, Massachusetts, to spread his

0:10:19.250 --> 0:10:23.810
<v Speaker 2>ideas at other top universities. He heard about a student

0:10:23.890 --> 0:10:28.810
<v Speaker 2>at MIT who might be receptive. A physics major in

0:10:28.890 --> 0:10:35.050
<v Speaker 2>his junior year unkempt a bit odd, but clearly brilliant.

0:10:36.010 --> 0:10:43.930
<v Speaker 2>McCaskill sent the student an email, Let's have lunch. Sam

0:10:43.970 --> 0:10:47.250
<v Speaker 2>Bankman Freed was surprised to get an email from a

0:10:47.250 --> 0:10:51.570
<v Speaker 2>philosopher at Oxford University Who is this guy? Why is

0:10:51.610 --> 0:10:55.810
<v Speaker 2>he inviting me to lunch? Sam was bored of his

0:10:55.850 --> 0:10:59.010
<v Speaker 2>physics degree, just as had been bored at school throughout

0:10:59.050 --> 0:11:02.610
<v Speaker 2>his childhood. He was good at maths and a card

0:11:02.690 --> 0:11:06.970
<v Speaker 2>game called Magic the Gathering, but bad at social interaction.

0:11:07.890 --> 0:11:11.490
<v Speaker 2>He remembers having to teach him when it's considered appropriate

0:11:11.570 --> 0:11:16.970
<v Speaker 2>to smile. His classmates, he thought, saw him as smart

0:11:17.090 --> 0:11:21.170
<v Speaker 2>and maybe not all that human. He didn't feel close

0:11:21.210 --> 0:11:24.850
<v Speaker 2>to anyone except for one kid who also liked the

0:11:24.890 --> 0:11:30.050
<v Speaker 2>card game Magic Gathering. That kid remembers Sam as a

0:11:30.210 --> 0:11:38.730
<v Speaker 2>rare combination of hyper rational and extremely kind. Sam rationalized

0:11:38.730 --> 0:11:41.850
<v Speaker 2>his way to a belief system. I guess I should

0:11:41.890 --> 0:11:45.530
<v Speaker 2>care the same amount about everyone, which is pretty much

0:11:45.570 --> 0:11:49.570
<v Speaker 2>what Peter Singer said all those years ago. When someone

0:11:49.610 --> 0:11:53.010
<v Speaker 2>made the case to Sam that his beliefs were inconsistent

0:11:53.090 --> 0:11:57.650
<v Speaker 2>with eating meat, Sam thought about it and concluded, this

0:11:57.810 --> 0:12:02.530
<v Speaker 2>sucks because I love fried chicken. But they're right. He

0:12:02.610 --> 0:12:07.690
<v Speaker 2>became a vegan. In his cargo shorts, crumpled T shirt,

0:12:07.770 --> 0:12:12.970
<v Speaker 2>and battered sneakers, Sam met Will McCaskill for lunch. He

0:12:13.090 --> 0:12:15.570
<v Speaker 2>wasn't really sure what he wanted to do with his life.

0:12:15.930 --> 0:12:20.370
<v Speaker 2>He told Will Before he came to MIT, he'd thought

0:12:20.970 --> 0:12:24.570
<v Speaker 2>maybe he'd become an academic, but he now realized that

0:12:24.650 --> 0:12:30.810
<v Speaker 2>he'd find academia far too boring. Will pitched Sam on

0:12:30.970 --> 0:12:34.290
<v Speaker 2>his earn to give idea. If you want to make

0:12:34.290 --> 0:12:37.890
<v Speaker 2>the world a better place, he told Sam, you should

0:12:37.930 --> 0:12:43.090
<v Speaker 2>set out to make lots and lots of money. Cautionary

0:12:43.170 --> 0:12:57.010
<v Speaker 2>tales will be back after the break. Sam bankmin Freed

0:12:57.250 --> 0:13:00.130
<v Speaker 2>finished his degree at MIT and got a job on

0:13:00.210 --> 0:13:04.690
<v Speaker 2>Wall Street at a trading firm. The job involved spotting

0:13:04.890 --> 0:13:11.370
<v Speaker 2>tiny inefficiencies in financial markets, patterns data that others had overlooked.

0:13:12.290 --> 0:13:17.090
<v Speaker 2>It was all about making rational calculations and thinking and probabilities.

0:13:17.770 --> 0:13:21.890
<v Speaker 2>It wasn't easy, but if you were good, you could

0:13:21.890 --> 0:13:27.490
<v Speaker 2>make a fortune. Sam was a natural. In his first year,

0:13:27.930 --> 0:13:32.050
<v Speaker 2>he was paid three hundred thousand dollars, in his second,

0:13:32.610 --> 0:13:38.290
<v Speaker 2>six hundred thousand, in his third a million. He gave

0:13:38.330 --> 0:13:42.250
<v Speaker 2>most of it away to good causes, including Will mccaskell's

0:13:42.410 --> 0:13:47.330
<v Speaker 2>Center for Effective Altruism. How much might I be earning

0:13:47.410 --> 0:13:51.290
<v Speaker 2>in ten years, he asked his bosses, if you keep

0:13:51.330 --> 0:13:54.170
<v Speaker 2>doing as well as you are. They said, maybe as

0:13:54.250 --> 0:14:01.130
<v Speaker 2>much as seventy five million dollars a year. But Sam

0:14:01.450 --> 0:14:06.570
<v Speaker 2>wasn't happy. I don't feel anything, he confided to his journal,

0:14:07.130 --> 0:14:10.610
<v Speaker 2>or at least anything good. I feel nothing but the

0:14:10.770 --> 0:14:17.090
<v Speaker 2>aching hole in my brain where happiness should be. Sam

0:14:17.130 --> 0:14:21.810
<v Speaker 2>began to get interested in crypto currency in twenty seventeen.

0:14:22.090 --> 0:14:25.490
<v Speaker 2>Crypto was still a very new phenomenon. It was hard

0:14:25.490 --> 0:14:28.450
<v Speaker 2>to know what to make of it, an important emerging

0:14:28.570 --> 0:14:34.210
<v Speaker 2>asset class or just some complicated scam. New coins were

0:14:34.250 --> 0:14:38.530
<v Speaker 2>being launched all the time, but unlike shares in say

0:14:38.810 --> 0:14:43.810
<v Speaker 2>Apple or Amazon, they were often completely unrelated to anything

0:14:43.850 --> 0:14:48.250
<v Speaker 2>in the real world economy. Sam's trading firm wouldn't let

0:14:48.330 --> 0:14:49.330
<v Speaker 2>him touch crypto.

0:14:49.770 --> 0:14:54.250
<v Speaker 1>It was far too risky to start with.

0:14:54.530 --> 0:14:58.690
<v Speaker 2>Crypto was bought and sold on exchanges that aren't regulated

0:14:58.690 --> 0:15:03.210
<v Speaker 2>in the same way as stock exchanges. Crypto was relatively

0:15:03.250 --> 0:15:07.210
<v Speaker 2>easy to steal, what to misplace. If you lose the

0:15:07.290 --> 0:15:10.570
<v Speaker 2>password to your bitcoin wallet, It's not like losing the

0:15:10.650 --> 0:15:14.410
<v Speaker 2>password to your online banking. You can't call a help

0:15:14.450 --> 0:15:19.450
<v Speaker 2>desk and get another one. Still, Sam saw an opportunity.

0:15:20.330 --> 0:15:23.570
<v Speaker 2>The nascent crypto markets were far less efficient than the

0:15:23.610 --> 0:15:27.050
<v Speaker 2>financial markets he was used to operating in. The same

0:15:27.130 --> 0:15:32.210
<v Speaker 2>coins could trade on different exchanges for different prices. Sam

0:15:32.250 --> 0:15:35.330
<v Speaker 2>decided to quit his job and set up his own firm.

0:15:36.170 --> 0:15:38.490
<v Speaker 2>He'd use the techniques he had learned on Wall Street

0:15:38.730 --> 0:15:43.570
<v Speaker 2>to trade in crypto, But what about the risk of theft.

0:15:44.410 --> 0:15:49.770
<v Speaker 2>With a few furtive keystrokes, an employee might divert coins

0:15:49.850 --> 0:15:53.250
<v Speaker 2>into their own personal account in a way that would

0:15:53.250 --> 0:15:58.930
<v Speaker 2>never work for Apple shares. Sam had a genius solution

0:15:59.050 --> 0:16:04.850
<v Speaker 2>to that problem. He would employ only effective altruists. If

0:16:04.970 --> 0:16:07.850
<v Speaker 2>all his employees were just as committed as he was

0:16:07.930 --> 0:16:11.050
<v Speaker 2>to giving their money away, they would feel no temptation

0:16:11.170 --> 0:16:13.650
<v Speaker 2>to enrich themselves by stealing from the firm.

0:16:14.410 --> 0:16:16.050
<v Speaker 1>It was perfect.

0:16:19.370 --> 0:16:24.130
<v Speaker 2>By twenty eighteen, Sam's new company, alimede A Research, had

0:16:24.170 --> 0:16:27.410
<v Speaker 2>employed a couple of dozen effective altruists and raised one

0:16:27.490 --> 0:16:32.250
<v Speaker 2>hundred and seventy million dollars from investors. But things got

0:16:32.290 --> 0:16:36.450
<v Speaker 2>off to a rocky start. The first problem was Sam's

0:16:36.530 --> 0:16:41.730
<v Speaker 2>leadership style. One employee recalls he was expecting everyone to

0:16:41.810 --> 0:16:45.050
<v Speaker 2>work eighteen hour days while he would not show up

0:16:45.050 --> 0:16:48.850
<v Speaker 2>for meetings, not shower for weeks, have a mess all

0:16:48.890 --> 0:16:52.050
<v Speaker 2>around him with old food everywhere, and fall asleep at

0:16:52.090 --> 0:16:58.410
<v Speaker 2>his desk. Then there was Sam's new bot. He wanted

0:16:58.450 --> 0:17:02.970
<v Speaker 2>to automate buying and selling coins on different exchanges. That

0:17:03.130 --> 0:17:06.090
<v Speaker 2>was a tried and tested idea on stock markets, but

0:17:06.690 --> 0:17:11.930
<v Speaker 2>stock markets worked more reliably than crypto exchanges. His management

0:17:11.930 --> 0:17:15.370
<v Speaker 2>team were worried if this went wrong, it could go

0:17:15.610 --> 0:17:19.970
<v Speaker 2>very wrong, very quickly. When you switch on this bot.

0:17:20.490 --> 0:17:23.330
<v Speaker 2>They told Sam, you have to watch it like a

0:17:23.410 --> 0:17:26.330
<v Speaker 2>hawk and be ready to switch it off straight away

0:17:26.490 --> 0:17:32.370
<v Speaker 2>if it starts losing money. Sam agreed. He switched on

0:17:32.410 --> 0:17:39.090
<v Speaker 2>the bot, then fell asleep. The biggest worry of all

0:17:39.210 --> 0:17:43.450
<v Speaker 2>was that, well four million dollars worth of crypto had

0:17:43.610 --> 0:17:44.610
<v Speaker 2>just disappeared.

0:17:45.890 --> 0:17:50.650
<v Speaker 1>Where had it gone? Had somebody stolen it? No one knew.

0:17:51.650 --> 0:17:55.850
<v Speaker 2>Sam's management team wanted to tell their investors. Let's not,

0:17:56.530 --> 0:18:00.050
<v Speaker 2>said Sam. I reckon, there's an eighty percent probability that

0:18:00.130 --> 0:18:06.970
<v Speaker 2>it turns up somewhere. In Sam's highper rational mind. That

0:18:07.170 --> 0:18:10.810
<v Speaker 2>was basically the same as them still having eighty percent

0:18:10.890 --> 0:18:13.810
<v Speaker 2>of the four million dollars, and it would be perfectly

0:18:13.810 --> 0:18:17.770
<v Speaker 2>reasonable to put that in their accounts. We can't do that,

0:18:18.730 --> 0:18:22.730
<v Speaker 2>said Sam's management team. That's not how the world works.

0:18:23.610 --> 0:18:28.490
<v Speaker 2>The management team at Alameda Research lost patients. Sam was

0:18:28.530 --> 0:18:32.770
<v Speaker 2>a brilliant trader, but hilariously ill suited to running a company.

0:18:33.570 --> 0:18:37.970
<v Speaker 2>They walked out. Half the employees followed. The investors pulled

0:18:37.970 --> 0:18:41.810
<v Speaker 2>out three quarters of the cash they'd put in. Still,

0:18:42.650 --> 0:18:45.970
<v Speaker 2>that left Sam with forty million dollars to play with,

0:18:46.810 --> 0:18:49.370
<v Speaker 2>and now there was no one to complain when he

0:18:49.490 --> 0:18:53.850
<v Speaker 2>did things his way. Sam turned on his bot and

0:18:54.010 --> 0:19:04.690
<v Speaker 2>let it run. In Oxford, Will McCaskill and his philosopher

0:19:04.730 --> 0:19:09.770
<v Speaker 2>colleagues were thinking, remember what Peter Singer had said years

0:19:09.810 --> 0:19:15.410
<v Speaker 2>ago about how distance wasn't morally important. We should care

0:19:15.450 --> 0:19:19.090
<v Speaker 2>as much about a child starving ten thousand miles away

0:19:19.450 --> 0:19:22.090
<v Speaker 2>as a child drowning in a pond right in front

0:19:22.130 --> 0:19:26.650
<v Speaker 2>of us. McCaskill began to think we should treat time

0:19:27.210 --> 0:19:30.570
<v Speaker 2>the same as distance. We should care as much about

0:19:30.690 --> 0:19:34.170
<v Speaker 2>children who might be born in the future as children

0:19:34.210 --> 0:19:38.930
<v Speaker 2>who exist right now. Following that logic leads to some

0:19:39.050 --> 0:19:44.170
<v Speaker 2>strange conclusions. The future could last a long time. There

0:19:44.250 --> 0:19:48.570
<v Speaker 2>might be trillions upon trillions of future humans, far more

0:19:48.610 --> 0:19:52.730
<v Speaker 2>than the mere few billion alive today. But those future

0:19:52.810 --> 0:19:56.530
<v Speaker 2>humans will never be born. If today's humans carelessly go

0:19:56.690 --> 0:20:01.370
<v Speaker 2>extinct in the next few decades, what might cause that

0:20:02.250 --> 0:20:09.050
<v Speaker 2>a genetically engineered pandemic perhaps or a rogue super intelligent AI.

0:20:11.010 --> 0:20:15.210
<v Speaker 2>So perhaps the most effective thing altruists can do is

0:20:15.410 --> 0:20:19.850
<v Speaker 2>fund academic research into how best to prevent those risks.

0:20:21.210 --> 0:20:25.410
<v Speaker 2>Of course, most of that research won't lead anywhere, but

0:20:25.490 --> 0:20:29.770
<v Speaker 2>a small probability of a huge payoff can still outweigh

0:20:29.770 --> 0:20:31.450
<v Speaker 2>the certainty of a small payoff.

0:20:35.330 --> 0:20:36.050
<v Speaker 1>Think of it like this.

0:20:37.610 --> 0:20:41.730
<v Speaker 2>If you donate three thousand dollars to buy bednets, you

0:20:41.810 --> 0:20:45.810
<v Speaker 2>can be fairly hopeful of saving one life. But what

0:20:46.010 --> 0:20:51.850
<v Speaker 2>if instead you put your three thousand dollars towards holding

0:20:51.890 --> 0:20:55.810
<v Speaker 2>an AI safety workshop. The chance that it will lead

0:20:55.810 --> 0:21:00.050
<v Speaker 2>to an important breakthrough is minuscule, say one in a billion,

0:21:01.210 --> 0:21:05.250
<v Speaker 2>But if it does, it might save lots of future lives,

0:21:05.610 --> 0:21:09.930
<v Speaker 2>say a trillion. One in a billion times a trillion.

0:21:10.050 --> 0:21:13.050
<v Speaker 2>If you think about it rationally, that is basically the

0:21:13.090 --> 0:21:17.570
<v Speaker 2>same thing as saving a thousand lives. Far more effective

0:21:17.610 --> 0:21:23.730
<v Speaker 2>then to fund AI workshops than bednets. This new school

0:21:23.730 --> 0:21:28.730
<v Speaker 2>of thought became known as long termism. Will McCaskill got

0:21:28.810 --> 0:21:31.970
<v Speaker 2>to work on a book to spread the ideas more widely.

0:21:35.850 --> 0:21:40.650
<v Speaker 2>At Alameda Research, they finally found the missing four million

0:21:40.730 --> 0:21:44.530
<v Speaker 2>dollars worth of crypto. It hadn't been stolen. After all,

0:21:44.930 --> 0:21:47.370
<v Speaker 2>there had been a computer glitch. It had been sent

0:21:47.450 --> 0:21:51.610
<v Speaker 2>to an exchange without an accompanying note about who owned it.

0:21:52.330 --> 0:21:56.570
<v Speaker 2>When Sam finally realized which exchange might have it and

0:21:56.650 --> 0:22:01.370
<v Speaker 2>called them up, they were astonished, How has it taken

0:22:01.450 --> 0:22:07.730
<v Speaker 2>you this long to contact us. Sam's bot, meanwhile, was

0:22:07.770 --> 0:22:14.890
<v Speaker 2>doing well. Alimeda Research was making money, but Sam wanted more.

0:22:15.770 --> 0:22:19.810
<v Speaker 2>He'd realized that the real money making potential in crypto

0:22:20.170 --> 0:22:24.890
<v Speaker 2>wasn't in trading on someone else's exchanges. It was running

0:22:25.010 --> 0:22:28.810
<v Speaker 2>an exchange of your own. Sam came up with a

0:22:28.890 --> 0:22:32.930
<v Speaker 2>clever design for a new kind of crypto exchange, one

0:22:33.010 --> 0:22:36.130
<v Speaker 2>that would let its users gamble on the future price

0:22:36.250 --> 0:22:39.930
<v Speaker 2>of various coins. Many of those people would end up losing.

0:22:40.210 --> 0:22:44.970
<v Speaker 2>That's the nature of gambling, win or lose. Sam would

0:22:45.050 --> 0:22:51.970
<v Speaker 2>take his cut, just like a casino. The exchange Sam

0:22:52.050 --> 0:22:55.250
<v Speaker 2>had in mind wouldn't be legal to run in America,

0:22:55.610 --> 0:22:58.530
<v Speaker 2>so he set it up in the harmas he called

0:22:58.570 --> 0:23:04.650
<v Speaker 2>it FTX. It quickly became a huge success. It ran

0:23:04.730 --> 0:23:08.570
<v Speaker 2>a Super Bowl commercial in which characters played by Larry

0:23:08.690 --> 0:23:14.970
<v Speaker 2>David shown new inventions through the ages the wheel, the toilet,

0:23:15.690 --> 0:23:21.730
<v Speaker 2>the light bulb. Larry mocks them all that's stupid. At

0:23:21.770 --> 0:23:26.530
<v Speaker 2>the end, he's shown FTX and sneers dismissively.

0:23:27.290 --> 0:23:29.530
<v Speaker 1>It's FTX. It's a safe, an easy way to get

0:23:29.530 --> 0:23:34.010
<v Speaker 1>into rypto. I don't think so, and I'm never wrong

0:23:34.050 --> 0:23:34.850
<v Speaker 1>about this stuff.

0:23:35.210 --> 0:23:41.010
<v Speaker 2>Never the tagline don't be like Larry, don't miss out.

0:23:41.970 --> 0:23:46.730
<v Speaker 2>The AD's message is clear. You might not understand crypto,

0:23:47.170 --> 0:23:50.370
<v Speaker 2>just like Larry David's characters didn't understand the wheel or

0:23:50.410 --> 0:23:53.010
<v Speaker 2>the light bulb, but it is going to be just

0:23:53.290 --> 0:23:56.770
<v Speaker 2>as important. Don't miss out. Who cares if you don't

0:23:56.850 --> 0:24:04.610
<v Speaker 2>understand it? Gamble on it now, earn to give. Will

0:24:04.690 --> 0:24:09.810
<v Speaker 2>mccaskell had advised Sam Bankman freed. Did anyone care how

0:24:09.970 --> 0:24:12.530
<v Speaker 2>Sam was making his money as long as he was

0:24:12.530 --> 0:24:19.170
<v Speaker 2>giving it away? Cautionary tales will be back after the break.

0:24:28.210 --> 0:24:32.890
<v Speaker 2>In twenty twenty two, Will McCaskill published his book What

0:24:33.090 --> 0:24:37.170
<v Speaker 2>We Owe the Future. The organization he helped set up,

0:24:37.570 --> 0:24:43.210
<v Speaker 2>the Center for Effective Altruism, moved into impressive new premises.

0:24:43.810 --> 0:24:48.570
<v Speaker 2>It bought Whiteham Abbey, a grand fifteenth century estate just

0:24:48.650 --> 0:24:52.970
<v Speaker 2>outside of Oxford, to host workshops on subjects like AI

0:24:53.170 --> 0:24:59.690
<v Speaker 2>safety and pandemic risk. Some Effective Altruists felt queasy. Was

0:24:59.730 --> 0:25:04.450
<v Speaker 2>this really a better use of money than bednets? Sure,

0:25:04.970 --> 0:25:08.410
<v Speaker 2>said others. If we hold our workshops in a century's

0:25:08.410 --> 0:25:11.730
<v Speaker 2>old building, that'll help to focus everyone's minds on a

0:25:11.770 --> 0:25:16.490
<v Speaker 2>long term time frame. Sam Bankman Freed was fully on

0:25:16.570 --> 0:25:21.130
<v Speaker 2>board with Will mccaskell's new long term is thinking. One

0:25:21.530 --> 0:25:25.290
<v Speaker 2>rational way to donate his money, Sam decided might be

0:25:25.370 --> 0:25:29.890
<v Speaker 2>to get politicians elected who knew something about AI and pandemics.

0:25:30.650 --> 0:25:36.850
<v Speaker 2>Politicians like Carrick Flynn, an earnest, young, effective altruist who

0:25:36.850 --> 0:25:40.570
<v Speaker 2>had worked on pandemic prevention, then decided to run for

0:25:40.690 --> 0:25:46.250
<v Speaker 2>Congress in Oregon. Carrick Flynn didn't know that Sam Bankman

0:25:46.370 --> 0:25:50.250
<v Speaker 2>Freed had decided to throw money at his campaign. Flynn

0:25:50.370 --> 0:25:55.250
<v Speaker 2>was watching YouTube sipping diet mountain dew when YouTube cut

0:25:55.330 --> 0:25:56.370
<v Speaker 2>to an ad.

0:25:57.130 --> 0:26:01.170
<v Speaker 3>Carrick Flynn faced poverty and homelessness, but he pushed through

0:26:01.410 --> 0:26:04.610
<v Speaker 3>to college on a scholarship and a career protecting the

0:26:04.690 --> 0:26:07.410
<v Speaker 3>most vulnerable billion dollars.

0:26:07.450 --> 0:26:11.570
<v Speaker 2>Flynn was so startled he covered himself in diet mountain dew,

0:26:12.370 --> 0:26:15.490
<v Speaker 2>and that was just the start. Soon, the voters of

0:26:15.570 --> 0:26:19.370
<v Speaker 2>Oregon's sixth Congressional district could hardly look at a screen

0:26:19.450 --> 0:26:22.370
<v Speaker 2>without encountering an ad for Carrick Flynn.

0:26:23.410 --> 0:26:26.370
<v Speaker 3>Carrick Flynn, the Democrat will create good jobs.

0:26:27.810 --> 0:26:32.450
<v Speaker 2>Oregonians quickly got sick of hearing the name Carrick Flynn

0:26:33.210 --> 0:26:37.850
<v Speaker 2>and suspicious these wall to wall ads must be costing

0:26:37.890 --> 0:26:43.730
<v Speaker 2>a fortune. Who was paying? Reporters found out that it

0:26:43.810 --> 0:26:47.650
<v Speaker 2>was a crypto billionaire who lived in the Bahamas, and

0:26:47.690 --> 0:26:52.210
<v Speaker 2>demanded of Carrick Flynn, why is Sam Bankman Freed so

0:26:52.610 --> 0:26:57.210
<v Speaker 2>very keen to get you elected? I don't know, Flynn protested.

0:26:57.410 --> 0:27:01.330
<v Speaker 2>I've never met him. I've never talked to him. Flynn said.

0:27:01.370 --> 0:27:03.890
<v Speaker 2>He assumed it must be because of his interest in

0:27:04.050 --> 0:27:07.650
<v Speaker 2>pandemic risk. Of course, the reporters didn't believe that.

0:27:09.490 --> 0:27:10.770
<v Speaker 1>They said, he.

0:27:10.770 --> 0:27:15.490
<v Speaker 2>Must want you to do something involving crypto. Flynn became

0:27:15.730 --> 0:27:21.450
<v Speaker 2>increasingly bewildered. I'm not a crypto person, he protested. I

0:27:21.450 --> 0:27:24.410
<v Speaker 2>don't know much about it. I've tried to read about it.

0:27:24.690 --> 0:27:31.410
<v Speaker 2>I didn't really care. Flynn finished a distant second in

0:27:31.490 --> 0:27:36.890
<v Speaker 2>his election. For every vote he received, Sam had spent

0:27:37.090 --> 0:27:41.930
<v Speaker 2>something like one thousand dollars on ads. To put that

0:27:42.090 --> 0:27:47.570
<v Speaker 2>another way, for every three votes Carrick Flynn received, Sam

0:27:47.610 --> 0:27:52.130
<v Speaker 2>could have bought enough bednets to save a life.

0:27:52.290 --> 0:27:54.530
<v Speaker 1>But in the new long term mist.

0:27:54.450 --> 0:27:58.650
<v Speaker 2>View of effective altruism, the money spent on Carrick Flynn

0:27:59.130 --> 0:28:03.090
<v Speaker 2>hadn't been wasted. It had always been a long shot

0:28:03.130 --> 0:28:06.770
<v Speaker 2>that Carrick Flynn's political career would end up preventing some

0:28:07.090 --> 0:28:11.410
<v Speaker 2>future pandemic from wiping out humanity. But if it did,

0:28:11.930 --> 0:28:16.010
<v Speaker 2>it could save trillions of future lives. If you thought

0:28:16.130 --> 0:28:22.370
<v Speaker 2>rationally about altruism, funding Carrick flynn ads instead of bednets

0:28:22.410 --> 0:28:30.730
<v Speaker 2>made perfect sense. In the last two episodes of Cautionary Tales,

0:28:30.970 --> 0:28:35.690
<v Speaker 2>we've heard two wildly different stories about people who took

0:28:35.810 --> 0:28:43.570
<v Speaker 2>altruism very seriously. Indeed, George Price's altruism was driven by revelation.

0:28:44.410 --> 0:28:47.210
<v Speaker 2>A vision of Jesus told him to give away whatever

0:28:47.250 --> 0:28:51.130
<v Speaker 2>he had to whoever asked him. He ended up as

0:28:51.170 --> 0:28:55.010
<v Speaker 2>thin as a stick with rotting teeth, sleeping on a

0:28:55.050 --> 0:28:59.810
<v Speaker 2>mattress on the floor of a squad. Sam Bankman Freed's

0:28:59.850 --> 0:29:04.810
<v Speaker 2>altruism was driven by rationality. A moral philosopher told him

0:29:04.850 --> 0:29:09.250
<v Speaker 2>to make lots of money and donate it effectively. He

0:29:09.370 --> 0:29:13.410
<v Speaker 2>ended up encouraging people to gamble on crypto so that

0:29:13.490 --> 0:29:19.010
<v Speaker 2>he could put more money into politics. Taking altruism very

0:29:19.050 --> 0:29:27.690
<v Speaker 2>seriously indeed can take you to some strange places. Will

0:29:27.810 --> 0:29:32.410
<v Speaker 2>mccaskell flew into the Bahamas for a penthouse discussion about

0:29:32.450 --> 0:29:36.730
<v Speaker 2>the best ways to help future people. Sam had been

0:29:36.770 --> 0:29:41.490
<v Speaker 2>trying out a new idea. He identified a hundred experts

0:29:41.490 --> 0:29:44.970
<v Speaker 2>in AI and pandemic risk and sent each of them

0:29:45.210 --> 0:29:49.010
<v Speaker 2>a million dollars out of the blue and nose strings attached.

0:29:50.210 --> 0:29:53.850
<v Speaker 2>Use it well and I'll give you more. He planned

0:29:53.890 --> 0:29:56.810
<v Speaker 2>to give away a billion dollars over the next year,

0:29:57.690 --> 0:30:04.570
<v Speaker 2>but how more political campaigns, more workshops at Whiteham Abbey.

0:30:05.730 --> 0:30:09.210
<v Speaker 2>As it turned out, the question was moot because when

0:30:09.290 --> 0:30:14.050
<v Speaker 2>Sam's dark secret was about to be discovered, he hadn't

0:30:14.130 --> 0:30:19.090
<v Speaker 2>just been earning to give, He'd also been defrauding to give.

0:30:20.330 --> 0:30:24.290
<v Speaker 2>When Sam set up FTX, he couldn't get a US

0:30:24.450 --> 0:30:29.050
<v Speaker 2>bank to open an account. Its activities were too legally murky.

0:30:29.690 --> 0:30:33.930
<v Speaker 2>That meant FTX had no way of taking dollar deposits

0:30:33.970 --> 0:30:37.090
<v Speaker 2>from its customers. But you know who did have a

0:30:37.130 --> 0:30:43.490
<v Speaker 2>dollar account, Sam's company, Alameda Research. When customers opened an

0:30:43.490 --> 0:30:49.370
<v Speaker 2>account at FTX, they wired their deposits to Alamader. Alamader

0:30:49.490 --> 0:30:52.650
<v Speaker 2>could and should have kept that money safe for the

0:30:52.770 --> 0:30:57.810
<v Speaker 2>FTX customers, but they didn't they used it to trade with.

0:30:59.210 --> 0:31:05.850
<v Speaker 2>This wasn't legally murky, this was very illegal. Indeed, what

0:31:06.090 --> 0:31:10.730
<v Speaker 2>was Sam thinking? Sam was thinking that nobody need ever

0:31:10.850 --> 0:31:15.610
<v Speaker 2>find out Alamada's trading was making profits, and with this

0:31:15.770 --> 0:31:19.050
<v Speaker 2>extra money to play with, it would make even more profits.

0:31:19.650 --> 0:31:22.690
<v Speaker 2>And Alameda had plenty of assets to fall back on.

0:31:23.250 --> 0:31:27.690
<v Speaker 2>It owned crypto worth many times more than those customer deposits,

0:31:28.490 --> 0:31:32.090
<v Speaker 2>so whenever an FTX customer wanted their deposit back, he

0:31:32.170 --> 0:31:36.370
<v Speaker 2>thought it wouldn't be a problem. As Sam told the

0:31:36.410 --> 0:31:40.890
<v Speaker 2>author Michael Lewis, it felt to us that Alameda had

0:31:41.210 --> 0:31:48.530
<v Speaker 2>infinity dollars. But then crypto prices fell, Alameda now had

0:31:49.010 --> 0:31:54.130
<v Speaker 2>finite dollars. FTX experienced the equivalent of a run on

0:31:54.210 --> 0:31:57.330
<v Speaker 2>the bank when all the customers rushed to withdraw their

0:31:57.330 --> 0:32:02.090
<v Speaker 2>deposits at once. Alameda suddenly had to scramble to find

0:32:02.130 --> 0:32:07.290
<v Speaker 2>the money to pay them back. Remember, when Alameda had

0:32:07.290 --> 0:32:11.570
<v Speaker 2>lost sight of four men million dollars, it hadn't got

0:32:11.610 --> 0:32:16.170
<v Speaker 2>any better at keeping track of what was where. In

0:32:16.210 --> 0:32:21.210
<v Speaker 2>his book Going Infinite, Michael Lewis describes a comically frantic

0:32:21.330 --> 0:32:22.970
<v Speaker 2>hunt for Alameda's assets.

0:32:24.010 --> 0:32:26.730
<v Speaker 4>Its CEO would come on to the screen and announce

0:32:26.770 --> 0:32:29.210
<v Speaker 4>that she found two hundred million dollars here or four

0:32:29.250 --> 0:32:31.250
<v Speaker 4>hundred million dollars there, as if she just made an

0:32:31.290 --> 0:32:34.970
<v Speaker 4>original scientific discovery. Some guy at Deltech they're bank in

0:32:35.010 --> 0:32:38.410
<v Speaker 4>the Bahamas message ramnic to say, oh, by the way,

0:32:38.450 --> 0:32:41.410
<v Speaker 4>you have three hundred million dollars with us, and it

0:32:41.450 --> 0:32:45.530
<v Speaker 4>came as a total surprise to all of them.

0:32:45.650 --> 0:32:51.890
<v Speaker 2>Alameda couldn't gather its money in time. FTX was declared bankrupt.

0:32:52.610 --> 0:32:56.770
<v Speaker 2>Sam was arrested and extradited from the Bahamas to the US.

0:32:58.690 --> 0:33:01.770
<v Speaker 2>Michael Lewis makes the case that Sam wasn't so much

0:33:02.010 --> 0:33:09.090
<v Speaker 2>criminal mastermind as overgrown teenager, incredibly reckless, and incredibly disorganized.

0:33:10.010 --> 0:33:14.690
<v Speaker 2>The bankruptcy lawyers eventually located enough assets in Alameda to

0:33:14.730 --> 0:33:19.810
<v Speaker 2>give FTX depositors all their money back with interest, But

0:33:19.970 --> 0:33:24.770
<v Speaker 2>reckless and disorganized is hardly a compelling defense. He took

0:33:24.850 --> 0:33:29.170
<v Speaker 2>money that wasn't his and spent it according to whatever

0:33:29.450 --> 0:33:34.290
<v Speaker 2>logic suited him. Sam was convicted of fraud and sentenced

0:33:34.370 --> 0:33:40.850
<v Speaker 2>to twenty five years in prison. Sam Bankman freed, is

0:33:40.930 --> 0:33:45.090
<v Speaker 2>now known for his crimes, but it's his altruism that

0:33:45.170 --> 0:33:48.850
<v Speaker 2>interests me. And they have a surprising amount in common.

0:33:49.810 --> 0:33:53.810
<v Speaker 2>Sam purloined his customer's deposits because he made a hyper

0:33:53.930 --> 0:33:57.690
<v Speaker 2>rational calculation that he'd probably get away with it and

0:33:57.770 --> 0:34:00.490
<v Speaker 2>didn't think much about the fallout if it all went wrong.

0:34:01.810 --> 0:34:05.770
<v Speaker 2>Sam gave money to politicians, not the poor, because he

0:34:05.850 --> 0:34:10.730
<v Speaker 2>made a hyper rational calculation to prioritize future people over

0:34:10.850 --> 0:34:16.530
<v Speaker 2>people actually suffering today. Both these calculations remind me how

0:34:16.690 --> 0:34:22.930
<v Speaker 2>Sam's teenage classmates described him smart and maybe not all

0:34:22.930 --> 0:34:29.130
<v Speaker 2>that human. George Price was deeply depressed by what his

0:34:29.210 --> 0:34:32.130
<v Speaker 2>own work said about what it means to be human.

0:34:33.410 --> 0:34:38.490
<v Speaker 2>Our altruistic instincts evolved to serve our selfish genes. When

0:34:38.490 --> 0:34:41.610
<v Speaker 2>we feel the urge to do something nice, tends to

0:34:41.650 --> 0:34:44.850
<v Speaker 2>be the kind of thing that for our ancestors might

0:34:44.890 --> 0:34:50.210
<v Speaker 2>have helped their relatives or forged a friendship. Remember the

0:34:50.330 --> 0:34:56.090
<v Speaker 2>contrast drawn by Peter Singer. We wouldn't hesitate to wade

0:34:56.090 --> 0:34:59.730
<v Speaker 2>into a shallow pond to save a drowning child, but

0:34:59.770 --> 0:35:02.570
<v Speaker 2>we don't feel the same urge to give money to

0:35:02.650 --> 0:35:06.810
<v Speaker 2>save a starving child on the other side of the world. Why,

0:35:07.850 --> 0:35:10.130
<v Speaker 2>when you think from the g Deane's point of view,

0:35:10.250 --> 0:35:14.290
<v Speaker 2>it's not hard to explain. The child who's drowning right

0:35:14.330 --> 0:35:17.530
<v Speaker 2>in front of us might plausibly be a distant cousin

0:35:17.650 --> 0:35:21.490
<v Speaker 2>or have parents who feel forever in our debt. The

0:35:21.610 --> 0:35:25.370
<v Speaker 2>child who's starving half a world away, not so much.

0:35:26.690 --> 0:35:30.410
<v Speaker 2>Our selfish genes can help to explain why we are

0:35:30.530 --> 0:35:33.850
<v Speaker 2>the way we are, but they can't tell us what's

0:35:33.890 --> 0:35:37.610
<v Speaker 2>the right thing to do. We need our rational minds

0:35:37.650 --> 0:35:43.530
<v Speaker 2>for that. It is depressing that we ignore the starving child,

0:35:44.170 --> 0:35:47.970
<v Speaker 2>because evolution simply didn't build us to care that much.

0:35:48.890 --> 0:35:51.970
<v Speaker 2>The moral philosophers are right that we can use our

0:35:52.090 --> 0:35:58.850
<v Speaker 2>rational minds to transcend our selfish genes. Then again, I'm

0:35:58.890 --> 0:36:02.290
<v Speaker 2>not sure it's any less depressing to ignore the starving

0:36:02.410 --> 0:36:06.250
<v Speaker 2>child because we've thought long and hard about it and

0:36:06.330 --> 0:36:10.690
<v Speaker 2>decided to fund an AI workshop in stead. If we

0:36:10.770 --> 0:36:14.490
<v Speaker 2>follow the logic of altruism far enough, it can take

0:36:14.570 --> 0:36:19.170
<v Speaker 2>us to places that don't feel human at all. So

0:36:19.250 --> 0:36:22.690
<v Speaker 2>perhaps we shouldn't beat ourselves up too much if we

0:36:22.770 --> 0:36:27.370
<v Speaker 2>succeed in transcending our selfish genes only by a little bit,

0:36:28.210 --> 0:36:31.690
<v Speaker 2>If we manage at least to do something good for

0:36:31.770 --> 0:36:35.290
<v Speaker 2>people who aren't family or friends, and we don't give

0:36:35.330 --> 0:36:39.970
<v Speaker 2>everything away like George Price, or feel angst about getting

0:36:40.050 --> 0:36:44.730
<v Speaker 2>braces like Will mccaskell, or donate our cash to long

0:36:44.810 --> 0:36:49.730
<v Speaker 2>shot chances of saving unborn trillions like Sam Bankman Freed.

0:36:50.930 --> 0:36:54.850
<v Speaker 2>It may not be a rational approach to altruism, but

0:36:55.450 --> 0:36:57.450
<v Speaker 2>it is a human one.

0:36:57.690 --> 0:37:01.170
<v Speaker 1>There are worse things in the world than being human.

0:37:14.810 --> 0:37:19.010
<v Speaker 2>A key source for this episode is Going Infinite, The

0:37:19.130 --> 0:37:22.730
<v Speaker 2>Rise and Fall of a New Tycoon by Michael Lewis.

0:37:22.850 --> 0:37:25.970
<v Speaker 2>I will be speaking to Michael Lewis next week about

0:37:26.010 --> 0:37:28.730
<v Speaker 2>his time with Sam Bankmin Freed, and we're going to

0:37:28.730 --> 0:37:33.570
<v Speaker 2>be answering your questions on altruism and kindness. This episode

0:37:33.570 --> 0:37:37.530
<v Speaker 2>of Cautionary Tales also relied on Gideon Lewis Krause's profile

0:37:37.650 --> 0:37:41.090
<v Speaker 2>of Will McCaskill in The New Yorker. For a fullnest

0:37:41.090 --> 0:37:46.890
<v Speaker 2>of our sources, visit Timharford dot com. Cautionary Tales is

0:37:46.930 --> 0:37:50.330
<v Speaker 2>written by me Tim Harford with Andrew Wright, Alice Fines,

0:37:50.530 --> 0:37:54.610
<v Speaker 2>and Ryan Billy. It's produced by Georgia Mills and Marilyn Rust.

0:37:55.210 --> 0:37:57.930
<v Speaker 2>The sound design and original music are the work of

0:37:58.010 --> 0:38:02.130
<v Speaker 2>Pascal Wise. Additional sound design is by Carlos San Juan

0:38:02.370 --> 0:38:06.410
<v Speaker 2>at Brain Audio. Bend a d Afhaffrey edited the scripts.

0:38:07.410 --> 0:38:10.250
<v Speaker 2>The show also wouldn't have been possible without the work

0:38:10.290 --> 0:38:15.410
<v Speaker 2>of Jacob Weisberg, Greta Cohene, Sarah Nix, Eric Sandler, Christina Sullivan,

0:38:15.730 --> 0:38:20.130
<v Speaker 2>Kira Posey, and Owen Miller. Cautionary Tales is a production

0:38:20.290 --> 0:38:24.090
<v Speaker 2>of Pushkin Industries. If you like the show, please remember

0:38:24.130 --> 0:38:26.970
<v Speaker 2>to share, rate, and review. It really makes a difference

0:38:26.970 --> 0:38:28.530
<v Speaker 2>to us and if you want to hear the show,

0:38:28.770 --> 0:38:31.890
<v Speaker 2>add free sign up to Pushkin Plus on the show

0:38:31.970 --> 0:38:35.930
<v Speaker 2>page on Apple Podcasts or at pushkin dot Fm, slash

0:38:36.130 --> 0:38:48.970
<v Speaker 2>plus