1 00:00:15,250 --> 00:00:22,890 Speaker 1: Pushkin. In the twenty first century, powerful technologies have been 2 00:00:22,890 --> 00:00:28,450 Speaker 1: appearing at a breathtaking pace, related to the Internet, artificial intelligence, 3 00:00:28,850 --> 00:00:34,370 Speaker 1: genetic engineering, and more. They could dramatically improve our world, or, 4 00:00:34,850 --> 00:00:37,530 Speaker 1: if we don't make wise choices, could leave us a 5 00:00:37,570 --> 00:00:47,170 Speaker 1: lot more solve. I Americ Lander. I'm a scientist who 6 00:00:47,210 --> 00:00:50,330 Speaker 1: works on ways to improve human health. I helped lead 7 00:00:50,370 --> 00:00:53,930 Speaker 1: the Human Genome Project, served as a science advisor to 8 00:00:53,970 --> 00:00:58,010 Speaker 1: the Obama White House, and today direct the Broad Institute 9 00:00:58,010 --> 00:01:03,010 Speaker 1: of MTN Harvard. This generation's choices will shape the future 10 00:01:03,290 --> 00:01:07,370 Speaker 1: as never before. The decisions ahead aren't just up to 11 00:01:07,530 --> 00:01:13,650 Speaker 1: scientists or politicians. We all of us are the stewards 12 00:01:13,690 --> 00:01:21,450 Speaker 1: of a brave new planet. In this podcast, we'll explore 13 00:01:21,530 --> 00:01:25,450 Speaker 1: hard questions. Should we alter the Earth's atmosphere to prevent 14 00:01:25,530 --> 00:01:30,770 Speaker 1: climate change? I can imagine conflicts that could arise when 15 00:01:31,290 --> 00:01:37,250 Speaker 1: nations start tinkering with the composition of the stratosphere. Should 16 00:01:37,250 --> 00:01:40,970 Speaker 1: we deploy a new genetic engineering technology in the wild 17 00:01:41,210 --> 00:01:44,890 Speaker 1: possibilities of a shining, marvelous future. We're just exploding all 18 00:01:44,890 --> 00:01:47,370 Speaker 1: around me like fireworks, And at the same time it 19 00:01:48,170 --> 00:01:50,530 Speaker 1: was literally in the same breath, I was also just 20 00:01:50,570 --> 00:01:55,490 Speaker 1: like Holly crap. If this isn't used properly, this could 21 00:01:55,490 --> 00:01:59,610 Speaker 1: be really damaging to our planet. As machines learn to 22 00:01:59,650 --> 00:02:03,330 Speaker 1: mimic human decision making, can we keep them from learning 23 00:02:03,410 --> 00:02:06,650 Speaker 1: human prejudices. When I learned that there was actually an 24 00:02:06,650 --> 00:02:10,290 Speaker 1: algorithm that judges used to help decide what to sentence people, 25 00:02:10,930 --> 00:02:16,610 Speaker 1: I was stunned. Can truth and democracy survive the impact 26 00:02:16,850 --> 00:02:20,050 Speaker 1: of deep fakes? Creditors would chime in and say, you 27 00:02:20,130 --> 00:02:23,050 Speaker 1: can absolutely make a deep fake sex video of your 28 00:02:23,050 --> 00:02:26,130 Speaker 1: ex with thirty pictures. I've done it with twenty. Here's 29 00:02:26,130 --> 00:02:28,130 Speaker 1: the things that keep me up at night. A video 30 00:02:28,210 --> 00:02:31,170 Speaker 1: of Donald Trump saying I've launched nuclear weapons against Iran. 31 00:02:32,010 --> 00:02:34,130 Speaker 1: And before anybody gets around to figuring out whether this 32 00:02:34,210 --> 00:02:37,010 Speaker 1: is real or not, we have global nucleu ountdown. And 33 00:02:37,570 --> 00:02:40,850 Speaker 1: is it time to turn war over to robots? I 34 00:02:40,930 --> 00:02:48,890 Speaker 1: know people who have killed civilians, and in all cases 35 00:02:49,090 --> 00:02:52,530 Speaker 1: where people made mistakes, it was just too much information. 36 00:02:52,770 --> 00:02:56,810 Speaker 1: Thanks for happening too fast. We'll talk to remarkable people. 37 00:02:57,330 --> 00:03:02,210 Speaker 1: A West African scientist fighting malaria, the co founder of LinkedIn, 38 00:03:03,010 --> 00:03:07,050 Speaker 1: A recent US Secretary of Defense, a civil rights lawyer 39 00:03:07,170 --> 00:03:11,530 Speaker 1: focused on artificial intelligence, one of the Navy's first female 40 00:03:11,530 --> 00:03:16,050 Speaker 1: fighter pilots, and many many more. To get the world 41 00:03:16,170 --> 00:03:22,250 Speaker 1: we want, we'll need to make wise choices. So fellows, 42 00:03:22,290 --> 00:03:25,370 Speaker 1: stewards of this Brave New Planet, join us as we 43 00:03:25,410 --> 00:03:29,370 Speaker 1: grapple with opportunities and challenges that are too big to 44 00:03:29,410 --> 00:03:32,450 Speaker 1: fit in a tweet, but that will shape our future. 45 00:03:33,370 --> 00:03:41,170 Speaker 1: Utopia or dystopia. It's up to us. Subscribe to Brave 46 00:03:41,210 --> 00:03:46,010 Speaker 1: New Planet on Apple Podcasts. Our first episodes drop October twelfth,