1 00:00:04,600 --> 00:00:07,680 Speaker 1: Sleepwalkers is a production of I Heart Radio and Unusual 2 00:00:07,680 --> 00:00:17,080 Speaker 1: productions in Casa Yeah Familiar. I was already at home, 3 00:00:17,480 --> 00:00:20,720 Speaker 1: resting with my family when I suddenly get a phone call. 4 00:00:22,680 --> 00:00:30,640 Speaker 1: Had fallen from the sky. Picture of field in rural Mexico. 5 00:00:31,120 --> 00:00:35,560 Speaker 1: Surrounding it yellow tape, a police car flashing lights, soldiers 6 00:00:35,640 --> 00:00:39,760 Speaker 1: carrying automatic weapons. Inside the line of cordon is a 7 00:00:39,840 --> 00:00:43,760 Speaker 1: large transparent globe, a bit like a washed up jellyfish, 8 00:00:44,200 --> 00:00:47,400 Speaker 1: but in fact it had fallen from the sky. Here 9 00:00:47,479 --> 00:00:54,320 Speaker 1: is state official Juan Carlos Castile, important to corder. It 10 00:00:54,440 --> 00:00:58,400 Speaker 1: fell from the sky at n in an erratic way, 11 00:00:58,720 --> 00:01:03,360 Speaker 1: making circles. The artifact itself had some sensors and light 12 00:01:03,760 --> 00:01:07,240 Speaker 1: that was flashing beeping one Manuel Sanchez. You had the 13 00:01:07,280 --> 00:01:10,520 Speaker 1: guts to approach the object, look at it closely, take 14 00:01:10,600 --> 00:01:14,160 Speaker 1: pictures and share them. Those pictures quickly turned into a 15 00:01:14,240 --> 00:01:17,759 Speaker 1: social media phenomenon. It really cost punic in the people. 16 00:01:18,120 --> 00:01:21,679 Speaker 1: Maybe they have never seen something like this. Actually, people 17 00:01:21,760 --> 00:01:24,760 Speaker 1: came from other settlements to see the artifact to take 18 00:01:24,800 --> 00:01:27,840 Speaker 1: selfies with it. I imagine that it could be an 19 00:01:28,040 --> 00:01:33,360 Speaker 1: ESPIONA charity fact that relay the images. Classified Information Officer 20 00:01:33,480 --> 00:01:37,720 Speaker 1: Castillo worried about the origin when a beeping wearing jellyfish 21 00:01:37,800 --> 00:01:41,039 Speaker 1: falls out of the sky. Sure it could be spycraft, 22 00:01:41,480 --> 00:01:46,319 Speaker 1: but if it's something even scarier, something other worldly, Officer 23 00:01:46,400 --> 00:01:49,320 Speaker 1: Castillo noticed that the beast was tagged with the phone number, 24 00:01:49,680 --> 00:01:52,320 Speaker 1: and if you dial that number, you'd ultimately get to 25 00:01:53,680 --> 00:01:56,080 Speaker 1: balloon C s I. This is where we find out 26 00:01:56,520 --> 00:01:58,920 Speaker 1: why balloons either last a hundred days or they don't. 27 00:01:59,560 --> 00:02:03,160 Speaker 1: That's La des Roche and where at X, a secretive 28 00:02:03,240 --> 00:02:06,880 Speaker 1: lab at alphabet, Google's parent company. Their mission is no 29 00:02:07,000 --> 00:02:10,520 Speaker 1: less than to invent the future. And it's X who 30 00:02:10,560 --> 00:02:13,640 Speaker 1: piloted the program to launch balloons into the stratosphere and 31 00:02:13,680 --> 00:02:16,920 Speaker 1: retrieve them when they come down. So, in order to 32 00:02:17,320 --> 00:02:19,440 Speaker 1: look at a balloon that's the size of a tennis court, 33 00:02:20,200 --> 00:02:22,960 Speaker 1: where a failure smaller than a millimeter is a huge 34 00:02:23,000 --> 00:02:26,480 Speaker 1: problem for us UM, we do have to build specialized equipment. 35 00:02:26,639 --> 00:02:29,799 Speaker 1: So behind me is the world's largest flatbed scanner. We're 36 00:02:29,800 --> 00:02:32,760 Speaker 1: working on models of what does a certain type of 37 00:02:32,840 --> 00:02:35,840 Speaker 1: damage look like, what do we think causes that damage? 38 00:02:35,880 --> 00:02:39,480 Speaker 1: It kind of makes a fingerprint. But why does X 39 00:02:39,600 --> 00:02:42,160 Speaker 1: and balloons into the sky for months at a time 40 00:02:42,360 --> 00:02:47,320 Speaker 1: in the first place, I'm Velosen and this is Sleepwalkers. 41 00:02:59,760 --> 00:03:01,880 Speaker 1: So carry you first told me about this balloon being 42 00:03:01,880 --> 00:03:04,520 Speaker 1: discovered in rural Mexico, and then we tracked down the 43 00:03:04,520 --> 00:03:07,960 Speaker 1: first responders. But what grabbed you about the story? First 44 00:03:08,000 --> 00:03:09,520 Speaker 1: of all, it's a crazy story. Something fell out of 45 00:03:09,520 --> 00:03:12,720 Speaker 1: the sky into a field in rural Mexico, which I love. 46 00:03:13,160 --> 00:03:15,240 Speaker 1: But it's also one of those moments when you know, 47 00:03:15,360 --> 00:03:19,119 Speaker 1: real people come in contact with technology in a way 48 00:03:19,120 --> 00:03:21,120 Speaker 1: that almost feels like not to mention them again, but 49 00:03:21,120 --> 00:03:23,240 Speaker 1: like a Steven Spielberg movie, you know. I mean, just 50 00:03:23,360 --> 00:03:26,760 Speaker 1: imagine not knowing what this thing is in your backyard, 51 00:03:27,120 --> 00:03:29,960 Speaker 1: and it sort of reminds me of other stories that 52 00:03:30,000 --> 00:03:34,200 Speaker 1: we've reported on, whether it's Gillian with targeting, Glen with 53 00:03:34,280 --> 00:03:38,680 Speaker 1: parole algorithms. You know, the way in which people interact 54 00:03:38,720 --> 00:03:43,160 Speaker 1: with technology is changing, and this is just the perfect 55 00:03:43,200 --> 00:03:48,400 Speaker 1: example of that. This February, we went to Mountain View 56 00:03:48,400 --> 00:03:51,640 Speaker 1: in California to visit x and learn how they're involved 57 00:03:51,680 --> 00:03:55,400 Speaker 1: with giant balloons floating in the stratosphere, because, as it 58 00:03:55,440 --> 00:03:59,200 Speaker 1: turns out, rural Mexico isn't the only place they've made contact. 59 00:04:00,240 --> 00:04:04,080 Speaker 1: We had internet before Maria, we had so many things 60 00:04:04,120 --> 00:04:08,080 Speaker 1: that we depended on the Internet for everything. Five days 61 00:04:08,120 --> 00:04:12,240 Speaker 1: after Hurrican Maria devastated Puerto Rico, the SEC granted an 62 00:04:12,240 --> 00:04:16,080 Speaker 1: experimental license to X to restore cell service to the island. 63 00:04:16,360 --> 00:04:18,839 Speaker 1: The government let X step in to provide a service 64 00:04:18,960 --> 00:04:22,320 Speaker 1: it couldn't, and Google sent their balloons. So when we 65 00:04:22,360 --> 00:04:24,800 Speaker 1: saw what was happening in Puerto Rico, you know, it 66 00:04:24,960 --> 00:04:28,680 Speaker 1: was really hard for us not the help right like that. 67 00:04:28,760 --> 00:04:31,560 Speaker 1: The whole company in some sense really wanted to get 68 00:04:31,560 --> 00:04:35,240 Speaker 1: behind that effort. It's kind of a rare opportunity when um, 69 00:04:35,279 --> 00:04:37,920 Speaker 1: there's a problem like that, where you know, problem with connectivity, 70 00:04:37,920 --> 00:04:40,279 Speaker 1: everyone's offline and you happen to have a fleet of 71 00:04:40,279 --> 00:04:44,200 Speaker 1: stratospheric Internet balloons like UM, I think we're probably the 72 00:04:44,200 --> 00:04:48,320 Speaker 1: only ones who can say that. That's Sell Candido, head 73 00:04:48,320 --> 00:04:51,400 Speaker 1: of Engineering Fallon And if cell towers in the sky 74 00:04:51,560 --> 00:04:54,560 Speaker 1: sound like the stuff of science fiction, they're already here 75 00:04:55,440 --> 00:05:01,560 Speaker 1: on A massive magnitude eight earthquake hit Peru, and while 76 00:05:01,640 --> 00:05:05,080 Speaker 1: cell towers and cables were down, Loons balloons were able 77 00:05:05,120 --> 00:05:09,160 Speaker 1: to restore temporary Internet access within just forty eight hours. 78 00:05:09,200 --> 00:05:11,880 Speaker 1: But how does Sell and his team get the balloons 79 00:05:11,920 --> 00:05:14,880 Speaker 1: to go where they're needed. So the idea in Act 80 00:05:14,960 --> 00:05:17,520 Speaker 1: one was we were going to build this ring or 81 00:05:17,560 --> 00:05:21,320 Speaker 1: balloons around the world, covering an entire latitude band. Obviously, 82 00:05:21,400 --> 00:05:24,240 Speaker 1: that's kind of a challenging concept to execute. Most of 83 00:05:24,240 --> 00:05:27,760 Speaker 1: them will be over the ocean, not connecting anyone. The 84 00:05:27,800 --> 00:05:31,719 Speaker 1: idea just wasn't feasible. So to make it work Google 85 00:05:31,760 --> 00:05:34,920 Speaker 1: had to figure out how to get the balloons to navigate. 86 00:05:35,440 --> 00:05:39,240 Speaker 1: Astro Tylor runs X and he supervised the original development 87 00:05:39,279 --> 00:05:43,120 Speaker 1: of Loon. We always hoped that the balloons could be 88 00:05:43,120 --> 00:05:48,799 Speaker 1: intelligent in which winds they choose to jump onto. As 89 00:05:48,880 --> 00:05:52,320 Speaker 1: the balloons have gotten better and better at predicting what 90 00:05:52,360 --> 00:05:55,239 Speaker 1: the winds will be at different altitudes, they can play 91 00:05:55,320 --> 00:05:59,120 Speaker 1: these more and more sophisticated chess games. If I go 92 00:05:59,360 --> 00:06:01,919 Speaker 1: up by a kilometer, I think I could catch a 93 00:06:01,920 --> 00:06:05,960 Speaker 1: wind that's going to the left at ten miles an hour, 94 00:06:06,240 --> 00:06:08,360 Speaker 1: and I'll hang out there for about three hours. Then 95 00:06:08,360 --> 00:06:11,640 Speaker 1: I'll go down by two kilometers and it plays this 96 00:06:11,680 --> 00:06:14,560 Speaker 1: out and makes this plan for how it's going to 97 00:06:14,680 --> 00:06:18,679 Speaker 1: get not just kind of to Australia, but right over Perth, 98 00:06:19,880 --> 00:06:23,080 Speaker 1: constantly reading the winds and predicting how they might change 99 00:06:23,160 --> 00:06:26,120 Speaker 1: in order to stay around the Earth is no small task, 100 00:06:26,600 --> 00:06:30,400 Speaker 1: and according to Sell, it takes a lot of computing power. 101 00:06:31,240 --> 00:06:33,120 Speaker 1: I don't think that you would be able to do this, 102 00:06:33,839 --> 00:06:37,359 Speaker 1: you know, if you had a person navigating each balloon. 103 00:06:37,600 --> 00:06:42,080 Speaker 1: The information processing capability and the act that you have 104 00:06:42,160 --> 00:06:45,960 Speaker 1: to be constantly watching making adjustments, it's a job that's 105 00:06:46,000 --> 00:06:49,159 Speaker 1: really well suited to a computer. It is a huge 106 00:06:49,279 --> 00:06:52,839 Speaker 1: volume of computation in our data center. I think that 107 00:06:52,960 --> 00:06:55,520 Speaker 1: is an area where Alphabet has a big advantage. How 108 00:06:55,560 --> 00:06:59,400 Speaker 1: big is a giant data center? Oh um, they're giant 109 00:06:59,720 --> 00:07:03,760 Speaker 1: shy uping center, bigger than shopping centers, powered by often 110 00:07:03,839 --> 00:07:06,960 Speaker 1: renewable energy, so they're often built next to a river 111 00:07:07,279 --> 00:07:10,280 Speaker 1: just so it can use an entire hydro electric plant. 112 00:07:10,720 --> 00:07:13,200 Speaker 1: You know, there's a time of computers being put to 113 00:07:13,560 --> 00:07:17,000 Speaker 1: all kinds of problems across Alphabet Luon is one of them. Right. 114 00:07:17,640 --> 00:07:20,640 Speaker 1: The reality is that today a company like Alphabet has 115 00:07:20,720 --> 00:07:25,080 Speaker 1: more computational power, more data and more engineering expertise than 116 00:07:25,120 --> 00:07:29,080 Speaker 1: most countries. So it can restore connectivity to Puerto Rico 117 00:07:29,280 --> 00:07:32,360 Speaker 1: or Peru after a natural disaster, and it can help 118 00:07:32,360 --> 00:07:35,760 Speaker 1: in daily emergencies as well as Obviously, Christio told us. 119 00:07:35,960 --> 00:07:39,080 Speaker 1: There are situations in his underconnected area that are currently 120 00:07:39,120 --> 00:07:42,480 Speaker 1: impossible to communicate quickly enough to receive help in time, 121 00:07:42,680 --> 00:07:45,640 Speaker 1: and Luon could change that. So the balloon that fell 122 00:07:45,640 --> 00:07:47,960 Speaker 1: out of the sky in Mexico, it was a sign 123 00:07:48,000 --> 00:07:51,400 Speaker 1: of things to come of a new global infrastructure being 124 00:07:51,400 --> 00:07:56,120 Speaker 1: built by technology companies, not governments, and that brings real leverage, 125 00:07:56,400 --> 00:07:58,920 Speaker 1: which is something we should all think about, even if 126 00:07:58,920 --> 00:08:01,160 Speaker 1: we don't find a giant gen fish in our backyard. 127 00:08:04,320 --> 00:08:08,120 Speaker 1: I'm all for development. There are emergency situations in the 128 00:08:08,200 --> 00:08:11,560 Speaker 1: Sierra that are impossible to communicate to us in order 129 00:08:11,600 --> 00:08:15,000 Speaker 1: to act within reasonable time frames, but I don't dismiss 130 00:08:15,000 --> 00:08:17,800 Speaker 1: the possibility of it being a trick by a company 131 00:08:17,840 --> 00:08:25,960 Speaker 1: to steal information from us either. Of course, these balloons 132 00:08:25,960 --> 00:08:29,520 Speaker 1: are actually very well intentioned fundamentally to bring internet to 133 00:08:29,600 --> 00:08:33,280 Speaker 1: places where it doesn't otherwise exist. That said, Kara, Google 134 00:08:33,280 --> 00:08:36,240 Speaker 1: aren't the only people trying to do this. Facebook are two, 135 00:08:36,320 --> 00:08:38,560 Speaker 1: and they had a program for a while attempting to 136 00:08:38,679 --> 00:08:42,120 Speaker 1: use solar powered drones to connect the world. Right, And 137 00:08:42,240 --> 00:08:44,800 Speaker 1: when Facebook and Google are both trying to do something, 138 00:08:45,080 --> 00:08:48,120 Speaker 1: it's says to me that it's probably not purely a 139 00:08:48,120 --> 00:08:50,960 Speaker 1: philanthropic endeavor, there's probably some bottom line and getting all 140 00:08:50,960 --> 00:08:55,440 Speaker 1: those people online. So anyway, we just heard from astro Teller, 141 00:08:55,480 --> 00:08:59,520 Speaker 1: who runs X, which is historically a very secretive organization. 142 00:09:00,040 --> 00:09:03,200 Speaker 1: We were invited inside the building. Do you understand a 143 00:09:03,200 --> 00:09:06,320 Speaker 1: bit more about how one of the world's most powerful companies, 144 00:09:06,640 --> 00:09:12,480 Speaker 1: Google is thinking about inventing the future. Hi, I'm astro Teller. 145 00:09:12,800 --> 00:09:15,760 Speaker 1: I am the captain of Moonshots here at X. It's 146 00:09:15,800 --> 00:09:19,920 Speaker 1: the part of alphabet where we try to design things 147 00:09:20,200 --> 00:09:23,959 Speaker 1: that can become, if we're lucky, new businesses, hopefully Google scale, 148 00:09:24,080 --> 00:09:27,360 Speaker 1: new businesses that can be as good for the world 149 00:09:27,559 --> 00:09:31,199 Speaker 1: as Google husband. So Astro has charged with bulk producing 150 00:09:31,400 --> 00:09:36,040 Speaker 1: Google scale innovations, as impossible as that sounds, but like 151 00:09:36,120 --> 00:09:39,800 Speaker 1: any good factory supervisor, he's got some clear criteria. In 152 00:09:39,960 --> 00:09:42,040 Speaker 1: order for something to be a moonshot, we require that 153 00:09:42,080 --> 00:09:44,520 Speaker 1: it has three things. A huge problem with the world, 154 00:09:45,200 --> 00:09:48,720 Speaker 1: radical proposed solution, and then some kind of breakthrough technology 155 00:09:48,760 --> 00:09:51,560 Speaker 1: that gives us at least a fighting chance of making 156 00:09:51,600 --> 00:09:54,760 Speaker 1: that science fiction sounding product. If you look around, you'll 157 00:09:54,800 --> 00:09:59,800 Speaker 1: see a lot of bear concrete, polished cement floors apply 158 00:10:00,000 --> 00:10:02,800 Speaker 1: would on the walls. The building's work in progress because 159 00:10:02,880 --> 00:10:07,439 Speaker 1: the projects here are a work in progress. To even 160 00:10:07,440 --> 00:10:11,400 Speaker 1: be considered for development at X, ideas have to be spectacular, 161 00:10:11,720 --> 00:10:15,520 Speaker 1: balanced at the edge of the impossible. When someone says, 162 00:10:15,840 --> 00:10:19,679 Speaker 1: what if we put a band of copper around the 163 00:10:19,720 --> 00:10:24,040 Speaker 1: North Pole and let the flux of Earth's magnetic core, 164 00:10:24,120 --> 00:10:27,720 Speaker 1: which goes up and down like a reversing, very slow 165 00:10:28,120 --> 00:10:33,640 Speaker 1: alternating current, turn it into current in that wire, and 166 00:10:33,679 --> 00:10:36,720 Speaker 1: then we could pipe all that current down to Norway 167 00:10:36,800 --> 00:10:41,480 Speaker 1: or something and power the Earth that way. That may 168 00:10:41,520 --> 00:10:47,319 Speaker 1: not actually work, but that statement that what if definitely 169 00:10:47,360 --> 00:10:51,160 Speaker 1: took you outside of the normal, And as Astro points out, 170 00:10:51,280 --> 00:10:54,120 Speaker 1: the lab itself is a moon shot. He has charged 171 00:10:54,160 --> 00:10:58,480 Speaker 1: with delivering ten X impact on the world's most intractable problems. 172 00:10:59,440 --> 00:11:02,880 Speaker 1: But they take think concerns seriously too, which is reportedly 173 00:11:02,880 --> 00:11:06,320 Speaker 1: why X abandoned what on an invisibility device. We go 174 00:11:06,440 --> 00:11:09,400 Speaker 1: talk to the public, we go talk to experts and 175 00:11:09,400 --> 00:11:12,679 Speaker 1: thought leaders, we go talk to regulators, and we put 176 00:11:12,720 --> 00:11:15,920 Speaker 1: these half formed ideas and prototypes in front of them 177 00:11:15,960 --> 00:11:19,000 Speaker 1: and say, this is the problem we're trying to solve, 178 00:11:19,920 --> 00:11:22,360 Speaker 1: here's how we're currently trying to solve it. What do 179 00:11:22,440 --> 00:11:25,800 Speaker 1: you think And we get feedback and that helps educate 180 00:11:25,920 --> 00:11:30,160 Speaker 1: us and repoint us in various ways. Internally, we also 181 00:11:30,200 --> 00:11:32,760 Speaker 1: play a lot of what if games, So an example 182 00:11:33,160 --> 00:11:38,360 Speaker 1: is what we call design fiction, where we either make 183 00:11:38,480 --> 00:11:43,440 Speaker 1: pictures or literally write stories about our technology and how 184 00:11:43,480 --> 00:11:46,320 Speaker 1: it might play out. But if you can't imagine it 185 00:11:46,440 --> 00:11:49,400 Speaker 1: working out in really good ways for society, we definitely 186 00:11:49,400 --> 00:11:53,160 Speaker 1: shouldn't be making it. And if you can imagine it 187 00:11:53,400 --> 00:11:56,920 Speaker 1: working out, but you can describe some bad things that 188 00:11:57,200 --> 00:12:01,239 Speaker 1: this moonshot might cause in society, being able to describe 189 00:12:01,280 --> 00:12:04,520 Speaker 1: those things means maybe we can change how we're working 190 00:12:04,640 --> 00:12:07,200 Speaker 1: on the moon shot. It's good to hear that there's 191 00:12:07,200 --> 00:12:11,040 Speaker 1: a strong ethical framework governing innovation at X, but it's 192 00:12:11,040 --> 00:12:14,360 Speaker 1: still a private company and they're making ethical decisions that 193 00:12:14,440 --> 00:12:18,120 Speaker 1: impact all of us. Astro has been talking about infrastructure, 194 00:12:18,679 --> 00:12:22,000 Speaker 1: but remember in our first episode the program to deter 195 00:12:22,080 --> 00:12:26,640 Speaker 1: potential terrorists using targeted ads that was also sponsored by Google. 196 00:12:27,360 --> 00:12:31,080 Speaker 1: So providing Internet and deterring terrorists are both good ideas, 197 00:12:31,480 --> 00:12:34,320 Speaker 1: but when the same company does both and so much more, 198 00:12:34,800 --> 00:12:39,040 Speaker 1: the concentration of power is concerning. Then again, in a 199 00:12:39,080 --> 00:12:42,960 Speaker 1: political environment of gridlock and proposed cuts to science funding, 200 00:12:43,440 --> 00:12:46,120 Speaker 1: maybe we should be grateful that the big technology companies 201 00:12:46,160 --> 00:12:51,560 Speaker 1: are stepping into tackle urgent problems. You've heard the bacteria 202 00:12:51,640 --> 00:12:56,800 Speaker 1: around the world are becoming more resistant to antibiotics. There's 203 00:12:56,880 --> 00:13:02,160 Speaker 1: no way for some drug company to chase after all 204 00:13:02,200 --> 00:13:07,640 Speaker 1: of these increasingly antibiotic resistant bacteria. And yet if we 205 00:13:07,679 --> 00:13:10,160 Speaker 1: don't chase after them, we're going to look back at 206 00:13:10,160 --> 00:13:12,600 Speaker 1: this time in the world as the golden age of 207 00:13:12,640 --> 00:13:16,320 Speaker 1: antibiotics and sure miss it. Even if you could just 208 00:13:16,760 --> 00:13:21,880 Speaker 1: simulate a bacteria inside a computer, that is asked, what 209 00:13:21,960 --> 00:13:26,160 Speaker 1: would happen to this bacteria if I knocked out this gene, 210 00:13:26,480 --> 00:13:29,240 Speaker 1: if I changed the pH in the solution that it's 211 00:13:29,240 --> 00:13:31,560 Speaker 1: sitting in, if I subjected it to a lot of 212 00:13:31,679 --> 00:13:34,280 Speaker 1: UV light. If you could just ask those kinds of 213 00:13:34,320 --> 00:13:40,359 Speaker 1: basic questions, you could start to design new antibacterial agents 214 00:13:40,840 --> 00:13:43,200 Speaker 1: at a thousand times the rate because you don't have 215 00:13:43,240 --> 00:13:46,080 Speaker 1: to go around pipetting and waiting for these things to 216 00:13:46,120 --> 00:13:51,040 Speaker 1: grow or die. If you could simulate life, that is, 217 00:13:51,240 --> 00:13:55,480 Speaker 1: model any part of biology inside the computer, it would 218 00:13:55,520 --> 00:13:58,559 Speaker 1: be a do over for the life sciences. That's an 219 00:13:58,600 --> 00:14:02,000 Speaker 1: example of something we're in the very early days of exploring. 220 00:14:03,520 --> 00:14:09,360 Speaker 1: Alphabet Google x loon Jigsaw. This constellation of interlocking companies 221 00:14:09,400 --> 00:14:12,280 Speaker 1: is working on everything from bringing the Internet remote parts 222 00:14:12,280 --> 00:14:15,880 Speaker 1: of the world, to deterring terrorists to creating new kinds 223 00:14:15,880 --> 00:14:18,480 Speaker 1: of antibiotics. And if that sounds a lot like the 224 00:14:18,520 --> 00:14:22,480 Speaker 1: work of government, well that's because it is. When we 225 00:14:22,520 --> 00:14:25,400 Speaker 1: come back, we ask what does this concentration of power 226 00:14:25,480 --> 00:14:27,880 Speaker 1: mean for us and what should we do about it. 227 00:14:33,080 --> 00:14:35,520 Speaker 1: We've been talking a lot about Google Cara, but of 228 00:14:35,560 --> 00:14:38,840 Speaker 1: course there's not just Google. We mentioned Facebook earlier, and 229 00:14:38,880 --> 00:14:43,160 Speaker 1: then there's Amazon. Amazon makes up almost half of all 230 00:14:43,240 --> 00:14:46,520 Speaker 1: online sales in the US. Just think about everything, you know. 231 00:14:46,560 --> 00:14:49,720 Speaker 1: We've talked about Alexa gathering intimate data and more than 232 00:14:49,720 --> 00:14:53,640 Speaker 1: a hundred million American homes and Jeff Be's His ambitions 233 00:14:53,640 --> 00:14:57,680 Speaker 1: also include redefining healthcare in the US and also recently 234 00:14:58,200 --> 00:15:02,840 Speaker 1: colonizing space where nicely so humble. Um, but it is 235 00:15:02,880 --> 00:15:05,720 Speaker 1: getting harder and harder to define what these massive businesses 236 00:15:05,800 --> 00:15:09,320 Speaker 1: actually are. Donald Trump calls the Washington Post the Amazon 237 00:15:09,400 --> 00:15:12,880 Speaker 1: Washington Post, which it is since Bezo sported. I mean, 238 00:15:12,880 --> 00:15:15,240 Speaker 1: I guess, not exactly the Amazon Washington Post. But it 239 00:15:15,280 --> 00:15:17,640 Speaker 1: does identify a real issue, which is that it's getting 240 00:15:17,640 --> 00:15:22,120 Speaker 1: harder and harder to say what these multi industry companies are. Well, yes, 241 00:15:22,160 --> 00:15:25,000 Speaker 1: because they all do so much, So how do we 242 00:15:25,040 --> 00:15:27,640 Speaker 1: regulate them. One person who's been out in front of 243 00:15:27,680 --> 00:15:30,880 Speaker 1: this issue is Lena Can While at Yale Law School 244 00:15:30,960 --> 00:15:35,040 Speaker 1: and age eight, she wrote a landmark article called Amazon's 245 00:15:35,040 --> 00:15:38,680 Speaker 1: antitrust Paradox. And we're now starting to have conversations as 246 00:15:38,680 --> 00:15:41,800 Speaker 1: a society about using monopoly law to limit the power 247 00:15:41,840 --> 00:15:44,760 Speaker 1: of the big tech companies. But Lena really kicks out 248 00:15:44,760 --> 00:15:47,560 Speaker 1: of the conversation. So I was very excited when she 249 00:15:47,600 --> 00:15:51,040 Speaker 1: agreed to join us on the show. These technologies are complicated. 250 00:15:51,200 --> 00:15:55,120 Speaker 1: There are oftentimes involved in multiple lines of business, and 251 00:15:55,200 --> 00:15:59,440 Speaker 1: so for many everyday people, including lawmakers, they don't actually 252 00:15:59,520 --> 00:16:03,360 Speaker 1: understand and how these firms operate. According to Lena, the 253 00:16:03,400 --> 00:16:07,480 Speaker 1: big three Amazon, Facebook, and Google have one thing in common. 254 00:16:07,720 --> 00:16:13,040 Speaker 1: They've emerged as gatekeepers of the digital economy. Amazon, Google 255 00:16:13,080 --> 00:16:15,480 Speaker 1: and Facebook are in a position where they can really 256 00:16:15,480 --> 00:16:19,600 Speaker 1: pick winners and losers, especially among the merchants or the 257 00:16:19,680 --> 00:16:23,240 Speaker 1: content producers or the app developers that are now reliant 258 00:16:23,240 --> 00:16:25,840 Speaker 1: on their platform to get to market. So if you're 259 00:16:25,840 --> 00:16:31,400 Speaker 1: a consumer, you're primarily thinking about price, about convenience, about quality, um, 260 00:16:31,480 --> 00:16:33,760 Speaker 1: you know, if the fact, if you're a new parent 261 00:16:33,920 --> 00:16:37,640 Speaker 1: and you can just order diapers and they'll be reliably 262 00:16:37,720 --> 00:16:40,400 Speaker 1: at your doorstep the next morning, there's no doubt that 263 00:16:40,440 --> 00:16:45,320 Speaker 1: Amazon has provided important benefits to consumers. But if you're 264 00:16:45,360 --> 00:16:48,640 Speaker 1: thinking about the company as a citizen, and you're thinking 265 00:16:48,680 --> 00:16:51,600 Speaker 1: about the market power that it has, if you're thinking 266 00:16:51,680 --> 00:16:54,320 Speaker 1: about the way that it was able to avoid paying 267 00:16:54,440 --> 00:16:57,480 Speaker 1: sales taxes for the first years that it was in business, 268 00:16:57,640 --> 00:17:00,000 Speaker 1: if you're looking at the way in which Amazon or 269 00:17:00,040 --> 00:17:02,840 Speaker 1: distorted its searched for its second headquarters, where it was 270 00:17:03,160 --> 00:17:06,639 Speaker 1: pretty ruthlessly pitting city against city and showing that it 271 00:17:06,680 --> 00:17:10,199 Speaker 1: was really willing to extort municipalities try and get the 272 00:17:10,200 --> 00:17:13,640 Speaker 1: biggest subsidy possible, and then ended up playing a bit 273 00:17:13,640 --> 00:17:15,919 Speaker 1: of a bait and switch where it collected all of 274 00:17:15,920 --> 00:17:18,760 Speaker 1: this information from all of these different cities, which now 275 00:17:18,800 --> 00:17:21,600 Speaker 1: inevitably will give it a competitive advantage. I mean, there's 276 00:17:21,600 --> 00:17:25,840 Speaker 1: just so many different dimensions of Amazon's dominance that are troubling. 277 00:17:26,720 --> 00:17:30,040 Speaker 1: If there was any doubt about Amazon's power to compel politicians, 278 00:17:30,560 --> 00:17:33,040 Speaker 1: the search for the second headquarters should have cleared it up. 279 00:17:33,520 --> 00:17:36,560 Speaker 1: And now Amazon is beginning to build leverage over the 280 00:17:36,600 --> 00:17:39,800 Speaker 1: federal government. There have been reports from the Institute for 281 00:17:39,840 --> 00:17:43,399 Speaker 1: Local Self Reliance about how Amazon is now on the 282 00:17:43,480 --> 00:17:47,359 Speaker 1: cusp of receiving potentially a big contract with the Pentagon Um. 283 00:17:47,400 --> 00:17:50,640 Speaker 1: So it's just entrenching itself deeper and deeper into our 284 00:17:50,720 --> 00:17:54,000 Speaker 1: daily lives in ways that if you just look at 285 00:17:54,040 --> 00:17:57,199 Speaker 1: an isolation, you will miss the bigger picture. But I 286 00:17:57,240 --> 00:17:59,760 Speaker 1: think as a citizen and as somebody who's looking at 287 00:17:59,760 --> 00:18:03,440 Speaker 1: the social and political implications of this, it's quite troubling. 288 00:18:04,960 --> 00:18:07,639 Speaker 1: I'm very struck Carab by Lena's distinction between being a 289 00:18:07,640 --> 00:18:10,800 Speaker 1: consumer and being a citizen, because of course, as consumers, 290 00:18:10,800 --> 00:18:13,120 Speaker 1: we always want the best price and the most convenience. 291 00:18:13,600 --> 00:18:15,880 Speaker 1: But it's fascinating to think that that can be directly 292 00:18:15,920 --> 00:18:20,760 Speaker 1: at odds with our responsibilities and even our interests as citizens. Yeah, 293 00:18:20,800 --> 00:18:22,280 Speaker 1: you know, I think a lot of readers who love 294 00:18:22,280 --> 00:18:25,120 Speaker 1: independent bookstores have this problem. You know, do I order 295 00:18:25,160 --> 00:18:27,200 Speaker 1: on Amazon because I want the book tomorrow and because 296 00:18:27,200 --> 00:18:29,119 Speaker 1: it's cheap and because I'm trying to save money, or 297 00:18:29,119 --> 00:18:31,560 Speaker 1: do I buy it at the independent bookstore where you know, 298 00:18:31,640 --> 00:18:35,000 Speaker 1: I've gone and loved for years. So yeah, I always 299 00:18:35,000 --> 00:18:36,680 Speaker 1: think about that. But you know, I think we also 300 00:18:36,680 --> 00:18:38,920 Speaker 1: have to think about shareholders and politicians when we talk 301 00:18:38,960 --> 00:18:42,520 Speaker 1: about this. Amazon shareholders are trying to keep Amazon from 302 00:18:42,520 --> 00:18:46,760 Speaker 1: selling Amazon's recognition technology to the US government, But those 303 00:18:46,840 --> 00:18:50,000 Speaker 1: same shareholders are profiting off of Amazon success, So we 304 00:18:50,320 --> 00:18:53,920 Speaker 1: really can't always rely on shareholders to do that. Yeah. 305 00:18:53,960 --> 00:18:56,200 Speaker 1: I mean, those shareholders who don't want Amazon to sell 306 00:18:56,240 --> 00:18:59,760 Speaker 1: facial recognition technology to the government are actually acting against 307 00:18:59,760 --> 00:19:03,240 Speaker 1: them interests, and that's an anomaly, which is why we 308 00:19:03,320 --> 00:19:08,600 Speaker 1: have government theoretically. But remember the Facebook Senate hearings. I mean, 309 00:19:08,600 --> 00:19:12,320 Speaker 1: the senators did a terrible job of holding Zuckerberg to account. 310 00:19:12,800 --> 00:19:15,600 Speaker 1: Mr Zuckerberg, I remember well your first visit to Capitol 311 00:19:15,680 --> 00:19:18,520 Speaker 1: Hill back in two thousand and ten, you spoke to 312 00:19:18,560 --> 00:19:20,879 Speaker 1: the Senate Republican high Tech tasks for us, which I 313 00:19:21,000 --> 00:19:25,159 Speaker 1: chare You said back then that Facebook would always be free. 314 00:19:25,640 --> 00:19:27,960 Speaker 1: How do you sustain a bitteress model in which users 315 00:19:28,000 --> 00:19:36,840 Speaker 1: don't play for your service? Senator, we run ads? Nice, see, Senator, 316 00:19:37,000 --> 00:19:40,640 Speaker 1: we run ads. That was Mark Zuckerberg's response to Senator 317 00:19:40,720 --> 00:19:44,080 Speaker 1: orn Hatch, which was probably the low point in a 318 00:19:44,160 --> 00:19:47,320 Speaker 1: series of low points in those Senate hearings, and also 319 00:19:47,400 --> 00:19:49,600 Speaker 1: the moment that really maybe want to make this podcast 320 00:19:50,359 --> 00:19:53,760 Speaker 1: because clearly the people who have the power and duty 321 00:19:53,920 --> 00:19:57,440 Speaker 1: to question the big technology companies are either being deredicted 322 00:19:57,440 --> 00:20:01,280 Speaker 1: their responsibilities or they simply don't understand what's going on. 323 00:20:01,400 --> 00:20:03,359 Speaker 1: And I'm not sure which is worse. Part of the 324 00:20:03,359 --> 00:20:05,359 Speaker 1: issue with the Senate is that collectively they are a 325 00:20:05,359 --> 00:20:07,960 Speaker 1: million years old, um. But part of it is also 326 00:20:08,080 --> 00:20:11,159 Speaker 1: that they all rely on Facebook ads and pages for reelection, 327 00:20:11,320 --> 00:20:13,679 Speaker 1: and they want Facebook money in their states. That's what 328 00:20:13,720 --> 00:20:19,080 Speaker 1: we call a conflict of interest. Conflict of interest and 329 00:20:19,400 --> 00:20:21,960 Speaker 1: a friend of mine from university literally wrote the book 330 00:20:21,960 --> 00:20:25,200 Speaker 1: on this issue. It's called Future Politics. He's a lawyer 331 00:20:25,240 --> 00:20:31,000 Speaker 1: called Jamie Suskind. It's an ancient idea in human civilization 332 00:20:31,160 --> 00:20:34,639 Speaker 1: that we don't allow great forms of power to be 333 00:20:34,680 --> 00:20:40,119 Speaker 1: erected over us without some degree of transparency, so we 334 00:20:40,160 --> 00:20:43,359 Speaker 1: know what's being done with technology. It's very early days, 335 00:20:43,400 --> 00:20:45,800 Speaker 1: but we need to look at the stuff as citizens 336 00:20:45,800 --> 00:20:48,720 Speaker 1: like we would at any form of power that a 337 00:20:48,840 --> 00:20:51,600 Speaker 1: cruise over our head, whether it's corporate power or political 338 00:20:51,640 --> 00:20:54,760 Speaker 1: power or great economic power. One of the big problems, 339 00:20:54,840 --> 00:20:57,399 Speaker 1: as Jamie sees it, is that as these big technology 340 00:20:57,400 --> 00:21:00,200 Speaker 1: companies get involved with things like creating new drug or 341 00:21:00,280 --> 00:21:04,120 Speaker 1: fighting terrorism. We start to talk about them as nation states, 342 00:21:04,320 --> 00:21:06,640 Speaker 1: and we missed the mark. You know, you'll see journalists 343 00:21:06,640 --> 00:21:09,680 Speaker 1: and commentators saying our tech firms are the new states, 344 00:21:10,040 --> 00:21:13,560 Speaker 1: and I just think that's that's sloppy thinking they might 345 00:21:13,560 --> 00:21:16,119 Speaker 1: have some stuff in common with states. But tech firms 346 00:21:16,119 --> 00:21:18,600 Speaker 1: are commercial entities operating in a market system for the 347 00:21:18,640 --> 00:21:22,120 Speaker 1: pursuit of profit and are answerable to their shareholders, which 348 00:21:22,119 --> 00:21:25,720 Speaker 1: is obviously just a profoundly different social institution to a state. 349 00:21:26,880 --> 00:21:30,000 Speaker 1: Kara mentioned that Amazon shareholders were pressuring the company to 350 00:21:30,040 --> 00:21:33,600 Speaker 1: ban facial recognition software sales, but in late May of 351 00:21:34,680 --> 00:21:37,200 Speaker 1: that measure was roundly defeated by a vote at the 352 00:21:37,280 --> 00:21:42,040 Speaker 1: company's annual general meeting, so Amazon will continue selling facial 353 00:21:42,080 --> 00:21:46,240 Speaker 1: recognition technology. And this highlights what Jamie is saying. These 354 00:21:46,240 --> 00:21:49,639 Speaker 1: firms are motivated by profit, unlike states, which are motivated 355 00:21:49,680 --> 00:21:53,879 Speaker 1: by protecting their citizens. Senator Elizabeth Warren has suggested one 356 00:21:53,920 --> 00:21:56,800 Speaker 1: solution might be to treat the big technology companies like 357 00:21:56,960 --> 00:22:01,600 Speaker 1: utility companies, things like regulating price and access. But Jamie 358 00:22:01,640 --> 00:22:06,120 Speaker 1: thinks this actually underplays just how powerful and consequential these 359 00:22:06,160 --> 00:22:09,480 Speaker 1: companies really are. They're not like utility firms. The water 360 00:22:09,480 --> 00:22:12,040 Speaker 1: company doesn't get you to do things you wouldn't otherwise do. 361 00:22:12,280 --> 00:22:15,359 Speaker 1: It doesn't affect the democratic process, it doesn't set the 362 00:22:15,400 --> 00:22:18,399 Speaker 1: limits of your liberty, and it doesn't distribute things of 363 00:22:18,440 --> 00:22:22,719 Speaker 1: importance throughout society according to principles of justice, which are 364 00:22:22,720 --> 00:22:24,679 Speaker 1: all things that I would say that tech firms now do. 365 00:22:25,359 --> 00:22:27,560 Speaker 1: So rowing back, we don't really have the words to 366 00:22:27,600 --> 00:22:30,840 Speaker 1: describe what a tech firm is conceptually and politically, and 367 00:22:30,880 --> 00:22:32,800 Speaker 1: so it's no wonder that we're not coming up with 368 00:22:32,840 --> 00:22:36,040 Speaker 1: policies and regulations and laws because we don't even have 369 00:22:36,040 --> 00:22:38,360 Speaker 1: the words to describe the future, even if we could 370 00:22:38,400 --> 00:22:43,240 Speaker 1: see it. When we come back, we investigate that future 371 00:22:43,760 --> 00:22:52,600 Speaker 1: and ask what we can do to regain some control. So, Kara, 372 00:22:52,720 --> 00:22:54,679 Speaker 1: everyone is now thinking a lot about the role of 373 00:22:54,680 --> 00:22:57,960 Speaker 1: technology in our lives, and one of the popular things 374 00:22:58,000 --> 00:23:02,399 Speaker 1: recently has been for reporters to try and live without technology. 375 00:23:02,520 --> 00:23:04,520 Speaker 1: I feel like phone detox is the new weight watchers, 376 00:23:04,560 --> 00:23:07,200 Speaker 1: and I'm kind of sick of it. But Kashmir Hill 377 00:23:07,359 --> 00:23:10,280 Speaker 1: did this really interesting story for Gizmoto where she tried 378 00:23:10,280 --> 00:23:13,200 Speaker 1: not to use any products from the big five tech companies. 379 00:23:13,240 --> 00:23:16,439 Speaker 1: You know, she blocked Amazon, Facebook, Google, Microsoft, and Apple, 380 00:23:16,840 --> 00:23:18,879 Speaker 1: and she said it was hell, you know, I think 381 00:23:18,920 --> 00:23:22,800 Speaker 1: about my own life, Google and Amazon are absolutely indispensable 382 00:23:22,840 --> 00:23:24,719 Speaker 1: to me. And then there are times where I'm like, oh, 383 00:23:24,760 --> 00:23:27,080 Speaker 1: I don't really use Facebook that much, you know, and 384 00:23:27,119 --> 00:23:31,119 Speaker 1: I'm sitting in a public restroom at two pm on 385 00:23:31,160 --> 00:23:34,760 Speaker 1: Instagram and sort of check on my there's Facebook, right, 386 00:23:34,800 --> 00:23:37,639 Speaker 1: And I think what Kashmir Hills story also revealed is 387 00:23:37,680 --> 00:23:39,600 Speaker 1: that even when we think we're not using one of 388 00:23:39,600 --> 00:23:42,960 Speaker 1: those companies products, we may well be using a website 389 00:23:43,040 --> 00:23:45,679 Speaker 1: or an app that's powered by them. So it's basically 390 00:23:45,680 --> 00:23:48,840 Speaker 1: impossible to opt out his Lina again, if what you 391 00:23:48,880 --> 00:23:50,679 Speaker 1: mean by opt out is you know, not have an 392 00:23:50,720 --> 00:23:54,120 Speaker 1: Amazon Prime account or not have Gmail. You know, there 393 00:23:54,160 --> 00:23:56,440 Speaker 1: may be ways in which you can stop using these 394 00:23:56,440 --> 00:24:00,760 Speaker 1: services in a day to day sense, But Amazon also 395 00:24:00,840 --> 00:24:04,479 Speaker 1: owns Amazon Web Services UM, which you know much of 396 00:24:04,520 --> 00:24:08,080 Speaker 1: the Internet now relies on. Google is providing the back 397 00:24:08,160 --> 00:24:11,840 Speaker 1: end infrastructure for so many other services. So I think, 398 00:24:12,280 --> 00:24:15,040 Speaker 1: you know, if you're if you're trying to UM, if 399 00:24:15,080 --> 00:24:17,480 Speaker 1: you're trying to delete these firms from all aspects of 400 00:24:17,560 --> 00:24:20,359 Speaker 1: your life, it actually becomes very difficult to live in 401 00:24:20,520 --> 00:24:24,199 Speaker 1: modern day society and Like we said, this applies to 402 00:24:24,280 --> 00:24:26,560 Speaker 1: all of the Big five, but let's play it out 403 00:24:26,600 --> 00:24:28,919 Speaker 1: with Amazon and we can start with the easy stuff. 404 00:24:29,000 --> 00:24:32,280 Speaker 1: So Kara, no more things for the kitchen from Amazon 405 00:24:32,320 --> 00:24:36,800 Speaker 1: dot Com well as Walmart. Um, no more groceries from 406 00:24:36,920 --> 00:24:40,480 Speaker 1: Whole Foods. Can we afford it anyway? Um? And no 407 00:24:40,560 --> 00:24:43,840 Speaker 1: more Amazon Prime Video. Sad to miss Mrs Mazel But 408 00:24:43,880 --> 00:24:49,320 Speaker 1: how the Netflix you know you like mazel us Um. 409 00:24:49,359 --> 00:24:52,600 Speaker 1: I think the consumer facing stuff is obviously interesting, but 410 00:24:52,680 --> 00:24:55,280 Speaker 1: the more consequential piece of this is the back end 411 00:24:55,359 --> 00:24:58,040 Speaker 1: of the Internet. You know, Lena was talking about Amazon 412 00:24:58,119 --> 00:25:01,200 Speaker 1: Web Services, which is like the cloud computing, the powers 413 00:25:01,280 --> 00:25:04,439 Speaker 1: so many of the services we use every day, and 414 00:25:04,560 --> 00:25:09,800 Speaker 1: Amazon basically sells this to other companies. So actually, without 415 00:25:09,840 --> 00:25:14,560 Speaker 1: this back end from Amazon, there is no Netflix, Unilever, Visor, 416 00:25:14,600 --> 00:25:17,720 Speaker 1: General Electric. These companies all rely on Amazon Web Services. 417 00:25:17,720 --> 00:25:21,159 Speaker 1: So does NASA, where I went to space camp sort of. 418 00:25:22,160 --> 00:25:25,439 Speaker 1: And even Apple, who compete with Amazon in areas like 419 00:25:25,480 --> 00:25:28,360 Speaker 1: streaming and devices, are also on track to spend more 420 00:25:28,359 --> 00:25:31,800 Speaker 1: than three hundred million dollars with Amazon this year. So 421 00:25:31,840 --> 00:25:34,960 Speaker 1: Apple are paying literally one of their competitors to host 422 00:25:34,960 --> 00:25:39,120 Speaker 1: their data. Yes, so if you're using the Internet, you're 423 00:25:39,240 --> 00:25:42,679 Speaker 1: in Amazon's territory. Territory is a good word. One of 424 00:25:42,720 --> 00:25:46,439 Speaker 1: the more interesting analogies I've heard is, well, I'll just 425 00:25:46,520 --> 00:25:49,000 Speaker 1: let Jack Clark say it. I think it's kind of 426 00:25:49,119 --> 00:25:52,560 Speaker 1: analogous to feudalism. Like you and I get to live 427 00:25:52,640 --> 00:25:56,600 Speaker 1: on some sort of a state like Google or Facebook 428 00:25:56,760 --> 00:25:59,720 Speaker 1: or Twitter. You know, via state is owned by a 429 00:25:59,800 --> 00:26:01,720 Speaker 1: few you door lord, which is the owners of these 430 00:26:01,760 --> 00:26:04,919 Speaker 1: companies and their boards, and the estate is able to 431 00:26:05,000 --> 00:26:10,040 Speaker 1: extract my labor benefit from it, and what it gives 432 00:26:10,080 --> 00:26:14,359 Speaker 1: me is stability. Jack is the policy director of Open Ai, 433 00:26:14,520 --> 00:26:17,920 Speaker 1: an AI research company founded by Elon Musk and Sam Altman. 434 00:26:18,160 --> 00:26:21,920 Speaker 1: When Jack says labor, he's referring to use a data Now, 435 00:26:22,040 --> 00:26:24,439 Speaker 1: part of open eyes mission is to compete with the 436 00:26:24,520 --> 00:26:27,720 Speaker 1: AI labs at Google, Facebook, and Amazon. So Jack might 437 00:26:27,760 --> 00:26:31,159 Speaker 1: not be fully impartial, But let's think about that comparison 438 00:26:31,240 --> 00:26:34,639 Speaker 1: to feudalism. If you're a peasant in a village owned 439 00:26:34,640 --> 00:26:38,240 Speaker 1: by a feudal lord, you had some guarantee of stability 440 00:26:38,320 --> 00:26:40,600 Speaker 1: as long as you stayed on that platform, but you 441 00:26:40,640 --> 00:26:43,600 Speaker 1: couldn't leave the platform. And actually, as early as the 442 00:26:43,640 --> 00:26:48,000 Speaker 1: thirteenth century, English peasants were being required to carry identification 443 00:26:48,080 --> 00:26:51,159 Speaker 1: cards because people would get really really grump you affirm 444 00:26:51,400 --> 00:26:53,960 Speaker 1: if they walked off the estate there on and tried 445 00:26:54,000 --> 00:26:56,240 Speaker 1: to go somewhere else. And I think that's actually very 446 00:26:56,280 --> 00:26:59,280 Speaker 1: similar to where we are today. We don't have possible 447 00:26:59,440 --> 00:27:04,639 Speaker 1: data our our ability to economically benefit ourselves with our 448 00:27:04,720 --> 00:27:07,480 Speaker 1: data is actually very very limited, and we live in 449 00:27:07,520 --> 00:27:10,240 Speaker 1: this kind of neo feudal system where you get to 450 00:27:10,240 --> 00:27:12,760 Speaker 1: pick your platform, they get all of the data of 451 00:27:12,800 --> 00:27:16,720 Speaker 1: the benefit, and you get some free service in exchange. 452 00:27:17,880 --> 00:27:20,239 Speaker 1: So we perceive what we're getting from these companies as 453 00:27:20,320 --> 00:27:22,680 Speaker 1: free car and all we have to do is tolerate 454 00:27:22,720 --> 00:27:25,720 Speaker 1: a few advertisements. But as we talked about, we're not 455 00:27:25,760 --> 00:27:28,480 Speaker 1: just giving them our eyeballs to look at ads. No, 456 00:27:28,680 --> 00:27:32,320 Speaker 1: we're allowing them to understand our patterns, behaviors, and desires 457 00:27:32,359 --> 00:27:35,359 Speaker 1: better than we understand them ourselves. And then they're using 458 00:27:35,359 --> 00:27:38,360 Speaker 1: that knowledge about us to hold our attention, influence our behavior, 459 00:27:38,400 --> 00:27:40,480 Speaker 1: and monetize us and sell us stuff that we don't need. 460 00:27:40,720 --> 00:27:43,639 Speaker 1: When you think about it, historically leaders have killed for 461 00:27:43,680 --> 00:27:47,240 Speaker 1: this sort of power influence, and we're handing over now 462 00:27:47,359 --> 00:27:51,240 Speaker 1: freely to Bazos and Zuckerberg and Larry page. We're eroding 463 00:27:51,240 --> 00:27:54,760 Speaker 1: our power as citizens because it's easier and more comfortable 464 00:27:54,800 --> 00:27:57,399 Speaker 1: to be consumers. I think it's important to remember that 465 00:27:57,480 --> 00:28:01,160 Speaker 1: the same place where a person can by a twelve 466 00:28:01,200 --> 00:28:04,280 Speaker 1: pack of toilet paper is also the same company that 467 00:28:04,400 --> 00:28:07,760 Speaker 1: is selling Amazon Web services to the Pentagon. Yeah, I 468 00:28:07,800 --> 00:28:12,439 Speaker 1: mean it's it's sobering. It is sobering. We focus a 469 00:28:12,440 --> 00:28:14,920 Speaker 1: lot on how the big technology companies are powered by 470 00:28:15,000 --> 00:28:18,520 Speaker 1: our data, but there's something else this as well. Jack 471 00:28:18,560 --> 00:28:21,480 Speaker 1: Clark of open ai argues that it's distracting us this 472 00:28:21,640 --> 00:28:25,960 Speaker 1: focus on data from something even more fundamental. Who owns 473 00:28:25,960 --> 00:28:29,879 Speaker 1: the computing power? Now. A bet that we have is 474 00:28:29,920 --> 00:28:33,199 Speaker 1: that the value of those large amounts of data is 475 00:28:33,200 --> 00:28:36,480 Speaker 1: going to reduce over time as you develop algorithms that 476 00:28:36,520 --> 00:28:39,920 Speaker 1: are better able to extract structure from smaller and smaller 477 00:28:39,920 --> 00:28:42,600 Speaker 1: amounts of data. But you're always going to need compute 478 00:28:42,680 --> 00:28:45,600 Speaker 1: to allow you to run more experiments and train bigger systems. 479 00:28:45,920 --> 00:28:48,680 Speaker 1: So we think in the long term compute might be 480 00:28:48,800 --> 00:28:53,040 Speaker 1: the key determiner of AI progress. Just to clarify, compute 481 00:28:53,160 --> 00:28:56,920 Speaker 1: is short for computing power, the ability to process large 482 00:28:56,960 --> 00:29:00,400 Speaker 1: amounts of data. It's what's needed to help loons loons 483 00:29:00,480 --> 00:29:04,240 Speaker 1: model and catch the winds, and a sale mentioned Google 484 00:29:04,360 --> 00:29:06,520 Speaker 1: used so much of it that they build their massive 485 00:29:06,640 --> 00:29:09,680 Speaker 1: data centers next to rivers so they can use hydro 486 00:29:09,720 --> 00:29:13,000 Speaker 1: electric energy to power them. And compute has grown at 487 00:29:13,040 --> 00:29:16,080 Speaker 1: an extraordinary rate as we've become able to make smaller 488 00:29:16,080 --> 00:29:19,840 Speaker 1: and smaller circuit boards and pack more power into smaller cases. 489 00:29:20,160 --> 00:29:23,000 Speaker 1: Now openly, I recently did an analysis where we looked 490 00:29:23,000 --> 00:29:26,600 Speaker 1: at the amount of compute that had been used in 491 00:29:26,800 --> 00:29:30,000 Speaker 1: breakthrough AI systems in recent years. And when we did 492 00:29:30,040 --> 00:29:32,960 Speaker 1: this analysis, we found out that the the amount of 493 00:29:32,960 --> 00:29:36,800 Speaker 1: compute has grown by three hundred thousand times in six years. 494 00:29:37,080 --> 00:29:40,280 Speaker 1: Three hundred thousand times in six years. What does that 495 00:29:40,320 --> 00:29:43,680 Speaker 1: mean in practical terms? What does that growth actually look like? Well, 496 00:29:43,760 --> 00:29:46,160 Speaker 1: I think a better measure here is to think about 497 00:29:46,240 --> 00:29:49,440 Speaker 1: your phone battery. That's equivalent to your phone going from 498 00:29:49,440 --> 00:29:52,440 Speaker 1: having a battery that lasted for one day six years 499 00:29:52,440 --> 00:29:56,120 Speaker 1: ago to a battery that lasted for eight hundred years today. 500 00:29:56,640 --> 00:29:59,200 Speaker 1: That's what that growth looks like, and it means that 501 00:29:59,360 --> 00:30:03,560 Speaker 1: more power for capabilities are coming into view faster than 502 00:30:03,600 --> 00:30:06,680 Speaker 1: we expect. It's that kind of growth that's allowed a 503 00:30:06,720 --> 00:30:09,920 Speaker 1: fleet of loon balloons to model the winds, adjust their 504 00:30:09,960 --> 00:30:14,440 Speaker 1: altitude and sail to precise locations. But the computational power 505 00:30:14,560 --> 00:30:17,080 Speaker 1: is in the hands of the private sector, and that 506 00:30:17,160 --> 00:30:21,520 Speaker 1: is dramatically increasing their power. We have brilliant people come 507 00:30:21,520 --> 00:30:24,240 Speaker 1: and work with us from places like you know, m 508 00:30:24,240 --> 00:30:27,000 Speaker 1: I T or Stanford, and one of the things that 509 00:30:27,120 --> 00:30:29,920 Speaker 1: attracts them to work here, or attracts them to work 510 00:30:29,920 --> 00:30:32,200 Speaker 1: at a Google or a Facebook, is we can give 511 00:30:32,240 --> 00:30:35,840 Speaker 1: them more computers and they can get their home institution. 512 00:30:36,080 --> 00:30:39,680 Speaker 1: I think if we don't solve this disparity, we're going 513 00:30:39,720 --> 00:30:44,200 Speaker 1: to really wreck the public benefits of scientific research because 514 00:30:44,200 --> 00:30:47,080 Speaker 1: you're going to have a whole class of research which 515 00:30:47,120 --> 00:30:50,160 Speaker 1: only occurs in the private sector, and therefore there are 516 00:30:50,320 --> 00:30:54,040 Speaker 1: very few guarantees but that research will always be public. 517 00:30:54,640 --> 00:30:58,160 Speaker 1: Earlier this year, President Trump announced an executive order called 518 00:30:58,480 --> 00:31:01,520 Speaker 1: the American AI Initially IF that laid out a plan 519 00:31:01,640 --> 00:31:04,400 Speaker 1: to address some of the concerns we've raised in this episode, 520 00:31:04,560 --> 00:31:08,440 Speaker 1: but it didn't come with any funding, So innovation and 521 00:31:08,520 --> 00:31:11,920 Speaker 1: ethical decisions will remain in private hands for the time being, 522 00:31:12,600 --> 00:31:15,800 Speaker 1: and this means that AI technology isn't guaranteed to serve 523 00:31:15,840 --> 00:31:18,760 Speaker 1: the public interest and may even get into the wrong hands. 524 00:31:19,000 --> 00:31:21,200 Speaker 1: I think for the first step we as a community 525 00:31:21,280 --> 00:31:24,520 Speaker 1: need to take is to acknowledge potential harms. And once 526 00:31:24,560 --> 00:31:27,440 Speaker 1: we have that mindset, I think it becomes easier to 527 00:31:27,560 --> 00:31:30,920 Speaker 1: sell the scientific community on Okay, we have a sense 528 00:31:30,960 --> 00:31:34,320 Speaker 1: of what harm looks like, what do we commit ourselves 529 00:31:34,320 --> 00:31:38,320 Speaker 1: to minimize that? So, yeah, that conversation has to happen. 530 00:31:38,560 --> 00:31:41,440 Speaker 1: I think if that conversation does not happen, then you're 531 00:31:41,480 --> 00:31:45,800 Speaker 1: going to have this arms race absent the conscious creation 532 00:31:45,840 --> 00:31:48,760 Speaker 1: of norms here. I think that that's for default, and 533 00:31:48,800 --> 00:31:52,440 Speaker 1: that default world terrifies me because I can see AI 534 00:31:52,560 --> 00:31:55,720 Speaker 1: research today that in two or three years is going 535 00:31:55,760 --> 00:31:59,720 Speaker 1: to give us, say, unprecedented capabilities in drone autonomy, that 536 00:32:00,040 --> 00:32:03,880 Speaker 1: to drone navigate to a target in a small urban area. Now, 537 00:32:03,920 --> 00:32:06,960 Speaker 1: that's obviously an amazingly good thing. If we want to 538 00:32:07,000 --> 00:32:10,640 Speaker 1: create rapid response drones that can deliver, say, tools for 539 00:32:10,680 --> 00:32:14,520 Speaker 1: dealing with cardiac arrest someone undergoing that on a city street, 540 00:32:15,080 --> 00:32:16,800 Speaker 1: I don't want to think about the version of this 541 00:32:16,920 --> 00:32:19,400 Speaker 1: where the drone has some explosive strap to it and 542 00:32:19,400 --> 00:32:21,840 Speaker 1: it is being used to assassinate somewhat. And I want 543 00:32:21,880 --> 00:32:26,560 Speaker 1: to have the AI community confront this problem. But the 544 00:32:26,600 --> 00:32:29,840 Speaker 1: AI community won't be able to do this alone. We 545 00:32:29,920 --> 00:32:33,240 Speaker 1: need our politicians to step up and create meaningful laws 546 00:32:33,320 --> 00:32:38,200 Speaker 1: and regulations. Ultimately, rely on tech companies to regulate themselves 547 00:32:38,400 --> 00:32:42,840 Speaker 1: is an abdication of responsibility. Here's Lena Can again. The 548 00:32:42,880 --> 00:32:46,600 Speaker 1: dominance of these tech companies is not inevitable, and none 549 00:32:46,600 --> 00:32:48,880 Speaker 1: of the economic outcomes that we're seeing in these markets 550 00:32:48,880 --> 00:32:52,800 Speaker 1: are inevitable, and they're deeply shaped by laws and policy 551 00:32:53,000 --> 00:32:55,880 Speaker 1: and the political choices that we're making about how we 552 00:32:55,960 --> 00:32:59,320 Speaker 1: allow these firms to expand and grown kinds of practices 553 00:32:59,320 --> 00:33:01,760 Speaker 1: they're allowed to engage in. So I think it's really 554 00:33:01,800 --> 00:33:07,400 Speaker 1: important to push back against determinism and inevitability narratives and 555 00:33:07,800 --> 00:33:11,960 Speaker 1: reassert the role of law and policy in shaping economic outcomes. 556 00:33:13,000 --> 00:33:17,120 Speaker 1: There's an important balance between public and private institutions in America, 557 00:33:17,640 --> 00:33:21,200 Speaker 1: and letting either side grow too powerful creates problems. But 558 00:33:21,360 --> 00:33:24,800 Speaker 1: right now, the big technology companies are becoming so large 559 00:33:24,840 --> 00:33:28,560 Speaker 1: and powerful as to be ungovernable, and no matter how 560 00:33:28,600 --> 00:33:31,840 Speaker 1: ethical or what intention they may be, they're not motivated 561 00:33:31,840 --> 00:33:35,680 Speaker 1: by fairness or protecting the weakest in society. They're motivated 562 00:33:35,720 --> 00:33:40,600 Speaker 1: by shareholders and profit that may seem like an insurmountable problem. 563 00:33:40,640 --> 00:33:42,720 Speaker 1: But we have a history in this country of making 564 00:33:42,800 --> 00:33:46,080 Speaker 1: laws that disrupt special interests and raise the quality of 565 00:33:46,120 --> 00:33:48,760 Speaker 1: life for everyone. Think about the formation of the e 566 00:33:48,880 --> 00:33:51,760 Speaker 1: p A, the New Deal, even the Civil Rights Act. 567 00:33:52,240 --> 00:33:55,080 Speaker 1: And actually, just this week, as we've been preparing to 568 00:33:55,120 --> 00:33:58,280 Speaker 1: release this episode, the House Judiciary Committee has launched a 569 00:33:58,360 --> 00:34:02,720 Speaker 1: major bipartisan and trust investigation into the big tech companies. 570 00:34:03,040 --> 00:34:05,480 Speaker 1: This is the first time in decades that Congress has 571 00:34:05,520 --> 00:34:10,560 Speaker 1: investigated a specific industry. And who is advising the investigation, Well, 572 00:34:10,640 --> 00:34:13,000 Speaker 1: that would be Lena con Yeah. We got in touch 573 00:34:13,000 --> 00:34:14,600 Speaker 1: with Lena to hear about our thoughts on the probe 574 00:34:14,600 --> 00:34:16,839 Speaker 1: and what we might be able to expect from the investigation, 575 00:34:17,280 --> 00:34:20,080 Speaker 1: but she's not able to comment publicly yet. Still, she's 576 00:34:20,080 --> 00:34:22,400 Speaker 1: said a couple of interesting things in our original interview. 577 00:34:23,080 --> 00:34:25,920 Speaker 1: The goal of anti trust is to keep markets competitive. 578 00:34:25,960 --> 00:34:28,799 Speaker 1: And I think it's important because it affects us as consumers, 579 00:34:28,800 --> 00:34:31,720 Speaker 1: it affects us as workers, it affects us an entrepreneurs, 580 00:34:31,760 --> 00:34:33,520 Speaker 1: and it affects us a citizens. Right. I mean, I 581 00:34:33,560 --> 00:34:36,920 Speaker 1: think the structure of our economy has huge consequences for 582 00:34:37,000 --> 00:34:39,000 Speaker 1: our day to day lives, and anti trust is a 583 00:34:39,080 --> 00:34:42,960 Speaker 1: key component of that. If we saw more robust anti 584 00:34:43,000 --> 00:34:47,040 Speaker 1: trust enforcement, then these markets should become more open to competition, 585 00:34:47,520 --> 00:34:53,960 Speaker 1: and so in ten years you would not necessarily see Google, Amazon, Facebook, Apple, 586 00:34:54,040 --> 00:34:56,920 Speaker 1: Microsoft continue to be as dominant as they are, because 587 00:34:56,920 --> 00:34:59,319 Speaker 1: we would have seen innovators. We would have seen the 588 00:34:59,320 --> 00:35:03,399 Speaker 1: next wave of of the breakthrough disruptors. I think it's 589 00:35:03,840 --> 00:35:08,640 Speaker 1: also worth acknowledging that the dominant these firms enjoy is 590 00:35:08,760 --> 00:35:14,080 Speaker 1: certain markets maybe something that we regulate rather than addressed 591 00:35:14,080 --> 00:35:18,600 Speaker 1: through breaking up. So since we've started working on Sleepwalkers, 592 00:35:18,960 --> 00:35:22,000 Speaker 1: there has been tangible progress, both in the US and 593 00:35:22,080 --> 00:35:24,920 Speaker 1: around the world. But this is only the beginning and 594 00:35:25,160 --> 00:35:28,200 Speaker 1: we can't relax just yet. In the next episode, we 595 00:35:28,320 --> 00:35:30,800 Speaker 1: bring things down to earth and look at how AI 596 00:35:31,040 --> 00:35:33,960 Speaker 1: may help us feed the world, and along the way, 597 00:35:34,040 --> 00:35:36,520 Speaker 1: we use machine learning to invent a brand new Seltzer 598 00:35:36,560 --> 00:35:40,120 Speaker 1: flavor just for us. I'm a veloshen see you next time. 599 00:35:52,640 --> 00:35:55,719 Speaker 1: Sleepwalkers is a production of our heart radio and unusual 600 00:35:55,760 --> 00:35:59,680 Speaker 1: productions for the latest AI news, live interviews, and behind 601 00:35:59,719 --> 00:36:02,879 Speaker 1: the scene in footage. Find us on Instagram, at Sleepwalker's 602 00:36:02,920 --> 00:36:07,480 Speaker 1: Podcast or at Sleepwalker's podcast dot com. Special thanks this 603 00:36:07,560 --> 00:36:10,600 Speaker 1: episode to the whole team at Weird Moved West, an 604 00:36:10,600 --> 00:36:14,239 Speaker 1: incredible production company in El Paso, Texas, who helped us 605 00:36:14,280 --> 00:36:18,160 Speaker 1: track down an interview Officer Castillo about his encounter with 606 00:36:18,280 --> 00:36:21,840 Speaker 1: the Loon Balloon. Special shout out to j W. Rogers, 607 00:36:22,040 --> 00:36:26,120 Speaker 1: Jorge carry On, asail Anya and Leonel Portillo, who voiced 608 00:36:26,160 --> 00:36:31,359 Speaker 1: Officer Castillo's translated lines. Sleepwalkers is hosted by me Ozveloshin 609 00:36:31,480 --> 00:36:34,080 Speaker 1: and co hosted by me Kara Price, with produced by 610 00:36:34,160 --> 00:36:37,400 Speaker 1: Julian Weller with help from Jacopo Penzo and Taylor Chacogne, 611 00:36:37,560 --> 00:36:40,839 Speaker 1: mixing by Tristan McNeil and Julian Weller. Our story editor 612 00:36:40,960 --> 00:36:44,880 Speaker 1: is Matthew Riddle. Recording assistance this episode from Miguel Paris 613 00:36:44,920 --> 00:36:49,680 Speaker 1: and Chris Handbrake. Sleepwalkers is executive produced by me Ozveloshin 614 00:36:49,800 --> 00:36:53,560 Speaker 1: and Mangesh Hattikiler. For more podcasts from My Heart Radio, 615 00:36:53,640 --> 00:36:56,560 Speaker 1: visit the I Heart Radio app, Apple Podcasts, or wherever 616 00:36:56,600 --> 00:36:57,960 Speaker 1: you listen to your favorite shows.