1 00:00:15,396 --> 00:00:23,236 Speaker 1: Pushkin. Imagine there's a place in our world where the 2 00:00:23,396 --> 00:00:31,916 Speaker 1: known things go, like old modems, or at the end 3 00:00:31,916 --> 00:00:36,676 Speaker 1: of this brightly lit hallway, towers of computer servers, a 4 00:00:36,716 --> 00:00:42,956 Speaker 1: little city of miniature skyscrapers. This vault, this data center, 5 00:00:43,556 --> 00:00:46,876 Speaker 1: stores the facts that matter, and matters a fact. It's 6 00:00:46,916 --> 00:00:50,316 Speaker 1: all that stands between a reasonable doubt and the chaos 7 00:00:50,796 --> 00:00:56,196 Speaker 1: of uncertainty. The sign on the door reads the last Archive. 8 00:00:58,316 --> 00:01:02,556 Speaker 1: Step through the door to Washington, d C. Capitol Hill. 9 00:01:03,236 --> 00:01:08,676 Speaker 1: July twenty six, nineteen sixty six, a Tuesday, the United 10 00:01:08,716 --> 00:01:12,076 Speaker 1: States was admired in a terrible war in Vietnam. Americans 11 00:01:12,076 --> 00:01:14,476 Speaker 1: were engaged in a struggle for civil rights. At home. 12 00:01:15,196 --> 00:01:19,036 Speaker 1: There'd been riots on the streets of American cities, protests 13 00:01:19,076 --> 00:01:23,996 Speaker 1: of all kinds, anti War March's student demonstrations. To a 14 00:01:23,996 --> 00:01:26,996 Speaker 1: lot of people felt as if the country was coming apart. 15 00:01:29,116 --> 00:01:31,956 Speaker 1: Inside the office building of the House of Representatives, Room 16 00:01:32,116 --> 00:01:36,236 Speaker 1: twenty two forty seven, hearings were about to begin. Hearings 17 00:01:36,236 --> 00:01:40,116 Speaker 1: that weren't about any of those protests, Hearings that were 18 00:01:40,116 --> 00:01:46,516 Speaker 1: about computer data. The subcommittee will come to order just 19 00:01:46,676 --> 00:01:50,676 Speaker 1: after ten am. Cornelius E. Gallagher, a Democratic Congressman from 20 00:01:50,676 --> 00:01:54,716 Speaker 1: New Jersey took his seat. The Special Subcommittee on Invasion 21 00:01:54,756 --> 00:01:58,636 Speaker 1: of Privacy today begins its investigation into proposals to establish 22 00:01:58,676 --> 00:02:02,756 Speaker 1: a National Data Center, a centralized facility within the structure 23 00:02:02,756 --> 00:02:05,796 Speaker 1: of the national government into which would be poured information 24 00:02:05,916 --> 00:02:09,716 Speaker 1: collected from various government agencies and from which computers could 25 00:02:09,796 --> 00:02:13,436 Speaker 1: draw selected facts. At the time, more than three billion 26 00:02:13,516 --> 00:02:16,436 Speaker 1: government records were stored on punch cards in magnetic tape, 27 00:02:16,716 --> 00:02:20,676 Speaker 1: housed in over twenty different federal agencies. So many facts 28 00:02:21,076 --> 00:02:25,716 Speaker 1: scattered all over the place. It was a mess. So 29 00:02:25,756 --> 00:02:29,916 Speaker 1: the Johnson administration said, how about building a national data Center. 30 00:02:30,436 --> 00:02:32,996 Speaker 1: It did, gather all the computer records of every federal 31 00:02:32,996 --> 00:02:36,316 Speaker 1: agency all in one place, the same way the Library 32 00:02:36,316 --> 00:02:40,236 Speaker 1: of Congress holds books, where the National Archives holds manuscripts. 33 00:02:40,476 --> 00:02:45,196 Speaker 1: Seems straightforward enough. Congressman Gallagher, though, he thought this was 34 00:02:45,236 --> 00:02:48,956 Speaker 1: a terrifying idea. If safeguards are not built into such 35 00:02:48,996 --> 00:02:51,516 Speaker 1: a facility, it could lead to the creation of what 36 00:02:51,596 --> 00:02:58,276 Speaker 1: I call the computerized man. The computerized man, as I 37 00:02:58,356 --> 00:03:02,596 Speaker 1: see him, would be stripped of his individuality and his privacy. 38 00:03:03,676 --> 00:03:06,676 Speaker 1: Congress is one of those lagging indicators of American fears 39 00:03:07,036 --> 00:03:09,556 Speaker 1: By the time I fear gets articulated on Capital Hill, 40 00:03:09,596 --> 00:03:12,636 Speaker 1: it's already been infecting the population for a while. Think 41 00:03:12,636 --> 00:03:15,356 Speaker 1: about the times Congress has brought in baby faced Facebook 42 00:03:15,436 --> 00:03:18,556 Speaker 1: CEO Mark Zuckerberg to ask him about computers and the 43 00:03:18,596 --> 00:03:22,916 Speaker 1: invasion of privacy. Mister Zuckerberg, would you be comfortable sharing 44 00:03:22,916 --> 00:03:24,556 Speaker 1: with us the name of the hotel you stayed in 45 00:03:24,636 --> 00:03:30,956 Speaker 1: last night? No? I think that maybe what this is 46 00:03:30,996 --> 00:03:34,716 Speaker 1: all about. You're right to privacy. It was too late. 47 00:03:35,636 --> 00:03:39,636 Speaker 1: Congress is very often too late. Remember last episode about 48 00:03:39,636 --> 00:03:44,876 Speaker 1: the defeat of National Health Insurance? Congress too little, too late? 49 00:03:45,996 --> 00:03:49,476 Speaker 1: The National Data Center. It's a forgotten sequel to that 50 00:03:49,556 --> 00:03:54,796 Speaker 1: story about National Health Insurance. Welcome to the Last Archive, 51 00:03:55,156 --> 00:03:57,236 Speaker 1: the show about how we know what we know, and 52 00:03:57,316 --> 00:03:59,516 Speaker 1: about how it seems lately as if we don't know 53 00:03:59,596 --> 00:04:05,196 Speaker 1: anything anymore. I'm Jill Lapoor. But of course you can 54 00:04:05,356 --> 00:04:08,196 Speaker 1: know things. For instance, you can know a lot of 55 00:04:08,236 --> 00:04:12,116 Speaker 1: things by listening old congressional hearings and asking questions like, 56 00:04:13,356 --> 00:04:16,396 Speaker 1: what the hell ever happened to the National Data Center? 57 00:04:17,676 --> 00:04:20,756 Speaker 1: And did anyone listen to Congress in the nineteen sixties 58 00:04:20,996 --> 00:04:23,636 Speaker 1: when it issued ominous warnings about a future in which 59 00:04:23,676 --> 00:04:27,356 Speaker 1: we'd all be stalked and haunted and tortured by that freak, 60 00:04:27,756 --> 00:04:36,116 Speaker 1: that monster computerized man. Everyone in the whole world seems 61 00:04:36,156 --> 00:04:38,796 Speaker 1: to have forgotten about those hearings in nineteen sixty six, 62 00:04:39,516 --> 00:04:42,596 Speaker 1: But forgetting about the means we keep having them all 63 00:04:42,636 --> 00:04:45,196 Speaker 1: over again. It's like the horror movie where you think 64 00:04:45,236 --> 00:04:47,236 Speaker 1: you've killed the monster, and then he digs himself out 65 00:04:47,236 --> 00:04:48,916 Speaker 1: of his own grave and comes after you with the 66 00:04:48,956 --> 00:04:54,196 Speaker 1: same rusty, bloody axe. Cornelius Gallagher, the Democrat who chaired 67 00:04:54,196 --> 00:04:56,596 Speaker 1: the committee, wasn't the only member of Congress who was 68 00:04:56,636 --> 00:04:59,916 Speaker 1: really worried about data and privacy. After Gallagher called the 69 00:04:59,956 --> 00:05:02,996 Speaker 1: meaning to order, he was followed by Frank Horton, a 70 00:05:03,036 --> 00:05:05,556 Speaker 1: Republican who felt more or less the same way about 71 00:05:05,556 --> 00:05:08,876 Speaker 1: the whole thing as Gallagher did. I have become conver 72 00:05:09,476 --> 00:05:12,316 Speaker 1: that the magnitude of the problem we now from front 73 00:05:12,716 --> 00:05:15,876 Speaker 1: is akin to the changes wrought in our national life 74 00:05:16,076 --> 00:05:20,116 Speaker 1: with the dawning of the nuclear age. Assuming the best 75 00:05:20,436 --> 00:05:24,436 Speaker 1: for a moment, let us regard our computer systems as 76 00:05:24,516 --> 00:05:27,756 Speaker 1: good and fair, and the computer men behind the console 77 00:05:27,876 --> 00:05:32,396 Speaker 1: as honest and capable. That was the best case scenario. 78 00:05:33,036 --> 00:05:35,596 Speaker 1: God forbid the machines were bad and unfair, and the 79 00:05:35,596 --> 00:05:39,556 Speaker 1: computer men dishonest and incompetent. By the way, the term 80 00:05:39,556 --> 00:05:43,516 Speaker 1: computer scientists didn't really exist then. They were called computermen, 81 00:05:43,796 --> 00:05:46,116 Speaker 1: which I like better anyway, because it makes it sound 82 00:05:46,116 --> 00:05:50,756 Speaker 1: as if they're half computer, half man. Anyway, there were 83 00:05:50,756 --> 00:05:53,396 Speaker 1: all kinds of reasons the federal government wanted a national 84 00:05:53,476 --> 00:05:57,716 Speaker 1: data center. For one, the government had taken on problems 85 00:05:57,796 --> 00:06:01,956 Speaker 1: of bigger and bigger scale, a social security program, the 86 00:06:01,956 --> 00:06:06,396 Speaker 1: building of atomic weapons. The government needed to collect and 87 00:06:06,476 --> 00:06:09,676 Speaker 1: to conduct statistical analysis of data had to do with 88 00:06:09,716 --> 00:06:14,556 Speaker 1: things like healthcare and economic development and racial justice. Lyndon 89 00:06:14,636 --> 00:06:17,876 Speaker 1: Johnson's administration needed that information to fight the War on poverty. 90 00:06:18,196 --> 00:06:22,036 Speaker 1: It also needed that information to implement the oversight requirements 91 00:06:22,236 --> 00:06:24,956 Speaker 1: of the nineteen sixty four Civil Rights Act and the 92 00:06:25,036 --> 00:06:28,876 Speaker 1: nineteen sixty five Voting Rights Act. To do those things, 93 00:06:28,956 --> 00:06:32,716 Speaker 1: it needed what we would call big data. Modern life 94 00:06:32,716 --> 00:06:36,876 Speaker 1: required taller and taller piles of facts, numbers, and data. 95 00:06:37,276 --> 00:06:39,796 Speaker 1: Piles of data so tall that they had to be 96 00:06:39,836 --> 00:06:44,276 Speaker 1: handled by computers. And remember this was decades before the 97 00:06:44,356 --> 00:06:47,636 Speaker 1: personal computer, before people had their own computers and knew 98 00:06:47,676 --> 00:06:50,396 Speaker 1: how to use them, which meant that people had to 99 00:06:50,396 --> 00:06:52,716 Speaker 1: be willing to put a whole lot of trust in 100 00:06:52,756 --> 00:06:56,076 Speaker 1: the very small number of people, mainly men who knew 101 00:06:56,076 --> 00:07:01,036 Speaker 1: how to run those machines, those computer men. Because of that, 102 00:07:01,676 --> 00:07:03,996 Speaker 1: it seemed as if computer men had all kinds of 103 00:07:04,036 --> 00:07:07,676 Speaker 1: information about you, and they seemed to have limitless powers. 104 00:07:09,596 --> 00:07:14,556 Speaker 1: Assuming a computer man who was dishonest, unscrupulous, or bent 105 00:07:14,716 --> 00:07:18,236 Speaker 1: on injury, there would be nothing sacred. We could be destroyed. 106 00:07:19,116 --> 00:07:22,356 Speaker 1: Whoa so crazy start to those hearings? We could all 107 00:07:22,356 --> 00:07:26,636 Speaker 1: be destroyed. Then Gallagher called his first witness, the subcommittee 108 00:07:26,716 --> 00:07:29,036 Speaker 1: is very fortunate this morning and having as its first 109 00:07:29,116 --> 00:07:32,956 Speaker 1: witness in this series of hearings, mister Vance Packard. Mister 110 00:07:32,996 --> 00:07:35,956 Speaker 1: Packard is more responsible than any man in our country 111 00:07:35,996 --> 00:07:38,476 Speaker 1: for alerting us to the dangers that lurk in the 112 00:07:38,476 --> 00:07:43,156 Speaker 1: twilight of our sophisticated society. The then famous, if now 113 00:07:43,396 --> 00:07:48,516 Speaker 1: sadly much forgotten, mister Vance Packard. I guess we'd now 114 00:07:48,556 --> 00:07:52,076 Speaker 1: call this guy an influencer. He wasn't an academic, but 115 00:07:52,236 --> 00:07:54,836 Speaker 1: he was often mistaken for one. And I'm sorry, but 116 00:07:54,996 --> 00:07:58,556 Speaker 1: I love this guy. In nineteen sixty six, he was 117 00:07:58,596 --> 00:08:00,836 Speaker 1: a fifty two year old social critic, and he was 118 00:08:00,916 --> 00:08:04,316 Speaker 1: most famous for a book called The Naked Society, in 119 00:08:04,356 --> 00:08:07,076 Speaker 1: which he'd issued a warning about the loss of privacy. 120 00:08:08,036 --> 00:08:10,236 Speaker 1: The book spent twenty three weeks on the New York 121 00:08:10,236 --> 00:08:14,636 Speaker 1: Times bestseller list. Packard had written a whole series of bestsellers, 122 00:08:15,076 --> 00:08:17,356 Speaker 1: a lot of them about things I've been talking about here, 123 00:08:17,356 --> 00:08:22,196 Speaker 1: and the last archive lie detectors targeted advertising data surveillance. 124 00:08:22,876 --> 00:08:25,796 Speaker 1: The Times, in a profile, described him as a man 125 00:08:25,996 --> 00:08:28,196 Speaker 1: who acts on the whole like a professor at a 126 00:08:28,236 --> 00:08:32,316 Speaker 1: small college, a little unsure of tenure. This was actually 127 00:08:32,316 --> 00:08:35,196 Speaker 1: a gentle way of saying that Packard was incredibly boring 128 00:08:35,476 --> 00:08:39,796 Speaker 1: to listen to. But the Times reporter added, at the typewriter, 129 00:08:39,916 --> 00:08:43,596 Speaker 1: he is something else again. In other words, in person, 130 00:08:43,916 --> 00:08:47,636 Speaker 1: Vance Packard was a dud. But on the page, Vance 131 00:08:47,676 --> 00:08:51,996 Speaker 1: Packard was a stud. Sometimes newspapers referred to Packard as 132 00:08:51,996 --> 00:08:55,596 Speaker 1: a sociologist. This really tugged at the well trimmed beards 133 00:08:55,596 --> 00:08:59,956 Speaker 1: of the nation's actual sociologists. One social science journal published 134 00:08:59,996 --> 00:09:03,436 Speaker 1: an entire issue trying to answer the question, is Vance 135 00:09:03,516 --> 00:09:07,116 Speaker 1: Packard necessary? You'd feel bad for the guy, except that 136 00:09:07,156 --> 00:09:10,236 Speaker 1: he was a famously happy man and troubled by his critics. 137 00:09:10,796 --> 00:09:13,276 Speaker 1: Packard liked to work in bed. He'd spend hours a 138 00:09:13,356 --> 00:09:16,676 Speaker 1: day lying there. He clipped stories from newspapers and magazines 139 00:09:16,716 --> 00:09:19,836 Speaker 1: and read through them. He played the banjo, He swam 140 00:09:19,876 --> 00:09:24,036 Speaker 1: in the ocean, fare asked every morning. He was famously thrifty. 141 00:09:24,276 --> 00:09:26,756 Speaker 1: In the winter, he wore two pairs of pants, one 142 00:09:26,796 --> 00:09:30,396 Speaker 1: on top of the other to stay warm. Famously thrifty, 143 00:09:30,796 --> 00:09:36,396 Speaker 1: famously kukie, but either way, very very famous. We are 144 00:09:36,396 --> 00:09:39,116 Speaker 1: indeed honored and privilege to have you opened these hearings today. 145 00:09:39,116 --> 00:09:43,156 Speaker 1: Mister Packard, please proceed. Cornelius Gallagher welcomed him to the 146 00:09:43,196 --> 00:09:48,596 Speaker 1: hearings with enthusiasm. Mister Chairman and members of the committee, 147 00:09:49,196 --> 00:09:59,316 Speaker 1: thank you for inviting me. Records on individual Americans now 148 00:09:59,476 --> 00:10:05,036 Speaker 1: number in the billions. We leave a trail of records 149 00:10:05,076 --> 00:10:09,356 Speaker 1: behind us from the moment of birth. The individual rule 150 00:10:09,436 --> 00:10:14,716 Speaker 1: citizen who is concerned about the erosion of his privacy 151 00:10:14,916 --> 00:10:19,316 Speaker 1: has up until now had some consolation in the knowledge 152 00:10:19,356 --> 00:10:23,596 Speaker 1: that all these files about his life have been widely 153 00:10:23,676 --> 00:10:27,356 Speaker 1: dispersed and often difficult to get at. Are we getting 154 00:10:27,356 --> 00:10:31,196 Speaker 1: prematurely overheated and are concerned? I do not think you 155 00:10:31,276 --> 00:10:35,596 Speaker 1: are getting prematurely overheated at all. I think we should 156 00:10:35,596 --> 00:10:41,276 Speaker 1: all be scared stiff about the possibility that these giant 157 00:10:41,356 --> 00:10:45,996 Speaker 1: machines would be fed data about individual Americans, and that 158 00:10:46,116 --> 00:10:51,036 Speaker 1: this information would be retrievable by a number of different 159 00:10:51,236 --> 00:10:56,596 Speaker 1: organizations or groups. I think this would clearly create the 160 00:10:56,636 --> 00:11:01,836 Speaker 1: preconditions for a totalitarian system. He went on like that. 161 00:11:02,356 --> 00:11:04,916 Speaker 1: On the first day of the hearings, nearly everyone agreed 162 00:11:04,956 --> 00:11:10,916 Speaker 1: with Packard, down to the last witness agenda. I got 163 00:11:10,916 --> 00:11:14,116 Speaker 1: to wondering whether people outside of Congress were worried about 164 00:11:14,116 --> 00:11:17,796 Speaker 1: the National Data Center, and once I started looking, men 165 00:11:17,916 --> 00:11:20,996 Speaker 1: they were everywhere. The New York Times, so the Data 166 00:11:20,996 --> 00:11:24,196 Speaker 1: Center would bring the United States closer to an Orwellian nightmare. 167 00:11:24,996 --> 00:11:28,436 Speaker 1: Big Brother Never Rests, ran the headline in the Indianapolis 168 00:11:28,516 --> 00:11:34,156 Speaker 1: Star from Pulpits priests and ministers and rabbis denounced the idea. 169 00:11:34,236 --> 00:11:38,196 Speaker 1: Another newspaper called it a giant peeping tom and I'm 170 00:11:38,196 --> 00:11:40,916 Speaker 1: not kidding. Even the Daughters of the American Revolution the 171 00:11:40,996 --> 00:11:45,356 Speaker 1: dr passed a resolution against the Data Center. But what 172 00:11:45,436 --> 00:11:49,116 Speaker 1: about just regular people? Almost everyone who wrote to Gallagher 173 00:11:49,276 --> 00:11:53,236 Speaker 1: about the National Data Center was opposed to it. Dear sir, 174 00:11:53,716 --> 00:11:55,796 Speaker 1: it is good to know that your committee is alert 175 00:11:55,876 --> 00:11:58,796 Speaker 1: to the dangers of this gestapo scheme of the president. 176 00:11:59,356 --> 00:12:02,876 Speaker 1: I'magene Armstrong, El Paso, Texas. A lot of the letter 177 00:12:02,876 --> 00:12:07,276 Speaker 1: writers feared a vast government conspiracy, a socialist conspiracy, or 178 00:12:07,316 --> 00:12:10,916 Speaker 1: a communist conspiracy. Let me just add here that in 179 00:12:10,916 --> 00:12:14,036 Speaker 1: the nineteen sixties this sort of cockamami thing got sent 180 00:12:14,076 --> 00:12:16,556 Speaker 1: by the postal surface and read by one person or 181 00:12:16,596 --> 00:12:19,196 Speaker 1: maybe two, and then they got filed in a drawer. 182 00:12:19,596 --> 00:12:24,116 Speaker 1: Today this stuff is on Fortune or Twitter, read possibly 183 00:12:24,156 --> 00:12:27,396 Speaker 1: by thousands, clicked on and linked to. I always think 184 00:12:27,436 --> 00:12:29,836 Speaker 1: about that when I read these old letters, that no 185 00:12:29,876 --> 00:12:33,116 Speaker 1: one has heard these voices since nineteen sixty six, and 186 00:12:33,116 --> 00:12:38,636 Speaker 1: that really hardly anyone heard them. Then, dear sir, what 187 00:12:38,836 --> 00:12:43,196 Speaker 1: better way to inaugurate a police state? Already we have 188 00:12:43,356 --> 00:12:49,316 Speaker 1: creeping socialism in our government. God help us yours truly. 189 00:12:49,596 --> 00:12:54,276 Speaker 1: Donald F. Hammond, Flagstaff, Arizona. Gallagher was a Democrat, a liberal, 190 00:12:54,316 --> 00:12:57,596 Speaker 1: a supporter of Johnson's Great Society, but the citizens who 191 00:12:57,596 --> 00:13:00,876 Speaker 1: wrote to him on this topic were chiefly conservatives. Also, 192 00:13:01,036 --> 00:13:06,156 Speaker 1: there was a lot of talk about Dossier's Dear mister Gallagher, 193 00:13:06,196 --> 00:13:08,396 Speaker 1: a word of thanks for bringing to a lot of 194 00:13:08,516 --> 00:13:12,796 Speaker 1: day the nations of the White House task force employed 195 00:13:12,876 --> 00:13:16,876 Speaker 1: in the Soviet life procedure of turning every US citizen 196 00:13:16,996 --> 00:13:21,956 Speaker 1: into a little white rabbit for governmental dissection, brainwashing, and 197 00:13:22,076 --> 00:13:29,676 Speaker 1: intimidation via the dossier process. Sincerely, Elizabeth Nash, Front Royal Virginia. 198 00:13:30,196 --> 00:13:32,276 Speaker 1: A lot of these letter writers were from what was 199 00:13:32,316 --> 00:13:36,396 Speaker 1: then the very far fringe of the American right, anti 200 00:13:36,436 --> 00:13:44,836 Speaker 1: government coups, white supremacists, John Bircher's conspiracy theorists. Dear mister Gallagher, 201 00:13:45,036 --> 00:13:50,036 Speaker 1: please kill the idea of a central data bank. This 202 00:13:50,076 --> 00:13:54,996 Speaker 1: looks like communist infiltration. Also kill the Civil Rights Bill, 203 00:13:55,596 --> 00:14:00,716 Speaker 1: especially fair housing jail Martin Luther King for inciting rebellion. 204 00:14:01,636 --> 00:14:07,476 Speaker 1: Most sincerely, Frank P. Sterling, Open California, veteran World War One. 205 00:14:07,756 --> 00:14:13,036 Speaker 1: Dear Congressman, the federal government is already operating too much 206 00:14:13,076 --> 00:14:18,636 Speaker 1: of our business, controlling our manufacturing, interfering in our medical establishments, 207 00:14:18,836 --> 00:14:24,756 Speaker 1: taking over our educational facilities and procedures. The Great Society 208 00:14:24,876 --> 00:14:29,636 Speaker 1: now demands to have prior control and jurisdiction over even 209 00:14:29,716 --> 00:14:33,476 Speaker 1: our personal thoughts. Guess we might just as well move 210 00:14:33,556 --> 00:14:41,356 Speaker 1: to Moscow. Please do everything in your power to defeat 211 00:14:41,476 --> 00:14:48,396 Speaker 1: and rout these enemies of freedom and liberty. James s. Christie, 212 00:14:48,476 --> 00:14:54,196 Speaker 1: Upper Darby, Pennsylvania. The congressman started to get a little worried. 213 00:14:54,636 --> 00:14:58,476 Speaker 1: He'd right back trying to calm these people down. Dear 214 00:14:58,556 --> 00:15:02,516 Speaker 1: mister Christie, what I appreciate you supported my inquiry. I 215 00:15:02,556 --> 00:15:05,076 Speaker 1: can either agree with your views that the federal government 216 00:15:05,236 --> 00:15:08,876 Speaker 1: is taking over our rights, nor that the great society 217 00:15:09,196 --> 00:15:15,636 Speaker 1: is the instrument for such action. Sincerely, Cornelius Gallagher. Remember though, 218 00:15:15,676 --> 00:15:17,596 Speaker 1: that at the time there was also just a lot 219 00:15:17,636 --> 00:15:21,316 Speaker 1: of anxiety about computers, and especially men who were obsessed 220 00:15:21,356 --> 00:15:24,716 Speaker 1: with them. In the nineteen sixties, Americans were terrified too 221 00:15:24,756 --> 00:15:28,036 Speaker 1: of another machine, the war machine, that was calling up 222 00:15:28,116 --> 00:15:31,676 Speaker 1: young men, number by number, for the draft to go 223 00:15:31,716 --> 00:15:37,716 Speaker 1: to Vietnam. Before nineteen sixty six, the kids of the 224 00:15:37,716 --> 00:15:41,596 Speaker 1: middle class, the college students, they got educational deferments or 225 00:15:41,676 --> 00:15:45,756 Speaker 1: some other dispensation. But by nineteen sixty six, the number 226 00:15:45,756 --> 00:15:47,876 Speaker 1: of young men called up to the draft every month 227 00:15:48,316 --> 00:15:52,156 Speaker 1: was being raised from seventeen thousand to thirty five thousand. 228 00:15:52,836 --> 00:15:55,596 Speaker 1: That meant that for the first time, middle class white 229 00:15:55,596 --> 00:16:01,156 Speaker 1: men boys really were being sent to Vietnam. Meantime, the 230 00:16:01,196 --> 00:16:05,676 Speaker 1: war began to seem as in fact, it was hopeless 231 00:16:05,716 --> 00:16:10,036 Speaker 1: and immoral. The Vietnam War was the first war whose 232 00:16:10,036 --> 00:16:14,596 Speaker 1: every move was plotted by computer. Johnson's Secretary of Defense, 233 00:16:14,876 --> 00:16:19,316 Speaker 1: Robert McNamara, was a systems analyst, a computer man who 234 00:16:19,316 --> 00:16:22,156 Speaker 1: appeared to be running this war with a giant computer 235 00:16:22,316 --> 00:16:26,836 Speaker 1: that could calculate a victory index. Johnson said, there is 236 00:16:26,876 --> 00:16:29,116 Speaker 1: no computer that can tell the hour and the day 237 00:16:29,116 --> 00:16:34,636 Speaker 1: of peace. But apparently there was a story, has it. 238 00:16:35,076 --> 00:16:37,596 Speaker 1: People at the Pentagon fed all kinds of data into 239 00:16:37,596 --> 00:16:42,556 Speaker 1: that computer, population, gross national product, manufacturing capability, the number 240 00:16:42,596 --> 00:16:46,356 Speaker 1: of tanks, ships and aircraft, the size of the armed forces. 241 00:16:46,956 --> 00:16:50,556 Speaker 1: And then about nineteen sixty seven or so, men at 242 00:16:50,556 --> 00:16:53,356 Speaker 1: the Pentagon went down to the basement and asked that 243 00:16:53,436 --> 00:16:57,596 Speaker 1: giant computer, which had run all those numbers, when will 244 00:16:57,636 --> 00:17:01,836 Speaker 1: we win the war? And the computer spat out these words, 245 00:17:03,516 --> 00:17:12,396 Speaker 1: you won. In nineteen sixty four, conservatives hated Lyndon Johnson's 246 00:17:12,436 --> 00:17:16,356 Speaker 1: Great Society and big government, and liberals were getting angrier 247 00:17:16,356 --> 00:17:19,716 Speaker 1: and angrier about Vietnam, seeing it as a war being 248 00:17:19,756 --> 00:17:24,556 Speaker 1: waged by the ruthless, relentless mindlessness of a giant electronic brain. 249 00:17:25,636 --> 00:17:28,196 Speaker 1: That meant that things looked bad for the establishment of 250 00:17:28,196 --> 00:17:33,516 Speaker 1: a national Data center. On the last day of Gallagher's 251 00:17:33,556 --> 00:17:37,916 Speaker 1: hearings July twenty eighth, nineteen sixty six, a witness appeared, 252 00:17:38,516 --> 00:17:42,116 Speaker 1: Paul Baron of Rand Corporation, a forty year old Polish 253 00:17:42,116 --> 00:17:47,916 Speaker 1: born computerman. Baron had grown up in Philadelphia. He'd worked 254 00:17:47,956 --> 00:17:51,236 Speaker 1: for the Remington Rand Company, the company that made the Univac, 255 00:17:51,396 --> 00:17:54,116 Speaker 1: the machine that CBS had hired for election night in 256 00:17:54,196 --> 00:17:59,676 Speaker 1: nineteen fifty two. Baron was brilliant. Suppose we build our 257 00:17:59,716 --> 00:18:03,756 Speaker 1: future systems without any safeguards at all, and all information, 258 00:18:04,236 --> 00:18:07,996 Speaker 1: this whole list of records did we accumulate during our life, 259 00:18:08,316 --> 00:18:12,716 Speaker 1: is available in various systems. Barren told the committee something 260 00:18:12,756 --> 00:18:16,396 Speaker 1: it hadn't expected to hear and couldn't really understand. He 261 00:18:16,476 --> 00:18:20,156 Speaker 1: admitted that a national data center could destroy privacy as 262 00:18:20,156 --> 00:18:22,796 Speaker 1: we knew it. But then he told the committee they 263 00:18:22,836 --> 00:18:25,316 Speaker 1: actually didn't matter whether or not the national data center 264 00:18:25,436 --> 00:18:29,076 Speaker 1: was built. It didn't require a building. Data was about 265 00:18:29,076 --> 00:18:32,796 Speaker 1: to get all linked up anyway. Whether the information is 266 00:18:32,876 --> 00:18:35,996 Speaker 1: centralized in one central data bank or whether it is 267 00:18:36,036 --> 00:18:39,596 Speaker 1: spread around the country doesn't make a darn pitted difference. 268 00:18:39,876 --> 00:18:43,276 Speaker 1: The result is the same. Basically, we could sum up, 269 00:18:43,316 --> 00:18:46,116 Speaker 1: mister Baron, that we certainly should not attempt to impede 270 00:18:46,156 --> 00:18:48,476 Speaker 1: the growth of technology. But at the same time, we 271 00:18:48,556 --> 00:18:51,396 Speaker 1: should start devoting more time to building in the safeguards 272 00:18:51,756 --> 00:18:54,916 Speaker 1: so that this technology can serve man rather than subordinate 273 00:18:54,996 --> 00:18:58,196 Speaker 1: him to its decision. That's right. I don't think we're 274 00:18:58,236 --> 00:19:00,956 Speaker 1: going to be able to stop technology. I think that 275 00:19:01,036 --> 00:19:03,636 Speaker 1: decision is not ours. But what we can do is 276 00:19:03,676 --> 00:19:09,316 Speaker 1: provide all the safeguards we possibly can. Barren's position came 277 00:19:09,356 --> 00:19:13,236 Speaker 1: down to this, you can't stop technology, and you can't 278 00:19:13,276 --> 00:19:16,796 Speaker 1: stop people from collecting data, but you can make data 279 00:19:16,836 --> 00:19:20,196 Speaker 1: safer if you act right now. He told the committee, 280 00:19:20,316 --> 00:19:23,716 Speaker 1: please do something. Set up a national data center, or 281 00:19:23,796 --> 00:19:27,996 Speaker 1: don't set up a national data center. Just do something, 282 00:19:28,876 --> 00:19:32,756 Speaker 1: set up some rules, establish principles before it's too late, 283 00:19:33,076 --> 00:19:35,036 Speaker 1: because unless the government comes up with a way to 284 00:19:35,116 --> 00:19:38,876 Speaker 1: understand what data is and what can be done with it, 285 00:19:38,876 --> 00:19:41,356 Speaker 1: will be left wandering around in a world where anyone 286 00:19:41,396 --> 00:19:45,276 Speaker 1: can collect data about anything and do whatever they want 287 00:19:45,316 --> 00:19:49,036 Speaker 1: with it. The Special Subcommittee of Invasion of Privacy is 288 00:19:49,036 --> 00:19:52,356 Speaker 1: a journ God damn it. The House investigation ended and 289 00:19:52,356 --> 00:19:55,116 Speaker 1: they didn't come up with any rules. But then the 290 00:19:55,196 --> 00:19:58,716 Speaker 1: Senate decided to hold its own hearings I've got the 291 00:19:58,756 --> 00:20:03,196 Speaker 1: tapes stored here in this data center, in the halls 292 00:20:03,236 --> 00:20:15,996 Speaker 1: of the last archive. The Senate held its own hearings 293 00:20:15,996 --> 00:20:18,556 Speaker 1: about the National Data Center a few months after the 294 00:20:18,596 --> 00:20:22,476 Speaker 1: House in March of nineteen sixty seven. Its chief witness 295 00:20:22,636 --> 00:20:25,636 Speaker 1: was a then unknown law school professor from the University 296 00:20:25,636 --> 00:20:28,756 Speaker 1: of Michigan. He was thirty two at the time. He's 297 00:20:28,756 --> 00:20:32,836 Speaker 1: in his eighties now. I called him up. I'm Arthur Miller. 298 00:20:33,196 --> 00:20:38,156 Speaker 1: I'm a university professor at NYU with a long tour 299 00:20:38,196 --> 00:20:41,676 Speaker 1: at Harvard Law School, and I wrote in nineteen seventy 300 00:20:41,716 --> 00:20:45,316 Speaker 1: one a book called The Assault on Privacy, which some 301 00:20:45,516 --> 00:20:50,076 Speaker 1: say saw the future. Miller's retired now for a long time. 302 00:20:50,116 --> 00:20:52,476 Speaker 1: In the nineteen seventies and nineteen eighties, he had a 303 00:20:52,476 --> 00:20:56,516 Speaker 1: TV show called Miller's Court. On an upcoming Miller's Court, 304 00:20:56,596 --> 00:20:59,716 Speaker 1: we'll be talking about the draft, the subject. Everyone has 305 00:20:59,716 --> 00:21:01,716 Speaker 1: an opinion. I watched it when I was a kid. 306 00:21:01,756 --> 00:21:05,476 Speaker 1: I thought it was like free law school. Miller really 307 00:21:05,596 --> 00:21:10,596 Speaker 1: is a maid for TV law professor. He's a nimble, witty, combative. 308 00:21:11,116 --> 00:21:14,596 Speaker 1: He's a very distinctive style. He wears three piece suits 309 00:21:14,596 --> 00:21:17,996 Speaker 1: with flashy ties. Isn't it time for your day in court. 310 00:21:18,356 --> 00:21:20,876 Speaker 1: I wanted to know how Miller got roped into testifying 311 00:21:20,916 --> 00:21:24,036 Speaker 1: before the Senate hearing. He said it all started when 312 00:21:24,036 --> 00:21:26,076 Speaker 1: he got a phone call from the office of Senator 313 00:21:26,156 --> 00:21:30,476 Speaker 1: Sam Irvin, a Democrat from North Carolina who's best known 314 00:21:30,716 --> 00:21:33,996 Speaker 1: as the guy who chaired the Watergate hearings. My phone rings. 315 00:21:34,076 --> 00:21:40,436 Speaker 1: It's the staffer from the Senate Committee, Irvin's committee, and 316 00:21:42,156 --> 00:21:45,116 Speaker 1: he says, we'd like you to be the lead witness 317 00:21:45,236 --> 00:21:49,756 Speaker 1: on this hearing that we're going to hold on the 318 00:21:49,876 --> 00:21:53,396 Speaker 1: National Data Center. And I say to the staffer, I 319 00:21:53,396 --> 00:21:58,116 Speaker 1: don't know a thing about privacy, and he says, well, 320 00:21:58,116 --> 00:22:00,876 Speaker 1: we're told that you've written about it. And I said no, 321 00:22:01,076 --> 00:22:05,156 Speaker 1: I've written about computers and privacy. And he says, hang on. 322 00:22:06,196 --> 00:22:10,116 Speaker 1: And the senator gets on the phone. What do you 323 00:22:10,156 --> 00:22:13,436 Speaker 1: mean you don't know anything about privacy? And he says, 324 00:22:13,476 --> 00:22:18,796 Speaker 1: I still want you as the lead witness in these hearings. Senator, 325 00:22:18,916 --> 00:22:22,756 Speaker 1: I know nothing about privacy, and he screams into the phone, 326 00:22:23,116 --> 00:22:27,676 Speaker 1: Well learn damn it. Miller says, okay, okay, I'll do it. 327 00:22:28,036 --> 00:22:30,796 Speaker 1: You don't say no to Sam Irvin, so he reads up. 328 00:22:31,116 --> 00:22:33,196 Speaker 1: I asked him why so many people had gotten worked 329 00:22:33,276 --> 00:22:36,396 Speaker 1: up about the National Data Center. It just seemed like 330 00:22:36,436 --> 00:22:43,116 Speaker 1: a damn good political issue that crossed Republican and Democratic lines, 331 00:22:43,956 --> 00:22:49,316 Speaker 1: because when you're dealing with privacy, you're into the world 332 00:22:49,396 --> 00:22:56,436 Speaker 1: of humanism, and the Democrats can rally around that, and 333 00:22:56,516 --> 00:23:02,316 Speaker 1: the Republicans can rally around not creating big government that 334 00:23:02,516 --> 00:23:08,636 Speaker 1: knows everything about you. The problem was that some people 335 00:23:08,796 --> 00:23:15,276 Speaker 1: saw the government as the devil. Personally. I was concerned 336 00:23:15,636 --> 00:23:20,756 Speaker 1: that there were no constraints on the broad outlines of 337 00:23:20,796 --> 00:23:25,756 Speaker 1: the system that was being proposed. So March nineteen sixty seven, 338 00:23:26,236 --> 00:23:30,276 Speaker 1: Arthur Miller gets dressed up, three piece suit, flashy tie, 339 00:23:30,396 --> 00:23:33,276 Speaker 1: pocket handkerchief. He goes to the hearing in the Senate. 340 00:23:34,276 --> 00:23:38,116 Speaker 1: Sadly it wasn't recorded, but Miller tells the senators with 341 00:23:38,156 --> 00:23:41,396 Speaker 1: his usual flair, that it's stupid to oppose a technological 342 00:23:41,436 --> 00:23:44,556 Speaker 1: development that will widen the sphere of knowledge just because 343 00:23:44,556 --> 00:23:48,396 Speaker 1: it might be misused. I was not Carrie Nation with 344 00:23:48,516 --> 00:23:52,316 Speaker 1: an axe out to smash the machine. Never have been 345 00:23:53,716 --> 00:23:57,836 Speaker 1: looking for rational controls. And the problem back then was 346 00:23:57,956 --> 00:24:00,956 Speaker 1: not only was no one looking at that, but the 347 00:24:01,076 --> 00:24:09,676 Speaker 1: high technology people didn't know how to do it. Miller 348 00:24:09,716 --> 00:24:12,716 Speaker 1: agreed with that rand guy Paul Baron, who testified before 349 00:24:12,756 --> 00:24:15,876 Speaker 1: the House a few months earlier. Baron said data collection 350 00:24:15,996 --> 00:24:18,596 Speaker 1: was going to happen, and it wouldn't require an actual 351 00:24:18,596 --> 00:24:21,516 Speaker 1: building like the Library of Congress or the National Archives. 352 00:24:22,036 --> 00:24:28,716 Speaker 1: It would just be there floating around like a cloud. 353 00:24:31,036 --> 00:24:33,036 Speaker 1: The thousand A Senate we're supposed to keep on thinking 354 00:24:33,116 --> 00:24:37,396 Speaker 1: about the National Data Center, but instead they tabled it everything, 355 00:24:37,476 --> 00:24:41,436 Speaker 1: including the questions it had raised about data and privacy. Basically, 356 00:24:41,516 --> 00:24:44,156 Speaker 1: they hoped everyone would forget about it. They'd killed the 357 00:24:44,196 --> 00:24:49,316 Speaker 1: monster and buried him. One month after the Senate hearings 358 00:24:49,316 --> 00:24:53,236 Speaker 1: were Miller testified. The first big, really big protest against 359 00:24:53,276 --> 00:24:56,556 Speaker 1: the Vietnam War took place in New York. Martin Luther 360 00:24:56,676 --> 00:24:59,996 Speaker 1: King spoke out against the war. We have the participants 361 00:24:59,996 --> 00:25:05,636 Speaker 1: in today's unprecedented national peace demonstration are united in our 362 00:25:05,756 --> 00:25:12,276 Speaker 1: conviction of the imperative need pond immediate peaceful solution to 363 00:25:12,436 --> 00:25:19,036 Speaker 1: an illegal and unjustifiable law that fault. Robert McNamara, Lyndon, 364 00:25:19,116 --> 00:25:23,476 Speaker 1: Johnson's Secretary of Defense, resigned a few months later. Johnson 365 00:25:23,556 --> 00:25:26,076 Speaker 1: shocked the country when he announced that he would not 366 00:25:26,196 --> 00:25:30,196 Speaker 1: run for reelection in nineteen sixty eight. The Great society 367 00:25:30,476 --> 00:25:33,556 Speaker 1: was over and the proposal for a National Data Center 368 00:25:33,956 --> 00:25:37,316 Speaker 1: was dead. What do you think was lost in the 369 00:25:37,396 --> 00:25:40,796 Speaker 1: way that defeat happened? When one looks at the National 370 00:25:40,916 --> 00:25:45,316 Speaker 1: Data Center, it's child's play. It's fun for the feeble mind. 371 00:25:45,916 --> 00:25:54,316 Speaker 1: It was basically a statistical system. Its ability to harm 372 00:25:54,596 --> 00:25:59,356 Speaker 1: was limited. Yeah, so, I mean that's exactly squarely. What 373 00:25:59,596 --> 00:26:03,356 Speaker 1: fascinates me about this story is looking back, was there 374 00:26:03,396 --> 00:26:05,436 Speaker 1: ever a moment, and if there was, was at that 375 00:26:05,476 --> 00:26:08,356 Speaker 1: moment in sixty six or sixty seven when maybe we 376 00:26:08,476 --> 00:26:11,356 Speaker 1: could have usher in a set of principles that might 377 00:26:11,396 --> 00:26:14,636 Speaker 1: have spared us the course we're now seeming to be 378 00:26:14,676 --> 00:26:19,316 Speaker 1: doomed too. If somebody had said, hey, this National Data 379 00:26:19,436 --> 00:26:24,836 Speaker 1: cent a great idea, sleek, efficient, economical, but we better 380 00:26:24,916 --> 00:26:29,396 Speaker 1: take two years out and figure out the delatarious side 381 00:26:29,396 --> 00:26:35,236 Speaker 1: effects of it. If that occurred, the possibility that we 382 00:26:35,396 --> 00:26:39,836 Speaker 1: shot ourselves in the foot because we defeated it may 383 00:26:39,876 --> 00:26:42,836 Speaker 1: have been a reality, but I assure you back in 384 00:26:42,956 --> 00:26:47,916 Speaker 1: sixty six, seven, eight, nine, seventy, there was no no 385 00:26:49,196 --> 00:26:52,436 Speaker 1: ability to do that. In other words, a better version 386 00:26:52,516 --> 00:26:55,156 Speaker 1: of the National Data Center never had a chance in 387 00:26:55,196 --> 00:26:59,156 Speaker 1: the nineteen sixties because people just couldn't wrap their heads 388 00:26:59,196 --> 00:27:02,516 Speaker 1: around it, or some people could. They could see what 389 00:27:02,676 --> 00:27:05,876 Speaker 1: was coming, but not enough people to really do something 390 00:27:05,876 --> 00:27:09,996 Speaker 1: about it, like make those rules and build those safeguards. 391 00:27:10,956 --> 00:27:14,316 Speaker 1: You probably know what happened next. Killing the National Data 392 00:27:14,356 --> 00:27:17,676 Speaker 1: Center didn't mean the end of data. It was then 393 00:27:17,876 --> 00:27:25,756 Speaker 1: followed by years of expanded governmental and private systems, which 394 00:27:26,036 --> 00:27:31,956 Speaker 1: in effect gives you a national data center maturing into 395 00:27:32,116 --> 00:27:36,716 Speaker 1: today's world, which is in fact the womb to tomb 396 00:27:36,796 --> 00:27:42,756 Speaker 1: dot cer. Womb to tomb, cradle to grave data chronicles 397 00:27:42,836 --> 00:27:47,716 Speaker 1: every moment of your life, and that data is entirely unregulated. 398 00:27:52,996 --> 00:27:55,876 Speaker 1: Even as the National Data Center was being buried, a 399 00:27:55,996 --> 00:27:59,636 Speaker 1: new monster was rising. The Internet, or at least what 400 00:27:59,676 --> 00:28:03,156 Speaker 1: would become the Internet, was first demonstrated in nineteen seventy 401 00:28:03,156 --> 00:28:06,036 Speaker 1: two in the ballroom of a hotel in Washington, d c. 402 00:28:08,836 --> 00:28:11,076 Speaker 1: The ball and floor looked like the bottom of the ocean. 403 00:28:11,276 --> 00:28:14,596 Speaker 1: A huge octopus splayed across it its tentacles, so many 404 00:28:14,636 --> 00:28:18,076 Speaker 1: cables tangled in a seaweed of telephone lines and electrical cords. 405 00:28:19,836 --> 00:28:22,676 Speaker 1: Technicians had set up twenty nine different terminals, each at 406 00:28:22,716 --> 00:28:26,756 Speaker 1: its own station. Bob Kahn, a computer man, had organized 407 00:28:26,756 --> 00:28:30,876 Speaker 1: the demo of what was called arpanet, the Computerized Network 408 00:28:30,876 --> 00:28:35,116 Speaker 1: of the Department of Defenses Advanced Research Projects Agency or ARPA. 409 00:28:36,836 --> 00:28:38,716 Speaker 1: People have been trying to build it for a long time, 410 00:28:39,236 --> 00:28:42,596 Speaker 1: but it had never gotten finished and configured. They needed 411 00:28:42,596 --> 00:28:44,636 Speaker 1: a deadline to give them a kick in the pants. 412 00:28:47,116 --> 00:28:48,556 Speaker 1: So can you tell me a little bit what the 413 00:28:48,596 --> 00:28:52,556 Speaker 1: mood was like in the ballroom during that week? Well, 414 00:28:52,596 --> 00:28:55,756 Speaker 1: it was I would say exuberant. I mean it was 415 00:28:55,876 --> 00:29:01,276 Speaker 1: like you know what you'd find in Time Square at 416 00:29:01,276 --> 00:29:04,076 Speaker 1: eleven thirty waiting for New Year's to happen. It was 417 00:29:04,436 --> 00:29:08,956 Speaker 1: everybody was excited. They knew they were part of, you know, 418 00:29:09,476 --> 00:29:12,436 Speaker 1: something important. Were you hoping to get reporters to come 419 00:29:12,476 --> 00:29:14,596 Speaker 1: right about this or it was really just for internally 420 00:29:14,676 --> 00:29:17,636 Speaker 1: for the community of people who were working on the system. 421 00:29:17,756 --> 00:29:21,516 Speaker 1: We weren't looking to try and advertise. I mean, like, 422 00:29:21,756 --> 00:29:23,796 Speaker 1: did the Right brothers have the press there when they 423 00:29:23,796 --> 00:29:27,396 Speaker 1: have their first airflight? I don't know. Maybe they didn't 424 00:29:27,396 --> 00:29:32,396 Speaker 1: know if they'd get off to them. The news that day, 425 00:29:32,476 --> 00:29:36,356 Speaker 1: October nineteen seventy two all had to do with the 426 00:29:36,436 --> 00:29:40,556 Speaker 1: upcoming presidential election, which was just two weeks away. Henry 427 00:29:40,636 --> 00:29:43,676 Speaker 1: Kissinger had just gotten back to Washington after a trip 428 00:29:43,716 --> 00:29:46,516 Speaker 1: to Vietnam, a young Joe Biden was turning heads in 429 00:29:46,596 --> 00:29:50,396 Speaker 1: his upstart Senate campaign in Delaware. The Washington Post had 430 00:29:50,436 --> 00:29:53,436 Speaker 1: just broken a story reported by Bob Woodward and Carl Bernstein, 431 00:29:53,836 --> 00:29:56,316 Speaker 1: an update about a break in at the offices of 432 00:29:56,356 --> 00:30:02,916 Speaker 1: the Democratic National Committee in the Watergate Hotel. Meanwhile, over 433 00:30:02,956 --> 00:30:06,116 Speaker 1: at the Arpanet demo at the Washington Hilton, a lot 434 00:30:06,116 --> 00:30:10,636 Speaker 1: of wonderful and thrilling goofiness. The idea was to demonstrate 435 00:30:10,676 --> 00:30:14,156 Speaker 1: that computers could talk to other computers. What they'd actually 436 00:30:14,156 --> 00:30:17,156 Speaker 1: talk about didn't seem to matter much. The big idea 437 00:30:17,316 --> 00:30:21,276 Speaker 1: was computers were changing from being storage devices to being 438 00:30:21,316 --> 00:30:26,836 Speaker 1: also communication devices. It was amazing. My name is Timmy 439 00:30:26,876 --> 00:30:32,676 Speaker 1: the Terminal, whilecast yours Jill. Lookfore, so you could sit 440 00:30:32,716 --> 00:30:36,076 Speaker 1: in DC, follow the instructions in a brochure and chat 441 00:30:36,156 --> 00:30:40,636 Speaker 1: with a program running out of UCLA. Pleased to meet you, 442 00:30:40,956 --> 00:30:46,116 Speaker 1: Jill Lapware. Have William met before? No? Sorry, but I 443 00:30:46,236 --> 00:30:50,356 Speaker 1: have a terrible memory for names. Anyway, My job is 444 00:30:50,396 --> 00:30:54,396 Speaker 1: to answer your questions, so I'll ask away when will 445 00:30:54,396 --> 00:31:01,996 Speaker 1: this computer crash next? About five o'clock, Timmy the Terminal 446 00:31:02,396 --> 00:31:06,156 Speaker 1: clip be the paper Clip's grandfather. Some of these programs 447 00:31:06,196 --> 00:31:11,396 Speaker 1: were really interesting, promising. Some of them are just unbelievably exciting. 448 00:31:12,196 --> 00:31:15,916 Speaker 1: The Stanford Artificial Intelligence Lab had established a connection to 449 00:31:15,956 --> 00:31:20,076 Speaker 1: the Associated Press's news hot line. Any terminal on that 450 00:31:20,156 --> 00:31:23,476 Speaker 1: network could be turned into an AP news line using 451 00:31:23,516 --> 00:31:28,516 Speaker 1: a program called h O T hot With this program 452 00:31:28,596 --> 00:31:31,316 Speaker 1: linked to the AP wire service, you could read through 453 00:31:31,356 --> 00:31:34,716 Speaker 1: all the latest news updates. You could even search by keyword. 454 00:31:35,596 --> 00:31:38,436 Speaker 1: It was the first, the very first version of what 455 00:31:38,476 --> 00:31:45,676 Speaker 1: we would call a news feed. Associated Press News Nixon 456 00:31:47,796 --> 00:31:53,116 Speaker 1: twelve news items found Read Which funds? Which Richard Nixon 457 00:31:53,196 --> 00:31:56,596 Speaker 1: story you might want to follow? In October nineteen seventy two, 458 00:31:57,596 --> 00:32:18,116 Speaker 1: hard to say there were too many to choose from. 459 00:32:18,236 --> 00:32:20,716 Speaker 1: Two weeks after the urpan At demo, Nixon was re 460 00:32:20,876 --> 00:32:24,316 Speaker 1: elected with an incredible sixty percent of the popular vote. 461 00:32:24,876 --> 00:32:27,396 Speaker 1: When it became clear just how huge Nixon's victory was 462 00:32:27,396 --> 00:32:30,276 Speaker 1: going to be, an anchor at ABC News called a 463 00:32:30,316 --> 00:32:35,476 Speaker 1: correspondent over where the Nixon campaign was celebrating well in Washington. 464 00:32:35,676 --> 00:32:39,516 Speaker 1: President Nixon's number one domestic affairs advisor is John Erlickman, 465 00:32:39,716 --> 00:32:42,036 Speaker 1: and David Schumacher is with him now in the Shoreham 466 00:32:42,116 --> 00:32:46,756 Speaker 1: hotel in Washington. David Howard, he feels too good. I'm 467 00:32:46,756 --> 00:32:50,156 Speaker 1: going to ask him whatever happened to Watergate. I don't know. 468 00:32:50,516 --> 00:32:53,876 Speaker 1: I don't know. Apparently nothing. In spite of all those 469 00:32:53,916 --> 00:32:58,556 Speaker 1: investigations by Woodward and bernstein Erlickman was confident that there 470 00:32:58,596 --> 00:33:01,436 Speaker 1: was nothing going on, But of course there was a 471 00:33:01,476 --> 00:33:05,116 Speaker 1: lot going on. It led to an investigation and in 472 00:33:05,236 --> 00:33:09,076 Speaker 1: a House impeachment inquiry. After two long years of scandal 473 00:33:09,276 --> 00:33:14,276 Speaker 1: and strife, on August ninth, nineteen seventy four, President Nixon 474 00:33:14,516 --> 00:33:18,316 Speaker 1: went on live television and resigned. I have never been 475 00:33:18,716 --> 00:33:23,276 Speaker 1: a quitter. To leave office before my term is completed 476 00:33:23,316 --> 00:33:27,476 Speaker 1: as abhorrent to every instinct in my body. The story 477 00:33:27,476 --> 00:33:31,916 Speaker 1: of Watergate is a story of illicit knowledge, the real monster, 478 00:33:32,396 --> 00:33:37,756 Speaker 1: ill gotten evidence, bad governance. Nixon's administration had bugged DNC 479 00:33:37,916 --> 00:33:41,076 Speaker 1: headquarters at the Watergate Hotel. It had also directed the 480 00:33:41,116 --> 00:33:44,876 Speaker 1: FBI and even the Army to conduct unlawful surveillance on 481 00:33:44,916 --> 00:33:48,836 Speaker 1: American citizens. The list of citizens the Nixon administration had 482 00:33:48,836 --> 00:33:52,476 Speaker 1: spied on it was staggering, and of course, in a 483 00:33:52,516 --> 00:33:56,516 Speaker 1: story of Shakespearean proportions, Nixon was brought down by the 484 00:33:56,556 --> 00:34:00,676 Speaker 1: evidence he had gathered because the president had even bugged 485 00:34:00,676 --> 00:34:06,636 Speaker 1: his own office. Secrecy and spying. That's not a climate 486 00:34:06,676 --> 00:34:10,356 Speaker 1: conducive to a national data center, let alone to a 487 00:34:10,436 --> 00:34:14,596 Speaker 1: network of government computers. But that's the climate the Internet 488 00:34:14,636 --> 00:34:19,156 Speaker 1: was born into, paranoia about what the government knew and 489 00:34:19,316 --> 00:34:21,876 Speaker 1: how it knew it. NBC News has learned that the 490 00:34:21,916 --> 00:34:25,996 Speaker 1: government has built a secret electronic intelligence network that gives 491 00:34:26,036 --> 00:34:29,276 Speaker 1: the White House, the CIA, and the Defense Department instant 492 00:34:29,316 --> 00:34:33,076 Speaker 1: access to computer files on millions of Americans. Just months 493 00:34:33,116 --> 00:34:36,716 Speaker 1: after Nixon resigned, a young NBC reporter named Ford Ruin 494 00:34:37,036 --> 00:34:40,196 Speaker 1: began an investigation of his own into the last days 495 00:34:40,356 --> 00:34:43,956 Speaker 1: of the Nixon White House. In a series of explosive 496 00:34:43,996 --> 00:34:47,396 Speaker 1: reports on NBC Nightly News in June nineteen seventy five, 497 00:34:48,036 --> 00:34:52,196 Speaker 1: Rowan claimed to have uncovered yet another conspiracy, yet another 498 00:34:52,396 --> 00:34:56,556 Speaker 1: cover up. The secret computer network was made possible by 499 00:34:56,636 --> 00:34:59,636 Speaker 1: dramatic breakthroughs and the technique of hooking different makes and 500 00:34:59,756 --> 00:35:02,716 Speaker 1: models of computers together so they can talk to one 501 00:35:02,716 --> 00:35:05,756 Speaker 1: another and share information. It's a whole new technology that 502 00:35:05,836 --> 00:35:09,036 Speaker 1: not many people know about. He sounds a little advanced 503 00:35:09,196 --> 00:35:11,996 Speaker 1: packard in the Naked society, going on about how the 504 00:35:12,076 --> 00:35:16,116 Speaker 1: government strips you down down to your bare ass. If 505 00:35:16,156 --> 00:35:18,596 Speaker 1: you pay taxes or use a credit card, if you 506 00:35:18,676 --> 00:35:21,476 Speaker 1: drive a car or have served in the military, if 507 00:35:21,516 --> 00:35:24,716 Speaker 1: you've had major medical expenses, or contributed to a national 508 00:35:24,756 --> 00:35:28,116 Speaker 1: political party, then there is information on you somewhere in 509 00:35:28,156 --> 00:35:31,836 Speaker 1: some computer. Then ruin without really explaining it invokes the 510 00:35:31,916 --> 00:35:36,276 Speaker 1: failed National Data Center proposal. The NBC News has learned 511 00:35:36,276 --> 00:35:39,436 Speaker 1: that while Congress was voting down plans for big computer 512 00:35:39,516 --> 00:35:43,596 Speaker 1: link ups, the Defense Department was developing exactly that capability. 513 00:35:44,076 --> 00:35:48,196 Speaker 1: The technology to connect virtually every computer. The network, and 514 00:35:48,276 --> 00:35:51,076 Speaker 1: it is referred to as the network, is now in operation. 515 00:35:52,716 --> 00:35:55,796 Speaker 1: NBC News has learned that the network computers are talking 516 00:35:55,796 --> 00:35:58,236 Speaker 1: to one another. We don't know how or when it began. 517 00:35:58,596 --> 00:36:01,436 Speaker 1: We do know this. One of the perpetrators is named 518 00:36:01,676 --> 00:36:07,756 Speaker 1: Timmy the Terminal. The Department of Defense scratched its head 519 00:36:07,916 --> 00:36:12,236 Speaker 1: at NBC's report. Everyone involved in building ARPANET must have 520 00:36:12,236 --> 00:36:16,756 Speaker 1: scratched their heads. This supposedly secret network had debuted in 521 00:36:16,756 --> 00:36:21,276 Speaker 1: a public demo in a hotel ballroom. Gerald Ford's White 522 00:36:21,276 --> 00:36:24,356 Speaker 1: House issued an official denial. The White House was not 523 00:36:24,476 --> 00:36:28,156 Speaker 1: connected to ARPANETT, which was also not a secret network, 524 00:36:28,836 --> 00:36:31,916 Speaker 1: but the Senate still in a watergate mood opened hearings 525 00:36:31,956 --> 00:36:36,716 Speaker 1: on surveillance technology. So once again we're back in a 526 00:36:36,796 --> 00:36:40,276 Speaker 1: hearing room to talk about monsters and computer men on 527 00:36:40,316 --> 00:36:45,036 Speaker 1: Capitol Hill. The Deputy Assistant Secretary of Defense appeared before 528 00:36:45,036 --> 00:36:48,316 Speaker 1: the committee and admitted that the Army had illegally conducted 529 00:36:48,316 --> 00:36:53,356 Speaker 1: surveillance and kept files on American citizens involved in political protests. 530 00:36:54,196 --> 00:36:57,676 Speaker 1: That story had already come out. But as for keeping 531 00:36:57,676 --> 00:37:01,796 Speaker 1: those records on a secret computer network, let me emphasize 532 00:37:01,876 --> 00:37:05,236 Speaker 1: that it is not a secret network, but it is 533 00:37:05,316 --> 00:37:09,316 Speaker 1: used for scientific research purposes, that it contains no sociological 534 00:37:09,476 --> 00:37:13,236 Speaker 1: or intelligence data on personalities, and that it is a 535 00:37:13,276 --> 00:37:16,836 Speaker 1: marvel in many ways, but it does not fit the 536 00:37:17,316 --> 00:37:23,156 Speaker 1: Orwellian mold attributed to it. Arpanet was and always had 537 00:37:23,196 --> 00:37:27,636 Speaker 1: been unclassified. The public was invited to its unveiling in 538 00:37:27,716 --> 00:37:31,436 Speaker 1: nineteen seventy two at the Washington Hilton. The public just 539 00:37:31,476 --> 00:37:34,196 Speaker 1: hadn't been interested. And not only had the public not 540 00:37:34,236 --> 00:37:37,116 Speaker 1: been paying much attention to arpanet, but the public had 541 00:37:37,116 --> 00:37:40,076 Speaker 1: really not been paying much attention to private corporate ownership 542 00:37:40,196 --> 00:37:46,396 Speaker 1: of data. In nineteen sixty four, Vance Packard, wearing two 543 00:37:46,396 --> 00:37:50,476 Speaker 1: pairs of pants, published the naked society, warning that computers 544 00:37:50,516 --> 00:37:53,916 Speaker 1: were collecting data about you and stripping you naked. In 545 00:37:54,036 --> 00:37:58,636 Speaker 1: nineteen seventy five, the United States finally withdrew from Vietnam. 546 00:37:58,676 --> 00:38:01,956 Speaker 1: In the decade between, the American public got alarmed about 547 00:38:01,996 --> 00:38:05,036 Speaker 1: what the American government was doing with its data. Between 548 00:38:05,116 --> 00:38:09,116 Speaker 1: Vietnam and Watergate, Americans lost faith in the federal government, 549 00:38:10,076 --> 00:38:13,316 Speaker 1: fearing and having found out that the government knew things 550 00:38:13,356 --> 00:38:16,476 Speaker 1: it shouldn't know and was hiding things it shouldn't hide. 551 00:38:17,396 --> 00:38:20,196 Speaker 1: All this happened just when data was becoming easier to 552 00:38:20,236 --> 00:38:23,836 Speaker 1: store and faster to process, and computers were beginning to 553 00:38:23,836 --> 00:38:27,196 Speaker 1: be able to communicate with one another across fast distances. 554 00:38:29,076 --> 00:38:32,756 Speaker 1: That part's just a coincidence. But there are patterns here too, 555 00:38:33,476 --> 00:38:36,756 Speaker 1: patterns that you can only see from the vantage of history. 556 00:38:36,916 --> 00:38:40,196 Speaker 1: Because meanwhile, the people who were really stealing and hoarding 557 00:38:40,276 --> 00:38:47,196 Speaker 1: data corporations, and hardly anyone noticed. Maybe Congress is always 558 00:38:47,236 --> 00:38:50,516 Speaker 1: too late when it comes to technology. I think Arthur 559 00:38:50,556 --> 00:38:53,476 Speaker 1: Miller feels about this the same way I do. The 560 00:38:53,556 --> 00:38:56,876 Speaker 1: National Data Center died, but then that monster came back, 561 00:38:57,276 --> 00:39:00,836 Speaker 1: wielding that same rusty, bloody axe. It came back as 562 00:39:00,876 --> 00:39:04,276 Speaker 1: the Internet, and for my entire life. We've been having 563 00:39:04,316 --> 00:39:07,076 Speaker 1: the same argument over it. My God, I close my 564 00:39:07,156 --> 00:39:13,116 Speaker 1: eyes and I say, I've seen this movie before, and 565 00:39:13,276 --> 00:39:16,476 Speaker 1: it's just as bad as it was the first time. 566 00:39:18,356 --> 00:39:22,036 Speaker 1: That bad movie. It's still playing on c SPAN. In 567 00:39:22,076 --> 00:39:25,196 Speaker 1: twenty eighteen, when a subcommittee of the Senate Judiciary Committee 568 00:39:25,196 --> 00:39:29,076 Speaker 1: held hearings about Facebook data and privacy, Senator or In 569 00:39:29,156 --> 00:39:33,596 Speaker 1: Hatch was eighty four years old, Mark Zuckerberg Facebook was 570 00:39:33,636 --> 00:39:37,356 Speaker 1: thirty three, and from their exchange it was pretty clear 571 00:39:37,516 --> 00:39:41,556 Speaker 1: that Hatch had no idea how Facebook works. So how 572 00:39:41,596 --> 00:39:43,876 Speaker 1: do you sustain a business model in which users don't 573 00:39:43,876 --> 00:39:50,596 Speaker 1: pay for your service? Here, Zuckerberg blinked, Senator, we run ads. Senator, 574 00:39:51,076 --> 00:39:54,476 Speaker 1: we run ads. People give us their data for free, 575 00:39:54,676 --> 00:39:57,956 Speaker 1: and we sell it to advertising agencies and to other companies. 576 00:39:59,796 --> 00:40:03,916 Speaker 1: This problem was never a technical problem. This problem a 577 00:40:03,956 --> 00:40:08,716 Speaker 1: half century old, was always a political problem. Where is 578 00:40:08,716 --> 00:40:13,996 Speaker 1: your data? Yeah, in the cloud? Where's the cloud everywhere? 579 00:40:14,796 --> 00:40:19,836 Speaker 1: In data centers all over the world. What are the rules? 580 00:40:22,556 --> 00:40:38,036 Speaker 1: There are no rules. The last archive is produced by 581 00:40:38,036 --> 00:40:41,196 Speaker 1: Sophie Crane, mccabbon and Bennette of Faffrey. Our editor is 582 00:40:41,276 --> 00:40:44,956 Speaker 1: Julia Barton and our executive producer is. Mia Lobell, Jason 583 00:40:44,996 --> 00:40:48,676 Speaker 1: Gambrell and Martine Gonzalez are our engineers. Fact checking by 584 00:40:48,676 --> 00:40:52,316 Speaker 1: Amy Gaines, original music by Matthias Bosse and John Evans 585 00:40:52,316 --> 00:40:56,196 Speaker 1: of Stellwagen Symfinette. Many of our sound effects are from 586 00:40:56,236 --> 00:40:59,956 Speaker 1: Harry Janette Jr. And the Star Jenette Foundation. Our full 587 00:40:59,996 --> 00:41:03,836 Speaker 1: proof players are Barlow, Adamson, Daniel Burger Jones, Jesse Hinson, 588 00:41:04,156 --> 00:41:08,516 Speaker 1: John Kuntz, Becca A. Lewis, and Maurice Emmanuel Parrott. The 589 00:41:08,556 --> 00:41:11,036 Speaker 1: Last Dark I was brought to you by Pushkin Industries. 590 00:41:11,716 --> 00:41:14,636 Speaker 1: Special thanks to Ryan McKittrick in the American Repertory Theater, 591 00:41:15,036 --> 00:41:17,636 Speaker 1: to Alex Allenson and the Bridge Sounding Stage, and to 592 00:41:17,716 --> 00:41:21,796 Speaker 1: Simon Leak at Pushkin. Thanks to Heather Fane, Maya Canink, 593 00:41:21,916 --> 00:41:26,076 Speaker 1: Carly Migliori, Emily Rostack, Maggie Taylor, and Jacob Weisberg. Our 594 00:41:26,116 --> 00:41:29,996 Speaker 1: research assistants are Michelle Gau, Olivia Oldham, Henrietta Riley, Oliver 595 00:41:30,116 --> 00:41:34,756 Speaker 1: Riskin Kutz and Emily Spector. Thanks also to Rachel Henson 596 00:41:34,796 --> 00:41:37,676 Speaker 1: at the Carl Albert Center at the University of Oklahoma 597 00:41:37,796 --> 00:41:40,396 Speaker 1: and to Stephen Kohler at the Lamont Library at Harvard. 598 00:41:41,396 --> 00:41:42,196 Speaker 1: I'm Jill lapoor