1 00:00:00,720 --> 00:00:04,160 Speaker 1: Tomorrow's Monsters as a production of My Heart Radio Flynn 2 00:00:04,200 --> 00:00:14,520 Speaker 1: Picture Company, Psycopia Pictures and Upper Room Productions, Monday Mars 3 00:00:14,960 --> 00:00:19,320 Speaker 1: to subject oh three b tomorrow. Jessup. We are currently 4 00:00:19,360 --> 00:00:25,480 Speaker 1: at hours without sleep. Patient reporting agitation and lack of focus, 5 00:00:25,480 --> 00:00:29,000 Speaker 1: several hours sooner than the last dose. Are you feeling 6 00:00:30,200 --> 00:00:37,160 Speaker 1: a tired coming down? I think any side effects since 7 00:00:37,240 --> 00:00:44,000 Speaker 1: last night? Um? A little ringing in my ear? Which 8 00:00:44,000 --> 00:00:54,760 Speaker 1: one so the left ear? Anything else? Mood swings? Are 9 00:00:54,760 --> 00:01:00,960 Speaker 1: you upset? No? But you are cry Why are you 10 00:01:04,280 --> 00:01:15,280 Speaker 1: just emotional? The voices too? Voices? Yeah? Have you talked 11 00:01:15,280 --> 00:01:19,039 Speaker 1: to Dr Berkeley about that? I'm not gonna tell you that. 12 00:01:19,640 --> 00:01:24,240 Speaker 1: Excuse me? No, um no, no, no not let me 13 00:01:24,280 --> 00:01:28,840 Speaker 1: call Dr Berkeley. Don't. I don't need your shrink. Sorry, 14 00:01:30,720 --> 00:01:38,640 Speaker 1: I'm fine, really, I'm fine, okay, And you haven't been 15 00:01:38,640 --> 00:01:43,720 Speaker 1: feeling ill physically ill. Look, can we get on with 16 00:01:43,760 --> 00:01:45,200 Speaker 1: it and dose me already? We can do the Dog 17 00:01:45,200 --> 00:01:48,680 Speaker 1: and Pony show some other time. Yeah, it's important to 18 00:01:48,720 --> 00:01:52,840 Speaker 1: collect these data points before. I'm sorry. I'm sorry. I 19 00:01:53,000 --> 00:02:01,960 Speaker 1: just need the fog to go away, all right? All right. 20 00:02:02,640 --> 00:02:05,560 Speaker 1: For the record, this is round thirteen with dosage of 21 00:02:05,600 --> 00:02:13,720 Speaker 1: two full exposures five PRF five seconds each, eyes forward, deep, 22 00:02:13,720 --> 00:02:35,760 Speaker 1: breath ready, Yes, here's exposure one. Keep breathing tomorrow, tomorrow, tomorrow. Okay, 23 00:02:37,040 --> 00:02:40,959 Speaker 1: that was exposure one. Patient is responsive. Heart rate spike 24 00:02:41,040 --> 00:02:45,440 Speaker 1: to two, but seems to be stabilizing. Okay, let's go 25 00:02:45,480 --> 00:02:51,640 Speaker 1: through the questions. What was your first thought? M Red balloons? 26 00:02:51,840 --> 00:02:56,280 Speaker 1: What did you smell? Cut grass? Any tinkling feeters? Sleep 27 00:02:57,400 --> 00:03:01,400 Speaker 1: in phase two? Breathe deep, let the blood get back 28 00:03:01,440 --> 00:03:06,480 Speaker 1: to your head. I don't remember starting that one. I 29 00:03:06,600 --> 00:03:15,600 Speaker 1: went deep that time, ready for exposure to yes, okay, 30 00:03:16,480 --> 00:03:28,600 Speaker 1: and three two and it was number thirteen. Administered tomorrow 31 00:03:31,000 --> 00:04:14,960 Speaker 1: ship goodmorrow, h m hm M Christ. Where am I? 32 00:04:15,720 --> 00:04:18,880 Speaker 1: Would you like me to go online to determine location? No, 33 00:04:19,120 --> 00:04:21,960 Speaker 1: stay offline. Just tell me the last man mark? Can 34 00:04:22,000 --> 00:04:28,240 Speaker 1: we passed twenty eight miles west of Urica, Nevada? Anything 35 00:04:28,240 --> 00:04:32,880 Speaker 1: else I can help with? Pull over? Okay? Pulling over 36 00:04:33,680 --> 00:04:36,599 Speaker 1: for safety. Please keep your seat belt on until I 37 00:04:36,680 --> 00:04:49,680 Speaker 1: come to a complete stop. It's January, January seven am. 38 00:04:49,680 --> 00:04:54,200 Speaker 1: Everywhere I look, I see monsters and ghosts, headlights in 39 00:04:54,200 --> 00:04:57,800 Speaker 1: the rear view. It feels like everyone is out to 40 00:04:57,839 --> 00:05:02,160 Speaker 1: get me. M's in any runs in the world. At 41 00:05:02,200 --> 00:05:06,120 Speaker 1: least there's there one of the benefits of taking backwards 42 00:05:06,960 --> 00:05:09,480 Speaker 1: less chances for me to get tagged by facial recognition 43 00:05:09,720 --> 00:05:15,440 Speaker 1: or motion idea. I'm using old unalog paper maps. My 44 00:05:15,520 --> 00:05:17,640 Speaker 1: father taught me to use them instead of virtual maps 45 00:05:17,640 --> 00:05:19,400 Speaker 1: when you need to stay off the radar, which is 46 00:05:19,400 --> 00:05:22,120 Speaker 1: exactly why I have to do. I'm also avoiding all 47 00:05:22,160 --> 00:05:24,920 Speaker 1: major interstates and stities, which makes me harder to track, 48 00:05:26,520 --> 00:05:29,320 Speaker 1: but it will take longer. It's the best chance I 49 00:05:29,360 --> 00:05:32,039 Speaker 1: have to make it across country to Arlington, Virginia without 50 00:05:32,080 --> 00:05:39,200 Speaker 1: getting calls. At least the scenic son is coming up 51 00:05:39,240 --> 00:05:46,160 Speaker 1: now and sage brush for days this drive. You know, 52 00:05:46,200 --> 00:05:48,400 Speaker 1: it's the first chance I've had to think to put 53 00:05:48,440 --> 00:05:52,440 Speaker 1: all the pieces together in my head. I wasn't sure 54 00:05:52,480 --> 00:05:56,719 Speaker 1: where to begin, but now I know. Let me play 55 00:05:56,760 --> 00:06:04,039 Speaker 1: you something. This is Dr Cassandra Berkeley. It's Sunday, June six, 56 00:06:04,480 --> 00:06:11,800 Speaker 1: Session three, subject Max Fuller. Okay, let's revisit the lake, 57 00:06:13,480 --> 00:06:16,599 Speaker 1: the frozen lake. I don't see why we need to 58 00:06:16,640 --> 00:06:22,720 Speaker 1: throw on that. Let's close your eyes. Okay, good, a 59 00:06:22,839 --> 00:06:32,960 Speaker 1: deep breath. Good, Now you are there. Tell me what 60 00:06:33,080 --> 00:06:37,919 Speaker 1: you see, not what you remember, but what you see 61 00:06:41,480 --> 00:06:49,000 Speaker 1: a boy. Where is the boy? He's lying flat on 62 00:06:49,080 --> 00:06:53,839 Speaker 1: a finch she device cracking beneath him, spreading his arms 63 00:06:53,920 --> 00:06:59,039 Speaker 1: and legs out as wide as possible. He's taking slow, 64 00:07:00,440 --> 00:07:04,200 Speaker 1: shallow breath. He's trying to distribute the weight evenly so 65 00:07:04,279 --> 00:07:07,400 Speaker 1: it doesn't fall through the eyes. This is struggling. Is 66 00:07:07,440 --> 00:07:11,560 Speaker 1: Max to remain lifeless? Max Follow? Any small motion is 67 00:07:11,560 --> 00:07:15,000 Speaker 1: an excerpt from his hychiatric session. I've been listening to 68 00:07:14,880 --> 00:07:18,960 Speaker 1: it looking for new clues. He's praying for someone to 69 00:07:20,920 --> 00:07:26,480 Speaker 1: see him out there, to see him before it gets dark, 70 00:07:27,400 --> 00:07:33,640 Speaker 1: and what if you want? Most of all, he wants father. 71 00:07:35,920 --> 00:07:38,360 Speaker 1: And now it's true. This actually happened to Max Follow 72 00:07:38,400 --> 00:07:40,520 Speaker 1: when he was a child. I know because he told 73 00:07:40,560 --> 00:07:43,239 Speaker 1: me one night as a guy on his own product. 74 00:07:44,400 --> 00:07:47,640 Speaker 1: This is a blueprint for madness. This is how monsters 75 00:07:47,680 --> 00:07:57,240 Speaker 1: gets made. My name is Jack Jack Lock. I'm not 76 00:07:57,240 --> 00:07:59,720 Speaker 1: a journalist, so the telling of this maybe a little 77 00:07:59,720 --> 00:08:04,520 Speaker 1: distract printed. I'm not a neuroscientist or a bioengineer. So 78 00:08:04,680 --> 00:08:07,200 Speaker 1: I almost was once and so I can offer some 79 00:08:07,240 --> 00:08:12,640 Speaker 1: perspective in that regard. So am I other than a 80 00:08:12,720 --> 00:08:16,720 Speaker 1: fee for an addict, A con manner, conspirator and I'm 81 00:08:16,800 --> 00:08:22,080 Speaker 1: nothing man and nobody that. Despite this, I'm asking you 82 00:08:22,160 --> 00:08:24,280 Speaker 1: to believe exactly what I'm going to tell you. You You 83 00:08:25,880 --> 00:08:29,280 Speaker 1: may know Max Fuller, but you probably have no idea 84 00:08:29,320 --> 00:08:32,319 Speaker 1: what kind of man he really is. But you need 85 00:08:32,360 --> 00:08:35,520 Speaker 1: to know. It is vitally important that you know that 86 00:08:35,559 --> 00:08:38,200 Speaker 1: the world knows who Max Fuller really is, and that 87 00:08:38,280 --> 00:08:41,120 Speaker 1: he may be responsible for the single greatest spike in 88 00:08:41,240 --> 00:08:45,160 Speaker 1: human evolution since the dawn of mankind. It hasn't happened 89 00:08:45,200 --> 00:08:48,920 Speaker 1: to emper. Oh, it's about to. You see. Max Fuller's 90 00:08:48,960 --> 00:08:53,360 Speaker 1: work is going to change the world, and we need 91 00:08:53,400 --> 00:08:58,760 Speaker 1: to stop him at all costs. All right, But I'm 92 00:08:58,800 --> 00:09:01,520 Speaker 1: getting ahead of myself, and I need to catch you 93 00:09:01,679 --> 00:09:05,160 Speaker 1: up because I don't have much time left, and several 94 00:09:05,200 --> 00:09:16,280 Speaker 1: people already did. We have some breaking news in southwest 95 00:09:16,360 --> 00:09:19,240 Speaker 1: Oakland right now a death investigation. A woman's body was 96 00:09:19,280 --> 00:09:23,000 Speaker 1: found this morning in an appearance suicide. Officers have non 97 00:09:23,000 --> 00:09:25,839 Speaker 1: confirmment as of yet to be a homicide. Orders were 98 00:09:25,880 --> 00:09:28,920 Speaker 1: trying to determine how this man fell some thirteen stories 99 00:09:28,960 --> 00:09:31,600 Speaker 1: to his death. As soon as we have more information, 100 00:09:31,720 --> 00:09:33,400 Speaker 1: will bring it to you. Some of them died of 101 00:09:33,400 --> 00:09:37,880 Speaker 1: apparent suicides or within three months of completing the first 102 00:09:37,960 --> 00:09:42,840 Speaker 1: clinical trial of Maths Fullest human enhancement products. Of course, 103 00:09:42,960 --> 00:09:45,200 Speaker 1: this is a connection the CORPS never made, or maybe 104 00:09:45,200 --> 00:09:49,440 Speaker 1: we're not allowed to. Coincidentally, around that time Max Fullest 105 00:09:49,440 --> 00:09:52,920 Speaker 1: startup Next Corps was ordered to temporarily suspend testing by 106 00:09:52,920 --> 00:09:58,880 Speaker 1: the Human Enhancement Administration for violating safety protocourse. It was 107 00:09:59,000 --> 00:10:04,440 Speaker 1: legally at least minor refraction disappointed that the A has 108 00:10:04,520 --> 00:10:07,920 Speaker 1: chosen to penalize Next Court, but in good faith, we 109 00:10:07,960 --> 00:10:11,600 Speaker 1: will openly welcome the investigation and will further continue to 110 00:10:11,720 --> 00:10:15,320 Speaker 1: comply with a d e A and f d A 111 00:10:15,400 --> 00:10:19,600 Speaker 1: guidelines as we developed our product pipeline. We welcome. The 112 00:10:19,640 --> 00:10:22,360 Speaker 1: product pipeline that Max Fuller is referring to is his 113 00:10:22,679 --> 00:10:26,760 Speaker 1: entire menu of mind apps, the sort of software platform 114 00:10:26,840 --> 00:10:30,360 Speaker 1: that feeds directly to the human brain, reprogramming our new 115 00:10:30,400 --> 00:10:33,360 Speaker 1: code and allowing us to author ourselves and enhance our 116 00:10:33,400 --> 00:10:37,800 Speaker 1: existence in ways that were previously unimaginable, creating what Max 117 00:10:37,840 --> 00:10:41,840 Speaker 1: Follo caused his human two point oh. But you know, 118 00:10:41,880 --> 00:10:44,400 Speaker 1: no one's ever thinking how the side effects not just 119 00:10:44,480 --> 00:10:48,000 Speaker 1: for the individual, what are the repercussions for civilization when 120 00:10:48,000 --> 00:10:52,240 Speaker 1: we give corporations direct access to our fucking brains. At 121 00:10:52,240 --> 00:10:54,720 Speaker 1: what point are we no longer just consumers? At what 122 00:10:54,800 --> 00:11:01,120 Speaker 1: point do we become their product? No one thing. Even 123 00:11:01,160 --> 00:11:03,120 Speaker 1: though they never stopped testing these mind apps, it's not 124 00:11:03,200 --> 00:11:05,920 Speaker 1: too late. It's not too late to stop when Max 125 00:11:05,960 --> 00:11:10,440 Speaker 1: Fellers started. But of course I need to get to 126 00:11:10,520 --> 00:11:18,880 Speaker 1: him first, and I'm almost off time. Jenny, Hello Jack, 127 00:11:19,960 --> 00:11:58,239 Speaker 1: let's go. Please buckle your seatbelt, brax foller. Everybody. Imagine 128 00:11:58,559 --> 00:12:05,720 Speaker 1: getting a third more out of life. More time. Our 129 00:12:05,760 --> 00:12:13,040 Speaker 1: greatest resource, greater than energy, food, air, anything, really is 130 00:12:13,120 --> 00:12:18,280 Speaker 1: simply time, time to do the things you really care about. 131 00:12:19,440 --> 00:12:22,880 Speaker 1: And I'm not talking about living longer now, I'm talking 132 00:12:22,920 --> 00:12:30,680 Speaker 1: about living better, fuller. Imagine a world that never sleeps. 133 00:12:32,200 --> 00:12:39,000 Speaker 1: Literally Thomas Edison did, and so did my father. And 134 00:12:39,040 --> 00:12:42,000 Speaker 1: we've been up to something very special at Next Corporate, 135 00:12:42,920 --> 00:12:51,720 Speaker 1: a secret project called shut Up Secret until now. It's 136 00:12:51,720 --> 00:12:57,920 Speaker 1: a safe application with one very simple benefit. You never 137 00:12:57,960 --> 00:13:10,240 Speaker 1: have to sleep again. M Max Fuller has been developing 138 00:13:10,320 --> 00:13:14,360 Speaker 1: these technologies for over a decade, and now it should 139 00:13:14,360 --> 00:13:16,600 Speaker 1: be pointed out that these sources of efforts have been 140 00:13:16,600 --> 00:13:20,199 Speaker 1: going on for the vast bulk of human history. Indigenous 141 00:13:20,240 --> 00:13:23,280 Speaker 1: cultures have been using plants to alter consciousness for centuries, 142 00:13:23,960 --> 00:13:26,160 Speaker 1: and humans have been drinking caffeine for as long as 143 00:13:26,200 --> 00:13:29,319 Speaker 1: we've been able to heat water. Now you jumped to 144 00:13:29,360 --> 00:13:32,240 Speaker 1: a few decades ago, the entire conscious world was living 145 00:13:32,280 --> 00:13:37,040 Speaker 1: on adorable wars steroids and modafinil, Noah tropics, and an 146 00:13:37,120 --> 00:13:40,000 Speaker 1: endless h a list of other barely legal designer drugs. 147 00:13:41,120 --> 00:13:44,840 Speaker 1: Then we advancements of Noah tropics, stacks and AI chip implants. 148 00:13:44,880 --> 00:13:47,080 Speaker 1: We were able to treat the orders better than ever 149 00:13:47,800 --> 00:13:52,640 Speaker 1: and boost our abilities in incredible ways. Cochleay implants drastically 150 00:13:52,679 --> 00:13:55,600 Speaker 1: improved our hearing Corny when plants that allowed us to 151 00:13:55,679 --> 00:13:59,640 Speaker 1: see in the dark. And then came the brain computer interface, 152 00:14:00,840 --> 00:14:05,320 Speaker 1: and then no technology gave us thought. With we started 153 00:14:05,360 --> 00:14:09,160 Speaker 1: to become cyborgs, a superhuman. We could think faster and 154 00:14:09,240 --> 00:14:13,360 Speaker 1: run farther than humans ever had. But none of the 155 00:14:13,440 --> 00:14:16,960 Speaker 1: drugs of the past compared to Max Fuller concuctions. And 156 00:14:16,960 --> 00:14:20,720 Speaker 1: by the time Max made that keynote speech, Next Corp 157 00:14:20,760 --> 00:14:23,480 Speaker 1: had already been courted by Adria as well as the 158 00:14:23,520 --> 00:14:27,160 Speaker 1: governments of China and India. Only at this point, none 159 00:14:27,160 --> 00:14:29,680 Speaker 1: of these clients, none of these investors, had any idea 160 00:14:29,720 --> 00:14:33,360 Speaker 1: how far along in the process. Next Corp was except 161 00:14:33,360 --> 00:14:40,280 Speaker 1: his own team and well met Fuller. You must be 162 00:14:40,440 --> 00:14:44,440 Speaker 1: Jack and it's Mr not doctor. My father was doctor four. 163 00:14:44,920 --> 00:14:47,840 Speaker 1: This is from the first time I met him. I 164 00:14:47,920 --> 00:14:51,080 Speaker 1: was interviewing for a position at his company, Next Corp. 165 00:14:51,440 --> 00:14:54,560 Speaker 1: And she's not one to ever be impressed. Oh good, Um, 166 00:14:54,800 --> 00:14:59,960 Speaker 1: I worked hard on it. Yes, so you're from London origin. 167 00:15:00,040 --> 00:15:03,400 Speaker 1: Only when did you make the jump over to the Bay. 168 00:15:03,680 --> 00:15:05,800 Speaker 1: We moved to Irvine first and my moments we've already 169 00:15:05,800 --> 00:15:08,400 Speaker 1: and then in a position opened up inside California and 170 00:15:08,520 --> 00:15:11,000 Speaker 1: a week later we were on a plane and that's 171 00:15:11,000 --> 00:15:14,560 Speaker 1: how the Jetty began. Master's in computational neuroscience from UCS 172 00:15:15,240 --> 00:15:19,520 Speaker 1: background in signal processing and fought where and you were 173 00:15:19,680 --> 00:15:24,800 Speaker 1: with Stabano crime yes three years which product lines where 174 00:15:24,840 --> 00:15:26,560 Speaker 1: you want? So for the first year I worked on 175 00:15:26,560 --> 00:15:29,360 Speaker 1: the new were still head tip. That was a big one, yea. 176 00:15:29,600 --> 00:15:32,440 Speaker 1: It was their best performing product. So I've ran and 177 00:15:32,480 --> 00:15:34,440 Speaker 1: I was on the research team for the Natilist X, 178 00:15:34,480 --> 00:15:37,080 Speaker 1: which that was crazy. But so I left before it 179 00:15:37,160 --> 00:15:41,440 Speaker 1: went to market. What made you leave? Honestly everything was 180 00:15:41,440 --> 00:15:44,960 Speaker 1: moving too slow. Well, yeah, that's that's the entire business. 181 00:15:45,000 --> 00:15:46,880 Speaker 1: Isn't it. You You have a thousand hoops to jump 182 00:15:46,920 --> 00:15:49,560 Speaker 1: through before you even moved to animal testing, and then 183 00:15:50,160 --> 00:15:52,920 Speaker 1: more before you even get to your first I don't 184 00:15:52,920 --> 00:15:55,360 Speaker 1: mean that. I mean I don't think anyone that was 185 00:15:55,400 --> 00:16:00,160 Speaker 1: working with was forward thinking. Now, I mean is what 186 00:16:00,160 --> 00:16:02,320 Speaker 1: about the bottom line? I mean, I get it they 187 00:16:02,320 --> 00:16:04,480 Speaker 1: have a mandated their shareholders to make money, but I 188 00:16:04,520 --> 00:16:10,240 Speaker 1: also believe corporations have a responsibility of their community. Go on, well, 189 00:16:10,880 --> 00:16:13,720 Speaker 1: look that they aren't really trying to help people. I mean, 190 00:16:13,720 --> 00:16:16,160 Speaker 1: they're not in the business of helping quadruplegics operate wheel 191 00:16:16,200 --> 00:16:18,720 Speaker 1: chairs with their brains. And then come on, see, that's 192 00:16:18,760 --> 00:16:21,880 Speaker 1: what my thesis was based on. I wrote this whole theory, 193 00:16:23,320 --> 00:16:26,000 Speaker 1: your thesis. I read it, so I know what you're 194 00:16:26,000 --> 00:16:32,280 Speaker 1: talking about. Yes, look elined about your resume, because I 195 00:16:32,320 --> 00:16:36,000 Speaker 1: didn't want you to be nervous. I mean you're qualified, sure, 196 00:16:36,120 --> 00:16:38,760 Speaker 1: but frankly I have interns here with more impressive resumes. 197 00:16:39,680 --> 00:16:45,200 Speaker 1: Oh so I wanted to hear you out. Can you 198 00:16:45,280 --> 00:16:51,880 Speaker 1: build on a little bit for me? Build one your thesis? Oh, 199 00:16:51,920 --> 00:16:56,960 Speaker 1: I'm yeah, absolutely sure. So that the first process and 200 00:16:57,000 --> 00:17:01,040 Speaker 1: shift away from CPUs was new networks, right, So and 201 00:17:01,080 --> 00:17:03,120 Speaker 1: even though the code and computers trying new ways to 202 00:17:03,120 --> 00:17:06,359 Speaker 1: mirror a human cognitive function, the goal never changed. See, 203 00:17:06,400 --> 00:17:09,000 Speaker 1: we still wanted to turn machines into humans instead of 204 00:17:09,040 --> 00:17:11,919 Speaker 1: having humans become more like machines. I'm gonna stop her. 205 00:17:12,000 --> 00:17:15,720 Speaker 1: I'm sorry, Jack, I'm not looking for a history lesson. 206 00:17:16,800 --> 00:17:19,800 Speaker 1: I'm looking for new ideas me too, and frankly, that's 207 00:17:19,840 --> 00:17:22,720 Speaker 1: why I want to work here with you, Mr Fuller, 208 00:17:22,760 --> 00:17:25,080 Speaker 1: and I do appreciate that, Jack, I really do thank you, 209 00:17:25,119 --> 00:17:29,640 Speaker 1: but unfortunately that's not enough. I like where your head 210 00:17:29,720 --> 00:17:33,960 Speaker 1: is at, thinking outside of the box. But from what 211 00:17:34,040 --> 00:17:37,760 Speaker 1: I can tell, you've been dragging this thesis around with 212 00:17:37,760 --> 00:17:39,880 Speaker 1: you for a few years now, and it might make 213 00:17:40,119 --> 00:17:43,080 Speaker 1: for an impressive conversation at a dinner party. But I 214 00:17:43,119 --> 00:17:47,919 Speaker 1: need people with new ideas, people motivated by something greater 215 00:17:48,000 --> 00:17:57,439 Speaker 1: than I'm sorry. It's shit. Hang on, I'm sorry, Mr Fuller. 216 00:17:58,640 --> 00:18:01,720 Speaker 1: I should have been up from the reason I want, 217 00:18:03,280 --> 00:18:06,440 Speaker 1: the reason I have to be here at next Corp. 218 00:18:07,400 --> 00:18:12,560 Speaker 1: It's my mother. She's dying and mind is going. They 219 00:18:12,560 --> 00:18:15,040 Speaker 1: can't figure out, and I don't. I didn't expect you 220 00:18:15,040 --> 00:18:17,720 Speaker 1: to be able to do anything, okay, but I thought 221 00:18:18,960 --> 00:18:23,600 Speaker 1: I'm so dumb. I I just thought that maybe maybe one. 222 00:18:26,000 --> 00:18:28,439 Speaker 1: I don't even know now. I don't know what I 223 00:18:28,480 --> 00:18:30,720 Speaker 1: was even looking for or why the hell you would 224 00:18:30,720 --> 00:18:34,160 Speaker 1: give half a ship. But you, sir, you are light 225 00:18:34,320 --> 00:18:37,720 Speaker 1: years ahead of others. Mr Fullana, and I do have ideas. 226 00:18:37,760 --> 00:18:40,240 Speaker 1: I do for days, and I am motivated. But I 227 00:18:40,320 --> 00:18:42,720 Speaker 1: understand your hesitation and thank you for your time, man, 228 00:18:43,040 --> 00:18:48,720 Speaker 1: and it means way and shut the door. This is 229 00:18:48,720 --> 00:18:52,080 Speaker 1: all bullshit. Of course, my mother is alive and well 230 00:18:52,160 --> 00:18:54,000 Speaker 1: and living in a three bedroom, sipilit and level ranch 231 00:18:54,040 --> 00:18:58,320 Speaker 1: in Arizona. I did feel guilty about using that I did, 232 00:18:58,480 --> 00:19:00,399 Speaker 1: but how the hell else do you get into a 233 00:19:00,400 --> 00:19:02,960 Speaker 1: place like this? The science behind what Max is doing 234 00:19:03,040 --> 00:19:06,200 Speaker 1: is so complex as to be mind bunding. My background 235 00:19:06,240 --> 00:19:08,960 Speaker 1: only took me so far. And like Max said, I 236 00:19:09,040 --> 00:19:12,080 Speaker 1: was qualified, but so what? And honestly, there was no 237 00:19:12,119 --> 00:19:15,560 Speaker 1: amount of preparation I could have done here. So I 238 00:19:15,600 --> 00:19:17,120 Speaker 1: did what I had to do. I took a different track. 239 00:19:18,720 --> 00:19:22,440 Speaker 1: I'm nothink if not resourceful. I did enough research to 240 00:19:22,480 --> 00:19:26,760 Speaker 1: assume Max had a softball of vulnerability. If you don't 241 00:19:26,760 --> 00:19:28,560 Speaker 1: know Max, you've probably at least heard of his father, 242 00:19:29,400 --> 00:19:33,600 Speaker 1: Dr Walter Fuller, His groundbreaking research into the link between 243 00:19:33,640 --> 00:19:37,040 Speaker 1: Alzheimer's disease and sleep deprivation earned him worldwide fame and 244 00:19:37,080 --> 00:19:42,280 Speaker 1: a Noble prize in medicine. A powerful man, big shoes 245 00:19:42,320 --> 00:19:46,119 Speaker 1: to fill within. Our plan is a step by step 246 00:19:46,160 --> 00:19:48,960 Speaker 1: process with which we hope to tackle the effects of 247 00:19:49,000 --> 00:19:51,600 Speaker 1: the disease and dramatically improve the quality of life of 248 00:19:51,600 --> 00:19:57,000 Speaker 1: its sufferers. And eventually, with Britain determination turned the tide 249 00:19:57,280 --> 00:20:00,240 Speaker 1: on this illness. And when the war against Alzheimer's and 250 00:20:00,280 --> 00:20:04,000 Speaker 1: for all, he wasn't blowing smoke. You may know the 251 00:20:04,080 --> 00:20:06,680 Speaker 1: names of some of the life saving treatments that resulted 252 00:20:06,720 --> 00:20:09,520 Speaker 1: from water Fullness research. Hell, you probably know one of 253 00:20:09,560 --> 00:20:11,879 Speaker 1: the five million people affected each year by the disease, 254 00:20:12,720 --> 00:20:15,399 Speaker 1: and he won the prize when he changed the delivery system. 255 00:20:15,520 --> 00:20:18,639 Speaker 1: Dr Walter Fuller is the recipient of this year's Nobel 256 00:20:18,720 --> 00:20:21,879 Speaker 1: Prize for Outstanding Discovery in the field of Life Sciences, 257 00:20:22,040 --> 00:20:25,800 Speaker 1: physiology and Medicine. The annual award was presented today to 258 00:20:25,960 --> 00:20:29,000 Speaker 1: Dr Fuller for uncovering the link between dementia and sleep 259 00:20:29,040 --> 00:20:33,159 Speaker 1: deprivation and the development of Thought Weare, which uses nanotechnology 260 00:20:33,200 --> 00:20:37,160 Speaker 1: to regulate hormone levels in the human brain. The nanites, 261 00:20:37,600 --> 00:20:41,160 Speaker 1: as we call them, act as a bridge between regions 262 00:20:41,200 --> 00:20:47,280 Speaker 1: of our brain and help regulates serotonin, melatonin, and dopamine levels. 263 00:20:47,400 --> 00:20:51,440 Speaker 1: So the technology really has implications for treating a variety 264 00:20:51,520 --> 00:20:57,320 Speaker 1: of diseases, including depression, Parkinson's dementia, Alzheimer's. What you may 265 00:20:57,320 --> 00:21:01,080 Speaker 1: not know about water is the tragedy can ironic attorney's 266 00:21:01,080 --> 00:21:04,600 Speaker 1: life talk and Welcome to World News Tonight. I'm Jane Coleman. 267 00:21:04,920 --> 00:21:08,320 Speaker 1: We begin tonight with devastating news from the medical community. 268 00:21:08,560 --> 00:21:12,600 Speaker 1: Neuroscientist and Nobel laureate Walter Fuller has passed away sadly 269 00:21:12,840 --> 00:21:15,520 Speaker 1: after a long battle with the very disease he worked 270 00:21:15,560 --> 00:21:19,960 Speaker 1: so hard to eradicate. The guy died of Alzheimer's. Walter 271 00:21:20,040 --> 00:21:23,440 Speaker 1: Fuller for all of his brilliance and determination for the 272 00:21:23,600 --> 00:21:27,800 Speaker 1: lives he saved with his innovations, Walter Fuller's personal life 273 00:21:27,840 --> 00:21:32,560 Speaker 1: was a disaster, a story of loss of lifelong despair, 274 00:21:32,800 --> 00:21:36,040 Speaker 1: of taking all of his pain and channeling me into 275 00:21:36,040 --> 00:21:39,880 Speaker 1: his research. He was survived by his only liven relative, 276 00:21:41,160 --> 00:21:50,359 Speaker 1: my son Max Fuller. Strong stuff here British Blood. I 277 00:21:50,680 --> 00:21:52,720 Speaker 1: captured this week's after getting my foot in the door 278 00:21:52,880 --> 00:21:56,879 Speaker 1: next Core I earned Max fullest trust, and I had 279 00:21:56,920 --> 00:22:00,359 Speaker 1: every office might Mom took me out to Arizona. That 280 00:22:00,520 --> 00:22:02,280 Speaker 1: was long go man in. At the time, I wasn't 281 00:22:02,280 --> 00:22:04,680 Speaker 1: sure what he confided in me, but I can see 282 00:22:04,760 --> 00:22:09,200 Speaker 1: now that he was desperate. He needed an that I 283 00:22:09,240 --> 00:22:13,800 Speaker 1: do you have any siblings and a friend? H Yeah, 284 00:22:14,480 --> 00:22:18,359 Speaker 1: it was a vulnerability. I was happy to exploit, just 285 00:22:18,400 --> 00:22:25,879 Speaker 1: the one. What did he do? He's dead? I'm sorry? 286 00:22:26,440 --> 00:22:28,400 Speaker 1: Did it mean to no? No, no, no, no, no no, no, no, 287 00:22:28,440 --> 00:22:32,760 Speaker 1: It's okay. It was a long time ago, a long 288 00:22:32,840 --> 00:22:38,040 Speaker 1: long time ago. Were you close? I mean identical twins? 289 00:22:38,200 --> 00:22:43,480 Speaker 1: So yeah, but it was complicated. It's always complicated, though. 290 00:22:43,680 --> 00:22:48,600 Speaker 1: What was he like? If that's not too forward, it's okay, 291 00:22:50,600 --> 00:22:58,080 Speaker 1: um m hmm. Benjamin he was. He was a kind 292 00:22:58,080 --> 00:23:04,439 Speaker 1: of bully. Ben detested fear. If you were afraid of something, 293 00:23:04,520 --> 00:23:11,600 Speaker 1: he'd get in your head, pick on you, work on 294 00:23:11,720 --> 00:23:16,800 Speaker 1: you until you gave in, especially when it came to 295 00:23:16,880 --> 00:23:21,960 Speaker 1: a game of chicken mhm. The day my brother died, 296 00:23:23,960 --> 00:23:26,439 Speaker 1: it was winter and it was way too late to 297 00:23:26,480 --> 00:23:31,200 Speaker 1: go out. Dad said no to walking out on the ice, 298 00:23:32,480 --> 00:23:38,800 Speaker 1: and Ben had to prove him wrong, definitely, and so 299 00:23:39,040 --> 00:23:47,040 Speaker 1: we went out three steps at a time he takes three, 300 00:23:47,359 --> 00:23:56,320 Speaker 1: I take three, just like that. At one point, another step, 301 00:23:56,400 --> 00:23:59,960 Speaker 1: I saw that we were too far from the shore, 302 00:24:00,119 --> 00:24:04,199 Speaker 1: or if we fell through, we wouldn't be able to 303 00:24:04,200 --> 00:24:08,600 Speaker 1: touch the bottom. And I remember looking at that shoreline 304 00:24:08,640 --> 00:24:13,800 Speaker 1: of snow melting on the bank, air freezing in my lungs, 305 00:24:17,320 --> 00:24:28,840 Speaker 1: and there was a sound, a small explosion. I thought 306 00:24:28,920 --> 00:24:33,840 Speaker 1: maybe a tree branch had cracked up, And that was 307 00:24:33,920 --> 00:24:41,640 Speaker 1: just my mind protecting me from what really happened. When 308 00:24:41,640 --> 00:24:45,800 Speaker 1: I turned back around, Ben was gone. One second, he's 309 00:24:45,960 --> 00:24:49,000 Speaker 1: on the ice, and then there's just a dark hole 310 00:24:49,600 --> 00:24:58,679 Speaker 1: and a crack running underneath my feet. Cheers. Anyway, I 311 00:24:58,720 --> 00:25:02,879 Speaker 1: really did assume Max, which is softening on me, laying 312 00:25:02,920 --> 00:25:06,000 Speaker 1: down his guard. But in retrospect I can't help but 313 00:25:06,080 --> 00:25:07,760 Speaker 1: one day, if he was tipping me off to something 314 00:25:08,840 --> 00:25:13,680 Speaker 1: bright and driven, strong willed and stubborn, but there was 315 00:25:13,720 --> 00:25:16,720 Speaker 1: still a quiet and timid boy inside. There was still 316 00:25:16,760 --> 00:25:19,120 Speaker 1: this broken kid who had survived the ice that day. 317 00:25:20,359 --> 00:25:22,320 Speaker 1: I mean, maybe in a way he was paying homage 318 00:25:22,359 --> 00:25:25,399 Speaker 1: to his brother drowned so long ago in the frigid 319 00:25:25,480 --> 00:25:28,680 Speaker 1: Midwestern lake. Or you know, maybe he was trying to 320 00:25:28,760 --> 00:25:34,439 Speaker 1: prove himself, proving his own significance. Yeah, First to his father, 321 00:25:34,520 --> 00:25:37,440 Speaker 1: who apparently never got over the loss and maybe blamed 322 00:25:37,440 --> 00:25:39,199 Speaker 1: his surviving song for what had happened out on the 323 00:25:39,200 --> 00:25:43,360 Speaker 1: first and late that day, and the second to himself. 324 00:25:45,680 --> 00:25:49,000 Speaker 1: Either ways, it drove Max to be the man he 325 00:25:49,119 --> 00:25:52,600 Speaker 1: is today, A fast talking genius, a visionary for sure. 326 00:25:53,359 --> 00:25:59,919 Speaker 1: Wait shut the door again. This is from my initial 327 00:26:00,000 --> 00:26:01,960 Speaker 1: Shoup inter view with Max right off that I lied 328 00:26:02,000 --> 00:26:04,280 Speaker 1: to him about my mother suffering from the same disease 329 00:26:04,320 --> 00:26:08,359 Speaker 1: that killed his father so he would hire me. Just 330 00:26:08,440 --> 00:26:16,920 Speaker 1: wait to say, how long? How long has she been sick? 331 00:26:18,480 --> 00:26:25,399 Speaker 1: How long is shiv la omph property she's been fogging 332 00:26:25,480 --> 00:26:32,199 Speaker 1: for a while? Who? Just so we're clear, you know, 333 00:26:32,280 --> 00:26:38,040 Speaker 1: I can't save her, right, I don't know. I mean 334 00:26:38,080 --> 00:26:40,800 Speaker 1: whatever she's sick with. If the doctors don't know, then 335 00:26:40,840 --> 00:26:45,320 Speaker 1: I don't know either. I'm not And I'm sorry about that. 336 00:26:45,359 --> 00:26:51,040 Speaker 1: I really am. Thank you. I know how it feels. 337 00:26:51,680 --> 00:26:54,560 Speaker 1: I know how it feels to see something that was 338 00:26:54,640 --> 00:27:01,600 Speaker 1: once so vital just drain away. When my father got sick, 339 00:27:01,680 --> 00:27:08,120 Speaker 1: it was like watching his entire essence just dissipate. Moman 340 00:27:08,200 --> 00:27:12,920 Speaker 1: is the same she used to have this laugh M 341 00:27:15,960 --> 00:27:23,360 Speaker 1: so funny. I kind of. You can't help her. That's 342 00:27:23,359 --> 00:27:27,320 Speaker 1: not why. I mean, why are you here? Because I 343 00:27:27,359 --> 00:27:34,439 Speaker 1: don't want this to happen to anyone else. Well, we 344 00:27:34,480 --> 00:27:37,800 Speaker 1: don't need another intern, but no, with your skill set, 345 00:27:39,280 --> 00:27:42,119 Speaker 1: there is a lot you can do. We've got a 346 00:27:42,119 --> 00:27:46,600 Speaker 1: massive meeting coming up for potential clients and investors, and 347 00:27:46,600 --> 00:27:48,320 Speaker 1: I'm going to be busy on the product, but I'm 348 00:27:48,320 --> 00:27:51,119 Speaker 1: also the face of this company. It's um it's it's 349 00:27:51,200 --> 00:27:54,720 Speaker 1: very hard being two people at once. So maybe you 350 00:27:54,760 --> 00:28:00,240 Speaker 1: can shadow me, be like my body man, help me 351 00:28:00,400 --> 00:28:05,760 Speaker 1: bridge the gaps between departments. Well, more like a like 352 00:28:05,800 --> 00:28:08,639 Speaker 1: a hawk. I'd say I need someone with a thirty 353 00:28:08,680 --> 00:28:11,679 Speaker 1: thousand foot view from all sides. I mean, you're a 354 00:28:11,680 --> 00:28:14,760 Speaker 1: structural man, good with systems. You can help with everything 355 00:28:14,760 --> 00:28:18,399 Speaker 1: from logistics and cost analysis to helping me with my 356 00:28:18,440 --> 00:28:22,600 Speaker 1: personal schedule. I can do that, and you're not above 357 00:28:22,680 --> 00:28:28,840 Speaker 1: making coffee. I make a very banging clothee. Okay, okay, 358 00:28:29,720 --> 00:28:34,119 Speaker 1: I'm I'm I'm taking a chance here. I know and 359 00:28:34,160 --> 00:28:36,359 Speaker 1: I understand that I won't for this up. I swear 360 00:28:37,560 --> 00:28:42,360 Speaker 1: this means a lot. Mr Fuller, you can call me Max. Yeah, 361 00:28:42,560 --> 00:28:45,200 Speaker 1: maxed in. Well, I guess we'd better send you back 362 00:28:45,200 --> 00:28:48,720 Speaker 1: down if helts and paved work. Huh, okay, sounds great. 363 00:28:49,920 --> 00:28:54,000 Speaker 1: Oh and hey Jack, Yeah, we're dealing with some sensitive 364 00:28:54,000 --> 00:28:56,560 Speaker 1: ship here, and I have to be able to trust 365 00:28:56,600 --> 00:29:00,600 Speaker 1: the people around me. The truth is that I'm gonna 366 00:29:00,640 --> 00:29:03,440 Speaker 1: need another set of eyes around here. Sometimes I'm so 367 00:29:03,520 --> 00:29:06,560 Speaker 1: blinded by the work I get a little lost. I 368 00:29:06,600 --> 00:29:13,080 Speaker 1: could really use a designated driver, so to speak. So 369 00:29:14,800 --> 00:29:20,200 Speaker 1: we need to trust each other with the truth, even 370 00:29:20,200 --> 00:29:33,560 Speaker 1: if it's a little uncomfortable. Agreed, Agreed. Max has a 371 00:29:33,560 --> 00:29:36,560 Speaker 1: aside to him, a certain softness to him, a kindness. 372 00:29:37,400 --> 00:29:39,600 Speaker 1: When I brought up my mother's illness, his eyes would 373 00:29:40,520 --> 00:29:43,480 Speaker 1: he took it so hard even I almost believed my story. 374 00:29:44,400 --> 00:29:48,120 Speaker 1: But again, my mother is alive and healthy. And whereas 375 00:29:48,120 --> 00:29:50,560 Speaker 1: the reason I gave for pursuing this job went technically 376 00:29:50,560 --> 00:29:54,640 Speaker 1: on the level, there was a touch of truth. Now. 377 00:29:54,640 --> 00:29:58,800 Speaker 1: Max is reckless during human trials under his watchful Why 378 00:29:58,800 --> 00:30:03,960 Speaker 1: has something happened? His subjects walked away, changed damaged. Three 379 00:30:03,960 --> 00:30:07,400 Speaker 1: of his subjects committed suicide, at least that was the 380 00:30:07,440 --> 00:30:15,000 Speaker 1: official cause of death. One of them, one of them 381 00:30:15,040 --> 00:30:23,760 Speaker 1: was my brother Michael. I saw him heading to find 382 00:30:23,800 --> 00:30:29,160 Speaker 1: Max journa. Tell me the last male marker we passed 383 00:30:29,760 --> 00:30:36,400 Speaker 1: Mile Marker Speed up. Please. Max Fuller created a series 384 00:30:36,400 --> 00:30:39,440 Speaker 1: of miracles in these labs. But these mind apps come 385 00:30:39,520 --> 00:30:43,000 Speaker 1: with a steep price, and from what I've seen, that 386 00:30:43,160 --> 00:30:45,800 Speaker 1: steep price wouldn't want to be paid again and again 387 00:30:45,800 --> 00:30:50,720 Speaker 1: with anyone who uses them. Max Fuller must be stopped, 388 00:30:53,080 --> 00:31:06,280 Speaker 1: and I'm the man who stopped him. Tomorrow's Monsters starring 389 00:31:06,360 --> 00:31:10,120 Speaker 1: John Boyega as Jack Locke, Darren Chris as Max Fuller, 390 00:31:10,560 --> 00:31:14,800 Speaker 1: Marley Shelton as Cass Berkeley, Clark Gregg as Walter Fuller, 391 00:31:15,120 --> 00:31:19,920 Speaker 1: saw and Guja as David Truesdale, Nicholas Takowski as Finn Connolly, 392 00:31:20,280 --> 00:31:25,479 Speaker 1: Claire Bronson as dr Abbie Reynolds, David Chen as Michael Corbin, zu, 393 00:31:25,600 --> 00:31:30,760 Speaker 1: Hila Elettar as Jenna, Victor Rivera as Eddie Binder, Robert 394 00:31:30,760 --> 00:31:35,840 Speaker 1: Praalgo as Agent Batty, Steve Coulter as Senator Berkeley, wrote A. 395 00:31:35,920 --> 00:31:39,960 Speaker 1: Griffiths as Rainy Webb, with additional performances by Helen Abel, 396 00:31:40,200 --> 00:31:45,560 Speaker 1: Jason Williams, Michael Anthony, Robin Bloodworth, and Teresa Davis. Our 397 00:31:45,600 --> 00:31:49,120 Speaker 1: first assistant director is Michael Monty. Our second assistant director 398 00:31:49,240 --> 00:31:53,800 Speaker 1: is Sarah Klein, sound and music by Ben Lovett. Additional 399 00:31:53,840 --> 00:31:57,720 Speaker 1: sound design and editing by Benjamin Belcolm, Justin Robowski and 400 00:31:57,800 --> 00:32:03,480 Speaker 1: Mike Reagan. Casting Jessica Fox thig pen Our. Executive producers 401 00:32:03,480 --> 00:32:07,800 Speaker 1: are Scott Sheldon, Shelby Thomas, Alexander Williams, and Matthew Frederick. 402 00:32:08,560 --> 00:32:12,800 Speaker 1: Written by Dan Bush and Nicholas Takowski, Created by Dan 403 00:32:12,880 --> 00:32:17,560 Speaker 1: Bush and Conald Byrne. Directed by Dan Bush, Produced by 404 00:32:17,600 --> 00:32:22,480 Speaker 1: both Flynt, dan Bush and John Boyega. Tomorrow's Monsters is 405 00:32:22,480 --> 00:32:26,400 Speaker 1: a production of Igheart Radio, Flynn Picture Company, Psycopia Pictures, 406 00:32:26,640 --> 00:32:31,160 Speaker 1: and Upper Room Productions. For more podcasts from my Heart Radio, 407 00:32:31,600 --> 00:32:35,560 Speaker 1: visit the I Heart Radio app, Apple Podcasts, or wherever 408 00:32:35,640 --> 00:32:36,920 Speaker 1: you listen to podcasts.