1 00:00:15,356 --> 00:00:15,796 Speaker 1: Pushkin. 2 00:00:20,436 --> 00:00:23,036 Speaker 2: Hey, Happy New Year. We're very happy to be back, 3 00:00:23,396 --> 00:00:26,836 Speaker 2: and i have one request before we start the show. 4 00:00:26,916 --> 00:00:29,596 Speaker 2: I'm asking you a favor, and the favor is this, 5 00:00:30,156 --> 00:00:35,396 Speaker 2: would you please send us an email to problem at 6 00:00:35,396 --> 00:00:39,756 Speaker 2: pushkin dot fm and tell us what you like about 7 00:00:39,796 --> 00:00:41,876 Speaker 2: the show and what you don't like about the show, 8 00:00:41,916 --> 00:00:44,356 Speaker 2: and specifically what kinds of things you want to hear 9 00:00:44,396 --> 00:00:47,236 Speaker 2: more of, and perhaps what kinds of things you don't 10 00:00:47,236 --> 00:00:51,876 Speaker 2: want to hear Again, it's problem at Pushkin dot fm. 11 00:00:51,956 --> 00:00:54,236 Speaker 2: I'm going to read all the emails, so thank you 12 00:00:54,356 --> 00:01:00,196 Speaker 2: in advance for sending them. Claude Shannon is this huge 13 00:01:00,236 --> 00:01:02,796 Speaker 2: figure in the history of technology. He's one of the 14 00:01:02,876 --> 00:01:05,836 Speaker 2: key people who worked at Bell Labs in the middle 15 00:01:05,836 --> 00:01:08,796 Speaker 2: of the twentieth century and really came up with the 16 00:01:08,796 --> 00:01:13,836 Speaker 2: idea that made modern technology possible. But I'm going to 17 00:01:13,876 --> 00:01:18,116 Speaker 2: be honest with you, I never really understood what Claude 18 00:01:18,116 --> 00:01:21,516 Speaker 2: Shannon figured out that was such a big deal. But 19 00:01:21,556 --> 00:01:23,836 Speaker 2: the people who know about technology, who know about the 20 00:01:23,916 --> 00:01:28,076 Speaker 2: history of ideas, they say Shannon's a giant. Claude Shannon 21 00:01:28,196 --> 00:01:32,476 Speaker 2: is like the nerds. Nerd he's the techno intellectuals, techno 22 00:01:32,516 --> 00:01:36,676 Speaker 2: intellectual and so For today's show, I wanted to understand 23 00:01:37,596 --> 00:01:40,316 Speaker 2: what did Claude Shannon figure out and why is it 24 00:01:40,436 --> 00:01:41,836 Speaker 2: so important. 25 00:01:41,556 --> 00:01:42,596 Speaker 1: For the modern world. 26 00:01:48,876 --> 00:01:51,676 Speaker 2: I'm Jacob Goldstein, and this is What's your problem. My 27 00:01:51,716 --> 00:01:55,556 Speaker 2: guest today is David Shay. David is a professor of 28 00:01:55,716 --> 00:02:00,436 Speaker 2: electrical engineering at Stanford. He has studied Shannon for decades. 29 00:02:00,676 --> 00:02:05,196 Speaker 2: He teaches Shannon's work to his students, and David used 30 00:02:05,236 --> 00:02:08,996 Speaker 2: Shannon's work to make a breakthrough in cell phone technology. 31 00:02:09,516 --> 00:02:12,396 Speaker 2: And that breakthrough, that breakthrough that came to us via 32 00:02:12,516 --> 00:02:15,996 Speaker 2: Shannon and Shay, it affects every phone call we make. 33 00:02:17,156 --> 00:02:20,436 Speaker 2: David and I talked about Shannon's key insights and about 34 00:02:20,436 --> 00:02:23,716 Speaker 2: how David's own work built on Shannon, and we also 35 00:02:23,756 --> 00:02:26,196 Speaker 2: talked about the big chunk of Shannon's life that was 36 00:02:26,196 --> 00:02:30,836 Speaker 2: taken up with juggling and riding unicycles and building mechanical toys. 37 00:02:31,516 --> 00:02:34,436 Speaker 2: But to start, we talked about how in the middle 38 00:02:34,476 --> 00:02:37,796 Speaker 2: of the twentieth century, Bell Labs wound up driving so 39 00:02:37,996 --> 00:02:39,756 Speaker 2: much technological innovation. 40 00:02:42,516 --> 00:02:47,036 Speaker 1: Yeah, so Bell Labs was the research lab of AT 41 00:02:47,156 --> 00:02:52,396 Speaker 1: and T. Aightenh at that time was the phone company. Okay, 42 00:02:52,476 --> 00:02:55,796 Speaker 1: nowadays we have many phone companies. We have Verizon, we 43 00:02:55,916 --> 00:03:00,476 Speaker 1: have T Mobile, et cetera. But those days there was 44 00:03:00,516 --> 00:03:04,756 Speaker 1: only one phone company, and that's a monopoly. So a 45 00:03:04,836 --> 00:03:07,476 Speaker 1: monopoly needs to justify its existence. 46 00:03:07,676 --> 00:03:10,316 Speaker 2: Huh. So it doesn't get broken by the government. 47 00:03:10,356 --> 00:03:12,716 Speaker 1: It doesn't get broken up. Of course, it eventually got 48 00:03:12,716 --> 00:03:15,436 Speaker 1: broken up, but at that time it was a monopoly. 49 00:03:16,396 --> 00:03:21,076 Speaker 1: And so one way of justifying its existence is to 50 00:03:21,116 --> 00:03:24,636 Speaker 1: say that. Okay, he says to the American people, to 51 00:03:24,676 --> 00:03:29,436 Speaker 1: the government, that we will always spend a certain percent 52 00:03:29,676 --> 00:03:34,916 Speaker 1: of our revenue on this research lab called bow Labs, 53 00:03:35,636 --> 00:03:38,876 Speaker 1: and whatever bow Labs come up with is kind of 54 00:03:38,916 --> 00:03:42,916 Speaker 1: our contribution, know only to our bottom line, but also 55 00:03:43,796 --> 00:03:46,916 Speaker 1: to technology of the country. 56 00:03:47,316 --> 00:03:49,796 Speaker 2: So they have this sort of public mission to prevent 57 00:03:49,836 --> 00:03:51,436 Speaker 2: the government from breaking them up. 58 00:03:52,196 --> 00:03:56,796 Speaker 1: Yeah, and so therefore it also allows researchers a very 59 00:03:56,836 --> 00:04:02,076 Speaker 1: free reign to do research that not necessarily tied to, 60 00:04:02,276 --> 00:04:06,796 Speaker 1: like say, a particular business unit. Okay, So they can 61 00:04:06,836 --> 00:04:10,196 Speaker 1: be very creative. And that's the atmosphere about so bad 62 00:04:10,236 --> 00:04:14,036 Speaker 1: Labs attracted a bunch of very smart people because smart 63 00:04:14,036 --> 00:04:16,196 Speaker 1: people wants to work on their own problem, not the 64 00:04:16,236 --> 00:04:19,876 Speaker 1: problem that the manager gives them. Yeah, okay, that's the 65 00:04:20,236 --> 00:04:23,476 Speaker 1: that's one characteristic of smart people. And so yeah, that 66 00:04:23,636 --> 00:04:26,756 Speaker 1: was the heydays of Bat Labs. Lots of smart people 67 00:04:27,036 --> 00:04:32,556 Speaker 1: inventing amazing stuff. Laser was invented there, information theory, the 68 00:04:32,636 --> 00:04:36,836 Speaker 1: transistor was invented there. Sort of almost all the foundation 69 00:04:37,236 --> 00:04:43,716 Speaker 1: of the information age. Yeah, where there's hardware, algorithm, software 70 00:04:44,516 --> 00:04:47,076 Speaker 1: is in some sense all have the roots at Ball Labs. 71 00:04:47,756 --> 00:04:50,836 Speaker 1: So that was the contribution to mankind. Actually, I should say, no, 72 00:04:50,876 --> 00:04:52,716 Speaker 1: only to America. 73 00:04:52,956 --> 00:04:56,596 Speaker 2: So Shannon gets there at this time, right, he's there 74 00:04:56,636 --> 00:05:00,556 Speaker 2: with you know, when they're inventing certainly the transistor. 75 00:05:01,956 --> 00:05:02,476 Speaker 1: What's he do? 76 00:05:02,636 --> 00:05:04,956 Speaker 2: Tell me about his his work there? When he gets there, 77 00:05:05,236 --> 00:05:06,036 Speaker 2: what's he working on? 78 00:05:07,676 --> 00:05:14,196 Speaker 1: Yeah? So I think Shannon always have his own agenda, right. 79 00:05:14,796 --> 00:05:17,316 Speaker 1: We know for a fact that he has been interested 80 00:05:17,516 --> 00:05:22,996 Speaker 1: in the problem of communication, that idea of having a 81 00:05:22,996 --> 00:05:27,156 Speaker 1: grand theory of communication, even back in nineteen thirty eight, 82 00:05:27,276 --> 00:05:29,476 Speaker 1: I think thirty seven thirty eight, because he wrote a 83 00:05:29,556 --> 00:05:33,036 Speaker 1: letter at that time to a very famous person named 84 00:05:33,076 --> 00:05:37,796 Speaker 1: Venera Bush. Yeah. Vera Bush is very famous, as he 85 00:05:37,916 --> 00:05:40,596 Speaker 1: was I think president of MIT or dean of MIT, 86 00:05:41,316 --> 00:05:43,796 Speaker 1: and then he became sort of a scientific advisor to 87 00:05:43,836 --> 00:05:47,636 Speaker 1: the president, and so he wrote a letter to Venera 88 00:05:47,676 --> 00:05:49,716 Speaker 1: Bush in nineteen thirty eight and say, hey, you know what, 89 00:05:49,756 --> 00:05:52,996 Speaker 1: I'm really interested in this question of how to find 90 00:05:53,076 --> 00:05:56,356 Speaker 1: one theory that unifies all possible communication systems. There's so 91 00:05:56,396 --> 00:05:58,716 Speaker 1: many different communications system out there, but I think there's 92 00:05:58,716 --> 00:06:01,716 Speaker 1: something at the heart of every system, and I'm trying 93 00:06:01,756 --> 00:06:02,516 Speaker 1: to get to the heart. 94 00:06:02,716 --> 00:06:05,676 Speaker 2: And like nobody had thought of it in that way, right. 95 00:06:05,716 --> 00:06:09,316 Speaker 2: It seems like part of his part of what Shannon's 96 00:06:09,356 --> 00:06:11,236 Speaker 2: such a big deal is like as I understand that 97 00:06:11,276 --> 00:06:13,636 Speaker 2: people it was like, you know that people understood like 98 00:06:13,756 --> 00:06:15,356 Speaker 2: they were trying to figure out how to make the 99 00:06:15,396 --> 00:06:17,796 Speaker 2: phone work better, and they were trying to, you know, 100 00:06:18,076 --> 00:06:20,876 Speaker 2: make movies be clearer or whatever. But there wasn't this 101 00:06:20,996 --> 00:06:24,796 Speaker 2: idea that you could abstract it until Shannon came along. 102 00:06:25,836 --> 00:06:29,396 Speaker 1: And the reason is very simple, actually, because if you 103 00:06:29,476 --> 00:06:31,596 Speaker 1: have a physical system, then you want to build right 104 00:06:32,516 --> 00:06:34,596 Speaker 1: what do you see right? You say, hey man the 105 00:06:35,036 --> 00:06:37,836 Speaker 1: video for example, i'm seeing you right now, I'm not 106 00:06:37,836 --> 00:06:39,756 Speaker 1: seeing you very clearly, have to say yes. 107 00:06:40,956 --> 00:06:42,836 Speaker 2: I'm in a closet, a closet. 108 00:06:42,956 --> 00:06:46,116 Speaker 1: Right Then I would say, how to try to improve 109 00:06:46,156 --> 00:06:48,236 Speaker 1: the imbage? Maybe I can try to, you know, fix 110 00:06:48,276 --> 00:06:52,476 Speaker 1: this pixel or do some filtering of your noise. So 111 00:06:52,756 --> 00:06:56,556 Speaker 1: I'm very tied to the very specific details of the 112 00:06:56,636 --> 00:06:59,356 Speaker 1: specific problem. Because why I'm the engineer. I need to 113 00:06:59,396 --> 00:07:03,196 Speaker 1: improve the system, not in ten years, but tomorrow. You know, tomorrow. 114 00:07:03,796 --> 00:07:05,836 Speaker 2: You don't need a theory of the system. You just 115 00:07:05,876 --> 00:07:07,116 Speaker 2: want to creer picture. 116 00:07:07,236 --> 00:07:10,316 Speaker 1: Yeah, yeah, I mean I'm in the weeds, right, I'm 117 00:07:10,356 --> 00:07:15,036 Speaker 1: in the weeds. And Shannon, because of his training, and 118 00:07:15,116 --> 00:07:18,276 Speaker 1: also because of the atmosphere of a place like bout Labs, 119 00:07:18,756 --> 00:07:22,116 Speaker 1: could afford to st step back and just look at 120 00:07:22,116 --> 00:07:25,796 Speaker 1: the broader forest as opposed to the details of specific trees. 121 00:07:26,876 --> 00:07:30,596 Speaker 2: So so, okay, Shannon's big idea comes out in this 122 00:07:30,716 --> 00:07:34,076 Speaker 2: paper he publishes in nineteen forty eight. The paper is 123 00:07:34,076 --> 00:07:38,236 Speaker 2: called a Mathematical Theory of Communication. It's like his great work. 124 00:07:38,716 --> 00:07:39,796 Speaker 2: Tell me about that paper. 125 00:07:40,756 --> 00:07:43,676 Speaker 1: So that paper is actually a very interesting paper. In fact, 126 00:07:44,036 --> 00:07:49,316 Speaker 1: when I teach information theory, I teach from the paper itself, 127 00:07:49,556 --> 00:07:53,276 Speaker 1: because I thought it's an amazing way not only of 128 00:07:53,676 --> 00:07:56,196 Speaker 1: learning information theory, but learning how to write a scientific 129 00:07:56,236 --> 00:08:01,356 Speaker 1: paper properly. Huh okay. And you know, not everyone does 130 00:08:01,396 --> 00:08:04,836 Speaker 1: research and information theory, but everybody has to write uh 131 00:08:04,916 --> 00:08:08,476 Speaker 1: huh okay. Every researcher has the right to express their 132 00:08:08,516 --> 00:08:12,076 Speaker 1: ideas to the peers and to the audience. So in 133 00:08:12,156 --> 00:08:15,916 Speaker 1: that paper, very interesting. The first paragraph of the paper. Okay, 134 00:08:15,956 --> 00:08:19,036 Speaker 1: it's already very interesting because typically when people write a 135 00:08:19,036 --> 00:08:22,236 Speaker 1: paper nowadays, they tell you, oh, how great my invention is. 136 00:08:22,476 --> 00:08:24,716 Speaker 1: It's going to change the world. Every paper is going 137 00:08:24,796 --> 00:08:27,996 Speaker 1: to change the world. But in fact, his first paper 138 00:08:28,116 --> 00:08:31,396 Speaker 1: paragraph focused on telling you what his paper is not achieving. 139 00:08:31,836 --> 00:08:36,596 Speaker 1: Ha ha, I mean that's a master. That's the masters, right, 140 00:08:36,836 --> 00:08:39,756 Speaker 1: I mean, how many papers that you read nowadays tells 141 00:08:39,796 --> 00:08:43,436 Speaker 1: you in the beginning, Hey, you know what, guys, expectation 142 00:08:43,596 --> 00:08:47,636 Speaker 1: management here, this paper is not about this. Hey, don't 143 00:08:47,636 --> 00:08:50,596 Speaker 1: get your home. Yeah, exactly, That's exactly what he did. 144 00:08:50,836 --> 00:08:55,036 Speaker 1: Expectation management nowadays, that's what today we will call it 145 00:08:55,076 --> 00:08:58,596 Speaker 1: expectation management. And now those days, I guess he just 146 00:08:58,636 --> 00:09:02,916 Speaker 1: calls it honesty. And his whole point was often people 147 00:09:03,036 --> 00:09:08,836 Speaker 1: associate information with meaning, okay, and then he said in 148 00:09:08,876 --> 00:09:14,916 Speaker 1: this we ignore meaning, we ignore meaning. Huh okay. So 149 00:09:15,036 --> 00:09:18,076 Speaker 1: that was the first thing he did, which is brilliant 150 00:09:18,396 --> 00:09:22,476 Speaker 1: because once you high information with meaning, then he will 151 00:09:22,476 --> 00:09:24,596 Speaker 1: never be able to make any progress. It's just too 152 00:09:24,636 --> 00:09:27,076 Speaker 1: difficult and too broad and too vague a problem. 153 00:09:27,716 --> 00:09:30,316 Speaker 2: Everybody gets stuck on this idea of meaning and what 154 00:09:30,476 --> 00:09:33,756 Speaker 2: is meaning? And he's like, forget about meaning. So we're 155 00:09:33,756 --> 00:09:36,476 Speaker 2: gonna forget about meaning. What is left? 156 00:09:37,876 --> 00:09:41,236 Speaker 1: Yes, Actually, the biggest I think breakthrough of that paper 157 00:09:41,996 --> 00:09:46,196 Speaker 1: is to really focus on the thing that matters and 158 00:09:46,876 --> 00:09:50,676 Speaker 1: cut away a lot of stuff that really doesn't learn 159 00:09:50,756 --> 00:09:53,436 Speaker 1: that it doesn't matter, but it doesn't matter in terms 160 00:09:53,436 --> 00:09:58,676 Speaker 1: of solving the communication problem, the communication. So then he said, okay, 161 00:09:58,716 --> 00:10:01,996 Speaker 1: what is the communication problem? The communication problem is the 162 00:10:02,076 --> 00:10:06,676 Speaker 1: following is that there are multiple possibilities of a word, 163 00:10:07,636 --> 00:10:11,436 Speaker 1: and my goal is to tell the receiver destination which 164 00:10:11,676 --> 00:10:14,356 Speaker 1: of the multiple pospiity is the correct prosperity. 165 00:10:15,076 --> 00:10:19,116 Speaker 2: Yeah, and so in language, it's basically it's a finite set. 166 00:10:19,236 --> 00:10:21,636 Speaker 2: Language is a finite set. It's very large. But if 167 00:10:21,636 --> 00:10:24,716 Speaker 2: we're speaking and we both know that we're speaking English, 168 00:10:25,116 --> 00:10:28,676 Speaker 2: then essentially you are hearing the words and decoding them, 169 00:10:28,836 --> 00:10:30,716 Speaker 2: and you know that it is a series of words, 170 00:10:30,716 --> 00:10:33,596 Speaker 2: and you just have to figure out which words I 171 00:10:33,636 --> 00:10:37,596 Speaker 2: mean like that for example, Yes, like that? Okay, so 172 00:10:37,796 --> 00:10:39,476 Speaker 2: that's the frame he builds then. 173 00:10:39,356 --> 00:10:45,996 Speaker 1: Why okay, all right, Then once you have this framing, right, 174 00:10:46,436 --> 00:10:49,316 Speaker 1: then you can ask the question, Okay, what is the 175 00:10:49,316 --> 00:10:53,396 Speaker 1: goal of communication? The goal of communication is to communicate 176 00:10:53,836 --> 00:10:59,676 Speaker 1: as fast as I can, right, And the natural question 177 00:10:59,876 --> 00:11:04,156 Speaker 1: is why is there a limit on how fast I 178 00:11:04,196 --> 00:11:07,956 Speaker 1: can communicate to you? Because if there's no limit, then 179 00:11:08,196 --> 00:11:11,756 Speaker 1: amazing world. Right, we can communicate so fast it's. 180 00:11:11,556 --> 00:11:15,276 Speaker 2: Like instant telepathy. It's like you instantly beat me every 181 00:11:15,316 --> 00:11:15,996 Speaker 2: thought in your head. 182 00:11:16,076 --> 00:11:18,756 Speaker 1: Yeah, okay, exactly. The natural question it has once you 183 00:11:18,756 --> 00:11:22,356 Speaker 1: set up this finite set, as you mentioned, is okay, 184 00:11:22,396 --> 00:11:25,076 Speaker 1: given these finite sets, is there a limit on how 185 00:11:25,116 --> 00:11:29,076 Speaker 1: fast I can communicate to you? And so that was 186 00:11:29,116 --> 00:11:31,196 Speaker 1: the question that was the heart of the paper, which 187 00:11:31,236 --> 00:11:37,476 Speaker 1: is to so he formulated this notion of a capacity. 188 00:11:37,836 --> 00:11:42,996 Speaker 1: That communication system is like a pipe. It's like you're 189 00:11:43,036 --> 00:11:46,356 Speaker 1: pushing water through this pipe, and the size of the 190 00:11:46,396 --> 00:11:49,316 Speaker 1: pipe limits of how fast you can push water through it. 191 00:11:50,396 --> 00:11:54,356 Speaker 1: And now, justly in communication, there's this notion of a 192 00:11:54,436 --> 00:11:57,676 Speaker 1: size of the pipe, which is called a capacity. And 193 00:11:57,756 --> 00:12:02,076 Speaker 1: you figured a way of computing this capacity for different 194 00:12:02,116 --> 00:12:07,756 Speaker 1: communication medium, any communication medium, you can actually compute a 195 00:12:07,876 --> 00:12:11,756 Speaker 1: capacity for that community, and that limits how fast you 196 00:12:11,796 --> 00:12:15,716 Speaker 1: can communicate information over that medium, whether that medium is wireless, 197 00:12:16,196 --> 00:12:19,556 Speaker 1: over the air or over the widline. Like I'm talking 198 00:12:19,596 --> 00:12:22,236 Speaker 1: to you, I communicate over the air, I talk to 199 00:12:22,316 --> 00:12:25,476 Speaker 1: my WiFi. The wi Fi goes through some copper cabo, 200 00:12:25,636 --> 00:12:29,236 Speaker 1: some optical fiber. H He's a physical medium, but he 201 00:12:29,356 --> 00:12:33,516 Speaker 1: can compute a capacity for each of these different mediums. 202 00:12:34,756 --> 00:12:44,076 Speaker 2: And I know that part of the paper looks at, say, 203 00:12:44,156 --> 00:12:52,676 Speaker 2: redundancy in various modes of communication and on related note patterns. Right, 204 00:12:52,876 --> 00:12:54,956 Speaker 2: there's this whole section of the paper where he looks 205 00:12:54,956 --> 00:12:59,476 Speaker 2: at the frequency with which letters occur in English and 206 00:12:59,596 --> 00:13:02,956 Speaker 2: kind of builds builds an idea around that. Tell me 207 00:13:02,996 --> 00:13:04,196 Speaker 2: about those pieces. 208 00:13:03,876 --> 00:13:07,076 Speaker 1: Of the paper. Yeah, so let's talk with the word redundancy. 209 00:13:07,876 --> 00:13:09,956 Speaker 2: Yeah, that comes off right. 210 00:13:09,996 --> 00:13:11,596 Speaker 1: No, no, no, no, no, no no, that's not only 211 00:13:11,676 --> 00:13:14,516 Speaker 1: not the wrong word, but it's actually the most important word. 212 00:13:14,556 --> 00:13:19,996 Speaker 1: I would say almost because you go back to the 213 00:13:20,076 --> 00:13:22,076 Speaker 1: question to the thing I was talking about, which is 214 00:13:22,116 --> 00:13:26,196 Speaker 1: how fast you can communicate? Right, So what he discovered 215 00:13:26,276 --> 00:13:28,356 Speaker 1: was actually there's no limit on how fast it can communicate. 216 00:13:28,396 --> 00:13:31,436 Speaker 1: You can always communicate very fast. But what the guy 217 00:13:31,516 --> 00:13:35,356 Speaker 1: can hear is gibberish, and he cannot really distinguish what 218 00:13:35,396 --> 00:13:37,116 Speaker 1: you're trying to say is like so much noise in 219 00:13:37,116 --> 00:13:39,676 Speaker 1: the system, okay, that he cannot really figure out what he's. 220 00:13:39,556 --> 00:13:41,756 Speaker 2: Saying, even if you're face to face, right, even if 221 00:13:41,756 --> 00:13:43,836 Speaker 2: you're face to face, you're not going to over the 222 00:13:43,876 --> 00:13:46,276 Speaker 2: phone or whatever. If you talk too fast, the listener 223 00:13:46,276 --> 00:13:47,676 Speaker 2: won't understand because you're going to. 224 00:13:47,956 --> 00:13:51,516 Speaker 1: And anybody who goes to a crazy professor's lecture would 225 00:13:51,556 --> 00:13:54,716 Speaker 1: know about this, where the professor just keeps on talking 226 00:13:55,716 --> 00:13:58,876 Speaker 1: and million miles per hour and the students, the sister 227 00:13:58,996 --> 00:14:01,516 Speaker 1: and nobody understood the thing, and the professor cours the 228 00:14:01,556 --> 00:14:05,796 Speaker 1: day when it's finished. So so basically he's what he's 229 00:14:05,836 --> 00:14:09,636 Speaker 1: saying is that, hey, you know what to make sure 230 00:14:09,756 --> 00:14:13,836 Speaker 1: that the information goes through reliably, reliably, that's the first 231 00:14:13,876 --> 00:14:20,316 Speaker 1: word you need to introduce, redundancy, redundancy in your message, okay. 232 00:14:21,716 --> 00:14:24,676 Speaker 1: And what you figured out is in some sense the 233 00:14:24,996 --> 00:14:28,956 Speaker 1: optimal way of adding redundancy, because you know, you can 234 00:14:28,996 --> 00:14:32,556 Speaker 1: always be stupid in adding redundancy. For example, I can 235 00:14:32,676 --> 00:14:35,116 Speaker 1: keep on repeating the same word one hundred times to 236 00:14:35,156 --> 00:14:37,476 Speaker 1: you and then you probably get it, and then I 237 00:14:37,516 --> 00:14:39,276 Speaker 1: move on to the next word. I cannot move on 238 00:14:39,316 --> 00:14:41,836 Speaker 1: the next word, but that would take me one hundred 239 00:14:41,956 --> 00:14:47,196 Speaker 1: times slow. Yes, right, and so that's not a very 240 00:14:47,236 --> 00:14:49,876 Speaker 1: smart way of adding redundancy. So what do you figured 241 00:14:49,876 --> 00:14:52,756 Speaker 1: out is an optimal way of edding redundancy so that 242 00:14:52,876 --> 00:14:57,436 Speaker 1: you can communicate reliably and yet at the maximum what 243 00:14:57,556 --> 00:15:02,676 Speaker 1: he calls capacity limit. And that was a totally amazing 244 00:15:02,756 --> 00:15:08,156 Speaker 1: actually formulation of the problem and highly non obvious. And 245 00:15:08,196 --> 00:15:12,116 Speaker 1: I think that is some of the amazing contribution of 246 00:15:12,156 --> 00:15:12,916 Speaker 1: this guy, Shennon. 247 00:15:12,996 --> 00:15:20,916 Speaker 2: Yeah, it's optimization. He optimizes communication across any channel where 248 00:15:20,916 --> 00:15:27,196 Speaker 2: you're balancing efficiency or speed and reliability. That is the tradeoff, 249 00:15:27,236 --> 00:15:29,876 Speaker 2: and he figures out how to optimize for that trade off. 250 00:15:31,276 --> 00:15:35,796 Speaker 1: Yes, yes, he figured out how to optimize that trade off. 251 00:15:37,516 --> 00:15:43,516 Speaker 1: But that tradeoff turns out to be very interesting. Uh huh. 252 00:15:43,876 --> 00:15:46,556 Speaker 1: It's a very interesting tradeoff. So typically when we think 253 00:15:46,556 --> 00:15:49,956 Speaker 1: about tradeoff, we think about like a smooth curve, right, 254 00:15:50,636 --> 00:15:53,636 Speaker 1: as when you tune something that you can get better performance. 255 00:15:54,196 --> 00:15:57,236 Speaker 1: But what he showed was that there's kind of like 256 00:15:57,396 --> 00:16:02,756 Speaker 1: a cliff effect, Okay, And the cliff effect is that 257 00:16:02,836 --> 00:16:08,076 Speaker 1: if you communicate below this number called capacity, then you 258 00:16:08,116 --> 00:16:12,916 Speaker 1: can always engineer system to make your signal the communication 259 00:16:13,076 --> 00:16:17,996 Speaker 1: as reliable as you want. Huh so reliable, that's completely clean. Wow. 260 00:16:18,796 --> 00:16:23,516 Speaker 1: Whereas you communicate above this number of capacity, then there's 261 00:16:23,556 --> 00:16:25,596 Speaker 1: nothing you can do to make a signal clean. It's 262 00:16:25,596 --> 00:16:29,796 Speaker 1: just completely gibberish. Huh. So it's a very sharp tradeoff 263 00:16:30,396 --> 00:16:33,036 Speaker 1: that he identified. It's not a smooth tradeoff. 264 00:16:33,556 --> 00:16:35,796 Speaker 2: And if you're running the phone company, that's exactly what 265 00:16:35,876 --> 00:16:37,716 Speaker 2: you want to know, right, So then you can tune 266 00:16:37,756 --> 00:16:40,996 Speaker 2: it all the way to capacity and then not try 267 00:16:41,036 --> 00:16:43,276 Speaker 2: and tune it anymore after that, because it's not going 268 00:16:43,356 --> 00:16:44,076 Speaker 2: to get any better. 269 00:16:44,476 --> 00:16:46,956 Speaker 1: Correct, And that's the goal of sixty years of engineering 270 00:16:46,956 --> 00:16:51,556 Speaker 1: to achieve his vision, his vision nineteen forty eight. It 271 00:16:51,596 --> 00:16:54,716 Speaker 1: took people around sixty years to get to that implement 272 00:16:54,796 --> 00:16:55,276 Speaker 1: his vision. 273 00:16:56,396 --> 00:16:59,596 Speaker 2: Well, so you are part of that story, right, Let's 274 00:16:59,836 --> 00:17:03,276 Speaker 2: let's let you walk onto the story now. So you 275 00:17:03,316 --> 00:17:07,236 Speaker 2: tell me about your work and how Shannon's work. You know, 276 00:17:07,716 --> 00:17:10,036 Speaker 2: how you built on Shannon's work. Tell me about how 277 00:17:10,036 --> 00:17:11,876 Speaker 2: you built on Shannon's work. 278 00:17:13,036 --> 00:17:16,956 Speaker 1: Yeah. So I did my PhD in the nineties. In 279 00:17:16,996 --> 00:17:23,076 Speaker 1: the nineties. My advisor was a Shannon student, and so 280 00:17:23,116 --> 00:17:26,476 Speaker 1: I learned information theory him. Okay, Now, at that time, 281 00:17:27,316 --> 00:17:31,196 Speaker 1: information theory was almost a dead subject. Okay. When I 282 00:17:31,276 --> 00:17:34,796 Speaker 1: was a PhD student, the first thing my advisor told me, 283 00:17:35,036 --> 00:17:39,276 Speaker 1: maybe following Shannon, is hey, don't work in information theory. Wow, 284 00:17:40,116 --> 00:17:42,156 Speaker 1: you'll never find a job. You never find a job 285 00:17:42,156 --> 00:17:44,596 Speaker 1: with this stuff. Okay, that's a tough moment. 286 00:17:44,956 --> 00:17:46,316 Speaker 2: That must be a tough moment for. 287 00:17:46,316 --> 00:17:50,516 Speaker 1: Pretty tough, yeah, because at that time, there's not much 288 00:17:50,596 --> 00:17:55,356 Speaker 1: progress made in the theory, and there's no killer applications either. 289 00:17:55,476 --> 00:17:58,556 Speaker 1: There's no very killer applications that need all this sophisticated 290 00:17:58,756 --> 00:18:03,036 Speaker 1: information theory. Okay. So it's like a dead field. 291 00:18:02,876 --> 00:18:05,196 Speaker 2: Was there a while when people used it to like 292 00:18:05,476 --> 00:18:08,476 Speaker 2: whatever make landline phones work better, like in the fifties 293 00:18:08,556 --> 00:18:10,796 Speaker 2: or something with where people like, oh great, now we've 294 00:18:10,796 --> 00:18:13,396 Speaker 2: got this theory and we can make the phone work better. 295 00:18:14,636 --> 00:18:20,436 Speaker 1: Yeah. So the thing is that the solutions that people 296 00:18:20,436 --> 00:18:24,116 Speaker 1: come up with to achieve these capacity limits is very complicated, okay, 297 00:18:24,916 --> 00:18:27,836 Speaker 1: and the electronics that technology is just not enough to 298 00:18:27,836 --> 00:18:31,116 Speaker 1: build these complicated circuits. So information theory have had not 299 00:18:31,316 --> 00:18:35,156 Speaker 1: a very significant applic impact in the fifties, sixties, or 300 00:18:35,196 --> 00:18:35,996 Speaker 1: even seven sounds. 301 00:18:36,116 --> 00:18:38,836 Speaker 2: So it's like one of those cases where the theory 302 00:18:38,876 --> 00:18:42,076 Speaker 2: is just too far ahead of the technology. 303 00:18:42,076 --> 00:18:45,556 Speaker 1: To be useful it. Yeah, and so people can start 304 00:18:45,596 --> 00:18:47,596 Speaker 1: losing interest in the theory is that, yes, this is 305 00:18:47,596 --> 00:18:50,476 Speaker 1: a bunch of maths. It's not impacting the real world, 306 00:18:50,556 --> 00:18:53,436 Speaker 1: and so students are drifting away from the field. But 307 00:18:53,556 --> 00:18:56,756 Speaker 1: there's still always a few students, okay, who are just 308 00:18:56,956 --> 00:19:00,436 Speaker 1: so enumorated by the theory that they keep on pursuing it. 309 00:19:01,116 --> 00:19:03,996 Speaker 1: And my advisor is one of the leading professors in 310 00:19:04,036 --> 00:19:08,116 Speaker 1: this area, and he would have like one student every decade, 311 00:19:08,356 --> 00:19:11,596 Speaker 1: every decade to do research in if. 312 00:19:11,436 --> 00:19:14,076 Speaker 2: You were that student, you were and I was. 313 00:19:14,076 --> 00:19:16,876 Speaker 1: Not that student, And I was not that student, Okay. 314 00:19:17,396 --> 00:19:19,596 Speaker 1: At that time, that slot was already taken by an 315 00:19:19,596 --> 00:19:22,796 Speaker 1: earlier student who was ways more than me, who's ways 316 00:19:22,876 --> 00:19:25,836 Speaker 1: more than me. And that's that he was that he 317 00:19:25,916 --> 00:19:29,276 Speaker 1: was a student of the decade in information theory. Okay. Now, 318 00:19:30,156 --> 00:19:32,716 Speaker 1: so I was assigned to work on some other problems okay, 319 00:19:32,716 --> 00:19:35,756 Speaker 1: completely and related Okay, But anyway, the point though, is 320 00:19:35,756 --> 00:19:40,036 Speaker 1: that when I graduated, something happened, okay, And that was 321 00:19:40,076 --> 00:19:45,796 Speaker 1: the beginning of the wireless revolution. That was the time 322 00:19:45,876 --> 00:19:49,436 Speaker 1: when only a million people have cell phones, and those 323 00:19:49,436 --> 00:19:53,116 Speaker 1: cell phones I don't even remember. It's like gigantically break yeah. 324 00:19:53,156 --> 00:19:56,836 Speaker 2: Like there's that famous scene from the movie Wall Street, right, 325 00:19:56,916 --> 00:19:58,676 Speaker 2: that's the one that everybody talks about where it's like 326 00:19:58,956 --> 00:20:01,236 Speaker 2: bigger than a brick. People say brick, but it's actually 327 00:20:01,236 --> 00:20:03,916 Speaker 2: bigger than a brick. It's like a big hardback book 328 00:20:04,036 --> 00:20:04,476 Speaker 2: or something. 329 00:20:05,676 --> 00:20:07,836 Speaker 1: Yeah. And actually those days, because there's some few of 330 00:20:07,876 --> 00:20:11,356 Speaker 1: six post it's like a prestige. It's like it's prestige 331 00:20:11,396 --> 00:20:14,596 Speaker 1: to have this brick. Yeah, okay, yeah. 332 00:20:14,356 --> 00:20:16,396 Speaker 2: You couldn't get that brick. You had to be rich 333 00:20:16,476 --> 00:20:18,076 Speaker 2: to get that brick. Yeah. 334 00:20:18,156 --> 00:20:21,756 Speaker 1: Yeah. And so the wireless revolution was happening because people 335 00:20:21,796 --> 00:20:24,276 Speaker 1: realized that hey, you know what, be able to communicate 336 00:20:24,276 --> 00:20:27,996 Speaker 1: anytime anywhere is really viable, and so people are now 337 00:20:28,036 --> 00:20:31,916 Speaker 1: getting interested. And at that time, what people realize is 338 00:20:31,956 --> 00:20:37,356 Speaker 1: that whoa this wireless physical media, it's really tough to 339 00:20:37,396 --> 00:20:41,276 Speaker 1: communicate over because the bandwidth is so limited and the 340 00:20:41,316 --> 00:20:45,116 Speaker 1: noise is so much. Right, FCC was limiting the bandwidth 341 00:20:45,116 --> 00:20:46,996 Speaker 1: allocation to these applications a lot. 342 00:20:47,076 --> 00:20:52,676 Speaker 2: Aha, and so Communications Commission the government wasn't letting wireless 343 00:20:52,676 --> 00:20:54,196 Speaker 2: companies use much bandwidth for. 344 00:20:54,236 --> 00:20:56,156 Speaker 1: Cel phone yeah, because all the bandwi most of them 345 00:20:56,156 --> 00:20:59,636 Speaker 1: are allocated for military purposes and there's only very little bandwidth 346 00:20:59,676 --> 00:21:03,076 Speaker 1: allocated at that time for civilians, and so those bandwidth 347 00:21:03,116 --> 00:21:06,916 Speaker 1: were auctional to companies with a very high price, and 348 00:21:06,956 --> 00:21:10,196 Speaker 1: so it became very important to be very efficient in 349 00:21:10,316 --> 00:21:15,996 Speaker 1: using this very expensive property. Aha, Okay, and then people realize, hey, 350 00:21:16,316 --> 00:21:19,396 Speaker 1: if we want to be really efficient, then we need 351 00:21:19,436 --> 00:21:23,356 Speaker 1: a theory which is about efficiency. So people start thinking, okay, 352 00:21:23,876 --> 00:21:26,836 Speaker 1: all right, so information theory was dead, but now it's 353 00:21:26,876 --> 00:21:29,116 Speaker 1: going to come back to life because we have this 354 00:21:29,196 --> 00:21:32,756 Speaker 1: really important problem, really expensive spectrum that was allocated by SEC, 355 00:21:32,956 --> 00:21:35,516 Speaker 1: and we want to squeeze as much of it as possible. 356 00:21:35,156 --> 00:21:38,476 Speaker 2: As much communication. We need a sort of mathematical theory 357 00:21:38,476 --> 00:21:40,076 Speaker 2: of communication, if you will. 358 00:21:41,156 --> 00:21:44,596 Speaker 1: And that was the renaissance of information theory, spurred by 359 00:21:45,236 --> 00:21:48,636 Speaker 1: this amazing technology of wireless, which took us from one 360 00:21:49,236 --> 00:21:52,596 Speaker 1: million phones to ten billion phones. 361 00:21:53,196 --> 00:21:56,116 Speaker 2: Today everybody has one point one. 362 00:21:55,996 --> 00:22:01,156 Speaker 1: Phone, and information theory play a big role in that revolution. 363 00:22:06,236 --> 00:22:09,676 Speaker 2: In a minute, how David used quad shannon It's nineteen 364 00:22:09,756 --> 00:22:12,436 Speaker 2: forty eight paper to come up with an idea that 365 00:22:12,476 --> 00:22:14,636 Speaker 2: we all use every time we make. 366 00:22:14,556 --> 00:22:24,076 Speaker 1: A phone call. 367 00:22:24,196 --> 00:22:27,756 Speaker 2: Let's talk for a moment about your your role, right, 368 00:22:27,836 --> 00:22:30,756 Speaker 2: like you actually played played an important role there. 369 00:22:31,076 --> 00:22:34,716 Speaker 1: Yeah, so I was at Ball Labs. Uh huh, just 370 00:22:34,756 --> 00:22:37,796 Speaker 1: like Claude. So it's like Claude to Yeah. Yeah, So 371 00:22:37,876 --> 00:22:40,476 Speaker 1: I spent one year at ball Lapse as a so 372 00:22:40,636 --> 00:22:44,516 Speaker 1: called postdoc right after my PhD, before I moved to 373 00:22:44,556 --> 00:22:47,836 Speaker 1: Berkeley to become a professor there. I spent one year there. 374 00:22:48,116 --> 00:22:49,836 Speaker 1: And that's what people were talking about that time of 375 00:22:49,836 --> 00:22:53,836 Speaker 1: beat Labs. Hey, this new thing, wireless information theories come 376 00:22:53,956 --> 00:22:57,116 Speaker 1: back to life. We can try to use information theory 377 00:22:57,196 --> 00:23:01,236 Speaker 1: and adapt it and extend it to this wireless communication problem. 378 00:23:01,876 --> 00:23:05,236 Speaker 1: And so that's when I said, whoa this information theory 379 00:23:05,236 --> 00:23:07,916 Speaker 1: I learned from Bob Gallagher. Finally there's a place to 380 00:23:08,076 --> 00:23:13,436 Speaker 1: use it. Finally I can actually make a living, make 381 00:23:13,476 --> 00:23:15,436 Speaker 1: a living out of it. And like what my advisor 382 00:23:15,476 --> 00:23:18,236 Speaker 1: told me, is not dead, it's come back to life. Yeah, 383 00:23:18,876 --> 00:23:21,556 Speaker 1: and so that's sort of my start in the field. 384 00:23:22,116 --> 00:23:25,716 Speaker 1: And uh yeah, so I did, I, you know, invented 385 00:23:25,716 --> 00:23:29,356 Speaker 1: a bunch of stuff and actually apply this connect this 386 00:23:29,396 --> 00:23:32,196 Speaker 1: information theory to the real world. And uh, every time 387 00:23:32,236 --> 00:23:36,196 Speaker 1: you use a phone, you're using my algorithm, which is 388 00:23:36,236 --> 00:23:38,996 Speaker 1: based on the theory of information. Huh. 389 00:23:39,036 --> 00:23:42,036 Speaker 2: And so you're you're that's a cool thing to be 390 00:23:42,076 --> 00:23:44,756 Speaker 2: able to say. First of all, that's a very good 391 00:23:44,796 --> 00:23:51,876 Speaker 2: flex your algorithm, it's the proportional fair scheduling algorithm, right, yes, yes, 392 00:23:52,156 --> 00:23:53,396 Speaker 2: what is that? What's it do? 393 00:23:54,756 --> 00:23:56,996 Speaker 1: All right? So I should tell you a little bit story. 394 00:23:57,036 --> 00:23:58,876 Speaker 1: I think the story is, uh, and then I'll tell 395 00:23:58,876 --> 00:24:02,796 Speaker 1: you what it does. Okay. So I went to So 396 00:24:02,836 --> 00:24:05,396 Speaker 1: that was the end of nineteen ninety nine, around nine 397 00:24:05,436 --> 00:24:09,036 Speaker 1: ninety nine. So I was doing all this information theory 398 00:24:09,156 --> 00:24:13,836 Speaker 1: stuff at Berkeley, writing many papers. But then I always 399 00:24:13,876 --> 00:24:16,156 Speaker 1: have a thought the back of my mind, which you say, 400 00:24:16,196 --> 00:24:18,436 Speaker 1: is this stuff going to be useful? And so I 401 00:24:18,476 --> 00:24:20,436 Speaker 1: went to a I decided to go to a company, 402 00:24:20,436 --> 00:24:22,356 Speaker 1: a wireless company who actually build these things and see 403 00:24:22,356 --> 00:24:24,596 Speaker 1: whether this theory can be used. And the company went 404 00:24:24,596 --> 00:24:25,396 Speaker 1: to is called Quaker. 405 00:24:26,676 --> 00:24:27,916 Speaker 2: Okay, I've heard of Quaker. 406 00:24:28,956 --> 00:24:30,516 Speaker 1: You've heard of Quaker, but at that time it was 407 00:24:30,556 --> 00:24:34,116 Speaker 1: a small company, it was not very big, Okay. And 408 00:24:34,196 --> 00:24:37,556 Speaker 1: at that time they have this problem they're working on. Okay, 409 00:24:37,556 --> 00:24:41,036 Speaker 1: which is the following. All right. So in wireless communication, 410 00:24:41,116 --> 00:24:45,276 Speaker 1: there's a concept COUD based station okay, and the base 411 00:24:45,316 --> 00:24:49,436 Speaker 1: station serves many cell phones in the vicinity of the 412 00:24:49,436 --> 00:24:50,716 Speaker 1: bay station. It's cost sout. 413 00:24:51,276 --> 00:24:52,876 Speaker 2: Is it like a tower? Is what we would call 414 00:24:52,916 --> 00:24:53,836 Speaker 2: it a tower? 415 00:24:54,036 --> 00:24:56,516 Speaker 1: That's right, it's always on the tower. There's there's some 416 00:24:56,596 --> 00:25:00,276 Speaker 1: electronics there. Yeah, and that's how the bay station is 417 00:25:00,316 --> 00:25:03,836 Speaker 1: supposed to beam information to many phones, and many phones. 418 00:25:03,876 --> 00:25:05,676 Speaker 2: You still see them. You see them when whatever on 419 00:25:05,756 --> 00:25:07,636 Speaker 2: top of a big building or when you're driving down 420 00:25:07,676 --> 00:25:09,116 Speaker 2: the freeway. Right, that's what you're talking about. 421 00:25:09,156 --> 00:25:12,396 Speaker 1: That, Yeah, that's right. And sometimes on fake trees. 422 00:25:12,596 --> 00:25:14,276 Speaker 2: I love the fake trees in New Jersey. 423 00:25:14,356 --> 00:25:18,356 Speaker 1: They love the fake New Jersey. That's right, New Jersey, 424 00:25:18,436 --> 00:25:22,836 Speaker 1: fake trees. Yes, So at that time they would look 425 00:25:22,836 --> 00:25:26,996 Speaker 1: at this problem, which is, hey, okay, my bandwidth is limited, 426 00:25:27,356 --> 00:25:30,636 Speaker 1: but I have many users to serve. Yeah, okay, how 427 00:25:30,676 --> 00:25:34,876 Speaker 1: do I schedule my limited resource among all these users? Right? 428 00:25:34,916 --> 00:25:37,716 Speaker 1: Because I only have one total bandwidth. And so at 429 00:25:37,716 --> 00:25:40,956 Speaker 1: that time people think, okay, maybe something simple. I give 430 00:25:41,156 --> 00:25:43,916 Speaker 1: one end of the time to the end user. Right, 431 00:25:43,956 --> 00:25:46,756 Speaker 1: so the five users, I serve this user for a 432 00:25:46,796 --> 00:25:48,556 Speaker 1: little bit, and I served the second user for a 433 00:25:48,556 --> 00:25:49,716 Speaker 1: little bit, and served user for. 434 00:25:50,196 --> 00:25:53,356 Speaker 2: The ideas, you're switching really fast. You're just like switching. 435 00:25:53,156 --> 00:25:56,756 Speaker 1: Switching really fast, yeah, exactly. And then when I went there, 436 00:25:56,836 --> 00:25:59,756 Speaker 1: I said, okay, good is the problem is a good problem. 437 00:25:59,796 --> 00:26:03,876 Speaker 1: And I said, hey, instead of fixating on this particular 438 00:26:03,916 --> 00:26:08,116 Speaker 1: scheduling policy, why don't we do any Shannon thing, a. 439 00:26:08,196 --> 00:26:11,716 Speaker 2: Card Shannon thing. You thought of, your thought of it? Yeah, okay. 440 00:26:12,436 --> 00:26:15,916 Speaker 1: The clause shaming thing is what is to look at 441 00:26:15,916 --> 00:26:21,196 Speaker 1: the problem from first principle, Uh not reassume a particular 442 00:26:21,876 --> 00:26:25,956 Speaker 1: solution or a particular class solution even and ask ourselves 443 00:26:26,036 --> 00:26:31,036 Speaker 1: what is the capacity of this whole system, and how 444 00:26:31,076 --> 00:26:34,636 Speaker 1: do I engineer the system to achieve that capacity? Okay? 445 00:26:35,836 --> 00:26:37,756 Speaker 1: And it turns out that if you look at the 446 00:26:37,796 --> 00:26:40,476 Speaker 1: problem this way, then it turns out that the optimal 447 00:26:40,556 --> 00:26:43,516 Speaker 1: way of scheduling is not the one that they will 448 00:26:43,556 --> 00:26:47,956 Speaker 1: try and design. And the reason is because in wireless 449 00:26:47,956 --> 00:26:54,596 Speaker 1: communication there's a very interesting characteristic which is called fading. Okay, okay. 450 00:26:55,356 --> 00:26:59,156 Speaker 1: When I talk to you over the air, the channel 451 00:26:59,316 --> 00:27:02,556 Speaker 1: actually goes up and down, strong and weak, strong and weak, 452 00:27:02,796 --> 00:27:05,676 Speaker 1: very rapidly. What I mean is when I send an 453 00:27:05,676 --> 00:27:10,036 Speaker 1: Electromnett signal from the from the base to the phone, 454 00:27:10,676 --> 00:27:17,276 Speaker 1: that signal get amplified and attenuate it very rapidly. It 455 00:27:17,316 --> 00:27:18,236 Speaker 1: goes up and down. 456 00:27:18,276 --> 00:27:21,276 Speaker 2: Okay, Okay, can we say it gets stronger and weaker. 457 00:27:21,356 --> 00:27:22,676 Speaker 2: Can we say it gets stronger. 458 00:27:22,436 --> 00:27:26,636 Speaker 1: And stronger and weaker? Okay? Yes, And so the alcomal 459 00:27:26,676 --> 00:27:29,316 Speaker 1: way that Information SERVI it has to do is actually 460 00:27:29,356 --> 00:27:34,676 Speaker 1: not divide the time into slots blindly, but really try 461 00:27:34,796 --> 00:27:39,716 Speaker 1: to SKATEU a user when the channel is strong. 462 00:27:40,476 --> 00:27:40,516 Speaker 2: Ha. 463 00:27:42,276 --> 00:27:45,316 Speaker 1: And then from that on I designed a scheduling album 464 00:27:45,316 --> 00:27:48,716 Speaker 1: from which is more practical by sort of leverage of 465 00:27:48,756 --> 00:27:50,956 Speaker 1: this basic idea from information theory. 466 00:27:51,356 --> 00:27:55,236 Speaker 2: And so so the base station is basically monitoring the 467 00:27:55,276 --> 00:27:58,796 Speaker 2: strength of the incoming signals from all the different phones 468 00:27:59,716 --> 00:28:02,036 Speaker 2: and saying, Oh, that one's strong, I'm gonna grab that one. 469 00:28:02,076 --> 00:28:03,556 Speaker 2: Oh that one's strong, I'm gonna grab that one. 470 00:28:03,596 --> 00:28:05,636 Speaker 1: That's what's happening, correct? Correct? 471 00:28:05,636 --> 00:28:07,796 Speaker 2: And how does that I mean I get in a 472 00:28:07,916 --> 00:28:10,476 Speaker 2: kind of big first principles way sort of analogously, it 473 00:28:10,516 --> 00:28:13,876 Speaker 2: follows from Shannon. But is there anything sort of specific 474 00:28:14,116 --> 00:28:20,916 Speaker 2: in Shannon that leads you to this algorithm? 475 00:28:20,956 --> 00:28:26,516 Speaker 1: So remember Shannon is a very general theory. Yeah, Okay. 476 00:28:27,356 --> 00:28:31,156 Speaker 1: It basically says that given any communication medium or any 477 00:28:31,196 --> 00:28:37,156 Speaker 1: communication setting, yeah, you can try to calculate this notion 478 00:28:37,236 --> 00:28:40,996 Speaker 1: of a capacity. So the very general theory, what I 479 00:28:41,156 --> 00:28:44,956 Speaker 1: did was to apply it to a very specific context, 480 00:28:44,956 --> 00:28:49,956 Speaker 1: which is this base station serving multiple user setting. Yeah, 481 00:28:50,036 --> 00:28:54,716 Speaker 1: and then apply his framework to analyze the capacity of 482 00:28:54,716 --> 00:28:55,236 Speaker 1: that system. 483 00:28:55,596 --> 00:28:57,036 Speaker 2: Huh. 484 00:28:57,116 --> 00:28:59,836 Speaker 1: And in the process of analyzing the capacity, you can 485 00:28:59,876 --> 00:29:04,916 Speaker 1: also figure out what is the optimal way of achieving 486 00:29:04,956 --> 00:29:09,076 Speaker 1: that capacity. Remember you mentioned capacity is really an optimization problem, 487 00:29:09,636 --> 00:29:13,556 Speaker 1: and Shannon was able to solve this optimization problem in general. 488 00:29:13,916 --> 00:29:16,236 Speaker 1: But now I specialize it in some sense to this 489 00:29:16,756 --> 00:29:20,636 Speaker 1: pretty specific setting, except that the setting is used by everybody. 490 00:29:21,356 --> 00:29:24,476 Speaker 1: But at that time, it was like, you know, research 491 00:29:24,516 --> 00:29:27,396 Speaker 1: is about timing, and I was there at the right 492 00:29:27,396 --> 00:29:32,076 Speaker 1: place at the right time. Because Qualcom turns out to 493 00:29:32,116 --> 00:29:37,876 Speaker 1: completely dominate the entire third generation technology. So when I 494 00:29:37,916 --> 00:29:40,676 Speaker 1: was able to convince them that, hey, your way of 495 00:29:40,676 --> 00:29:44,156 Speaker 1: doing things is no good. This way suggested by Shannon's 496 00:29:44,196 --> 00:29:48,556 Speaker 1: actually far better. Please use this way. It took me 497 00:29:48,596 --> 00:29:50,516 Speaker 1: a few months, but I was able to persuade them 498 00:29:50,556 --> 00:29:53,116 Speaker 1: to implement it, and then it got into the standard 499 00:29:53,796 --> 00:29:57,956 Speaker 1: through the domination, and then every standard after that uses 500 00:29:58,076 --> 00:30:02,396 Speaker 1: the same basic, the same algorithm. So it was good because, 501 00:30:02,476 --> 00:30:04,036 Speaker 1: as I said, I'm at the right place at the 502 00:30:04,076 --> 00:30:07,196 Speaker 1: right time. You know, when you try to contribute to engineering. 503 00:30:07,396 --> 00:30:09,836 Speaker 1: It's too late if the system is built, because people 504 00:30:09,876 --> 00:30:12,276 Speaker 1: don't want to wreck change the whole system to accompany 505 00:30:12,396 --> 00:30:15,876 Speaker 1: or new idea. But it was very early in the 506 00:30:15,916 --> 00:30:17,036 Speaker 1: design phase. 507 00:30:17,956 --> 00:30:21,796 Speaker 2: So so okay, So you made this breakthrough in wireless 508 00:30:21,796 --> 00:30:26,876 Speaker 2: communications using Shannon's work. Were there similar breakthroughs in, you know, 509 00:30:26,996 --> 00:30:29,196 Speaker 2: in other domains? 510 00:30:29,236 --> 00:30:33,676 Speaker 1: Any communication median right, it could be optical, fiber, it 511 00:30:33,716 --> 00:30:40,596 Speaker 1: could be DSL modem, YESL modem, underwater communication. Almost all 512 00:30:40,676 --> 00:30:44,036 Speaker 1: these communication systems are now designed based on his principle. 513 00:30:45,236 --> 00:30:50,596 Speaker 1: So as impact of this theory is kind of global, 514 00:30:50,716 --> 00:30:53,036 Speaker 1: it's the entire communication landscape. 515 00:30:54,076 --> 00:31:00,156 Speaker 2: There's a story I read in about Shannon that when 516 00:31:00,196 --> 00:31:05,596 Speaker 2: he is developing information theory, he he takes a book 517 00:31:05,636 --> 00:31:07,956 Speaker 2: off the shelf and he reads a sentence to its 518 00:31:07,956 --> 00:31:10,716 Speaker 2: actually his wife, and it's something like the lamp was 519 00:31:10,756 --> 00:31:14,036 Speaker 2: sitting on the and she says table and he says, no, 520 00:31:14,116 --> 00:31:16,356 Speaker 2: I'll give you a clue. The first letter is D 521 00:31:16,756 --> 00:31:20,916 Speaker 2: and she says desk. And when I heard that story, 522 00:31:21,156 --> 00:31:24,756 Speaker 2: what I thought of was large language models, Like that 523 00:31:24,796 --> 00:31:28,396 Speaker 2: sounds exactly like a large language model, And so I'm 524 00:31:28,436 --> 00:31:34,316 Speaker 2: just fishing, I'm just curious, like does his work matter 525 00:31:34,676 --> 00:31:38,676 Speaker 2: for machine learning, large language models, et cetera or no. 526 00:31:40,596 --> 00:31:44,116 Speaker 1: Yeah, so that's a very interesting point. Now I'm not 527 00:31:44,156 --> 00:31:48,716 Speaker 1: an expert by any means in AI or large language models. Yeah, 528 00:31:48,756 --> 00:31:52,116 Speaker 1: I'm not a professional researcher in that area. But I 529 00:31:52,156 --> 00:31:54,796 Speaker 1: think you can actually see some commonality, right, is that 530 00:31:56,676 --> 00:31:58,916 Speaker 1: you know these models in some sense, they don't care 531 00:31:58,916 --> 00:31:59,876 Speaker 1: about meaning either. 532 00:32:00,556 --> 00:32:03,476 Speaker 2: Yeah, very good, Very good. Yeah. 533 00:32:03,596 --> 00:32:08,716 Speaker 1: Right, Actually I've just came to my discussion is very 534 00:32:08,716 --> 00:32:13,596 Speaker 1: interesting because it's really just patterns. It's just which patterns 535 00:32:13,636 --> 00:32:16,196 Speaker 1: are more likely than other patterns. Right. The example you 536 00:32:16,236 --> 00:32:20,436 Speaker 1: gave about desk and is basically about patterns, and information 537 00:32:20,556 --> 00:32:23,796 Speaker 1: theory is really analyzing sort of the number of possible 538 00:32:23,836 --> 00:32:30,236 Speaker 1: patterns in some sense. So there is definitely a philosophical connection, 539 00:32:30,396 --> 00:32:35,156 Speaker 1: I believe, starting from Shannon to these large language models. 540 00:32:35,476 --> 00:32:36,916 Speaker 2: So let me ask you about one other, and this 541 00:32:36,996 --> 00:32:44,316 Speaker 2: is one that you are professionally involved in cryptocurrency and blockchain. 542 00:32:44,916 --> 00:32:48,316 Speaker 2: You have studied it and you started a company. 543 00:32:48,476 --> 00:32:48,676 Speaker 1: Right. 544 00:32:49,316 --> 00:32:52,836 Speaker 2: Is there a connection between Shannon's work and cryptocurrency. 545 00:32:53,436 --> 00:32:58,596 Speaker 1: Yeah, So what attracts me to work in this area blockchain, 546 00:32:59,636 --> 00:33:05,836 Speaker 1: is that blockchain actually has one very common philosophical connection 547 00:33:05,956 --> 00:33:09,836 Speaker 1: to information theory, which is a following in blockchain. The 548 00:33:09,916 --> 00:33:14,076 Speaker 1: problem is not communication per se. It's called consensus. Okay, 549 00:33:14,436 --> 00:33:18,836 Speaker 1: it's a different problem, but it's essentially allow a bunch 550 00:33:18,876 --> 00:33:21,396 Speaker 1: of users at different places to come to an agreement 551 00:33:22,036 --> 00:33:28,036 Speaker 1: on something. Okay, yes, Now, the goal of designing blockchain 552 00:33:28,436 --> 00:33:33,036 Speaker 1: is really to be so called for tolerant, tolerant, which 553 00:33:33,076 --> 00:33:37,716 Speaker 1: means for torerant, which means that even if say one 554 00:33:37,876 --> 00:33:40,876 Speaker 1: third of the users are bad guys and send you 555 00:33:40,916 --> 00:33:46,396 Speaker 1: some gibberish message, you can still the rest two third 556 00:33:46,396 --> 00:33:50,396 Speaker 1: people can still come to an agreement. Okay, all right, 557 00:33:51,516 --> 00:33:53,556 Speaker 1: So you look at this problem, it's actually not that 558 00:33:53,636 --> 00:33:58,396 Speaker 1: different from communication information theory because it's kind of combating. 559 00:33:58,116 --> 00:34:00,076 Speaker 2: The bad guys are the noise that the good guys 560 00:34:00,076 --> 00:34:01,076 Speaker 2: at the signal. 561 00:34:01,156 --> 00:34:03,116 Speaker 1: And the good guys at the signal, and they try 562 00:34:03,156 --> 00:34:07,996 Speaker 1: to reintroduce redundancy, okay, to help them to fight against 563 00:34:08,036 --> 00:34:08,796 Speaker 1: these bad guys. 564 00:34:09,556 --> 00:34:12,716 Speaker 2: And there's an optimization problem where like the more redundancy 565 00:34:12,796 --> 00:34:14,796 Speaker 2: you have, the sort of slower the system is, the 566 00:34:14,836 --> 00:34:15,636 Speaker 2: more ponderous. 567 00:34:16,716 --> 00:34:19,556 Speaker 1: And so you tried an optimization problem is to try 568 00:34:19,596 --> 00:34:22,956 Speaker 1: to figure out what is the optimal number of bad 569 00:34:22,996 --> 00:34:25,876 Speaker 1: guys that you can tolerate and your system still works. 570 00:34:26,436 --> 00:34:29,436 Speaker 1: That is the analogous to the capacity problem. So I 571 00:34:29,476 --> 00:34:35,076 Speaker 1: find the philosophical connection very appealing, and so that's sort 572 00:34:35,116 --> 00:34:37,316 Speaker 1: of one reason why I got attracted to work into 573 00:34:37,356 --> 00:34:39,116 Speaker 1: this area. 574 00:34:39,316 --> 00:34:42,636 Speaker 2: Why do you think more people don't know about Shannon? 575 00:34:42,876 --> 00:34:47,676 Speaker 2: Like all of the sort of intellectuals in technology say, 576 00:34:48,956 --> 00:34:53,756 Speaker 2: he's like one of the great thinkers of the twentieth century, 577 00:34:54,396 --> 00:34:57,636 Speaker 2: but most people have never heard of him. Why do 578 00:34:57,676 --> 00:34:58,356 Speaker 2: you think that is? 579 00:34:59,556 --> 00:35:05,756 Speaker 1: So? Shannon was actually a very shy person, very shy person. 580 00:35:06,396 --> 00:35:13,036 Speaker 1: He hates publicity, He hated when people interview him. You remember, right, 581 00:35:13,636 --> 00:35:16,156 Speaker 1: it's basically a very modest person. Remember the first paragraph 582 00:35:16,236 --> 00:35:19,076 Speaker 1: I talked you about. Yeah, he tells you what he 583 00:35:19,116 --> 00:35:22,356 Speaker 1: is not accomplishing. Yeah, And so he's a very modest, 584 00:35:22,516 --> 00:35:26,156 Speaker 1: very shy person, not into publicity. And I think that 585 00:35:26,916 --> 00:35:32,076 Speaker 1: sort of impact not only himself, but also everybody who 586 00:35:32,076 --> 00:35:36,316 Speaker 1: works in that field. Uhha, and dob this as kind 587 00:35:36,316 --> 00:35:38,676 Speaker 1: of like a metric, right that Hey, we should all 588 00:35:38,716 --> 00:35:41,436 Speaker 1: be modest, because what look at this guy who accomplished 589 00:35:41,476 --> 00:35:44,476 Speaker 1: so much and he's still so motist. Who are we? 590 00:35:44,996 --> 00:35:48,876 Speaker 1: Who are we? Right? So, as a result, the field 591 00:35:48,876 --> 00:35:52,996 Speaker 1: doesn't really sell himself very well. The marketing engine, the 592 00:35:53,036 --> 00:35:56,796 Speaker 1: marketing DNA is not there. Yeah, and so people don't 593 00:35:56,836 --> 00:35:58,356 Speaker 1: know about him. 594 00:35:58,556 --> 00:36:00,516 Speaker 2: So I want to talk for a minute about the 595 00:36:00,556 --> 00:36:03,996 Speaker 2: rest of Shannon's life. He writes this huge paper when 596 00:36:03,996 --> 00:36:07,236 Speaker 2: he's in his early thirties, eventually goes on to be 597 00:36:07,276 --> 00:36:11,156 Speaker 2: a professor at MIT, and he seems to spend a 598 00:36:11,236 --> 00:36:17,916 Speaker 2: lot of his career juggling, writing a unicycle, building mechanical toys, 599 00:36:17,956 --> 00:36:21,476 Speaker 2: building games, and he never, you know, does sort of 600 00:36:21,596 --> 00:36:25,876 Speaker 2: great influential work again, and I'm curious, you know, what 601 00:36:25,916 --> 00:36:27,716 Speaker 2: do you what do you make of that? How do 602 00:36:27,716 --> 00:36:29,796 Speaker 2: you sort of fit his whole career together? 603 00:36:31,036 --> 00:36:34,076 Speaker 1: So there's a single there's a theme that unifies all 604 00:36:34,116 --> 00:36:39,116 Speaker 1: this in my mind, which is playfulness. Because in his mind, 605 00:36:39,476 --> 00:36:46,036 Speaker 1: research is really about puzzles. Uh, he doesn't understand something. 606 00:36:46,796 --> 00:36:48,996 Speaker 1: It's like a puzzle to him, and he's trying to 607 00:36:49,036 --> 00:36:53,476 Speaker 1: figure out the pieces of the puzzle. Information theory was 608 00:36:53,556 --> 00:36:56,916 Speaker 1: like that the puzzles. He sees all these real war systems, 609 00:36:56,996 --> 00:36:59,396 Speaker 1: they seem to all share some community, but nobody understood it. 610 00:36:59,436 --> 00:37:01,596 Speaker 1: So there's a puzzle, and it's always thinking about the puzzle. 611 00:37:02,636 --> 00:37:06,316 Speaker 1: And finally his paper basically solve that puzzle. So everything 612 00:37:06,356 --> 00:37:09,436 Speaker 1: to him is playfulness. I think it's playing. There's a 613 00:37:09,476 --> 00:37:13,076 Speaker 1: game puzzle and needs a soft to puzzle, and that 614 00:37:13,276 --> 00:37:15,716 Speaker 1: is mine. That's how it's my work. So although it 615 00:37:15,716 --> 00:37:19,956 Speaker 1: seems very different things that he did pre and post 616 00:37:20,036 --> 00:37:24,316 Speaker 1: inflamation theory, but it's actually in my mind quite strongly monouns. 617 00:37:29,796 --> 00:37:31,756 Speaker 2: We'll be back in a minute with the lightning round. 618 00:37:40,116 --> 00:37:45,476 Speaker 2: So I read that you recently asked people at your 619 00:37:45,556 --> 00:37:50,196 Speaker 2: company to give five minute talks. I'm curious why you 620 00:37:50,236 --> 00:37:51,916 Speaker 2: did that. That's interesting to me. Why'd you do that? 621 00:37:53,796 --> 00:37:59,396 Speaker 1: So just short to talk the harderest to give. So 622 00:38:00,156 --> 00:38:04,356 Speaker 1: you can't explain an idea in five minutes, then I 623 00:38:04,356 --> 00:38:05,836 Speaker 1: think your idea is actually not very good. 624 00:38:06,036 --> 00:38:08,596 Speaker 2: Ah, that's good. 625 00:38:09,316 --> 00:38:12,196 Speaker 1: Most good ideas you can get the point to across 626 00:38:12,236 --> 00:38:15,516 Speaker 1: in five minutes. Remember, I'm an information theorist by training, 627 00:38:16,796 --> 00:38:21,436 Speaker 1: so communication to the limit is what I'm passionate about. 628 00:38:21,756 --> 00:38:25,156 Speaker 2: If you had to give a five minute talk, what 629 00:38:25,196 --> 00:38:25,916 Speaker 2: would it be about? 630 00:38:27,316 --> 00:38:33,316 Speaker 1: How about Shutton? I guess he's my hero. He's my hero. 631 00:38:34,636 --> 00:38:39,196 Speaker 2: So one you talked about the importance of timing in 632 00:38:40,116 --> 00:38:44,076 Speaker 2: research of not only finding the right problem, but finding 633 00:38:44,076 --> 00:38:46,036 Speaker 2: the right problem at the right time. Right, both in 634 00:38:46,116 --> 00:38:49,156 Speaker 2: terms of Shannon's work and in terms of your work. 635 00:38:51,316 --> 00:38:54,876 Speaker 2: You know, you're also a professor and you know a manager, Like, 636 00:38:55,036 --> 00:38:58,156 Speaker 2: how do you help other people find the right problem 637 00:38:58,196 --> 00:38:58,996 Speaker 2: at the right time? 638 00:39:01,196 --> 00:39:04,716 Speaker 1: Yeah, finding the right problem at the right time is 639 00:39:04,716 --> 00:39:11,276 Speaker 1: probably the most difficult because you know, time is everything. However, 640 00:39:12,076 --> 00:39:14,796 Speaker 1: this is hard to teach. What we try to do 641 00:39:16,076 --> 00:39:21,756 Speaker 1: is to be ready. So one very famous information theorist 642 00:39:21,836 --> 00:39:25,756 Speaker 1: told me this. He said, you know, everybody will get 643 00:39:25,836 --> 00:39:30,396 Speaker 1: lucky at some point in time in the career. However, 644 00:39:30,556 --> 00:39:34,076 Speaker 1: most people, when they get lucky, they're not ready, so 645 00:39:34,116 --> 00:39:37,036 Speaker 1: they don't realize that they get lucky, and so they 646 00:39:37,076 --> 00:39:40,276 Speaker 1: missed the opportunity. They went a different direction. Luck tells 647 00:39:40,276 --> 00:39:42,396 Speaker 1: you should go this way, but you went the other way. 648 00:39:42,756 --> 00:39:43,236 Speaker 1: Lost it. 649 00:39:43,596 --> 00:39:45,516 Speaker 2: That makes me so scared. 650 00:39:45,836 --> 00:39:49,396 Speaker 1: And so what I teach my students is always be ready. 651 00:39:49,996 --> 00:39:52,476 Speaker 1: It's like your muscles. You have to be always training 652 00:39:52,476 --> 00:39:54,796 Speaker 1: your muscles so that when you are lucky, you can 653 00:39:54,836 --> 00:39:58,476 Speaker 1: capitalize on the lucky. 654 00:39:58,596 --> 00:40:02,476 Speaker 2: Do you So you talked about Shannon's playful nature like 655 00:40:02,556 --> 00:40:05,276 Speaker 2: he was a juggler. He wrote a unicycle You do 656 00:40:05,316 --> 00:40:08,876 Speaker 2: anything like that? You have any weird hobbies. 657 00:40:10,396 --> 00:40:14,276 Speaker 1: No, No, the only weird hobby is I love to 658 00:40:14,276 --> 00:40:15,116 Speaker 1: talk to people like. 659 00:40:15,116 --> 00:40:19,636 Speaker 2: You fair, you love going on podcasts. That that is 660 00:40:19,676 --> 00:40:24,116 Speaker 2: the juggling of the twenty first century. Who's your second 661 00:40:24,156 --> 00:40:25,756 Speaker 2: favorite underrated thinker? 662 00:40:27,916 --> 00:40:33,516 Speaker 1: My advisor Ah Gallagher? Well gallaghery. He taught me how 663 00:40:33,556 --> 00:40:38,916 Speaker 1: to think about research because you learned from Shannon and 664 00:40:38,916 --> 00:40:40,596 Speaker 1: I learned from him. 665 00:40:40,876 --> 00:40:44,436 Speaker 2: And if you boil down what your advisor learned from 666 00:40:44,476 --> 00:40:47,076 Speaker 2: Shannon and what you learned from your advisor, what would 667 00:40:47,076 --> 00:40:48,116 Speaker 2: it be? What did you learn? 668 00:40:49,556 --> 00:40:54,356 Speaker 1: Yeah, and learn about taking a very complicated problem and 669 00:40:54,436 --> 00:40:59,036 Speaker 1: strip it down to the essential and then formulate a 670 00:40:59,076 --> 00:41:04,396 Speaker 1: problem around that and solve it. That's an art. It's 671 00:41:04,436 --> 00:41:07,756 Speaker 1: not something you can to convert it into a mathematical 672 00:41:07,836 --> 00:41:12,676 Speaker 1: formula and teach students. It's just based on intuition, experience. 673 00:41:13,596 --> 00:41:17,756 Speaker 1: And that's what Shannon talked my advisor, and that's what 674 00:41:17,836 --> 00:41:20,236 Speaker 1: my advisor taught me, and that's what I try to 675 00:41:20,236 --> 00:41:24,236 Speaker 1: teach my students. Really, teaching is not really about giving 676 00:41:24,276 --> 00:41:27,716 Speaker 1: the follower is really just learning by examples. I observe 677 00:41:27,836 --> 00:41:31,116 Speaker 1: what he does, and then my students observe what I 678 00:41:31,236 --> 00:41:34,756 Speaker 1: do as I interact with them. And hopefully this art 679 00:41:34,956 --> 00:41:38,196 Speaker 1: will carry on from generation to generation. 680 00:41:38,636 --> 00:41:41,276 Speaker 2: Finding the essence of the problem. 681 00:41:41,476 --> 00:41:50,036 Speaker 1: Yeah, David. 682 00:41:50,076 --> 00:41:54,836 Speaker 2: She is a professor at Stanford. Today's show was produced 683 00:41:54,836 --> 00:41:58,036 Speaker 2: by Gabriel Hunter Chang. It was edited by Lyddy Jean 684 00:41:58,116 --> 00:42:01,836 Speaker 2: Kott and engineered by Sarah Bruguier. You can email us 685 00:42:01,876 --> 00:42:05,716 Speaker 2: at problem at Pushkin dot FM. I'm Jacob Goldstein and 686 00:42:05,756 --> 00:42:08,116 Speaker 2: we'll be back next week with another episode of What's 687 00:42:08,116 --> 00:42:14,596 Speaker 2: Your Problem.