1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tex Stuff, a production from I Heart Radio. 2 00:00:12,160 --> 00:00:14,720 Speaker 1: Hey there, and welcome to tex Stuff. I'm your host, 3 00:00:14,840 --> 00:00:17,800 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio. 4 00:00:18,120 --> 00:00:21,480 Speaker 1: And how the tech are you. Well, it's time for 5 00:00:21,520 --> 00:00:24,320 Speaker 1: a tech stuff tidbit. And in the world of tech, 6 00:00:24,440 --> 00:00:29,840 Speaker 1: we often refer to various laws that aren't actually laws 7 00:00:29,920 --> 00:00:33,160 Speaker 1: at all, either in the legal or the scientific sense. 8 00:00:33,760 --> 00:00:36,920 Speaker 1: So I thought I would dedicate a Tidbits episode to 9 00:00:37,159 --> 00:00:40,280 Speaker 1: some of those laws, not all of them, and talk 10 00:00:40,360 --> 00:00:42,960 Speaker 1: about where they came from and what they mean. So 11 00:00:43,120 --> 00:00:47,479 Speaker 1: keep in mind, these laws are really more observations of 12 00:00:47,520 --> 00:00:51,880 Speaker 1: things like trends in technology, and they have a tendency 13 00:00:52,200 --> 00:00:55,480 Speaker 1: to be relevant, but there's no fundamental aspect of the 14 00:00:55,560 --> 00:00:59,600 Speaker 1: universe that actually forces them to be true. Also, keep 15 00:00:59,640 --> 00:01:02,240 Speaker 1: in mind, again we're just looking at some of the 16 00:01:02,280 --> 00:01:04,640 Speaker 1: observations we referred to as laws in tech. There are 17 00:01:04,680 --> 00:01:06,240 Speaker 1: a lot more of them out there than the ones 18 00:01:06,280 --> 00:01:08,920 Speaker 1: I'm going to cover, and some of those end up 19 00:01:08,959 --> 00:01:12,360 Speaker 1: getting super technical. Um And in fact, we're going to 20 00:01:12,480 --> 00:01:14,560 Speaker 1: take a pretty high level on most of these. But 21 00:01:14,640 --> 00:01:16,720 Speaker 1: the first one we should start off with is of 22 00:01:16,760 --> 00:01:22,600 Speaker 1: course Moore's law. That's perhaps the most frequently referenced law 23 00:01:22,959 --> 00:01:27,760 Speaker 1: quote unquote in tech now these days, we generally interpret 24 00:01:27,959 --> 00:01:31,920 Speaker 1: More's law to mean that every eighteen months to two 25 00:01:32,000 --> 00:01:36,640 Speaker 1: years the computers we produced double in processing power, meaning 26 00:01:36,680 --> 00:01:40,440 Speaker 1: a computer produced today has twice the processing capability of 27 00:01:40,480 --> 00:01:44,000 Speaker 1: a computer that was produced in twenty Assuming that you're 28 00:01:44,000 --> 00:01:47,440 Speaker 1: actually listening to this episode in two that's when it 29 00:01:47,520 --> 00:01:52,240 Speaker 1: originally aired. This is a fairly loose interpretation of the 30 00:01:52,280 --> 00:01:56,040 Speaker 1: observation that Gordon Moore made in his paper Cramming more 31 00:01:56,160 --> 00:02:01,920 Speaker 1: components onto integrated circuits way back in n More observed that, 32 00:02:02,400 --> 00:02:06,800 Speaker 1: due to many different factors, not all of them directly technological, 33 00:02:07,280 --> 00:02:12,720 Speaker 1: there was this trend for semiconductor companies fabricators to double 34 00:02:12,960 --> 00:02:16,080 Speaker 1: the number of transistors on a silicon chip every year 35 00:02:16,440 --> 00:02:18,880 Speaker 1: in the early days, and that held true for about 36 00:02:18,919 --> 00:02:22,400 Speaker 1: a decade. But More would later revise his observation in 37 00:02:22,440 --> 00:02:25,880 Speaker 1: the seventies to say every two years, and that's mostly 38 00:02:25,880 --> 00:02:28,919 Speaker 1: where it's sat ever since then. So, in other words, 39 00:02:29,680 --> 00:02:32,840 Speaker 1: an integrated circuit in nineteen sixty five would have twice 40 00:02:32,880 --> 00:02:35,160 Speaker 1: the number of transistors as one that was produced in 41 00:02:35,200 --> 00:02:38,799 Speaker 1: sixty four, and an integrated circuit in nineteen sixty six 42 00:02:39,120 --> 00:02:41,520 Speaker 1: would have twice as many transistors as the one in 43 00:02:41,639 --> 00:02:44,200 Speaker 1: nineteen sixty five, And you could project this out and 44 00:02:44,240 --> 00:02:47,440 Speaker 1: make predictions and so on, and then eventually we hit 45 00:02:47,480 --> 00:02:49,240 Speaker 1: a point where this slowed down a bit, and it 46 00:02:49,280 --> 00:02:52,079 Speaker 1: took two years to double the number of components rather 47 00:02:52,120 --> 00:02:56,440 Speaker 1: than just one. Now, Moore's observation took into account not 48 00:02:56,639 --> 00:03:00,000 Speaker 1: just the advance in technological capabilities that would be required 49 00:03:00,160 --> 00:03:02,960 Speaker 1: to make this happen, right, we'd actually have to build 50 00:03:03,000 --> 00:03:06,839 Speaker 1: out the systems to make these components smaller and then 51 00:03:07,080 --> 00:03:10,320 Speaker 1: cram them onto a silicon chip, As he would say, 52 00:03:10,680 --> 00:03:13,640 Speaker 1: we also had to take into account the economic drivers 53 00:03:13,720 --> 00:03:17,080 Speaker 1: that would push companies to pursue this trend. See, there 54 00:03:17,080 --> 00:03:19,560 Speaker 1: has to be a reason for the push to cram 55 00:03:19,639 --> 00:03:22,000 Speaker 1: more components on the circuit, because if there's no reason 56 00:03:22,040 --> 00:03:25,440 Speaker 1: to do it, companies wouldn't pour the money into making 57 00:03:25,440 --> 00:03:28,880 Speaker 1: it happen. After all, building out more complex circuits means 58 00:03:28,960 --> 00:03:32,680 Speaker 1: investing a lot of money, time and expertise into finding 59 00:03:32,720 --> 00:03:35,839 Speaker 1: new ways to produce smaller and smaller components and then 60 00:03:35,880 --> 00:03:40,160 Speaker 1: designing a chip architecture that takes advantage of those smaller components. 61 00:03:40,200 --> 00:03:43,080 Speaker 1: So there has to be that economic driver or else, 62 00:03:43,240 --> 00:03:46,840 Speaker 1: the expense of doing this would be prohibitive. You wouldn't 63 00:03:46,880 --> 00:03:50,080 Speaker 1: do it. Now. Lots of folks have predicted Moore's law 64 00:03:50,360 --> 00:03:56,119 Speaker 1: would end as semiconductor fabrication facilities started hitting increasingly challenging obstacles. 65 00:03:56,760 --> 00:04:00,600 Speaker 1: A one big obstacle is truly a physical one, like 66 00:04:00,600 --> 00:04:04,440 Speaker 1: like physics. It's it's once you get your components down 67 00:04:04,440 --> 00:04:07,440 Speaker 1: to the nanoscale, you know, like a nanometer is a 68 00:04:07,480 --> 00:04:10,320 Speaker 1: billionth of a meter. Once you get down there, you 69 00:04:10,400 --> 00:04:13,840 Speaker 1: start having to account for quantum effects. And these strange 70 00:04:13,840 --> 00:04:16,479 Speaker 1: effects don't happen in the macro scale, so they seem 71 00:04:16,480 --> 00:04:18,800 Speaker 1: almost magical to us. It seems like stuff that would 72 00:04:18,839 --> 00:04:22,760 Speaker 1: be impossible because in our daily experience we don't encounter 73 00:04:22,839 --> 00:04:25,760 Speaker 1: anything like this. So, for example, there is an effect 74 00:04:25,760 --> 00:04:29,680 Speaker 1: called quantum tunneling. So imagine you've got a sub atomic 75 00:04:29,720 --> 00:04:33,839 Speaker 1: particle like an electron, and let's say you've got a channel, 76 00:04:34,040 --> 00:04:36,960 Speaker 1: and this electron can travel down the channel. It's a 77 00:04:36,960 --> 00:04:38,760 Speaker 1: one way channel, so you can just go from one 78 00:04:39,040 --> 00:04:41,640 Speaker 1: end to the other, and you've built it just for 79 00:04:41,680 --> 00:04:44,159 Speaker 1: this purpose. And at the end of the channel you 80 00:04:44,240 --> 00:04:49,200 Speaker 1: have a gate blocking the end of it. But it's 81 00:04:49,200 --> 00:04:52,320 Speaker 1: a very very thin gate, and the electron just moves 82 00:04:52,320 --> 00:04:55,440 Speaker 1: down your channel and then surprise gets close to the gate, 83 00:04:55,480 --> 00:04:57,360 Speaker 1: and at some point it just appears on the other 84 00:04:57,400 --> 00:05:00,400 Speaker 1: side of the gate, and the electron continue is on 85 00:05:00,440 --> 00:05:03,599 Speaker 1: it's very little way. Uh now it's to you. It 86 00:05:03,640 --> 00:05:07,359 Speaker 1: looks like the electron somehow dug a pathway through the 87 00:05:07,400 --> 00:05:11,240 Speaker 1: gate and kept on going. But the electron didn't dig 88 00:05:11,279 --> 00:05:13,520 Speaker 1: a path. It was just on one side of the 89 00:05:13,520 --> 00:05:16,280 Speaker 1: gate at one point and then on the other side 90 00:05:16,320 --> 00:05:18,920 Speaker 1: of the gate. And the reason for this is that 91 00:05:19,040 --> 00:05:23,279 Speaker 1: electrons occupy more of an area than a specific point 92 00:05:23,360 --> 00:05:28,559 Speaker 1: in space, or that they can uh inhabit any point 93 00:05:28,600 --> 00:05:31,400 Speaker 1: within a certain area at any given given times. So 94 00:05:31,480 --> 00:05:35,520 Speaker 1: there's the small region where the electron could possibly occupy 95 00:05:35,640 --> 00:05:38,599 Speaker 1: any of the points within that region. Now, if the 96 00:05:38,640 --> 00:05:43,200 Speaker 1: gate is so thin that this region of possibility overlaps 97 00:05:43,279 --> 00:05:45,919 Speaker 1: the gate to the other side, well that means that 98 00:05:45,960 --> 00:05:48,799 Speaker 1: there is a chance. It might be a very small chance, 99 00:05:48,800 --> 00:05:51,479 Speaker 1: but there's still a chance that the electron could be 100 00:05:51,560 --> 00:05:53,800 Speaker 1: on the other side of the gate without the gate 101 00:05:53,880 --> 00:05:57,200 Speaker 1: ever having opened or the electron physically passing through it. 102 00:05:57,960 --> 00:06:01,400 Speaker 1: And if there is a chance, that means that given 103 00:06:01,520 --> 00:06:05,320 Speaker 1: enough opportunities, it will happen like that's kind of what 104 00:06:05,400 --> 00:06:08,960 Speaker 1: chance means. So this is a bad thing if you 105 00:06:09,000 --> 00:06:11,320 Speaker 1: want an integrated circuit, which you can think of as 106 00:06:11,360 --> 00:06:15,440 Speaker 1: being a very very complicated system of pathways and gates 107 00:06:15,480 --> 00:06:19,160 Speaker 1: for electrons to pass through or to be held back from. 108 00:06:19,200 --> 00:06:21,920 Speaker 1: So for that reason and for lots of other ones 109 00:06:21,960 --> 00:06:25,280 Speaker 1: that we don't really need to get into, semiconductor manufacturers 110 00:06:25,360 --> 00:06:28,560 Speaker 1: don't really reduce all the size of components the way 111 00:06:28,600 --> 00:06:30,919 Speaker 1: they used to back in the sixties and seventies and 112 00:06:31,200 --> 00:06:34,880 Speaker 1: leading up to fairly recent years actually, but they have 113 00:06:35,120 --> 00:06:38,960 Speaker 1: kept the naming convention. That is, you designate the type 114 00:06:39,080 --> 00:06:43,799 Speaker 1: of the semiconductor node, the chip node with a metric 115 00:06:44,120 --> 00:06:48,120 Speaker 1: such as like you might call it a four nanometer chip. Now, 116 00:06:48,160 --> 00:06:50,960 Speaker 1: not that long ago, the metric would actually refer to 117 00:06:50,960 --> 00:06:54,680 Speaker 1: the size of specific components on the chip itself. But 118 00:06:54,800 --> 00:06:57,359 Speaker 1: these days it's really more of a way to say 119 00:06:57,400 --> 00:06:59,880 Speaker 1: this chip performs at a higher level than say, a 120 00:07:00,279 --> 00:07:03,960 Speaker 1: ten nanometer chip would, So it's really more to tell 121 00:07:04,000 --> 00:07:08,840 Speaker 1: you about performance, but it's not necessarily uh in reference 122 00:07:08,839 --> 00:07:11,800 Speaker 1: to the actual size of anything that's located on the 123 00:07:11,880 --> 00:07:14,720 Speaker 1: chip itself. Now, there are a bunch of other laws 124 00:07:14,720 --> 00:07:18,400 Speaker 1: in tech that reference or build off of Moore's laws. So, 125 00:07:18,480 --> 00:07:22,200 Speaker 1: for example, there's Rocks Law, which is also sometimes called 126 00:07:22,240 --> 00:07:26,920 Speaker 1: Moore's second law. This observes that as computational power increases, 127 00:07:27,440 --> 00:07:31,600 Speaker 1: the cost to perpetuate Moore's law also increases, So in 128 00:07:31,640 --> 00:07:35,640 Speaker 1: other words, it gets progressively more expensive to meet the 129 00:07:35,680 --> 00:07:40,080 Speaker 1: fabrication requirements in order to build more powerful processors that 130 00:07:40,200 --> 00:07:43,120 Speaker 1: keep pace with Moore's law. Moore's law is something of 131 00:07:43,200 --> 00:07:46,440 Speaker 1: almost like a self fulfilling prophecy. There are companies that 132 00:07:46,560 --> 00:07:50,720 Speaker 1: push themselves to keep pace with Moore's law, even though 133 00:07:50,720 --> 00:07:53,840 Speaker 1: again there's no like fundamental law of the universe that 134 00:07:53,880 --> 00:07:56,760 Speaker 1: requires them to do this. And this touches on something 135 00:07:56,800 --> 00:08:00,000 Speaker 1: that I talked about in a recent episode about how 136 00:08:00,120 --> 00:08:03,400 Speaker 1: Taiwan play such an important part in the semiconductor industry. 137 00:08:03,680 --> 00:08:08,200 Speaker 1: Taiwan began investing in fabrication facilities in the nineteen seventies 138 00:08:08,640 --> 00:08:12,280 Speaker 1: and built on that considerably in the nineteen eighties. Meanwhile, 139 00:08:12,560 --> 00:08:15,040 Speaker 1: you had companies in the United States that we're really 140 00:08:15,400 --> 00:08:20,040 Speaker 1: starting to shift and focus primarily on chip design, but 141 00:08:20,320 --> 00:08:25,680 Speaker 1: not chip fabrication, because the fabrication cycle required a huge 142 00:08:25,840 --> 00:08:29,840 Speaker 1: recurring investment. You would spend millions of dollars to build 143 00:08:29,880 --> 00:08:32,440 Speaker 1: out all the tools and facilities you would need to 144 00:08:32,520 --> 00:08:36,600 Speaker 1: make a chip with components of a certain size. Meanwhile, 145 00:08:36,640 --> 00:08:38,800 Speaker 1: your designers are coming up with the next generation of 146 00:08:38,880 --> 00:08:41,160 Speaker 1: chips and those are all going to need their own, 147 00:08:42,000 --> 00:08:46,160 Speaker 1: you know, special equipment and facilities. So it's just a 148 00:08:46,200 --> 00:08:50,319 Speaker 1: never ending cycle of having to reinvest in your process. 149 00:08:50,360 --> 00:08:52,800 Speaker 1: So you have these chip designers who are creating a 150 00:08:52,880 --> 00:08:56,560 Speaker 1: chip architecture, but then they would outsource the actual manufacturing 151 00:08:56,559 --> 00:09:00,600 Speaker 1: to companies around the world, with Taiwan taking the lead. Also, 152 00:09:01,120 --> 00:09:03,679 Speaker 1: this implies that a lot of chip companies out there 153 00:09:03,679 --> 00:09:06,480 Speaker 1: are just consciously working to keep Moore's law going, even 154 00:09:06,520 --> 00:09:10,000 Speaker 1: if it might not economically make much sense to keep 155 00:09:10,080 --> 00:09:13,440 Speaker 1: up that pace. Um, there's almost like the weight of 156 00:09:13,520 --> 00:09:19,280 Speaker 1: expectation upon it if you're a Luisa out there. Some 157 00:09:19,400 --> 00:09:22,720 Speaker 1: observations playoff Moore's law in other ways or seem to 158 00:09:22,720 --> 00:09:27,120 Speaker 1: evoke More's law. For example, there's Worth's law w I 159 00:09:27,280 --> 00:09:30,120 Speaker 1: R T H that's named after Nicholas Worth, who wrote 160 00:09:30,160 --> 00:09:34,320 Speaker 1: an article that's titled a Plea for Lean Software. Worth 161 00:09:34,520 --> 00:09:39,440 Speaker 1: observation was that as computers get faster, software is getting slower, 162 00:09:39,559 --> 00:09:42,640 Speaker 1: and effect software it gets slower at a rate that's 163 00:09:42,720 --> 00:09:48,120 Speaker 1: greater than hardware's improvement. So hardware is getting faster, but 164 00:09:48,160 --> 00:09:52,280 Speaker 1: software is getting slower faster than hardware is getting faster. 165 00:09:52,920 --> 00:09:56,000 Speaker 1: If you will, so words law explains why you might 166 00:09:56,040 --> 00:09:58,520 Speaker 1: go out and buy a brand new computer and it 167 00:09:58,559 --> 00:10:01,760 Speaker 1: doesn't necessarily feel that much faster than the one you 168 00:10:01,800 --> 00:10:04,160 Speaker 1: had a couple of years ago. Now, it's pretty natural 169 00:10:04,200 --> 00:10:07,439 Speaker 1: for us to think, like, let's say we're using a computer, 170 00:10:07,480 --> 00:10:08,920 Speaker 1: and we might think, oh, if I buy a new 171 00:10:08,920 --> 00:10:10,600 Speaker 1: one in a couple of years, it's gonna leave this 172 00:10:10,640 --> 00:10:13,400 Speaker 1: one in the dust. It's gonna be so much faster. 173 00:10:13,760 --> 00:10:16,880 Speaker 1: But this ignores the fact that within those same two years, 174 00:10:17,600 --> 00:10:21,280 Speaker 1: developers are going to make increasingly complex software that gobbles 175 00:10:21,360 --> 00:10:25,520 Speaker 1: up those precious resources on the new machines. The software 176 00:10:25,559 --> 00:10:28,240 Speaker 1: two years from now would likely require more than what 177 00:10:28,400 --> 00:10:31,720 Speaker 1: your current machine could even handle. So Worth was saying 178 00:10:31,760 --> 00:10:35,480 Speaker 1: to software developers, hey, guys, chill out and uh figure 179 00:10:35,480 --> 00:10:39,000 Speaker 1: out ways to create less demanding software. Don't just pounce 180 00:10:39,160 --> 00:10:43,800 Speaker 1: on computational capabilities just because they're they're this. Don't treat 181 00:10:43,840 --> 00:10:46,240 Speaker 1: it like everest and you're climbing it because it's there. 182 00:10:46,559 --> 00:10:48,840 Speaker 1: And this always makes me think of Triple A video games, 183 00:10:49,400 --> 00:10:54,440 Speaker 1: which frequently on their highest settings anyway, have extremely high 184 00:10:54,520 --> 00:10:57,960 Speaker 1: graphics processing demands, and they often push even the most 185 00:10:58,000 --> 00:11:01,400 Speaker 1: powerful GPUs to their limits. And then you get the 186 00:11:01,440 --> 00:11:04,400 Speaker 1: next entry in the franchise, and it will be even 187 00:11:04,600 --> 00:11:08,640 Speaker 1: more demanding and will push whatever the current bleeding edge 188 00:11:08,679 --> 00:11:12,040 Speaker 1: GPU is to its limits, and so on. It never 189 00:11:12,760 --> 00:11:16,040 Speaker 1: eases off. On a sort of similar note, and one 190 00:11:16,080 --> 00:11:19,840 Speaker 1: that involves not just tech but human nature is Brooks's law, 191 00:11:20,240 --> 00:11:23,720 Speaker 1: and this comes from an observation by Fred Brooks back 192 00:11:23,760 --> 00:11:27,920 Speaker 1: in nineteen He says that under some conditions, adding a 193 00:11:27,960 --> 00:11:33,200 Speaker 1: person to a project, particularly software development, when the project 194 00:11:33,240 --> 00:11:37,720 Speaker 1: is running behind schedule, will push the project to um 195 00:11:38,400 --> 00:11:41,680 Speaker 1: to even further behind. So, in other words, if a 196 00:11:41,720 --> 00:11:45,080 Speaker 1: project is not going to meet deadline, adding another person 197 00:11:45,600 --> 00:11:48,200 Speaker 1: will likely make things worse. And there are lots of 198 00:11:48,200 --> 00:11:50,280 Speaker 1: different reasons for that. I'm sure many of you are 199 00:11:50,320 --> 00:11:52,960 Speaker 1: anticipating some of them. If you've ever gone through any 200 00:11:53,000 --> 00:11:56,440 Speaker 1: sort of onboarding process, either with a company or a team, 201 00:11:56,679 --> 00:11:59,560 Speaker 1: or if, bless your heart, you have been in charge 202 00:11:59,679 --> 00:12:02,680 Speaker 1: of on boarding someone else, you know that it takes 203 00:12:02,720 --> 00:12:04,800 Speaker 1: time for a new team member to get their bearings 204 00:12:04,800 --> 00:12:07,880 Speaker 1: and get understanding of how things are working and find 205 00:12:07,880 --> 00:12:10,880 Speaker 1: ways to contribute to that process. And until they get 206 00:12:10,920 --> 00:12:13,600 Speaker 1: to that point, while the new person is more likely 207 00:12:13,640 --> 00:12:16,520 Speaker 1: to be a drain on resources rather than adding to 208 00:12:16,640 --> 00:12:19,760 Speaker 1: the resources. And it's not it's not their fault. They 209 00:12:19,760 --> 00:12:23,240 Speaker 1: don't magically know where the project is and it's development cycle, 210 00:12:23,360 --> 00:12:26,400 Speaker 1: or what is working or was not working, or how 211 00:12:26,440 --> 00:12:29,440 Speaker 1: to make it better. It's just kind of the way 212 00:12:29,480 --> 00:12:33,199 Speaker 1: things are. Other issues can also contribute to a slowdown. 213 00:12:33,280 --> 00:12:36,080 Speaker 1: For example, as you add more people to a team, 214 00:12:36,200 --> 00:12:39,640 Speaker 1: communication becomes more complicated. And boy, how do you do 215 00:12:39,679 --> 00:12:42,400 Speaker 1: I feel this one? If you've ever used any sort 216 00:12:42,400 --> 00:12:46,800 Speaker 1: of project management or communication platform to communicate with a team, 217 00:12:47,000 --> 00:12:49,640 Speaker 1: you know, things like Slack or base camp or whatever, 218 00:12:50,480 --> 00:12:52,840 Speaker 1: you know that things can get pretty chaotic. As more 219 00:12:52,960 --> 00:12:56,800 Speaker 1: people join into a project. You can get cross talk, 220 00:12:57,240 --> 00:13:00,320 Speaker 1: you can get conversations that maybe should happen within a 221 00:13:00,400 --> 00:13:03,559 Speaker 1: subgroup rather than the whole group. You can have points 222 00:13:03,559 --> 00:13:07,280 Speaker 1: where various subgroups haven't communicated with one another, and so 223 00:13:07,360 --> 00:13:09,560 Speaker 1: things get all jump lee, just a lot of stuff 224 00:13:09,559 --> 00:13:13,199 Speaker 1: that gets harder as more people join in. Uh and uh, 225 00:13:13,320 --> 00:13:15,559 Speaker 1: you know that's something that we're going to touch on again. 226 00:13:15,679 --> 00:13:17,360 Speaker 1: A little bit later in this episode, when we talk 227 00:13:17,400 --> 00:13:23,240 Speaker 1: about processors adding people or resources to some jobs sometimes 228 00:13:23,280 --> 00:13:27,120 Speaker 1: makes sense if those jobs can be divided up into 229 00:13:27,240 --> 00:13:31,160 Speaker 1: individual tasks, but it makes less sense if the job 230 00:13:31,400 --> 00:13:34,480 Speaker 1: isn't divisible, And we'll come back to that idea in 231 00:13:34,559 --> 00:13:38,480 Speaker 1: just a moment. First, before we get into more loss stuff, 232 00:13:38,720 --> 00:13:49,640 Speaker 1: let's take a quick break. We're back e Room's Law. 233 00:13:49,880 --> 00:13:52,240 Speaker 1: Here's another one that's all play on Moore's law, because 234 00:13:52,280 --> 00:13:55,839 Speaker 1: the room is actually more spelled backwards. It's e R 235 00:13:55,960 --> 00:13:58,959 Speaker 1: O O M. It describes the process of developing new 236 00:13:59,080 --> 00:14:03,960 Speaker 1: drugs like pharmaceuticals, and that developing new drugs gets progressively 237 00:14:04,080 --> 00:14:07,959 Speaker 1: harder over time, and that means it takes longer and 238 00:14:08,080 --> 00:14:11,079 Speaker 1: costs more money to develop new drugs as time goes on. 239 00:14:11,240 --> 00:14:15,040 Speaker 1: Despite the fact that you've got scientists and engineers and 240 00:14:15,120 --> 00:14:18,080 Speaker 1: chemists who have developed some really super cool high tech 241 00:14:18,120 --> 00:14:22,000 Speaker 1: equipment specifically for the purposes of drug development, that even 242 00:14:22,000 --> 00:14:27,800 Speaker 1: though our capabilities grow, the difficulty still grows. And generally, 243 00:14:28,480 --> 00:14:30,800 Speaker 1: the Rooms Law says that the cost of developing a 244 00:14:30,840 --> 00:14:34,920 Speaker 1: new drug doubles pretty much every nine years. Okay, now, 245 00:14:35,280 --> 00:14:38,120 Speaker 1: way back in three, which was more than a decade 246 00:14:38,120 --> 00:14:42,160 Speaker 1: before Gordon Moore's article about cramming components on integrated circuits 247 00:14:42,200 --> 00:14:45,880 Speaker 1: would come out, there was a person named Herb Gross 248 00:14:46,360 --> 00:14:50,080 Speaker 1: who observed that computer performance increases as the square of 249 00:14:50,120 --> 00:14:54,640 Speaker 1: the cost. So, in other words, as a computer's price doubles, 250 00:14:54,840 --> 00:14:58,720 Speaker 1: its computer processing power should be four times as great. 251 00:14:59,200 --> 00:15:01,600 Speaker 1: So if you're looking at two computers and the first 252 00:15:01,600 --> 00:15:03,880 Speaker 1: one is five dollars and the second one is one 253 00:15:03,920 --> 00:15:07,080 Speaker 1: thousand dollars, the second computer should be four times as 254 00:15:07,120 --> 00:15:10,360 Speaker 1: powerful as the first computer. Uh. This is one of 255 00:15:10,360 --> 00:15:13,880 Speaker 1: those laws that at various times wasn't you know, totally accurate. 256 00:15:14,160 --> 00:15:17,520 Speaker 1: But generally it's saying that as computers get more expensive, 257 00:15:17,920 --> 00:15:21,280 Speaker 1: the price of computational performance, if you were able to 258 00:15:21,360 --> 00:15:24,240 Speaker 1: like divide that up on a per unit basis, is 259 00:15:24,280 --> 00:15:28,760 Speaker 1: actually coming down. Then there's Kumi's law, which describes the 260 00:15:28,840 --> 00:15:33,480 Speaker 1: power efficiency of hardware over time. Essentially, Kumi said that 261 00:15:33,680 --> 00:15:36,800 Speaker 1: every one point five seven years, the amount of battery 262 00:15:36,920 --> 00:15:40,640 Speaker 1: you would need to handle a fixed computing load would 263 00:15:40,720 --> 00:15:43,360 Speaker 1: fall by a factor of two, meaning that if you 264 00:15:43,400 --> 00:15:46,920 Speaker 1: were to perform the exact same computational load on computers 265 00:15:46,920 --> 00:15:49,120 Speaker 1: that were made essentially a year and a half apart 266 00:15:49,160 --> 00:15:51,920 Speaker 1: from each other, the more recent computer would do it 267 00:15:51,960 --> 00:15:56,200 Speaker 1: without consuming nearly as much power. After two thousand, Kumi 268 00:15:56,360 --> 00:15:58,720 Speaker 1: adjusted this to say the trend would actually follow every 269 00:15:58,760 --> 00:16:01,600 Speaker 1: two point six year instead of one point five seven. 270 00:16:02,320 --> 00:16:04,600 Speaker 1: And you might wonder, well, why doesn't this mean that 271 00:16:04,640 --> 00:16:08,400 Speaker 1: we see laptop computers that have batteries that have you know, 272 00:16:08,480 --> 00:16:13,080 Speaker 1: an all day operation, Like why don't they last many, many, 273 00:16:13,120 --> 00:16:16,640 Speaker 1: many more hours than old laptops? After all, if we're 274 00:16:16,680 --> 00:16:20,120 Speaker 1: talking about it, the battery load decreasing by a factor 275 00:16:20,160 --> 00:16:24,240 Speaker 1: of two every even every two point six years, shouldn't 276 00:16:24,280 --> 00:16:26,480 Speaker 1: that be the case. Well, then we have to remember 277 00:16:26,480 --> 00:16:29,160 Speaker 1: Worth's law and the fact that software bloats at are 278 00:16:29,160 --> 00:16:32,640 Speaker 1: really fast rate, so we're not running the exact same 279 00:16:32,680 --> 00:16:36,600 Speaker 1: computational loads as time goes on. The computational loads are 280 00:16:36,600 --> 00:16:39,680 Speaker 1: getting bigger as time goes on, so it all kind 281 00:16:39,680 --> 00:16:44,360 Speaker 1: of negates each other, all right. Then we have Gustafson's law, 282 00:16:44,840 --> 00:16:47,040 Speaker 1: which gets back to that issue I was talking about 283 00:16:47,040 --> 00:16:49,040 Speaker 1: with Brooks law. You know, that was the one that 284 00:16:49,080 --> 00:16:52,480 Speaker 1: says adding more people to a project doesn't necessarily speed 285 00:16:52,480 --> 00:16:57,080 Speaker 1: things up. Gustupson's law covers how much faster a computer 286 00:16:57,280 --> 00:17:02,080 Speaker 1: with parallel processors will complete given task compared to a 287 00:17:02,120 --> 00:17:06,880 Speaker 1: computer with a single core processor. All right, now, when 288 00:17:06,880 --> 00:17:10,280 Speaker 1: it comes to parallel processing, I use this analogy pretty 289 00:17:10,359 --> 00:17:14,960 Speaker 1: much every single time to describe single core versus multi 290 00:17:14,960 --> 00:17:18,200 Speaker 1: core processors. But let's do it again. Okay, Let's say 291 00:17:18,240 --> 00:17:20,840 Speaker 1: you got yourself a math class and there are six 292 00:17:20,960 --> 00:17:23,960 Speaker 1: students in the math class. One student, and we're gonna 293 00:17:23,960 --> 00:17:27,320 Speaker 1: call her Annie, is like super good at math. She 294 00:17:27,560 --> 00:17:30,560 Speaker 1: is a genius. Now, the other five students in the 295 00:17:30,600 --> 00:17:33,120 Speaker 1: class are really good at math, but they're not at 296 00:17:33,160 --> 00:17:36,879 Speaker 1: Annie's level. One day, the teacher comes in and proposes 297 00:17:36,920 --> 00:17:40,080 Speaker 1: a contest. She's going to hand out a pop quiz 298 00:17:40,400 --> 00:17:43,359 Speaker 1: with five math problems on it. Annie will have to 299 00:17:43,400 --> 00:17:47,880 Speaker 1: complete all five problems, but the other five students will 300 00:17:47,960 --> 00:17:51,080 Speaker 1: each only have to solve one of the five problems. 301 00:17:51,080 --> 00:17:54,000 Speaker 1: So student one gets problem one, Student two gets problem 302 00:17:54,080 --> 00:17:57,720 Speaker 1: to et cetera, And then the teacher starts the clock 303 00:17:58,040 --> 00:18:01,120 Speaker 1: and lo and behold the group of five students finish. 304 00:18:01,400 --> 00:18:06,400 Speaker 1: There's collectively first, Annie is faster at answering individual questions 305 00:18:06,560 --> 00:18:10,000 Speaker 1: than her counterparts are, so she can answer question one 306 00:18:10,320 --> 00:18:13,920 Speaker 1: before student one can do the same. But keep in mind, 307 00:18:14,040 --> 00:18:17,400 Speaker 1: students two through five are still working on those other 308 00:18:17,520 --> 00:18:21,960 Speaker 1: problems while Annie is still finishing up question one, so 309 00:18:22,640 --> 00:18:24,760 Speaker 1: she is not able to finish her quiz faster than 310 00:18:24,800 --> 00:18:29,040 Speaker 1: the collective classmates for that particular pop quiz. This is 311 00:18:29,160 --> 00:18:33,560 Speaker 1: kind of like parallel processing. Parallel processors are great for 312 00:18:33,680 --> 00:18:38,360 Speaker 1: certain types of computational problems, namely problems that can be 313 00:18:38,440 --> 00:18:42,280 Speaker 1: solved in parts. The various cores can tackle the different 314 00:18:42,400 --> 00:18:45,800 Speaker 1: parts of the problem and then together they all arrive 315 00:18:45,880 --> 00:18:49,200 Speaker 1: at the solution, and potentially they can do this much 316 00:18:49,240 --> 00:18:53,680 Speaker 1: faster than a more powerful single core processor could. However, 317 00:18:54,480 --> 00:18:58,760 Speaker 1: not all problems are parallel. Some problems require a serial 318 00:18:58,920 --> 00:19:02,800 Speaker 1: approach s e. R I. A L not captain crunch, 319 00:19:03,520 --> 00:19:06,080 Speaker 1: and a serial approach means that you can't just divide 320 00:19:06,119 --> 00:19:08,359 Speaker 1: the problem up into parts. You have to go from 321 00:19:08,400 --> 00:19:11,040 Speaker 1: beginning through all the way to the end. And in 322 00:19:11,080 --> 00:19:14,720 Speaker 1: those cases, the more powerful single core processor is going 323 00:19:14,760 --> 00:19:18,640 Speaker 1: to solve the problem faster than the parallel processor where 324 00:19:18,640 --> 00:19:21,320 Speaker 1: each core is not running at the same you know, 325 00:19:21,400 --> 00:19:25,719 Speaker 1: high level clock speed. So Gustopherson's law takes all this 326 00:19:25,760 --> 00:19:29,919 Speaker 1: into account and gives a mathematical expression to estimate how 327 00:19:30,000 --> 00:19:33,639 Speaker 1: much faster a parallel processor can solve a given type 328 00:19:33,640 --> 00:19:35,879 Speaker 1: of problem if you happen to know how much of 329 00:19:35,920 --> 00:19:40,920 Speaker 1: that problem is serial rather than parallel. Alternatively, you could 330 00:19:41,040 --> 00:19:44,679 Speaker 1: use this to estimate how slowly a single core processor 331 00:19:45,000 --> 00:19:49,199 Speaker 1: would solve a parallel problem. All right, Well, what if 332 00:19:49,200 --> 00:19:50,840 Speaker 1: you were to look at Moore's law and say, hey, 333 00:19:50,880 --> 00:19:53,600 Speaker 1: that's nice and all, but it's way too slow. Well, 334 00:19:53,640 --> 00:19:56,959 Speaker 1: then you could take a gander at Nevin's law. This 335 00:19:57,040 --> 00:20:00,159 Speaker 1: is named after Hartmut Nevin, who works in discipline is 336 00:20:00,200 --> 00:20:03,919 Speaker 1: like quantum computing in robotics, and his law proposes that 337 00:20:04,040 --> 00:20:08,680 Speaker 1: quantum computers increase in power at a doubly exponential rate, 338 00:20:09,040 --> 00:20:12,560 Speaker 1: which is quick and crazy fast. And we can sort 339 00:20:12,560 --> 00:20:16,480 Speaker 1: of see why this is too so. For classical computers, 340 00:20:16,920 --> 00:20:20,080 Speaker 1: the basic unit of information is the bit, and a 341 00:20:20,119 --> 00:20:23,240 Speaker 1: bit is a binary digit, so we designated as either 342 00:20:23,359 --> 00:20:26,040 Speaker 1: a zero or a one, which you can think of 343 00:20:26,080 --> 00:20:30,399 Speaker 1: like a switch being turned off or on. Using lots 344 00:20:30,400 --> 00:20:33,280 Speaker 1: of bits, we can tell machines to do all sorts 345 00:20:33,280 --> 00:20:36,480 Speaker 1: of stuff based on specific input. This is the basis 346 00:20:36,640 --> 00:20:42,480 Speaker 1: of computer science. But quantum computing has a different basic unit. 347 00:20:42,560 --> 00:20:47,119 Speaker 1: It is called the cubit or quantum bit, and a 348 00:20:47,240 --> 00:20:49,760 Speaker 1: bit can either be a zero or a one, right, 349 00:20:49,800 --> 00:20:52,080 Speaker 1: it has to be one or the other. It is binary, 350 00:20:52,080 --> 00:20:56,880 Speaker 1: But a cubit, thanks to stuff like superposition, can effectively 351 00:20:56,960 --> 00:21:01,400 Speaker 1: inhabit both the zero state and the one states simultaneously, 352 00:21:01,440 --> 00:21:05,200 Speaker 1: as well as technically all states in between. So as 353 00:21:05,280 --> 00:21:08,679 Speaker 1: you add in the ability to handle more cubits, as 354 00:21:08,680 --> 00:21:13,359 Speaker 1: you build quantum systems that have more cubits incorporated into them, 355 00:21:13,440 --> 00:21:16,480 Speaker 1: you vastly expand what the computer is capable of doing. 356 00:21:16,840 --> 00:21:20,879 Speaker 1: Your collection of cubits can, in a sense, perform drastically 357 00:21:20,960 --> 00:21:25,200 Speaker 1: larger numbers of simultaneous processes to solve a specific subset 358 00:21:25,280 --> 00:21:28,479 Speaker 1: of computational problems. That sounds like word sellad, but it 359 00:21:28,520 --> 00:21:30,560 Speaker 1: does make sense if you start to break it down, 360 00:21:31,119 --> 00:21:33,400 Speaker 1: And it doesn't mean that a quantum computer is good 361 00:21:33,400 --> 00:21:37,080 Speaker 1: for every kind of computational problem, just like a parallel 362 00:21:37,200 --> 00:21:39,800 Speaker 1: processor is not going to be good for are not 363 00:21:39,840 --> 00:21:43,480 Speaker 1: going to be better at at a serial level kind 364 00:21:43,520 --> 00:21:46,760 Speaker 1: of computational problem than a single core processor could be. 365 00:21:47,760 --> 00:21:49,639 Speaker 1: A quantum computer is not going to be great at 366 00:21:49,680 --> 00:21:54,080 Speaker 1: every single type of computational load, but for a subset 367 00:21:54,119 --> 00:21:57,040 Speaker 1: of them, they could be phenomenal as long as computer 368 00:21:57,119 --> 00:22:02,200 Speaker 1: scientists develop effective algorithms to lever the quantum computing UH, 369 00:22:02,240 --> 00:22:04,439 Speaker 1: and these are problems that would take a traditional computer 370 00:22:04,560 --> 00:22:07,760 Speaker 1: decades or maybe centuries or thousands of years to complete, 371 00:22:07,760 --> 00:22:12,000 Speaker 1: depending upon the complexity. So one example of of the 372 00:22:12,080 --> 00:22:15,240 Speaker 1: type of problem that quantum computers might be really good 373 00:22:15,280 --> 00:22:19,440 Speaker 1: at tackling is UH is known as the traveling salesman problem. 374 00:22:19,480 --> 00:22:21,920 Speaker 1: And here's a version of this problem. Let's say you've 375 00:22:21,920 --> 00:22:26,600 Speaker 1: got a salesman whose region includes you know, ten different cities, 376 00:22:27,400 --> 00:22:29,600 Speaker 1: and the salesman's trying to figure out what is the 377 00:22:29,640 --> 00:22:33,240 Speaker 1: most efficient route that will allow the salesman to visit 378 00:22:33,440 --> 00:22:36,719 Speaker 1: every city at least once with the least amount of 379 00:22:36,760 --> 00:22:40,600 Speaker 1: travel time. And in a classical computer system, a computer 380 00:22:40,640 --> 00:22:44,919 Speaker 1: would have to go through every single possible variation of 381 00:22:45,040 --> 00:22:48,800 Speaker 1: every route between every city and record the results. And 382 00:22:48,840 --> 00:22:52,120 Speaker 1: then once it had run every single variation, it could 383 00:22:52,119 --> 00:22:55,480 Speaker 1: then compare all the results against each other UH and 384 00:22:55,520 --> 00:22:58,760 Speaker 1: then determine which route would be the fastest, and that, 385 00:22:59,000 --> 00:23:01,520 Speaker 1: depending again on the complexity of the problem, could take 386 00:23:01,600 --> 00:23:05,480 Speaker 1: hundreds of years. A quantum computer with a sufficient number 387 00:23:05,480 --> 00:23:09,560 Speaker 1: of cubits along with the appropriate algorithm, could potentially solve 388 00:23:09,640 --> 00:23:13,960 Speaker 1: this problem much faster. Also, interestingly, the solutions that quantum computers. 389 00:23:14,000 --> 00:23:17,880 Speaker 1: Generator actually listed in probabilities, not certainties, so you would 390 00:23:17,920 --> 00:23:22,000 Speaker 1: get an answer that what might have like a threshold 391 00:23:22,680 --> 00:23:25,520 Speaker 1: of certainty saying that this is the right answer, which 392 00:23:25,520 --> 00:23:28,960 Speaker 1: means it might not be, but it probably is. Now 393 00:23:29,000 --> 00:23:32,640 Speaker 1: I'm being very fast and loose and very very high 394 00:23:32,720 --> 00:23:36,440 Speaker 1: level with this description. It gets so much more complicated 395 00:23:36,480 --> 00:23:38,760 Speaker 1: and technical than what I'm I'm saying. But this is 396 00:23:38,800 --> 00:23:41,600 Speaker 1: just to give you an idea of what quantum computers 397 00:23:41,640 --> 00:23:43,800 Speaker 1: will be used for. They won't be used for everything, 398 00:23:44,480 --> 00:23:47,600 Speaker 1: but for the applications that they can be used for, 399 00:23:48,040 --> 00:23:51,919 Speaker 1: they can potentially be far more powerful than any classical system. 400 00:23:51,960 --> 00:23:55,680 Speaker 1: All Right, We've got a few more laws to cover 401 00:23:55,880 --> 00:24:07,440 Speaker 1: before we wrap this up, but let's take another quick break. Okay, 402 00:24:07,560 --> 00:24:11,280 Speaker 1: let's let's talk about a few more laws. Robert Metcalf 403 00:24:11,640 --> 00:24:15,320 Speaker 1: made an observation that we now call Metcalf's law, and 404 00:24:15,359 --> 00:24:19,000 Speaker 1: this has to do with the value of a telecommunications network. 405 00:24:19,040 --> 00:24:22,359 Speaker 1: And by value, we're kind of talking about the number 406 00:24:22,400 --> 00:24:26,960 Speaker 1: of possible connections that can exist within a network. And 407 00:24:27,160 --> 00:24:31,760 Speaker 1: the mathematical expression of this law is N times and 408 00:24:31,920 --> 00:24:35,840 Speaker 1: minus one all divided by two, So N represents the 409 00:24:35,960 --> 00:24:40,120 Speaker 1: number of nodes in a network. All right, Let's let's 410 00:24:40,119 --> 00:24:43,240 Speaker 1: start using actual examples to kind of explain what this means. 411 00:24:43,680 --> 00:24:47,080 Speaker 1: So let's say we have the simplest network imaginable. Let's 412 00:24:47,080 --> 00:24:49,760 Speaker 1: say we've got a couple of kids and they've got 413 00:24:49,840 --> 00:24:53,240 Speaker 1: to tin cans and a string connecting to them, so 414 00:24:53,359 --> 00:24:56,560 Speaker 1: the two kids can talk to one another. But that's it, right, 415 00:24:56,600 --> 00:25:00,240 Speaker 1: you can only have two people using that network, and 416 00:25:00,320 --> 00:25:02,600 Speaker 1: so that means our n in this case, the number 417 00:25:02,600 --> 00:25:05,840 Speaker 1: of nodes is too. We have two people, two nodes. 418 00:25:06,840 --> 00:25:10,800 Speaker 1: So then we take that mathematical expression you know, in 419 00:25:11,160 --> 00:25:13,880 Speaker 1: times in minus one divided by two, So that means 420 00:25:13,880 --> 00:25:17,719 Speaker 1: it's two times two minus one divided by two, and 421 00:25:17,840 --> 00:25:20,639 Speaker 1: two minus one is one, so then you have two 422 00:25:20,680 --> 00:25:23,800 Speaker 1: times one. That means you've got to you're divided by two, 423 00:25:24,080 --> 00:25:26,399 Speaker 1: you're left with one. So one is the number of 424 00:25:26,440 --> 00:25:28,919 Speaker 1: connections we can make with this network. We have two nodes, 425 00:25:29,320 --> 00:25:32,480 Speaker 1: only one connection can happen. What happens if we add 426 00:25:32,520 --> 00:25:35,320 Speaker 1: a third person in there. Let's say we've graduated from 427 00:25:35,320 --> 00:25:41,320 Speaker 1: tin cans and string to actual like um telephone system. Well, 428 00:25:41,320 --> 00:25:44,280 Speaker 1: now our n is three, so it's three times three 429 00:25:44,320 --> 00:25:46,840 Speaker 1: minus one divided by two three minus one is two 430 00:25:47,080 --> 00:25:49,680 Speaker 1: two times three six six divided by two is three. 431 00:25:49,880 --> 00:25:54,480 Speaker 1: We went from one connection to three potential connection connections 432 00:25:54,520 --> 00:25:58,879 Speaker 1: just by adding one person. Now let's say we go 433 00:25:58,960 --> 00:26:02,600 Speaker 1: up to twenty. We go through that same mathematical expression, 434 00:26:02,600 --> 00:26:04,600 Speaker 1: which I'm not going to walk you through because you've 435 00:26:04,600 --> 00:26:07,080 Speaker 1: heard it several times, but it ends up with one. 436 00:26:08,400 --> 00:26:10,640 Speaker 1: So again, with two people, you have a maximum number 437 00:26:10,640 --> 00:26:13,480 Speaker 1: of connections set at one. With twenty you have one 438 00:26:15,000 --> 00:26:19,399 Speaker 1: possible different connected pairs. So the more nodes you have 439 00:26:19,520 --> 00:26:23,600 Speaker 1: in the communications network, the higher the value of the network. 440 00:26:24,440 --> 00:26:29,080 Speaker 1: On a similar note, David P. Read R E. D. 441 00:26:29,720 --> 00:26:34,560 Speaker 1: Observed that the usefulness of large networks, specifically social networks, 442 00:26:35,000 --> 00:26:40,560 Speaker 1: scales exponentially with the size of the network. So even 443 00:26:40,600 --> 00:26:43,800 Speaker 1: adding just a few people to a social network creates 444 00:26:43,960 --> 00:26:48,560 Speaker 1: exponential growth in that network's utility. And read described this 445 00:26:48,640 --> 00:26:52,640 Speaker 1: in terms of subgroups, that the more people join a network, 446 00:26:52,960 --> 00:26:58,159 Speaker 1: the more subgroups can exist within that network. And again 447 00:26:58,200 --> 00:27:01,240 Speaker 1: you have a mathematical expression you can used to describe this. 448 00:27:01,960 --> 00:27:04,639 Speaker 1: In this case, you would say that the the number 449 00:27:04,680 --> 00:27:07,960 Speaker 1: of subgroups is you take the number of people in 450 00:27:08,000 --> 00:27:10,680 Speaker 1: your network, or a number of components in your network. 451 00:27:10,880 --> 00:27:13,439 Speaker 1: This is your n and you take two to the 452 00:27:13,480 --> 00:27:17,000 Speaker 1: power of n, then you subtract in from that number, 453 00:27:17,119 --> 00:27:20,240 Speaker 1: and then you subtract one from that number. So let's 454 00:27:20,240 --> 00:27:22,920 Speaker 1: say we've got eight people in this social network. Two 455 00:27:22,960 --> 00:27:25,360 Speaker 1: to the power of eight is two hundred fifty six. 456 00:27:25,840 --> 00:27:28,760 Speaker 1: We substract, subtract eight from that, and we get two 457 00:27:29,160 --> 00:27:31,959 Speaker 1: forty eight. We subtract one from that, and we get 458 00:27:32,040 --> 00:27:35,160 Speaker 1: to forty seven. So with eight people, you could potentially 459 00:27:35,240 --> 00:27:39,239 Speaker 1: have as many as two hundred forty seven subgroups. And 460 00:27:39,240 --> 00:27:41,760 Speaker 1: obviously it just gets crazier from there as you start 461 00:27:41,800 --> 00:27:45,160 Speaker 1: to add people. So this points out how social networks 462 00:27:45,200 --> 00:27:49,960 Speaker 1: can very quickly grow and become more important. Though it's 463 00:27:50,000 --> 00:27:53,000 Speaker 1: obviously not a guarantee because obviously those those eight people 464 00:27:53,000 --> 00:27:56,960 Speaker 1: could have potentially as many as two forty seven subgroups, 465 00:27:56,960 --> 00:27:59,479 Speaker 1: But that doesn't mean you would actually see every single 466 00:28:00,160 --> 00:28:03,240 Speaker 1: variation of a group play out in real life, Like 467 00:28:03,560 --> 00:28:07,240 Speaker 1: maybe any inner mathematics classmates are all part of those 468 00:28:07,280 --> 00:28:10,240 Speaker 1: eight people, and any refuses to be in any subgroup 469 00:28:10,280 --> 00:28:13,960 Speaker 1: with our five classmates, Well, that would eliminate a ton 470 00:28:14,320 --> 00:28:16,880 Speaker 1: of options there. But you get the idea. And then 471 00:28:16,920 --> 00:28:20,920 Speaker 1: we have a couple of dark laws in tech. Laws 472 00:28:21,000 --> 00:28:25,760 Speaker 1: of observations that show some of them not so pleasant 473 00:28:25,840 --> 00:28:29,160 Speaker 1: sides of technology. One of those is called Zimmerman's law, 474 00:28:29,680 --> 00:28:33,200 Speaker 1: which states that the capability for computational devices to track 475 00:28:33,280 --> 00:28:36,919 Speaker 1: what we're doing doubles every eighteen months. And when you 476 00:28:36,960 --> 00:28:42,400 Speaker 1: think about stuff like social networks, location tracking, targeted advertising, 477 00:28:42,480 --> 00:28:44,480 Speaker 1: all this kind of thing, you start to see what 478 00:28:44,560 --> 00:28:47,280 Speaker 1: Zimmerman was saying with this. And some of this is 479 00:28:47,280 --> 00:28:50,600 Speaker 1: based off the capabilities of the technology, how the technology 480 00:28:50,640 --> 00:28:53,520 Speaker 1: is getting better at doing this stuff, but some of 481 00:28:53,560 --> 00:28:55,280 Speaker 1: it also has to do with how we choose to 482 00:28:55,320 --> 00:28:59,520 Speaker 1: interact with tech. Like obviously, if we didn't, if we 483 00:28:59,520 --> 00:29:01,920 Speaker 1: weren't part participants in this, or at least we weren't 484 00:29:01,920 --> 00:29:05,800 Speaker 1: eager participants in this, it wouldn't have quite the same 485 00:29:05,920 --> 00:29:08,960 Speaker 1: level of effectiveness. So if more and more people were 486 00:29:09,000 --> 00:29:10,200 Speaker 1: to say, you know what, I'm not going to do 487 00:29:10,280 --> 00:29:13,520 Speaker 1: social network stuff anymore, maybe they don't even use a 488 00:29:13,520 --> 00:29:16,440 Speaker 1: smartphone or anything like that, it would cut way back 489 00:29:16,480 --> 00:29:20,440 Speaker 1: on this. But most of us are willing participants to 490 00:29:20,520 --> 00:29:22,800 Speaker 1: some degree or another in this system, and that just 491 00:29:22,880 --> 00:29:27,000 Speaker 1: makes it more effective. And then we have Godwin's law. 492 00:29:27,320 --> 00:29:30,840 Speaker 1: Anyone who's been part of any online community is likely 493 00:29:30,920 --> 00:29:35,080 Speaker 1: to be aware of Godwin's law. It's named after Mike Godwin, 494 00:29:35,480 --> 00:29:39,280 Speaker 1: and this law essentially says that the longer any online 495 00:29:39,280 --> 00:29:42,800 Speaker 1: discussion goes, the more likely someone will bring up a 496 00:29:42,840 --> 00:29:48,160 Speaker 1: comparison that involves Nazis or Hitler, and that should a 497 00:29:48,240 --> 00:29:52,280 Speaker 1: conversation go on long enough, the probability becomes a certainty. 498 00:29:52,640 --> 00:29:55,040 Speaker 1: And God would observe this way back in the youth 499 00:29:55,120 --> 00:29:58,760 Speaker 1: net news group days, essentially, people would get into conversations, 500 00:29:58,760 --> 00:30:01,320 Speaker 1: someone would disagree with someone, and else things would get heated, 501 00:30:01,560 --> 00:30:06,240 Speaker 1: and ultimately someone would compare either a person or an 502 00:30:06,280 --> 00:30:11,840 Speaker 1: idea to uh, something that the Nazis would have celebrated 503 00:30:11,960 --> 00:30:15,000 Speaker 1: or something that Hitler would have proposed, and then Godwin's 504 00:30:15,040 --> 00:30:18,200 Speaker 1: law would be complete. Like you would have said, yes, 505 00:30:18,240 --> 00:30:21,160 Speaker 1: this is Godwin's law. This conversation has reached the point 506 00:30:21,360 --> 00:30:25,320 Speaker 1: where someone made that comparison, and essentially they were saying 507 00:30:25,320 --> 00:30:28,720 Speaker 1: that if a conversation goes long enough, the probability that 508 00:30:28,760 --> 00:30:32,640 Speaker 1: would happen ends up being a certainty. Now, some folks 509 00:30:32,800 --> 00:30:36,480 Speaker 1: say that if someone invokes a comparison to Nazis or 510 00:30:36,560 --> 00:30:40,640 Speaker 1: Hitler in a conversation, that person automatically loses whatever the 511 00:30:40,720 --> 00:30:44,360 Speaker 1: argument was about. Essentially, the idea is that you were 512 00:30:44,440 --> 00:30:47,920 Speaker 1: unable to defend your position, so you ended up resorting 513 00:30:47,920 --> 00:30:53,000 Speaker 1: to this comparison. This, this emotionally charged comparison. Therefore, your 514 00:30:53,000 --> 00:30:59,040 Speaker 1: position is indefensible and you lose, You get nothing. As 515 00:30:59,040 --> 00:31:01,880 Speaker 1: Willy Wonka would say, Uh, Now, I do not want 516 00:31:01,880 --> 00:31:05,000 Speaker 1: to end on that note. Obviously it's really a bummer 517 00:31:05,040 --> 00:31:07,480 Speaker 1: of a note. So we're gonna throw in the bonus here. 518 00:31:07,680 --> 00:31:10,760 Speaker 1: Of the laws of robotics as originally proposed by science 519 00:31:10,760 --> 00:31:14,720 Speaker 1: fiction author Isaac Asimov, and originally there were just three 520 00:31:14,880 --> 00:31:18,560 Speaker 1: laws of robotics. The first law states that a robot 521 00:31:18,720 --> 00:31:22,800 Speaker 1: may not cause harm to a human being or through inaction, 522 00:31:23,320 --> 00:31:26,640 Speaker 1: allow a human to come to harm. Uh. The second 523 00:31:26,720 --> 00:31:30,320 Speaker 1: law is that a robot must obey any and all 524 00:31:30,440 --> 00:31:34,320 Speaker 1: orders given to it by a human, except if doing 525 00:31:34,400 --> 00:31:37,240 Speaker 1: so would violate the first law. So I couldn't tell 526 00:31:37,800 --> 00:31:41,680 Speaker 1: a robot to um to grab a specific chair just 527 00:31:41,760 --> 00:31:44,600 Speaker 1: as you're trying to sit in that chair, because if 528 00:31:44,600 --> 00:31:47,600 Speaker 1: the robot did that, you would miss the chair and 529 00:31:47,680 --> 00:31:50,800 Speaker 1: you would sit down and hit the floor and possibly 530 00:31:50,840 --> 00:31:53,280 Speaker 1: hurt yourself. And so the robot would not be able 531 00:31:53,320 --> 00:31:56,000 Speaker 1: to do that, even though it is otherwise compelled to 532 00:31:56,320 --> 00:32:00,080 Speaker 1: obey every command given to it by a human. The 533 00:32:00,120 --> 00:32:03,200 Speaker 1: third law is that a robot must protect itself unless 534 00:32:03,320 --> 00:32:05,520 Speaker 1: by doing so it would come into conflict with the 535 00:32:05,640 --> 00:32:09,280 Speaker 1: first or second law. So if the robot protecting itself 536 00:32:09,320 --> 00:32:11,840 Speaker 1: would allow a human to come to harm, well, then 537 00:32:11,880 --> 00:32:16,160 Speaker 1: the robot will take whatever action is necessary, including actions 538 00:32:16,200 --> 00:32:20,360 Speaker 1: that would harm itself. Now later, as Amov would add 539 00:32:20,560 --> 00:32:24,440 Speaker 1: the zero law, this states that a robot may not 540 00:32:24,560 --> 00:32:28,520 Speaker 1: harm humanity or through an action, allow humanity to come 541 00:32:28,560 --> 00:32:32,800 Speaker 1: to harm. So not just a human, but humanity in general. Uh, 542 00:32:32,840 --> 00:32:34,720 Speaker 1: this is the law that really helps you get around 543 00:32:34,800 --> 00:32:38,320 Speaker 1: that science fiction trope in which engineers create a super 544 00:32:38,440 --> 00:32:43,240 Speaker 1: powerful artificial intelligence like superhuman AI, and then you know, 545 00:32:43,320 --> 00:32:46,760 Speaker 1: they use the AI as a decision making engine and 546 00:32:46,800 --> 00:32:49,760 Speaker 1: they ask for it to bring about world peace, like 547 00:32:49,920 --> 00:32:52,200 Speaker 1: this is a problem so big that humans can't solve it, 548 00:32:52,240 --> 00:32:56,320 Speaker 1: but you are smarter than humans, make world peace, And 549 00:32:56,360 --> 00:32:58,640 Speaker 1: the AI ends up saying, well, the only guarantee for 550 00:32:58,680 --> 00:33:00,960 Speaker 1: world peace is if I wipe out all the humans, 551 00:33:01,160 --> 00:33:03,560 Speaker 1: because then they can't fight each other, and that's the 552 00:33:03,600 --> 00:33:05,800 Speaker 1: only guarantee for world peace. So I guess a better 553 00:33:05,840 --> 00:33:10,040 Speaker 1: get to it launch nuclear weapons now, the zero law 554 00:33:10,120 --> 00:33:13,520 Speaker 1: would presumably tell the AI, Nope, that's off the table, 555 00:33:14,240 --> 00:33:17,880 Speaker 1: try again, and then the AI would probably dissolve into 556 00:33:17,880 --> 00:33:21,880 Speaker 1: goo because the problem we gave it was way too hard. Anyway, 557 00:33:22,320 --> 00:33:26,000 Speaker 1: those are the basic laws of robotics. Obviously, science fiction authors, 558 00:33:26,040 --> 00:33:29,760 Speaker 1: including Isaac Asimov have played with those in various ways 559 00:33:29,800 --> 00:33:33,080 Speaker 1: to create scenarios in which robots would behave in in 560 00:33:33,760 --> 00:33:36,920 Speaker 1: what you might consider an unpredictable manner, because it was 561 00:33:37,000 --> 00:33:41,400 Speaker 1: the robots method of attempting to complete a task while 562 00:33:41,480 --> 00:33:45,560 Speaker 1: also trying to obey the laws of robotics. As the 563 00:33:45,640 --> 00:33:50,040 Speaker 1: science fiction stories show, these basic ideas, while it seems 564 00:33:50,080 --> 00:33:54,080 Speaker 1: like it's like covering all your bases, doesn't necessarily result 565 00:33:54,120 --> 00:33:57,520 Speaker 1: in that when you put it into practice, uh, which, again, 566 00:33:57,560 --> 00:34:01,240 Speaker 1: like good science fiction should be something that that teaches 567 00:34:01,320 --> 00:34:05,040 Speaker 1: us something either about ourselves or about the potential for 568 00:34:05,160 --> 00:34:09,240 Speaker 1: us to have blinders on when we make certain decisions 569 00:34:09,280 --> 00:34:12,719 Speaker 1: and not foresee the consequences of our actions, so that 570 00:34:12,960 --> 00:34:17,200 Speaker 1: maybe we spend a little more time considering things before 571 00:34:17,239 --> 00:34:20,600 Speaker 1: we act on them. Anyway, I hope you enjoyed this 572 00:34:20,640 --> 00:34:23,799 Speaker 1: Tex stuff. Tidbits ended up being now almost the length 573 00:34:23,800 --> 00:34:26,759 Speaker 1: of a regular episode, so I'm again really bad at 574 00:34:26,760 --> 00:34:29,719 Speaker 1: this whole tidbit thing. But yes, that's just a selection 575 00:34:29,880 --> 00:34:34,520 Speaker 1: of some of the quote unquote laws in technology. Maybe 576 00:34:34,560 --> 00:34:36,600 Speaker 1: sometime I'll do a follow up, or I'll cover some 577 00:34:36,680 --> 00:34:40,680 Speaker 1: of the more specific laws and talk about, you know, 578 00:34:40,719 --> 00:34:44,320 Speaker 1: how those came to be? Uh they there, I would argue, 579 00:34:44,320 --> 00:34:49,160 Speaker 1: more obscure except for specific sectors of the tech industry, 580 00:34:49,600 --> 00:34:53,000 Speaker 1: and thus you're less likely to come across them. But 581 00:34:53,320 --> 00:34:55,680 Speaker 1: we might do a follow up episode in the future. 582 00:34:55,840 --> 00:34:58,120 Speaker 1: If you have suggestions for topics I should cover in 583 00:34:58,239 --> 00:35:00,960 Speaker 1: episodes of tech Stuff, feel free to reach out to me. 584 00:35:01,200 --> 00:35:03,400 Speaker 1: The best way to do that is over on Twitter, 585 00:35:03,600 --> 00:35:06,680 Speaker 1: and a handle for the show is tech Stuff h 586 00:35:07,080 --> 00:35:11,600 Speaker 1: s W and I'll talk to you again, really sis y. 587 00:35:15,880 --> 00:35:18,919 Speaker 1: Tech Stuff is an I Heart Radio production. For more 588 00:35:18,960 --> 00:35:22,360 Speaker 1: podcasts from my Heart Radio, visit the i Heart Radio app, 589 00:35:22,520 --> 00:35:25,640 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.