1 00:00:04,440 --> 00:00:12,200 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Be there 2 00:00:12,240 --> 00:00:15,720 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:15,760 --> 00:00:18,680 Speaker 1: I'm an executive producer with iHeartRadio. And how the tech 4 00:00:18,720 --> 00:00:21,360 Speaker 1: are you today? I thought I would talk about a 5 00:00:21,400 --> 00:00:26,040 Speaker 1: few fundamental principles that underlie technology, at least as identified 6 00:00:26,079 --> 00:00:29,880 Speaker 1: by a blog called Simplicable. The name of the author 7 00:00:29,920 --> 00:00:32,559 Speaker 1: on all the articles I was looking at was John Spacey, 8 00:00:33,320 --> 00:00:35,440 Speaker 1: so it appears to be John Spacey at the helm 9 00:00:35,560 --> 00:00:37,960 Speaker 1: of this blog, which I think is based out of Singapore. 10 00:00:38,680 --> 00:00:43,600 Speaker 1: So these are concepts that apply to or drive technology 11 00:00:43,640 --> 00:00:47,880 Speaker 1: and technological change and innovation. And it goes beyond things 12 00:00:47,920 --> 00:00:52,479 Speaker 1: like circuits or electricity or mechanical systems or anything like that. 13 00:00:52,560 --> 00:00:58,520 Speaker 1: These are more like ideas and observations that underlie technology, 14 00:00:59,640 --> 00:01:03,160 Speaker 1: again as identified by Simplicable. So I don't wish to 15 00:01:03,200 --> 00:01:08,720 Speaker 1: suggest that these are universal fundamental principles, just rather that 16 00:01:08,800 --> 00:01:12,120 Speaker 1: I came across this list Unsimplicable and I thought it 17 00:01:12,160 --> 00:01:13,920 Speaker 1: was interesting, so I thought I would talk about some 18 00:01:14,240 --> 00:01:18,119 Speaker 1: of them today. So first up, we've got a concept 19 00:01:18,160 --> 00:01:22,720 Speaker 1: that ties in with Moore's law to an extent. You 20 00:01:22,800 --> 00:01:26,280 Speaker 1: could argue that Moore's law is another fundamental principle. In fact, 21 00:01:26,319 --> 00:01:30,200 Speaker 1: it's one of the ones that's listed by simplicable. I 22 00:01:30,240 --> 00:01:33,520 Speaker 1: think that's going a little far to call it a 23 00:01:33,520 --> 00:01:38,959 Speaker 1: fundamental principle. It's certainly an observation that people have attempted 24 00:01:39,000 --> 00:01:46,040 Speaker 1: to push beyond some pretty hard boundaries. But as a refresher, 25 00:01:46,120 --> 00:01:48,080 Speaker 1: just so that you understand what I'm talking about here, 26 00:01:48,640 --> 00:01:52,960 Speaker 1: Moore's law stems from an observation that Gordon Moore paid 27 00:01:53,000 --> 00:01:56,000 Speaker 1: back in the middle of the last century, actually, when 28 00:01:56,040 --> 00:02:00,240 Speaker 1: he saw that economic and technological factors were contribute meeting 29 00:02:00,320 --> 00:02:05,280 Speaker 1: to a system where there was an economic demand for 30 00:02:05,400 --> 00:02:11,560 Speaker 1: progressively more powerful microprocessors and later on computer chips, and 31 00:02:11,560 --> 00:02:16,280 Speaker 1: that this demand in turn created an incentive for fabrication 32 00:02:16,360 --> 00:02:19,240 Speaker 1: companies to come up with ways to meet that demand. Like, 33 00:02:19,280 --> 00:02:23,240 Speaker 1: it wasn't just magical that the fabrication companies were able 34 00:02:23,240 --> 00:02:26,680 Speaker 1: to make more powerful processors. They saw the demand there 35 00:02:26,720 --> 00:02:29,200 Speaker 1: and then they said, okay, well, how can we make 36 00:02:29,280 --> 00:02:33,120 Speaker 1: something that meets that demand. It's not just magically happening 37 00:02:33,120 --> 00:02:36,040 Speaker 1: on its own. So every so often we would see 38 00:02:36,040 --> 00:02:40,639 Speaker 1: companies find new ways to fit more discrete components onto 39 00:02:40,720 --> 00:02:45,080 Speaker 1: a square inch of silicon wafer. As they were able 40 00:02:45,080 --> 00:02:49,200 Speaker 1: to a couple of years previously. So generally speaking, the 41 00:02:49,320 --> 00:02:53,960 Speaker 1: law of Moore's law boiled down to every eighteen to 42 00:02:54,040 --> 00:02:57,440 Speaker 1: twenty four months, fabrication companies would find a way to 43 00:02:57,639 --> 00:03:01,359 Speaker 1: double the number of transistors that they could it onto 44 00:03:01,560 --> 00:03:06,120 Speaker 1: a silicon wafer an inch of silicon wafer square inch. Now, 45 00:03:06,120 --> 00:03:11,440 Speaker 1: over time, this concept morphed into a similar but different one. 46 00:03:12,400 --> 00:03:14,440 Speaker 1: We still call it Moore's law, but now we don't 47 00:03:14,480 --> 00:03:18,919 Speaker 1: necessarily say that the processors of today have twice as 48 00:03:18,960 --> 00:03:22,480 Speaker 1: many transistors on it as the processors from two years ago. 49 00:03:22,520 --> 00:03:26,320 Speaker 1: For example, now we say every two years or so 50 00:03:27,280 --> 00:03:30,840 Speaker 1: the chips that fabrication companies are producing are twice as 51 00:03:30,880 --> 00:03:34,160 Speaker 1: powerful as the ones from two years earlier. So, in 52 00:03:34,200 --> 00:03:36,520 Speaker 1: other words, if you bought a high end processor in 53 00:03:36,560 --> 00:03:40,200 Speaker 1: twenty ten, and then you bought another high end processor 54 00:03:40,360 --> 00:03:43,600 Speaker 1: in twenty twelve, the twenty twelve one should be twice 55 00:03:43,600 --> 00:03:47,200 Speaker 1: as powerful as the twenty ten one, And then in 56 00:03:47,200 --> 00:03:50,280 Speaker 1: twenty fourteen, if you bought one, that chip should be 57 00:03:50,320 --> 00:03:52,600 Speaker 1: twice as powerful as the one that you had bought 58 00:03:52,640 --> 00:03:56,680 Speaker 1: back in twenty twelve. Now we also tend to get 59 00:03:56,720 --> 00:03:59,560 Speaker 1: a little loosey goosey with a whole concept of what 60 00:03:59,680 --> 00:04:04,160 Speaker 1: power means in this context. Often this comes down to 61 00:04:04,320 --> 00:04:10,360 Speaker 1: processing speed, how fast can the chip process information, how 62 00:04:10,400 --> 00:04:16,120 Speaker 1: fast can it complete executions of operations, But powerful can 63 00:04:16,200 --> 00:04:20,840 Speaker 1: also include some other concepts like bitwidth, and you can 64 00:04:20,880 --> 00:04:23,479 Speaker 1: think of bitwidth as how large of a chunk of 65 00:04:23,560 --> 00:04:28,560 Speaker 1: data can this processor handle. So if the bitwidth is greater, 66 00:04:29,080 --> 00:04:33,120 Speaker 1: the processor can handle larger chunks of data. To me, 67 00:04:33,279 --> 00:04:36,000 Speaker 1: one interesting thing about Moore's law is how we choose 68 00:04:36,040 --> 00:04:39,520 Speaker 1: to interpret it so that it will remain relevant over 69 00:04:39,560 --> 00:04:43,200 Speaker 1: the years. Because pretty much everyone has acknowledged we're running 70 00:04:43,279 --> 00:04:46,880 Speaker 1: up against some obstacles that are just impossible to get 71 00:04:46,880 --> 00:04:49,400 Speaker 1: around based on the technology we have developed so far. 72 00:04:50,480 --> 00:04:53,120 Speaker 1: Let me explain that a little bit more so. As 73 00:04:53,200 --> 00:04:56,120 Speaker 1: you reduce the size of these components so that you 74 00:04:56,160 --> 00:05:00,080 Speaker 1: can fit more of them onto a silicon wafer, you 75 00:05:00,120 --> 00:05:03,920 Speaker 1: start getting down to a size where you're encountering issues 76 00:05:03,960 --> 00:05:10,120 Speaker 1: with quantum mechanics, and these quantum mechanics issues are not 77 00:05:10,920 --> 00:05:14,440 Speaker 1: in alignment with the way that we want our electronics 78 00:05:14,480 --> 00:05:17,880 Speaker 1: to work. Essentially, we get to a point where the 79 00:05:17,920 --> 00:05:22,080 Speaker 1: pathways we have created for electrons are so small that 80 00:05:22,240 --> 00:05:25,520 Speaker 1: the electrons have the potential to exist in a different 81 00:05:25,560 --> 00:05:28,560 Speaker 1: part of the pathway than where we want them to be. So, 82 00:05:28,760 --> 00:05:32,679 Speaker 1: if you think of transistor gates as actually being physical gates, 83 00:05:33,360 --> 00:05:35,479 Speaker 1: as in, you know, you're when it's closed, you're not 84 00:05:35,520 --> 00:05:38,880 Speaker 1: allowed to go through once you get to a certain size, 85 00:05:38,920 --> 00:05:41,240 Speaker 1: there's a potential for an electron to be on the 86 00:05:41,279 --> 00:05:44,440 Speaker 1: other side of the gate without having to actually physically 87 00:05:44,480 --> 00:05:47,279 Speaker 1: pass through the gate. It just means that there's the 88 00:05:47,400 --> 00:05:50,000 Speaker 1: possibility that the electron could be on the other side. 89 00:05:50,040 --> 00:05:52,000 Speaker 1: And as long as there's a possibility, it means that 90 00:05:52,040 --> 00:05:56,400 Speaker 1: sometimes that's what happens. And if you can't control where 91 00:05:56,400 --> 00:05:59,479 Speaker 1: the electrons are, then the gates mean nothing, and it 92 00:05:59,520 --> 00:06:02,600 Speaker 1: means that you're going to get computational errors. So there 93 00:06:02,640 --> 00:06:05,960 Speaker 1: are fundamental physical limitations to how small we can make 94 00:06:06,040 --> 00:06:09,680 Speaker 1: things if they're working the way that we have designed 95 00:06:09,760 --> 00:06:13,000 Speaker 1: microprocessors for you know, the better part of a century. 96 00:06:14,120 --> 00:06:17,960 Speaker 1: So that means that we have to come up with 97 00:06:18,000 --> 00:06:22,039 Speaker 1: other means, other ways to try and eke out more 98 00:06:22,120 --> 00:06:27,159 Speaker 1: performance in these chips. If we want Moore's law to 99 00:06:27,200 --> 00:06:30,720 Speaker 1: remain relevant, and that's only if we say, all right, 100 00:06:30,720 --> 00:06:33,160 Speaker 1: we want Moore's law to remain relevant, but not the 101 00:06:33,240 --> 00:06:38,000 Speaker 1: original Moore's law. We're talking about our reinterpretation of that 102 00:06:38,120 --> 00:06:41,800 Speaker 1: observation that was made, you know, in the middle of 103 00:06:42,160 --> 00:06:46,960 Speaker 1: the nineteen hundreds. So that is kind of where we 104 00:06:47,000 --> 00:06:49,880 Speaker 1: are with Moore's law. It's still sort of relevant, but 105 00:06:50,040 --> 00:06:53,599 Speaker 1: mostly because we're willing to bend on how we interpret 106 00:06:53,720 --> 00:06:59,400 Speaker 1: it and how we define it. So no one wants 107 00:06:59,400 --> 00:07:01,600 Speaker 1: to reach a point where they have to admit that 108 00:07:01,680 --> 00:07:05,400 Speaker 1: the rate of improvement is slowing down. No one wants 109 00:07:05,400 --> 00:07:08,159 Speaker 1: to get to that point. So we'll just keep on 110 00:07:08,279 --> 00:07:11,320 Speaker 1: moving things around, like doing the balls and cups routine, 111 00:07:11,960 --> 00:07:15,280 Speaker 1: until we can no longer fool ourselves into thinking that 112 00:07:15,400 --> 00:07:18,920 Speaker 1: we're able to keep this rate of change up. And 113 00:07:18,960 --> 00:07:22,040 Speaker 1: this is what brings us up to a different fundamental principle, 114 00:07:22,120 --> 00:07:27,160 Speaker 1: as defined by Sipplicable called accelerated change or accelerating change, 115 00:07:27,200 --> 00:07:29,800 Speaker 1: I should say, and that's just what it sounds like, 116 00:07:30,240 --> 00:07:34,440 Speaker 1: because we all realize that over time, stuff changes. Accelerating 117 00:07:34,560 --> 00:07:39,040 Speaker 1: change means that the rate of change itself is changing. 118 00:07:39,480 --> 00:07:42,400 Speaker 1: It's not just that things are changing day to day. 119 00:07:43,000 --> 00:07:46,640 Speaker 1: It's the concept that they're changing more quickly than the 120 00:07:46,720 --> 00:07:49,360 Speaker 1: rate of change was before, and that this is driven 121 00:07:49,400 --> 00:07:54,280 Speaker 1: by technology and how we use that technology. So with 122 00:07:54,400 --> 00:07:56,760 Speaker 1: this kind of concept, we would be able to look 123 00:07:56,760 --> 00:07:59,320 Speaker 1: at any ten year span. So let's say we looked 124 00:07:59,360 --> 00:08:03,560 Speaker 1: at nineteen fourteen to nineteen twenty three, and then let's 125 00:08:03,600 --> 00:08:08,000 Speaker 1: say we compared that with a more recent ten year span, 126 00:08:08,080 --> 00:08:12,240 Speaker 1: so let's say twenty fourteen to twenty twenty three. Then 127 00:08:12,280 --> 00:08:15,720 Speaker 1: we ask the question in which of those ten year spans, 128 00:08:15,720 --> 00:08:19,480 Speaker 1: In which of those decades would we see more change? 129 00:08:19,960 --> 00:08:21,840 Speaker 1: And if we want to get specific, where did we 130 00:08:21,880 --> 00:08:27,160 Speaker 1: see more technologically driven or oriented change. Since the nineteen 131 00:08:27,200 --> 00:08:30,680 Speaker 1: fourteen to nineteen twenty three decade predates the invention of 132 00:08:30,720 --> 00:08:33,920 Speaker 1: the transistor, it's pretty easy for us to say, well, 133 00:08:34,440 --> 00:08:37,600 Speaker 1: the most recent decade has seen way more innovation driven 134 00:08:37,640 --> 00:08:42,520 Speaker 1: by technology. It's not even comparable. And that's true, but 135 00:08:42,559 --> 00:08:45,679 Speaker 1: we should also remember once again, these advancements in technology, 136 00:08:45,679 --> 00:08:50,600 Speaker 1: they're not happening in a vacuum. It's not like technology, 137 00:08:50,679 --> 00:08:53,960 Speaker 1: if left to itself, will evolve and improve over time. 138 00:08:54,360 --> 00:08:58,320 Speaker 1: Stuff in the world shapes our approach to technology, and 139 00:08:58,360 --> 00:09:01,880 Speaker 1: then our technology shapes the stuff in the world and 140 00:09:02,000 --> 00:09:06,280 Speaker 1: our interaction with it. So in nineteen fourteen, you would say, well, 141 00:09:06,280 --> 00:09:10,600 Speaker 1: what kind of factors were influencing technological development. In nineteen fourteen, well, 142 00:09:10,640 --> 00:09:13,000 Speaker 1: there was a big one. It was the Great War, 143 00:09:13,320 --> 00:09:16,160 Speaker 1: which was later called World War One. They didn't call 144 00:09:16,200 --> 00:09:18,600 Speaker 1: it that at the beginning because there were still optimists 145 00:09:18,640 --> 00:09:21,680 Speaker 1: back then. They weren't expecting there to be a second one. 146 00:09:21,880 --> 00:09:23,840 Speaker 1: In fact, they called it the War to End all wars. 147 00:09:24,120 --> 00:09:29,040 Speaker 1: It turns out that was wrong. Anyway, this war spurred 148 00:09:29,320 --> 00:09:33,360 Speaker 1: a ton of innovation as various countries tried to find 149 00:09:33,360 --> 00:09:36,280 Speaker 1: more efficient ways to kill the enemy while sustaining fewer 150 00:09:36,320 --> 00:09:38,560 Speaker 1: losses of their own, or at the very least just 151 00:09:38,640 --> 00:09:40,960 Speaker 1: killing war of them than they managed to kill of us. 152 00:09:41,640 --> 00:09:45,880 Speaker 1: So we got lots of stuff like machine guns, motorized 153 00:09:45,960 --> 00:09:51,839 Speaker 1: military vehicles, airplanes were used in warfare. Chemical warfare became 154 00:09:51,880 --> 00:09:54,840 Speaker 1: a thing. We also got some other stuff that wasn't 155 00:09:55,040 --> 00:09:58,560 Speaker 1: expressly made to kill people, like gas masks which was 156 00:09:58,559 --> 00:10:02,800 Speaker 1: meant to save people, and field radios. Anyway, my point 157 00:10:02,840 --> 00:10:07,080 Speaker 1: is that the concept of accelerating change isn't something that 158 00:10:07,080 --> 00:10:09,600 Speaker 1: we can just isolate from the rest of the world. 159 00:10:09,960 --> 00:10:13,079 Speaker 1: It's also something that can lead people to make predictions 160 00:10:13,080 --> 00:10:16,199 Speaker 1: that I think may at best be a long shot. 161 00:10:16,600 --> 00:10:20,359 Speaker 1: It's what has fueled a ton of discussion around concepts 162 00:10:20,480 --> 00:10:24,280 Speaker 1: like the singularity. This is this idea that we reach 163 00:10:24,320 --> 00:10:27,880 Speaker 1: a point where change is happening so quickly and it's 164 00:10:27,920 --> 00:10:31,720 Speaker 1: so constant that there is no way that you can 165 00:10:31,840 --> 00:10:35,120 Speaker 1: meaningfully talk about the way things are, because by the 166 00:10:35,160 --> 00:10:37,520 Speaker 1: time you're done making a sentence, they ain't that way 167 00:10:37,559 --> 00:10:42,160 Speaker 1: no more. They've changed already. So this idea, which relates 168 00:10:42,200 --> 00:10:45,400 Speaker 1: to other concepts like humans becoming something more than human, 169 00:10:45,960 --> 00:10:51,200 Speaker 1: the so called transhuman approach. Transhuman in this case meaning 170 00:10:51,520 --> 00:10:55,680 Speaker 1: something beyond humanity. It doesn't have anything to do with 171 00:10:56,320 --> 00:11:01,240 Speaker 1: something like the transgender community. That it's a different thing, 172 00:11:01,840 --> 00:11:05,880 Speaker 1: very important thing, but different. This concept of transhumanism is 173 00:11:05,880 --> 00:11:09,520 Speaker 1: about no longer being strictly human the way we would 174 00:11:09,559 --> 00:11:13,240 Speaker 1: define it today, and that could include lots of things. 175 00:11:13,720 --> 00:11:17,880 Speaker 1: It usually includes some form of augmented intelligence. Either we 176 00:11:17,960 --> 00:11:21,040 Speaker 1: figured out a way to boost our own biological intelligence, 177 00:11:21,800 --> 00:11:24,679 Speaker 1: or we've incorporated technology into us in some way that 178 00:11:24,679 --> 00:11:29,199 Speaker 1: that then boosts our intelligence. Sometimes we also have an 179 00:11:29,200 --> 00:11:33,520 Speaker 1: idea of digital immortality thrown in there. Mostly it feels 180 00:11:33,559 --> 00:11:35,920 Speaker 1: a lot like stuff that's in the realm of science 181 00:11:35,920 --> 00:11:38,760 Speaker 1: fiction rather than stuff that's in reality. But I would 182 00:11:38,840 --> 00:11:41,800 Speaker 1: argue that it's based on this perception that change is 183 00:11:41,880 --> 00:11:45,800 Speaker 1: happening faster with every passing year. And if that's your 184 00:11:45,840 --> 00:11:48,560 Speaker 1: basic argument, well, then it stands to reason that at 185 00:11:48,600 --> 00:11:52,200 Speaker 1: some point the rate of change will be such that 186 00:11:52,200 --> 00:11:55,160 Speaker 1: there will be no meaningful way to quantify it. But 187 00:11:55,360 --> 00:11:57,800 Speaker 1: like Moore's law, I would say this belief is based 188 00:11:57,840 --> 00:12:01,720 Speaker 1: off things that may not be universally or permanently true. 189 00:12:02,400 --> 00:12:06,200 Speaker 1: I think one of the mistakes that some futurists make 190 00:12:06,880 --> 00:12:09,960 Speaker 1: is that they equate all change with what we see 191 00:12:10,040 --> 00:12:14,400 Speaker 1: in things like Moore's law, because Moore's law describes exponential 192 00:12:14,520 --> 00:12:19,400 Speaker 1: rates of improvement right with at least with processor complexity, 193 00:12:19,440 --> 00:12:23,640 Speaker 1: which then we redefined as processor power or performance, So 194 00:12:23,679 --> 00:12:27,120 Speaker 1: that same rate of change would then apply to everything 195 00:12:27,240 --> 00:12:30,280 Speaker 1: with the way some futurists frame stuff out. But that's 196 00:12:30,360 --> 00:12:34,800 Speaker 1: just not true. We don't see exponential change in everything 197 00:12:34,840 --> 00:12:37,560 Speaker 1: that's related to technology. Some things actually change at an 198 00:12:37,559 --> 00:12:41,120 Speaker 1: even faster rate, they might start to approach a hyperbolic 199 00:12:41,280 --> 00:12:45,120 Speaker 1: rate of change, But some things experience much slower change. 200 00:12:45,160 --> 00:12:48,320 Speaker 1: They don't advance that quickly, like battery technology does not 201 00:12:48,480 --> 00:12:53,040 Speaker 1: advance nearly as quickly as microprocessor technology. So it appears 202 00:12:53,040 --> 00:12:55,000 Speaker 1: to me that the singularity is going to require a 203 00:12:55,000 --> 00:12:59,280 Speaker 1: lot more than just super fast processors, unless we decide 204 00:12:59,320 --> 00:13:02,320 Speaker 1: to redefine what the singularity is, and then we could 205 00:13:02,320 --> 00:13:05,320 Speaker 1: say we're already in it because we've redefined it, which 206 00:13:05,400 --> 00:13:07,280 Speaker 1: we kind of have done with Moore's law. I guess 207 00:13:07,320 --> 00:13:11,480 Speaker 1: we could do that, but it's not very satisfying. So 208 00:13:11,600 --> 00:13:15,160 Speaker 1: accelerating change is one of those fundamental ideas in tech 209 00:13:15,200 --> 00:13:17,920 Speaker 1: that may or may not apply depending on the tech 210 00:13:18,240 --> 00:13:21,480 Speaker 1: type we're talking about. I would say that for some technologies, 211 00:13:21,600 --> 00:13:26,840 Speaker 1: like autonomous vehicles, for example, we have not seen accelerating change. Instead, 212 00:13:26,880 --> 00:13:32,280 Speaker 1: we saw an initial burst of innovation, incredible innovation, and 213 00:13:32,360 --> 00:13:35,760 Speaker 1: for a few years that continued and it did look 214 00:13:35,800 --> 00:13:39,880 Speaker 1: like it was accelerating change. But now we're seeing engineers 215 00:13:39,920 --> 00:13:44,400 Speaker 1: having to hone in on specific limitations and problems and 216 00:13:44,480 --> 00:13:50,360 Speaker 1: challenges within autonomous vehicle technology, and these require careful solutions, 217 00:13:51,080 --> 00:13:53,560 Speaker 1: and some problems might be accounted for, but lots more 218 00:13:53,840 --> 00:13:56,520 Speaker 1: have been discovered or are not accounted for, and we 219 00:13:56,559 --> 00:14:01,480 Speaker 1: aren't seeing accelerating change in that field anymore. Iterative changes, 220 00:14:02,040 --> 00:14:04,800 Speaker 1: which is still good, like we're still seeing advancements, but 221 00:14:04,840 --> 00:14:09,000 Speaker 1: it's not going at this you know, breakneck speed. That 222 00:14:09,120 --> 00:14:13,080 Speaker 1: accelerating change suggests. Okay, we've got a lot more principles 223 00:14:13,120 --> 00:14:14,960 Speaker 1: to cover. I think it's time for us to take 224 00:14:15,040 --> 00:14:26,360 Speaker 1: a quick break. All right, we're gonna move on to 225 00:14:26,560 --> 00:14:31,600 Speaker 1: a different fundamental principle that Simplicable has identified as being, 226 00:14:31,720 --> 00:14:37,280 Speaker 1: you know, fundamental to technology. And so let's talk about complexity. 227 00:14:37,320 --> 00:14:40,920 Speaker 1: You can think of complexity falling into one of two categories. 228 00:14:41,080 --> 00:14:46,760 Speaker 1: Simplicable calls it essential complexity and then accidental complexity, or 229 00:14:46,760 --> 00:14:50,440 Speaker 1: what I would refer to as non essential complexity. Now, 230 00:14:50,520 --> 00:14:54,880 Speaker 1: if something has essential complexity, that doesn't necessarily mean there's 231 00:14:55,000 --> 00:14:58,480 Speaker 1: no way to simplify the technology. It might be possible 232 00:14:58,480 --> 00:15:02,840 Speaker 1: to simplify it, but it's essential complexity. That means that 233 00:15:02,920 --> 00:15:06,680 Speaker 1: if we were to try and simplify this technology, to 234 00:15:06,760 --> 00:15:10,600 Speaker 1: streamline it or remove features or anything like that, then 235 00:15:10,880 --> 00:15:14,400 Speaker 1: in the process, we would also reduce the usability or 236 00:15:14,560 --> 00:15:18,760 Speaker 1: value of that technology. So, in other words, we could 237 00:15:18,800 --> 00:15:21,640 Speaker 1: make it less complex, but we would also make it 238 00:15:22,000 --> 00:15:26,800 Speaker 1: less good, or less desirable or less useful. This can 239 00:15:26,960 --> 00:15:30,960 Speaker 1: fall into subjective perceptions. It's not just an objective truth. 240 00:15:31,360 --> 00:15:33,800 Speaker 1: So for example, let's say you've got a team and 241 00:15:33,800 --> 00:15:40,560 Speaker 1: they're developing a smartphone, and their initial lineup of features 242 00:15:41,120 --> 00:15:43,600 Speaker 1: are all the typical ones you would find in a smartphone. 243 00:15:44,400 --> 00:15:46,720 Speaker 1: It'll be able to make calls, they'll be able to 244 00:15:46,800 --> 00:15:50,200 Speaker 1: send and receive emails and text messages. It'll be able 245 00:15:50,240 --> 00:15:54,200 Speaker 1: to take photos, all that kind of stuff. The complexity 246 00:15:54,440 --> 00:15:58,560 Speaker 1: necessitates certain design decisions, right, Like, in order to achieve 247 00:15:58,720 --> 00:16:02,200 Speaker 1: the things you have listen did as what the smartphone 248 00:16:02,240 --> 00:16:05,280 Speaker 1: has to do, you've got to make certain design decisions 249 00:16:05,320 --> 00:16:08,840 Speaker 1: to support it. This all is just logical, right, And 250 00:16:08,880 --> 00:16:11,520 Speaker 1: that could range from everything from the size of the 251 00:16:11,520 --> 00:16:13,720 Speaker 1: battery you're gonna need if you want to have a 252 00:16:13,840 --> 00:16:16,960 Speaker 1: useful phone life so it lasts at least a day. 253 00:16:17,880 --> 00:16:20,720 Speaker 1: It may involve things like screen resolution. You want to 254 00:16:20,720 --> 00:16:22,880 Speaker 1: make sure that people can see whatever it is that's 255 00:16:22,920 --> 00:16:26,640 Speaker 1: being displayed. There, process or power to support all these 256 00:16:26,640 --> 00:16:31,720 Speaker 1: different functions, Like, all these things become necessary considerations in 257 00:16:31,800 --> 00:16:34,400 Speaker 1: order to provide a good experience. But let's say there's 258 00:16:34,440 --> 00:16:37,120 Speaker 1: someone on your team who just doesn't see the value 259 00:16:37,160 --> 00:16:39,680 Speaker 1: in having a camera and a smartphone. Maybe this person 260 00:16:39,720 --> 00:16:43,080 Speaker 1: never uses the camera on their smartphone at all. Maybe 261 00:16:43,120 --> 00:16:45,680 Speaker 1: they just don't take pictures. They don't see it as 262 00:16:45,720 --> 00:16:51,000 Speaker 1: being fundamentally important to a smartphone's design. They could argue 263 00:16:51,080 --> 00:16:55,479 Speaker 1: that you could drastically simplify the design of the smartphone 264 00:16:55,480 --> 00:16:57,960 Speaker 1: if you just ditched the camera right. That means you 265 00:16:57,960 --> 00:17:00,440 Speaker 1: get rid of the lenses, you get rid of all 266 00:17:00,440 --> 00:17:03,400 Speaker 1: the sensors, you get rid of all the stuff that 267 00:17:03,560 --> 00:17:05,960 Speaker 1: otherwise would have to be in the smartphone for the 268 00:17:06,000 --> 00:17:08,720 Speaker 1: camera to work. And then you could either make the 269 00:17:08,800 --> 00:17:12,080 Speaker 1: smartphone smaller, you know, create a smaller form factor because 270 00:17:12,480 --> 00:17:16,440 Speaker 1: it no longer has to house those components, or dedicate 271 00:17:16,480 --> 00:17:20,119 Speaker 1: that space for something else. Or even this could precipitate 272 00:17:20,119 --> 00:17:22,760 Speaker 1: into changes for things like the battery life and the 273 00:17:22,800 --> 00:17:25,000 Speaker 1: processor because it no longer would need to support the 274 00:17:25,040 --> 00:17:28,080 Speaker 1: functions of a camera like these are sort of a 275 00:17:28,119 --> 00:17:32,800 Speaker 1: cascading list of decisions that this could affect. However, someone 276 00:17:32,840 --> 00:17:35,680 Speaker 1: else might say, well, this smartphone has no camera. It's 277 00:17:35,720 --> 00:17:38,560 Speaker 1: not a good smartphone. I would argue it's not even 278 00:17:38,560 --> 00:17:40,520 Speaker 1: a smartphone at all because there's no camera in it, 279 00:17:41,320 --> 00:17:46,320 Speaker 1: And to them, the reduction in complexity has resulted in 280 00:17:46,520 --> 00:17:50,920 Speaker 1: a reduction of usability or value. Again, there's no objective 281 00:17:50,960 --> 00:17:54,120 Speaker 1: truth here. It's all dependent upon how you feel about it. 282 00:17:54,640 --> 00:17:57,639 Speaker 1: But there is a general understanding that technologies can only 283 00:17:57,720 --> 00:18:01,760 Speaker 1: be simplified and made more efficient and less clunky up 284 00:18:01,800 --> 00:18:04,240 Speaker 1: to a point, and once you get beyond that point, 285 00:18:04,320 --> 00:18:06,520 Speaker 1: you start to lose whatever it was that makes the 286 00:18:06,560 --> 00:18:09,240 Speaker 1: technology useful in the first place. Now, if we go 287 00:18:09,359 --> 00:18:14,280 Speaker 1: to accidental or non essential complexity, we've got the opposite. 288 00:18:14,560 --> 00:18:20,560 Speaker 1: This describes a technology's tendency to have functions or features 289 00:18:20,840 --> 00:18:25,199 Speaker 1: or elements to it that make it more complex but 290 00:18:25,840 --> 00:18:29,720 Speaker 1: add no extra value to the technology itself, which means 291 00:18:29,880 --> 00:18:32,280 Speaker 1: if you get rid of it, not only do you 292 00:18:32,400 --> 00:18:35,680 Speaker 1: not eliminate something valuable, you might actually increase the value 293 00:18:35,760 --> 00:18:38,000 Speaker 1: of the technology because you've gotten rid of some clutter. 294 00:18:38,520 --> 00:18:41,760 Speaker 1: Now this could be software, it could be hardware. Like 295 00:18:41,800 --> 00:18:44,320 Speaker 1: it doesn't have to be a physical technology. It could 296 00:18:44,359 --> 00:18:46,960 Speaker 1: be something like a bug in software that when you 297 00:18:47,040 --> 00:18:50,280 Speaker 1: eliminate it, not only does it make it less complex, 298 00:18:50,359 --> 00:18:53,600 Speaker 1: but now the software works more effectively. So you've increased 299 00:18:54,240 --> 00:18:57,399 Speaker 1: the value of the software. Even if you were to 300 00:18:57,520 --> 00:19:01,399 Speaker 1: just argue that eliminating the bug reduces the size, like 301 00:19:01,440 --> 00:19:04,080 Speaker 1: the data size of the software itself, you have increased 302 00:19:04,080 --> 00:19:08,160 Speaker 1: its value, right because size is it's not an infinite 303 00:19:08,200 --> 00:19:12,320 Speaker 1: resource that we have like your machines that run software 304 00:19:12,400 --> 00:19:15,640 Speaker 1: have a limitation on how much they can handle. And 305 00:19:16,040 --> 00:19:19,520 Speaker 1: if you start to reduce the demands of software by 306 00:19:19,560 --> 00:19:24,000 Speaker 1: eliminating bugs, that increases that software's value, maybe not monetarily, 307 00:19:24,080 --> 00:19:30,080 Speaker 1: but certainly from a process standpoint. So, with essential complexity, 308 00:19:30,119 --> 00:19:34,720 Speaker 1: reducing complexity reduces the text value. With accidental complexity, reducing 309 00:19:34,720 --> 00:19:39,480 Speaker 1: complexity increases the technology's value. Now there's a related tech 310 00:19:39,520 --> 00:19:42,040 Speaker 1: issue I would like to kind of dip my toe 311 00:19:42,080 --> 00:19:46,560 Speaker 1: in and mention here. That's called feature creep. Now, this 312 00:19:46,640 --> 00:19:49,760 Speaker 1: is when a team is building out a technology and 313 00:19:49,800 --> 00:19:52,479 Speaker 1: then they begin to add in features that were not 314 00:19:52,600 --> 00:19:56,800 Speaker 1: included in the original plan for the tech. You know, oh, 315 00:19:56,840 --> 00:19:59,320 Speaker 1: what if we were to add neon lights, or what 316 00:19:59,400 --> 00:20:01,920 Speaker 1: about we put speakers on the outside of the car. 317 00:20:02,600 --> 00:20:06,480 Speaker 1: Feature creep happens a lot in tech space. It can 318 00:20:06,600 --> 00:20:09,920 Speaker 1: again happen in hardware and in software. It can also 319 00:20:09,920 --> 00:20:13,879 Speaker 1: get to a point where it will doom a project 320 00:20:13,960 --> 00:20:16,760 Speaker 1: or the very least delay it for ages and potentially 321 00:20:17,840 --> 00:20:20,520 Speaker 1: mean that you end up with something that is less 322 00:20:20,640 --> 00:20:24,840 Speaker 1: valuable because of all that feature creep, which ties into 323 00:20:24,880 --> 00:20:30,159 Speaker 1: another fundamental principle that simplicable identifies where they say worse 324 00:20:30,359 --> 00:20:33,399 Speaker 1: is better. By that, they mean not that if a 325 00:20:33,440 --> 00:20:36,919 Speaker 1: technology is worse, it's better than a better technology, But 326 00:20:37,520 --> 00:20:41,800 Speaker 1: sometimes a technology that has fewer functions but works really 327 00:20:41,840 --> 00:20:45,160 Speaker 1: well is going to be viewed as more valuable than 328 00:20:45,760 --> 00:20:48,480 Speaker 1: a device that has way more functions but it's harder 329 00:20:48,520 --> 00:20:51,480 Speaker 1: to use, Which makes sense, right. If you make something 330 00:20:51,560 --> 00:20:53,800 Speaker 1: that is easy to use and it does what it's 331 00:20:53,840 --> 00:20:56,760 Speaker 1: supposed to do, then people are going to gravitate toward 332 00:20:56,800 --> 00:21:00,359 Speaker 1: that more than they will technologies that it might have 333 00:21:00,680 --> 00:21:02,600 Speaker 1: a lot more bells and whistles, but they don't do 334 00:21:02,600 --> 00:21:06,320 Speaker 1: anything particularly well that doesn't typically stand the test of 335 00:21:06,359 --> 00:21:09,320 Speaker 1: time well. Feature creep plays into both of these things here, 336 00:21:09,840 --> 00:21:14,920 Speaker 1: both the complexity issue and the worst is better issue. Now, 337 00:21:14,960 --> 00:21:18,040 Speaker 1: to me, the definitive example of feature creep, the one 338 00:21:18,080 --> 00:21:20,800 Speaker 1: I would use if I were doing my TED talk 339 00:21:20,880 --> 00:21:24,439 Speaker 1: on what feature creep is and why it's bad, would 340 00:21:24,480 --> 00:21:29,200 Speaker 1: be the game Duke Nukem Forever. It was an infamous 341 00:21:29,240 --> 00:21:32,119 Speaker 1: game while it was in development, and once it published 342 00:21:32,160 --> 00:21:34,399 Speaker 1: it no longer became infamous. It just kind of became 343 00:21:34,720 --> 00:21:38,600 Speaker 1: a bit of a punching bag, or sometimes just completely dismissed. 344 00:21:39,080 --> 00:21:42,080 Speaker 1: So if you're not familiar with the Duke nucom franchise. 345 00:21:42,200 --> 00:21:48,560 Speaker 1: It follows this overly macho male hero who's based a 346 00:21:48,600 --> 00:21:53,360 Speaker 1: lot on characters that Arnold Schwarzenegger has played or Bruce Campbell. 347 00:21:53,480 --> 00:21:56,960 Speaker 1: In fact, it lifts a lot of lines straight from 348 00:21:57,840 --> 00:22:01,880 Speaker 1: the Evil Dead movies. Use Campbell Evil Dead movies when 349 00:22:01,880 --> 00:22:04,800 Speaker 1: they were really campy and stuff. And it's a first 350 00:22:04,840 --> 00:22:07,320 Speaker 1: person shooter game that a company called three D Realms 351 00:22:07,400 --> 00:22:10,760 Speaker 1: originally announced in nineteen ninety seven. They'll keep in mind 352 00:22:10,800 --> 00:22:14,200 Speaker 1: when studios announce a game, it typically means that they've 353 00:22:14,200 --> 00:22:16,959 Speaker 1: already been working on it for a while, so they 354 00:22:17,000 --> 00:22:20,359 Speaker 1: announced it in nineteen ninety seven, but the game wouldn't 355 00:22:20,400 --> 00:22:26,119 Speaker 1: actually come out until two thousand and eleven. Now, video 356 00:22:26,160 --> 00:22:29,760 Speaker 1: games can take a long time in development, but fourteen 357 00:22:29,880 --> 00:22:34,280 Speaker 1: years is atypical, although Star Citizen might catch up in 358 00:22:34,320 --> 00:22:36,880 Speaker 1: a couple of years if they don't have a full 359 00:22:37,000 --> 00:22:41,640 Speaker 1: game released before then. And one of the many problems 360 00:22:41,840 --> 00:22:45,879 Speaker 1: that was causing a lot of these delays was feature creep. See. 361 00:22:45,920 --> 00:22:48,479 Speaker 1: While the team at three D Realms was working on 362 00:22:48,520 --> 00:22:52,320 Speaker 1: the game, other companies were releasing updated game engines that 363 00:22:52,400 --> 00:22:57,520 Speaker 1: supported more features, so you could build on the game 364 00:22:57,560 --> 00:23:01,440 Speaker 1: engine you were depending upon already and continue building out 365 00:23:01,440 --> 00:23:04,359 Speaker 1: your game. But the fear was that when Duke Nucom 366 00:23:04,440 --> 00:23:07,960 Speaker 1: would release, it would look dated against games that were 367 00:23:08,200 --> 00:23:13,479 Speaker 1: created on the more recent game engines. So you get 368 00:23:13,600 --> 00:23:16,840 Speaker 1: the head of the project who suddenly demands that the 369 00:23:16,880 --> 00:23:20,560 Speaker 1: team swaps to a different game engine, a more recent one, 370 00:23:21,200 --> 00:23:25,240 Speaker 1: and that necessitates starting from scratch for most aspects of 371 00:23:25,320 --> 00:23:27,600 Speaker 1: the game, like almost all the assets needed to be 372 00:23:27,720 --> 00:23:30,760 Speaker 1: redone in order to work with this new game engine, 373 00:23:30,800 --> 00:23:33,480 Speaker 1: and it sets the entire development process back to the beginning. 374 00:23:34,400 --> 00:23:36,480 Speaker 1: Then you would have times where the leader of the 375 00:23:36,520 --> 00:23:39,199 Speaker 1: project would want capabilities that were starting to show up 376 00:23:39,240 --> 00:23:43,240 Speaker 1: in other games to then be incorporated into Duke nukeomb forever, 377 00:23:43,359 --> 00:23:47,359 Speaker 1: when that had not been a consideration earlier in development. 378 00:23:47,760 --> 00:23:51,960 Speaker 1: So as a result, the game was constantly going through development, 379 00:23:52,480 --> 00:23:57,000 Speaker 1: then revisions, and sometimes complete restarts, and by the end 380 00:23:57,040 --> 00:23:59,359 Speaker 1: of it all, I think it was pretty safe to 381 00:23:59,440 --> 00:24:02,880 Speaker 1: argue that the game was filled with non essential complexity. 382 00:24:03,200 --> 00:24:06,800 Speaker 1: And of course, by the time it finally published under 383 00:24:06,800 --> 00:24:10,800 Speaker 1: a different company. At that point, because it had changed hands, 384 00:24:11,520 --> 00:24:14,520 Speaker 1: it was no longer really a relevant game to the 385 00:24:14,600 --> 00:24:18,119 Speaker 1: sensibilities of most gamers. You know, the even if the 386 00:24:18,160 --> 00:24:22,040 Speaker 1: gameplay had been stellar and free of things like bugs 387 00:24:22,080 --> 00:24:25,960 Speaker 1: and other issues. The tone of the game no longer 388 00:24:26,000 --> 00:24:30,520 Speaker 1: fit what people wanted anymore because more than ten years 389 00:24:30,520 --> 00:24:33,639 Speaker 1: had gone by since the game had been announced, and 390 00:24:33,720 --> 00:24:39,399 Speaker 1: people's tastes in gameplay and game tone had changed in 391 00:24:39,480 --> 00:24:44,120 Speaker 1: that time. So feature creep really was a huge problem 392 00:24:44,240 --> 00:24:48,960 Speaker 1: for that game. Well, here's another concept that Simplicable includes 393 00:24:49,000 --> 00:24:54,240 Speaker 1: as a fundamental principle of technology, the creativity of constraints. 394 00:24:54,320 --> 00:24:58,800 Speaker 1: This one really speaks to me. Basically, this idea says 395 00:24:59,320 --> 00:25:03,840 Speaker 1: it is much easier to be creative and innovative when 396 00:25:03,840 --> 00:25:09,320 Speaker 1: you're working within some form of constraint, because constraints drive decisions, 397 00:25:09,960 --> 00:25:13,520 Speaker 1: and if you are without constraint, you have no limiting 398 00:25:13,560 --> 00:25:16,440 Speaker 1: factors that make it necessary to decide to go one 399 00:25:16,480 --> 00:25:20,520 Speaker 1: way versus another. Any decision you make appears to be 400 00:25:20,960 --> 00:25:25,160 Speaker 1: as valid and viable as any other decision you could make, 401 00:25:25,240 --> 00:25:29,000 Speaker 1: because there's nothing pushing back against you. And that can 402 00:25:29,160 --> 00:25:31,680 Speaker 1: mean you just end up spinning your wheels a lot 403 00:25:31,720 --> 00:25:35,560 Speaker 1: and you don't really make any progress. Constraints can be 404 00:25:35,600 --> 00:25:39,160 Speaker 1: pretty much anything. Budget is probably the big one, right. 405 00:25:39,280 --> 00:25:42,400 Speaker 1: Usually you're working within some sort of budget, and that 406 00:25:42,440 --> 00:25:46,080 Speaker 1: means at least in theory, you can't make decisions that 407 00:25:46,119 --> 00:25:49,800 Speaker 1: would require more money than the budget allows. Obviously, lots 408 00:25:49,800 --> 00:25:52,760 Speaker 1: of projects go over budget, but the budget is meant 409 00:25:52,800 --> 00:25:57,760 Speaker 1: to serve as that constraint. A deadline is another constraint 410 00:25:57,840 --> 00:26:00,000 Speaker 1: you might face. You know, you have to finish your 411 00:26:00,080 --> 00:26:03,280 Speaker 1: task by some appointed time. But there can be lots 412 00:26:03,280 --> 00:26:06,520 Speaker 1: of other types of constraints, even in tech. Some of 413 00:26:06,560 --> 00:26:10,399 Speaker 1: them could be technological constraints, like it's just physically not 414 00:26:10,520 --> 00:26:14,000 Speaker 1: possible for you to go beyond a certain level of 415 00:26:14,040 --> 00:26:17,600 Speaker 1: performance because of the limitations of technology. There can be 416 00:26:17,640 --> 00:26:22,679 Speaker 1: material constraints. Maybe you know you can't go further in 417 00:26:22,760 --> 00:26:26,800 Speaker 1: any particular direction because the materials that you can use 418 00:26:27,040 --> 00:26:29,159 Speaker 1: have their own physical limitations, and if you were to 419 00:26:29,160 --> 00:26:33,120 Speaker 1: try and push beyond that, you would break the device 420 00:26:33,200 --> 00:26:36,240 Speaker 1: or whatever it might be. You can have social constraints, 421 00:26:36,920 --> 00:26:39,600 Speaker 1: maybe there are things that are not socially acceptable that 422 00:26:39,680 --> 00:26:44,119 Speaker 1: you back away from with decisions as you're making. You know, 423 00:26:44,160 --> 00:26:47,320 Speaker 1: this technology. There could be legal constraints. Maybe there are 424 00:26:47,640 --> 00:26:52,840 Speaker 1: regulations or laws that mean that you can't do certain things, 425 00:26:52,840 --> 00:26:54,760 Speaker 1: and that means you have to come up with creative 426 00:26:54,760 --> 00:26:58,480 Speaker 1: solutions to get your technology to work. Properly while still 427 00:26:58,480 --> 00:27:02,000 Speaker 1: being within that legal framework. So with constraints, you end 428 00:27:02,080 --> 00:27:05,480 Speaker 1: up saying, I need to do X, that's my goal. 429 00:27:05,920 --> 00:27:09,080 Speaker 1: But meanwhile, AB and C are all in my way, 430 00:27:09,520 --> 00:27:12,440 Speaker 1: So how can I achieve my goal? And you start 431 00:27:12,480 --> 00:27:15,640 Speaker 1: problem solving, and the problem solving shapes not just how 432 00:27:15,680 --> 00:27:20,000 Speaker 1: you get around those challenges. That problem solving actually shapes 433 00:27:20,040 --> 00:27:23,119 Speaker 1: the end product itself. The thing you make in part 434 00:27:23,680 --> 00:27:27,440 Speaker 1: is a reflection of the constraints that you encountered while 435 00:27:27,440 --> 00:27:30,320 Speaker 1: you were making it. But if you don't have constraints, 436 00:27:30,960 --> 00:27:33,240 Speaker 1: then you don't have those guidelines, right, You don't have 437 00:27:33,280 --> 00:27:36,720 Speaker 1: anything to push against, no hard edges that you're gonna 438 00:27:36,760 --> 00:27:39,680 Speaker 1: bump up against and have to work around, and honestly, 439 00:27:39,760 --> 00:27:43,840 Speaker 1: that can stifle creativity. That particular concept really rings out 440 00:27:43,840 --> 00:27:47,960 Speaker 1: to me because I've encountered it myself when I've played 441 00:27:48,080 --> 00:27:51,399 Speaker 1: certain types of video games, right Like, there are big, 442 00:27:51,720 --> 00:27:56,880 Speaker 1: open world video games that are exploration based and there's 443 00:27:56,960 --> 00:27:59,920 Speaker 1: very little direction, and that can feel too vast. I 444 00:28:00,080 --> 00:28:02,439 Speaker 1: feel like I'm not really doing anything. And then there 445 00:28:02,440 --> 00:28:06,000 Speaker 1: are smaller, more modest games. They might be much more 446 00:28:06,040 --> 00:28:10,080 Speaker 1: directed or have very defined goals that you need to achieve, 447 00:28:10,560 --> 00:28:13,399 Speaker 1: and those really resonate with me because I feel like 448 00:28:13,440 --> 00:28:16,119 Speaker 1: I'm making progress as I play it. Now, this is 449 00:28:16,160 --> 00:28:18,959 Speaker 1: not to say the big open world games with very 450 00:28:19,000 --> 00:28:22,320 Speaker 1: little direction are bad. They're not bad, and the people 451 00:28:22,320 --> 00:28:24,560 Speaker 1: who love them are not bad people. They're not wrong 452 00:28:24,680 --> 00:28:28,280 Speaker 1: for loving them. It's just something that fundamentally doesn't work 453 00:28:28,280 --> 00:28:31,280 Speaker 1: with the way my brain works. And so for me, 454 00:28:31,440 --> 00:28:35,679 Speaker 1: those constraints really are important because they provide structure, and 455 00:28:35,760 --> 00:28:38,120 Speaker 1: with that structure, I can then feel if I'm doing 456 00:28:38,160 --> 00:28:41,320 Speaker 1: well or not well. Without structure, I don't know that, 457 00:28:41,680 --> 00:28:45,080 Speaker 1: and then I start tumbling into an existential crisis. And y'all, 458 00:28:45,120 --> 00:28:47,720 Speaker 1: you've heard enough episodes of this show. You know nobody 459 00:28:47,760 --> 00:28:51,840 Speaker 1: wants that. Okay, we're going to come back in just 460 00:28:51,920 --> 00:28:55,880 Speaker 1: a moment and talk about one other principle that simplicable identifies. 461 00:28:56,160 --> 00:28:58,720 Speaker 1: Keeping in mind they have others than the ones I've 462 00:28:58,760 --> 00:29:03,480 Speaker 1: just mentioned, and we'll finish off this episode from there. 463 00:29:03,520 --> 00:29:15,640 Speaker 1: But first, let's take another quick break to think our sponsors. Now, 464 00:29:15,680 --> 00:29:17,640 Speaker 1: as I mentioned at the top of this episode, I 465 00:29:17,680 --> 00:29:21,360 Speaker 1: was just kind of surfing around the web, which dates 466 00:29:21,400 --> 00:29:24,400 Speaker 1: me right, using terminology like that. That's fine. I'm an 467 00:29:24,400 --> 00:29:27,840 Speaker 1: old man, I get it, and I found this simplicable 468 00:29:27,880 --> 00:29:30,520 Speaker 1: blog which I had never seen before, and I started 469 00:29:30,560 --> 00:29:34,280 Speaker 1: reading this article about the different fundamental principles of technology, 470 00:29:34,320 --> 00:29:36,520 Speaker 1: and this one also really stood out to me because 471 00:29:36,560 --> 00:29:41,560 Speaker 1: I think it's one that we can easily contextualize right 472 00:29:41,600 --> 00:29:44,239 Speaker 1: now based upon things that are playing out at this 473 00:29:44,360 --> 00:29:48,600 Speaker 1: very moment. And that is the principle of cultural lag. 474 00:29:49,160 --> 00:29:52,440 Speaker 1: And again this is these are things that surround technology 475 00:29:52,440 --> 00:29:57,000 Speaker 1: and technological development. Right We're not talking about the actual 476 00:29:57,280 --> 00:30:01,000 Speaker 1: things that make the technology work. So what is cultural lack? Well, 477 00:30:01,040 --> 00:30:03,760 Speaker 1: it's pretty much what it sounds like. It's essentially when 478 00:30:04,040 --> 00:30:09,760 Speaker 1: technology outpaces society in some way. Technology or the things 479 00:30:09,760 --> 00:30:14,200 Speaker 1: that the technology introduces, like the possibilities the technology creates 480 00:30:14,680 --> 00:30:18,360 Speaker 1: are ones that society lacks the facility to handle. So 481 00:30:18,440 --> 00:30:21,440 Speaker 1: I would argue, right now, we're really seeing this with 482 00:30:21,560 --> 00:30:25,680 Speaker 1: generative AI and then artificial intelligence in general, because keep 483 00:30:25,720 --> 00:30:30,920 Speaker 1: in mind, generative AI is one application of artificial intelligence. 484 00:30:31,560 --> 00:30:35,000 Speaker 1: Generative AI is a type of AI, but not all 485 00:30:35,040 --> 00:30:39,320 Speaker 1: AI is generative AI. Right, all cats are mammals, but 486 00:30:39,360 --> 00:30:44,680 Speaker 1: not all mammals are cats. So generative AI has some 487 00:30:44,720 --> 00:30:48,560 Speaker 1: potential applications that society is just not prepared to handle. 488 00:30:49,160 --> 00:30:54,720 Speaker 1: Everything from copyright infringement, to plagiarism, to the capacity to 489 00:30:54,800 --> 00:31:00,280 Speaker 1: generate and disseminate misinformation. All of these are issue with 490 00:31:00,360 --> 00:31:03,600 Speaker 1: generative AI that we just don't have the ability to handle, 491 00:31:04,160 --> 00:31:07,480 Speaker 1: or even being able to differentiate between something that was 492 00:31:07,600 --> 00:31:11,000 Speaker 1: created by generative AI versus something that was created by 493 00:31:11,040 --> 00:31:14,480 Speaker 1: a person. We're not really able to handle that either. 494 00:31:15,400 --> 00:31:18,920 Speaker 1: And again, artificial intelligence in general also falls into the 495 00:31:18,960 --> 00:31:22,560 Speaker 1: category of a technology that we have you know, cultural 496 00:31:22,640 --> 00:31:27,400 Speaker 1: lag associated with it. So often I talk about this 497 00:31:28,080 --> 00:31:32,840 Speaker 1: within the context of legislation as various politicians and leaders 498 00:31:32,880 --> 00:31:37,800 Speaker 1: around the world struggle with the challenges created by technological 499 00:31:37,880 --> 00:31:41,200 Speaker 1: innovation and how can they take advantage of that innovation, 500 00:31:41,760 --> 00:31:45,360 Speaker 1: how can they try not to stifle innovation, but at 501 00:31:45,400 --> 00:31:47,880 Speaker 1: the same time, how can they protect the people and 502 00:31:48,000 --> 00:31:53,120 Speaker 1: institutions of a country from harm based upon what this 503 00:31:53,200 --> 00:31:56,600 Speaker 1: technological innovation can do. And we're even still seeing it 504 00:31:56,640 --> 00:31:59,880 Speaker 1: here in the United States with regard to like base 505 00:32:00,240 --> 00:32:03,480 Speaker 1: principles of what the Internet in general and the Web 506 00:32:03,520 --> 00:32:06,680 Speaker 1: in particular allow, right, I mean, that's why we get 507 00:32:06,800 --> 00:32:09,920 Speaker 1: arguments about stuff like Section two thirty here in the 508 00:32:10,000 --> 00:32:14,280 Speaker 1: United States. There's a cultural lag that is significant because 509 00:32:14,320 --> 00:32:16,480 Speaker 1: keep in mind section two thirty. Section two thirty is 510 00:32:16,520 --> 00:32:21,160 Speaker 1: what protects online platforms from being held liable for the 511 00:32:21,280 --> 00:32:25,880 Speaker 1: content that users post to those platforms. Right, Like, if 512 00:32:25,880 --> 00:32:30,280 Speaker 1: you were solely responsible for the content that goes up 513 00:32:30,320 --> 00:32:32,280 Speaker 1: on a website, all the content that goes up on 514 00:32:32,280 --> 00:32:35,160 Speaker 1: the website comes from you, and you start posting stuff 515 00:32:35,160 --> 00:32:38,720 Speaker 1: that's illegal, well, logic dictates you should be able to 516 00:32:38,720 --> 00:32:43,520 Speaker 1: be held accountable for posting illegal material. You posted it, 517 00:32:43,640 --> 00:32:46,600 Speaker 1: you created it, you posted it, you're the one responsible. 518 00:32:47,400 --> 00:32:50,680 Speaker 1: But if instead you create a website that allows anyone 519 00:32:50,800 --> 00:32:54,120 Speaker 1: to post there, and some other person you've never heard of, 520 00:32:54,600 --> 00:32:57,360 Speaker 1: you don't know them, you've never met them, they come 521 00:32:57,400 --> 00:33:00,480 Speaker 1: to your website and they post something illegal, section two 522 00:33:00,520 --> 00:33:03,760 Speaker 1: thirty would protect you from being held accountable for the 523 00:33:03,760 --> 00:33:06,680 Speaker 1: thing that this other person did. You provided the space, 524 00:33:07,000 --> 00:33:10,360 Speaker 1: but you didn't create the content. And section two thirty 525 00:33:10,400 --> 00:33:13,400 Speaker 1: itself has got some limitations. You're supposed to at least 526 00:33:13,440 --> 00:33:18,120 Speaker 1: put forth reasonable effort to remove illegal material, or else 527 00:33:18,160 --> 00:33:21,320 Speaker 1: you can lose the protection of section two thirty anyway. 528 00:33:21,360 --> 00:33:23,160 Speaker 1: All of this was worked out as part of the 529 00:33:23,160 --> 00:33:28,840 Speaker 1: Communications Decency Act of nineteen ninety six, the year before 530 00:33:29,040 --> 00:33:33,760 Speaker 1: three D Realms announced the development of Duke Nukem forever. 531 00:33:34,440 --> 00:33:38,000 Speaker 1: So it was nineteen ninety six when Section two thirty 532 00:33:38,080 --> 00:33:43,400 Speaker 1: was first written into law, and we're still struggling with 533 00:33:43,480 --> 00:33:46,560 Speaker 1: it today. You still have people on either side of 534 00:33:46,680 --> 00:33:51,480 Speaker 1: the political ideologies who want to either eliminate Section two 535 00:33:51,560 --> 00:33:57,640 Speaker 1: thirty or to amend it significantly for very different ideological reasons. 536 00:33:57,640 --> 00:34:00,400 Speaker 1: But you know, there's this agreement that on both sides 537 00:34:00,440 --> 00:34:05,600 Speaker 1: that it's not what they want, and that shows a 538 00:34:05,600 --> 00:34:08,400 Speaker 1: cultural lag that's really significant. I mean, we tried to 539 00:34:09,160 --> 00:34:14,160 Speaker 1: acknowledge it back in ninety six, and more than a 540 00:34:14,200 --> 00:34:16,919 Speaker 1: decade later, more than a decade and a half later, 541 00:34:17,080 --> 00:34:21,719 Speaker 1: we're still trying to grapple with it. That's a significant 542 00:34:21,760 --> 00:34:26,040 Speaker 1: cultural lag. Now, there are several other topics that Simplicable 543 00:34:26,120 --> 00:34:29,640 Speaker 1: lists as foundational principles of technology. I'm not sure that 544 00:34:29,719 --> 00:34:33,359 Speaker 1: I agree with that classification in every case. I think 545 00:34:33,400 --> 00:34:37,640 Speaker 1: they are things that relate to technology and are to 546 00:34:37,800 --> 00:34:40,560 Speaker 1: varying degrees important. I don't know if I would call 547 00:34:40,600 --> 00:34:45,160 Speaker 1: them fundamental principles. However, I think all of the ideas 548 00:34:45,160 --> 00:34:48,840 Speaker 1: are well worth discussing, and I will likely do another 549 00:34:48,920 --> 00:34:51,359 Speaker 1: episode on this in the future because I find it 550 00:34:51,480 --> 00:34:56,920 Speaker 1: really interesting. To think about these concepts and observations and 551 00:34:56,960 --> 00:35:01,680 Speaker 1: how do they interact with our approach to technology. And 552 00:35:01,760 --> 00:35:04,040 Speaker 1: there are tons more that they list, so we'll we'll 553 00:35:04,040 --> 00:35:07,960 Speaker 1: get to those in another episode if you'd like to 554 00:35:08,000 --> 00:35:10,200 Speaker 1: read up on them, by the way, and just to 555 00:35:10,280 --> 00:35:16,080 Speaker 1: see what Simplicable lists as fundamental principles of technology, as 556 00:35:16,120 --> 00:35:19,439 Speaker 1: well as all the other stuff that's unsipplicable. The url 557 00:35:19,560 --> 00:35:23,040 Speaker 1: is just simplicable dot com. That's s I M P 558 00:35:23,480 --> 00:35:27,799 Speaker 1: L I C A B L E dot com. Now 559 00:35:28,239 --> 00:35:31,560 Speaker 1: full disclosure, I do not have any connection to that site. 560 00:35:31,760 --> 00:35:34,640 Speaker 1: I didn't know it existed before today. I don't know 561 00:35:34,680 --> 00:35:37,520 Speaker 1: anyone who writes for it. I just stumbled across it 562 00:35:37,560 --> 00:35:41,000 Speaker 1: by chance and thought that the pages about, you know, 563 00:35:41,040 --> 00:35:44,719 Speaker 1: these foundational principles of technology were really interesting and there's 564 00:35:44,760 --> 00:35:46,440 Speaker 1: tons more on the site as well, so if you're 565 00:35:46,440 --> 00:35:49,560 Speaker 1: so inclined, you should check it out. I'm very thankful 566 00:35:49,560 --> 00:35:51,400 Speaker 1: that I came across them because it gave me a 567 00:35:51,440 --> 00:35:54,399 Speaker 1: lot to think about. And as I said, I don't 568 00:35:54,440 --> 00:35:58,360 Speaker 1: agree with all the conclusions made by Simplicable, but I 569 00:35:58,360 --> 00:36:02,560 Speaker 1: think it ends up being kind of find details that 570 00:36:02,719 --> 00:36:09,759 Speaker 1: are arguably subjective. So it could just be because my 571 00:36:09,800 --> 00:36:13,319 Speaker 1: point of view is slightly different, but it ultimately may 572 00:36:13,400 --> 00:36:15,640 Speaker 1: mean that we're both arguing the same thing, we're just 573 00:36:15,680 --> 00:36:20,200 Speaker 1: doing it in slightly different terms. So again, no slight, unsimplicable. 574 00:36:20,239 --> 00:36:23,480 Speaker 1: I think the goal is really an ideal one. They 575 00:36:23,560 --> 00:36:31,640 Speaker 1: want to produce informative, straightforward and objective information to help 576 00:36:31,760 --> 00:36:33,920 Speaker 1: educate people, which I think is a great thing to do. 577 00:36:34,840 --> 00:36:38,879 Speaker 1: Certainly I strive to do some of those things, but 578 00:36:38,920 --> 00:36:40,919 Speaker 1: not all of them, at least not all the time. 579 00:36:41,520 --> 00:36:44,080 Speaker 1: All Right, Well, that's it for this episode. Like I said, 580 00:36:44,080 --> 00:36:47,520 Speaker 1: I'll do another one coming up. Also, I've got a 581 00:36:47,520 --> 00:36:50,000 Speaker 1: lot of travel coming up in the near future where 582 00:36:50,040 --> 00:36:54,480 Speaker 1: I'll be recording remotely. Got a really exciting opportunity to 583 00:36:54,560 --> 00:36:59,040 Speaker 1: record an interview in a studio that's in Las Vegas, 584 00:36:59,719 --> 00:37:02,480 Speaker 1: which I will be doing pretty soon, so be on 585 00:37:02,600 --> 00:37:05,920 Speaker 1: the lookout or listen out for those. We've got some 586 00:37:05,920 --> 00:37:08,520 Speaker 1: more episodes of Smart Talks with IBM that are going 587 00:37:08,560 --> 00:37:12,160 Speaker 1: to publish in the near future in this feed, and 588 00:37:12,239 --> 00:37:16,400 Speaker 1: also an episode of The Restless Ones, the show that 589 00:37:16,520 --> 00:37:21,840 Speaker 1: I host where I talk with various chief officers usually 590 00:37:21,960 --> 00:37:25,760 Speaker 1: CIOs or CTOs of companies to kind of get insight 591 00:37:25,880 --> 00:37:30,120 Speaker 1: into their leadership. Process and their approach to technology. We'll 592 00:37:30,160 --> 00:37:33,360 Speaker 1: have one of those episodes published in this feed in 593 00:37:33,400 --> 00:37:35,880 Speaker 1: the not too distant future as well, says you can 594 00:37:35,920 --> 00:37:38,840 Speaker 1: hear my other work besides the stuff that I do 595 00:37:38,920 --> 00:37:43,560 Speaker 1: here for tech Stuff, I'm still the same Dufiss no 596 00:37:43,600 --> 00:37:46,479 Speaker 1: matter what where you put me, So no fear there, 597 00:37:47,320 --> 00:37:48,759 Speaker 1: but I wanted to give you the heads up on that. 598 00:37:49,680 --> 00:37:52,759 Speaker 1: And yeah, this month turned into at least the back 599 00:37:52,760 --> 00:37:55,080 Speaker 1: half of this month has turned into something that's far 600 00:37:55,200 --> 00:37:58,880 Speaker 1: busier than I had originally anticipated when the month started, 601 00:37:59,280 --> 00:38:01,240 Speaker 1: so I wanted to just kind of give a shout 602 00:38:01,239 --> 00:38:02,960 Speaker 1: out and make you all aware of what was going on. 603 00:38:03,400 --> 00:38:07,480 Speaker 1: All right, that's it. I'm getting out of here. I 604 00:38:07,520 --> 00:38:10,080 Speaker 1: hope you are all well, and I'll talk to you 605 00:38:10,120 --> 00:38:20,840 Speaker 1: again really soon. Tech Stuff is an iHeartRadio production. For 606 00:38:20,960 --> 00:38:25,799 Speaker 1: more podcasts from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, 607 00:38:25,920 --> 00:38:31,759 Speaker 1: or wherever you listen to your favorite shows.