1 00:00:04,440 --> 00:00:12,520 Speaker 1: Welcome to Tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,520 --> 00:00:16,400 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:16,400 --> 00:00:19,279 Speaker 1: I'm an executive producer with iHeartRadio. And how the tech 4 00:00:19,320 --> 00:00:22,720 Speaker 1: are you? Here in the United States, we are celebrating 5 00:00:23,000 --> 00:00:26,480 Speaker 1: Labor Day, a federal holiday here in the States. Also 6 00:00:26,560 --> 00:00:30,520 Speaker 1: one that I find really interesting because it's all about 7 00:00:30,600 --> 00:00:33,960 Speaker 1: celebrating the American labor movement. But if you pay attention, 8 00:00:34,120 --> 00:00:37,040 Speaker 1: especially in the tech space, there are a lot of 9 00:00:37,200 --> 00:00:41,000 Speaker 1: entities out there, a lot of companies that are eagerly 10 00:00:41,240 --> 00:00:45,159 Speaker 1: opposing the labor movement and trying to do things like 11 00:00:45,280 --> 00:00:51,560 Speaker 1: prevent workers from organizing and forming unions. Yeah, a complicated 12 00:00:51,600 --> 00:00:53,600 Speaker 1: thing that we have a holiday to celebrate it, and 13 00:00:53,720 --> 00:00:58,120 Speaker 1: yet we have plenty of examples of companies and organizations 14 00:00:58,160 --> 00:01:02,760 Speaker 1: out there dedicated to venting more labor organization from happening. 15 00:01:03,360 --> 00:01:06,280 Speaker 1: But let's put aside all of that. I thought, since 16 00:01:06,280 --> 00:01:09,360 Speaker 1: we're on holiday today, I wanted to make sure that 17 00:01:09,440 --> 00:01:13,240 Speaker 1: you had an episode anyway. So we're going to actually 18 00:01:13,280 --> 00:01:15,920 Speaker 1: listen to an episode that came out a few years ago, 19 00:01:16,160 --> 00:01:19,600 Speaker 1: back in twenty twenty July thirteenth, twenty twenty. In fact, 20 00:01:20,080 --> 00:01:23,760 Speaker 1: it is titled The Robots Are Coming for Your Job, 21 00:01:24,280 --> 00:01:27,400 Speaker 1: and it's all about robots and automated systems and the 22 00:01:28,680 --> 00:01:33,520 Speaker 1: anxiety that exists around this idea of automation eliminating jobs, 23 00:01:33,560 --> 00:01:37,520 Speaker 1: which I think has only become even more of a 24 00:01:37,560 --> 00:01:41,440 Speaker 1: talking point in the wake of things like generative AI, 25 00:01:41,640 --> 00:01:45,480 Speaker 1: for example. So let's listen to this episode from twenty twenty, 26 00:01:45,800 --> 00:01:48,320 Speaker 1: the Robots Are Coming for Your Job, and I'll chat 27 00:01:48,320 --> 00:01:53,080 Speaker 1: with you again at the end. I read a lot 28 00:01:53,160 --> 00:01:56,600 Speaker 1: of tech news and sometimes that ends up inspiring me 29 00:01:56,760 --> 00:01:59,840 Speaker 1: to do an episode of tech stuff. That happened to 30 00:01:59,920 --> 00:02:03,280 Speaker 1: me recently when I read this headline off of the 31 00:02:03,320 --> 00:02:10,440 Speaker 1: website tech Spot. Sony factory assembles PS four in thirty seconds, 32 00:02:10,919 --> 00:02:15,840 Speaker 1: only four humans involved in the process. A PS four, 33 00:02:15,919 --> 00:02:19,040 Speaker 1: in case you're not aware, is a PlayStation four video 34 00:02:19,080 --> 00:02:23,160 Speaker 1: game consoles. So this factory can build a video game 35 00:02:23,200 --> 00:02:27,359 Speaker 1: console from parts in half a minute and only four 36 00:02:27,560 --> 00:02:30,720 Speaker 1: human beings touch the ding dang thing in the process. 37 00:02:31,440 --> 00:02:34,079 Speaker 1: Those four humans, by the way, are involved in the 38 00:02:34,120 --> 00:02:37,560 Speaker 1: beginning and the end of the process. Two of them 39 00:02:37,720 --> 00:02:41,239 Speaker 1: load motherboards onto the assembly line, and a motherboard is 40 00:02:41,280 --> 00:02:44,880 Speaker 1: the primary circuit board for a computer system, and the 41 00:02:45,040 --> 00:02:47,320 Speaker 1: other two human beings are at the end of the 42 00:02:47,360 --> 00:02:50,720 Speaker 1: assembly line, and their job is to package the completed consoles. 43 00:02:51,200 --> 00:02:55,560 Speaker 1: All the actual assembly work is done by robots. Now 44 00:02:55,600 --> 00:02:58,840 Speaker 1: you may be experiencing a couple of different responses to 45 00:02:59,040 --> 00:03:02,360 Speaker 1: this information. I know I did. One of those was 46 00:03:02,400 --> 00:03:07,480 Speaker 1: a wow, that's seriously impressive. The PS four, like many 47 00:03:07,560 --> 00:03:11,160 Speaker 1: computer systems, has a lot of components, many of which 48 00:03:11,240 --> 00:03:14,760 Speaker 1: attached to one another by wire or cable. So these 49 00:03:14,840 --> 00:03:18,200 Speaker 1: robots have to be able to take these flexible components 50 00:03:18,280 --> 00:03:21,360 Speaker 1: and to join them in their proper anchor points with 51 00:03:21,440 --> 00:03:24,880 Speaker 1: the appropriate amount of pressure and precision to make a 52 00:03:24,919 --> 00:03:27,400 Speaker 1: good connection. Now, if any of you out there have 53 00:03:27,480 --> 00:03:31,040 Speaker 1: ever built your own PC, you know that plugging cables 54 00:03:31,120 --> 00:03:33,960 Speaker 1: in can get a little tricky depending on the layout 55 00:03:34,000 --> 00:03:37,040 Speaker 1: of the motherboard and the various components. And if you're 56 00:03:37,200 --> 00:03:40,080 Speaker 1: someone like me, you're likely putting stuff together only to 57 00:03:40,160 --> 00:03:42,720 Speaker 1: realize that maybe you should have done some of that 58 00:03:42,760 --> 00:03:46,000 Speaker 1: before you mounted them in a computer case, because now 59 00:03:46,360 --> 00:03:48,960 Speaker 1: you just don't have the space to work in properly. 60 00:03:49,360 --> 00:03:52,400 Speaker 1: So it's pretty darn impressive that robots can do this 61 00:03:52,520 --> 00:03:57,720 Speaker 1: consistently and correctly at that level of speed. Another response 62 00:03:57,800 --> 00:04:02,880 Speaker 1: I had was kind of scary. I mean, typically you 63 00:04:02,960 --> 00:04:06,200 Speaker 1: would have dozens of people employed on the assembly line 64 00:04:06,240 --> 00:04:09,120 Speaker 1: to do this sort of work, but in this factory 65 00:04:09,400 --> 00:04:13,000 Speaker 1: it's been stripped down to thirty two robots and four 66 00:04:13,160 --> 00:04:16,960 Speaker 1: human beings. The article in textpot points out that twenty 67 00:04:17,160 --> 00:04:21,880 Speaker 1: six of those thirty two robots are just attaching flexible 68 00:04:21,920 --> 00:04:26,080 Speaker 1: components together inside the console. Now, I have no idea 69 00:04:26,440 --> 00:04:30,160 Speaker 1: how much these robots cost, but I wager that they 70 00:04:30,200 --> 00:04:33,600 Speaker 1: are expensive enough to equal the salary of a standard 71 00:04:33,680 --> 00:04:38,240 Speaker 1: human employee on the assembly line. However, you don't pay robots. 72 00:04:38,600 --> 00:04:41,360 Speaker 1: You do have to spend money to maintain and repair them, 73 00:04:41,760 --> 00:04:45,120 Speaker 1: but assuming whatever you're making is going to be around 74 00:04:45,120 --> 00:04:48,760 Speaker 1: for a little while, they'll pay for themselves because eventually 75 00:04:49,080 --> 00:04:51,159 Speaker 1: you'll get to a point where the salaries you'd be 76 00:04:51,279 --> 00:04:54,320 Speaker 1: paying for humans would be more than the purchase and 77 00:04:54,400 --> 00:04:58,920 Speaker 1: maintenance cost of the robots. And the increase in efficiency 78 00:04:59,360 --> 00:05:03,280 Speaker 1: means you can produce a whole lot more stuff in 79 00:05:03,320 --> 00:05:06,080 Speaker 1: a given amount of time than you would with a 80 00:05:06,200 --> 00:05:10,640 Speaker 1: human centric assembly line, so you'll have more product to 81 00:05:11,040 --> 00:05:14,200 Speaker 1: sell in a shorter amount of time. When you start 82 00:05:14,320 --> 00:05:18,800 Speaker 1: crunching numbers, you discover your robotic assembly line can make 83 00:05:18,920 --> 00:05:21,960 Speaker 1: more stuff at a lower cost over a given period 84 00:05:22,000 --> 00:05:24,120 Speaker 1: of time, like you know, over a couple of years, 85 00:05:24,520 --> 00:05:27,440 Speaker 1: than what you would accomplish with human beings on that 86 00:05:27,480 --> 00:05:29,840 Speaker 1: assembly line. So you don't have to worry about the 87 00:05:29,920 --> 00:05:33,040 Speaker 1: robots taking a vacation. They don't take sick time, they 88 00:05:33,040 --> 00:05:35,240 Speaker 1: don't even take the night off. They can work around 89 00:05:35,240 --> 00:05:39,159 Speaker 1: the clock. They don't need health insurance, though I would 90 00:05:39,440 --> 00:05:41,839 Speaker 1: guess that most companies ensure the heck out of these 91 00:05:41,880 --> 00:05:44,560 Speaker 1: things just in case one breaks down. But from a 92 00:05:44,600 --> 00:05:47,400 Speaker 1: financial point of view, they make sense if you're building 93 00:05:47,400 --> 00:05:50,680 Speaker 1: stuff at a large enough scale, stuff like video game 94 00:05:50,720 --> 00:05:55,279 Speaker 1: consoles for the PlayStation four. It's a no brainer because 95 00:05:55,279 --> 00:05:59,000 Speaker 1: that console has sold around one hundred and ten million 96 00:05:59,160 --> 00:06:02,479 Speaker 1: units so far. That's a number large enough that I 97 00:06:02,520 --> 00:06:05,440 Speaker 1: can't even imagine what it would look like if you 98 00:06:05,720 --> 00:06:09,479 Speaker 1: had all those consoles together in one place. So if 99 00:06:09,480 --> 00:06:12,560 Speaker 1: there's enough demand for you to sell one hundred and 100 00:06:12,600 --> 00:06:16,479 Speaker 1: ten million whatever it is you want to sell, you 101 00:06:16,560 --> 00:06:19,720 Speaker 1: need to have a way to make those as efficiently 102 00:06:19,800 --> 00:06:23,359 Speaker 1: as possible, and that will help maximize your profits. And 103 00:06:23,400 --> 00:06:26,640 Speaker 1: the more efficient the process, the more competitively you can 104 00:06:26,680 --> 00:06:29,760 Speaker 1: price your product and still make a profit. But the 105 00:06:29,839 --> 00:06:34,120 Speaker 1: idea of robots performing jobs far more effectively, consistently, and 106 00:06:34,160 --> 00:06:38,320 Speaker 1: efficiently than humans raises a lot of questions. And these 107 00:06:38,360 --> 00:06:42,400 Speaker 1: are not new questions either, but they are questions like 108 00:06:43,120 --> 00:06:47,120 Speaker 1: if more factories rely on robots for production, particularly if 109 00:06:47,120 --> 00:06:50,840 Speaker 1: those robots can be programmed to produce new products once 110 00:06:50,880 --> 00:06:54,080 Speaker 1: older ones go obsolete, what happens to the job market, 111 00:06:54,360 --> 00:06:57,200 Speaker 1: What happens to the millions of people who work in 112 00:06:57,279 --> 00:07:01,760 Speaker 1: manufacturing on assembly lines? They go? What will this do 113 00:07:02,000 --> 00:07:05,880 Speaker 1: to economies around the world. Lots of people have tried 114 00:07:05,920 --> 00:07:11,640 Speaker 1: to answer these questions, sometimes giving drastically different answers. And 115 00:07:11,680 --> 00:07:13,840 Speaker 1: we're going to take a look at the history and 116 00:07:13,920 --> 00:07:17,640 Speaker 1: evolution of industrial robots in this episode and explore the 117 00:07:17,720 --> 00:07:22,720 Speaker 1: ramifications of automated manufacturing. And this is where I dive 118 00:07:22,800 --> 00:07:26,000 Speaker 1: into history. I've talked about the history of robots before, 119 00:07:26,280 --> 00:07:30,600 Speaker 1: so I'll try to restrict my focus to industrial robots. 120 00:07:30,880 --> 00:07:33,760 Speaker 1: And before I get into that, let's just address the 121 00:07:33,800 --> 00:07:37,320 Speaker 1: fact that the use of machinery to increase efficiency has 122 00:07:37,360 --> 00:07:40,960 Speaker 1: been a controversial subject since long before there ever was 123 00:07:41,000 --> 00:07:45,440 Speaker 1: such a thing as a robot. Generally speaking, machines are 124 00:07:45,480 --> 00:07:48,960 Speaker 1: meant to make work easier or in some cases make 125 00:07:49,000 --> 00:07:53,000 Speaker 1: the work possible just to begin with, they are labor 126 00:07:53,160 --> 00:07:57,400 Speaker 1: saving devices, requiring humans to put forth less effort to 127 00:07:57,480 --> 00:08:01,280 Speaker 1: get the same or better results. This applies to the 128 00:08:01,320 --> 00:08:05,200 Speaker 1: simplest of machines, I mean stuff like levers or pulleys 129 00:08:05,320 --> 00:08:09,040 Speaker 1: or an inclined plane, and it applies to very complex 130 00:08:09,080 --> 00:08:14,080 Speaker 1: machines as well. Before the Industrial Revolution, most stuff like 131 00:08:14,360 --> 00:08:17,920 Speaker 1: textiles was made by crafts people out of their own homes. 132 00:08:18,400 --> 00:08:23,360 Speaker 1: This was literally the cottage industry. Trades people would travel 133 00:08:23,440 --> 00:08:27,720 Speaker 1: and become the lifeline for the cottage industry, supplying raw materials, 134 00:08:28,040 --> 00:08:31,600 Speaker 1: buying finished products, and selling those products off at a 135 00:08:31,640 --> 00:08:36,400 Speaker 1: profit elsewhere. Many tradespeople built a good deal of wealth 136 00:08:36,600 --> 00:08:39,200 Speaker 1: working this way, and they had the means to look 137 00:08:39,200 --> 00:08:45,360 Speaker 1: at alternatives to this decentralized cottage industry approach. An idea 138 00:08:45,440 --> 00:08:49,400 Speaker 1: began to form. If you brought together crafts people to 139 00:08:49,679 --> 00:08:54,160 Speaker 1: a centralized location, and if you simplified the process of production, 140 00:08:54,920 --> 00:08:58,320 Speaker 1: you could make way more stuff, which in turn means 141 00:08:58,360 --> 00:09:01,959 Speaker 1: you could sell way more stuff, which in turn means 142 00:09:01,960 --> 00:09:05,760 Speaker 1: you can make way more money, and money makes the 143 00:09:05,800 --> 00:09:10,880 Speaker 1: world go round. This thought process helped fuel a similar 144 00:09:11,040 --> 00:09:14,960 Speaker 1: line of thinking. If you could design machines that could 145 00:09:15,000 --> 00:09:17,800 Speaker 1: do a lot of work that typically fell to skilled 146 00:09:17,840 --> 00:09:21,439 Speaker 1: crafts people. You wouldn't need the crafts people at all. 147 00:09:21,840 --> 00:09:25,040 Speaker 1: You could train anyone, even if that person had no 148 00:09:25,240 --> 00:09:29,360 Speaker 1: experience with the process, just to work the machine. And 149 00:09:29,400 --> 00:09:32,040 Speaker 1: while it might take years of dedication to go through 150 00:09:32,080 --> 00:09:35,360 Speaker 1: the process of being an apprentice to learn a trade 151 00:09:35,440 --> 00:09:37,600 Speaker 1: well enough so that you can actually make a living 152 00:09:37,679 --> 00:09:41,120 Speaker 1: at it, with a machine, you can skip right over that. 153 00:09:41,520 --> 00:09:44,400 Speaker 1: As long as the machine's end product was good enough. 154 00:09:45,360 --> 00:09:48,400 Speaker 1: It didn't have to be better than the stuff crafts 155 00:09:48,400 --> 00:09:51,480 Speaker 1: people were making. It just had to be good enough 156 00:09:51,760 --> 00:09:55,600 Speaker 1: and cheap enough and fast enough to produce. Then you 157 00:09:55,600 --> 00:09:58,400 Speaker 1: could sell the finished product at a lower cost than 158 00:09:58,400 --> 00:10:01,360 Speaker 1: what crafts people would charge because not as much time 159 00:10:01,400 --> 00:10:04,439 Speaker 1: and effort went into making the thing. Now, I guess 160 00:10:04,679 --> 00:10:07,400 Speaker 1: it's clunky to talk about this while using an example, 161 00:10:07,480 --> 00:10:12,559 Speaker 1: so let's go with a poster child for the Industrial Revolution, weaving. 162 00:10:13,160 --> 00:10:16,600 Speaker 1: The weaving trade is an ancient one, and it requires 163 00:10:16,640 --> 00:10:19,640 Speaker 1: a good deal of skill to do it well by hand. 164 00:10:20,200 --> 00:10:24,119 Speaker 1: In the late seventeen hundreds, a man named Edmund Cartwright 165 00:10:24,320 --> 00:10:28,240 Speaker 1: patented a loom powered by a water wheel. The loom's 166 00:10:28,280 --> 00:10:31,120 Speaker 1: operation was such that a person who had no training 167 00:10:31,120 --> 00:10:35,400 Speaker 1: in weaving could operate the machine and produce finished textiles. 168 00:10:35,800 --> 00:10:39,839 Speaker 1: Cartwright's design would be built upon by other inventors who 169 00:10:39,920 --> 00:10:43,080 Speaker 1: had turned to steam power and other means to operate 170 00:10:43,120 --> 00:10:47,960 Speaker 1: the power loom. Many cottage industry weavers found themselves out 171 00:10:47,960 --> 00:10:51,200 Speaker 1: of work. They could potentially opt to work in the 172 00:10:51,280 --> 00:10:55,000 Speaker 1: textile factories, as those were popping up all over the place, 173 00:10:55,080 --> 00:10:59,760 Speaker 1: particularly in England, but the wages were low. As you 174 00:10:59,800 --> 00:11:04,680 Speaker 1: can imagine, this didn't sit well with the weavers. There 175 00:11:04,720 --> 00:11:10,640 Speaker 1: were protests, including some that incorporated violence and destruction. Ultimately, 176 00:11:10,920 --> 00:11:14,440 Speaker 1: the factory process won out, and along with it some 177 00:11:15,200 --> 00:11:19,560 Speaker 1: really awful working conditions followed, including stuff like child labor 178 00:11:19,720 --> 00:11:25,320 Speaker 1: and ridiculously low wages and dangerous working conditions. This led 179 00:11:25,360 --> 00:11:28,319 Speaker 1: to more protests, including the type that would give us 180 00:11:28,360 --> 00:11:32,439 Speaker 1: the word sabotage. And let's get a quick side note 181 00:11:32,480 --> 00:11:35,400 Speaker 1: on that one, as it is the source of a 182 00:11:35,480 --> 00:11:41,000 Speaker 1: little mythology or misinformation. See The apocryphal story goes that 183 00:11:41,080 --> 00:11:45,720 Speaker 1: the word sabotage comes from the word sabo, which describes 184 00:11:45,800 --> 00:11:50,800 Speaker 1: the wooden shoes worn by laborers, mainly Dutch laborers, but 185 00:11:50,840 --> 00:11:54,920 Speaker 1: also laborers in France. And according to the story, these 186 00:11:55,040 --> 00:11:58,960 Speaker 1: laborers wore those shoes and used them to great effect. 187 00:11:59,480 --> 00:12:02,400 Speaker 1: In an effort to protest the conditions and factories, they 188 00:12:02,400 --> 00:12:07,040 Speaker 1: would toss their wooden shoes into the machinery to break 189 00:12:07,080 --> 00:12:11,120 Speaker 1: the various gears and literally grind production to a halt, 190 00:12:11,200 --> 00:12:15,480 Speaker 1: as it were. But this story, while compelling, isn't really 191 00:12:15,520 --> 00:12:20,560 Speaker 1: the truth. Sabotage does stem from the word sabo, but 192 00:12:20,880 --> 00:12:25,240 Speaker 1: in French there is a verb sabote. This verb means 193 00:12:25,480 --> 00:12:29,880 Speaker 1: to make a loud noise with wooden shoes. Now isn't 194 00:12:29,880 --> 00:12:33,600 Speaker 1: it great that there's a verb for that? And it 195 00:12:33,640 --> 00:12:37,160 Speaker 1: makes sense wooden shoes would make a great deal of 196 00:12:37,240 --> 00:12:41,199 Speaker 1: racket as people would walk around. Heck, if a toddler 197 00:12:41,320 --> 00:12:44,120 Speaker 1: wore wooden shoes, I think it would probably sound as 198 00:12:44,120 --> 00:12:47,000 Speaker 1: though the world were shaking apart. I don't know how 199 00:12:47,080 --> 00:12:49,959 Speaker 1: toddlers manage to sound like they weigh eight hundred pounds, 200 00:12:50,200 --> 00:12:52,440 Speaker 1: but they do it. And if you have a toddler, 201 00:12:53,000 --> 00:12:55,880 Speaker 1: you know what I'm talking about. And in the culture 202 00:12:55,920 --> 00:13:00,360 Speaker 1: of France, the idea of a clumsy slow worker was 203 00:13:00,400 --> 00:13:04,080 Speaker 1: often linked to someone who wore wooden shoes because they're 204 00:13:04,120 --> 00:13:09,520 Speaker 1: awkward to wear anyway, The reason sabote led to sabotage 205 00:13:10,000 --> 00:13:14,120 Speaker 1: is because factory workers who were protesting their work conditions 206 00:13:14,120 --> 00:13:19,040 Speaker 1: and wages would purposefully work more slowly and less efficiently 207 00:13:19,440 --> 00:13:22,520 Speaker 1: in order to affect the overall output of a factory. 208 00:13:22,760 --> 00:13:26,560 Speaker 1: It was related to a similar strategy that British laborers employed, 209 00:13:26,600 --> 00:13:29,640 Speaker 1: and their version was called kakani. It was a saying 210 00:13:29,679 --> 00:13:34,160 Speaker 1: from Scotland which essentially means don't do so much man now. 211 00:13:34,200 --> 00:13:36,560 Speaker 1: I would argue this also feeds into a strategy that 212 00:13:36,600 --> 00:13:39,760 Speaker 1: we see to this very day in certain government offices, 213 00:13:39,800 --> 00:13:42,560 Speaker 1: where the idea is there's no need to do too 214 00:13:42,679 --> 00:13:46,559 Speaker 1: much too quickly, as it doesn't result in increased compensation 215 00:13:46,800 --> 00:13:49,439 Speaker 1: and it also sets a really high bar of expectation, 216 00:13:49,600 --> 00:13:51,959 Speaker 1: so why not just take it easy? You know, I 217 00:13:51,960 --> 00:13:55,440 Speaker 1: don't have a coffee break now. In the early twentieth century, 218 00:13:55,720 --> 00:13:59,319 Speaker 1: people began to use the word sabotage to really refer 219 00:13:59,400 --> 00:14:04,080 Speaker 1: to a purpose full approach to undermining the output of factories, 220 00:14:04,480 --> 00:14:07,280 Speaker 1: and it had nothing to do with tossing wooden shoes 221 00:14:07,320 --> 00:14:10,880 Speaker 1: into machinery, though it did also pertain to instances in 222 00:14:10,920 --> 00:14:16,120 Speaker 1: which workers purposefully damaged equipment and tried to slow down 223 00:14:16,160 --> 00:14:19,760 Speaker 1: the production that way. While this isn't directly tied to 224 00:14:19,800 --> 00:14:23,320 Speaker 1: the idea that machines themselves are displacing workers, it is 225 00:14:23,400 --> 00:14:27,560 Speaker 1: related to the effect of moving toward a manufacturing based 226 00:14:27,600 --> 00:14:32,120 Speaker 1: economy and how that allows for the exploitation of workers. 227 00:14:32,440 --> 00:14:36,760 Speaker 1: The machines themselves aren't really at fault, but they facilitate 228 00:14:36,840 --> 00:14:41,000 Speaker 1: the system of operations that leads to exploitation. Now that's 229 00:14:41,040 --> 00:14:44,320 Speaker 1: something that'll be a theme in this episode, and we 230 00:14:44,400 --> 00:14:47,400 Speaker 1: can't ignore the social aspect of what's going on here, 231 00:14:47,520 --> 00:14:51,160 Speaker 1: or else we miss the whole point. But let's skip ahead. 232 00:14:51,560 --> 00:14:54,640 Speaker 1: I've spoken about this before, but we get the word 233 00:14:54,840 --> 00:14:59,680 Speaker 1: robot from a check author named Kyl Kopek. He wrote 234 00:14:59,720 --> 00:15:03,840 Speaker 1: a pl called Rossum's Universal Robots or R you Are 235 00:15:04,080 --> 00:15:09,640 Speaker 1: in nineteen twenty. Kopek took an older word rabota, which 236 00:15:09,760 --> 00:15:14,200 Speaker 1: means forced labor in Europe. This concept was tied to 237 00:15:14,240 --> 00:15:17,640 Speaker 1: that of the old system of serfdom, in which people 238 00:15:17,640 --> 00:15:21,360 Speaker 1: would do work on behalf of a landowner. In return, 239 00:15:21,680 --> 00:15:24,640 Speaker 1: those people would be allowed to live on part of 240 00:15:24,680 --> 00:15:29,440 Speaker 1: that landowner's land. In ru are factory owners devise a 241 00:15:29,440 --> 00:15:33,960 Speaker 1: way to build laborers from raw materials. Now in the play, 242 00:15:34,320 --> 00:15:38,480 Speaker 1: they are indistinguishable from humans other than they have no 243 00:15:39,320 --> 00:15:43,040 Speaker 1: inner desires. But in the course of the play, these 244 00:15:43,120 --> 00:15:47,960 Speaker 1: laborers eventually take over all the jobs that humans previously held, 245 00:15:48,440 --> 00:15:52,760 Speaker 1: and humans themselves become a threatened species as these laborers 246 00:15:52,840 --> 00:15:56,440 Speaker 1: begin to understand the power that they hold by occupying 247 00:15:56,720 --> 00:16:00,360 Speaker 1: all the positions of employment, including as soldiers in the military. 248 00:16:00,760 --> 00:16:03,720 Speaker 1: And so with the introduction of the concept of robot, 249 00:16:04,120 --> 00:16:07,440 Speaker 1: we actually get the very first robotic uprising all the 250 00:16:07,480 --> 00:16:11,160 Speaker 1: way back in nineteen twenty. See I told you it 251 00:16:11,160 --> 00:16:14,640 Speaker 1: was an old idea. It's important to remember that in 252 00:16:14,680 --> 00:16:18,520 Speaker 1: the play, the robots are nearly identical to humans. They 253 00:16:19,200 --> 00:16:23,040 Speaker 1: aren't mechanical the way our robots of today are, but 254 00:16:23,120 --> 00:16:25,920 Speaker 1: the idea of creating machines that can do work without 255 00:16:25,920 --> 00:16:30,000 Speaker 1: a will of their own is a part of robotics 256 00:16:30,040 --> 00:16:34,200 Speaker 1: in general, and industrial robotics in particular. When we come back, 257 00:16:34,560 --> 00:16:38,240 Speaker 1: we'll talk about the earliest industrial robots and what they did, 258 00:16:38,360 --> 00:16:49,400 Speaker 1: But first let's take a quick break. It's interesting to 259 00:16:49,480 --> 00:16:53,680 Speaker 1: me that the tech world adopted the term robot when 260 00:16:53,680 --> 00:16:57,160 Speaker 1: we think about the origins of that word. In Cope's work, 261 00:16:57,680 --> 00:17:01,920 Speaker 1: robots were sentient slaves. They could perform the work humans 262 00:17:01,960 --> 00:17:06,360 Speaker 1: would otherwise do, but they lack the emotions that humans have, 263 00:17:06,800 --> 00:17:10,480 Speaker 1: and the whole idea is that these devices could do 264 00:17:10,560 --> 00:17:14,800 Speaker 1: our work for us without question or protest. They would, 265 00:17:14,960 --> 00:17:19,520 Speaker 1: in theory, endure conditions that people wouldn't or couldn't, But 266 00:17:19,600 --> 00:17:22,959 Speaker 1: in the play they ultimately lead to the destruction of 267 00:17:23,040 --> 00:17:26,800 Speaker 1: the human race and potentially they become the new dominant 268 00:17:26,840 --> 00:17:30,440 Speaker 1: species on the planet. Now. I say potentially because part 269 00:17:30,480 --> 00:17:34,199 Speaker 1: of the play's plot involves the destruction of the formula 270 00:17:34,440 --> 00:17:37,720 Speaker 1: that scientists use to produce the robots in the first place. 271 00:17:38,240 --> 00:17:40,879 Speaker 1: That is an important plot point. The robots are not 272 00:17:41,160 --> 00:17:44,800 Speaker 1: sure how to make more robots, so they might just 273 00:17:44,960 --> 00:17:48,920 Speaker 1: die out. Now, it seems to me as though that's 274 00:17:48,960 --> 00:17:53,080 Speaker 1: a pretty emotionally charged term to adopt for an entire 275 00:17:53,119 --> 00:17:57,439 Speaker 1: discipline of technology, right, robots, Especially if you are actually 276 00:17:57,480 --> 00:17:59,959 Speaker 1: aware of that play, and by the way, I recommend 277 00:18:00,119 --> 00:18:03,439 Speaker 1: people read it. It's a good play. But then a 278 00:18:03,480 --> 00:18:06,240 Speaker 1: lot of people are not aware of the origins of 279 00:18:06,280 --> 00:18:08,960 Speaker 1: the word, or at least not beyond knowing that it 280 00:18:09,240 --> 00:18:12,000 Speaker 1: came from a play in the nineteen twenty So I 281 00:18:12,040 --> 00:18:14,840 Speaker 1: guess for them, it's just, you know, a word. A 282 00:18:14,920 --> 00:18:18,080 Speaker 1: robot by any other name would smell as sweet as 283 00:18:18,119 --> 00:18:21,680 Speaker 1: it were. And we've definitely seen the themes of rure 284 00:18:21,920 --> 00:18:25,000 Speaker 1: serving as an undercurrent for stuff that's happening in robotics 285 00:18:25,040 --> 00:18:29,640 Speaker 1: in general. But let's move ahead. In nineteen fifty four, 286 00:18:29,920 --> 00:18:34,600 Speaker 1: an engineer named George Duvault designed an industrial robot. He 287 00:18:34,800 --> 00:18:38,760 Speaker 1: was nine years old when Kopek coined the term robot. 288 00:18:39,480 --> 00:18:44,239 Speaker 1: He called his design the Programmed Article Transferred Device, for 289 00:18:44,280 --> 00:18:48,080 Speaker 1: which he received a US patent in nineteen sixty one. 290 00:18:48,720 --> 00:18:51,880 Speaker 1: This machine was a robotic arm, and it was capable 291 00:18:51,920 --> 00:18:56,320 Speaker 1: of picking up something and then transferring it a short 292 00:18:56,359 --> 00:18:59,200 Speaker 1: distance away, just within reach of the arm. The arm 293 00:18:59,240 --> 00:19:02,880 Speaker 1: itself couldn't move, it was anchored in place. It could 294 00:19:02,880 --> 00:19:05,280 Speaker 1: also follow. In fact, this is the important part, It 295 00:19:05,320 --> 00:19:08,800 Speaker 1: would follow a pre program series of instructions to do this. 296 00:19:09,520 --> 00:19:12,600 Speaker 1: Deval's argument for his device was that up to this point, 297 00:19:12,840 --> 00:19:17,680 Speaker 1: mechanical handling of objects fell into two broad categories. Either 298 00:19:17,720 --> 00:19:22,480 Speaker 1: stuff got moved by humans, typically operating a powerful machine 299 00:19:22,520 --> 00:19:26,080 Speaker 1: like a crane or a forklift, or stuff got moved 300 00:19:26,160 --> 00:19:30,520 Speaker 1: by a device that operated under cam control. Now, manual 301 00:19:30,560 --> 00:19:34,280 Speaker 1: control is self explanatory, so let's talk about cams. A 302 00:19:34,400 --> 00:19:39,320 Speaker 1: cam is a rotating component in machinery. Typically a cam 303 00:19:39,400 --> 00:19:43,000 Speaker 1: has some variation in its surface. So let's start with 304 00:19:43,040 --> 00:19:46,000 Speaker 1: a wheel. Just imagine a wheel that is spinning on 305 00:19:46,080 --> 00:19:50,440 Speaker 1: an axle. Well, you wouldn't typically have a perfectly smooth 306 00:19:50,560 --> 00:19:54,280 Speaker 1: wheel as a cam. Part of that surface might be flat, 307 00:19:54,720 --> 00:19:57,879 Speaker 1: or it might have dips in it, And when the 308 00:19:57,920 --> 00:20:01,960 Speaker 1: cam rotates, these variations apply force to some other mechanical 309 00:20:02,000 --> 00:20:05,800 Speaker 1: component that is held against the cam, and it causes 310 00:20:05,840 --> 00:20:10,080 Speaker 1: that mechanical component to move in specific ways. A cam 311 00:20:10,119 --> 00:20:13,440 Speaker 1: operating system can work on its own, but it will 312 00:20:13,440 --> 00:20:17,320 Speaker 1: always repeat the exact same motions As long as everything 313 00:20:17,359 --> 00:20:20,640 Speaker 1: is working, it'll just repeat those steps. Once the cams 314 00:20:20,720 --> 00:20:25,439 Speaker 1: complete a full systematic rotation, you can't really adapt it 315 00:20:25,480 --> 00:20:28,560 Speaker 1: to do anything else. The movements depend entirely on the 316 00:20:28,600 --> 00:20:31,600 Speaker 1: cams themselves, so if you wanted it to do something else, 317 00:20:31,600 --> 00:20:34,240 Speaker 1: you would at first have to swap out the cams, 318 00:20:35,600 --> 00:20:38,400 Speaker 1: and even then you would be under whatever the limitations 319 00:20:38,480 --> 00:20:41,800 Speaker 1: of the device was itself, like it wouldn't have full 320 00:20:41,920 --> 00:20:45,840 Speaker 1: range of motion. Moreover, this level of specialization also means 321 00:20:45,880 --> 00:20:50,240 Speaker 1: that it's typically really expensive to rely upon cam based systems, 322 00:20:50,359 --> 00:20:53,560 Speaker 1: so it was really only useful if the application had 323 00:20:53,560 --> 00:20:56,879 Speaker 1: to do with mass manufacturing or else you're looking at 324 00:20:56,920 --> 00:20:59,280 Speaker 1: an economic loss. The cost of the system was just 325 00:20:59,359 --> 00:21:02,760 Speaker 1: too much. So Devaal was proposing a machine that could 326 00:21:02,800 --> 00:21:06,119 Speaker 1: be programmed to do operations. And this would let a 327 00:21:06,160 --> 00:21:10,639 Speaker 1: programmer create different processes using the same machine. Or you 328 00:21:10,640 --> 00:21:13,720 Speaker 1: could get a whole bunch of the same basic machine 329 00:21:14,320 --> 00:21:17,960 Speaker 1: and program each one to do a particular job. Meanwhile, 330 00:21:18,359 --> 00:21:20,440 Speaker 1: you'd free people up to work on other stuff in 331 00:21:20,480 --> 00:21:23,520 Speaker 1: the manufacturing process, and you could take the most dangerous 332 00:21:23,520 --> 00:21:26,119 Speaker 1: stuff and give it to the robots. Now, the story 333 00:21:26,200 --> 00:21:29,399 Speaker 1: goes that Daval was at a party in nineteen fifty 334 00:21:29,440 --> 00:21:31,720 Speaker 1: six and he got into a conversation with a man 335 00:21:31,880 --> 00:21:37,159 Speaker 1: named Joseph Engelberger. Joseph was a scientist and an entrepreneur, 336 00:21:37,600 --> 00:21:41,680 Speaker 1: and when the subject turned to Deval's programmed article transferred device, 337 00:21:41,720 --> 00:21:44,399 Speaker 1: as well as the work of a science fiction author 338 00:21:44,480 --> 00:21:48,320 Speaker 1: known as Isaac Asimov, you know, the father of robotics. 339 00:21:48,359 --> 00:21:51,800 Speaker 1: He famously incorporated a concept of the laws of robotics 340 00:21:51,800 --> 00:21:54,520 Speaker 1: in his works. We won't really go into that in 341 00:21:54,560 --> 00:21:57,840 Speaker 1: this episode, but the laws of robotics still play a 342 00:21:57,960 --> 00:22:02,600 Speaker 1: big part in the discipline of robotics in general. But 343 00:22:02,720 --> 00:22:06,480 Speaker 1: it's kind of outside the focus of this episode. Ingelberger 344 00:22:06,800 --> 00:22:10,080 Speaker 1: used his connections to get funding for devauld to create 345 00:22:10,119 --> 00:22:13,640 Speaker 1: a more advanced version of the programmed article transfer machine, 346 00:22:14,040 --> 00:22:17,000 Speaker 1: and it would be a robotic arm capable of making repeated, 347 00:22:17,160 --> 00:22:21,280 Speaker 1: precise movements while holding very heavy objects. They called it 348 00:22:21,520 --> 00:22:28,880 Speaker 1: the Unimate Unimate, and the first prototype, Unimate zero zero one, 349 00:22:29,080 --> 00:22:31,880 Speaker 1: would go to General Motors to work on a die 350 00:22:31,960 --> 00:22:36,600 Speaker 1: casting assembly line. Now, according to the company robot Works, 351 00:22:36,760 --> 00:22:42,120 Speaker 1: that's a WRX. This robot cost around sixty five thousand 352 00:22:42,119 --> 00:22:45,000 Speaker 1: dollars to produce, and Ingelberger sold it off at a 353 00:22:45,080 --> 00:22:50,159 Speaker 1: tremendous loss. General Motors only paid eighteen thousand dollars for 354 00:22:50,240 --> 00:22:53,879 Speaker 1: a sixty five thousand dollars machine. But Ingelberger really wanted 355 00:22:53,920 --> 00:22:56,920 Speaker 1: to establish that robotics were a way to perform repetitive, 356 00:22:57,119 --> 00:23:01,240 Speaker 1: dangerous functions at a lower risk to humans. Welding die 357 00:23:01,280 --> 00:23:05,679 Speaker 1: cast components on autobodies was a great first application of 358 00:23:05,680 --> 00:23:10,240 Speaker 1: industrial robots for a few reasons. Die Casting is a 359 00:23:10,280 --> 00:23:14,960 Speaker 1: process involving molten metal. You take that molten metal and 360 00:23:15,000 --> 00:23:17,960 Speaker 1: you force it into steel molds, and these are water 361 00:23:18,200 --> 00:23:22,359 Speaker 1: called dyes. The molten metal cools in the exact shape 362 00:23:22,400 --> 00:23:25,160 Speaker 1: of the mold. So this is a way to make 363 00:23:25,480 --> 00:23:28,560 Speaker 1: or cast a bunch of identical parts out of metal 364 00:23:28,800 --> 00:23:33,800 Speaker 1: and get consistent quality out of it rather than forging 365 00:23:33,920 --> 00:23:37,280 Speaker 1: each piece and then fitting them together. A dye can 366 00:23:37,320 --> 00:23:41,120 Speaker 1: have complex shapes in it, such as external threads, which 367 00:23:41,119 --> 00:23:44,160 Speaker 1: means you don't have to make a pipe, for example, 368 00:23:44,200 --> 00:23:47,720 Speaker 1: and then do a secondary process on that pipe to 369 00:23:47,800 --> 00:23:49,520 Speaker 1: get the result you want, So you wouldn't have to 370 00:23:49,600 --> 00:23:53,960 Speaker 1: carve those threads into a otherwise smooth pipe. You could 371 00:23:54,000 --> 00:23:58,480 Speaker 1: just cast the pipe with the threads incorporated on it already. 372 00:23:58,920 --> 00:24:03,240 Speaker 1: But welding die cast parts onto autobodies is hard work. 373 00:24:03,440 --> 00:24:06,479 Speaker 1: The components are really heavy, so you're at risk of 374 00:24:06,760 --> 00:24:09,960 Speaker 1: immediate injury if something goes wrong, like let's say you 375 00:24:10,080 --> 00:24:13,360 Speaker 1: drop a weighty component on your foot, or you might 376 00:24:13,600 --> 00:24:17,399 Speaker 1: develop a repetitive stress injury after going through the same 377 00:24:17,520 --> 00:24:21,840 Speaker 1: welding motions over and over again. In addition, the fumes 378 00:24:21,880 --> 00:24:25,600 Speaker 1: given off while welding were sometimes toxic still are, so 379 00:24:26,040 --> 00:24:28,119 Speaker 1: it's not great to have people exposed to them for 380 00:24:28,240 --> 00:24:31,199 Speaker 1: very long. So a robot was a great substitute for 381 00:24:31,280 --> 00:24:33,919 Speaker 1: a person. The robot could handle much greater weight than 382 00:24:33,960 --> 00:24:37,280 Speaker 1: people could The robot didn't breathe, so there was no 383 00:24:37,600 --> 00:24:41,000 Speaker 1: respiratory issue there, and it didn't get tired. I mean, 384 00:24:41,040 --> 00:24:43,600 Speaker 1: it would wear down over time, but you could repair 385 00:24:43,640 --> 00:24:47,640 Speaker 1: it in fairly short order. The Unimate worked with computer 386 00:24:47,680 --> 00:24:52,200 Speaker 1: controlled hydraulic systems. A hydraulic system uses a liquid that's 387 00:24:52,359 --> 00:24:56,120 Speaker 1: under pressure in order to do work like pushing against 388 00:24:56,119 --> 00:24:58,800 Speaker 1: a piston to power an actuator of some sort like 389 00:24:58,920 --> 00:25:03,520 Speaker 1: lift a platform. The UNIMATE zero zero one weighed twenty 390 00:25:03,680 --> 00:25:07,320 Speaker 1: seven hundred pounds or about one two hundred and twenty 391 00:25:07,359 --> 00:25:10,720 Speaker 1: five kilograms, and it could work twenty four hours a day, 392 00:25:10,800 --> 00:25:15,080 Speaker 1: placing components with a precision of within one fifty thousandth 393 00:25:15,320 --> 00:25:17,680 Speaker 1: of an inch. Now I'm not going to do the 394 00:25:17,720 --> 00:25:20,120 Speaker 1: conversion on that, because I think it's sufficient to say 395 00:25:20,200 --> 00:25:25,119 Speaker 1: that it was just really precise. According to a charmingly 396 00:25:25,240 --> 00:25:29,720 Speaker 1: dated newsreel from Britain, complete with swinging sixties music that 397 00:25:29,800 --> 00:25:32,480 Speaker 1: sounded like it came straight off an Austin Powers movie, 398 00:25:32,840 --> 00:25:35,600 Speaker 1: the robot could operate for five hundred hours without the 399 00:25:35,600 --> 00:25:38,840 Speaker 1: need for a human to check in on it. Engelberger, 400 00:25:39,040 --> 00:25:43,520 Speaker 1: a savvy businessman and promoter, would arrange for Unimate to 401 00:25:43,640 --> 00:25:47,120 Speaker 1: show what it could do at trade shows and on 402 00:25:47,240 --> 00:25:52,200 Speaker 1: TV appearances, including one on The Tonight Show with Johnny Carson. 403 00:25:52,640 --> 00:25:55,280 Speaker 1: If you don't know who that is, ask your parents, 404 00:25:55,520 --> 00:26:01,080 Speaker 1: and if they don't know, ask your grandparents. Nineteen sixty nine, 405 00:26:01,240 --> 00:26:05,080 Speaker 1: General Motors had jumped on board the robot train, as 406 00:26:05,119 --> 00:26:09,720 Speaker 1: it were. They rebuilt a manufacturing plant in Lordstown, Ohio, 407 00:26:09,960 --> 00:26:13,320 Speaker 1: and they installed unimate robots to perform spot welding on 408 00:26:13,440 --> 00:26:17,520 Speaker 1: car bodies, and the results spoke for themselves. The plant 409 00:26:17,640 --> 00:26:21,560 Speaker 1: was capable of producing one hundred and ten cars per hour, 410 00:26:21,840 --> 00:26:24,840 Speaker 1: which was more than double the speed that the plant 411 00:26:24,880 --> 00:26:29,000 Speaker 1: could manage before the installation of the robots. The business 412 00:26:29,040 --> 00:26:33,240 Speaker 1: case for the robots seemed clear. After a hefty upfront cost, 413 00:26:33,600 --> 00:26:36,800 Speaker 1: you could produce way more stuff per day, and as 414 00:26:36,840 --> 00:26:39,840 Speaker 1: long as the demand for that stuff is high enough, 415 00:26:40,320 --> 00:26:42,960 Speaker 1: it could mean greater revenue. You could also bring the 416 00:26:43,000 --> 00:26:46,439 Speaker 1: cost of production for an individual unit down. Then you 417 00:26:46,480 --> 00:26:49,200 Speaker 1: could pass savings on to customers and get really competitive 418 00:26:49,240 --> 00:26:52,159 Speaker 1: with your pricing, or you could just keep everything priced 419 00:26:52,160 --> 00:26:55,520 Speaker 1: the same and try to increase your profit margin. The 420 00:26:55,600 --> 00:26:58,040 Speaker 1: key to all this was that you had to be 421 00:26:58,160 --> 00:27:01,160 Speaker 1: sure the thing you were producing would bring in enough 422 00:27:01,200 --> 00:27:04,320 Speaker 1: money to offset the cost of automation, so it would 423 00:27:04,400 --> 00:27:07,800 Speaker 1: not make sense to spend millions of dollars building out 424 00:27:07,800 --> 00:27:10,840 Speaker 1: a factory staffed with robots. If you were making something 425 00:27:10,920 --> 00:27:13,960 Speaker 1: that had a very small market to begin with, yes, 426 00:27:14,320 --> 00:27:17,160 Speaker 1: you'd be able to produce way more Watching My colle 427 00:27:17,160 --> 00:27:20,159 Speaker 1: its than you could before. But if the demand for 428 00:27:20,280 --> 00:27:24,200 Speaker 1: Watchma collets is really modest, that doesn't do you any good. 429 00:27:24,720 --> 00:27:27,239 Speaker 1: In fact, you might end up flooding the market and 430 00:27:27,359 --> 00:27:31,359 Speaker 1: devaluing your product. So while robots were taking on jobs 431 00:27:31,359 --> 00:27:34,000 Speaker 1: that were previously held by humans, there was no real 432 00:27:34,080 --> 00:27:37,480 Speaker 1: danger of a massive upheaval where everything would be automated. 433 00:27:37,760 --> 00:27:40,960 Speaker 1: The limitations in the technology were just too great and 434 00:27:41,000 --> 00:27:43,560 Speaker 1: the cost was too high for most companies to go 435 00:27:43,640 --> 00:27:46,960 Speaker 1: that route. And this also became the starting point for 436 00:27:47,040 --> 00:27:51,000 Speaker 1: something that would become really important that the main goal 437 00:27:51,119 --> 00:27:55,840 Speaker 1: of developing industrial robots wasn't to displace humans. It was 438 00:27:55,880 --> 00:28:00,520 Speaker 1: meant to offload duties that were dull, dirty, or days. 439 00:28:00,600 --> 00:28:04,120 Speaker 1: You'll often hear those terms being used with robotics if 440 00:28:04,160 --> 00:28:06,640 Speaker 1: it is a job that carries with it a significant 441 00:28:06,720 --> 00:28:09,399 Speaker 1: risk to the person performing it, or a job so 442 00:28:09,600 --> 00:28:12,000 Speaker 1: demanding that you can only expect a person to stick 443 00:28:12,000 --> 00:28:14,040 Speaker 1: with it for a short while before they need to 444 00:28:14,080 --> 00:28:17,679 Speaker 1: do something else. Then building a robot to do that job, 445 00:28:17,800 --> 00:28:21,040 Speaker 1: or at least that list of tasks makes sense. The 446 00:28:21,200 --> 00:28:24,760 Speaker 1: robot is just a thing. It can endure conditions that 447 00:28:24,840 --> 00:28:27,679 Speaker 1: humans can't, and it doesn't get sick, and it doesn't 448 00:28:27,680 --> 00:28:31,520 Speaker 1: get hurt. If something breaks down, you can typically repair 449 00:28:31,560 --> 00:28:35,800 Speaker 1: it pretty quickly. We humans don't have that luxury. Now, 450 00:28:35,840 --> 00:28:38,400 Speaker 1: I'm not going to go and run down a full 451 00:28:38,520 --> 00:28:42,200 Speaker 1: history of all industrial robots because that would mostly involve 452 00:28:42,240 --> 00:28:45,960 Speaker 1: me talking about model numbers with slight differences like the 453 00:28:46,080 --> 00:28:49,320 Speaker 1: number of axes of movement or points of articulation for 454 00:28:49,440 --> 00:28:52,120 Speaker 1: one robot versus another, and that's not really interesting. But 455 00:28:52,160 --> 00:28:55,480 Speaker 1: I do want to hit a couple of highlights. One 456 00:28:55,640 --> 00:29:02,440 Speaker 1: is that in nineteen seventy five, the ASEAB robot would 457 00:29:02,480 --> 00:29:07,320 Speaker 1: be the first fully electrically driven robot. It also used 458 00:29:07,400 --> 00:29:11,600 Speaker 1: Intel's first chip set as processors. Now, this was not 459 00:29:11,840 --> 00:29:16,960 Speaker 1: a super strong robot because those electrically driven limbs just 460 00:29:17,040 --> 00:29:19,720 Speaker 1: can't pack the same punch as a hydraulic system, which 461 00:29:19,760 --> 00:29:24,520 Speaker 1: typically moves much more slowly but can handle much heavier payloads. 462 00:29:24,920 --> 00:29:28,600 Speaker 1: So this particular robot could only lift weights up to 463 00:29:28,760 --> 00:29:33,040 Speaker 1: around thirteen pounds or six kilograms, but the move toward 464 00:29:33,160 --> 00:29:37,720 Speaker 1: processors and electrically driven components marked a big technological step, 465 00:29:37,920 --> 00:29:42,360 Speaker 1: even if the arm's physical capabilities were much less impressive 466 00:29:42,520 --> 00:29:46,920 Speaker 1: than a hydraulic system. By the end of the nineteen seventies, 467 00:29:47,320 --> 00:29:50,680 Speaker 1: Japan was getting into the robotics game with arc welding 468 00:29:50,760 --> 00:29:55,040 Speaker 1: robots for assembly lines, and then it was off to 469 00:29:55,120 --> 00:29:58,600 Speaker 1: the robotic races, with the eighties seeing a surge in 470 00:29:58,720 --> 00:30:05,440 Speaker 1: advances with industrial robots. Soon, massive manufacturing facilities were installing 471 00:30:05,520 --> 00:30:08,720 Speaker 1: robots to take over elements of the assembly line process, 472 00:30:08,840 --> 00:30:14,000 Speaker 1: particularly in that dirty, dull, and dangerous category. The robots 473 00:30:14,000 --> 00:30:18,440 Speaker 1: became more sophisticated, which also added to their value. When 474 00:30:18,440 --> 00:30:21,360 Speaker 1: we come back, I'll talk more about why that's important, 475 00:30:21,360 --> 00:30:32,240 Speaker 1: but first let's take another quick break. By the mid 476 00:30:32,360 --> 00:30:36,480 Speaker 1: nineteen nineties, robotics companies were making machines that could coordinate 477 00:30:36,560 --> 00:30:39,479 Speaker 1: and synchronize the movements of more than one robot at 478 00:30:39,520 --> 00:30:44,200 Speaker 1: the same time, allowing for more complex manufacturing processes. By 479 00:30:44,200 --> 00:30:47,320 Speaker 1: the early two thousands, there were systems that could synchronize 480 00:30:47,360 --> 00:30:50,120 Speaker 1: the actions of up to four robots at a time, 481 00:30:50,560 --> 00:30:54,480 Speaker 1: further adding to the overall system flexibility. Now, I mentioned 482 00:30:54,560 --> 00:30:58,920 Speaker 1: earlier that a programmable robot is more versatile than something 483 00:30:59,000 --> 00:31:03,680 Speaker 1: like a cam operated system. Well, more sophisticated robots, with 484 00:31:03,760 --> 00:31:08,120 Speaker 1: more axes of motion and more points of articulation have 485 00:31:08,240 --> 00:31:11,959 Speaker 1: the potential to do lots of different types of jobs, 486 00:31:12,360 --> 00:31:15,560 Speaker 1: and this is of critical importance. If the robot is 487 00:31:15,600 --> 00:31:19,000 Speaker 1: too limited, if you can only do a small range 488 00:31:19,080 --> 00:31:24,200 Speaker 1: of motions, you can't necessarily repurpose it for new processes. 489 00:31:24,400 --> 00:31:27,440 Speaker 1: And as markets change, you may find yourself needing to 490 00:31:27,480 --> 00:31:30,320 Speaker 1: be flexible when it comes to the stuff you're manufacturing. 491 00:31:30,720 --> 00:31:34,920 Speaker 1: So let's use an extreme hypothetical example that would probably 492 00:31:35,040 --> 00:31:38,880 Speaker 1: never happen. So let's say that you run an auto 493 00:31:38,880 --> 00:31:43,479 Speaker 1: manufacturing facility, but then there's a massive market change and 494 00:31:43,520 --> 00:31:47,160 Speaker 1: it drastically affects the demand for your cars. There's just 495 00:31:47,280 --> 00:31:51,840 Speaker 1: not enough demand to support the production. So rather than 496 00:31:51,960 --> 00:31:55,480 Speaker 1: just you know, closing up shop and calling it a day, 497 00:31:55,960 --> 00:31:59,160 Speaker 1: your business decides to do an amazing pivot and you 498 00:31:59,200 --> 00:32:03,320 Speaker 1: begin to convert your manufacturing facility over to I don't know, 499 00:32:04,080 --> 00:32:08,040 Speaker 1: home appliances. Now, again, this is an extreme hypothetical example, 500 00:32:08,080 --> 00:32:11,239 Speaker 1: but let's just go with it. Okay, So here we go. 501 00:32:11,560 --> 00:32:15,000 Speaker 1: If the robots on your assembly line are powerful, but 502 00:32:15,320 --> 00:32:19,080 Speaker 1: limited in movement and function. You may find it impossible 503 00:32:19,120 --> 00:32:21,480 Speaker 1: to adapt them to your new line of business, which 504 00:32:21,520 --> 00:32:24,960 Speaker 1: would mean you need to either invest in new robots 505 00:32:25,440 --> 00:32:28,440 Speaker 1: or you'd have to hire human workers to put together 506 00:32:28,480 --> 00:32:31,760 Speaker 1: your appliances. And it would also mean that your old 507 00:32:31,880 --> 00:32:34,240 Speaker 1: robots would be a sunk cost. You would need to 508 00:32:34,280 --> 00:32:37,200 Speaker 1: either sell them off or put them in storage or something. 509 00:32:37,640 --> 00:32:40,920 Speaker 1: If the robots are really sophisticated, however, you might be 510 00:32:40,960 --> 00:32:43,720 Speaker 1: able to program them to do some of the operations 511 00:32:43,760 --> 00:32:47,240 Speaker 1: on the new assembly line, and that would keep them useful, 512 00:32:47,240 --> 00:32:50,479 Speaker 1: it would lower the cost of production. Or, for a 513 00:32:50,720 --> 00:32:54,959 Speaker 1: less extreme example, you introduce a new model of whatever 514 00:32:55,120 --> 00:32:58,800 Speaker 1: thing it is that you're producing. Anything new will require 515 00:32:58,840 --> 00:33:02,000 Speaker 1: adjustments in the assembly line process, and if the changes 516 00:33:02,040 --> 00:33:04,600 Speaker 1: are big enough, the robots may not be able to 517 00:33:05,080 --> 00:33:08,320 Speaker 1: make as big a contribution in the process. That's something 518 00:33:08,320 --> 00:33:11,800 Speaker 1: that could happen with the example of the PlayStation we 519 00:33:11,800 --> 00:33:14,160 Speaker 1: were talking about. Yeah, those robots can put together a 520 00:33:14,200 --> 00:33:17,320 Speaker 1: PS four and thirty seconds. There's no guarantee they'll be 521 00:33:17,360 --> 00:33:19,400 Speaker 1: able to do the same thing with a PS five, 522 00:33:19,680 --> 00:33:23,880 Speaker 1: at least not without a major overhaul of their assembly line. System. 523 00:33:24,360 --> 00:33:27,120 Speaker 1: While the manufacturing facility can churn out a finished PS 524 00:33:27,160 --> 00:33:30,560 Speaker 1: four and thirty seconds, we might not see them work 525 00:33:30,600 --> 00:33:33,200 Speaker 1: at all with PS five, at least not right away. 526 00:33:33,560 --> 00:33:36,280 Speaker 1: It would all have to be optimized. So for decades, 527 00:33:36,520 --> 00:33:40,600 Speaker 1: industrial robots were kept as separate from human workers as 528 00:33:40,840 --> 00:33:45,000 Speaker 1: was possible. You wanted to keep them well away from 529 00:33:45,040 --> 00:33:46,920 Speaker 1: all the people, or keep the people well away from 530 00:33:46,920 --> 00:33:50,520 Speaker 1: all the robots. Often the robots would operate within cages 531 00:33:50,880 --> 00:33:55,000 Speaker 1: specifically to limit the possibility of a human coming within range. 532 00:33:55,720 --> 00:34:00,880 Speaker 1: After all, these robots are large, they're heavy, powerful, and 533 00:34:01,120 --> 00:34:04,440 Speaker 1: many of them are incapable of sensing stuff in their 534 00:34:04,640 --> 00:34:08,160 Speaker 1: environment and whether or not a human is within their 535 00:34:08,280 --> 00:34:11,279 Speaker 1: range of motion. Instead, they're just going through that pre 536 00:34:11,400 --> 00:34:15,000 Speaker 1: programmed series of motions and they're not going to stop 537 00:34:15,080 --> 00:34:18,600 Speaker 1: unless someone turns it off. A robot is performing that 538 00:34:18,680 --> 00:34:20,640 Speaker 1: same series of steps over and over, and that can 539 00:34:20,680 --> 00:34:24,759 Speaker 1: mean that if a human in that area gets near 540 00:34:24,840 --> 00:34:27,440 Speaker 1: the robot, they could end up getting injured or worse. 541 00:34:27,480 --> 00:34:30,120 Speaker 1: And in fact, this has happened several times over the 542 00:34:30,120 --> 00:34:32,600 Speaker 1: course of the last few decades, and at least in 543 00:34:32,640 --> 00:34:35,520 Speaker 1: some cases it seems as though the robot might have 544 00:34:35,600 --> 00:34:38,799 Speaker 1: been at fault meaning it's not always a case of 545 00:34:38,920 --> 00:34:43,640 Speaker 1: human carelessness. For example, an engineer in twenty fifteen died 546 00:34:43,719 --> 00:34:46,400 Speaker 1: when a robot armed from one section of the factory 547 00:34:46,440 --> 00:34:52,200 Speaker 1: floor moved beyond its operating area and into the neighboring 548 00:34:52,280 --> 00:34:55,520 Speaker 1: section that the engineer was working in. This is something 549 00:34:55,560 --> 00:34:57,880 Speaker 1: that should not have happened. The robot arms should not 550 00:34:57,960 --> 00:35:02,640 Speaker 1: have moved that far into the neighboring section. The robot 551 00:35:02,719 --> 00:35:05,480 Speaker 1: arm hit the engineer on the head, and she later 552 00:35:05,600 --> 00:35:09,640 Speaker 1: died from her injuries. In the United States, the government 553 00:35:09,719 --> 00:35:13,520 Speaker 1: has listed thirty three workplace deaths due to accidents with 554 00:35:13,600 --> 00:35:18,000 Speaker 1: industrial robots between the years nineteen eighty four and twenty fourteen. 555 00:35:18,600 --> 00:35:23,839 Speaker 1: The investigations also found that the majority of those tragedies 556 00:35:24,640 --> 00:35:28,640 Speaker 1: was typically the fault of human error. There was a 557 00:35:28,680 --> 00:35:32,400 Speaker 1: person who was wandering into the operation zone of a robot. 558 00:35:32,560 --> 00:35:36,759 Speaker 1: That twenty fifteen incident was an outlier. Not that any 559 00:35:36,800 --> 00:35:39,520 Speaker 1: of this makes the thought of working around industrial robots 560 00:35:39,600 --> 00:35:43,120 Speaker 1: less scary or those other accidents any less tragic. They're 561 00:35:43,120 --> 00:35:47,920 Speaker 1: all terribly tragic. Moreover, we're seeing more robots that are 562 00:35:47,920 --> 00:35:51,680 Speaker 1: capable of roaming a work space. They are no longer 563 00:35:51,719 --> 00:35:54,640 Speaker 1: anchored to a specific spot on the floor. In some cases, 564 00:35:55,000 --> 00:35:59,960 Speaker 1: they also, unlike the first industrial robots, typically have external sense. 565 00:36:00,840 --> 00:36:03,840 Speaker 1: These not only help the robots navigate their environments, but 566 00:36:03,920 --> 00:36:10,240 Speaker 1: also hopefully avoid accidents with human workers. Let's take Amazon's 567 00:36:10,480 --> 00:36:14,960 Speaker 1: warehouse robots for example. These robots look like really big 568 00:36:15,080 --> 00:36:19,480 Speaker 1: robotic vacuum cleaners. They are designed to roll under shelves 569 00:36:19,800 --> 00:36:23,800 Speaker 1: and the shelves are just slightly larger than the dimensions 570 00:36:23,840 --> 00:36:26,600 Speaker 1: of the robot. And when an order comes in, a 571 00:36:26,680 --> 00:36:29,560 Speaker 1: robot from the warehouse rolls over to a shelf that 572 00:36:29,760 --> 00:36:34,040 Speaker 1: holds the respective item on it according to the inventory system, 573 00:36:34,320 --> 00:36:37,600 Speaker 1: and the robot goes under the shelf then lifts the 574 00:36:37,600 --> 00:36:41,319 Speaker 1: shelf by raising a platter like platform on the top 575 00:36:41,360 --> 00:36:42,879 Speaker 1: of the robot. Think of it as like a little 576 00:36:42,920 --> 00:36:45,200 Speaker 1: fork left, except it's more like a I don't know, 577 00:36:45,480 --> 00:36:47,920 Speaker 1: like a tray that a waiter would use to carry 578 00:36:48,000 --> 00:36:51,200 Speaker 1: drinks to a table. But it carries the whole shelf 579 00:36:51,400 --> 00:36:54,319 Speaker 1: up and over to the edge of a cage, where 580 00:36:54,320 --> 00:36:58,000 Speaker 1: a human operator will take the respective item off the 581 00:36:58,080 --> 00:37:01,200 Speaker 1: shelf and scan it and put it into a bin. 582 00:37:01,360 --> 00:37:03,960 Speaker 1: And then those bins go to other humans who further 583 00:37:04,080 --> 00:37:06,840 Speaker 1: scan those items and then put them into other bins 584 00:37:06,840 --> 00:37:09,680 Speaker 1: that ultimately go to the packing department. And if you 585 00:37:09,719 --> 00:37:12,600 Speaker 1: watch videos of these robots, it looks like they're doing 586 00:37:12,640 --> 00:37:17,400 Speaker 1: a complicated ballet as they maneuver through this warehouse, avoiding 587 00:37:17,400 --> 00:37:22,240 Speaker 1: other robots and shelves. As they bring those shelves to humans. 588 00:37:23,200 --> 00:37:26,399 Speaker 1: Markings on the warehouse floor tell the robots where they 589 00:37:26,400 --> 00:37:29,200 Speaker 1: are with respect to everything else in the warehouse, and 590 00:37:29,280 --> 00:37:33,640 Speaker 1: the robots even will position shelves that have items that 591 00:37:33,680 --> 00:37:36,400 Speaker 1: are being ordered a lot toward the edges of this 592 00:37:36,560 --> 00:37:39,839 Speaker 1: space so that they're easier to get to and move 593 00:37:39,880 --> 00:37:42,480 Speaker 1: them over to the human beings. So it's kind of 594 00:37:42,520 --> 00:37:45,239 Speaker 1: an interesting dynamic system. It's not like they pick up 595 00:37:45,280 --> 00:37:48,200 Speaker 1: the shelf and then bring the shelf immediately right back 596 00:37:48,239 --> 00:37:50,600 Speaker 1: to where it started. The shelf can end up in 597 00:37:50,640 --> 00:37:55,720 Speaker 1: a different location entirely. In addition, cameras give the robots 598 00:37:55,760 --> 00:37:59,479 Speaker 1: the ability to sense any obstacles that might block their path, 599 00:37:59,600 --> 00:38:01,920 Speaker 1: allowing the robot to come to a stop and a 600 00:38:01,920 --> 00:38:06,480 Speaker 1: wait further instructions and report that it has found something 601 00:38:06,680 --> 00:38:11,440 Speaker 1: unusual on the warehouse floor. Even so, typically humans are 602 00:38:11,480 --> 00:38:14,640 Speaker 1: not allowed to roam the area where the robots pick 603 00:38:14,719 --> 00:38:17,680 Speaker 1: up shelves if something has fallen on the warehouse floor. 604 00:38:17,760 --> 00:38:22,200 Speaker 1: A designated troubleshooter gets an alert, and that person must 605 00:38:22,280 --> 00:38:24,920 Speaker 1: use an interface to draw the path that they are 606 00:38:24,960 --> 00:38:28,839 Speaker 1: going to take from the entrance of the cage all 607 00:38:28,920 --> 00:38:32,520 Speaker 1: the way to the point of trouble. Like let's say 608 00:38:32,560 --> 00:38:35,000 Speaker 1: that a product has fallen out of a shelf and 609 00:38:35,000 --> 00:38:36,920 Speaker 1: has hit the floor, and a robot has reported it. 610 00:38:37,640 --> 00:38:39,920 Speaker 1: You would use a tablet. If you're the troubleshooter, you'd 611 00:38:40,000 --> 00:38:42,440 Speaker 1: use a tablet and you would draw, almost like a maze, 612 00:38:42,800 --> 00:38:46,920 Speaker 1: the path you would take to get to that particular item, 613 00:38:47,080 --> 00:38:51,200 Speaker 1: and you would follow that path out and back. In addition, 614 00:38:51,760 --> 00:38:55,080 Speaker 1: you'd wear a radio transmitter that would send a signal 615 00:38:55,120 --> 00:38:57,600 Speaker 1: out that the robots could all detect, and that would 616 00:38:57,600 --> 00:39:00,839 Speaker 1: alert the robots to the presence of you, the troubleshooter. 617 00:39:01,520 --> 00:39:04,080 Speaker 1: That helps prevent a situation in which the robots are 618 00:39:04,120 --> 00:39:08,920 Speaker 1: going to collide with you, right, you want to avoid that. Now, 619 00:39:08,920 --> 00:39:11,560 Speaker 1: there's a lot of work that goes into designing robots 620 00:39:11,600 --> 00:39:14,920 Speaker 1: that can interoperate in a space that's occupied by humans, 621 00:39:15,400 --> 00:39:18,080 Speaker 1: and it's a very challenging line of technology because it 622 00:39:18,120 --> 00:39:21,240 Speaker 1: takes more than just thinking about how the machines work 623 00:39:21,480 --> 00:39:25,120 Speaker 1: you also have to think about how people work, and moreover, 624 00:39:25,520 --> 00:39:28,200 Speaker 1: you have to think about how people change the way 625 00:39:28,239 --> 00:39:30,560 Speaker 1: they work when they're in the company of a robot. 626 00:39:30,800 --> 00:39:35,680 Speaker 1: It's kind of similar to the concept in quantum theory, right, 627 00:39:35,719 --> 00:39:38,839 Speaker 1: the idea that you change a thing you observe just 628 00:39:38,880 --> 00:39:43,239 Speaker 1: through the act of observation. Well, you can have a 629 00:39:43,680 --> 00:39:47,200 Speaker 1: workspace that humans had only been working in for a while, 630 00:39:48,040 --> 00:39:49,839 Speaker 1: and you could say, all right, well, I've observed how 631 00:39:49,840 --> 00:39:51,680 Speaker 1: the humans work, and I'm going to build a robot 632 00:39:51,680 --> 00:39:55,080 Speaker 1: that does this one task that the humans do, and 633 00:39:55,280 --> 00:39:58,239 Speaker 1: I'm just going to incorporate right into their workspace. But 634 00:39:58,320 --> 00:40:00,440 Speaker 1: then you find out that when you do that, the 635 00:40:00,520 --> 00:40:03,040 Speaker 1: humans all behave in a new way because there's a 636 00:40:03,080 --> 00:40:06,239 Speaker 1: new thing in the environment that you didn't account for, 637 00:40:06,680 --> 00:40:09,360 Speaker 1: and now the design of your robot doesn't work as well. 638 00:40:10,000 --> 00:40:14,200 Speaker 1: We humans are tricky like that. Moreover, we need to 639 00:40:14,200 --> 00:40:17,480 Speaker 1: get to that threat that weavers were worried about more 640 00:40:17,520 --> 00:40:22,400 Speaker 1: than a century ago. Is automation going to take our jobs? Now? 641 00:40:22,440 --> 00:40:26,400 Speaker 1: There have been a few studies, all using different methodologies, 642 00:40:26,760 --> 00:40:29,480 Speaker 1: and some of those studies coming under criticism for the 643 00:40:29,560 --> 00:40:32,839 Speaker 1: approaches that were used. But there have been a few 644 00:40:32,840 --> 00:40:36,400 Speaker 1: studies that suggest we'll see automation continue to impact jobs 645 00:40:36,440 --> 00:40:40,200 Speaker 1: in the near future and drastically so over the course 646 00:40:40,239 --> 00:40:43,799 Speaker 1: of the long run. The interpretation of those results have 647 00:40:43,840 --> 00:40:47,960 Speaker 1: been reported in ways that range from automation is going 648 00:40:48,000 --> 00:40:51,400 Speaker 1: to be disruptive that's on the light end to fifty 649 00:40:51,400 --> 00:40:53,319 Speaker 1: percent of all jobs are going to be taken by 650 00:40:53,320 --> 00:40:57,800 Speaker 1: the robots. So what's the actual truth. Well, the truth, 651 00:40:57,880 --> 00:41:02,200 Speaker 1: as it turns out, is complex. For one thing, automation 652 00:41:02,719 --> 00:41:06,600 Speaker 1: rarely takes over an entire job. What is far more 653 00:41:06,800 --> 00:41:10,160 Speaker 1: likely to happen is that automation will take over certain 654 00:41:10,400 --> 00:41:15,080 Speaker 1: tasks that are part of a job, or perhaps multiple jobs. 655 00:41:15,680 --> 00:41:18,880 Speaker 1: So if a job requires a wide variety of tasks, 656 00:41:19,280 --> 00:41:23,080 Speaker 1: some of which may require critical thinking, it's really hard 657 00:41:23,080 --> 00:41:25,640 Speaker 1: to design a robot that can do all of that. 658 00:41:26,239 --> 00:41:30,360 Speaker 1: It's far more likely that you would automate certain job responsibilities, 659 00:41:30,680 --> 00:41:34,240 Speaker 1: which would mean that those jobs themselves wouldn't go away, 660 00:41:34,600 --> 00:41:40,000 Speaker 1: they would just change. The repetitive responsibilities would be offloaded, 661 00:41:40,080 --> 00:41:42,680 Speaker 1: and you would focus on something else. You might have 662 00:41:42,719 --> 00:41:45,360 Speaker 1: to spend more time doing other duties rather than the 663 00:41:45,480 --> 00:41:49,960 Speaker 1: routine ones, which isn't necessarily a bad thing, but there 664 00:41:49,960 --> 00:41:54,800 Speaker 1: are cases where automation would likely take over an entire job, 665 00:41:55,120 --> 00:42:00,239 Speaker 1: for example, truck drivers in shipping trucks. Much of the 666 00:42:00,280 --> 00:42:04,480 Speaker 1: work in autonomous vehicles is really focusing not necessarily on 667 00:42:04,520 --> 00:42:10,480 Speaker 1: replacing passenger vehicles so much as commercial vehicles like shipping trucks. 668 00:42:11,480 --> 00:42:15,560 Speaker 1: The Bureau of Labor Statistics in the United States estimated 669 00:42:15,600 --> 00:42:18,520 Speaker 1: that the age of the average US truck driver is 670 00:42:18,640 --> 00:42:22,279 Speaker 1: fifty five and more than ninety percent of all truck 671 00:42:22,360 --> 00:42:26,080 Speaker 1: drivers in the US are mail and that will present 672 00:42:26,200 --> 00:42:31,160 Speaker 1: a challenge see Generally, the pro argument for automation is 673 00:42:31,200 --> 00:42:35,760 Speaker 1: that while robots and automated systems will eliminate some jobs, 674 00:42:36,120 --> 00:42:41,520 Speaker 1: they will create other jobs, presumably better jobs. And this 675 00:42:41,719 --> 00:42:45,000 Speaker 1: is true. At the turn of the twentieth century, forty 676 00:42:45,080 --> 00:42:48,760 Speaker 1: percent of all jobs in the United States were on farms. 677 00:42:49,360 --> 00:42:52,120 Speaker 1: So that means four out of ten people in the 678 00:42:52,280 --> 00:42:56,480 Speaker 1: US who had a job we're working on a farm. Today, 679 00:42:57,160 --> 00:43:01,200 Speaker 1: Agriculture and all the related food sectors make up just 680 00:43:01,400 --> 00:43:04,200 Speaker 1: eleven percent of all jobs in the United States. And 681 00:43:04,440 --> 00:43:06,319 Speaker 1: if we just limit this to the people who are 682 00:43:06,360 --> 00:43:10,080 Speaker 1: working on farms, you know, not all agricultural jobs and 683 00:43:10,120 --> 00:43:13,279 Speaker 1: food sector jobs, just the farm jobs. If we do that, 684 00:43:13,640 --> 00:43:17,680 Speaker 1: we're talking about only one point three percent of all 685 00:43:17,880 --> 00:43:21,600 Speaker 1: US employment, So going from forty percent to one point 686 00:43:21,640 --> 00:43:25,239 Speaker 1: three percent, that's a drastic change. Now, clearly automation has 687 00:43:25,440 --> 00:43:29,480 Speaker 1: transformed agriculture. It allows us to do a lot more 688 00:43:29,719 --> 00:43:34,400 Speaker 1: while relying on fewer people, and new jobs did come around, 689 00:43:34,520 --> 00:43:38,800 Speaker 1: so we didn't see an unemployment rate reaching levels higher 690 00:43:38,840 --> 00:43:45,720 Speaker 1: than forty percent pre COVID. The pro automation argument states 691 00:43:45,760 --> 00:43:49,839 Speaker 1: that new jobs, which again should ideally be better than 692 00:43:49,960 --> 00:43:53,279 Speaker 1: existing jobs, as in less strenuous and less dangerous and 693 00:43:53,320 --> 00:43:57,480 Speaker 1: more interesting, will emerge as older jobs are phased out. 694 00:43:58,239 --> 00:44:01,440 Speaker 1: Now that works fine on a macro scale when you're 695 00:44:01,480 --> 00:44:04,880 Speaker 1: taking a really big picture look at the overall trends, 696 00:44:04,920 --> 00:44:08,000 Speaker 1: but when you consider the particulars, like our truck drivers, 697 00:44:08,400 --> 00:44:12,200 Speaker 1: you start to see some obstacles. See this year, I 698 00:44:12,280 --> 00:44:16,080 Speaker 1: turned forty five, so I'm a lot closer to the 699 00:44:16,120 --> 00:44:18,920 Speaker 1: average age of a truck driver in the United States 700 00:44:19,440 --> 00:44:21,719 Speaker 1: than I am to someone who's just getting into the 701 00:44:21,800 --> 00:44:24,520 Speaker 1: job market for the first time. And I can tell 702 00:44:24,560 --> 00:44:28,759 Speaker 1: you that even as a relatively tech savvy guy, I 703 00:44:28,800 --> 00:44:32,640 Speaker 1: would find it really challenging to pick up the job skills. 704 00:44:32,680 --> 00:44:35,640 Speaker 1: I would need to go into a different line of work, 705 00:44:35,920 --> 00:44:40,400 Speaker 1: particularly one where I'm competing against people who already have 706 00:44:40,640 --> 00:44:44,560 Speaker 1: training and experience in that field. So imagine having to 707 00:44:44,560 --> 00:44:47,120 Speaker 1: tell a group of fifty five year old truck drivers 708 00:44:47,640 --> 00:44:50,440 Speaker 1: that they're out of a job. But good news, if 709 00:44:50,440 --> 00:44:52,960 Speaker 1: you just start taking classes, you can learn to code 710 00:44:53,440 --> 00:44:56,680 Speaker 1: and make less money than you did in your old job. 711 00:44:57,480 --> 00:45:00,600 Speaker 1: It's not great, is what I'm saying. That being said, 712 00:45:01,200 --> 00:45:04,440 Speaker 1: automation is clearly not going anywhere. It's going to continue 713 00:45:04,520 --> 00:45:07,480 Speaker 1: to play a big role in how we get work done, 714 00:45:07,680 --> 00:45:10,960 Speaker 1: and in our best case scenarios, it's going to augment 715 00:45:11,120 --> 00:45:14,920 Speaker 1: the work that humans do, leading to better, more efficient, 716 00:45:15,040 --> 00:45:18,520 Speaker 1: and more cost effective outcomes. It will free us up 717 00:45:18,600 --> 00:45:20,440 Speaker 1: to focus on the parts of our jobs that we 718 00:45:20,520 --> 00:45:23,759 Speaker 1: find the most fulfilling. We can handle the stuff that 719 00:45:23,800 --> 00:45:27,720 Speaker 1: requires flexibility and intuitive thinking, and the machines can handle 720 00:45:27,920 --> 00:45:32,759 Speaker 1: the routine and the dangerous. But in a worst case scenario, 721 00:45:33,280 --> 00:45:37,920 Speaker 1: we'll see an unprepared population of former workers who are 722 00:45:37,960 --> 00:45:40,880 Speaker 1: now out of a job and without the support system 723 00:45:40,960 --> 00:45:44,640 Speaker 1: there to help them transition into something new so that 724 00:45:44,680 --> 00:45:49,719 Speaker 1: they can continue to contribute to society and earn a living. Now, 725 00:45:49,760 --> 00:45:53,320 Speaker 1: this is why you will often hear conversations about automation 726 00:45:53,840 --> 00:45:59,120 Speaker 1: get tied into concepts like a guaranteed jobs program. This 727 00:45:59,160 --> 00:46:02,280 Speaker 1: is typically where something like a government creates a system 728 00:46:02,480 --> 00:46:05,480 Speaker 1: that makes certain every person who wants a job can 729 00:46:05,520 --> 00:46:10,160 Speaker 1: get a job. Or you'll hear about guaranteed basic income. 730 00:46:10,320 --> 00:46:12,840 Speaker 1: This is a strategy in which tax dollars go to 731 00:46:12,920 --> 00:46:17,319 Speaker 1: fund a standard income payout to all citizens so that 732 00:46:17,360 --> 00:46:20,960 Speaker 1: they can meet their most basic needs. Now, these are 733 00:46:21,000 --> 00:46:25,759 Speaker 1: big ideas, they aren't easy to implement or administer, and 734 00:46:25,800 --> 00:46:28,920 Speaker 1: they're not cheap, But it may be that they will 735 00:46:28,960 --> 00:46:33,160 Speaker 1: become necessary or some similar strategy will be needed to 736 00:46:33,239 --> 00:46:35,960 Speaker 1: make certain that we have a plan to move toward 737 00:46:36,480 --> 00:46:39,640 Speaker 1: rather than being caught in a world where a disproportionate 738 00:46:39,680 --> 00:46:44,360 Speaker 1: percentage of people can't find gainful employment. Heck, we're seeing 739 00:46:44,400 --> 00:46:47,600 Speaker 1: something like that right now due to the COVID crisis, 740 00:46:48,000 --> 00:46:51,759 Speaker 1: which is also underlining the importance of automation in a 741 00:46:51,800 --> 00:46:54,719 Speaker 1: world where it's not necessarily safe to have a bunch 742 00:46:54,760 --> 00:46:57,600 Speaker 1: of human beings all gathered in the same place at 743 00:46:57,640 --> 00:47:03,040 Speaker 1: the same time. Robots coming for our jobs, well, for 744 00:47:03,120 --> 00:47:06,480 Speaker 1: some of our jobs, definitely, many of those jobs come 745 00:47:06,560 --> 00:47:10,080 Speaker 1: with some pretty tough consequences for humans who are working 746 00:47:10,280 --> 00:47:14,440 Speaker 1: those jobs today. Those jobs may have high injury rates, 747 00:47:14,480 --> 00:47:17,400 Speaker 1: the people who work them may have lower life expectancies, 748 00:47:17,880 --> 00:47:19,680 Speaker 1: and they are a whole host of health issues that 749 00:47:19,680 --> 00:47:22,279 Speaker 1: can come along with certain jobs. So you could make 750 00:47:22,520 --> 00:47:25,000 Speaker 1: a strong argument that really this is for the best 751 00:47:25,120 --> 00:47:27,880 Speaker 1: because it will help save lives and reduce the chance 752 00:47:27,920 --> 00:47:31,719 Speaker 1: for injury or illness for a lot of people. But 753 00:47:31,960 --> 00:47:34,879 Speaker 1: for other jobs, the robots aren't likely to take over 754 00:47:34,960 --> 00:47:38,360 Speaker 1: in the near future. For a lot of jobs, automated systems, 755 00:47:38,719 --> 00:47:44,640 Speaker 1: not necessarily robots, but perhaps software based AI, will augment 756 00:47:45,000 --> 00:47:49,200 Speaker 1: what humans are doing. It's important we have conversations about 757 00:47:49,200 --> 00:47:52,040 Speaker 1: this stuff and to talk about how to address the 758 00:47:52,080 --> 00:47:56,560 Speaker 1: consequences of increased automation. There are ways we can enjoy 759 00:47:56,560 --> 00:47:59,960 Speaker 1: the benefits of automation, but only if we think critics 760 00:48:00,400 --> 00:48:05,680 Speaker 1: about it and create policies and procedures accordingly. Now I 761 00:48:05,719 --> 00:48:09,400 Speaker 1: gotta get going. I hear robo Jonathan is going to 762 00:48:09,400 --> 00:48:11,960 Speaker 1: host the next episode of Tech Stuff, and I have 763 00:48:12,040 --> 00:48:14,000 Speaker 1: to train them on how to make puns and pop 764 00:48:14,040 --> 00:48:19,320 Speaker 1: culture references. I hope you liked that episode from twenty twenty. 765 00:48:19,560 --> 00:48:21,520 Speaker 1: As I said, We've got a lot more to talk 766 00:48:21,520 --> 00:48:24,239 Speaker 1: about these days because of things like generative AI and 767 00:48:24,280 --> 00:48:27,239 Speaker 1: the concern that that could impact jobs that for a 768 00:48:27,280 --> 00:48:31,440 Speaker 1: long time people assumed we're safe from automation, especially from 769 00:48:31,480 --> 00:48:34,920 Speaker 1: things like robotics, you know, white collar jobs that people 770 00:48:35,000 --> 00:48:37,920 Speaker 1: just thought were kind of the domain of humans, And 771 00:48:37,960 --> 00:48:41,040 Speaker 1: now there's a real question as to whether or not 772 00:48:41,120 --> 00:48:44,080 Speaker 1: that's actually the case. So I think it's even more prevalent. 773 00:48:44,200 --> 00:48:47,640 Speaker 1: And again seeing the labor movement in the tech sector 774 00:48:47,760 --> 00:48:50,560 Speaker 1: in particular over the last few years tells us that 775 00:48:51,200 --> 00:48:54,759 Speaker 1: there's some very important issues still at the very heart 776 00:48:54,840 --> 00:48:59,440 Speaker 1: of technology and the way we do business that relate 777 00:48:59,520 --> 00:49:04,040 Speaker 1: back to the foundations of the labor movement here in America. 778 00:49:04,960 --> 00:49:07,440 Speaker 1: So I hope you enjoyed that episode. For those of 779 00:49:07,440 --> 00:49:09,480 Speaker 1: you in the United States, I hope you're having a 780 00:49:09,920 --> 00:49:14,040 Speaker 1: healthy and safe and fun Labor Day. For everyone else, 781 00:49:14,080 --> 00:49:16,160 Speaker 1: I hope you're having a great Monday, you know. I 782 00:49:16,200 --> 00:49:18,520 Speaker 1: hope your day is fantastic too, And I'll talk to 783 00:49:18,600 --> 00:49:28,800 Speaker 1: you again really soon. Tech Stuff is an iHeartRadio production. 784 00:49:29,080 --> 00:49:34,120 Speaker 1: For more podcasts from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, 785 00:49:34,239 --> 00:49:36,240 Speaker 1: or wherever you listen to your favorite shows.