1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to text Stuff, a production from I Heart Radio. 2 00:00:12,160 --> 00:00:15,280 Speaker 1: Hey there, and welcome to tech Stuff. I am your host, 3 00:00:15,480 --> 00:00:18,880 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:18,960 --> 00:00:22,599 Speaker 1: and I love all things tech. You know, guys, I 5 00:00:22,680 --> 00:00:26,040 Speaker 1: read a lot of tech news and sometimes that ends 6 00:00:26,120 --> 00:00:29,520 Speaker 1: up inspiring me to do an episode of text Stuff. 7 00:00:29,760 --> 00:00:32,920 Speaker 1: That happened to me recently when I read this headline 8 00:00:33,040 --> 00:00:38,800 Speaker 1: off of the website text Spot. Sony Factory assembles PS 9 00:00:38,920 --> 00:00:44,680 Speaker 1: four in thirty seconds, only four humans involved in the process. 10 00:00:45,479 --> 00:00:47,920 Speaker 1: A p S four, in case you're not aware, is 11 00:00:47,920 --> 00:00:52,360 Speaker 1: a PlayStation for video game consoles. So this factory can 12 00:00:52,400 --> 00:00:55,800 Speaker 1: build a video game console from parts in half a 13 00:00:55,840 --> 00:00:59,920 Speaker 1: minute and only four human beings touched the ding day 14 00:01:00,160 --> 00:01:03,440 Speaker 1: thing in the process. Those four humans, by the way, 15 00:01:03,480 --> 00:01:07,199 Speaker 1: are involved in the beginning and the end of the process. 16 00:01:07,400 --> 00:01:10,880 Speaker 1: Two of them load motherboards onto the assembly line, and 17 00:01:10,920 --> 00:01:14,960 Speaker 1: a motherboard is the primary circuit board for a computer system, 18 00:01:15,000 --> 00:01:17,400 Speaker 1: and the other two human beings are at the end 19 00:01:17,520 --> 00:01:19,720 Speaker 1: of the assembly line and their job is to package 20 00:01:19,760 --> 00:01:23,880 Speaker 1: the completed consoles. All the actual assembly work is done 21 00:01:24,200 --> 00:01:27,880 Speaker 1: by robots. Now, you may be experiencing a couple of 22 00:01:27,880 --> 00:01:32,240 Speaker 1: different responses to this information I know I did. One 23 00:01:32,280 --> 00:01:36,800 Speaker 1: of those was a wow, that's seriously impressive. The PS four, 24 00:01:37,160 --> 00:01:41,280 Speaker 1: like many computer systems, has a lot of components, many 25 00:01:41,319 --> 00:01:44,360 Speaker 1: of which attached to one another by wire or cable. 26 00:01:44,920 --> 00:01:46,840 Speaker 1: So these robots have to be able to take these 27 00:01:47,160 --> 00:01:50,960 Speaker 1: flexible components and to join them in their proper anchor 28 00:01:51,000 --> 00:01:54,840 Speaker 1: points with the appropriate amount of pressure and precision to 29 00:01:54,960 --> 00:01:57,560 Speaker 1: make a good connection. Now, if any of you out 30 00:01:57,600 --> 00:02:00,360 Speaker 1: there have ever built your own PC, you know that 31 00:02:00,440 --> 00:02:03,880 Speaker 1: plugging cables in can get a little tricky depending on 32 00:02:03,920 --> 00:02:06,440 Speaker 1: the layout of the motherboard and the various components. And 33 00:02:07,040 --> 00:02:10,160 Speaker 1: if you're someone like me, you're likely putting stuff together 34 00:02:10,200 --> 00:02:12,880 Speaker 1: only to realize that maybe you should have done some 35 00:02:12,960 --> 00:02:15,840 Speaker 1: of that before you mounted them in a computer case, 36 00:02:15,840 --> 00:02:18,639 Speaker 1: because now you just don't have the space to work 37 00:02:18,639 --> 00:02:22,520 Speaker 1: in properly. So it's pretty darn impressive that robots can 38 00:02:22,560 --> 00:02:26,320 Speaker 1: do this consistently and correctly at that level of speed. 39 00:02:27,200 --> 00:02:32,079 Speaker 1: Another response I had was it's kind of scary. I mean, 40 00:02:32,680 --> 00:02:35,840 Speaker 1: typically you would have dozens of people employed on the 41 00:02:35,880 --> 00:02:38,760 Speaker 1: assembly line to do this sort of work, but in 42 00:02:38,800 --> 00:02:42,720 Speaker 1: this factory it's been stripped down to thirty two robots 43 00:02:42,960 --> 00:02:46,519 Speaker 1: and four human beings. The article in tech spot points 44 00:02:46,520 --> 00:02:50,600 Speaker 1: out that twenty six of those thirty two robots are 45 00:02:50,720 --> 00:02:55,760 Speaker 1: just attaching flexible components together inside the console. Now, I 46 00:02:55,800 --> 00:02:59,880 Speaker 1: have no idea how much these robots cost, but I wait, 47 00:03:00,040 --> 00:03:03,359 Speaker 1: you're that they are expensive enough to equal the salary 48 00:03:03,480 --> 00:03:06,720 Speaker 1: of a standard human employee on the assembly line. However, 49 00:03:07,240 --> 00:03:10,160 Speaker 1: you don't pay robots. You do have to spend money 50 00:03:10,200 --> 00:03:13,959 Speaker 1: to maintain and repair them, but assuming whatever you're making 51 00:03:14,120 --> 00:03:17,080 Speaker 1: is going to be around for a little while, they'll 52 00:03:17,120 --> 00:03:20,359 Speaker 1: pay for themselves because eventually you'll get to a point 53 00:03:20,440 --> 00:03:23,280 Speaker 1: where the salaries you'd be paying for humans would be 54 00:03:23,360 --> 00:03:26,800 Speaker 1: more than the purchase and maintenance cost of the robots. 55 00:03:27,520 --> 00:03:31,160 Speaker 1: And the increase in efficiency means you can produce a 56 00:03:31,240 --> 00:03:34,840 Speaker 1: whole lot more stuff in a given amount of time 57 00:03:35,360 --> 00:03:38,920 Speaker 1: then you would with a human centric assembly line, so 58 00:03:39,000 --> 00:03:42,920 Speaker 1: you'll have more product to sell in a shorter amount 59 00:03:42,960 --> 00:03:47,320 Speaker 1: of time. When you start crunching numbers, you discover your 60 00:03:47,440 --> 00:03:50,400 Speaker 1: robotic assembly line can make more stuff at a lower 61 00:03:50,480 --> 00:03:53,360 Speaker 1: cost over a given period of time, like you know, 62 00:03:53,480 --> 00:03:56,200 Speaker 1: over a couple of years, than what you would accomplish 63 00:03:56,320 --> 00:03:59,520 Speaker 1: with human beings on that assembly line. So you don't 64 00:03:59,560 --> 00:04:02,240 Speaker 1: have to worry about the robots taking a vacation. They 65 00:04:02,240 --> 00:04:04,840 Speaker 1: don't take sick time, they don't even take the night off. 66 00:04:04,880 --> 00:04:08,160 Speaker 1: They can work around the clock. They don't need health insurance, 67 00:04:08,200 --> 00:04:11,800 Speaker 1: though I would guess that most companies ensure the heck 68 00:04:11,800 --> 00:04:14,040 Speaker 1: out of these things just in case one breaks down. 69 00:04:14,560 --> 00:04:17,000 Speaker 1: But from a financial point of view, they make sense 70 00:04:17,120 --> 00:04:20,120 Speaker 1: if you're building stuff at a large enough scale, stuff 71 00:04:20,160 --> 00:04:24,320 Speaker 1: like video game consoles for the PlayStation four. It's a 72 00:04:24,360 --> 00:04:28,480 Speaker 1: no brainer because that console has sold around one hundred 73 00:04:28,600 --> 00:04:32,640 Speaker 1: ten million units so far. That's a number large enough 74 00:04:32,680 --> 00:04:35,360 Speaker 1: that I can't even imagine what it would look like 75 00:04:35,600 --> 00:04:38,839 Speaker 1: if you had all those consoles together in one place. 76 00:04:39,200 --> 00:04:42,440 Speaker 1: So if there's enough demand for you to sell a 77 00:04:42,560 --> 00:04:46,080 Speaker 1: hundred ten million whatever it is you want to sell, 78 00:04:46,839 --> 00:04:49,520 Speaker 1: you need to have a way to make those as 79 00:04:49,560 --> 00:04:53,680 Speaker 1: efficiently as possible, and that will help maximize your profits. 80 00:04:53,720 --> 00:04:56,880 Speaker 1: And the more efficient the process, the more competitively you 81 00:04:56,920 --> 00:05:00,080 Speaker 1: can price your product and still make a profit. But 82 00:05:00,160 --> 00:05:04,400 Speaker 1: the idea of robots performing jobs far more effectively, consistently, 83 00:05:04,480 --> 00:05:08,520 Speaker 1: and efficiently than humans raises a lot of questions, and 84 00:05:08,600 --> 00:05:12,719 Speaker 1: these are not new questions either, but they are questions 85 00:05:12,760 --> 00:05:17,400 Speaker 1: like if more factories rely on robots for production, particularly 86 00:05:17,440 --> 00:05:20,800 Speaker 1: if those robots can be programmed to produce new products 87 00:05:21,040 --> 00:05:24,440 Speaker 1: once older ones go obsolete. What happens to the job market, 88 00:05:24,800 --> 00:05:27,640 Speaker 1: What happens to the millions of people who work in 89 00:05:27,720 --> 00:05:31,800 Speaker 1: manufacturing on assembly lines? Where will they go? What will 90 00:05:31,839 --> 00:05:35,919 Speaker 1: this do to economies around the world. Lots of people 91 00:05:35,920 --> 00:05:41,719 Speaker 1: have tried to answer these questions, sometimes giving drastically different answers. 92 00:05:42,000 --> 00:05:44,120 Speaker 1: And we're going to take a look at the history 93 00:05:44,200 --> 00:05:48,000 Speaker 1: and evolution of industrial robots in this episode and explore 94 00:05:48,040 --> 00:05:52,840 Speaker 1: the ramifications of automated manufacturing. And this is where I 95 00:05:52,920 --> 00:05:56,440 Speaker 1: dive into history. I've talked about the history of robots before, 96 00:05:56,760 --> 00:06:01,040 Speaker 1: so I'll try to restrict my focus to an industrial robots. 97 00:06:01,360 --> 00:06:04,200 Speaker 1: And before I get into that, let's just address the 98 00:06:04,240 --> 00:06:07,760 Speaker 1: fact that the use of machinery to increase efficiency has 99 00:06:07,800 --> 00:06:11,400 Speaker 1: been a controversial subject since long before there ever was 100 00:06:11,440 --> 00:06:15,880 Speaker 1: such a thing as a robot. Generally speaking, machines are 101 00:06:15,920 --> 00:06:19,400 Speaker 1: meant to make work easier, or in some cases, make 102 00:06:19,440 --> 00:06:23,440 Speaker 1: the work possible. Just to begin with, they are labor 103 00:06:23,600 --> 00:06:27,840 Speaker 1: saving devices, requiring humans to put forth less effort to 104 00:06:27,920 --> 00:06:31,720 Speaker 1: get the same or better results. This applies to the 105 00:06:31,760 --> 00:06:35,680 Speaker 1: simplest of machines. I mean stuff like levers or pulleys 106 00:06:35,760 --> 00:06:39,480 Speaker 1: or an inclined plane, and it applies to very complex 107 00:06:39,560 --> 00:06:45,520 Speaker 1: machines as well. Before the Industrial Revolution, most stuff like textiles, 108 00:06:45,880 --> 00:06:48,359 Speaker 1: was made by crafts people out of their own homes. 109 00:06:48,839 --> 00:06:54,000 Speaker 1: This was literally the cottage industry. Tradespeople would travel and 110 00:06:54,040 --> 00:06:58,159 Speaker 1: become the lifeline for the cottage industry, supplying raw materials, 111 00:06:58,480 --> 00:07:02,080 Speaker 1: buying finished products, and selling those products off at a 112 00:07:02,080 --> 00:07:06,359 Speaker 1: profit elsewhere. Many trades people built a good deal of 113 00:07:06,440 --> 00:07:09,359 Speaker 1: wealth working this way, and they had the means to 114 00:07:09,440 --> 00:07:15,360 Speaker 1: look at alternatives to this decentralized cottage industry approach. An 115 00:07:15,400 --> 00:07:19,520 Speaker 1: idea began to form. If you brought together crafts people 116 00:07:19,760 --> 00:07:23,880 Speaker 1: to a centralized location, and if you simplified the process 117 00:07:24,000 --> 00:07:28,200 Speaker 1: of production, you can make way more stuff, which in 118 00:07:28,240 --> 00:07:31,680 Speaker 1: turn means you could sell way more stuff, which in 119 00:07:31,800 --> 00:07:35,600 Speaker 1: turn means you can make way more money, and money 120 00:07:35,680 --> 00:07:40,680 Speaker 1: makes the world go round. This thought process helped fuel 121 00:07:40,760 --> 00:07:44,560 Speaker 1: a similar line of thinking. If you could design machines 122 00:07:45,040 --> 00:07:47,640 Speaker 1: that could do a lot of work that typically felt 123 00:07:47,680 --> 00:07:51,119 Speaker 1: a skilled crafts people, you wouldn't need the crafts people 124 00:07:51,400 --> 00:07:55,040 Speaker 1: at all. You could train anyone, even if that person 125 00:07:55,120 --> 00:07:59,200 Speaker 1: had no experience with the process just to work the machine. 126 00:07:59,680 --> 00:08:02,240 Speaker 1: And while it might take years of dedication to go 127 00:08:02,280 --> 00:08:05,360 Speaker 1: through the process of being an apprentice to learn a 128 00:08:05,400 --> 00:08:07,680 Speaker 1: trade well enough so that you can actually make a 129 00:08:07,720 --> 00:08:10,960 Speaker 1: living at it with a machine, you can skip right 130 00:08:11,080 --> 00:08:14,160 Speaker 1: over that. As long as the machines in product was 131 00:08:14,200 --> 00:08:18,200 Speaker 1: good enough. It didn't have to be better than the 132 00:08:18,200 --> 00:08:20,720 Speaker 1: stuff crafts people were making. It just had to be 133 00:08:20,960 --> 00:08:24,760 Speaker 1: good enough and cheap enough and fast enough to produce. 134 00:08:25,240 --> 00:08:28,200 Speaker 1: Then you could sell the finished product at a lower 135 00:08:28,240 --> 00:08:31,560 Speaker 1: cost than what craftspeople would charge because not as much 136 00:08:31,600 --> 00:08:34,480 Speaker 1: time and effort went into making the thing. Now, I 137 00:08:34,520 --> 00:08:37,800 Speaker 1: guess it's clunky to talk about this while using an example, 138 00:08:37,920 --> 00:08:43,000 Speaker 1: so let's go with a poster child for the Industrial Revolution. Weaving. 139 00:08:43,600 --> 00:08:47,040 Speaker 1: The weaving trade is an ancient one, and it requires 140 00:08:47,080 --> 00:08:50,079 Speaker 1: a good deal of skill to do it well by hand. 141 00:08:50,640 --> 00:08:54,559 Speaker 1: In the late seventeen hundreds, a man named Edmund Cartwright 142 00:08:54,720 --> 00:08:58,680 Speaker 1: patented a loom powered by a water wheel. The looms 143 00:08:58,720 --> 00:09:01,560 Speaker 1: operation was such that a person who had no training 144 00:09:01,559 --> 00:09:05,880 Speaker 1: and weaving could operate the machine and produce finished textiles. 145 00:09:06,240 --> 00:09:10,319 Speaker 1: Cartwright's design would be built upon by other inventors who 146 00:09:10,320 --> 00:09:13,560 Speaker 1: had turned to steam power and other means to operate 147 00:09:13,600 --> 00:09:18,400 Speaker 1: the power loom. Many cottage industry weavers found themselves out 148 00:09:18,400 --> 00:09:21,640 Speaker 1: of work. They could potentially up to work in the 149 00:09:21,760 --> 00:09:25,480 Speaker 1: textile factories, as those were popping up all over the place, 150 00:09:25,520 --> 00:09:30,360 Speaker 1: particularly in England, but the wages were low. As you 151 00:09:30,400 --> 00:09:35,160 Speaker 1: can imagine, this didn't sit well with the weavers. There 152 00:09:35,160 --> 00:09:41,080 Speaker 1: were protests, including some that incorporated violence and destruction. Ultimately, 153 00:09:41,360 --> 00:09:44,760 Speaker 1: the factory process one out and along with it some 154 00:09:45,640 --> 00:09:50,000 Speaker 1: really awful working conditions followed, including stuff like child labor 155 00:09:50,160 --> 00:09:55,760 Speaker 1: and ridiculously low wages and dangerous working conditions. This led 156 00:09:55,800 --> 00:09:58,760 Speaker 1: to more protests, including the type that would give us 157 00:09:58,800 --> 00:10:02,880 Speaker 1: the word sabotage. And let's get a quick side note 158 00:10:02,920 --> 00:10:05,880 Speaker 1: on that one, as it is the source of a 159 00:10:05,920 --> 00:10:11,440 Speaker 1: little mythology or misinformation. See The apocryphal story goes that 160 00:10:11,520 --> 00:10:16,160 Speaker 1: the word sabotage comes from the word sabot which describes 161 00:10:16,240 --> 00:10:21,240 Speaker 1: the wooden shoes worn by laborers, mainly Dutch laborers, but 162 00:10:21,320 --> 00:10:25,360 Speaker 1: also laborers in France. And according to the story, these 163 00:10:25,480 --> 00:10:29,440 Speaker 1: laborers wore those shoes and use them to great effect. 164 00:10:29,920 --> 00:10:32,840 Speaker 1: In an effort to protest the conditions and factories, they 165 00:10:32,840 --> 00:10:37,480 Speaker 1: would toss their wooden shoes into the machinery to break 166 00:10:37,520 --> 00:10:41,559 Speaker 1: the various gears and literally grind production to a halt, 167 00:10:41,640 --> 00:10:45,920 Speaker 1: as it were. But the story, while compelling, isn't really 168 00:10:45,960 --> 00:10:51,000 Speaker 1: the truth. Sabotage does stem from the words sabo, but 169 00:10:51,320 --> 00:10:55,679 Speaker 1: in French there is a verb sabotet. This verb means 170 00:10:55,920 --> 00:11:00,320 Speaker 1: to make a loud noise with wooden shoes. Now isn't 171 00:11:00,360 --> 00:11:04,000 Speaker 1: it great that there's a verb for that? And it 172 00:11:04,120 --> 00:11:07,600 Speaker 1: makes sense wooden shoes would make a great deal of 173 00:11:07,679 --> 00:11:10,880 Speaker 1: racket as people would walk around. Heck, if a if 174 00:11:10,920 --> 00:11:14,120 Speaker 1: a toddler wore wooden shoes, I think it would probably 175 00:11:14,160 --> 00:11:17,080 Speaker 1: sound as though the world were shaking apart. I don't 176 00:11:17,080 --> 00:11:19,559 Speaker 1: know how toddlers managed to sound like they weigh a 177 00:11:19,679 --> 00:11:22,200 Speaker 1: hundred pounds, but they do it. And if you have 178 00:11:22,280 --> 00:11:25,760 Speaker 1: a toddler, you know what I'm talking about. And in 179 00:11:25,840 --> 00:11:29,800 Speaker 1: the culture of France, the idea of a clumsy slow 180 00:11:30,000 --> 00:11:33,439 Speaker 1: worker was often linked to someone who wore wooden shoes 181 00:11:33,520 --> 00:11:38,920 Speaker 1: because they're awkward to wear anyway. The reason sabote led 182 00:11:39,000 --> 00:11:43,680 Speaker 1: to sabotage is because factory workers who were protesting their 183 00:11:43,679 --> 00:11:48,320 Speaker 1: work conditions and wages would purposefully work more slowly and 184 00:11:48,440 --> 00:11:52,280 Speaker 1: less efficiently. In order to affect the overall output of 185 00:11:52,320 --> 00:11:55,320 Speaker 1: a factory. It was related to a similar strategy that 186 00:11:55,400 --> 00:11:59,439 Speaker 1: British laborers employed, and their version was called kakani. It 187 00:11:59,480 --> 00:12:02,839 Speaker 1: was a saying from Scotland which essentially means don't do 188 00:12:02,960 --> 00:12:06,079 Speaker 1: so much man now. I would argue this also feeds 189 00:12:06,120 --> 00:12:08,320 Speaker 1: into a strategy that we see to this very day 190 00:12:08,400 --> 00:12:12,360 Speaker 1: in certain government offices, where the ideas there's no need 191 00:12:12,400 --> 00:12:15,720 Speaker 1: to do too much too quickly, as it doesn't result 192 00:12:15,720 --> 00:12:18,920 Speaker 1: in increased compensation, and it also sets a really high 193 00:12:18,920 --> 00:12:22,240 Speaker 1: bar of expectations, So why not just take it easy 194 00:12:22,280 --> 00:12:24,520 Speaker 1: and I don't have a coffee break now. In the 195 00:12:24,559 --> 00:12:28,400 Speaker 1: early twentieth century, people began to use the word sabotage 196 00:12:28,679 --> 00:12:33,120 Speaker 1: to really refer to a purposeful approach to undermining the 197 00:12:33,200 --> 00:12:36,360 Speaker 1: output of factories, and it had nothing to do with 198 00:12:36,440 --> 00:12:40,480 Speaker 1: tossing wooden shoes into machinery, though it did also pertain 199 00:12:40,520 --> 00:12:45,760 Speaker 1: to instances in which workers purposefully damaged equipment and tried 200 00:12:45,840 --> 00:12:49,199 Speaker 1: to slow down the production that way. While this isn't 201 00:12:49,240 --> 00:12:53,200 Speaker 1: directly tied to the idea that machines themselves are displacing workers, 202 00:12:53,520 --> 00:12:56,760 Speaker 1: it is related to the effect of moving towards a 203 00:12:56,840 --> 00:13:01,520 Speaker 1: manufacturing based economy and how that allow for the exploitation 204 00:13:01,840 --> 00:13:06,120 Speaker 1: of workers. The machines themselves aren't really at fault, but 205 00:13:06,200 --> 00:13:10,640 Speaker 1: they facilitate the system of operations that leads to exploitation. 206 00:13:11,040 --> 00:13:14,479 Speaker 1: Now that's something that will be a theme in this episode, 207 00:13:14,559 --> 00:13:17,360 Speaker 1: and we can't ignore the social aspect of what's going 208 00:13:17,400 --> 00:13:20,560 Speaker 1: on here, or else we missed the whole point. But 209 00:13:20,640 --> 00:13:24,480 Speaker 1: let's skip ahead. I've spoken about this before, but we 210 00:13:24,559 --> 00:13:29,120 Speaker 1: get the word robot from a check author named Carrol Copeck. 211 00:13:29,720 --> 00:13:33,560 Speaker 1: He wrote a play called Rossom's Universal Robots or Are 212 00:13:33,760 --> 00:13:39,640 Speaker 1: You Are? In nineteen twenty. Copeck took an older word robota, 213 00:13:39,800 --> 00:13:44,440 Speaker 1: which means forced labor in Europe. This concept was tied 214 00:13:44,520 --> 00:13:47,760 Speaker 1: to that of the old system of serfdom, in which 215 00:13:47,800 --> 00:13:51,800 Speaker 1: people would do work on behalf of a landowner. In return, 216 00:13:52,120 --> 00:13:55,080 Speaker 1: those people would be allowed to live on part of 217 00:13:55,120 --> 00:13:59,720 Speaker 1: that landowner's land. And Are You Are, factory owners devise 218 00:13:59,760 --> 00:14:03,960 Speaker 1: a way to build laborers from raw materials. Now, in 219 00:14:03,960 --> 00:14:08,480 Speaker 1: the play, they are indistinguishable from humans other than they 220 00:14:08,520 --> 00:14:13,280 Speaker 1: have no inner desires. But in the course of the play, 221 00:14:13,320 --> 00:14:17,360 Speaker 1: these laborers eventually take over all the jobs that humans 222 00:14:17,400 --> 00:14:22,320 Speaker 1: previously held, and humans themselves become a threatened species as 223 00:14:22,400 --> 00:14:25,960 Speaker 1: these laborers begin to understand the power that they hold 224 00:14:26,080 --> 00:14:29,920 Speaker 1: by occupying all the positions of employment, including as soldiers 225 00:14:29,920 --> 00:14:32,960 Speaker 1: in the military. And so with the introduction of the 226 00:14:33,000 --> 00:14:36,880 Speaker 1: concept of robot we actually get the very first robotic 227 00:14:37,000 --> 00:14:41,480 Speaker 1: uprising all the way back in See I told you 228 00:14:41,520 --> 00:14:44,760 Speaker 1: it was an old idea. It's important to remember that 229 00:14:44,960 --> 00:14:48,880 Speaker 1: in the play, the robots are nearly identical to humans. 230 00:14:48,880 --> 00:14:53,040 Speaker 1: They they aren't mechanical the way our robots of today are, 231 00:14:53,440 --> 00:14:55,840 Speaker 1: but the idea of creating machines that can do work 232 00:14:56,000 --> 00:14:59,360 Speaker 1: without a will of their own is a part of 233 00:14:59,760 --> 00:15:03,960 Speaker 1: row botics in general and industrial robotics in particular. When 234 00:15:03,960 --> 00:15:07,800 Speaker 1: we come back, we'll talk about the earliest industrial robots 235 00:15:07,840 --> 00:15:10,840 Speaker 1: and what they did, but first let's take a quick break. 236 00:15:18,480 --> 00:15:22,760 Speaker 1: It's interesting to me that the tech world adopted the 237 00:15:22,880 --> 00:15:26,120 Speaker 1: term robot when we think about the origins of that word. 238 00:15:26,520 --> 00:15:31,320 Speaker 1: In compex work, robots were sentient slaves. They could perform 239 00:15:31,360 --> 00:15:34,960 Speaker 1: the work humans would otherwise do, but they lack the 240 00:15:35,040 --> 00:15:38,920 Speaker 1: emotions that humans have, and the whole idea is that 241 00:15:39,000 --> 00:15:43,080 Speaker 1: these devices could do our work for us without question 242 00:15:43,280 --> 00:15:47,960 Speaker 1: or protest. They would in theory endure conditions that people 243 00:15:48,000 --> 00:15:52,200 Speaker 1: wouldn't or couldn't, but in the play, they ultimately lead 244 00:15:52,240 --> 00:15:55,880 Speaker 1: to the destruction of the human race and potentially they 245 00:15:55,920 --> 00:15:59,120 Speaker 1: become the new dominant species on the planet. Now, I 246 00:15:59,160 --> 00:16:02,800 Speaker 1: say potentially because part of the play's plot involves the 247 00:16:02,840 --> 00:16:06,800 Speaker 1: destruction of the formula that scientists use to produce the 248 00:16:06,880 --> 00:16:10,280 Speaker 1: robots in the first place. That is an important plot point. 249 00:16:10,360 --> 00:16:13,840 Speaker 1: The robots are not sure how to make more robots, 250 00:16:14,000 --> 00:16:17,360 Speaker 1: so they might just die out. Now, it seems to 251 00:16:17,440 --> 00:16:22,200 Speaker 1: me as though that's a pretty emotionally charged term to 252 00:16:22,280 --> 00:16:26,800 Speaker 1: adopt for an entire discipline of technology, right, robots, especially 253 00:16:26,800 --> 00:16:29,440 Speaker 1: if you are actually aware of that play, and by 254 00:16:29,440 --> 00:16:33,200 Speaker 1: the way, I recommend people read it. It's a good play. 255 00:16:33,440 --> 00:16:36,000 Speaker 1: But then a lot of people are not aware of 256 00:16:36,040 --> 00:16:38,560 Speaker 1: the origins of the word, or at least not beyond 257 00:16:38,760 --> 00:16:41,920 Speaker 1: knowing that it came from a play in the nineteen twenties. 258 00:16:41,960 --> 00:16:45,080 Speaker 1: So I guess for them it's just, you know, a word. 259 00:16:45,200 --> 00:16:48,280 Speaker 1: A robot by any other name would smell as sweet 260 00:16:48,400 --> 00:16:51,480 Speaker 1: as it were. And we've definitely seen the themes of 261 00:16:51,520 --> 00:16:54,400 Speaker 1: are you are serving as an undercurrent for stuff that's 262 00:16:54,400 --> 00:16:58,680 Speaker 1: happening in robotics in general. But let's move ahead. In 263 00:16:58,840 --> 00:17:03,200 Speaker 1: nineteen fifty four or an engineer named George Daval designed 264 00:17:03,200 --> 00:17:07,520 Speaker 1: an industrial robot. He was nine years old when Copic 265 00:17:07,680 --> 00:17:12,400 Speaker 1: coined the term robot. He called his design the Programmed 266 00:17:12,560 --> 00:17:16,760 Speaker 1: Article Transferred Device, for which he received a U S 267 00:17:16,800 --> 00:17:21,120 Speaker 1: patent in nineteen sixty one. This machine was a robotic arm, 268 00:17:21,320 --> 00:17:24,640 Speaker 1: and it was capable of picking up something and then 269 00:17:24,680 --> 00:17:28,960 Speaker 1: transferring it a short distance away just within reach of 270 00:17:29,000 --> 00:17:31,680 Speaker 1: the arm. The arm itself couldn't move, it was anchored 271 00:17:31,680 --> 00:17:34,919 Speaker 1: in place. It could also follow. In fact, this is 272 00:17:34,920 --> 00:17:37,919 Speaker 1: the important part. It would follow a pre program series 273 00:17:37,920 --> 00:17:41,720 Speaker 1: of instructions to do this. Daval's argument for his device 274 00:17:41,800 --> 00:17:44,920 Speaker 1: was that up to this point, mechanical handling of objects 275 00:17:45,200 --> 00:17:49,760 Speaker 1: fell into two broad categories. Either stuff got moved by humans, 276 00:17:50,520 --> 00:17:54,800 Speaker 1: typically operating a powerful machine like a crane or a forklift, 277 00:17:55,400 --> 00:17:58,560 Speaker 1: or stuff got moved by a device that operated under 278 00:17:58,680 --> 00:18:02,880 Speaker 1: cam control. Now, manual control is self explanatory, so let's 279 00:18:02,920 --> 00:18:09,000 Speaker 1: talk about cams. A cam is a rotating component in machinery. Typically, 280 00:18:09,320 --> 00:18:13,080 Speaker 1: a cam has some variation in its surface. So let's 281 00:18:13,080 --> 00:18:15,639 Speaker 1: start with a wheel. Just imagine a wheel that is 282 00:18:15,720 --> 00:18:19,800 Speaker 1: spinning on an axle. Well, You wouldn't typically have a 283 00:18:19,880 --> 00:18:23,600 Speaker 1: perfectly smooth wheel as a cam. Part of that surface 284 00:18:23,680 --> 00:18:27,080 Speaker 1: might be flat, or it might have dips in it, 285 00:18:27,720 --> 00:18:31,159 Speaker 1: and when the cam rotates, these variations apply force to 286 00:18:31,400 --> 00:18:35,200 Speaker 1: some other mechanical component that is held against the cam, 287 00:18:35,600 --> 00:18:38,919 Speaker 1: and it causes that mechanical component to move in specific ways. 288 00:18:40,000 --> 00:18:43,280 Speaker 1: A cam operating system can work on its own, but 289 00:18:43,560 --> 00:18:46,520 Speaker 1: it will always repeat the exact same motions. As long 290 00:18:46,720 --> 00:18:50,480 Speaker 1: as everything is working, it'll just repeat those steps. Once 291 00:18:50,520 --> 00:18:55,320 Speaker 1: the cams complete a full systematic rotation, you can't really 292 00:18:55,359 --> 00:18:58,680 Speaker 1: adapt it to do anything else. The movements depend entirely 293 00:18:58,760 --> 00:19:01,439 Speaker 1: on the cams themselves, so if you wanted it to 294 00:19:01,440 --> 00:19:03,720 Speaker 1: do something else, you would first have to swap out 295 00:19:03,880 --> 00:19:07,359 Speaker 1: the cams uh and even then you would be under 296 00:19:07,640 --> 00:19:11,480 Speaker 1: whatever the limitations of the device was itself, like, it 297 00:19:11,480 --> 00:19:14,800 Speaker 1: wouldn't have full range of motion. Moreover, this level of 298 00:19:14,840 --> 00:19:18,920 Speaker 1: specialization also means that it's typically really expensive to rely 299 00:19:19,040 --> 00:19:22,959 Speaker 1: upon cam based systems, so it was really only useful 300 00:19:23,000 --> 00:19:26,040 Speaker 1: if the application had to do with mass manufacturing or 301 00:19:26,040 --> 00:19:29,119 Speaker 1: else you're looking at economic loss. The cost of the 302 00:19:29,160 --> 00:19:32,399 Speaker 1: system was just too much, so Daval was proposing a 303 00:19:32,480 --> 00:19:36,000 Speaker 1: machine that could be programmed to do operations, and this 304 00:19:36,040 --> 00:19:40,400 Speaker 1: would let a programmer create different processes using the same machine, 305 00:19:40,880 --> 00:19:43,080 Speaker 1: or you could get a whole bunch of the same 306 00:19:43,119 --> 00:19:48,360 Speaker 1: basic machine and program each one to do a particular job. Meanwhile, 307 00:19:48,840 --> 00:19:50,879 Speaker 1: you'd free people up to work on other stuff in 308 00:19:50,920 --> 00:19:54,000 Speaker 1: the manufacturing process, and you could take the most dangerous 309 00:19:54,000 --> 00:19:56,560 Speaker 1: stuff and give it to the robots. Now, the story 310 00:19:56,600 --> 00:19:59,800 Speaker 1: goes that Daval was at a party in nineteen fifty 311 00:19:59,800 --> 00:20:01,919 Speaker 1: s X and he got into a conversation with a 312 00:20:01,960 --> 00:20:07,560 Speaker 1: man named Joseph Engelberger. Joseph was a scientist and an entrepreneur, 313 00:20:08,080 --> 00:20:12,120 Speaker 1: and when the subject turned to Duvall's programmed article transferred device, 314 00:20:12,200 --> 00:20:14,879 Speaker 1: as well as the work of a science fiction author 315 00:20:14,960 --> 00:20:18,760 Speaker 1: known as Isaac Asimov, you know, the father of robotics. 316 00:20:18,840 --> 00:20:22,240 Speaker 1: He famously incorporated a concept of the laws of robotics 317 00:20:22,240 --> 00:20:24,960 Speaker 1: in his works. We won't really go into that in 318 00:20:25,000 --> 00:20:28,360 Speaker 1: this episode, but the laws of robotics still play a 319 00:20:28,440 --> 00:20:33,040 Speaker 1: big part in the discipline of robotics in general, but 320 00:20:33,160 --> 00:20:36,880 Speaker 1: it's kind of outside the focus of this episode. Engelberger 321 00:20:37,280 --> 00:20:40,560 Speaker 1: used his connections to get funding for duvol to create 322 00:20:40,560 --> 00:20:44,000 Speaker 1: a more advanced version of the programmed article transfer machine, 323 00:20:44,520 --> 00:20:47,480 Speaker 1: and it would be a robotic arm capable of making repeated, 324 00:20:47,600 --> 00:20:51,760 Speaker 1: precise movements while holding very heavy objects. They called it 325 00:20:51,960 --> 00:20:55,919 Speaker 1: the Unimate you n I M A T E, and 326 00:20:56,000 --> 00:21:00,119 Speaker 1: the first prototype, Unimate zero zero one, would go to 327 00:21:00,280 --> 00:21:04,439 Speaker 1: General Motors to work on a die casting assembly line. Now, 328 00:21:04,480 --> 00:21:08,280 Speaker 1: according to the company robot Works, that's a w O 329 00:21:08,640 --> 00:21:12,960 Speaker 1: r X. This robot cost around sixty five thousand dollars 330 00:21:13,000 --> 00:21:16,760 Speaker 1: to produce, and Ingelburgers sold it off at a tremendous loss. 331 00:21:17,280 --> 00:21:21,480 Speaker 1: General Motors only paid eighteen thousand dollars for sixty five 332 00:21:21,480 --> 00:21:25,119 Speaker 1: thousand dollar machine. But Ingelberger really wanted to establish that 333 00:21:25,240 --> 00:21:28,840 Speaker 1: robotics were a way to perform repetitive, dangerous functions at 334 00:21:28,880 --> 00:21:33,000 Speaker 1: a lower risk to humans. Welding die cast components on 335 00:21:33,160 --> 00:21:37,200 Speaker 1: auto bodies was a great first application of industrial robots 336 00:21:37,240 --> 00:21:42,000 Speaker 1: for a few reasons. Die Casting is a process involving 337 00:21:42,160 --> 00:21:45,920 Speaker 1: molten metal. You take that molten metal and you force 338 00:21:46,000 --> 00:21:49,399 Speaker 1: it into steel molds, and these are water called dies. 339 00:21:50,160 --> 00:21:53,320 Speaker 1: The molten metal cools in the exact shape of the mold. 340 00:21:53,840 --> 00:21:56,919 Speaker 1: So this is a way to make or cast a 341 00:21:56,960 --> 00:22:00,000 Speaker 1: bunch of identical parts out of metal and get consisted 342 00:22:00,040 --> 00:22:04,240 Speaker 1: stent quality out of it rather than you know, forging 343 00:22:04,400 --> 00:22:07,720 Speaker 1: each piece and then fitting them together. A diet can 344 00:22:07,760 --> 00:22:11,560 Speaker 1: have complex shapes in it, such as external threads, which 345 00:22:11,560 --> 00:22:14,600 Speaker 1: means you don't have to make a pipe, for example, 346 00:22:14,640 --> 00:22:18,159 Speaker 1: and then do a secondary process on that pipe to 347 00:22:18,240 --> 00:22:19,919 Speaker 1: get the result you want. So you wouldn't have to 348 00:22:20,040 --> 00:22:24,360 Speaker 1: carve those threads into a otherwise smooth pipe. You could 349 00:22:24,440 --> 00:22:28,920 Speaker 1: just cast the pipe with the threads incorporated on it already. 350 00:22:29,359 --> 00:22:33,639 Speaker 1: But welding die cast parts onto auto bodies is hard work. 351 00:22:33,880 --> 00:22:36,919 Speaker 1: The components are really heavy, so you're at risk of 352 00:22:37,160 --> 00:22:40,399 Speaker 1: immediate injury if something goes wrong, like let's say you 353 00:22:40,520 --> 00:22:43,840 Speaker 1: drop a weighty component on your foot, or you might 354 00:22:44,040 --> 00:22:47,840 Speaker 1: develop a repetitive stress injury after going through the same 355 00:22:47,960 --> 00:22:52,320 Speaker 1: welding motions over and over again. In addition, the fumes 356 00:22:52,320 --> 00:22:56,200 Speaker 1: given off while welding where sometimes toxic still are so 357 00:22:56,520 --> 00:22:58,600 Speaker 1: it's not great to have people exposed to them for 358 00:22:58,680 --> 00:23:01,639 Speaker 1: very long. So a robot was a great substitute for 359 00:23:01,720 --> 00:23:04,360 Speaker 1: a person. The robot could handle much greater weight than 360 00:23:04,400 --> 00:23:07,760 Speaker 1: people could. The robot didn't breathe, so there was no 361 00:23:08,000 --> 00:23:11,440 Speaker 1: respiratory issue there, and it didn't get tired. I mean 362 00:23:11,480 --> 00:23:14,040 Speaker 1: it would wear down over time, but you could repair 363 00:23:14,080 --> 00:23:18,080 Speaker 1: it in fairly short order. The Unimate worked with computer 364 00:23:18,119 --> 00:23:23,040 Speaker 1: controlled hydraulic systems. Hydraulic system uses a liquid that's under 365 00:23:23,080 --> 00:23:26,639 Speaker 1: pressure in order to do work like pushing against a 366 00:23:26,680 --> 00:23:29,680 Speaker 1: piston to power and actuator of some sort like lift 367 00:23:29,720 --> 00:23:34,439 Speaker 1: a platform. The Unimate zero zero one weighed twenty seven 368 00:23:34,600 --> 00:23:39,399 Speaker 1: hundred pounds or about one thousand two ms, and it 369 00:23:39,440 --> 00:23:42,200 Speaker 1: could work twenty four hours a day, placing components with 370 00:23:42,240 --> 00:23:47,399 Speaker 1: a precision of within one fifty th of an inch. Now, 371 00:23:47,440 --> 00:23:49,280 Speaker 1: I'm not going to do the conversion on that, because 372 00:23:49,280 --> 00:23:51,639 Speaker 1: I think it's sufficient to say that it was just 373 00:23:52,400 --> 00:23:57,439 Speaker 1: really precise. According to a charmingly dated newsreel from Britain, 374 00:23:57,600 --> 00:24:01,040 Speaker 1: complete with swinging sixties music that sounded like it came 375 00:24:01,119 --> 00:24:04,439 Speaker 1: straight off an Austin Powers movie, the robot could operate 376 00:24:04,520 --> 00:24:07,000 Speaker 1: for five hours without the need for a human to 377 00:24:07,080 --> 00:24:11,320 Speaker 1: check in on it. Engelberger, a savvy businessman and promoter, 378 00:24:11,840 --> 00:24:15,720 Speaker 1: would arrange for Unimate to show what it could do 379 00:24:15,960 --> 00:24:20,320 Speaker 1: at trade shows and on TV appearances, including one on 380 00:24:20,400 --> 00:24:23,679 Speaker 1: The Tonight Show with Johnny Carson. If you don't know 381 00:24:23,720 --> 00:24:26,879 Speaker 1: who that is, ask your parents, and if they don't know, 382 00:24:28,280 --> 00:24:32,720 Speaker 1: ask your grandparents. By nineteen sixty nine, General Motors had 383 00:24:32,800 --> 00:24:36,600 Speaker 1: jumped on board the robot train, as it were. They 384 00:24:36,760 --> 00:24:41,240 Speaker 1: rebuilt a manufacturing plant in Lordstown, Ohio, and they installed 385 00:24:41,320 --> 00:24:45,280 Speaker 1: unimate robots to perform spot welding on car bodies, and 386 00:24:45,320 --> 00:24:48,880 Speaker 1: the results spoke for themselves. The plant was capable of 387 00:24:48,920 --> 00:24:52,920 Speaker 1: producing one hundred ten cars per hour, which was more 388 00:24:53,000 --> 00:24:56,400 Speaker 1: than double the speed that the plant could manage before 389 00:24:56,600 --> 00:24:59,840 Speaker 1: the installation of the robots. The business case for the 390 00:25:00,040 --> 00:25:04,320 Speaker 1: robots seemed clear. After a hefty upfront cost, you could 391 00:25:04,359 --> 00:25:07,639 Speaker 1: produce way more stuff per day, and as long as 392 00:25:07,680 --> 00:25:11,040 Speaker 1: the demand for that stuff is high enough, it could 393 00:25:11,119 --> 00:25:13,800 Speaker 1: mean greater revenue. You could also bring the cost of 394 00:25:13,880 --> 00:25:17,359 Speaker 1: production for an individual unit down. Then you could pass 395 00:25:17,440 --> 00:25:20,360 Speaker 1: savings on to customers and get really competitive with your pricing, 396 00:25:20,960 --> 00:25:23,280 Speaker 1: or you could just keep everything price the same and 397 00:25:23,359 --> 00:25:26,800 Speaker 1: try to increase your profit margin. The key to all 398 00:25:26,880 --> 00:25:29,440 Speaker 1: this was that you had to be sure the thing 399 00:25:29,480 --> 00:25:32,600 Speaker 1: you were producing would bring in enough money to offset 400 00:25:32,600 --> 00:25:35,840 Speaker 1: the cost of automation, so it would not make sense 401 00:25:35,920 --> 00:25:39,280 Speaker 1: to spend millions of dollars building out a factory staffed 402 00:25:39,280 --> 00:25:41,879 Speaker 1: with robots if you were making something that had a 403 00:25:42,000 --> 00:25:45,280 Speaker 1: very small market to begin with, Yes, you'd be able 404 00:25:45,280 --> 00:25:48,159 Speaker 1: to produce way more watching My Call It's than you 405 00:25:48,160 --> 00:25:51,359 Speaker 1: could before. But if the demand for watch my Call 406 00:25:51,440 --> 00:25:54,680 Speaker 1: It's is really modest, that doesn't do you any good. 407 00:25:55,160 --> 00:25:57,719 Speaker 1: In fact, you might end up flooding the market and 408 00:25:57,800 --> 00:26:01,800 Speaker 1: devaluing your product. So well, robots were taking on jobs 409 00:26:01,840 --> 00:26:04,440 Speaker 1: that were previously held by humans, there was no real 410 00:26:04,560 --> 00:26:08,159 Speaker 1: danger of a massive upheaval where everything would be automated. 411 00:26:08,200 --> 00:26:11,399 Speaker 1: The limitations in the technology were just too great and 412 00:26:11,440 --> 00:26:14,000 Speaker 1: the cost was too high for most companies to go 413 00:26:14,080 --> 00:26:17,439 Speaker 1: that route. And this also became the starting point for 414 00:26:17,520 --> 00:26:21,440 Speaker 1: something that would become really important. That the main goal 415 00:26:21,560 --> 00:26:26,280 Speaker 1: of developing industrial robots wasn't to displace humans. It was 416 00:26:26,320 --> 00:26:30,959 Speaker 1: meant to offload duties that were dull, dirty, or dangerous. 417 00:26:31,040 --> 00:26:34,560 Speaker 1: You'll often hear those terms being used with robotics. If 418 00:26:34,600 --> 00:26:37,080 Speaker 1: it is a job that carries with it a significant 419 00:26:37,160 --> 00:26:39,920 Speaker 1: risk to the person performing it, or a job so 420 00:26:40,040 --> 00:26:42,440 Speaker 1: demanding that you can only expect a person to stick 421 00:26:42,480 --> 00:26:44,520 Speaker 1: with it for a short while before they need to 422 00:26:44,520 --> 00:26:48,119 Speaker 1: do something else, then building a robot to do that job, 423 00:26:48,240 --> 00:26:51,520 Speaker 1: or at least that list of tasks makes sense. The 424 00:26:51,640 --> 00:26:55,240 Speaker 1: robot is just a thing. It can endure conditions that 425 00:26:55,320 --> 00:26:58,119 Speaker 1: humans can't, and it doesn't get sick, and it doesn't 426 00:26:58,119 --> 00:27:02,000 Speaker 1: get hurt. If something breaks down, you can typically repair 427 00:27:02,040 --> 00:27:06,199 Speaker 1: it pretty quickly. We humans don't have that luxury. Now, 428 00:27:06,280 --> 00:27:08,840 Speaker 1: I'm not going to go and run down a full 429 00:27:08,960 --> 00:27:12,639 Speaker 1: history of all industrial robots because that would mostly involve 430 00:27:12,680 --> 00:27:16,439 Speaker 1: me talking about model numbers with slight differences like the 431 00:27:16,520 --> 00:27:19,840 Speaker 1: number of axes of movement or points of articulation for 432 00:27:19,880 --> 00:27:22,600 Speaker 1: one robot versus another, and that's not really interesting. But 433 00:27:22,600 --> 00:27:25,840 Speaker 1: I do want to hit a couple of highlights. One 434 00:27:26,080 --> 00:27:30,480 Speaker 1: is that in NINETI, the A S E A I 435 00:27:30,880 --> 00:27:36,400 Speaker 1: RB robot would be the first fully electrically driven robot. 436 00:27:36,960 --> 00:27:41,560 Speaker 1: It also used Intel's first chip set as processors. Now, 437 00:27:41,600 --> 00:27:45,720 Speaker 1: this was not a super strong robot because those electrically 438 00:27:45,840 --> 00:27:49,000 Speaker 1: driven limbs just can't pack the same punch as a 439 00:27:49,080 --> 00:27:52,720 Speaker 1: hydraulic system, which typically moves much more slowly but can 440 00:27:52,840 --> 00:27:57,440 Speaker 1: handle much heavier payloads. So this particular robot could only 441 00:27:57,560 --> 00:28:01,800 Speaker 1: lift weights up to around their teen pounds or six kgrams. 442 00:28:02,480 --> 00:28:06,720 Speaker 1: But the move toward processors and electrically driven components marked 443 00:28:06,720 --> 00:28:10,480 Speaker 1: a big technological step, even if the arms physical capabilities 444 00:28:10,760 --> 00:28:15,560 Speaker 1: were much less impressive than a hydraulic system. By the 445 00:28:15,680 --> 00:28:19,040 Speaker 1: end of the nineteen seventies, Japan was getting into the 446 00:28:19,119 --> 00:28:23,720 Speaker 1: robotics game with arc welding robots for assembly lines, and 447 00:28:23,760 --> 00:28:27,280 Speaker 1: then it was off to the robotic races, with the 448 00:28:27,320 --> 00:28:32,480 Speaker 1: eighties seeing a surge in advances with industrial robots. Soon, 449 00:28:32,760 --> 00:28:37,640 Speaker 1: massive manufacturing facilities were installing robots to take over elements 450 00:28:37,640 --> 00:28:41,480 Speaker 1: of the assembly line process, particularly in that dirty, dull, 451 00:28:41,560 --> 00:28:46,920 Speaker 1: and dangerous category. The robots became more sophisticated, which also 452 00:28:47,000 --> 00:28:49,880 Speaker 1: added to their value. When we come back, I'll talk 453 00:28:49,960 --> 00:28:53,560 Speaker 1: more about why that's important, but first let's take another 454 00:28:53,680 --> 00:29:05,120 Speaker 1: quick break. By the mid nine nineties, robotics companies were 455 00:29:05,160 --> 00:29:08,600 Speaker 1: making machines that could coordinate and synchronize the movements of 456 00:29:08,640 --> 00:29:11,600 Speaker 1: more than one robot at the same time, allowing for 457 00:29:11,680 --> 00:29:16,200 Speaker 1: more complex manufacturing processes. By the early two thousand's, there 458 00:29:16,200 --> 00:29:19,040 Speaker 1: were systems that could synchronize the actions of up to 459 00:29:19,160 --> 00:29:22,600 Speaker 1: four robots at a time, further adding to the overall 460 00:29:22,720 --> 00:29:27,720 Speaker 1: system flexibility. Now I mentioned earlier that a programmable robot 461 00:29:27,960 --> 00:29:31,680 Speaker 1: is more versatile than something like a cam operated system. Well, 462 00:29:32,280 --> 00:29:36,640 Speaker 1: more sophisticated robots with more axes of motion and more. 463 00:29:36,680 --> 00:29:41,160 Speaker 1: Points of articulation have the potential to do lots of 464 00:29:41,200 --> 00:29:44,960 Speaker 1: different types of jobs, and this is of critical importance. 465 00:29:45,120 --> 00:29:47,440 Speaker 1: If the robot is too limited, if you can only 466 00:29:47,520 --> 00:29:53,080 Speaker 1: do a small range of motions, you can't necessarily repurpose 467 00:29:53,120 --> 00:29:56,760 Speaker 1: it for new processes. And as markets change, you may 468 00:29:56,760 --> 00:29:59,280 Speaker 1: find yourself needing to be flexible when it comes to 469 00:29:59,360 --> 00:30:03,800 Speaker 1: the stuff you're manufacturing. So let's use an extreme hypothetical 470 00:30:03,840 --> 00:30:07,960 Speaker 1: example that would probably never happen. So let's say that 471 00:30:08,040 --> 00:30:12,360 Speaker 1: you run an auto manufacturing facility, but then there's a 472 00:30:12,400 --> 00:30:16,440 Speaker 1: massive market change and it drastically affects the demand for 473 00:30:16,560 --> 00:30:20,960 Speaker 1: your cars. There's just not enough demand to support the production. 474 00:30:21,520 --> 00:30:25,040 Speaker 1: So rather than just you know, closing up shop and 475 00:30:25,120 --> 00:30:28,000 Speaker 1: calling it a day, your business decides to do an 476 00:30:28,080 --> 00:30:31,840 Speaker 1: amazing pivot and you begin to convert your manufacturing facility 477 00:30:31,880 --> 00:30:36,680 Speaker 1: over to I don't know, home appliances. Now, again, this 478 00:30:36,720 --> 00:30:40,960 Speaker 1: is an extreme hypothetical example, but let's just go with it. Okay, 479 00:30:41,080 --> 00:30:44,000 Speaker 1: So here we go. If the robots and your assembly 480 00:30:44,040 --> 00:30:48,240 Speaker 1: line are powerful but limited in movement and function, you 481 00:30:48,280 --> 00:30:51,000 Speaker 1: may find it impossible to adapt them to your new 482 00:30:51,040 --> 00:30:53,680 Speaker 1: line of business, which would mean you need to either 483 00:30:53,840 --> 00:30:57,479 Speaker 1: invest in new robots, or you'd have to hire human 484 00:30:57,720 --> 00:31:01,560 Speaker 1: workers to put together your appliances, and it would also 485 00:31:01,600 --> 00:31:04,120 Speaker 1: mean that your old robots would be a sunk cost. 486 00:31:04,160 --> 00:31:06,280 Speaker 1: You would need to either sell them off or put 487 00:31:06,320 --> 00:31:10,600 Speaker 1: them in storage or something. If the robots are really sophisticated, however, 488 00:31:10,720 --> 00:31:13,000 Speaker 1: you might be able to program them to do some 489 00:31:13,120 --> 00:31:16,440 Speaker 1: of the operations on the new assembly line, and that 490 00:31:16,440 --> 00:31:19,520 Speaker 1: would keep them useful, it would lower the cost of production. 491 00:31:20,240 --> 00:31:24,240 Speaker 1: Or for a less extreme example, you introduce a new 492 00:31:24,320 --> 00:31:27,240 Speaker 1: model of whatever a thing it is that you're producing. 493 00:31:27,600 --> 00:31:31,440 Speaker 1: Anything new will require adjustments in the assembly line process, 494 00:31:31,680 --> 00:31:34,000 Speaker 1: and if the changes are big enough, the robots may 495 00:31:34,000 --> 00:31:36,960 Speaker 1: not be able to make as big a contribution in 496 00:31:37,000 --> 00:31:41,040 Speaker 1: the process. That's something that could happen with the example 497 00:31:41,080 --> 00:31:43,880 Speaker 1: of the PlayStation we were talking about. Yeah, those robots 498 00:31:43,920 --> 00:31:46,800 Speaker 1: can put together a PS four and thirty seconds, there's 499 00:31:46,800 --> 00:31:48,920 Speaker 1: no guarantee they'll be able to do the same thing 500 00:31:48,960 --> 00:31:51,440 Speaker 1: with a PS five, at least not without a major 501 00:31:51,600 --> 00:31:56,200 Speaker 1: overhaul of their assembly line system. While the manufacturing facility 502 00:31:56,240 --> 00:31:58,640 Speaker 1: can churn out a finished PS four and thirty seconds, 503 00:31:59,400 --> 00:32:02,440 Speaker 1: we might not see them work at all with PS five, 504 00:32:02,480 --> 00:32:04,720 Speaker 1: at least not right away. It would all have to 505 00:32:04,760 --> 00:32:08,920 Speaker 1: be optimized. So for decades, industrial robots were kept as 506 00:32:09,120 --> 00:32:12,840 Speaker 1: separate from human workers as was possible. You wanted to 507 00:32:12,920 --> 00:32:16,400 Speaker 1: keep them well away from all the people, or keep 508 00:32:16,440 --> 00:32:18,560 Speaker 1: the people well away from all the robots. Often the 509 00:32:18,640 --> 00:32:24,000 Speaker 1: robots would operate within cages specifically to limit the possibility 510 00:32:24,040 --> 00:32:27,560 Speaker 1: of a human coming within range. After all, these robots 511 00:32:27,800 --> 00:32:32,080 Speaker 1: are large, they're heavy, they're powerful, and many of them 512 00:32:32,080 --> 00:32:36,920 Speaker 1: are incapable of sensing stuff in their environment. Uh and 513 00:32:37,000 --> 00:32:39,960 Speaker 1: whether or not a human is within their range of motion. Instead, 514 00:32:39,960 --> 00:32:43,720 Speaker 1: they're just going through that pre programmed series of motions 515 00:32:44,240 --> 00:32:47,280 Speaker 1: and they're not going to stop unless someone turns it off. 516 00:32:47,680 --> 00:32:50,360 Speaker 1: A robot is performing that same series of steps over 517 00:32:50,360 --> 00:32:52,000 Speaker 1: and over, and that can mean that if a human 518 00:32:52,480 --> 00:32:56,479 Speaker 1: in that area gets near the robot, they could end 519 00:32:56,520 --> 00:32:58,960 Speaker 1: up getting injured or worse. And in fact, this has 520 00:32:59,000 --> 00:33:01,200 Speaker 1: happened several times times over the course of the last 521 00:33:01,200 --> 00:33:04,600 Speaker 1: few decades, and at least in some cases it seems 522 00:33:04,600 --> 00:33:07,240 Speaker 1: as though the robot might have been at fault, meaning 523 00:33:07,280 --> 00:33:11,600 Speaker 1: it's not always a case of human carelessness. For example, 524 00:33:12,000 --> 00:33:15,160 Speaker 1: an engineer in twenty fifteen died when a robot arm 525 00:33:15,280 --> 00:33:19,200 Speaker 1: from one section of the factory floor moved beyond its 526 00:33:19,240 --> 00:33:24,280 Speaker 1: operating area and into the neighboring section that the engineer 527 00:33:24,360 --> 00:33:27,240 Speaker 1: was working in. This is something that should not have happened. 528 00:33:27,240 --> 00:33:30,800 Speaker 1: The robot arms should not have moved that far into 529 00:33:30,880 --> 00:33:34,360 Speaker 1: the neighboring section. The robot arm hit the engineer on 530 00:33:34,400 --> 00:33:38,240 Speaker 1: the head, and she later died from her injuries. In 531 00:33:38,280 --> 00:33:42,400 Speaker 1: the United States, the government has listed thirty three workplace 532 00:33:42,520 --> 00:33:45,960 Speaker 1: deaths due to accidents with industrial robots between the years 533 00:33:46,080 --> 00:33:51,440 Speaker 1: nine four and two thousand fourteen. The investigations also found 534 00:33:51,520 --> 00:33:56,920 Speaker 1: that the majority of those tragedies was typically the fault 535 00:33:57,080 --> 00:33:59,880 Speaker 1: of human error. There was a person who was one 536 00:34:00,120 --> 00:34:03,360 Speaker 1: ring into the operation zone of a robot. That two 537 00:34:03,400 --> 00:34:07,320 Speaker 1: thousand fifteen incident was an outlier. Not that any of 538 00:34:07,360 --> 00:34:10,279 Speaker 1: this makes the thought of working around industrial robots less 539 00:34:10,320 --> 00:34:14,160 Speaker 1: scary or those other accidents any less tragic. They're all 540 00:34:14,280 --> 00:34:18,800 Speaker 1: terribly tragic. Moreover, we're seeing more robots that are capable 541 00:34:18,960 --> 00:34:22,640 Speaker 1: of roaming a work space. They are no longer anchored 542 00:34:22,719 --> 00:34:25,080 Speaker 1: to a specific spot on the floor. In some cases, 543 00:34:25,440 --> 00:34:31,240 Speaker 1: they also, unlike the first industrial robots, typically have external sensors. 544 00:34:31,280 --> 00:34:34,279 Speaker 1: These not only help the robots navigate their environments, but 545 00:34:34,360 --> 00:34:40,640 Speaker 1: also hopefully avoid accidents with human workers. Let's take Amazon's 546 00:34:40,920 --> 00:34:45,400 Speaker 1: warehouse robots for example. These robots look like really big 547 00:34:45,520 --> 00:34:50,200 Speaker 1: robotic vacuum cleaners. They are designed to roll under shelves, 548 00:34:50,239 --> 00:34:53,040 Speaker 1: and the shelves are just, you know, slightly larger than 549 00:34:53,080 --> 00:34:56,840 Speaker 1: the dimensions of the robot. And when an order comes in, 550 00:34:56,960 --> 00:34:59,840 Speaker 1: a robot from the warehouse rolls over to a shelf. 551 00:35:00,040 --> 00:35:03,040 Speaker 1: It holds the respective item on it according to the 552 00:35:03,080 --> 00:35:07,359 Speaker 1: inventory system, and the robot goes under the shelf then 553 00:35:07,520 --> 00:35:11,359 Speaker 1: lifts the shelf by raising a platter like platform on 554 00:35:11,440 --> 00:35:13,000 Speaker 1: the top of the robot. Think of it as like 555 00:35:13,040 --> 00:35:15,640 Speaker 1: a little forklift, except it's more like a i don't know, 556 00:35:15,920 --> 00:35:18,360 Speaker 1: like a tray that a waiter would use to carry 557 00:35:18,440 --> 00:35:21,719 Speaker 1: drinks to a table. But it carries the whole shelf 558 00:35:21,840 --> 00:35:24,759 Speaker 1: up and over to the edge of a cage, where 559 00:35:24,760 --> 00:35:28,440 Speaker 1: a human operator will take the respective item off the 560 00:35:28,480 --> 00:35:31,600 Speaker 1: shelf and scan it and put it into a bin. 561 00:35:31,800 --> 00:35:34,400 Speaker 1: And then those bins go to other humans who further 562 00:35:34,560 --> 00:35:37,280 Speaker 1: scan those items and then put them into other bins 563 00:35:37,280 --> 00:35:40,160 Speaker 1: that ultimately go to the packing department. And if you 564 00:35:40,160 --> 00:35:43,040 Speaker 1: watch videos of these robots, it looks like they're doing 565 00:35:43,040 --> 00:35:47,840 Speaker 1: a complicated ballet as they maneuver through this warehouse, avoiding 566 00:35:47,840 --> 00:35:52,720 Speaker 1: other robots and shelves. As they bring those shelves to humans, 567 00:35:53,600 --> 00:35:56,839 Speaker 1: Markings on the warehouse floor tell the robots where they 568 00:35:56,840 --> 00:35:59,680 Speaker 1: are with respect to everything else in the warehouse, and 569 00:35:59,719 --> 00:36:04,080 Speaker 1: the bots even will position shelves that have items that 570 00:36:04,120 --> 00:36:06,799 Speaker 1: are being ordered a lot toward the edges of this 571 00:36:07,000 --> 00:36:10,279 Speaker 1: space so that they're easier to get to and move 572 00:36:10,360 --> 00:36:12,960 Speaker 1: them over to the human beings. So it's kind of 573 00:36:12,960 --> 00:36:15,680 Speaker 1: an interesting dynamic system. It's not like they pick up 574 00:36:15,719 --> 00:36:18,600 Speaker 1: the shelf and then bring the shelf immediately right back 575 00:36:18,680 --> 00:36:21,040 Speaker 1: to where it started. The shelf can end up in 576 00:36:21,080 --> 00:36:26,200 Speaker 1: a different location entirely. In addition, cameras give the robots 577 00:36:26,239 --> 00:36:29,919 Speaker 1: the ability to sense any obstacles that might block their path, 578 00:36:30,080 --> 00:36:32,600 Speaker 1: allowing the robot to come to a stop and wait 579 00:36:32,640 --> 00:36:37,799 Speaker 1: further instructions and report that it has found something unusual 580 00:36:38,360 --> 00:36:42,040 Speaker 1: on the warehouse floor. Even so, typically humans are not 581 00:36:42,160 --> 00:36:45,760 Speaker 1: allowed to roam the area where the robots pick up shelves. 582 00:36:46,239 --> 00:36:49,120 Speaker 1: If something has fallen on the warehouse floor, a designated 583 00:36:49,120 --> 00:36:53,040 Speaker 1: troubleshooter gets an alert, and that person must use an 584 00:36:53,040 --> 00:36:55,759 Speaker 1: interface to draw the path that they are going to 585 00:36:55,840 --> 00:36:59,719 Speaker 1: take from the entrance of the cage all the way 586 00:36:59,800 --> 00:37:03,080 Speaker 1: to you the point of trouble. Like let's say that 587 00:37:03,640 --> 00:37:05,600 Speaker 1: a product has fallen out of a shelf and has 588 00:37:05,640 --> 00:37:08,160 Speaker 1: hit the floor, and a robot has reported it. You 589 00:37:08,160 --> 00:37:10,560 Speaker 1: would use a tablet. If you're the troubleshooter, you'd use 590 00:37:10,600 --> 00:37:12,880 Speaker 1: a tablet and you would draw, almost like a maze, 591 00:37:13,200 --> 00:37:17,400 Speaker 1: the path you would take to get to that particular item, 592 00:37:17,560 --> 00:37:21,680 Speaker 1: and you would follow that path out and back. In addition, 593 00:37:22,200 --> 00:37:25,480 Speaker 1: you'd wear a radio transmitter that would send a signal 594 00:37:25,560 --> 00:37:28,040 Speaker 1: out that the robots could all detect, and that would 595 00:37:28,040 --> 00:37:31,319 Speaker 1: alert the robots to the presence of you, the troubleshooter. 596 00:37:32,000 --> 00:37:34,560 Speaker 1: That helps prevent a situation in which the robots are 597 00:37:34,600 --> 00:37:39,359 Speaker 1: going to collide with you, right, you want to avoid that. Now, 598 00:37:39,360 --> 00:37:42,000 Speaker 1: there's a lot of work that goes into designing robots 599 00:37:42,080 --> 00:37:45,400 Speaker 1: that can interoperate in a space that's occupied by humans, 600 00:37:45,840 --> 00:37:48,560 Speaker 1: and it's a very challenging line of technology because it 601 00:37:48,560 --> 00:37:51,600 Speaker 1: takes more than just thinking about how the machines work. 602 00:37:51,920 --> 00:37:55,560 Speaker 1: You also have to think about how people work, and moreover, 603 00:37:56,000 --> 00:37:58,640 Speaker 1: you have to think about how people change the way 604 00:37:58,680 --> 00:38:01,000 Speaker 1: they work when they're in the company of a robot. 605 00:38:01,280 --> 00:38:06,160 Speaker 1: It's kind of similar to the concept in quantum theory, right, 606 00:38:06,200 --> 00:38:09,239 Speaker 1: the idea that you change a thing you observe just 607 00:38:09,320 --> 00:38:13,880 Speaker 1: through the act of observation. Well, you can have a 608 00:38:14,080 --> 00:38:17,600 Speaker 1: workspace that humans had only been working in for a while, 609 00:38:18,480 --> 00:38:20,160 Speaker 1: and you could say, all right, well, i have observed 610 00:38:20,160 --> 00:38:21,719 Speaker 1: how the humans work, and I'm going to build a 611 00:38:21,800 --> 00:38:24,480 Speaker 1: robot that does this one task that the humans do, 612 00:38:25,400 --> 00:38:28,680 Speaker 1: and I'm just gonna incorporate right into their workspace. But 613 00:38:28,760 --> 00:38:30,920 Speaker 1: then you find out that when you do that, the 614 00:38:31,000 --> 00:38:33,480 Speaker 1: humans all behave in a new way because there's a 615 00:38:33,520 --> 00:38:36,680 Speaker 1: new thing in the environment that you didn't account for, 616 00:38:37,120 --> 00:38:39,880 Speaker 1: and now the design of your robot doesn't work as well. 617 00:38:40,440 --> 00:38:44,640 Speaker 1: We humans are tricky like that. Moreover, we need to 618 00:38:44,640 --> 00:38:47,919 Speaker 1: get to that threat that weavers were worried about more 619 00:38:47,920 --> 00:38:52,879 Speaker 1: than a century ago. Is automation going to take our jobs? Now? 620 00:38:52,880 --> 00:38:56,920 Speaker 1: There have been a few studies, all using different methodologies, 621 00:38:57,239 --> 00:38:59,920 Speaker 1: and some of those studies coming under criticism for the 622 00:39:00,040 --> 00:39:03,279 Speaker 1: approaches that were used. But there have been a few 623 00:39:03,320 --> 00:39:06,840 Speaker 1: states that suggest we'll see automation continue to impact jobs 624 00:39:06,880 --> 00:39:10,640 Speaker 1: in the near future and drastically so over the course 625 00:39:10,680 --> 00:39:14,160 Speaker 1: of the long run. The interpretation of those results have 626 00:39:14,280 --> 00:39:18,399 Speaker 1: been reported in ways that range from automation is going 627 00:39:18,400 --> 00:39:22,239 Speaker 1: to be disruptive, that's on the light end too. Of 628 00:39:22,280 --> 00:39:25,360 Speaker 1: all jobs are going to be taken by the robots. 629 00:39:25,400 --> 00:39:28,600 Speaker 1: So what's the actual truth. Well, the truth, as it 630 00:39:28,640 --> 00:39:34,000 Speaker 1: turns out, is complicated. For one thing, automation rarely takes 631 00:39:34,040 --> 00:39:37,600 Speaker 1: over an entire job. What is far more likely to 632 00:39:37,680 --> 00:39:42,160 Speaker 1: happen is that automation will take over certain tasks that 633 00:39:42,320 --> 00:39:46,239 Speaker 1: are part of a job, or perhaps multiple jobs. So 634 00:39:46,280 --> 00:39:49,799 Speaker 1: if a job requires a wide variety of tasks, some 635 00:39:49,960 --> 00:39:53,640 Speaker 1: of which may require critical thinking, it's really hard to 636 00:39:53,680 --> 00:39:56,879 Speaker 1: design a robot that can do all of that. It's 637 00:39:56,920 --> 00:40:00,760 Speaker 1: far more likely that you would automate certain job respond possibilities, 638 00:40:01,120 --> 00:40:05,000 Speaker 1: which would mean that those jobs themselves wouldn't go away, 639 00:40:05,040 --> 00:40:10,480 Speaker 1: they would just change. The repetitive responsibilities would be offloaded 640 00:40:10,560 --> 00:40:13,120 Speaker 1: and you would focus on something else. You might have 641 00:40:13,160 --> 00:40:15,840 Speaker 1: to spend more time doing other duties rather than the 642 00:40:15,920 --> 00:40:20,399 Speaker 1: routine ones, which isn't necessarily a bad thing. But there 643 00:40:20,400 --> 00:40:25,160 Speaker 1: are cases where automation would likely take over an entire job. 644 00:40:25,560 --> 00:40:30,439 Speaker 1: For example, truck drivers, you know, in shipping trucks. Much 645 00:40:30,440 --> 00:40:33,880 Speaker 1: of the work in autonomous vehicles is really focusing not 646 00:40:34,000 --> 00:40:39,839 Speaker 1: necessarily on replacing passenger vehicles so much as commercial vehicles 647 00:40:39,880 --> 00:40:44,640 Speaker 1: like shipping trucks. The Bureau of Labor Statistics in the 648 00:40:44,680 --> 00:40:47,920 Speaker 1: United States estimated that the age of the average US 649 00:40:48,000 --> 00:40:52,480 Speaker 1: truck driver is fifty five and more than of all 650 00:40:52,520 --> 00:40:56,160 Speaker 1: truck drivers in the US are mail and that will 651 00:40:56,200 --> 00:41:00,360 Speaker 1: present a challenge. See Generally, the pro argue meant for 652 00:41:00,400 --> 00:41:05,160 Speaker 1: automation is that while robots and automated systems will eliminate 653 00:41:05,320 --> 00:41:10,760 Speaker 1: some jobs, they will create other jobs, presumably better jobs. 654 00:41:11,520 --> 00:41:15,479 Speaker 1: And this is true. At the turn of the twentieth century, 655 00:41:16,080 --> 00:41:19,759 Speaker 1: of all jobs in the United States were on farms, 656 00:41:19,840 --> 00:41:22,600 Speaker 1: so that means four out of ten people in the 657 00:41:22,719 --> 00:41:26,920 Speaker 1: US who had a job we're working on a farm. Today, 658 00:41:27,560 --> 00:41:31,600 Speaker 1: agriculture and all the related food sectors make up just 659 00:41:31,840 --> 00:41:35,040 Speaker 1: eleven of all jobs in the United States. And if 660 00:41:35,080 --> 00:41:37,160 Speaker 1: we just limit this to the people who are working 661 00:41:37,160 --> 00:41:40,840 Speaker 1: on farms, you know, not all agricultural jobs and food 662 00:41:40,840 --> 00:41:43,759 Speaker 1: sector jobs, just the farm jobs. If we do that, 663 00:41:44,080 --> 00:41:48,120 Speaker 1: we're talking about only one point three percent of all 664 00:41:48,320 --> 00:41:52,600 Speaker 1: US employment. So going from to one point three percent, 665 00:41:52,760 --> 00:41:57,680 Speaker 1: that's a drastic change. Now, clearly automation has transformed agriculture. 666 00:41:57,719 --> 00:42:00,880 Speaker 1: It allows us to do a lot more while relying 667 00:42:00,960 --> 00:42:05,120 Speaker 1: on fewer people, and new jobs did come around, so 668 00:42:05,600 --> 00:42:09,439 Speaker 1: we didn't see an unemployment rate reaching levels higher than 669 00:42:11,960 --> 00:42:17,360 Speaker 1: pre COVID. The pro automation argument states that new jobs, 670 00:42:17,800 --> 00:42:21,680 Speaker 1: which again should ideally be better than existing jobs, isn't. 671 00:42:21,760 --> 00:42:25,759 Speaker 1: Less strenuous and less dangerous and more interesting will emerge 672 00:42:26,160 --> 00:42:29,760 Speaker 1: as older jobs are phased out. Now, that works fine 673 00:42:30,160 --> 00:42:33,120 Speaker 1: on a macro scale when you're taking a really big 674 00:42:33,200 --> 00:42:36,200 Speaker 1: picture look at the overall trends, But when you consider 675 00:42:36,280 --> 00:42:39,560 Speaker 1: the particulars, like our truck drivers, you start to see 676 00:42:39,600 --> 00:42:44,520 Speaker 1: some obstacles. See this year, I turned forty five, so 677 00:42:44,719 --> 00:42:47,719 Speaker 1: I'm a lot closer to the average age of a 678 00:42:47,800 --> 00:42:50,560 Speaker 1: truck driver in the United States. Then I am to 679 00:42:50,600 --> 00:42:53,239 Speaker 1: someone who's just getting into the job market for the 680 00:42:53,280 --> 00:42:56,440 Speaker 1: first time, and I can tell you that, even as 681 00:42:56,440 --> 00:43:00,400 Speaker 1: a relatively tech savvy guy, I would find it really 682 00:43:00,520 --> 00:43:03,719 Speaker 1: challenging to pick up the job skills. I would need 683 00:43:04,160 --> 00:43:08,000 Speaker 1: to go into a different line of work, particularly one 684 00:43:08,200 --> 00:43:11,600 Speaker 1: where I'm competing against people who already have training and 685 00:43:11,719 --> 00:43:15,319 Speaker 1: experience in that field. So imagine having to tell a 686 00:43:15,320 --> 00:43:18,560 Speaker 1: group of fifty five year old truck drivers that they're 687 00:43:18,600 --> 00:43:21,160 Speaker 1: out of a job. But good news. If you just 688 00:43:21,200 --> 00:43:24,319 Speaker 1: start taking classes, you can learn to code and make 689 00:43:25,080 --> 00:43:28,160 Speaker 1: less money than you did in your old job. It's 690 00:43:28,200 --> 00:43:31,000 Speaker 1: not great, is what I'm saying. Now, that being said, 691 00:43:31,640 --> 00:43:34,920 Speaker 1: automation is clearly not going anywhere. It's going to continue 692 00:43:34,960 --> 00:43:37,680 Speaker 1: to play a big role in how we get work done, 693 00:43:38,120 --> 00:43:41,400 Speaker 1: and in our best case scenarios, it's going to augment 694 00:43:41,560 --> 00:43:45,400 Speaker 1: the work that humans do, leading to better, more efficient, 695 00:43:45,480 --> 00:43:48,960 Speaker 1: and more cost effective outcomes. It will free us up 696 00:43:49,040 --> 00:43:50,920 Speaker 1: to focus on the parts of our jobs that we 697 00:43:50,960 --> 00:43:54,160 Speaker 1: find the most fulfilling. We can handle the stuff that 698 00:43:54,200 --> 00:43:58,120 Speaker 1: requires flexibility and intuitive thinking, and the machines can handle 699 00:43:58,360 --> 00:44:03,200 Speaker 1: the routine and the dangerous. But in a worst case scenario, 700 00:44:03,719 --> 00:44:08,359 Speaker 1: we'll see an unprepared population of former workers who are 701 00:44:08,400 --> 00:44:11,320 Speaker 1: now out of a job and without the support system 702 00:44:11,400 --> 00:44:15,080 Speaker 1: there to help them transition into something new so that 703 00:44:15,120 --> 00:44:20,160 Speaker 1: they can continue to contribute to society and earn a living. Now, 704 00:44:20,200 --> 00:44:23,919 Speaker 1: this is why you will often hear conversations about automation 705 00:44:24,280 --> 00:44:29,560 Speaker 1: get tied into concepts like a guaranteed jobs program. This 706 00:44:29,640 --> 00:44:32,760 Speaker 1: is typically where something like a government creates a system 707 00:44:32,920 --> 00:44:35,919 Speaker 1: that makes certain every person who wants a job can 708 00:44:36,000 --> 00:44:40,600 Speaker 1: get a job. Or you'll hear about guaranteed basic income. 709 00:44:40,760 --> 00:44:43,279 Speaker 1: This is a strategy in which tax dollars go to 710 00:44:43,360 --> 00:44:47,759 Speaker 1: fund a standard income payout to all citizens so that 711 00:44:47,800 --> 00:44:51,400 Speaker 1: they can meet their most basic needs. Now, these are 712 00:44:51,480 --> 00:44:56,200 Speaker 1: big ideas, they aren't easy to implement or administer, and 713 00:44:56,239 --> 00:44:59,359 Speaker 1: they're not cheap, But it may be that they will 714 00:44:59,400 --> 00:45:02,920 Speaker 1: become nest as sary, or some similar strategy will be 715 00:45:02,960 --> 00:45:05,560 Speaker 1: needed to make certain that we have a plan to 716 00:45:05,680 --> 00:45:09,080 Speaker 1: move toward rather than being caught in a world where 717 00:45:09,080 --> 00:45:14,240 Speaker 1: a disproportionate percentage of people can't find gainful employment. Heck, 718 00:45:14,320 --> 00:45:17,000 Speaker 1: we're seeing something like that right now due to the 719 00:45:17,040 --> 00:45:21,879 Speaker 1: COVID crisis, which is also underlining the importance of automation 720 00:45:22,000 --> 00:45:24,719 Speaker 1: in a world where it's not necessarily safe to have 721 00:45:24,800 --> 00:45:27,160 Speaker 1: a bunch of human beings all gathered in the same 722 00:45:27,200 --> 00:45:31,359 Speaker 1: place at the same time. Are the robots coming for 723 00:45:31,400 --> 00:45:35,920 Speaker 1: our jobs? Well, for some of our jobs. Definitely. Many 724 00:45:35,960 --> 00:45:39,160 Speaker 1: of those jobs come with some pretty tough consequences for 725 00:45:39,280 --> 00:45:43,040 Speaker 1: humans who are working those jobs today. Those jobs may 726 00:45:43,080 --> 00:45:46,000 Speaker 1: have high injury rates, the people who work them may 727 00:45:46,040 --> 00:45:49,239 Speaker 1: have lower life expectancies, and there are a whole host 728 00:45:49,360 --> 00:45:51,600 Speaker 1: of health issues that can come along with certain jobs. 729 00:45:51,600 --> 00:45:54,600 Speaker 1: So you could make a strong argument that really this 730 00:45:54,680 --> 00:45:57,360 Speaker 1: is for the best because it will help save lives 731 00:45:57,360 --> 00:46:00,440 Speaker 1: and reduce the chance for injury or ill. This for 732 00:46:00,520 --> 00:46:03,840 Speaker 1: a lot of people, but for other jobs, the robots 733 00:46:03,960 --> 00:46:06,719 Speaker 1: aren't likely to take over in the near future. For 734 00:46:06,800 --> 00:46:11,120 Speaker 1: a lot of jobs, automated systems not necessarily robots, but 735 00:46:11,200 --> 00:46:16,120 Speaker 1: perhaps you know software based AI will augment what humans 736 00:46:16,160 --> 00:46:20,240 Speaker 1: are doing. It's important we have conversations about this stuff 737 00:46:20,520 --> 00:46:23,640 Speaker 1: and to talk about how to address the consequences of 738 00:46:23,680 --> 00:46:27,560 Speaker 1: increased automation. There are ways we can enjoy the benefits 739 00:46:27,600 --> 00:46:31,359 Speaker 1: of automation, but only if we think critically about it 740 00:46:31,480 --> 00:46:36,960 Speaker 1: and create policies and procedures accordingly. Now I gotta get going. 741 00:46:37,680 --> 00:46:40,439 Speaker 1: I hear robo Jonathan is going to host the next 742 00:46:40,480 --> 00:46:43,000 Speaker 1: episode of tech Stuff, and I have to train them 743 00:46:43,000 --> 00:46:46,400 Speaker 1: on how to make puns and pop culture references. But 744 00:46:46,560 --> 00:46:49,520 Speaker 1: if you guys have suggestions for future topics I could 745 00:46:49,560 --> 00:46:52,319 Speaker 1: tackle here on tech Stuff, please reach out to me 746 00:46:52,360 --> 00:46:54,759 Speaker 1: and let me know what those are. You can reach 747 00:46:54,760 --> 00:46:57,360 Speaker 1: out on Twitter. The handle for the show is text 748 00:46:57,400 --> 00:47:03,000 Speaker 1: stuff hs W and I'll all too again really soon. Y. 749 00:47:07,280 --> 00:47:10,319 Speaker 1: Text Stuff is an i heart Radio production. For more 750 00:47:10,400 --> 00:47:13,800 Speaker 1: podcasts from I heart Radio, visit the i heart Radio app, 751 00:47:13,920 --> 00:47:17,080 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.