1 00:00:04,400 --> 00:00:12,840 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. Okay, there, 2 00:00:13,039 --> 00:00:16,599 Speaker 1: I'm Lauren Vogelbaum, sitting in for Jonathan Strickland today for 3 00:00:16,760 --> 00:00:21,479 Speaker 1: I Heart Radio's International Women's Day podcast Takeover. Uh. Some 4 00:00:21,640 --> 00:00:24,239 Speaker 1: of y'all might remember me from the way back I 5 00:00:24,280 --> 00:00:26,560 Speaker 1: was Jonathan's co host here for a minute, when I 6 00:00:26,640 --> 00:00:29,480 Speaker 1: was just a ton of little baby podcaster, or from 7 00:00:29,520 --> 00:00:32,559 Speaker 1: Forward Thinking, another show that we worked on together, or 8 00:00:32,600 --> 00:00:35,680 Speaker 1: from other podcasts that I work on myself, or you 9 00:00:35,760 --> 00:00:39,400 Speaker 1: might find my voice new and strange. Um. But in 10 00:00:39,479 --> 00:00:43,520 Speaker 1: any case, hello, thank you for existing and for being 11 00:00:43,560 --> 00:00:47,440 Speaker 1: here today. In honor of International Women's Day, I wanted 12 00:00:47,440 --> 00:00:51,640 Speaker 1: to do this episode about ethics in technology, and specifically 13 00:00:51,640 --> 00:00:55,080 Speaker 1: an artificial intelligence. You know, about the ways that tech 14 00:00:55,200 --> 00:00:58,720 Speaker 1: can can hurt or help the quest to make the 15 00:00:58,760 --> 00:01:03,560 Speaker 1: world more equitable. And you might be going, aren't computers 16 00:01:03,680 --> 00:01:08,480 Speaker 1: essentially or even quintessentially unbiased? Um? You know, a program 17 00:01:08,480 --> 00:01:11,560 Speaker 1: doesn't have feelings, It only has code that it executes, 18 00:01:12,160 --> 00:01:16,039 Speaker 1: And of course that's true, but the humans who write 19 00:01:16,040 --> 00:01:20,440 Speaker 1: the code do have biases, some conscious, some unconscious, and 20 00:01:20,520 --> 00:01:23,080 Speaker 1: so the ways that we tell programs to work can 21 00:01:23,120 --> 00:01:27,000 Speaker 1: carry those biases. One example that I always think of. 22 00:01:27,200 --> 00:01:29,520 Speaker 1: And you might have seen headlines about this back in 23 00:01:29,600 --> 00:01:35,039 Speaker 1: like two about how digital cameras can behave in biased ways. 24 00:01:35,480 --> 00:01:38,920 Speaker 1: There was this whole thing where some webcams weren't tracking 25 00:01:38,959 --> 00:01:42,640 Speaker 1: and focusing on the faces of black users, and some 26 00:01:42,760 --> 00:01:47,360 Speaker 1: other cameras were flagging photos of Asian subjects because the 27 00:01:47,440 --> 00:01:51,160 Speaker 1: software insisted that their eyes were closed. In both of 28 00:01:51,200 --> 00:01:54,120 Speaker 1: these cases, it was clear that the programs had been 29 00:01:54,160 --> 00:01:58,120 Speaker 1: trained on photos of a vast majority of white faces. 30 00:01:58,440 --> 00:02:01,200 Speaker 1: The programs didn't know what to do with skin that 31 00:02:01,240 --> 00:02:05,000 Speaker 1: reflected light differently or with eyes that were a different shape. 32 00:02:05,800 --> 00:02:07,840 Speaker 1: And I will say, like this is not a purely 33 00:02:08,240 --> 00:02:10,480 Speaker 1: digital issue. UM, this sort of thing has been an 34 00:02:10,520 --> 00:02:14,320 Speaker 1: issue in photography for as long as photography has existed. 35 00:02:14,520 --> 00:02:18,760 Speaker 1: UM film stocks were originally created with only white subjects 36 00:02:18,760 --> 00:02:23,160 Speaker 1: in mind. And it wasn't until like furniture and chocolate 37 00:02:23,240 --> 00:02:26,960 Speaker 1: companies started lodging complaints with Kodak in the nineteen sixties 38 00:02:27,000 --> 00:02:29,880 Speaker 1: and seventies. The Kodak started to adjust their films to 39 00:02:29,919 --> 00:02:35,200 Speaker 1: better capture different shades of brown. But on this like 40 00:02:35,440 --> 00:02:39,000 Speaker 1: small example scale, you know, like that sucks for the 41 00:02:39,080 --> 00:02:43,520 Speaker 1: users of these cameras. But these problems are really compounded 42 00:02:43,639 --> 00:02:47,000 Speaker 1: when you start getting into big data and machine learning 43 00:02:47,120 --> 00:02:52,520 Speaker 1: and artificial intelligence. Artificial intelligence has the capacity to totally 44 00:02:52,680 --> 00:02:56,280 Speaker 1: change our world for the better. Um. Everything from making 45 00:02:56,440 --> 00:03:00,640 Speaker 1: our energy grid more efficient and more adaptable, preventing tragic 46 00:03:00,680 --> 00:03:04,920 Speaker 1: outages like we saw in Texas recently, to helping farmers 47 00:03:05,040 --> 00:03:07,640 Speaker 1: make the most of their resources and getting more fresh 48 00:03:07,680 --> 00:03:11,600 Speaker 1: foods to people who currently don't have good access to that, 49 00:03:12,280 --> 00:03:16,840 Speaker 1: to making autonomous vehicles possible, to letting your doctor just 50 00:03:16,919 --> 00:03:19,679 Speaker 1: like real quick consult every case of a disease, ever 51 00:03:19,880 --> 00:03:22,919 Speaker 1: while making a decision about how to proceed with your treatment. 52 00:03:23,560 --> 00:03:27,000 Speaker 1: Uh to I don't know, stopping the thing where you 53 00:03:27,160 --> 00:03:33,560 Speaker 1: always get served ads for the thing you just bought. Yes, 54 00:03:33,639 --> 00:03:35,760 Speaker 1: I like that T shirt, That's why I just bought 55 00:03:35,760 --> 00:03:39,400 Speaker 1: it up. Okay, Uh anyway, it's it's not It's not 56 00:03:39,480 --> 00:03:43,880 Speaker 1: as simple as um Asimov's laws of robotics when you 57 00:03:44,000 --> 00:03:47,920 Speaker 1: start getting into the wider consequences of AI UM and 58 00:03:48,240 --> 00:03:51,320 Speaker 1: you know, um, a robot may not injure a human 59 00:03:51,360 --> 00:03:54,920 Speaker 1: being or through inaction, allow a human being to come 60 00:03:54,960 --> 00:03:57,680 Speaker 1: to harm, which of course didn't even work in the 61 00:03:57,680 --> 00:04:01,480 Speaker 1: fictional world that that asthmov was set up. And robot 62 00:04:01,480 --> 00:04:04,920 Speaker 1: ethics is totally a thing that is also not simple. 63 00:04:05,040 --> 00:04:07,080 Speaker 1: And of course it is ideal if you're if your 64 00:04:07,160 --> 00:04:11,960 Speaker 1: roomba or your autonomous car does not kill you. UM, 65 00:04:11,960 --> 00:04:15,880 Speaker 1: but we're talking about designing these AI systems that will 66 00:04:16,000 --> 00:04:19,680 Speaker 1: change the way of life for whole societies. UM, it 67 00:04:19,839 --> 00:04:23,400 Speaker 1: is a big deal and it could lead to some 68 00:04:23,520 --> 00:04:28,599 Speaker 1: big problems. So we need to talk about how to 69 00:04:28,720 --> 00:04:35,000 Speaker 1: train your algorithm. Um. These big systems start small with 70 00:04:35,120 --> 00:04:39,640 Speaker 1: designers training algorithms with data sets. So right from the start, UM, 71 00:04:39,720 --> 00:04:42,040 Speaker 1: you have the issue of what data is going in 72 00:04:42,480 --> 00:04:46,279 Speaker 1: and within that data, what's being paid attention to and 73 00:04:46,320 --> 00:04:50,240 Speaker 1: what's being ignored. And now I'm not saying that these 74 00:04:50,279 --> 00:04:54,080 Speaker 1: designers are like all mustache twirling villains out there to 75 00:04:54,160 --> 00:04:56,800 Speaker 1: do evil, but they are human. You know, We're each 76 00:04:56,839 --> 00:04:59,760 Speaker 1: moving through the world with our own set of experiences. 77 00:05:00,040 --> 00:05:02,800 Speaker 1: There are so many other experiences that were bound to 78 00:05:02,960 --> 00:05:05,840 Speaker 1: fail to take some of them into consideration, or to 79 00:05:06,160 --> 00:05:10,479 Speaker 1: or to misunderstand some of those circumstances. Which is why 80 00:05:10,480 --> 00:05:12,640 Speaker 1: it's so important to have people from a variety of 81 00:05:12,680 --> 00:05:16,320 Speaker 1: backgrounds and on these projects. And right now, diversity in 82 00:05:16,360 --> 00:05:20,800 Speaker 1: tech is uh not great. UM. Back inten a whole 83 00:05:20,839 --> 00:05:23,919 Speaker 1: bunch of big tech companies got together and pledged to 84 00:05:24,240 --> 00:05:28,279 Speaker 1: increase diversity in their workforces UM and make their results public. 85 00:05:28,839 --> 00:05:32,400 Speaker 1: And every year these reports come out and the numbers 86 00:05:32,440 --> 00:05:35,880 Speaker 1: haven't changed that much in some of these categories in 87 00:05:36,600 --> 00:05:41,359 Speaker 1: six years. UH. Women are better represented now UM or 88 00:05:41,400 --> 00:05:46,360 Speaker 1: as rather having gone from around fift of the workforce 89 00:05:46,400 --> 00:05:50,960 Speaker 1: to around of the workforce at places like Google and Facebook. 90 00:05:51,680 --> 00:05:55,279 Speaker 1: But the only company that showed a comparable jump in 91 00:05:55,560 --> 00:06:00,839 Speaker 1: UM black employees was Amazon, and they're including their distribution 92 00:06:00,920 --> 00:06:05,679 Speaker 1: center employees. And other categories of underrepresented people like people 93 00:06:05,680 --> 00:06:09,279 Speaker 1: with disabilities aren't even being reported in all of these 94 00:06:09,320 --> 00:06:13,279 Speaker 1: public results UM, but studies show that they're dramatically underrepresented 95 00:06:13,279 --> 00:06:16,520 Speaker 1: in the workforce, which just isn't great. You know, when 96 00:06:16,560 --> 00:06:20,080 Speaker 1: we're designing technology like self driving cars that will need 97 00:06:20,120 --> 00:06:25,360 Speaker 1: to take into consideration the movement of wheelchair users. And 98 00:06:25,480 --> 00:06:29,840 Speaker 1: there are unfortunately instances of programs being made specifically with 99 00:06:29,960 --> 00:06:34,159 Speaker 1: bias UM, like in to sixteen when the Boston Police 100 00:06:34,160 --> 00:06:39,239 Speaker 1: Department used this social media surveillance system to flag posts 101 00:06:39,279 --> 00:06:44,440 Speaker 1: made by regular citizens who who used certain terms, for example, 102 00:06:44,920 --> 00:06:51,160 Speaker 1: colloquial Arabic Muslim words or um words like ferguson or protest, 103 00:06:51,920 --> 00:06:55,440 Speaker 1: and we are all in this, whether we like it 104 00:06:55,560 --> 00:06:58,960 Speaker 1: or not. UM Again, just as one example, everything that 105 00:06:59,000 --> 00:07:01,599 Speaker 1: we do online, from you know, what we type into 106 00:07:01,640 --> 00:07:05,880 Speaker 1: search engines and social media sites, to our location data, 107 00:07:06,040 --> 00:07:09,680 Speaker 1: to how we move our mice or like tap at 108 00:07:09,680 --> 00:07:12,760 Speaker 1: our smartphones, all of that has the potential to be 109 00:07:12,840 --> 00:07:17,440 Speaker 1: recorded and collected and sold and referenced and cross referenced 110 00:07:17,480 --> 00:07:20,360 Speaker 1: and used to track us in any number of ways. 111 00:07:21,000 --> 00:07:25,679 Speaker 1: But AI systems are a huge industry worldwide. Business spending 112 00:07:25,760 --> 00:07:30,240 Speaker 1: on artificial intelligence hit about fifty billion dollars in and 113 00:07:30,280 --> 00:07:36,960 Speaker 1: it's expected to more than double that by retail, banking, media, governments, 114 00:07:37,120 --> 00:07:41,560 Speaker 1: All kinds of industries are investing in this, and all 115 00:07:41,600 --> 00:07:45,000 Speaker 1: of this is fairly new, But of course the field 116 00:07:45,080 --> 00:07:48,560 Speaker 1: of ethics, and even computer ethics is not new at all. 117 00:07:49,480 --> 00:07:55,760 Speaker 1: So ethics and technology have always been tied together because 118 00:07:56,040 --> 00:07:59,840 Speaker 1: every time that that we humans create some new technology 119 00:07:59,840 --> 00:08:02,640 Speaker 1: that changes our world and how we interact with it 120 00:08:02,760 --> 00:08:06,720 Speaker 1: and each other, we have to reconsider our world and 121 00:08:06,880 --> 00:08:11,239 Speaker 1: our interactions. You could even argue that in that way, 122 00:08:11,400 --> 00:08:16,400 Speaker 1: like like philosophically, ethics is itself a type of technology. 123 00:08:16,640 --> 00:08:18,600 Speaker 1: But I'm not going to go that deep today. I'm 124 00:08:18,600 --> 00:08:23,120 Speaker 1: backing away from that precipice. Uh So, let's skip ahead 125 00:08:23,200 --> 00:08:26,200 Speaker 1: from you know, the beginning of human consciousness UM to 126 00:08:26,520 --> 00:08:30,800 Speaker 1: the nineteen forties, because that's when digital computers were being 127 00:08:30,840 --> 00:08:34,520 Speaker 1: invented and in the field of cybernetics got started. That's 128 00:08:34,559 --> 00:08:39,280 Speaker 1: the science of information feedback systems, right. UM. Cybernetics was 129 00:08:39,320 --> 00:08:43,160 Speaker 1: pioneered by m T mathematician Norbert Wiener and some of 130 00:08:43,200 --> 00:08:45,960 Speaker 1: his colleagues as they were working during World War Two 131 00:08:46,320 --> 00:08:49,840 Speaker 1: to develop an anti aircraft cannon that could that could 132 00:08:49,880 --> 00:08:53,720 Speaker 1: a detect and track a fast moving airplane and then 133 00:08:53,960 --> 00:08:58,640 Speaker 1: be extrapolate the airplane's probable location in the immediate future 134 00:08:58,920 --> 00:09:02,520 Speaker 1: and aim um and then see signal the firing mechanism 135 00:09:02,559 --> 00:09:08,840 Speaker 1: to fire. And that internal communication that the machine was 136 00:09:08,960 --> 00:09:13,680 Speaker 1: doing really got Weener thinking. In he published a book 137 00:09:13,720 --> 00:09:18,000 Speaker 1: called Cybernetics, or Control and Communication in the Animal and 138 00:09:18,120 --> 00:09:21,840 Speaker 1: the Machine, and in it he mused that these new 139 00:09:22,000 --> 00:09:27,000 Speaker 1: computing machines could very easily become central nervous systems for 140 00:09:27,120 --> 00:09:30,080 Speaker 1: processing all kinds of data from all kinds of instruments, 141 00:09:30,600 --> 00:09:34,360 Speaker 1: and that that potential was huge. He compared it to 142 00:09:34,440 --> 00:09:38,920 Speaker 1: nuclear weapons um. He wrote, long before Nagasaki and the 143 00:09:38,960 --> 00:09:41,920 Speaker 1: public awareness of the atomic bomb, it had occurred to 144 00:09:41,960 --> 00:09:44,679 Speaker 1: me that we were here in the presence of another 145 00:09:44,960 --> 00:09:50,040 Speaker 1: social potentiality, of unheard of importance for good and for evil. 146 00:09:51,720 --> 00:09:54,200 Speaker 1: Oh Weener expounded on this in a book he published 147 00:09:54,240 --> 00:09:58,119 Speaker 1: in nineteen fifty called The Human Use of Human Beings, 148 00:09:58,640 --> 00:10:03,600 Speaker 1: which which basically predicted that integrating computer technology into society 149 00:10:03,920 --> 00:10:07,840 Speaker 1: was going to be another revolution, just as sweeping and 150 00:10:08,000 --> 00:10:12,400 Speaker 1: messy as the Industrial Revolution, and he tried to lay 151 00:10:12,440 --> 00:10:14,600 Speaker 1: out a bit of groundwork for how to not like 152 00:10:14,920 --> 00:10:20,120 Speaker 1: totally bork it up. Spoiler alert, people borked it up anyway. 153 00:10:20,679 --> 00:10:22,840 Speaker 1: There wasn't a whole lot more work done in the 154 00:10:22,880 --> 00:10:26,320 Speaker 1: field of computer ethics until the nineteen sixties and seventies, 155 00:10:26,720 --> 00:10:31,359 Speaker 1: when computer based crime and information security and privacy concerns 156 00:10:31,559 --> 00:10:35,640 Speaker 1: had already become a problem. So by the mid sixties, 157 00:10:35,720 --> 00:10:40,480 Speaker 1: corporations and the government had both begun collecting just giant 158 00:10:40,559 --> 00:10:46,120 Speaker 1: amounts of personal data about US citizens in literally massive computers. 159 00:10:47,120 --> 00:10:50,520 Speaker 1: I guess, I guess all computers are literally massive in 160 00:10:50,520 --> 00:10:53,720 Speaker 1: that they have mass. But okay, you know what I mean. Um, 161 00:10:54,080 --> 00:10:57,280 Speaker 1: From you know, medical records to military records to legal 162 00:10:57,320 --> 00:11:01,240 Speaker 1: documents to shopping habits. And this journalist by the name 163 00:11:01,280 --> 00:11:04,959 Speaker 1: of Vance Packard wrote a book called The Naked Society 164 00:11:05,000 --> 00:11:09,600 Speaker 1: published in nineteen sixty four about the inherent privacy issue 165 00:11:09,679 --> 00:11:16,040 Speaker 1: of having that information collected and available for instantaneous reference. UH. 166 00:11:16,200 --> 00:11:20,959 Speaker 1: Something of an uproar ensued UM. Focusing as Packard's book did, 167 00:11:21,400 --> 00:11:25,880 Speaker 1: on the US government's use of citizens information UM, and 168 00:11:25,920 --> 00:11:29,000 Speaker 1: it led to just a whole bunch of data transparency 169 00:11:29,080 --> 00:11:32,520 Speaker 1: legislation over the next decade. UH. The Freedom of Information Act, 170 00:11:32,600 --> 00:11:36,080 Speaker 1: the Fair Credit Reporting Act, all designed to make sure 171 00:11:36,120 --> 00:11:39,440 Speaker 1: that citizens are able to know what data is being 172 00:11:39,480 --> 00:11:43,560 Speaker 1: collected about them by the government and how it's being used, 173 00:11:43,960 --> 00:11:47,319 Speaker 1: And to be fair, at that time, most computational power 174 00:11:47,480 --> 00:11:50,600 Speaker 1: was in the hands of the government. But this legislature 175 00:11:50,800 --> 00:11:56,120 Speaker 1: and conversation really ignored the activities of private corporations, and 176 00:11:56,160 --> 00:11:59,600 Speaker 1: it never really questioned the ethics of collecting all of 177 00:11:59,600 --> 00:12:02,679 Speaker 1: that at to in the first place. UM. And this 178 00:12:02,760 --> 00:12:05,400 Speaker 1: is apparently like a very American thing, the concept that 179 00:12:05,760 --> 00:12:09,360 Speaker 1: information is inherently good and like more is better UM. 180 00:12:09,400 --> 00:12:12,200 Speaker 1: But that is a rabbit hole for another day. But 181 00:12:12,679 --> 00:12:16,800 Speaker 1: it's not that people weren't thinking about the ethics. One 182 00:12:16,880 --> 00:12:19,240 Speaker 1: important thing that happened around the same time was that 183 00:12:19,280 --> 00:12:22,080 Speaker 1: a researcher by the name of Don Parker, who was 184 00:12:22,160 --> 00:12:26,840 Speaker 1: looking into crime being committed via computers. Parker proposed to 185 00:12:26,920 --> 00:12:32,360 Speaker 1: this leading industry professional organization, the Association for Computing Machinery, 186 00:12:32,400 --> 00:12:35,000 Speaker 1: that they develop a code of ethics for their members, 187 00:12:35,520 --> 00:12:38,079 Speaker 1: and they were like, yeah, cool, you you do that. UM. 188 00:12:38,120 --> 00:12:40,120 Speaker 1: So he headed up a committee and the a c 189 00:12:40,280 --> 00:12:43,800 Speaker 1: M adopted their first code of Ethics in ninety three. 190 00:12:44,559 --> 00:12:46,880 Speaker 1: They've updated it, I think like about once a decade 191 00:12:46,920 --> 00:12:49,640 Speaker 1: since then, UM, with the most recent update being in 192 00:12:51,160 --> 00:12:54,120 Speaker 1: It's really thoughtful. UM. One of my favorite bits from 193 00:12:54,120 --> 00:12:57,439 Speaker 1: the intro its specifies that it's quote not an algorithm 194 00:12:57,520 --> 00:13:01,000 Speaker 1: for solving ethical problems. Rather, it's as a basis for 195 00:13:01,120 --> 00:13:04,160 Speaker 1: ethical decision making. UM. You can read it if you're 196 00:13:04,160 --> 00:13:05,599 Speaker 1: into that sort of thing by going to a c 197 00:13:05,800 --> 00:13:09,439 Speaker 1: M dot org and then back to our timeline. In 198 00:13:09,520 --> 00:13:14,400 Speaker 1: nineteen seventy six, Joseph Weisenbaum, who had created the psychotherapy 199 00:13:14,440 --> 00:13:18,880 Speaker 1: mimicking chat, bought Eliza. A decade earlier, he published his 200 00:13:18,920 --> 00:13:23,240 Speaker 1: book Computer Power and Human Reason From Judgment to Calculation, 201 00:13:24,480 --> 00:13:28,280 Speaker 1: and this book was a response to the response that 202 00:13:28,400 --> 00:13:32,479 Speaker 1: he had gotten from people to his chat Bought Eliza, 203 00:13:33,040 --> 00:13:35,640 Speaker 1: and I know that Jonathan has talked about this chat 204 00:13:35,679 --> 00:13:37,960 Speaker 1: Bought on the show before. UM. It comes up a 205 00:13:38,000 --> 00:13:41,599 Speaker 1: lot and discussions about the Turing test and how convincingly 206 00:13:41,720 --> 00:13:45,480 Speaker 1: computers can approximate human communication because it was one of 207 00:13:45,480 --> 00:13:48,760 Speaker 1: the first that was effective. UM. The thing is, though, 208 00:13:48,880 --> 00:13:54,280 Speaker 1: that Weisenbaum designed Eliza as a demonstration of how bad 209 00:13:54,679 --> 00:13:59,680 Speaker 1: computers inherently are at this type of communication, but he 210 00:13:59,760 --> 00:14:03,920 Speaker 1: got the opposite response from people who tried the program 211 00:14:03,920 --> 00:14:07,280 Speaker 1: out in talking to I mean, you know, typing with 212 00:14:07,559 --> 00:14:12,720 Speaker 1: Eliza about their psychological problems. People felt like Eliza understood them, 213 00:14:12,760 --> 00:14:17,000 Speaker 1: even when they knew it was a robot. UM. So 214 00:14:17,880 --> 00:14:21,960 Speaker 1: Weisenbaum was kind of like, whoe wait uh. And so 215 00:14:22,000 --> 00:14:25,000 Speaker 1: he wrote this book to to really explain the differences 216 00:14:25,120 --> 00:14:30,560 Speaker 1: between computation and human intelligence and to assert that ethics 217 00:14:30,600 --> 00:14:34,880 Speaker 1: are imperative in the design of artificial intelligence because people 218 00:14:34,920 --> 00:14:39,440 Speaker 1: will forget the computers do not have the understanding, the wisdom, 219 00:14:39,560 --> 00:14:44,880 Speaker 1: the moral and emotional consideration of human beings. And this 220 00:14:44,960 --> 00:14:48,080 Speaker 1: is when things really started picking up in the field 221 00:14:48,080 --> 00:14:51,840 Speaker 1: of computer ethics. UM. Also in ninety this professor who 222 00:14:51,880 --> 00:14:55,560 Speaker 1: was teaching medical ethics at the time. Walter Byner noticed 223 00:14:55,680 --> 00:15:00,880 Speaker 1: how computers were complicating that field, and he became the 224 00:15:00,880 --> 00:15:04,920 Speaker 1: first person in academia to really like decide computer ethics 225 00:15:04,960 --> 00:15:08,880 Speaker 1: should be its own field of research and application UM. 226 00:15:08,880 --> 00:15:13,280 Speaker 1: He wrote that computer technology was creating whole new ethical 227 00:15:13,320 --> 00:15:17,040 Speaker 1: concerns that needed to be taken into consideration. UM, so 228 00:15:17,080 --> 00:15:20,560 Speaker 1: he popularized the term computer ethics, did speeches and workshops 229 00:15:20,560 --> 00:15:24,920 Speaker 1: and everything and jumping ahead. The first major textbook on 230 00:15:24,960 --> 00:15:28,480 Speaker 1: the subject was published called Computer Ethics, edited by one 231 00:15:28,560 --> 00:15:32,960 Speaker 1: Deborah Johnson, and Johnson disagreed with maner Um. She argued 232 00:15:33,000 --> 00:15:38,400 Speaker 1: the computers weren't creating new problems, but rather exacerbating old 233 00:15:38,480 --> 00:15:44,680 Speaker 1: problems around things like privacy, ownership, power, and responsibility. We 234 00:15:44,760 --> 00:15:47,560 Speaker 1: will get into what the future may hold after we 235 00:15:47,640 --> 00:15:58,080 Speaker 1: get back from a quick break. Welcome back. As the 236 00:15:58,160 --> 00:16:03,160 Speaker 1: computer industry group with a consumer adoption of home computing, 237 00:16:03,240 --> 00:16:06,640 Speaker 1: with with internet access growing in both the capacity and 238 00:16:06,880 --> 00:16:11,640 Speaker 1: the ubiquity of computers, just absolutely snowballing UM. The field 239 00:16:11,640 --> 00:16:15,320 Speaker 1: of computer ethics exploded UM, and more and more groups 240 00:16:15,400 --> 00:16:19,440 Speaker 1: formed to to help everyone makes sense of all of this. 241 00:16:20,040 --> 00:16:24,880 Speaker 1: For example, the Electronic Frontier Foundation was founded in there, 242 00:16:25,160 --> 00:16:29,080 Speaker 1: that advocacy and activism group that's dedicated to defending civil 243 00:16:29,160 --> 00:16:33,560 Speaker 1: liberties as technology advances. And okay, this is kind of 244 00:16:33,560 --> 00:16:36,000 Speaker 1: a side quest, but I did not notice. They actually 245 00:16:36,000 --> 00:16:39,240 Speaker 1: formed up in response to the federal seizure of a 246 00:16:39,240 --> 00:16:44,080 Speaker 1: bunch of computer equipment belonging to Steve Jackson Games, yep, 247 00:16:44,120 --> 00:16:46,960 Speaker 1: the company that brings us Apples to Apples and Munchkin 248 00:16:47,520 --> 00:16:50,160 Speaker 1: lots of other good stuff. So what happened here? What 249 00:16:50,200 --> 00:16:52,720 Speaker 1: was that was that there was this Bell South digital 250 00:16:52,760 --> 00:16:57,880 Speaker 1: document that explained how the emergency telephone system worked, and 251 00:16:57,960 --> 00:17:01,520 Speaker 1: it leaked, and the U. S. Secret Service was concerned 252 00:17:01,560 --> 00:17:04,280 Speaker 1: that having that info out there was a security concern, 253 00:17:04,359 --> 00:17:07,359 Speaker 1: you know, that hackers might overwhelm the system, um, something 254 00:17:07,400 --> 00:17:11,880 Speaker 1: like that. So so the Secret Service was conducting raids 255 00:17:12,200 --> 00:17:17,879 Speaker 1: tracking this documents digital distribution. Steve Jackson was innocent, um, 256 00:17:17,920 --> 00:17:20,520 Speaker 1: but they got this warrant, conducted this raid, took all 257 00:17:20,560 --> 00:17:24,719 Speaker 1: this stuff, um, and wound up accessing and the leading 258 00:17:24,920 --> 00:17:28,240 Speaker 1: a bunch of personal bulletin board messages from the company's 259 00:17:28,280 --> 00:17:32,439 Speaker 1: website in the process. And now the Secret Service, you know, 260 00:17:32,480 --> 00:17:35,280 Speaker 1: didn't find anything, so they gave the equipment back and 261 00:17:35,320 --> 00:17:39,600 Speaker 1: didn't press charges, but Steve Jackson was like, no, no, no, 262 00:17:40,359 --> 00:17:44,080 Speaker 1: y'all almost tanked my business. You violated the privacy of 263 00:17:44,119 --> 00:17:47,159 Speaker 1: my bulletin board users. We are pressing charges against you. 264 00:17:48,160 --> 00:17:52,439 Speaker 1: But there was really no civil rights organization that was 265 00:17:52,480 --> 00:17:55,960 Speaker 1: prepared to take on the case due to the technological 266 00:17:55,960 --> 00:17:59,159 Speaker 1: complexity of the issue. So the e f F formed 267 00:17:59,200 --> 00:18:01,960 Speaker 1: in order to bring that suit to court. Um, and 268 00:18:02,000 --> 00:18:04,480 Speaker 1: that was the first time that the court recognized the 269 00:18:04,560 --> 00:18:07,720 Speaker 1: email should have equal protection to to to to phone 270 00:18:07,720 --> 00:18:10,800 Speaker 1: calls or any other kind of communication. So so thanks, 271 00:18:10,960 --> 00:18:15,199 Speaker 1: thanks Steve Jackson for for everything. Um, the ff has 272 00:18:15,240 --> 00:18:18,239 Speaker 1: done a whole lot of important stuff. Um, they got 273 00:18:18,359 --> 00:18:22,639 Speaker 1: encryption technology taken off the list of nationally regulated weapons. 274 00:18:23,520 --> 00:18:26,840 Speaker 1: That's a separate episode though, anyway, So side quest over 275 00:18:27,000 --> 00:18:30,200 Speaker 1: back to the main quest. Um, all of this computer 276 00:18:30,240 --> 00:18:34,760 Speaker 1: stuff exploded and the field of computer ethics specialized. Um, 277 00:18:34,880 --> 00:18:39,760 Speaker 1: so now you've got internet ethics, information systems ethics, robot ethics. 278 00:18:40,720 --> 00:18:42,080 Speaker 1: And I and I do want to say, all of 279 00:18:42,119 --> 00:18:45,160 Speaker 1: this does coincide with work being done in in other 280 00:18:45,359 --> 00:18:48,640 Speaker 1: fields engine engineering and information science. Like I don't want 281 00:18:48,640 --> 00:18:52,320 Speaker 1: to imply that computer technology was the only field that 282 00:18:52,359 --> 00:18:55,640 Speaker 1: had been working with and contributing to these to these 283 00:18:55,880 --> 00:18:59,840 Speaker 1: ethical theories and the practical application of them. But like 284 00:18:59,840 --> 00:19:01,720 Speaker 1: I was saying at the top of this episode, one 285 00:19:01,720 --> 00:19:05,480 Speaker 1: of the really interesting specialties to me is in AI 286 00:19:05,560 --> 00:19:09,320 Speaker 1: ethics because it does have such potentially sweeping effects, and 287 00:19:09,440 --> 00:19:12,280 Speaker 1: also because it keeps coming up in the news for 288 00:19:12,760 --> 00:19:16,879 Speaker 1: less than rad reasons. So you have probably seen headlines 289 00:19:16,920 --> 00:19:22,119 Speaker 1: over the past few years about algorithms gone wrong. Um, 290 00:19:22,359 --> 00:19:26,639 Speaker 1: there's there's the story from where Pro Publica found that 291 00:19:26,680 --> 00:19:30,560 Speaker 1: the software used by some courts to determine the risk 292 00:19:30,680 --> 00:19:35,359 Speaker 1: of criminal defendants committing further offenses and therefore to determine 293 00:19:35,480 --> 00:19:39,840 Speaker 1: whether to detain those defendants until trial or or what 294 00:19:39,960 --> 00:19:42,840 Speaker 1: kind of bail to set for them. Um. At least 295 00:19:42,840 --> 00:19:47,359 Speaker 1: one of those algorithms, called Compass, regularly found black people 296 00:19:47,520 --> 00:19:52,480 Speaker 1: riskier and white people less risky, even when everything else 297 00:19:52,560 --> 00:19:56,640 Speaker 1: about the defendants cases were comparable. And this kind of 298 00:19:56,680 --> 00:20:00,760 Speaker 1: issue crops up in discussions about hiring software as well. 299 00:20:00,840 --> 00:20:03,880 Speaker 1: Because of a lack of care in their design, these 300 00:20:03,920 --> 00:20:08,880 Speaker 1: programs that automatically sort through resumes have ranked applicants who 301 00:20:08,920 --> 00:20:13,240 Speaker 1: have woman sounding or or black sounding names as lower 302 00:20:13,800 --> 00:20:18,920 Speaker 1: in consideration. And then there's Google's search algorithms UM from 303 00:20:20,359 --> 00:20:23,800 Speaker 1: eighteen there were all of these headlines. Reverse image searches 304 00:20:24,080 --> 00:20:28,919 Speaker 1: using photos of black people were returning images of guerrillas 305 00:20:29,680 --> 00:20:32,000 Speaker 1: because no one had taught the system how to consider 306 00:20:32,440 --> 00:20:36,399 Speaker 1: dark skin tones in people. Or if you searched the 307 00:20:36,680 --> 00:20:42,080 Speaker 1: seemingly innocuous term black girls, the first page results out 308 00:20:42,080 --> 00:20:46,600 Speaker 1: of trillions of web indexed pages included porn. Or there 309 00:20:46,640 --> 00:20:49,720 Speaker 1: was research into the search history of the white guy 310 00:20:49,880 --> 00:20:53,400 Speaker 1: who killed nine black people in a church in Charleston, 311 00:20:53,480 --> 00:20:57,920 Speaker 1: South Carolina. It's highly likely that he was radicalized, at 312 00:20:57,960 --> 00:21:01,119 Speaker 1: least in part due to the way that Google search 313 00:21:01,160 --> 00:21:05,280 Speaker 1: works UM. It's taking into account location and demographics about 314 00:21:05,359 --> 00:21:09,240 Speaker 1: him and other searchers in his area. Google search returning 315 00:21:09,440 --> 00:21:13,720 Speaker 1: white supremacist propaganda when he searched the term black on 316 00:21:13,800 --> 00:21:17,160 Speaker 1: white crime. By the way, related to this, you can 317 00:21:17,240 --> 00:21:21,320 Speaker 1: make Google give you less specialized search results. And I 318 00:21:21,359 --> 00:21:23,359 Speaker 1: am looking into that and doing it as soon as 319 00:21:23,400 --> 00:21:27,120 Speaker 1: I finished recording this this because it is continually infuriating 320 00:21:27,119 --> 00:21:30,000 Speaker 1: to me, just just when I'm doing my reading for 321 00:21:30,000 --> 00:21:32,760 Speaker 1: for podcast episodes like this and trying to find stuff 322 00:21:32,800 --> 00:21:36,120 Speaker 1: that isn't like a restaurant in my area. If I'm 323 00:21:36,119 --> 00:21:41,639 Speaker 1: talking about a larger concern anyway. UM. Also, remember how 324 00:21:41,640 --> 00:21:44,600 Speaker 1: I was talking about the flaws in programming of digital 325 00:21:44,640 --> 00:21:49,400 Speaker 1: cameras and how they sometimes have trouble discerning specific features 326 00:21:49,400 --> 00:21:52,560 Speaker 1: of people of color. UM. You know, extrapolate that out 327 00:21:52,680 --> 00:21:57,040 Speaker 1: to how facial recognition software is used with surveillance footage 328 00:21:57,160 --> 00:22:01,199 Speaker 1: by police departments. Um, the software is more likely to 329 00:22:01,240 --> 00:22:05,440 Speaker 1: make a mistake in identifying a black person's face because 330 00:22:05,440 --> 00:22:08,480 Speaker 1: the software just isn't as good as seeing that face 331 00:22:08,720 --> 00:22:11,480 Speaker 1: because of how it was programmed, which can lead to 332 00:22:11,680 --> 00:22:16,960 Speaker 1: false identifications and thus wrongful harassments and arrests. UM. Georgetown 333 00:22:17,040 --> 00:22:20,639 Speaker 1: University put out a whole report on this in called 334 00:22:20,720 --> 00:22:24,560 Speaker 1: the Perpetual Lineup. That report found that half of Americans 335 00:22:24,760 --> 00:22:28,240 Speaker 1: have photos and police facial recognition databases, by the way, UM, 336 00:22:28,280 --> 00:22:32,080 Speaker 1: which includes just lots of people with no criminal backgrounds. 337 00:22:32,400 --> 00:22:35,000 Speaker 1: And of course, even if we fix those algorithms, that 338 00:22:35,000 --> 00:22:37,560 Speaker 1: that isn't going to fix the fact that communities of 339 00:22:37,600 --> 00:22:41,399 Speaker 1: color are subject to more surveillance in the first place. 340 00:22:42,920 --> 00:22:46,600 Speaker 1: More recently, there have been headlines and a whole discussion 341 00:22:46,680 --> 00:22:50,840 Speaker 1: about this new rule that was issued in September by 342 00:22:51,000 --> 00:22:54,840 Speaker 1: the U S Department of Housing and Urban Development. And 343 00:22:54,880 --> 00:22:59,280 Speaker 1: this rule essentially makes it super difficult for banks or 344 00:22:59,400 --> 00:23:03,960 Speaker 1: landlord or homeowners insurance companies to be sued for denying 345 00:23:04,000 --> 00:23:07,840 Speaker 1: housing two people of color if an algorithm was used 346 00:23:08,000 --> 00:23:12,160 Speaker 1: to make that determination. Based on the concept that algorithms 347 00:23:12,160 --> 00:23:16,720 Speaker 1: cannot be racist, the rule was immediately challenged, UM, and 348 00:23:17,000 --> 00:23:21,679 Speaker 1: in January, President Biden issued an executive order directing the 349 00:23:21,720 --> 00:23:25,000 Speaker 1: Department of Housing and Urban Development to to examine the 350 00:23:25,080 --> 00:23:31,240 Speaker 1: rules effects. But hoof, hoof, UM, And you know it's 351 00:23:31,280 --> 00:23:35,560 Speaker 1: it's not it's not easy. None of this is easy, um. 352 00:23:35,600 --> 00:23:40,120 Speaker 1: In order to build a non biased artificial intelligence system, 353 00:23:40,280 --> 00:23:43,560 Speaker 1: we um, I mean like like humans, not like you know, 354 00:23:43,640 --> 00:23:46,840 Speaker 1: you and me, dear listener. We need to change the 355 00:23:46,880 --> 00:23:51,280 Speaker 1: systems that lead to the building of artificial intelligence. We 356 00:23:51,320 --> 00:23:56,399 Speaker 1: need to examine how the design and programming is taught, um, 357 00:23:56,440 --> 00:24:01,640 Speaker 1: how companies conduct their business, how policy is written, and 358 00:24:01,640 --> 00:24:05,119 Speaker 1: and who has access to seats at all of those tables. 359 00:24:05,520 --> 00:24:08,280 Speaker 1: I mean it's also not technically easy, Like when you 360 00:24:08,320 --> 00:24:11,520 Speaker 1: build and train these systems, just adding more diverse data 361 00:24:11,760 --> 00:24:15,800 Speaker 1: isn't magically going to make the system create better, less 362 00:24:15,840 --> 00:24:19,800 Speaker 1: biased rules. It might create conflicting rules. UM. You know 363 00:24:20,000 --> 00:24:23,320 Speaker 1: it is expensive in terms of time and effort and 364 00:24:23,320 --> 00:24:26,720 Speaker 1: and just pure physical energy to to do this work. 365 00:24:27,359 --> 00:24:31,159 Speaker 1: The pitfalls of not doing this work are tremendous. You know, 366 00:24:31,200 --> 00:24:35,720 Speaker 1: it can cause measurable hurt in people's lives. And as 367 00:24:35,760 --> 00:24:41,080 Speaker 1: one doctor Debi Chatra, material science engineer, has said, any 368 00:24:41,160 --> 00:24:47,320 Speaker 1: sufficiently advanced neglect is indistinguishable from malice. But the benefits 369 00:24:47,480 --> 00:24:53,600 Speaker 1: to doing this work are equally measurable and tremendous. One 370 00:24:53,760 --> 00:24:57,840 Speaker 1: consideration moving into the future is how to square the 371 00:24:57,960 --> 00:25:03,920 Speaker 1: very concept of ethics with an increasingly multicultural digital world. Um. 372 00:25:03,960 --> 00:25:06,160 Speaker 1: You know, not not everyone on the planet grew up 373 00:25:06,160 --> 00:25:09,440 Speaker 1: with European philosophy, going back to the ancient Greek as 374 00:25:09,440 --> 00:25:13,439 Speaker 1: the basis of their ethical conception. And we also have 375 00:25:13,520 --> 00:25:17,440 Speaker 1: to acknowledge that, um, you know, whatever a culture's philosophical basis, 376 00:25:18,040 --> 00:25:21,240 Speaker 1: it's probably rooted in some biases of its own. Um. 377 00:25:21,280 --> 00:25:24,320 Speaker 1: Just for example, just throwing it out there, if your 378 00:25:24,320 --> 00:25:29,720 Speaker 1: society has coded emotions as feminine and feminine as bad, um, 379 00:25:29,840 --> 00:25:34,040 Speaker 1: then you're probably not giving emotional harm as much weight, 380 00:25:34,119 --> 00:25:38,680 Speaker 1: if any weight, as physical harm in your considerations of justice. Um. 381 00:25:38,720 --> 00:25:41,800 Speaker 1: And you can see the effects of this in things 382 00:25:41,880 --> 00:25:44,720 Speaker 1: like the care that we give our veterans with physical 383 00:25:44,760 --> 00:25:49,600 Speaker 1: injury versus veterans with PTSD UM or just the general 384 00:25:49,600 --> 00:25:53,240 Speaker 1: ways that our society handles mental health versus physical health, 385 00:25:53,520 --> 00:25:57,920 Speaker 1: or or any kind of neurodivergence. All of this work 386 00:25:58,080 --> 00:26:02,719 Speaker 1: in artificial intelligence and the ethics thereof is really requiring 387 00:26:02,800 --> 00:26:08,080 Speaker 1: us to redefine intelligence, to fully consider what we mean 388 00:26:08,359 --> 00:26:12,359 Speaker 1: by human intelligence, what logic and emotion and experience go 389 00:26:12,440 --> 00:26:16,440 Speaker 1: into that, and the ways in which machine intelligence might differ. 390 00:26:16,880 --> 00:26:22,480 Speaker 1: The Stanford Encyclopedia Philosophy references Minsky's book The Society of 391 00:26:22,480 --> 00:26:26,760 Speaker 1: the Mind, saying we do not wish to restrict intelligence 392 00:26:27,000 --> 00:26:32,280 Speaker 1: to what would require intelligence if done by humans. And 393 00:26:32,280 --> 00:26:35,800 Speaker 1: and of course that's true, um A, I can do 394 00:26:35,920 --> 00:26:40,080 Speaker 1: stuff that we can't that that's arguably the whole point, um. 395 00:26:40,119 --> 00:26:42,439 Speaker 1: But it all has to be done with the best 396 00:26:42,640 --> 00:26:47,560 Speaker 1: of what human intelligence can be in mind, and I'm 397 00:26:47,560 --> 00:26:49,439 Speaker 1: just now realizing that, like I might have written a 398 00:26:49,520 --> 00:26:52,560 Speaker 1: forward thinking episode instead of a text stuff episode. But 399 00:26:52,560 --> 00:26:54,920 Speaker 1: but that's what I've got for you today. So if 400 00:26:55,000 --> 00:26:57,600 Speaker 1: you have enjoyed this episode and would like to hear 401 00:26:57,640 --> 00:27:01,280 Speaker 1: more from me, um, you can find me podcasts like 402 00:27:01,680 --> 00:27:05,200 Speaker 1: brain Stuff. It's a It's a daily short form general 403 00:27:05,240 --> 00:27:09,000 Speaker 1: science and culture show. UM or Savor which is a 404 00:27:09,160 --> 00:27:13,679 Speaker 1: food science and history show. Or American Shadows, which is 405 00:27:13,680 --> 00:27:16,440 Speaker 1: produced with Aaron Minky's company Grim and Mild UM It's 406 00:27:16,480 --> 00:27:20,040 Speaker 1: it's a show about some of the darker bits of 407 00:27:20,080 --> 00:27:25,040 Speaker 1: American history UH and ways in which even those dire 408 00:27:25,080 --> 00:27:29,320 Speaker 1: situations UM had light brought to them. I would like 409 00:27:29,400 --> 00:27:32,359 Speaker 1: to give a quick shout out to my friend Damien 410 00:27:32,359 --> 00:27:35,040 Speaker 1: Patrick Williams. He works in this field and has made 411 00:27:35,040 --> 00:27:36,879 Speaker 1: me more familiar with a lot of the concepts that 412 00:27:36,920 --> 00:27:38,959 Speaker 1: I talked about today. UM. You can find lots more 413 00:27:39,040 --> 00:27:43,520 Speaker 1: from him at a Future Worth thinking About dot com. 414 00:27:43,520 --> 00:27:47,520 Speaker 1: This podcast is produced by Tari Harrison and Severe. Thanks 415 00:27:47,520 --> 00:27:50,320 Speaker 1: to her for being so kind and accommodating UM and 416 00:27:50,440 --> 00:27:54,240 Speaker 1: helping me with this episode. The executive producer is Jonathan Strickland, 417 00:27:54,440 --> 00:27:56,560 Speaker 1: and thanks to him for trusting me with his podcast 418 00:27:56,600 --> 00:27:59,399 Speaker 1: for a day. Thanks to you for listening, and He'll 419 00:27:59,400 --> 00:28:06,880 Speaker 1: talk to you again. End really soon. Yeah. Text Stuff 420 00:28:06,920 --> 00:28:10,120 Speaker 1: is an I Heart Radio production. For more podcasts from 421 00:28:10,119 --> 00:28:13,880 Speaker 1: I Heart Radio, visit the i Heart Radio app, Apple Podcasts, 422 00:28:14,000 --> 00:28:16,000 Speaker 1: or wherever you listen to your favorite shows.