1 00:00:04,440 --> 00:00:12,319 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,320 --> 00:00:15,720 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:15,760 --> 00:00:19,439 Speaker 1: I'm an executive producer with iHeart Podcasts and how the 4 00:00:19,480 --> 00:00:23,240 Speaker 1: tech are you today? I want to tell the story 5 00:00:23,440 --> 00:00:27,880 Speaker 1: of Lenna orsin her role in the history of image 6 00:00:27,880 --> 00:00:33,559 Speaker 1: processing and technical papers, and why some publications and organizations 7 00:00:33,560 --> 00:00:39,199 Speaker 1: are now banning papers that contain her photograph. Because I 8 00:00:39,240 --> 00:00:41,880 Speaker 1: hadn't actually heard of any of this before I read 9 00:00:41,920 --> 00:00:46,560 Speaker 1: an article in Ours Technica by BINGJ. Edwards titled Playboy 10 00:00:46,680 --> 00:00:50,000 Speaker 1: image from nineteen seventy two gets banned from I Tripoli 11 00:00:50,280 --> 00:00:53,800 Speaker 1: computer Journals. Yeah, we're going to be telling you about 12 00:00:53,840 --> 00:00:57,880 Speaker 1: Playboy and nude modeling as well, because that all is 13 00:00:57,920 --> 00:01:01,480 Speaker 1: involved in this story, all right. The first up, who 14 00:01:01,840 --> 00:01:05,880 Speaker 1: is Lena for Senne? Well, she's a model from Sweden. 15 00:01:06,240 --> 00:01:09,800 Speaker 1: She moved to America and in her early twenties she 16 00:01:10,000 --> 00:01:13,440 Speaker 1: was a model and Playboy discovered her and she became 17 00:01:13,640 --> 00:01:18,520 Speaker 1: Miss November nineteen seventy two. The centerfold photo of her 18 00:01:18,560 --> 00:01:24,200 Speaker 1: pictorial would become entwined with tech history, and not just 19 00:01:24,240 --> 00:01:28,280 Speaker 1: because some tech heads were subscribed to Playboy, although I 20 00:01:28,319 --> 00:01:31,200 Speaker 1: suspected that played a large part in it, and her 21 00:01:31,240 --> 00:01:34,000 Speaker 1: image is one that you've probably seen at least the 22 00:01:34,240 --> 00:01:38,080 Speaker 1: portion of the image of her centerfold that has been 23 00:01:38,200 --> 00:01:41,399 Speaker 1: used in the tech sector, because it's literally been used 24 00:01:41,440 --> 00:01:44,000 Speaker 1: thousands of times. I know that when I saw the picture, 25 00:01:44,040 --> 00:01:47,600 Speaker 1: I thought, oh, that's where that's from, because I had 26 00:01:47,600 --> 00:01:50,800 Speaker 1: never thought to look any further into it. So a 27 00:01:50,840 --> 00:01:54,280 Speaker 1: photographer named Dwight Hooker took the photo in question, and 28 00:01:54,360 --> 00:01:58,080 Speaker 1: in the photo, Lena Foceen stands facing a mirror and 29 00:01:58,400 --> 00:02:01,200 Speaker 1: her back is to the camerage. She's turned her head 30 00:02:01,560 --> 00:02:04,320 Speaker 1: so that she's looking back over her shoulder to look 31 00:02:04,640 --> 00:02:09,000 Speaker 1: at the camera lens. And the centerfold version of this 32 00:02:09,080 --> 00:02:12,600 Speaker 1: photo is full body, you know, from head to toe, 33 00:02:12,680 --> 00:02:16,240 Speaker 1: well technically hat to toe, because she is wearing a 34 00:02:16,280 --> 00:02:20,200 Speaker 1: hat and a boa and really nothing else. The section 35 00:02:20,480 --> 00:02:23,040 Speaker 1: of the image that matters the most to our story 36 00:02:23,480 --> 00:02:26,920 Speaker 1: is from her shoulders up shoulders to the top of 37 00:02:26,960 --> 00:02:29,799 Speaker 1: the feathered hat she is wearing, because that's the part 38 00:02:29,840 --> 00:02:31,560 Speaker 1: of the image that we play a huge role in 39 00:02:31,600 --> 00:02:35,680 Speaker 1: the development of image processing in general, and the fact 40 00:02:35,720 --> 00:02:38,280 Speaker 1: that it was a cropped image from a centerfold photo 41 00:02:38,320 --> 00:02:41,600 Speaker 1: in Playboy would become a source of debate and I 42 00:02:41,680 --> 00:02:44,680 Speaker 1: hesitate to use the word controversy. I'm not sure that 43 00:02:44,760 --> 00:02:48,919 Speaker 1: it was that controversial as much as it was concerning. 44 00:02:49,639 --> 00:02:51,880 Speaker 1: But we'll talk all about that when we get toward 45 00:02:51,919 --> 00:02:55,000 Speaker 1: the end of this episode. But let's go back to 46 00:02:55,080 --> 00:02:59,240 Speaker 1: the early nineteen seventies, more specifically the summer of nineteen 47 00:02:59,400 --> 00:03:02,560 Speaker 1: seventy three. Getting more specific than just the summer of 48 00:03:02,560 --> 00:03:04,760 Speaker 1: seventy three is a bit tricky because the folks who 49 00:03:04,760 --> 00:03:08,880 Speaker 1: have related the story as it unfolded about using this 50 00:03:08,960 --> 00:03:11,200 Speaker 1: image in the first place, we're all working from memory, 51 00:03:11,520 --> 00:03:14,560 Speaker 1: so they were just like, it's June or July nineteen 52 00:03:14,600 --> 00:03:16,760 Speaker 1: seventy three. That's about as specific as we can get. 53 00:03:17,200 --> 00:03:20,440 Speaker 1: The place, however, we can narrow down. The place was 54 00:03:20,480 --> 00:03:24,440 Speaker 1: the University of Southern California, and specifically the Signal and 55 00:03:24,680 --> 00:03:29,680 Speaker 1: Image Processing Institute or SIPI or SIPPY as I will 56 00:03:29,720 --> 00:03:34,120 Speaker 1: call it. The institute at that point was pretty darn young, 57 00:03:34,400 --> 00:03:38,320 Speaker 1: so an alumnus of USC named William Pratt led a 58 00:03:38,480 --> 00:03:43,920 Speaker 1: group of alumni to establish this institute in nineteen seventy two, 59 00:03:44,680 --> 00:03:47,560 Speaker 1: and they had the goal of tackling three big challenges 60 00:03:47,840 --> 00:03:51,640 Speaker 1: in computing and digital images. So, according to the organization's 61 00:03:51,680 --> 00:03:55,640 Speaker 1: fiftieth anniversary celebration page because they just recently celebrated that 62 00:03:55,680 --> 00:03:58,880 Speaker 1: a couple of years ago. The challenges were to solve quote, 63 00:03:59,200 --> 00:04:06,400 Speaker 1: image code, image restoration, and image data extractioned. So remember 64 00:04:06,760 --> 00:04:11,800 Speaker 1: this is decades before we would get industry standards like JPEG, 65 00:04:12,160 --> 00:04:15,320 Speaker 1: and even still more than a decade before the bitmap 66 00:04:15,360 --> 00:04:19,359 Speaker 1: image file format. So the folks at USC SIPPY, which 67 00:04:19,440 --> 00:04:22,880 Speaker 1: back then was just called IPPI, they were doing work 68 00:04:22,960 --> 00:04:26,359 Speaker 1: that would inform these later standards. So this is like 69 00:04:26,400 --> 00:04:29,279 Speaker 1: the early work where they're coming up with the methodologies 70 00:04:29,520 --> 00:04:33,760 Speaker 1: that ultimately would find their way into various file formats 71 00:04:34,320 --> 00:04:36,560 Speaker 1: much further down the line. So they were laying the 72 00:04:36,640 --> 00:04:40,640 Speaker 1: groundwork now. In that summer of nineteen seventy three, an 73 00:04:40,640 --> 00:04:45,760 Speaker 1: electrical engineering assistant professor named Alexander Sawchuk was working with 74 00:04:45,839 --> 00:04:48,640 Speaker 1: grad students to find an image that they were going 75 00:04:48,680 --> 00:04:52,920 Speaker 1: to scan. So Sawchuk had a colleague who was preparing 76 00:04:53,920 --> 00:04:58,720 Speaker 1: a paper for a conference, and this paper was all 77 00:04:58,760 --> 00:05:02,440 Speaker 1: about the process they were using at USC to scan 78 00:05:02,600 --> 00:05:06,560 Speaker 1: images and to digitize them. And Sawchuk wanted to grab 79 00:05:06,600 --> 00:05:09,000 Speaker 1: something that they hadn't already used a dozen times in 80 00:05:09,120 --> 00:05:12,000 Speaker 1: various test scans, like they had some stock photos that 81 00:05:12,040 --> 00:05:15,480 Speaker 1: were their go to, but everyone had seen those already, 82 00:05:15,520 --> 00:05:18,520 Speaker 1: and moreover, the folks in the lab were just sick 83 00:05:18,560 --> 00:05:21,440 Speaker 1: of them, so they wanted to get something new. Plus, 84 00:05:21,440 --> 00:05:24,560 Speaker 1: they wanted something that would really show off their methodology 85 00:05:24,680 --> 00:05:27,440 Speaker 1: to good effect. So they wanted something that was special. 86 00:05:27,440 --> 00:05:30,680 Speaker 1: They wanted a glossy photo that had a lot of 87 00:05:30,760 --> 00:05:33,400 Speaker 1: intricate detail inside of it and also a high level 88 00:05:33,400 --> 00:05:36,240 Speaker 1: of contrast in it. They wanted something that had a 89 00:05:36,279 --> 00:05:40,960 Speaker 1: lot of dynamic elements so that their methodology could be 90 00:05:41,000 --> 00:05:44,440 Speaker 1: shown off in the best light, so to speak. So 91 00:05:44,680 --> 00:05:47,400 Speaker 1: he also wanted the photo to be of a person's face, 92 00:05:47,480 --> 00:05:50,320 Speaker 1: to really, you know, have something that people could see 93 00:05:50,400 --> 00:05:53,200 Speaker 1: and recognize immediately as oh, that's a human being, and 94 00:05:53,240 --> 00:05:56,839 Speaker 1: thus be able to tell how well the process worked. 95 00:05:57,160 --> 00:06:00,320 Speaker 1: Now history has lost the name of the heat Row 96 00:06:00,360 --> 00:06:04,160 Speaker 1: who walked by carrying an issue of the November nineteen 97 00:06:04,240 --> 00:06:07,719 Speaker 1: seventy two Playboy magazine. Let me just say, the nineteen 98 00:06:07,760 --> 00:06:10,800 Speaker 1: seventies were definitely a different time. But even as someone 99 00:06:10,880 --> 00:06:13,680 Speaker 1: who was born in the nineteen seventies, as an I 100 00:06:13,800 --> 00:06:16,120 Speaker 1: was born in the seventies, y'all, it is hard for 101 00:06:16,160 --> 00:06:19,840 Speaker 1: me to imagine just casually bringing a Playboy magazine into 102 00:06:19,880 --> 00:06:22,480 Speaker 1: an academic lab like that, or just carrying it around 103 00:06:22,480 --> 00:06:25,719 Speaker 1: at a college campus. It's hard for me to imagine 104 00:06:25,720 --> 00:06:29,200 Speaker 1: doing that. But then I'm also not an electrical engineer, 105 00:06:29,640 --> 00:06:33,120 Speaker 1: so maybe I just I'm just built different. The scanner 106 00:06:33,360 --> 00:06:36,680 Speaker 1: that the team was using was a mirror head or 107 00:06:36,800 --> 00:06:42,800 Speaker 1: mir head muirhad, I'm butchering the pronunciation, but he was 108 00:06:42,880 --> 00:06:46,960 Speaker 1: a wire photo machine and had a scanning resolution of 109 00:06:47,279 --> 00:06:50,920 Speaker 1: one hundred lines per inch. Their plan was to scan 110 00:06:51,040 --> 00:06:55,000 Speaker 1: a five twelve x five twelve image, So five hundred 111 00:06:55,000 --> 00:06:56,800 Speaker 1: and twelve lines by five hundred and twelve lines I 112 00:06:56,839 --> 00:06:58,960 Speaker 1: meant the photo that they scanned could only be five 113 00:06:59,040 --> 00:07:02,119 Speaker 1: point one two inches to a side right one hundred 114 00:07:02,200 --> 00:07:04,880 Speaker 1: lines to an inch. So that would mean that they 115 00:07:05,000 --> 00:07:08,120 Speaker 1: could not do a full scan of the centerfold, and 116 00:07:08,440 --> 00:07:11,320 Speaker 1: they shouldn't even have thought of it anyway, and I'm 117 00:07:11,320 --> 00:07:14,840 Speaker 1: sure they didn't because it would have been incredibly inappropriate 118 00:07:15,000 --> 00:07:18,560 Speaker 1: to present the centerfold as a scanned image in a 119 00:07:18,680 --> 00:07:22,720 Speaker 1: conference paper, considering the nudity. That would just be inappropriate. 120 00:07:22,800 --> 00:07:26,240 Speaker 1: So they just went the top five point one two 121 00:07:26,320 --> 00:07:29,760 Speaker 1: inches of the image showing, which would crop the photo 122 00:07:30,160 --> 00:07:35,000 Speaker 1: at Lenna's shoulders, so it would just be the shoulders up. Well, 123 00:07:35,040 --> 00:07:38,360 Speaker 1: let's talk about the actual scanning technology for a minute. 124 00:07:38,480 --> 00:07:42,000 Speaker 1: So we're in the early nineteen seventies, and you might 125 00:07:42,080 --> 00:07:45,960 Speaker 1: wonder how old was wire photo technology. Believe it or not, 126 00:07:46,480 --> 00:07:49,880 Speaker 1: wire photo services had been in operation for around fifty 127 00:07:50,320 --> 00:07:54,120 Speaker 1: years by the time we're talking about making computer scans. Now, 128 00:07:54,120 --> 00:07:57,280 Speaker 1: of course, in the early twentieth century, people were not 129 00:07:57,440 --> 00:08:01,920 Speaker 1: scanning photographs into computers. That wasn't happening. But what they 130 00:08:01,960 --> 00:08:05,800 Speaker 1: were doing was they were using technology to turn photographs 131 00:08:05,840 --> 00:08:09,920 Speaker 1: into electrical signals that could then be transmitted over wire 132 00:08:10,200 --> 00:08:13,520 Speaker 1: or over radio waves. Now, there were different particulars for 133 00:08:13,600 --> 00:08:17,560 Speaker 1: the various methodologies. There wasn't just one device out there 134 00:08:17,600 --> 00:08:21,760 Speaker 1: that did this. But generally speaking, a typical approach with 135 00:08:22,080 --> 00:08:26,360 Speaker 1: a wire photo scanner would have a machine that consisted 136 00:08:26,360 --> 00:08:29,640 Speaker 1: of a drum, and on this drum you would place 137 00:08:29,680 --> 00:08:32,560 Speaker 1: the image because kind of similar to like a photocopier 138 00:08:32,640 --> 00:08:35,840 Speaker 1: in that respect, and the drum would rotate and as 139 00:08:35,880 --> 00:08:38,480 Speaker 1: the drum rotated, there would be this tiny dot of 140 00:08:38,559 --> 00:08:41,600 Speaker 1: light that would hit the photograph and slowly make its 141 00:08:41,600 --> 00:08:45,720 Speaker 1: way across the width of the photograph. So as the 142 00:08:45,800 --> 00:08:48,719 Speaker 1: light was moving, the drum would rotate fast enough so 143 00:08:48,760 --> 00:08:52,760 Speaker 1: that the light would scan the entire length of the photograph, 144 00:08:53,240 --> 00:08:56,480 Speaker 1: and the photograph would reflect some of that light onto 145 00:08:56,600 --> 00:08:59,599 Speaker 1: a photovoltaic cell, so kind of similar to what you 146 00:08:59,600 --> 00:09:02,840 Speaker 1: would find in the solar cell right now. The intensity 147 00:09:03,080 --> 00:09:06,600 Speaker 1: of the light that hit that photovoltaic cell would determine 148 00:09:06,640 --> 00:09:09,360 Speaker 1: the amount of electrical charge that the cell could generate. 149 00:09:09,559 --> 00:09:11,480 Speaker 1: So that means you would end up with this variable 150 00:09:11,600 --> 00:09:15,080 Speaker 1: electrical signal, and that signal would represent the amount of 151 00:09:15,160 --> 00:09:18,320 Speaker 1: light that was reflected off the photograph. You can think 152 00:09:18,360 --> 00:09:21,200 Speaker 1: of it as a really strong signal shows a bright 153 00:09:21,320 --> 00:09:23,760 Speaker 1: part of the image, and a weak signal would show 154 00:09:23,800 --> 00:09:28,160 Speaker 1: a dark part of the image. To oversimplify it. Now, 155 00:09:28,679 --> 00:09:32,679 Speaker 1: you could then take this electrical signal and you could 156 00:09:32,679 --> 00:09:35,600 Speaker 1: send it to a destination. You could do so directly 157 00:09:35,679 --> 00:09:39,400 Speaker 1: over a wire, it's like a power line or phone 158 00:09:39,440 --> 00:09:43,640 Speaker 1: line or something. Or you could further transform the information 159 00:09:43,720 --> 00:09:47,480 Speaker 1: by converting the signal into radio waves and then broadcast 160 00:09:47,800 --> 00:09:51,040 Speaker 1: the radio waves to a receiver which would then reverse 161 00:09:51,080 --> 00:09:54,360 Speaker 1: the process to capture the radio waves, convert them back 162 00:09:54,360 --> 00:09:56,800 Speaker 1: into an electrical signal, and then send them on. So 163 00:09:57,120 --> 00:09:59,480 Speaker 1: either way this electrical signal would make its way to 164 00:09:59,520 --> 00:10:02,320 Speaker 1: the other end, where you would have a similar device 165 00:10:02,400 --> 00:10:05,079 Speaker 1: to the scanner. It would also have a drum on it, 166 00:10:05,280 --> 00:10:07,080 Speaker 1: but instead of a photo on it, you would have 167 00:10:07,320 --> 00:10:12,440 Speaker 1: photoreactive paper or film on it, and the drum would 168 00:10:12,559 --> 00:10:17,199 Speaker 1: rotate in synchronization with the rotation of the original drum, 169 00:10:17,360 --> 00:10:20,480 Speaker 1: synchronization of what it was going at when it was scanning, 170 00:10:20,480 --> 00:10:24,439 Speaker 1: that is, and it would also project a light, and 171 00:10:24,440 --> 00:10:26,800 Speaker 1: that light the projection of that light would depend upon 172 00:10:26,840 --> 00:10:30,480 Speaker 1: that varying electrical signal, So in strong parts of the signal, 173 00:10:31,000 --> 00:10:34,560 Speaker 1: the light would be more intense, and in weaker parts 174 00:10:34,600 --> 00:10:37,120 Speaker 1: of the signal it'd be dimmer. And so this light 175 00:10:37,160 --> 00:10:40,880 Speaker 1: would hit the photoreactive film on the drum. As the 176 00:10:40,960 --> 00:10:45,000 Speaker 1: drum rotated, the light would also scan across the width 177 00:10:45,080 --> 00:10:49,560 Speaker 1: of the film. And then you would take the film 178 00:10:49,720 --> 00:10:52,000 Speaker 1: and you would develop it, and you would end up 179 00:10:52,120 --> 00:10:55,520 Speaker 1: with a copy of the original image that was used 180 00:10:55,520 --> 00:11:00,199 Speaker 1: on the first device. This was really clever. It's a 181 00:11:00,200 --> 00:11:04,720 Speaker 1: clever way to take an image, transform it into a signal, 182 00:11:05,280 --> 00:11:08,439 Speaker 1: and then take that signal and transform it back into 183 00:11:08,480 --> 00:11:11,679 Speaker 1: a copy of the image. All Right, we're gonna take 184 00:11:11,679 --> 00:11:13,680 Speaker 1: a quick break here. When we come back, i'll talk 185 00:11:13,800 --> 00:11:18,760 Speaker 1: more about how this centerfold image found its way into 186 00:11:18,880 --> 00:11:21,800 Speaker 1: tech history, but first let's take a break to thank 187 00:11:21,880 --> 00:11:34,520 Speaker 1: our sponsors. Okay, we're back. So I talked before the 188 00:11:34,559 --> 00:11:39,520 Speaker 1: break about the actual technical approach toward wire photo scanning, 189 00:11:39,920 --> 00:11:44,040 Speaker 1: and it's a very clever approach. However, it's also an 190 00:11:44,200 --> 00:11:48,000 Speaker 1: analog approach, so, in other words, it's not directly compatible 191 00:11:48,040 --> 00:11:51,680 Speaker 1: with the world of computers, which is a digital world. 192 00:11:51,760 --> 00:11:55,280 Speaker 1: It's built on top of the concept of binary right 193 00:11:55,480 --> 00:11:58,320 Speaker 1: zeros and ones. You can think of the computer as 194 00:11:58,400 --> 00:12:01,440 Speaker 1: viewing the world as stuff that's I off or it's on, 195 00:12:02,080 --> 00:12:06,000 Speaker 1: whereas analog is more like a continuum. This, by the way, 196 00:12:06,720 --> 00:12:11,559 Speaker 1: is one of the many bases of arguments that audio 197 00:12:11,679 --> 00:12:17,040 Speaker 1: files make that analog audio is inherently better than digital audio, 198 00:12:17,440 --> 00:12:22,600 Speaker 1: because analog is a representation of an unbroken signal, whereas 199 00:12:23,120 --> 00:12:25,559 Speaker 1: digital is a bunch of zeros and ones that give 200 00:12:25,559 --> 00:12:27,760 Speaker 1: you little, tiny steps. And yes, the steps can be 201 00:12:27,880 --> 00:12:31,600 Speaker 1: very very tiny, so tiny that to our perception it's 202 00:12:31,679 --> 00:12:34,640 Speaker 1: no different than an unbroken signal. But if you get 203 00:12:34,679 --> 00:12:38,280 Speaker 1: down far enough, it is it's broken up. And apparently 204 00:12:38,320 --> 00:12:42,960 Speaker 1: that's enough to make digital worse than analog. Not every 205 00:12:43,000 --> 00:12:44,840 Speaker 1: audio file believes that. By the way, I don't want 206 00:12:44,840 --> 00:12:47,360 Speaker 1: to paint everyone with the same brush, but that is 207 00:12:47,520 --> 00:12:49,960 Speaker 1: one of the arguments that audio files make. I don't 208 00:12:49,960 --> 00:12:52,120 Speaker 1: personally buy into it. I do think you can reach 209 00:12:52,160 --> 00:12:56,839 Speaker 1: a level of fidelity that is so indistinguishable from any 210 00:12:56,880 --> 00:13:00,600 Speaker 1: other format that it makes no difference, but that's my 211 00:13:00,679 --> 00:13:05,760 Speaker 1: own personal opinion anyway. The wire photo scanning approach is 212 00:13:05,920 --> 00:13:11,439 Speaker 1: an analog technology. You're getting this variable electrical signal that's 213 00:13:11,559 --> 00:13:14,600 Speaker 1: a continuous thing, not a bunch of zeros and ones. 214 00:13:15,040 --> 00:13:17,640 Speaker 1: So in order to be able to process this in 215 00:13:17,679 --> 00:13:20,480 Speaker 1: a computer and to be able to make a computer 216 00:13:20,559 --> 00:13:25,080 Speaker 1: scan of this, the lab actually had to use analog 217 00:13:25,160 --> 00:13:28,640 Speaker 1: to digital converters to connect to the scanner in order 218 00:13:28,679 --> 00:13:32,680 Speaker 1: to change those signals into binary data that a computer 219 00:13:32,760 --> 00:13:36,720 Speaker 1: could make sense of. Actually, the lab had three analog 220 00:13:36,800 --> 00:13:39,960 Speaker 1: to digital converters. They had one for red, one for green, 221 00:13:40,040 --> 00:13:43,240 Speaker 1: and one for blue, and collectively these three converters would 222 00:13:43,240 --> 00:13:47,040 Speaker 1: supply all the information needed to recreate the image on 223 00:13:47,080 --> 00:13:51,040 Speaker 1: a computer with the proper colors, because obviously if you 224 00:13:51,160 --> 00:13:53,360 Speaker 1: didn't have that then you would have to work with 225 00:13:53,400 --> 00:13:57,360 Speaker 1: a monochromatic digital image, right you would just have information 226 00:13:57,440 --> 00:13:59,760 Speaker 1: as to the brightness or darkness, but you wouldn't have 227 00:14:00,400 --> 00:14:03,559 Speaker 1: that related to things like what color that was. By 228 00:14:03,640 --> 00:14:08,400 Speaker 1: using filters and three different digital analog to digital converters, 229 00:14:08,440 --> 00:14:12,680 Speaker 1: they could recreate that. Now. The computational side of their 230 00:14:12,679 --> 00:14:15,760 Speaker 1: work was handled by a Hewlett Packard twenty one hundred 231 00:14:15,800 --> 00:14:19,000 Speaker 1: mini computer. Now we don't tend to use the term 232 00:14:19,160 --> 00:14:22,840 Speaker 1: mini computer anymore these days, and the name might give 233 00:14:22,840 --> 00:14:26,600 Speaker 1: you the wrong impression if you aren't aren't ancient like 234 00:14:26,680 --> 00:14:31,200 Speaker 1: I am. Many computers were not teeny tiny machines despite 235 00:14:31,240 --> 00:14:35,720 Speaker 1: the name mini computer. Many computers were still honking big computers. 236 00:14:35,760 --> 00:14:39,120 Speaker 1: They could weigh more than two hundred pounds easy. Now 237 00:14:39,160 --> 00:14:42,960 Speaker 1: they were called many computers because despite their size, they 238 00:14:43,040 --> 00:14:47,440 Speaker 1: were smaller and less powerful than the giant mainframe machines 239 00:14:47,440 --> 00:14:50,640 Speaker 1: you might find in some places. Many computers came out 240 00:14:50,640 --> 00:14:54,920 Speaker 1: before the era of personal computers, but there was quite 241 00:14:54,920 --> 00:14:58,040 Speaker 1: a bit of overlap between the era of many computers 242 00:14:58,120 --> 00:15:01,560 Speaker 1: and the era of PCs did come out. While many 243 00:15:01,600 --> 00:15:04,960 Speaker 1: computers were still very much the regular type of machine 244 00:15:05,000 --> 00:15:09,280 Speaker 1: you would encounter in say a scientific lab, a research lab, 245 00:15:09,440 --> 00:15:12,360 Speaker 1: or maybe a financial institution. That kind of thing you 246 00:15:12,400 --> 00:15:15,000 Speaker 1: were more likely to run across a mini computer than 247 00:15:15,040 --> 00:15:18,440 Speaker 1: a personal computer because for lots of years, PCs just 248 00:15:18,600 --> 00:15:22,120 Speaker 1: couldn't match the performance of many computers, and a lot 249 00:15:22,120 --> 00:15:26,680 Speaker 1: of organizations had higher needs than a PC could provide, 250 00:15:27,040 --> 00:15:29,680 Speaker 1: but they didn't have such high demands that they needed 251 00:15:29,680 --> 00:15:33,760 Speaker 1: to invest millions of dollars into a mainframe computer, So 252 00:15:33,800 --> 00:15:36,800 Speaker 1: the mini computer was kind of the best solution. Now, 253 00:15:36,840 --> 00:15:39,680 Speaker 1: by the time you get to the late nineties early 254 00:15:39,720 --> 00:15:44,320 Speaker 1: two thousands, personal computers were at a performance level where 255 00:15:44,600 --> 00:15:48,920 Speaker 1: many computers weren't really relevant anymore. Many computers also were 256 00:15:49,120 --> 00:15:51,960 Speaker 1: reducing in size. There was kind of this convergence happening 257 00:15:52,240 --> 00:15:56,080 Speaker 1: where PCs were essentially being able to do the stuff 258 00:15:56,080 --> 00:15:58,520 Speaker 1: that the mini computers of the past could do. But 259 00:15:58,880 --> 00:16:01,240 Speaker 1: back in the nineteen seventy, if you were doing serious 260 00:16:01,280 --> 00:16:04,360 Speaker 1: scientific research on the computer, chances are it was a 261 00:16:04,400 --> 00:16:07,920 Speaker 1: mini computer unless the geeks and the college mainframe center 262 00:16:07,960 --> 00:16:10,160 Speaker 1: really liked you and gave you time to use on 263 00:16:10,240 --> 00:16:13,560 Speaker 1: their machines. But time on a mainframe was a really 264 00:16:13,640 --> 00:16:17,800 Speaker 1: sought after commodity anyway. The HP twenty one hundred was 265 00:16:17,880 --> 00:16:21,080 Speaker 1: a sixteen bit machine that means it could handle data 266 00:16:21,160 --> 00:16:24,320 Speaker 1: units that were sixteen bits wide. That means it could 267 00:16:24,400 --> 00:16:27,600 Speaker 1: store two to the sixteenth power of values, which is 268 00:16:27,640 --> 00:16:31,360 Speaker 1: sixty five, five hundred and thirty six values. These days, 269 00:16:31,600 --> 00:16:34,120 Speaker 1: you probably own a PC running. I mean, at least 270 00:16:34,160 --> 00:16:37,320 Speaker 1: it's a thirty two bit mode, if not a sixty 271 00:16:37,360 --> 00:16:40,080 Speaker 1: four bit mode. So we have come a very long 272 00:16:40,160 --> 00:16:43,640 Speaker 1: way from those days in the early nineteen seventies. Now, 273 00:16:43,680 --> 00:16:46,520 Speaker 1: the scanning process itself took quite a bit of time, 274 00:16:46,760 --> 00:16:50,360 Speaker 1: and it was a little bit finicky. In fact, during 275 00:16:50,400 --> 00:16:53,560 Speaker 1: that first scan there were a couple of errors in 276 00:16:53,800 --> 00:16:57,040 Speaker 1: the process. For one thing, there was a software issue, 277 00:16:57,480 --> 00:17:00,240 Speaker 1: which meant that at the end of the day there 278 00:17:00,320 --> 00:17:03,840 Speaker 1: was a single line missing from the scanned photo. It 279 00:17:03,880 --> 00:17:06,879 Speaker 1: only had five hundred eleven lines, not five hundred twelve. 280 00:17:07,160 --> 00:17:10,199 Speaker 1: So the team decided that in order to fix that, 281 00:17:10,440 --> 00:17:12,840 Speaker 1: they would just copy the very top line of the 282 00:17:12,880 --> 00:17:16,000 Speaker 1: image and then place it at the very top of it, 283 00:17:16,160 --> 00:17:19,240 Speaker 1: so the top line of the image was there twice, 284 00:17:19,840 --> 00:17:22,280 Speaker 1: and that way they got the five hundred and twelve 285 00:17:22,359 --> 00:17:24,800 Speaker 1: lines that they wanted. They also found out that their 286 00:17:24,800 --> 00:17:29,080 Speaker 1: analog to digital converters were not properly synchronized with the drum, 287 00:17:29,480 --> 00:17:32,800 Speaker 1: so the image they had was just a touch distorted, 288 00:17:33,240 --> 00:17:36,720 Speaker 1: not like to a terrible level, but it was a 289 00:17:36,800 --> 00:17:40,800 Speaker 1: tiny bit elongated. Now, from what I understand, you wouldn't 290 00:17:40,840 --> 00:17:44,280 Speaker 1: really know it unless you were comparing the scanned photo 291 00:17:44,359 --> 00:17:46,720 Speaker 1: to the original photograph. It wasn't like it was to 292 00:17:46,800 --> 00:17:49,960 Speaker 1: a point where it was, you know, disturbing or something. 293 00:17:50,480 --> 00:17:52,879 Speaker 1: But you know, if you were going to compare it 294 00:17:52,920 --> 00:17:54,760 Speaker 1: to the original, it means that you just brought a 295 00:17:54,800 --> 00:17:58,040 Speaker 1: copy of Playboy into the conference and that probably is 296 00:17:58,080 --> 00:18:00,840 Speaker 1: a reflection on your own sense of propriety. But who 297 00:18:00,880 --> 00:18:04,080 Speaker 1: am I to judge. I'm somewhat amused that the team 298 00:18:04,160 --> 00:18:07,600 Speaker 1: felt that replicating a top line and accepting a little 299 00:18:07,680 --> 00:18:11,680 Speaker 1: deformation in their copied image were both within acceptable limits 300 00:18:11,760 --> 00:18:15,800 Speaker 1: for their colleague's conference paper. So my guess is they 301 00:18:15,880 --> 00:18:19,080 Speaker 1: must have really been under the gun to meet a deadline. Otherwise, 302 00:18:19,119 --> 00:18:21,879 Speaker 1: I can't imagine that they wouldn't just try the scan 303 00:18:22,200 --> 00:18:25,399 Speaker 1: again to get a better result. Then again, maybe the 304 00:18:25,440 --> 00:18:30,040 Speaker 1: process really was so slow and cumbersome and unpredictable that 305 00:18:30,160 --> 00:18:32,879 Speaker 1: no one really had the desire to give it another 306 00:18:33,040 --> 00:18:36,680 Speaker 1: go without a guaranteed success. Right, So maybe the scan 307 00:18:36,840 --> 00:18:38,920 Speaker 1: was actually better than what they normally got out of 308 00:18:38,920 --> 00:18:41,600 Speaker 1: the process, and there, they felt they were lucky getting 309 00:18:41,600 --> 00:18:45,119 Speaker 1: away with what they got. Whatever the reason, this imperfect 310 00:18:45,200 --> 00:18:49,080 Speaker 1: scan was what made it into that conference presentation, and 311 00:18:49,200 --> 00:18:52,840 Speaker 1: ultimately it's what would make Lenna something of a celebrity 312 00:18:52,960 --> 00:18:56,080 Speaker 1: within the image processing world. Well, that and the fact 313 00:18:56,080 --> 00:19:00,199 Speaker 1: that Lenna's photograph showcased her youth and beauty, and you know, 314 00:19:00,240 --> 00:19:03,080 Speaker 1: the jaunt tilt of her hat with the long purple 315 00:19:03,119 --> 00:19:06,440 Speaker 1: feather added some flare to it. And there's no denying 316 00:19:06,640 --> 00:19:09,560 Speaker 1: that the photograph has just got great composition in lighting 317 00:19:09,920 --> 00:19:12,720 Speaker 1: and that it captures Lenna's looks really well. I mean, 318 00:19:12,760 --> 00:19:15,800 Speaker 1: it's a picture of a beautiful young woman, so it 319 00:19:15,960 --> 00:19:20,760 Speaker 1: obviously had some captivating factors all by itself. Now, the 320 00:19:20,800 --> 00:19:24,080 Speaker 1: story goes that attend days at this conference wanted to 321 00:19:24,080 --> 00:19:28,480 Speaker 1: be able to test their own processes and methodologies against 322 00:19:28,480 --> 00:19:33,840 Speaker 1: the Sippy Labs version. But to test like against like, 323 00:19:33,920 --> 00:19:36,280 Speaker 1: you know, they would need to have two things. They 324 00:19:36,280 --> 00:19:40,120 Speaker 1: would need a copy of Sippy's version of the scanned image, 325 00:19:40,359 --> 00:19:43,560 Speaker 1: and they would need access to the original image so 326 00:19:43,600 --> 00:19:47,120 Speaker 1: that they could perform their own scans and then compare 327 00:19:47,119 --> 00:19:50,080 Speaker 1: them to Sippy's results. Well, access to the original was 328 00:19:50,080 --> 00:19:52,920 Speaker 1: pretty easy because it was in a published magazine, which, 329 00:19:52,960 --> 00:19:56,000 Speaker 1: from what I gather, was a pretty large circulation in general, 330 00:19:56,040 --> 00:20:00,720 Speaker 1: but seemed particularly popular in computer engineering circles. So apparently 331 00:20:00,720 --> 00:20:06,359 Speaker 1: there was no shortage of that original centerfold. The scanned 332 00:20:06,600 --> 00:20:09,320 Speaker 1: version they would need to get from Sippy, but Sippy 333 00:20:09,640 --> 00:20:13,120 Speaker 1: chose to share it liberally. They said, sure, yeah, well 334 00:20:13,160 --> 00:20:17,560 Speaker 1: absolutely share this image of our scan. You can have 335 00:20:17,600 --> 00:20:19,719 Speaker 1: a copy of it now. The result of all this 336 00:20:19,840 --> 00:20:23,040 Speaker 1: is that Lenna's image became a de facto standard for 337 00:20:23,119 --> 00:20:27,560 Speaker 1: testing scanning technologies and compression algorithms, and her photograph were 338 00:20:27,640 --> 00:20:32,360 Speaker 1: rather more specifically, a scan of her photograph would find 339 00:20:32,400 --> 00:20:37,560 Speaker 1: its way into countless journals and papers about the different 340 00:20:37,640 --> 00:20:42,480 Speaker 1: methods for creating digital image files. This actually reminds me 341 00:20:42,560 --> 00:20:45,960 Speaker 1: a little bit about the history of the MP three format. 342 00:20:46,119 --> 00:20:49,399 Speaker 1: So you might remember that the engineers behind the MP 343 00:20:49,520 --> 00:20:52,000 Speaker 1: three algorithm, you know, when they were trying to figure 344 00:20:52,000 --> 00:20:56,280 Speaker 1: out the compression algorithm to use, they used a version 345 00:20:56,400 --> 00:21:01,879 Speaker 1: of Suzanne Vegas Song Tom's Diner to calibrate their approach, 346 00:21:02,280 --> 00:21:05,640 Speaker 1: so they would make tweaks to how the algorithm would 347 00:21:05,680 --> 00:21:08,760 Speaker 1: choose which information to keep and which information could be 348 00:21:08,800 --> 00:21:13,000 Speaker 1: tossed aside. That's how the MP three file format really works. 349 00:21:13,040 --> 00:21:16,600 Speaker 1: The compression algorithm, and I'm talking about file size compression, 350 00:21:16,640 --> 00:21:21,760 Speaker 1: not audio compression. The filesize compression algorithm looks at the 351 00:21:21,760 --> 00:21:25,600 Speaker 1: elements of a sound file and says, what can we 352 00:21:25,640 --> 00:21:29,440 Speaker 1: get rid of that isn't going to compromise the quality 353 00:21:29,440 --> 00:21:33,159 Speaker 1: of the audio to such a degree that it's undesirable. 354 00:21:33,720 --> 00:21:38,320 Speaker 1: So they would make changes to their algorithm. Then they 355 00:21:38,320 --> 00:21:42,040 Speaker 1: would put Tom's Diner through the algorithm. Then they would 356 00:21:42,080 --> 00:21:45,760 Speaker 1: listen to the compressed song and find out if the 357 00:21:45,840 --> 00:21:49,239 Speaker 1: changes they made were manifest, if they were obvious when 358 00:21:49,280 --> 00:21:52,040 Speaker 1: the song is played, they were perceptible. So the goal 359 00:21:52,119 --> 00:21:55,480 Speaker 1: obviously was to maintain the song's quality as much as possible, 360 00:21:55,720 --> 00:21:58,239 Speaker 1: even as they would reduce the file size. Well in 361 00:21:58,280 --> 00:22:02,840 Speaker 1: some ways, this play always centerfold photograph of Lenna served 362 00:22:03,040 --> 00:22:06,840 Speaker 1: much the same purpose, but in image research labs, you know, 363 00:22:07,160 --> 00:22:10,280 Speaker 1: the labs that we're working on the future of digital imagery. 364 00:22:10,640 --> 00:22:14,000 Speaker 1: And what's interesting to me about this part of it 365 00:22:14,040 --> 00:22:17,840 Speaker 1: is that Playboy found out about this. The magazine found 366 00:22:17,840 --> 00:22:20,000 Speaker 1: out about this, and in a move that is a 367 00:22:20,040 --> 00:22:24,080 Speaker 1: little bit surprising considering how we typically see companies really 368 00:22:24,119 --> 00:22:28,719 Speaker 1: move swiftly to protect their intellectual property. Playboy chose not 369 00:22:28,880 --> 00:22:32,119 Speaker 1: to do anything about it. For one thing, the distribution 370 00:22:32,200 --> 00:22:36,240 Speaker 1: of Lena's scanned image was pretty good publicity for the magazine. Plus, 371 00:22:36,280 --> 00:22:38,919 Speaker 1: the various labs that wanted to test their own processes 372 00:22:38,960 --> 00:22:42,280 Speaker 1: would need copies of the nineteen seventy two November issue 373 00:22:42,359 --> 00:22:46,200 Speaker 1: to measure their results against what Sippy produced, So you 374 00:22:46,320 --> 00:22:49,520 Speaker 1: had some guaranteed magazine sales out there. So Playboy's like, 375 00:22:49,560 --> 00:22:52,160 Speaker 1: you know what, this is not bad for us. We're 376 00:22:52,280 --> 00:22:55,480 Speaker 1: not gonna pursue the fact that this image that we 377 00:22:55,600 --> 00:23:00,119 Speaker 1: owned the copyright too, is being used in various academic papers, 378 00:23:00,440 --> 00:23:04,440 Speaker 1: in thousands of journals. As for Lena, she got word 379 00:23:04,440 --> 00:23:06,879 Speaker 1: that her picture was being used by computer labs to 380 00:23:06,920 --> 00:23:10,199 Speaker 1: advance digital imagery technology, and it sounds to me like 381 00:23:10,240 --> 00:23:13,120 Speaker 1: she was pretty tickled by the whole thing. She certainly 382 00:23:13,680 --> 00:23:16,560 Speaker 1: was happy to appear at a nineteen ninety seven conference 383 00:23:16,560 --> 00:23:20,359 Speaker 1: in Boston. Playboy actually helped track her down so that 384 00:23:20,520 --> 00:23:25,000 Speaker 1: she would be invited to this engineering conference. So there 385 00:23:25,040 --> 00:23:30,080 Speaker 1: she goes, a former Playboy model appearing at an electrical 386 00:23:30,119 --> 00:23:34,720 Speaker 1: engineering conference in Boston, Massachusetts, where she signed autographs for 387 00:23:34,840 --> 00:23:38,000 Speaker 1: people who had been using her photograph while working on 388 00:23:38,080 --> 00:23:43,760 Speaker 1: their own imagery projects like she was a celebrity. There. Okay, 389 00:23:44,040 --> 00:23:47,520 Speaker 1: we've got more to say about this iconic image and 390 00:23:47,600 --> 00:23:49,919 Speaker 1: its place in the tech sector and how that place 391 00:23:50,200 --> 00:23:53,200 Speaker 1: is changing. But before we get to all that, let's 392 00:23:53,240 --> 00:24:05,840 Speaker 1: take another quick break to thank our sponsors. All Right, 393 00:24:05,880 --> 00:24:09,840 Speaker 1: we're back now. According to numerous sources that I came 394 00:24:09,840 --> 00:24:13,399 Speaker 1: across when I was researching this, the image of Lenna 395 00:24:13,520 --> 00:24:18,320 Speaker 1: found its way into literally thousands of papers and articles 396 00:24:18,680 --> 00:24:22,080 Speaker 1: over the following decades, in the nineteen seventies, the eighties, 397 00:24:22,119 --> 00:24:25,560 Speaker 1: the nineties, the two thousands. I mean, it's it's still 398 00:24:26,440 --> 00:24:30,280 Speaker 1: found in circulating articles and papers to this day. And 399 00:24:30,960 --> 00:24:36,080 Speaker 1: not only was it essentially ubiquitous in the tech literature 400 00:24:36,560 --> 00:24:40,760 Speaker 1: and often used by people who had no idea of 401 00:24:41,000 --> 00:24:45,880 Speaker 1: that image's history, it also became sort of an icon 402 00:24:46,280 --> 00:24:51,280 Speaker 1: for an ongoing conversation that's really nuanced and complicated and 403 00:24:51,359 --> 00:24:55,159 Speaker 1: important in the tech space, and it was an easy 404 00:24:55,200 --> 00:24:57,919 Speaker 1: way to kind of point to this one image and 405 00:24:58,040 --> 00:25:01,080 Speaker 1: use it as sort of the entry point for that conversation. 406 00:25:01,640 --> 00:25:04,879 Speaker 1: And there's been a movement now to really call a 407 00:25:04,920 --> 00:25:09,800 Speaker 1: stop to using this picture in academic papers and articles 408 00:25:09,840 --> 00:25:14,320 Speaker 1: because it reinforces an aspect of a harmful environment in 409 00:25:14,320 --> 00:25:18,200 Speaker 1: the tech space. Again, this is nuanced, This is complicated. 410 00:25:18,359 --> 00:25:21,679 Speaker 1: So I don't want to just say using a picture 411 00:25:21,720 --> 00:25:27,159 Speaker 1: that originated from Playboy magazine is already inherently bad, and 412 00:25:27,200 --> 00:25:29,280 Speaker 1: that's the start and end of it. It's not the 413 00:25:29,320 --> 00:25:31,840 Speaker 1: start and end of it. It's one aspect of a 414 00:25:31,920 --> 00:25:36,479 Speaker 1: much larger conversation. There is no denying that historically the 415 00:25:36,480 --> 00:25:41,000 Speaker 1: field of computer science has been dominated by men. Now 416 00:25:41,040 --> 00:25:46,040 Speaker 1: this is despite the fact that many women, incredible intelligent 417 00:25:46,080 --> 00:25:50,080 Speaker 1: women have made phenomenal contributions to computer science. Some of 418 00:25:50,119 --> 00:25:53,320 Speaker 1: the most important contributions to computer science came from women, 419 00:25:53,640 --> 00:25:58,600 Speaker 1: women like Ada Lovelace. Ada Lovelace envision that numbers in 420 00:25:58,640 --> 00:26:01,600 Speaker 1: mathematics could be used to re present everything from paintings 421 00:26:01,600 --> 00:26:05,639 Speaker 1: to music. She was arguing for computer science before a 422 00:26:05,680 --> 00:26:09,520 Speaker 1: computer existed. Or you had women like Grace Hopper who 423 00:26:09,600 --> 00:26:14,719 Speaker 1: led teams developing early computer programming languages. Or you had 424 00:26:14,760 --> 00:26:18,480 Speaker 1: the women who worked for the US military who calculated 425 00:26:18,560 --> 00:26:23,560 Speaker 1: ballistics charts for artillery. Again for the military, these women 426 00:26:23,760 --> 00:26:27,119 Speaker 1: were actually the first computers. That was their job title. 427 00:26:27,280 --> 00:26:32,840 Speaker 1: They were computers. They computed those ballistic charts. So they 428 00:26:33,080 --> 00:26:36,920 Speaker 1: clearly have made some massive contributions to the field. However, 429 00:26:37,240 --> 00:26:40,359 Speaker 1: when you looked at a typical computer lab, especially like 430 00:26:40,359 --> 00:26:43,080 Speaker 1: in the nineteen seventies, that was a space that was 431 00:26:43,119 --> 00:26:47,000 Speaker 1: almost filled exclusively with young men. In the United States, 432 00:26:47,320 --> 00:26:50,359 Speaker 1: it was usually young white men at that point, and 433 00:26:50,400 --> 00:26:54,040 Speaker 1: this homogeneous group of people fell into a tendency that 434 00:26:54,240 --> 00:26:56,800 Speaker 1: typically happens when you get a bunch of people together 435 00:26:57,160 --> 00:27:00,439 Speaker 1: who all share the same backgrounds. It doesn't just happened 436 00:27:00,520 --> 00:27:03,399 Speaker 1: with white guys, like it happens with any group of 437 00:27:03,480 --> 00:27:06,639 Speaker 1: people who are all very similar to one another. Certain 438 00:27:06,720 --> 00:27:10,479 Speaker 1: things are going to rise to the top and become 439 00:27:10,600 --> 00:27:13,679 Speaker 1: norms within that group. And if you didn't belong to 440 00:27:13,720 --> 00:27:17,280 Speaker 1: that group, those norms might not be very welcoming. In fact, 441 00:27:17,280 --> 00:27:20,480 Speaker 1: they could feel downright hostile or demeaning. And again, this 442 00:27:20,560 --> 00:27:24,440 Speaker 1: happens with any group of homogeneous folks, like these norms 443 00:27:24,520 --> 00:27:28,240 Speaker 1: just kind of establish themselves over time. Now, that was 444 00:27:28,280 --> 00:27:32,760 Speaker 1: a large part of what would become the Losing Lena movement. 445 00:27:33,640 --> 00:27:36,200 Speaker 1: You had a lot of different organizations that took part 446 00:27:36,240 --> 00:27:39,520 Speaker 1: in this. In fact, there was a documentary called Losing 447 00:27:39,600 --> 00:27:43,359 Speaker 1: Lena Women, and these organizations really began to push for 448 00:27:43,400 --> 00:27:48,200 Speaker 1: publications and the various technical societies to abandon the use 449 00:27:48,240 --> 00:27:52,159 Speaker 1: of Lena's image as a testing standard. They argued that 450 00:27:52,200 --> 00:27:56,239 Speaker 1: the tech world is far more diverse than that, that 451 00:27:56,760 --> 00:27:59,359 Speaker 1: there are lots of different voices in the tech space, 452 00:27:59,560 --> 00:28:02,680 Speaker 1: and his image taking such a prominent role in tech 453 00:28:03,160 --> 00:28:07,280 Speaker 1: was really a byproduct of bias from these homogeneous groups 454 00:28:07,320 --> 00:28:12,040 Speaker 1: of mostly white guys who really kind of created the 455 00:28:12,240 --> 00:28:16,719 Speaker 1: culture of the tech space, and they felt that this 456 00:28:17,000 --> 00:28:21,160 Speaker 1: was something that they could actually address and maybe start 457 00:28:21,160 --> 00:28:24,480 Speaker 1: a conversation about all of that. As for Lenna herself, 458 00:28:25,000 --> 00:28:28,000 Speaker 1: she came out in favor of retiring her photos. In 459 00:28:28,000 --> 00:28:31,560 Speaker 1: twenty nineteen. She made a statement saying she had retired 460 00:28:31,560 --> 00:28:33,800 Speaker 1: from modeling and it was time for her to retire 461 00:28:33,840 --> 00:28:37,040 Speaker 1: from technology as well. Now she remains proud of her work, 462 00:28:37,359 --> 00:28:40,880 Speaker 1: and she should, but she also recognizes how the almost 463 00:28:41,160 --> 00:28:45,800 Speaker 1: mandated use of her photograph in digital imagery could reinforce 464 00:28:45,960 --> 00:28:51,440 Speaker 1: an unwelcome environment considering the origin for that picture, Playboy, 465 00:28:51,960 --> 00:28:55,719 Speaker 1: And this is where the conversation really needs to get nuanced. 466 00:28:56,120 --> 00:28:58,880 Speaker 1: I think most folks who criticized the use of Lenna's 467 00:28:58,880 --> 00:29:03,280 Speaker 1: photograph in technology, they don't harbor any animosity toward Lenna herself. 468 00:29:03,480 --> 00:29:06,200 Speaker 1: They aren't saying that she was a bad person for 469 00:29:06,280 --> 00:29:09,280 Speaker 1: taking the job or working with Playboy or anything like that. 470 00:29:09,680 --> 00:29:11,560 Speaker 1: I don't think most of them are even calling out 471 00:29:11,600 --> 00:29:17,200 Speaker 1: Playboy beyond the fact that Playboy was catering to a group, 472 00:29:17,240 --> 00:29:20,800 Speaker 1: a demographic that also happened to be the same demographic 473 00:29:20,840 --> 00:29:23,520 Speaker 1: that kind of defined the culture of the tech space. 474 00:29:23,960 --> 00:29:27,800 Speaker 1: So that's really the problem, right, There's this deep culture 475 00:29:27,800 --> 00:29:32,280 Speaker 1: in tech space that has at best been unwelcoming toward women, 476 00:29:32,400 --> 00:29:35,280 Speaker 1: and at worst, it has been downright hostile. Now, this 477 00:29:35,320 --> 00:29:38,880 Speaker 1: does not mean that every tech department in every business, 478 00:29:39,000 --> 00:29:43,120 Speaker 1: or every school or organization or whatever is bad or 479 00:29:43,160 --> 00:29:46,880 Speaker 1: just staffed entirely with misogynists. That's not what that means. 480 00:29:47,240 --> 00:29:50,040 Speaker 1: But there's also no denying that many of the institutions 481 00:29:50,040 --> 00:29:53,960 Speaker 1: in tech are male centric and male dominated, and this 482 00:29:54,040 --> 00:29:57,640 Speaker 1: poses a challenge. How do you address a male dominated 483 00:29:57,680 --> 00:30:00,400 Speaker 1: industry in a way that opens it up and makes 484 00:30:00,400 --> 00:30:03,480 Speaker 1: it more welcoming to people who aren't male. And to 485 00:30:03,520 --> 00:30:06,640 Speaker 1: be clear, I really do think opening up is a 486 00:30:06,680 --> 00:30:10,800 Speaker 1: good thing. I think welcoming people from different demographics ultimately 487 00:30:10,840 --> 00:30:14,240 Speaker 1: it results in better output, whether it's a service or 488 00:30:14,280 --> 00:30:19,040 Speaker 1: a product. Bringing people from a diverse collection of experiences 489 00:30:19,040 --> 00:30:23,320 Speaker 1: and backgrounds helps you get new ideas and approaches that 490 00:30:23,360 --> 00:30:26,640 Speaker 1: wouldn't have occurred if you're just working with a homogeneous group. 491 00:30:26,880 --> 00:30:30,320 Speaker 1: You find perspectives that you hadn't considered before. And I 492 00:30:30,440 --> 00:30:34,080 Speaker 1: know that the whole diversity, equity and inclusion topic, the 493 00:30:34,160 --> 00:30:37,760 Speaker 1: DEI topic, is a hot button issue, and often folks 494 00:30:37,800 --> 00:30:41,240 Speaker 1: will weaponize the idea to suggest that the real goal 495 00:30:41,280 --> 00:30:44,600 Speaker 1: of DEI is to water everything down. For some reason, 496 00:30:44,960 --> 00:30:47,200 Speaker 1: I don't know why that's the go to. I don't 497 00:30:47,200 --> 00:30:49,480 Speaker 1: think most people are like, hey, I want everything to 498 00:30:49,520 --> 00:30:53,920 Speaker 1: be crappy. In fact, I think that's exact opposite. You know, personally, 499 00:30:53,960 --> 00:30:57,640 Speaker 1: I do not believe that making DEI a focus is 500 00:30:57,680 --> 00:31:00,760 Speaker 1: a bad thing. However, I do understand that other folks 501 00:31:00,800 --> 00:31:04,719 Speaker 1: have a very different opinion. And certainly there are instances 502 00:31:04,720 --> 00:31:08,320 Speaker 1: in which a company or organization might make token gestures 503 00:31:08,480 --> 00:31:12,400 Speaker 1: toward DEI but ultimately cause more harm than good in 504 00:31:12,440 --> 00:31:15,080 Speaker 1: the process. I've seen that happen a lot. I've seen 505 00:31:15,120 --> 00:31:19,320 Speaker 1: companies that have done these sort of token gestures, and 506 00:31:19,440 --> 00:31:24,080 Speaker 1: ultimately it is demoralizing, it can harm their output, and 507 00:31:24,120 --> 00:31:31,680 Speaker 1: it's because there's no genuine commitment toward improving diversity and representation. 508 00:31:32,280 --> 00:31:35,160 Speaker 1: It's more about how can we look good on the 509 00:31:35,200 --> 00:31:38,520 Speaker 1: surface level without having to take the hard work of 510 00:31:38,560 --> 00:31:43,160 Speaker 1: addressing the underlying issues. But that's a topic for another time. Anyway, 511 00:31:43,440 --> 00:31:46,000 Speaker 1: I feel it really does benefit a group to consider 512 00:31:46,040 --> 00:31:48,520 Speaker 1: the opinions and expertise of folks who come from other 513 00:31:48,560 --> 00:31:51,120 Speaker 1: backgrounds and points of view. But to do that, you 514 00:31:51,200 --> 00:31:53,920 Speaker 1: first have to make sure that you're not just discouraging 515 00:31:53,960 --> 00:31:56,640 Speaker 1: these people from participating in the first place. And that's 516 00:31:56,680 --> 00:32:00,480 Speaker 1: what the Losing Lina movement was really about, using just 517 00:32:00,800 --> 00:32:05,360 Speaker 1: one component of a much larger cultural issue in technology, 518 00:32:05,720 --> 00:32:08,040 Speaker 1: and there's been some progress on that front. I mean, 519 00:32:08,160 --> 00:32:11,600 Speaker 1: just this week, the Institute of Electrical and Electronics Engineers 520 00:32:11,720 --> 00:32:14,080 Speaker 1: or I Triple E, or as I like to say 521 00:32:14,600 --> 00:32:17,960 Speaker 1: I E, they just announced that they would no longer 522 00:32:18,000 --> 00:32:21,800 Speaker 1: accept papers that included the Lena image in them. In fact, editors, 523 00:32:22,000 --> 00:32:24,240 Speaker 1: if they find the image in the paper, are supposed 524 00:32:24,240 --> 00:32:27,200 Speaker 1: to reach out to the producers of that paper and 525 00:32:27,560 --> 00:32:30,040 Speaker 1: work with them to select a different image to go 526 00:32:30,160 --> 00:32:33,959 Speaker 1: in that place, rather than just an outright rejection. So 527 00:32:34,200 --> 00:32:38,840 Speaker 1: the IEE or send out an email to their members that, 528 00:32:38,920 --> 00:32:42,280 Speaker 1: among other things, stated quote in alignment with this culture 529 00:32:42,320 --> 00:32:44,800 Speaker 1: and with respect to the wishes of the subject of 530 00:32:44,840 --> 00:32:49,480 Speaker 1: the image, Lena for sen IEEE will no longer accept 531 00:32:49,520 --> 00:32:53,920 Speaker 1: submitted papers which include the quote unquote Lena image end 532 00:32:53,960 --> 00:32:56,120 Speaker 1: full quote. So the I Triple E is not the 533 00:32:56,120 --> 00:32:59,240 Speaker 1: only organization to announce this kind of a ban. In fact, 534 00:32:59,280 --> 00:33:02,520 Speaker 1: a year before or Lenna herself came out to support 535 00:33:02,600 --> 00:33:06,360 Speaker 1: her removal or the removal of her image rather in 536 00:33:06,440 --> 00:33:10,120 Speaker 1: various papers and journals, the scientific journal Nature said they 537 00:33:10,120 --> 00:33:13,400 Speaker 1: were banned any papers that were submitted that included the 538 00:33:13,480 --> 00:33:16,280 Speaker 1: Lena image. So this has been going on for a 539 00:33:16,320 --> 00:33:18,960 Speaker 1: few years. And again I do think this is a 540 00:33:19,040 --> 00:33:23,320 Speaker 1: very complex issue and it requires compassion and critical thinking 541 00:33:23,680 --> 00:33:27,200 Speaker 1: to approach it properly. I see really the whole Lena 542 00:33:27,400 --> 00:33:30,800 Speaker 1: image thing as an opportunity to start having deeper, more 543 00:33:30,840 --> 00:33:35,000 Speaker 1: meaningful conversations about the cultural climate in the tech sector 544 00:33:35,120 --> 00:33:38,560 Speaker 1: in general, and really to examine what is and isn't 545 00:33:38,560 --> 00:33:43,200 Speaker 1: working on that cultural level. And yes, that might mean 546 00:33:43,240 --> 00:33:46,440 Speaker 1: that ultimately you might have to make some changes in 547 00:33:46,600 --> 00:33:51,080 Speaker 1: how your organization does stuff. Those changes might to some 548 00:33:51,200 --> 00:33:54,560 Speaker 1: seem to be arbitrary or irritating, but if it means 549 00:33:54,600 --> 00:33:57,840 Speaker 1: creating a more welcoming environment where innovation can come from 550 00:33:57,920 --> 00:34:02,240 Speaker 1: news sources. Ultimately, every every body benefits from that. So 551 00:34:02,240 --> 00:34:04,800 Speaker 1: that's one reason to support these kinds of changes, at 552 00:34:04,880 --> 00:34:07,120 Speaker 1: least the ones that are made at a genuine level 553 00:34:07,160 --> 00:34:10,640 Speaker 1: and not just a way to get good quote unquote optics, 554 00:34:10,760 --> 00:34:14,520 Speaker 1: since we're talking digital imagery here. Plus, if you're really 555 00:34:14,640 --> 00:34:18,600 Speaker 1: upset about the Lina image being phased out of the 556 00:34:18,640 --> 00:34:22,279 Speaker 1: whole tech sphere, that picture's everywhere on the web. It's 557 00:34:22,320 --> 00:34:25,360 Speaker 1: not disappearing like. You could do a quick Google image 558 00:34:25,400 --> 00:34:29,120 Speaker 1: search and you're gonna find countless examples of the Lina 559 00:34:29,200 --> 00:34:31,440 Speaker 1: image out there. I mean, if you wanted to, you 560 00:34:31,440 --> 00:34:34,080 Speaker 1: could probably even track down a vintage copy of the 561 00:34:34,160 --> 00:34:37,439 Speaker 1: nineteen seventy two November issue of Playboy if you look 562 00:34:37,480 --> 00:34:40,400 Speaker 1: hard enough. It's okay if the imagery world moves on 563 00:34:40,520 --> 00:34:43,600 Speaker 1: to adopt other pictures as a means of testing technologies 564 00:34:43,600 --> 00:34:48,160 Speaker 1: and algorithms. I mean, Lenna said, So, all right, that's 565 00:34:48,239 --> 00:34:51,040 Speaker 1: it for this episode of tech Stuff. I hope you 566 00:34:51,080 --> 00:34:54,759 Speaker 1: are all well, and I'll talk to you again really soon. 567 00:35:01,280 --> 00:35:05,920 Speaker 1: Tech Stuff is an iHeartRadio production. For more podcasts from iHeartRadio, 568 00:35:06,239 --> 00:35:09,960 Speaker 1: visit the iHeartRadio app Apple podcasts, or wherever you listen 569 00:35:10,000 --> 00:35:11,080 Speaker 1: to your favorite shows.