1 00:00:04,400 --> 00:00:07,760 Speaker 1: Welcome to Tech Stuff, a production from my Heart Radio. 2 00:00:11,760 --> 00:00:14,000 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host 3 00:00:14,080 --> 00:00:16,560 Speaker 1: job in Strickland. I'm an executive producer with iHeart Radio. 4 00:00:16,600 --> 00:00:19,000 Speaker 1: And how the tech are you? It is time for 5 00:00:19,040 --> 00:00:21,680 Speaker 1: another classic episode. We're actually doing a follow up to 6 00:00:22,000 --> 00:00:26,680 Speaker 1: last week's classic episode. This week's episode is Photo Editing 7 00:00:26,680 --> 00:00:31,200 Speaker 1: and Manipulation, Part two. Originally published on September two, two 8 00:00:31,200 --> 00:00:35,640 Speaker 1: thousand fifteen. Dylan was a guest host on this show. 9 00:00:35,720 --> 00:00:38,160 Speaker 1: He is now a managing executive producer with I Heart, 10 00:00:38,560 --> 00:00:42,560 Speaker 1: so he's stuck with us and has done incredible work. 11 00:00:42,760 --> 00:00:46,800 Speaker 1: The guys very busy. But way back then we had 12 00:00:46,880 --> 00:00:49,479 Speaker 1: him on as a guest to talk about photo manipulation 13 00:00:50,000 --> 00:00:54,480 Speaker 1: and yeah, I hope you enjoy this classic episode. Images 14 00:00:54,560 --> 00:00:57,120 Speaker 1: on computers, that's nothing new, We've had those for quite 15 00:00:57,120 --> 00:01:01,240 Speaker 1: some time. I would argue that perhaps the pivotal moment 16 00:01:01,960 --> 00:01:05,640 Speaker 1: in the post digital era would be nineteen eighties seven. 17 00:01:05,880 --> 00:01:08,480 Speaker 1: That's when you have a PhD student at the University 18 00:01:08,480 --> 00:01:11,920 Speaker 1: of Michigan named Thomas Noel who builds a program for 19 00:01:11,959 --> 00:01:15,840 Speaker 1: the Mac Plus. He calls it display. Oh yeah, yeah, display. 20 00:01:15,880 --> 00:01:18,319 Speaker 1: Everyone's using that these days, but they are They just 21 00:01:18,360 --> 00:01:21,280 Speaker 1: don't know. Yeah, they just it's just not called that anymore. Yeah. 22 00:01:21,360 --> 00:01:25,840 Speaker 1: So in he and his brother John Noel. John Noel 23 00:01:26,080 --> 00:01:29,959 Speaker 1: was an effects expert a little company called Industrial Light 24 00:01:30,000 --> 00:01:35,520 Speaker 1: and Magic. Yeah so a little, little known Hollywood effects studio, 25 00:01:36,040 --> 00:01:38,800 Speaker 1: just one of the smaller ones. Yeah. So that they 26 00:01:40,080 --> 00:01:42,840 Speaker 1: put their heads together and together they come up with 27 00:01:42,880 --> 00:01:45,800 Speaker 1: an idea. They decided to rename display and they call 28 00:01:45,880 --> 00:01:50,680 Speaker 1: it Photoshop, and they license it to a manufacturer of 29 00:01:50,800 --> 00:01:56,480 Speaker 1: slide scanners, and the manufacturers called Barney Scan. So the 30 00:01:56,600 --> 00:02:00,000 Speaker 1: very first version of Photoshop, which was not version one point, oh, 31 00:02:00,120 --> 00:02:03,680 Speaker 1: it preceded version one point. It was version zero point 32 00:02:03,760 --> 00:02:09,240 Speaker 1: eight seven. It was included with about two hundred scanners. 33 00:02:09,600 --> 00:02:12,320 Speaker 1: So are about two hundred editions of Photoshop zero point 34 00:02:12,360 --> 00:02:17,000 Speaker 1: eight seven out there? Wow, that's a yeah, I guess. 35 00:02:18,440 --> 00:02:22,000 Speaker 1: And in nineteen nine they would sign a distribution deal 36 00:02:22,080 --> 00:02:26,600 Speaker 1: with Adobe, and thus the relationship with Adobe and Photoshop begins. 37 00:02:28,520 --> 00:02:33,799 Speaker 1: Was also when we get a famous incident of photo manipulation, 38 00:02:34,000 --> 00:02:38,959 Speaker 1: a huge slap. I would say, this was a TV 39 00:02:39,120 --> 00:02:42,480 Speaker 1: guide cover. I don't know if you're particularly familiar with 40 00:02:42,520 --> 00:02:49,880 Speaker 1: this instance. It is Oprah. Yes, so this was pretty awful. 41 00:02:49,960 --> 00:02:53,880 Speaker 1: I would say, yeah, So, what what happened was TV 42 00:02:54,000 --> 00:02:57,600 Speaker 1: Guide puts Oprah on the cover. That's not the egregious, nasty, 43 00:02:57,720 --> 00:03:01,520 Speaker 1: horrible part. That's that's perfectly fine. Was a ratings champion 44 00:03:01,720 --> 00:03:06,760 Speaker 1: absolutely deserved to be on the cover of TV What 45 00:03:06,760 --> 00:03:09,320 Speaker 1: what she did not deserve is is the way she 46 00:03:09,440 --> 00:03:13,760 Speaker 1: was portrayed. So what happened was they had cut her 47 00:03:13,840 --> 00:03:17,800 Speaker 1: head off of one picture and then superimposed it on 48 00:03:17,880 --> 00:03:21,200 Speaker 1: top of the body of Anne Margaret and a ten 49 00:03:21,280 --> 00:03:25,760 Speaker 1: year old photo of Ann Margaret at that from nine yeah, 50 00:03:25,960 --> 00:03:30,919 Speaker 1: nine seventy nine publicity shot of Anne Margaret, different woman, 51 00:03:31,200 --> 00:03:38,240 Speaker 1: different race, and the it was immediately recognized by the 52 00:03:38,280 --> 00:03:40,960 Speaker 1: fashion designer for Ann Margaret, the person who designed the 53 00:03:41,080 --> 00:03:45,080 Speaker 1: dress and Margaret was wearing in that publicity photo and 54 00:03:45,320 --> 00:03:49,520 Speaker 1: clearly was the same photo obviously the exact same pose 55 00:03:49,600 --> 00:03:52,880 Speaker 1: and same body. It just had been manipulated so that 56 00:03:53,120 --> 00:03:56,800 Speaker 1: it was supposed to like Oprah's body. This was not 57 00:03:57,120 --> 00:04:02,200 Speaker 1: one of TV Guide's best moments, and it certainly was 58 00:04:02,240 --> 00:04:06,640 Speaker 1: one of those that really brought the the this photo 59 00:04:06,680 --> 00:04:11,320 Speaker 1: manipulation uh concept back into the public eye. Keep in 60 00:04:11,360 --> 00:04:14,280 Speaker 1: mind that a lot of the manipulations we talked about 61 00:04:14,280 --> 00:04:18,360 Speaker 1: in our first episode. For a really long time, we're 62 00:04:18,400 --> 00:04:23,920 Speaker 1: not necessarily public knowledge. People weren't aware that a lot 63 00:04:23,920 --> 00:04:26,200 Speaker 1: of this was happening, you know. I think a lot 64 00:04:26,240 --> 00:04:30,200 Speaker 1: of people still believed that when you see a photograph, 65 00:04:30,480 --> 00:04:33,320 Speaker 1: what you're seeing is exactly the way it it rolled 66 00:04:33,320 --> 00:04:35,440 Speaker 1: out in real life. And I think a lot of 67 00:04:35,440 --> 00:04:38,240 Speaker 1: people now would still look at some of these photographs 68 00:04:38,240 --> 00:04:41,120 Speaker 1: from like the eighteen sixties and not know that there 69 00:04:41,240 --> 00:04:44,600 Speaker 1: was the ability to do manipulations like we're possible, right, 70 00:04:44,680 --> 00:04:46,480 Speaker 1: So if you were looking at, say some of the 71 00:04:46,520 --> 00:04:49,680 Speaker 1: spirit photography that we talked about in our last episode, 72 00:04:50,080 --> 00:04:54,120 Speaker 1: you might be led to believe that somehow the photographers 73 00:04:54,120 --> 00:04:58,160 Speaker 1: of the nineteenth century had an ability to capture spirits 74 00:04:58,200 --> 00:05:02,279 Speaker 1: in photographs. They weren't. They use double exposure. Double exposure 75 00:05:02,279 --> 00:05:04,600 Speaker 1: is no longer a thing. So maybe that's part of 76 00:05:04,600 --> 00:05:08,560 Speaker 1: the problem, because unless you're using film double exposure like 77 00:05:08,600 --> 00:05:11,640 Speaker 1: you could, you could digitally create an effect that's like 78 00:05:11,680 --> 00:05:15,640 Speaker 1: double exposure. You can overlay one layer and photoshop on 79 00:05:15,680 --> 00:05:19,560 Speaker 1: another and change the opacity. Yes, but as far as 80 00:05:20,040 --> 00:05:25,240 Speaker 1: putting two pictures on the same piece of film, it's 81 00:05:25,240 --> 00:05:28,320 Speaker 1: not exactly the same. Yeah, yeah, it's it's I mean, 82 00:05:28,360 --> 00:05:30,520 Speaker 1: you can you can fake it so that you get 83 00:05:30,560 --> 00:05:32,960 Speaker 1: the same effect, But it's not it's definitely not the 84 00:05:33,000 --> 00:05:39,160 Speaker 1: same process. Nine was when Adobe Photoshop one point oh 85 00:05:39,320 --> 00:05:43,359 Speaker 1: would ship Happy anniversary Photoshop. Hey, there you go. I 86 00:05:43,400 --> 00:05:45,680 Speaker 1: didn't even make them do the math because in English 87 00:05:45,720 --> 00:05:50,200 Speaker 1: major don't blame me. So here's here's something that's hard 88 00:05:50,240 --> 00:05:55,080 Speaker 1: to believe. Adobe Photoshop one point oh fit on a 89 00:05:55,160 --> 00:05:59,960 Speaker 1: single three and a half inch floppy disk. That's incredible. Yeah, yeah, 90 00:06:00,080 --> 00:06:03,000 Speaker 1: uh yeah. You think about the size of those those 91 00:06:03,080 --> 00:06:06,599 Speaker 1: programs now, and well, first of all, we don't use 92 00:06:06,640 --> 00:06:10,760 Speaker 1: sloppy disks anymore. You know, I don't know if Photoshop 93 00:06:10,839 --> 00:06:13,239 Speaker 1: still I guess you can still physically, but most people 94 00:06:14,040 --> 00:06:16,040 Speaker 1: downloaded from the cloud now right, Yeah, you get a 95 00:06:16,080 --> 00:06:20,080 Speaker 1: digital download and it's sizeable. It would take many three 96 00:06:20,080 --> 00:06:22,480 Speaker 1: and a half inch disks to hold it. Now, and 97 00:06:22,760 --> 00:06:26,279 Speaker 1: that Photoshop one point oh contain several of the tools 98 00:06:26,320 --> 00:06:29,839 Speaker 1: that would later become the standard features of Photoshop. That's 99 00:06:29,839 --> 00:06:33,280 Speaker 1: pretty amazing to me. Yeah, the clone tool was included 100 00:06:33,320 --> 00:06:36,279 Speaker 1: in Photoshop one point oh. Yeah, and you you you 101 00:06:36,360 --> 00:06:40,800 Speaker 1: already had the ability to select certain certain parts of 102 00:06:40,800 --> 00:06:48,479 Speaker 1: an image and human saturation levels or tones. Thinking since 103 00:06:48,520 --> 00:06:51,920 Speaker 1: I never used one point. Oh, thinking back and imagining 104 00:06:51,960 --> 00:06:55,880 Speaker 1: that that was possible back then already in some form 105 00:06:56,040 --> 00:07:00,000 Speaker 1: is is pretty amazing. Although the great thing about photoshop 106 00:07:00,200 --> 00:07:04,120 Speaker 1: was that in comparison to its competitors, it was it 107 00:07:04,240 --> 00:07:08,839 Speaker 1: was easier, more intuitive, and that's the way it was marketed. Right. 108 00:07:08,880 --> 00:07:11,920 Speaker 1: So keep in mind that when we talked about in 109 00:07:12,000 --> 00:07:16,360 Speaker 1: Part one, those those those means of manipulating and editing photos, 110 00:07:17,080 --> 00:07:19,920 Speaker 1: that was a specialized skill. Like if you were not 111 00:07:20,040 --> 00:07:23,840 Speaker 1: a photographer, chances are you were unaware of how this 112 00:07:23,840 --> 00:07:29,560 Speaker 1: this trickery or manipulation took place. And it's almost like 113 00:07:29,600 --> 00:07:31,960 Speaker 1: you can almost think of it like a guild system. 114 00:07:32,000 --> 00:07:35,240 Speaker 1: It wasn't formal like that, but it was this thing 115 00:07:35,240 --> 00:07:37,640 Speaker 1: that if you were not part of that club, you 116 00:07:37,800 --> 00:07:40,840 Speaker 1: were largely ignorant of what was going on in order 117 00:07:40,880 --> 00:07:44,800 Speaker 1: for that stuff to happen. Uh. This approach meant that 118 00:07:44,840 --> 00:07:47,480 Speaker 1: it suddenly became much more accessible. It's still had a 119 00:07:47,600 --> 00:07:50,040 Speaker 1: barrier to entry. There was still a learning curve to 120 00:07:50,160 --> 00:07:53,560 Speaker 1: be to be conquered, but it was much lower than 121 00:07:53,920 --> 00:07:58,560 Speaker 1: say having to go and roll in classes in photography 122 00:07:58,680 --> 00:08:00,760 Speaker 1: and all the chemistry and fault there was. There was 123 00:08:00,800 --> 00:08:04,200 Speaker 1: no longer chemistry involved, and there was no chemistry in 124 00:08:04,240 --> 00:08:07,760 Speaker 1: a photo which didn't need to have chemistry, and you 125 00:08:07,800 --> 00:08:10,080 Speaker 1: didn't have to go into the dark room and stand 126 00:08:10,080 --> 00:08:12,280 Speaker 1: there with the tools if you wanted to lighten a 127 00:08:12,360 --> 00:08:15,120 Speaker 1: part of an image or dark end. Yeah. I loved 128 00:08:15,160 --> 00:08:17,080 Speaker 1: and you're in the part one you talked about the 129 00:08:17,120 --> 00:08:19,760 Speaker 1: experience of going into a pitch black room in order 130 00:08:19,800 --> 00:08:23,240 Speaker 1: to manipulate a camera so that you don't expose the 131 00:08:23,280 --> 00:08:26,600 Speaker 1: film to light, thus ruining the film because obviously the 132 00:08:26,600 --> 00:08:30,840 Speaker 1: film is as photoreactive. You can't since its photoreactive, any 133 00:08:30,920 --> 00:08:34,280 Speaker 1: light is going to make it react. And the idea 134 00:08:34,320 --> 00:08:36,400 Speaker 1: of fumbling around in the dark that first time, where 135 00:08:36,400 --> 00:08:39,120 Speaker 1: you're trying to build that familiarity, it made me think 136 00:08:39,160 --> 00:08:43,439 Speaker 1: of every scene in every military movie where someone has 137 00:08:43,520 --> 00:08:46,320 Speaker 1: to take apart a gun and then reassemble it. It's 138 00:08:46,400 --> 00:08:49,080 Speaker 1: like it sounds like that, like that, like after after 139 00:08:49,160 --> 00:08:52,480 Speaker 1: week five or six, you just confidently just you're not 140 00:08:52,520 --> 00:08:54,920 Speaker 1: even thinking about it's no longer conscious thought. It's just 141 00:08:55,000 --> 00:08:58,520 Speaker 1: muscle memory. Yeah, and that that's that's great, But it's 142 00:08:58,559 --> 00:09:01,920 Speaker 1: also great now, as much as I love film photography, 143 00:09:01,960 --> 00:09:03,800 Speaker 1: to do that on a daily basis is not as 144 00:09:03,840 --> 00:09:06,679 Speaker 1: easy as taking a digital SLR and putting an SD 145 00:09:06,840 --> 00:09:10,320 Speaker 1: card slot. Yeah, it turns out that you know, you 146 00:09:10,360 --> 00:09:12,320 Speaker 1: can even do that with the lights song you can. 147 00:09:14,440 --> 00:09:18,560 Speaker 1: That's great. So the cool thing also about Adobe Photoshop 148 00:09:18,679 --> 00:09:20,840 Speaker 1: one point oh is it was being used for some 149 00:09:20,960 --> 00:09:25,440 Speaker 1: really high profile, uh feature films at the time. There 150 00:09:25,480 --> 00:09:29,720 Speaker 1: were actual effects studios that were using elements of Photoshop 151 00:09:30,120 --> 00:09:34,679 Speaker 1: to work on composite digital shots of their films, including 152 00:09:34,720 --> 00:09:39,200 Speaker 1: movies like The Abyss. So that's James Cameron's big movie 153 00:09:39,280 --> 00:09:42,280 Speaker 1: that depending on which version you're seeing, you're dedicating about 154 00:09:42,320 --> 00:09:46,040 Speaker 1: four hours of your life. Too. There's The Rocketeer, which 155 00:09:46,559 --> 00:09:51,120 Speaker 1: highly underrated fantastic sci fi fantasy film. Yeah, yeah, I 156 00:09:51,160 --> 00:09:54,000 Speaker 1: remember seeing that one of us really young. It's a 157 00:09:54,480 --> 00:09:56,439 Speaker 1: great it's a great movie. Go check it off. You 158 00:09:56,480 --> 00:09:58,280 Speaker 1: haven't seen The Rocketeer, youve got to see The Rocketier. 159 00:09:58,360 --> 00:10:00,520 Speaker 1: So it's actually a really good superher Row film. It 160 00:10:00,600 --> 00:10:05,200 Speaker 1: just didn't get adopted by a larger audience the way 161 00:10:05,200 --> 00:10:08,440 Speaker 1: some of the ones today did. Terminator two was one 162 00:10:08,480 --> 00:10:13,400 Speaker 1: of the movies that Phomshop one point oh influenced. James Cameron. Ye, 163 00:10:13,559 --> 00:10:16,040 Speaker 1: so if you uh, if you you know, if you 164 00:10:16,120 --> 00:10:19,439 Speaker 1: really want to celebrate give photoshop one point, I was 165 00:10:19,600 --> 00:10:24,680 Speaker 1: very slowly descending thumbs up. And then one of one 166 00:10:24,679 --> 00:10:28,000 Speaker 1: of a movie that I love despite its flaws, Hook, 167 00:10:28,600 --> 00:10:33,320 Speaker 1: Steven Spielberg movie. Yeah, the the Steven Spielberg Retelling of 168 00:10:33,400 --> 00:10:36,120 Speaker 1: Peter Pan, where Robin Williams is a grown up Peter 169 00:10:36,200 --> 00:10:39,920 Speaker 1: Pan and Dustin Hoffman's Captain Hook. Uh. Dustin Hoffman, by 170 00:10:39,920 --> 00:10:42,960 Speaker 1: the way, phenomenal Captain Hook, one of my I loved 171 00:10:42,960 --> 00:10:45,080 Speaker 1: his performance. I wish there were more Hook in the 172 00:10:45,120 --> 00:10:47,320 Speaker 1: movie Hook, because every time he was on screen, I 173 00:10:47,360 --> 00:10:50,520 Speaker 1: was happy. It's just pretty phenomenal. Yeah. It was a 174 00:10:50,520 --> 00:10:54,160 Speaker 1: totally different style of Hook than I was anticipating, and 175 00:10:54,200 --> 00:10:57,120 Speaker 1: I loved every second of it. Anyway, all of those 176 00:10:57,280 --> 00:11:01,360 Speaker 1: had influence from photoshop. They were digital composites, and we 177 00:11:01,400 --> 00:11:05,400 Speaker 1: talked previously in the earlier episode about composite photographs. In 178 00:11:05,440 --> 00:11:08,280 Speaker 1: that sense, there were composites that were completely made through 179 00:11:08,360 --> 00:11:11,600 Speaker 1: analog methods where you're literally cutting and pasting either in 180 00:11:11,600 --> 00:11:13,880 Speaker 1: a negative form or print form and taking a new 181 00:11:13,920 --> 00:11:17,840 Speaker 1: photograph whatever it may be. But you were putting together 182 00:11:17,880 --> 00:11:21,880 Speaker 1: two or more images to create a new image that's 183 00:11:21,880 --> 00:11:25,439 Speaker 1: a composite of those previous ones. Now we're talking about 184 00:11:25,440 --> 00:11:29,160 Speaker 1: doing the same thing but using digital tools, which gives 185 00:11:29,160 --> 00:11:32,240 Speaker 1: you a lot more freedom. It does, and in soon 186 00:11:32,800 --> 00:11:36,200 Speaker 1: thereafter I think three point oh was the introduction of layers, 187 00:11:36,200 --> 00:11:41,559 Speaker 1: which changes everything. Yeah, you have non destructive editing. Please 188 00:11:41,760 --> 00:11:44,720 Speaker 1: can you explain what that is? Non destructive editing is 189 00:11:44,960 --> 00:11:50,680 Speaker 1: a godsend because, uh, I guess when most people maybe 190 00:11:50,679 --> 00:11:53,840 Speaker 1: open up Photoshop for the first time and they're messing around, 191 00:11:53,920 --> 00:11:56,080 Speaker 1: you see things like the eraser things that kind of 192 00:11:56,120 --> 00:11:58,080 Speaker 1: makes sense. Like maybe if you ever opened up a 193 00:11:58,120 --> 00:11:59,880 Speaker 1: picture in MS paint and you thought, like, I can 194 00:12:00,040 --> 00:12:04,240 Speaker 1: paint on this, But once you do that, you're you're 195 00:12:04,280 --> 00:12:06,400 Speaker 1: you're limited by how many times you can hit the 196 00:12:06,480 --> 00:12:11,840 Speaker 1: undue button. But with layers, um you can continually add 197 00:12:11,880 --> 00:12:16,920 Speaker 1: on different elements, on on, different on. The best way 198 00:12:16,960 --> 00:12:18,920 Speaker 1: to say is layers. So it's like putting pieces of 199 00:12:18,920 --> 00:12:21,920 Speaker 1: paper over one another kind of or clear acetate as 200 00:12:21,920 --> 00:12:28,120 Speaker 1: maybe and and so if you uh subtract from one 201 00:12:28,400 --> 00:12:30,480 Speaker 1: part of that layer, if you still have a copy 202 00:12:30,520 --> 00:12:34,560 Speaker 1: of that underneath, nothing's going to happen to the original 203 00:12:34,600 --> 00:12:36,199 Speaker 1: part of that image you're and you're still going to 204 00:12:36,320 --> 00:12:38,959 Speaker 1: have on the bottom layer. If you don't edit that 205 00:12:39,040 --> 00:12:41,360 Speaker 1: layer the whole image. You're still going to have the 206 00:12:41,360 --> 00:12:44,160 Speaker 1: whole image. So you have your original as the base, 207 00:12:44,240 --> 00:12:47,120 Speaker 1: the foundation, yes, and then you lay other layers on 208 00:12:47,160 --> 00:12:49,360 Speaker 1: top of it. That's where you can do your manipulation. 209 00:12:49,480 --> 00:12:51,720 Speaker 1: And then you decide, maybe you decide, you know, I 210 00:12:51,760 --> 00:12:54,400 Speaker 1: took that balloon out, but I actually kind of want 211 00:12:54,400 --> 00:12:56,480 Speaker 1: that balloon back in. Now that I've done these other edits, 212 00:12:56,480 --> 00:12:59,040 Speaker 1: I'm gonna go back re add that one layer or 213 00:12:59,120 --> 00:13:01,800 Speaker 1: undo what I did on that one layer. And now 214 00:13:02,120 --> 00:13:05,360 Speaker 1: it's as if by magic we get the balloon bag exactly. 215 00:13:05,400 --> 00:13:07,360 Speaker 1: And and I'm sure a lot of people have noticed 216 00:13:07,400 --> 00:13:11,079 Speaker 1: that you have at the top of your your application 217 00:13:11,160 --> 00:13:13,880 Speaker 1: you have different tabs on them is like adjustments, and 218 00:13:13,960 --> 00:13:17,280 Speaker 1: you can go and you can change the contrast, the saturation, 219 00:13:17,400 --> 00:13:21,120 Speaker 1: things like that. Well, with adjustment layers, you can add 220 00:13:21,200 --> 00:13:24,360 Speaker 1: that on and then you can you can turn off 221 00:13:24,360 --> 00:13:27,280 Speaker 1: that layer and the effect goes away. If you do 222 00:13:27,320 --> 00:13:30,559 Speaker 1: that through the file menu at the top, it changes 223 00:13:30,840 --> 00:13:34,400 Speaker 1: the picture permanently. If you can't, if you don't have 224 00:13:34,960 --> 00:13:37,360 Speaker 1: the back button, if you don't have undo, yeah, yeah, 225 00:13:37,360 --> 00:13:40,079 Speaker 1: as long as you or or if you avoid saving it, 226 00:13:41,040 --> 00:13:43,880 Speaker 1: you still have your original. But then what's the point 227 00:13:43,920 --> 00:13:46,560 Speaker 1: of doing all the work. Yeah, it's a great point. 228 00:13:46,640 --> 00:13:49,920 Speaker 1: I mean I've often wondered because as someone who has 229 00:13:49,960 --> 00:13:54,200 Speaker 1: only casually used these editing tools myself, Uh, it took 230 00:13:54,240 --> 00:13:56,760 Speaker 1: a while for me to figure out what layers were. 231 00:13:57,200 --> 00:14:01,200 Speaker 1: And um, like I remember specific quickly using them to 232 00:14:01,280 --> 00:14:05,240 Speaker 1: do things like if I wanted to take an image 233 00:14:06,120 --> 00:14:10,000 Speaker 1: but remove, like remove an image on a white background, 234 00:14:10,040 --> 00:14:12,560 Speaker 1: like let's say it's a stock photo of something, and 235 00:14:12,600 --> 00:14:14,720 Speaker 1: I want to put that image, the image of that 236 00:14:14,800 --> 00:14:17,360 Speaker 1: thing on some other background and not have a big 237 00:14:17,400 --> 00:14:20,720 Speaker 1: white box behind it. That's when it was it was 238 00:14:20,800 --> 00:14:23,080 Speaker 1: useful if that was a layer itself, like I could 239 00:14:23,080 --> 00:14:26,040 Speaker 1: take the layer that had the object and leave the 240 00:14:26,040 --> 00:14:29,800 Speaker 1: white background behind, so I could actually put that against 241 00:14:29,800 --> 00:14:31,840 Speaker 1: a different background and have it look like it meant 242 00:14:31,840 --> 00:14:34,840 Speaker 1: to it was meant to be. There another huge breakthrough 243 00:14:34,880 --> 00:14:38,320 Speaker 1: his masks. And if use layers and masks together, you 244 00:14:38,520 --> 00:14:42,320 Speaker 1: have so much control and and it's you can go 245 00:14:42,520 --> 00:14:45,360 Speaker 1: backwards or forwards as much as you want in your document. 246 00:14:45,440 --> 00:14:49,560 Speaker 1: With with masks, you can select a certain part of 247 00:14:49,720 --> 00:14:52,680 Speaker 1: the image and then you can mask out the rest 248 00:14:52,720 --> 00:14:56,120 Speaker 1: of it. So you just have that that part selected 249 00:14:56,160 --> 00:14:59,440 Speaker 1: and then you can use the paint brush to paint 250 00:14:59,480 --> 00:15:02,360 Speaker 1: and or paint doubt to kind of clean up the 251 00:15:02,400 --> 00:15:04,960 Speaker 1: lines that you just selected. If you want more of 252 00:15:05,080 --> 00:15:07,080 Speaker 1: that layer that you just blank part of it out, 253 00:15:07,120 --> 00:15:08,840 Speaker 1: you can you can erase part of it and you 254 00:15:08,840 --> 00:15:13,720 Speaker 1: can bring it back in. So layers and mass I 255 00:15:13,760 --> 00:15:17,640 Speaker 1: think we're big game changers for photo editors and photo 256 00:15:17,720 --> 00:15:20,960 Speaker 1: manipulators who either used as an art or they used 257 00:15:20,960 --> 00:15:22,920 Speaker 1: it to deceive whatever you want to use it for. 258 00:15:23,080 --> 00:15:27,440 Speaker 1: Photoshop was making it possible as we as we have 259 00:15:27,600 --> 00:15:30,480 Speaker 1: said many times on tech stuff, it's not the tool 260 00:15:31,280 --> 00:15:35,080 Speaker 1: that's necessarily the issue or the problem. It's the way 261 00:15:35,080 --> 00:15:38,480 Speaker 1: the tools implemented. And the purpose of these tools was 262 00:15:38,520 --> 00:15:41,720 Speaker 1: to give greater freedom to people who work in images, 263 00:15:42,280 --> 00:15:45,480 Speaker 1: and what you do with that freedom is up to 264 00:15:45,600 --> 00:15:48,240 Speaker 1: the choice of the individual, and in some cases it 265 00:15:48,320 --> 00:15:50,600 Speaker 1: was to make art. In some cases it was just 266 00:15:50,640 --> 00:15:53,600 Speaker 1: to clean up an image so that the point that 267 00:15:53,680 --> 00:15:56,560 Speaker 1: the photographer wanted you to focus on was in fact 268 00:15:56,720 --> 00:16:00,200 Speaker 1: the most notable element of that image. Like if I'm 269 00:16:00,240 --> 00:16:02,880 Speaker 1: taking a photo of something and i think like, oh, 270 00:16:02,880 --> 00:16:06,520 Speaker 1: it's this great background, and I've got a wonderful subject 271 00:16:06,560 --> 00:16:09,800 Speaker 1: in the foreground, and I'm really concentrating on my subject, 272 00:16:09,840 --> 00:16:12,960 Speaker 1: and I don't notice that there's some jackass twenty feet 273 00:16:13,000 --> 00:16:16,720 Speaker 1: back who's you know, waving maniacally into the camera. And 274 00:16:16,720 --> 00:16:19,760 Speaker 1: then I think, I really don't want that guy there. 275 00:16:19,920 --> 00:16:23,400 Speaker 1: I mean, you know, just pop open a new layer 276 00:16:23,400 --> 00:16:27,480 Speaker 1: and photo shop and clone stamp them out. I like 277 00:16:27,560 --> 00:16:29,800 Speaker 1: the way you think, well, when you see that guy 278 00:16:29,880 --> 00:16:32,800 Speaker 1: on the street, like I clone stamped you exactly, he 279 00:16:32,840 --> 00:16:35,920 Speaker 1: doesn't know what you mean. Maybe there's a good chance 280 00:16:35,960 --> 00:16:38,520 Speaker 1: he has no idea how what what a sick burn, 281 00:16:40,560 --> 00:16:43,240 Speaker 1: which is essentially saying like I did what I did 282 00:16:43,240 --> 00:16:47,200 Speaker 1: to you what Stalin did to the commissar, only in 283 00:16:47,240 --> 00:16:51,760 Speaker 1: the photo sense. Though, you're free to go. We'll be 284 00:16:51,880 --> 00:16:55,520 Speaker 1: back to talk about more photo editing and manipulation in 285 00:16:55,720 --> 00:17:09,119 Speaker 1: just a moment after this break. So some other instances 286 00:17:09,160 --> 00:17:12,680 Speaker 1: of photo manipulation, some of which were not necessarily meant 287 00:17:12,720 --> 00:17:17,359 Speaker 1: to be uh taken as a representation of the actual person, 288 00:17:17,680 --> 00:17:22,520 Speaker 1: some of which it's arguable. Texas Monthly Magazine had a 289 00:17:22,640 --> 00:17:26,840 Speaker 1: great image of Governor Ann richards on a Harley Davidson. 290 00:17:27,119 --> 00:17:30,080 Speaker 1: That's a great one, and she liked it. Yeah, yeah, 291 00:17:30,080 --> 00:17:32,600 Speaker 1: she looked great. But if you actually looked in the 292 00:17:32,640 --> 00:17:35,720 Speaker 1: credit page of the magazine. They they indicate that the 293 00:17:36,480 --> 00:17:40,400 Speaker 1: motorcyclist was a stock image, and so if you were 294 00:17:40,520 --> 00:17:42,879 Speaker 1: to take the trouble of looking at the at that 295 00:17:42,920 --> 00:17:46,280 Speaker 1: credit page, you would see that they don't hide the 296 00:17:46,280 --> 00:17:49,200 Speaker 1: fact that this was a manipulation. And that's almost become 297 00:17:49,280 --> 00:17:54,320 Speaker 1: common practices for manipulated covers. Maybe if it's someone's head 298 00:17:54,359 --> 00:17:57,480 Speaker 1: on someone else's body, yeah, which is a big thing, 299 00:17:57,840 --> 00:17:59,919 Speaker 1: that it will have a small disclaimer either on the 300 00:18:00,000 --> 00:18:02,560 Speaker 1: credit page or very small on the bottom of the 301 00:18:02,600 --> 00:18:08,639 Speaker 1: cover that that it is illustration. Yeah. Yeah. And ninety 302 00:18:08,720 --> 00:18:11,600 Speaker 1: three that's when photoshop would come to Windows. So finally 303 00:18:11,640 --> 00:18:14,399 Speaker 1: Windows users were able to stand side by side with 304 00:18:14,480 --> 00:18:17,960 Speaker 1: Mac users with sixteen bit support. That was that was 305 00:18:18,000 --> 00:18:22,040 Speaker 1: the big thing that year, I think for Windows. UH. 306 00:18:22,160 --> 00:18:28,280 Speaker 1: Here's a famous instance of photo manipulation that was a 307 00:18:28,320 --> 00:18:32,600 Speaker 1: big controversy. This was during UH just in the in 308 00:18:32,640 --> 00:18:34,760 Speaker 1: the wake of O. J. Simpson being arrested on the 309 00:18:34,840 --> 00:18:38,000 Speaker 1: charge of murder, and two magazines came out at the 310 00:18:38,040 --> 00:18:43,199 Speaker 1: same time, both using O. J. Simpsons mug shot. Newsweek 311 00:18:43,320 --> 00:18:48,760 Speaker 1: had the unaltered mug shot as the cover. Time magazine 312 00:18:48,800 --> 00:18:52,639 Speaker 1: had a altered version of that mug shot. Yes, they 313 00:18:53,520 --> 00:18:57,960 Speaker 1: manipulated it to make him appear darker and more menacing, 314 00:18:59,240 --> 00:19:08,600 Speaker 1: which is I would argue bad photo journalism, and yes, yes, 315 00:19:08,840 --> 00:19:14,200 Speaker 1: both of those things. It is I would argue indefensible. Yeah, 316 00:19:14,280 --> 00:19:16,439 Speaker 1: that was one of the things that they show you 317 00:19:16,480 --> 00:19:21,040 Speaker 1: a most photography history of photograghy classes or yeah, saying 318 00:19:21,160 --> 00:19:25,400 Speaker 1: quite pointing at that and saying, do not be this person. Yes, 319 00:19:26,200 --> 00:19:30,600 Speaker 1: it's bad idea. It says a lot. It says a 320 00:19:30,600 --> 00:19:34,920 Speaker 1: lot of really ugly things about not just the mentality 321 00:19:35,200 --> 00:19:38,800 Speaker 1: of uh, the editors of the magazine at the time, 322 00:19:38,840 --> 00:19:43,120 Speaker 1: but also their opinion of the American public and perhaps 323 00:19:43,280 --> 00:19:48,560 Speaker 1: even by extension, the overall dominant opinion of the American public. 324 00:19:48,600 --> 00:19:50,679 Speaker 1: Like all of that none of that comes out well 325 00:19:51,000 --> 00:19:53,560 Speaker 1: in that in that instance, to say that this would 326 00:19:53,600 --> 00:19:57,080 Speaker 1: be seen as a reasonable use of photo editing and 327 00:19:57,119 --> 00:20:00,720 Speaker 1: that it would be accepted and that you that's all 328 00:20:00,720 --> 00:20:03,840 Speaker 1: of that is just ugly and it's unfortunate. Yeah, and 329 00:20:04,040 --> 00:20:07,560 Speaker 1: it's it's it's good that Newsweek ran the same picture, 330 00:20:08,040 --> 00:20:11,240 Speaker 1: which is something that I think is is uniquely pre 331 00:20:11,760 --> 00:20:16,720 Speaker 1: internet kind of saturation, is that that would happen every 332 00:20:16,760 --> 00:20:19,320 Speaker 1: once in a while that Newsweek or Time would run 333 00:20:19,359 --> 00:20:22,359 Speaker 1: the same image on their home on their front almost 334 00:20:22,359 --> 00:20:26,200 Speaker 1: that homepage. Yeah, well, the same image of that world. 335 00:20:25,800 --> 00:20:29,000 Speaker 1: I will often end up referring to old media with 336 00:20:29,080 --> 00:20:31,960 Speaker 1: new media terms. And then I realized, like, I've been 337 00:20:31,960 --> 00:20:34,280 Speaker 1: in this business a while now, and this is this 338 00:20:34,320 --> 00:20:37,840 Speaker 1: is reality. Now. That's so weird. I remember before there 339 00:20:37,880 --> 00:20:41,800 Speaker 1: was a worldwide web, um showing my age. All right, 340 00:20:41,880 --> 00:20:44,399 Speaker 1: And I've got one more specific when I want to 341 00:20:44,400 --> 00:20:47,240 Speaker 1: talk about, and I'm sure you have some more examples too, 342 00:20:47,280 --> 00:20:49,280 Speaker 1: But the one that I was thinking of immediately that 343 00:20:49,560 --> 00:20:52,040 Speaker 1: it kind of ties in in a way, and that 344 00:20:52,200 --> 00:20:56,400 Speaker 1: has to do with race. Um. So the time image 345 00:20:56,480 --> 00:21:02,040 Speaker 1: was racist image that was like, again, just terrible photo journalism. 346 00:21:02,080 --> 00:21:04,840 Speaker 1: The one I was going to mention was an attempt 347 00:21:05,520 --> 00:21:09,320 Speaker 1: by the University of Wisconsin to demonstrate that they had 348 00:21:09,359 --> 00:21:15,439 Speaker 1: a diverse student uh population by showing a group of 349 00:21:15,480 --> 00:21:18,800 Speaker 1: football fans at a game that included a black student 350 00:21:19,119 --> 00:21:21,480 Speaker 1: in the middle of to the side of several white 351 00:21:21,480 --> 00:21:24,560 Speaker 1: students all cheering on a football game. The only problem 352 00:21:24,640 --> 00:21:28,919 Speaker 1: was it was a composite photo. The black students photo 353 00:21:29,080 --> 00:21:32,560 Speaker 1: had been taken a year apart from the football fans photo, 354 00:21:32,640 --> 00:21:34,919 Speaker 1: and then the black student had been inserted into the 355 00:21:34,960 --> 00:21:39,640 Speaker 1: picture in order to perpetuate this idea that the University 356 00:21:39,640 --> 00:21:43,320 Speaker 1: of Wisconsin had a diverse student body. Yes, and it 357 00:21:43,480 --> 00:21:46,800 Speaker 1: is bad. And I mean, like the photoshop itself is bad. 358 00:21:46,840 --> 00:21:48,600 Speaker 1: It does not look like that person is not the 359 00:21:48,640 --> 00:21:51,400 Speaker 1: same place. There was a publication in Toronto. I think 360 00:21:51,440 --> 00:21:57,200 Speaker 1: it was for uh. It was like come to Toronto 361 00:21:57,280 --> 00:21:59,199 Speaker 1: and they were trying to show on the on the 362 00:21:59,280 --> 00:22:01,560 Speaker 1: cover at a family and so they did the same 363 00:22:01,600 --> 00:22:04,080 Speaker 1: thing that they did with the Wisconsin cover. And it 364 00:22:04,400 --> 00:22:07,200 Speaker 1: hits me on two levels, both of these covers that 365 00:22:07,480 --> 00:22:12,480 Speaker 1: it's it's it's it's it's not good practice, and it 366 00:22:12,720 --> 00:22:15,920 Speaker 1: it wouldn't have been with all the effort and time 367 00:22:15,960 --> 00:22:18,400 Speaker 1: that you went through to get those photographs, you could 368 00:22:18,440 --> 00:22:22,439 Speaker 1: have gotten an actual photograph number one that would have 369 00:22:22,480 --> 00:22:25,400 Speaker 1: probably been less time consuming just to get a group 370 00:22:25,440 --> 00:22:27,920 Speaker 1: of people together and say hey, let's let's take your picture. 371 00:22:28,000 --> 00:22:29,960 Speaker 1: We're gonna pose you here, and if there's a game 372 00:22:30,040 --> 00:22:32,080 Speaker 1: and we don't take a picture. So it's not very ethical, 373 00:22:32,200 --> 00:22:35,520 Speaker 1: it's not very well done and like in good taste. 374 00:22:35,520 --> 00:22:39,159 Speaker 1: And then also like the editing is really bad. You 375 00:22:39,200 --> 00:22:41,880 Speaker 1: look at it and you know, yeah, I mean even 376 00:22:41,880 --> 00:22:44,600 Speaker 1: with an untrained eye. It definitely does not look like 377 00:22:44,640 --> 00:22:46,800 Speaker 1: these photos were taken on the same day because they weren't. 378 00:22:46,800 --> 00:22:49,119 Speaker 1: They were taking a year apart. And uh, you know 379 00:22:49,400 --> 00:22:51,600 Speaker 1: the University of Wisconsin. What they said was they said, 380 00:22:52,720 --> 00:22:56,040 Speaker 1: we do have black students in our student body. We 381 00:22:56,080 --> 00:23:00,840 Speaker 1: didn't have any photos that represented the diversity, so we 382 00:23:01,400 --> 00:23:04,240 Speaker 1: made one. Yeah. And and not to get on that 383 00:23:04,320 --> 00:23:08,720 Speaker 1: soap box, but if you want diversity, do it. Yeah, Like, like, 384 00:23:08,960 --> 00:23:12,280 Speaker 1: don't create the illusion of it by manipulation of photographs, 385 00:23:12,320 --> 00:23:15,639 Speaker 1: but put the effort in to get the photograph, the 386 00:23:15,640 --> 00:23:18,280 Speaker 1: real photograph, get the real diversity of the campus. Things 387 00:23:18,320 --> 00:23:24,080 Speaker 1: like that. Yeah, I agree, it's because because all it takes, 388 00:23:24,119 --> 00:23:26,280 Speaker 1: even if you do it really really well with the 389 00:23:26,280 --> 00:23:30,440 Speaker 1: photo manipulation, all it takes is someone who has any 390 00:23:30,480 --> 00:23:33,120 Speaker 1: knowledge of what's going on to say, yeah, it didn't 391 00:23:33,160 --> 00:23:36,879 Speaker 1: really happen, and then everything unravels and then you have 392 00:23:36,920 --> 00:23:41,880 Speaker 1: a pr problem that's way bigger than the perceived lack 393 00:23:41,920 --> 00:23:47,119 Speaker 1: of diversity. Um. So definitely agree, address the underlying problem 394 00:23:47,160 --> 00:23:50,760 Speaker 1: of diversity before you start worrying about Uh, you know 395 00:23:50,840 --> 00:23:54,760 Speaker 1: the brochures you were handing out. That's that's definitely I'm 396 00:23:54,800 --> 00:23:58,000 Speaker 1: on the same page. Yeah, and and I've all of 397 00:23:58,000 --> 00:24:00,320 Speaker 1: a sudden thought of like, um. There a lot of 398 00:24:00,320 --> 00:24:04,520 Speaker 1: examples of of people who have pointed out photos in 399 00:24:04,560 --> 00:24:09,040 Speaker 1: this this post digital era, people who have experience with photoshop, 400 00:24:09,080 --> 00:24:13,720 Speaker 1: who work with digital photography fairly extensively, who very quickly 401 00:24:13,760 --> 00:24:19,080 Speaker 1: can point out when, um, when when less than professional 402 00:24:19,400 --> 00:24:23,400 Speaker 1: manipulation has taken place, like in some cases, in some 403 00:24:23,440 --> 00:24:26,480 Speaker 1: cases it's it's fairly obvious even to the casual viewer, 404 00:24:26,920 --> 00:24:28,920 Speaker 1: and in some cases it takes a little bit more 405 00:24:29,040 --> 00:24:32,600 Speaker 1: of a trained eye to start noticing the indicators. If 406 00:24:32,640 --> 00:24:37,199 Speaker 1: it's done really, really well, then theoretically you wouldn't be 407 00:24:37,200 --> 00:24:40,800 Speaker 1: able to tell like you you you what you could say, well, 408 00:24:40,840 --> 00:24:43,439 Speaker 1: it's possible that was manipulated. It's also possible that that 409 00:24:43,600 --> 00:24:47,080 Speaker 1: wasn't manipulated. Some of the well done ones go viral 410 00:24:47,160 --> 00:24:50,280 Speaker 1: so quickly that it can make the rounds everywhere before 411 00:24:50,920 --> 00:24:54,240 Speaker 1: someone comes out, even if it's like three hours later, 412 00:24:54,280 --> 00:24:57,320 Speaker 1: it's already had all the news, you know, and they 413 00:24:57,359 --> 00:24:59,480 Speaker 1: they're like, oh, hey, I just opened this up a 414 00:24:59,480 --> 00:25:02,640 Speaker 1: photoshop zoomed in, and I can tell you where they 415 00:25:02,840 --> 00:25:06,240 Speaker 1: where this picture ends, and this picture begins, and this edit, 416 00:25:06,920 --> 00:25:11,480 Speaker 1: this is definitely manipulated. Yeah, I suppose if you were 417 00:25:11,520 --> 00:25:16,080 Speaker 1: to take the greatest pains possible and go pixel by 418 00:25:16,119 --> 00:25:20,000 Speaker 1: pixel and you had lots of spare time on your hands, 419 00:25:20,600 --> 00:25:25,600 Speaker 1: you could perhaps create an image that was essentially unbeatable, undetectable. 420 00:25:25,640 --> 00:25:28,320 Speaker 1: But it's a huge amount of effort in order to 421 00:25:28,359 --> 00:25:32,080 Speaker 1: do that. You can probably get past a large percentage 422 00:25:32,080 --> 00:25:37,040 Speaker 1: of the viewing public with a with a decent manipulation job. 423 00:25:37,119 --> 00:25:39,080 Speaker 1: But they're always going to be those experts out there 424 00:25:39,119 --> 00:25:40,679 Speaker 1: who are like, they know what to look for, and 425 00:25:40,680 --> 00:25:43,840 Speaker 1: there's always going to be Now in two thousand and fifteen, 426 00:25:43,840 --> 00:25:46,200 Speaker 1: there's gonna they're going to be services that you can 427 00:25:46,320 --> 00:25:48,679 Speaker 1: use that you can upload your image and it's going 428 00:25:48,720 --> 00:25:50,760 Speaker 1: to tell you whether or not they think it's been 429 00:25:51,040 --> 00:25:54,600 Speaker 1: digitally altered because they have an algorithm for finding those 430 00:25:54,640 --> 00:25:57,600 Speaker 1: things out. Which, Yeah, once you have the algorithms, go 431 00:25:57,680 --> 00:26:00,919 Speaker 1: in there where they're looking at it pixel by pixel, 432 00:26:00,960 --> 00:26:03,520 Speaker 1: Like they can look at two adjacent pixels and if 433 00:26:03,560 --> 00:26:05,720 Speaker 1: there's enough of a difference between the two that can't 434 00:26:05,760 --> 00:26:10,399 Speaker 1: be explained by other elements within the image, then that's 435 00:26:10,880 --> 00:26:13,960 Speaker 1: a possible red flag. Yeah, Or people who forget to 436 00:26:14,160 --> 00:26:17,560 Speaker 1: alter their metadata or raise their metadata. There are a 437 00:26:17,640 --> 00:26:22,199 Speaker 1: lot of ways to find fakes, and more often than not, 438 00:26:22,359 --> 00:26:24,760 Speaker 1: they're found out pretty quickly. There's also been some pretty 439 00:26:24,760 --> 00:26:28,240 Speaker 1: embarrassing situations where people have uploaded a photo that they 440 00:26:28,240 --> 00:26:33,760 Speaker 1: had cropped and uh, and yet the original picture is 441 00:26:33,800 --> 00:26:39,920 Speaker 1: still accessible through that image, where someone's like, oh, I 442 00:26:40,000 --> 00:26:41,960 Speaker 1: just expanded it and I saw that your room is 443 00:26:42,000 --> 00:26:46,959 Speaker 1: a total mess. That would be the less scandalous versions 444 00:26:47,000 --> 00:26:50,120 Speaker 1: of some of those photos that have made their rounds. Yeah, 445 00:26:50,200 --> 00:26:52,520 Speaker 1: and then some of them, like you said, are just bad. 446 00:26:52,880 --> 00:26:54,800 Speaker 1: I mean not to skip around too much, but one 447 00:26:54,840 --> 00:26:57,280 Speaker 1: of my favorite examples recently has been there have been 448 00:26:57,320 --> 00:27:02,320 Speaker 1: like a number of photographs of Chinese officials inspecting things 449 00:27:03,119 --> 00:27:06,280 Speaker 1: that are pretty poorly done. I haven't seen these there 450 00:27:06,280 --> 00:27:09,959 Speaker 1: there are. There's just one in particular. It's a doctored 451 00:27:10,000 --> 00:27:15,119 Speaker 1: photo of three local officials inspecting a highway project, uh, 452 00:27:15,240 --> 00:27:20,480 Speaker 1: where they're not on the road, they're just placed kind 453 00:27:20,480 --> 00:27:22,800 Speaker 1: of above the road. So it's like, so not only 454 00:27:22,840 --> 00:27:25,960 Speaker 1: are they inspecting their hovering yeah, like no shadow work, 455 00:27:26,040 --> 00:27:28,240 Speaker 1: and also it's kind of like a stock photo where 456 00:27:28,280 --> 00:27:30,360 Speaker 1: they're like, oh, like they have these looks on their 457 00:27:30,400 --> 00:27:33,560 Speaker 1: face that are like very surprised of how well the 458 00:27:33,640 --> 00:27:36,720 Speaker 1: progress is going on on the road. They're like, oh gosh, wow, 459 00:27:36,760 --> 00:27:40,520 Speaker 1: look at this road and it's pretty embarrassing. And when 460 00:27:40,640 --> 00:27:44,199 Speaker 1: when that's something like that gets out it, yeah, I 461 00:27:44,200 --> 00:27:46,840 Speaker 1: I definitely and I meant to bring this up when 462 00:27:46,840 --> 00:27:51,639 Speaker 1: we were talking about stock photography earlier. There's also been 463 00:27:51,680 --> 00:27:59,080 Speaker 1: several instances of various UM companies or news agencies or 464 00:27:59,560 --> 00:28:04,679 Speaker 1: political campaigns that have used stock images of people and 465 00:28:05,200 --> 00:28:07,800 Speaker 1: use them in ways to claim that this is representing 466 00:28:07,880 --> 00:28:11,520 Speaker 1: a specific individual that isn't the person who posed for 467 00:28:11,560 --> 00:28:15,720 Speaker 1: the stock photography in order to portray a particular viewpoint 468 00:28:15,920 --> 00:28:18,800 Speaker 1: or to put forth a story or something along those lines. 469 00:28:18,800 --> 00:28:21,680 Speaker 1: We've seen that happen a few times where UM people 470 00:28:21,680 --> 00:28:24,679 Speaker 1: have said this image that you're using to to create 471 00:28:24,760 --> 00:28:28,240 Speaker 1: this narrative that furthers your cause in one way or another, 472 00:28:28,280 --> 00:28:30,760 Speaker 1: whether it doesn't matter what side of the political spectrum 473 00:28:30,800 --> 00:28:34,240 Speaker 1: you're on. This has been done across the board by 474 00:28:34,320 --> 00:28:38,400 Speaker 1: various people at various times, sometimes not necessarily maliciously, but 475 00:28:38,440 --> 00:28:41,960 Speaker 1: certainly to mislead they would create. They find the stock 476 00:28:42,000 --> 00:28:44,760 Speaker 1: image that seems to represent like diversity would be a 477 00:28:44,800 --> 00:28:46,640 Speaker 1: good one. They I've seen this happen a couple of 478 00:28:46,640 --> 00:28:50,840 Speaker 1: times and people point out, actually, that's a stock image. 479 00:28:50,880 --> 00:28:54,280 Speaker 1: That's not a picture of their employees or their campaign. 480 00:28:54,720 --> 00:28:58,080 Speaker 1: That's not that's not a representative of that group. That's 481 00:28:58,120 --> 00:29:01,000 Speaker 1: actually just a stock photo. And here's where you can 482 00:29:01,120 --> 00:29:04,040 Speaker 1: find it. Yeah, like you took this crowd and you 483 00:29:04,040 --> 00:29:06,760 Speaker 1: put it with this picture of you, and that crowd 484 00:29:06,760 --> 00:29:09,960 Speaker 1: didn't actually come to your event. Um, so do you 485 00:29:10,000 --> 00:29:11,840 Speaker 1: really have the support that you say that you have, 486 00:29:12,640 --> 00:29:15,640 Speaker 1: or uh, even on a stock photo. But one I 487 00:29:15,680 --> 00:29:19,000 Speaker 1: think that you could get from an image database and 488 00:29:19,040 --> 00:29:21,560 Speaker 1: put two photos together that was pretty famous is back 489 00:29:21,600 --> 00:29:24,520 Speaker 1: in two thousand four when there was the image of 490 00:29:24,600 --> 00:29:27,960 Speaker 1: John Kerry sitting next to Jane Fonda at an anti 491 00:29:28,080 --> 00:29:33,880 Speaker 1: vietnamm A protest. Yeah, and that that I think back 492 00:29:33,880 --> 00:29:36,080 Speaker 1: in two thousand four, it took a little bit longer 493 00:29:36,680 --> 00:29:40,000 Speaker 1: for that news to be made public that it was fake, 494 00:29:41,000 --> 00:29:44,440 Speaker 1: where now I think it would it would circle background 495 00:29:44,440 --> 00:29:47,160 Speaker 1: pretty quickly. But that was that was an image that 496 00:29:47,200 --> 00:29:49,360 Speaker 1: I think a lot of people looked at and thought, wow, 497 00:29:49,360 --> 00:29:51,720 Speaker 1: that looks that looks legitimate. That really looks like they're 498 00:29:51,720 --> 00:29:54,160 Speaker 1: sitting there together, that they had a closer relationship than 499 00:29:54,200 --> 00:29:55,760 Speaker 1: they actually had a lot of people do that for 500 00:29:55,800 --> 00:29:58,920 Speaker 1: either positive or negative, to take pictures of themselves and 501 00:29:58,920 --> 00:30:01,800 Speaker 1: put them with famous p for either a positive or 502 00:30:01,840 --> 00:30:06,640 Speaker 1: a negative connotation. Yeah, this kind of also leads into 503 00:30:07,360 --> 00:30:11,600 Speaker 1: a more lighthearted version of people using photoshop for this 504 00:30:11,680 --> 00:30:15,200 Speaker 1: kind of purpose. It's my favorite, I mean, it's my 505 00:30:15,280 --> 00:30:18,040 Speaker 1: favorite thing about photoshop. Ever, It's my favorite way that 506 00:30:18,080 --> 00:30:21,240 Speaker 1: people use photoshop. I mean, I I appreciate again the 507 00:30:21,320 --> 00:30:24,080 Speaker 1: technical skill and artistry that is required for you to 508 00:30:24,240 --> 00:30:26,920 Speaker 1: make good use of photoshop and not make it look crappy, 509 00:30:27,080 --> 00:30:30,240 Speaker 1: because I do not have that skill. If I use photoshop, 510 00:30:30,280 --> 00:30:32,800 Speaker 1: it's gonna look like I use fingerpaints to cover up 511 00:30:32,840 --> 00:30:38,240 Speaker 1: a smudge or something. But I love with a with 512 00:30:38,320 --> 00:30:42,000 Speaker 1: a passion that is difficult to describe. The funny uses 513 00:30:42,040 --> 00:30:45,479 Speaker 1: of photoshop people will do when someone presents a photo 514 00:30:45,600 --> 00:30:48,000 Speaker 1: and they have a request. But this happens a lot 515 00:30:48,000 --> 00:30:50,840 Speaker 1: on Reddit, where they say, here's a picture I took, 516 00:30:51,240 --> 00:30:53,960 Speaker 1: this is what I wanted. Can someone photoshop this for 517 00:30:54,240 --> 00:31:00,560 Speaker 1: me so that this outcome happens. And the friend is 518 00:31:00,600 --> 00:31:05,680 Speaker 1: for people to do ridiculous photoshops that that uh end 519 00:31:05,760 --> 00:31:08,960 Speaker 1: up fulfilling the letter of what the person asked for, 520 00:31:09,040 --> 00:31:11,480 Speaker 1: but not the spirit of what they asked for. I 521 00:31:11,520 --> 00:31:13,400 Speaker 1: love that. It's it's like, can you put I don't 522 00:31:13,440 --> 00:31:16,240 Speaker 1: have a picture of me and my cat together? Can 523 00:31:16,280 --> 00:31:17,960 Speaker 1: you put me in my cat in the same picture? 524 00:31:18,120 --> 00:31:20,160 Speaker 1: And then like someone will make the cat huge and 525 00:31:20,400 --> 00:31:23,600 Speaker 1: the guy tiny. Yeah, yeah exactly, or they'll they'll like 526 00:31:23,680 --> 00:31:27,080 Speaker 1: just uh, you know, end up having like half a 527 00:31:27,080 --> 00:31:29,240 Speaker 1: cat on the picture and the other half the cat 528 00:31:29,280 --> 00:31:30,920 Speaker 1: is on the other side of the picture, and just 529 00:31:30,920 --> 00:31:34,840 Speaker 1: just sometimes sometimes the results are terrible, like like terrible 530 00:31:34,880 --> 00:31:38,360 Speaker 1: as in oh, and sometimes they're just funny. My favorite 531 00:31:38,400 --> 00:31:41,240 Speaker 1: one that I've seen recently, uh And I mentioned this 532 00:31:41,320 --> 00:31:42,920 Speaker 1: to you a couple of days ago when we were 533 00:31:42,920 --> 00:31:45,920 Speaker 1: talking about this podcast was one where it was a 534 00:31:46,000 --> 00:31:48,720 Speaker 1: guy standing next to the Eiffel Tower and he had 535 00:31:48,760 --> 00:31:52,200 Speaker 1: his hand up in the air, and what he had 536 00:31:52,240 --> 00:31:55,360 Speaker 1: wanted was the photographer to take a picture so that 537 00:31:55,720 --> 00:31:58,080 Speaker 1: the perspective made it look as if he had his 538 00:31:58,120 --> 00:32:01,280 Speaker 1: hand on the top of the Eiffel Tower. Classic Eiffel photo, 539 00:32:01,280 --> 00:32:04,400 Speaker 1: classic leaning tower photo. Yeah, exactly, get it. The one 540 00:32:04,440 --> 00:32:08,240 Speaker 1: that you know, everyone's gotta have one, right, I would 541 00:32:08,280 --> 00:32:10,520 Speaker 1: be the guy who would want the photo to be 542 00:32:10,560 --> 00:32:14,680 Speaker 1: taken incorrectly on purpose, like I would like I'd want 543 00:32:14,680 --> 00:32:16,120 Speaker 1: to be the guy next to the lean tower piece. 544 00:32:16,120 --> 00:32:18,600 Speaker 1: I had my hands up and there's clearly like three 545 00:32:18,640 --> 00:32:23,400 Speaker 1: ft of space between direction exactly like yeah, I'm not 546 00:32:23,640 --> 00:32:25,040 Speaker 1: I'm not even on the right side of the tower 547 00:32:25,080 --> 00:32:28,240 Speaker 1: or something like that, because I think that's funny because 548 00:32:28,360 --> 00:32:30,680 Speaker 1: so many people have done the other one. So this 549 00:32:30,720 --> 00:32:32,920 Speaker 1: one I saw where the guy said, could you photo 550 00:32:32,960 --> 00:32:34,680 Speaker 1: shop this so that my hand is on the top 551 00:32:34,720 --> 00:32:37,760 Speaker 1: of the Eiffel Tower. Obviously people had a lot of 552 00:32:37,760 --> 00:32:39,560 Speaker 1: fun with it. I think my favorite was one where 553 00:32:39,560 --> 00:32:42,600 Speaker 1: it made guy had had done I shouldn't say a 554 00:32:42,600 --> 00:32:44,200 Speaker 1: guy might have been a lady. I have no idea 555 00:32:44,280 --> 00:32:50,120 Speaker 1: that the manipulator um made made the guy's arm look 556 00:32:50,200 --> 00:32:53,520 Speaker 1: like Mr. Fantastic or plastic uh. And it was not 557 00:32:53,640 --> 00:32:56,960 Speaker 1: done subtly or like like. It was very cartoonishly where 558 00:32:56,960 --> 00:32:59,760 Speaker 1: it looped around all over the photo until the palm 559 00:32:59,800 --> 00:33:02,160 Speaker 1: was on the top of the Eiffel Tower. Now clearly 560 00:33:02,160 --> 00:33:04,440 Speaker 1: what the guy had meant was, can you reduce the 561 00:33:04,520 --> 00:33:07,160 Speaker 1: size of the Eiffel Tower in the perspective of this 562 00:33:07,200 --> 00:33:09,840 Speaker 1: photo and then edge it over to the side so 563 00:33:09,880 --> 00:33:12,480 Speaker 1: that my hand is on top of it, but that's 564 00:33:12,600 --> 00:33:16,120 Speaker 1: obviously not what the redditors provided. And that's just one 565 00:33:16,160 --> 00:33:20,200 Speaker 1: example and just one example of one request. There are 566 00:33:20,360 --> 00:33:25,160 Speaker 1: entire threads of these sort of things, and they always 567 00:33:25,200 --> 00:33:27,880 Speaker 1: make me laugh because people are so creative, Like, yeah, 568 00:33:27,960 --> 00:33:30,320 Speaker 1: it takes a certain level of skill to do photoshop 569 00:33:30,400 --> 00:33:35,120 Speaker 1: badly on purpose. Within itself, it is I wanna argue 570 00:33:35,200 --> 00:33:38,960 Speaker 1: yes and indeed, and it kind of also goes back 571 00:33:39,000 --> 00:33:41,840 Speaker 1: to something that came out of a very tragic event 572 00:33:42,360 --> 00:33:47,480 Speaker 1: but led to ridiculous examples of photoshop. The famous example 573 00:33:47,960 --> 00:33:51,040 Speaker 1: of the guy on the top of the World Trade 574 00:33:51,080 --> 00:33:54,440 Speaker 1: Center UM on September eleven, two thousand one one of 575 00:33:54,480 --> 00:33:58,680 Speaker 1: the most famous manipulations they can think of the modern age. Yeah, 576 00:33:58,880 --> 00:34:00,920 Speaker 1: so it's you know, you've price scene it. It's the 577 00:34:00,920 --> 00:34:03,360 Speaker 1: guy who's waving and then in the background you see 578 00:34:03,400 --> 00:34:07,440 Speaker 1: the plane headed for the World Trade Center and um, 579 00:34:07,480 --> 00:34:10,120 Speaker 1: this was not this was a manipulated photo. The plane 580 00:34:10,160 --> 00:34:13,640 Speaker 1: was not there in the original photo. And as a meme, 581 00:34:14,480 --> 00:34:17,640 Speaker 1: people started to use photoshop to cut the guy out 582 00:34:17,680 --> 00:34:20,319 Speaker 1: of the picture and put him in other various like 583 00:34:20,400 --> 00:34:28,520 Speaker 1: disaster scenarios like the Hindenburg or JFK JFK or volcano exploding, 584 00:34:28,719 --> 00:34:31,319 Speaker 1: or sometimes would be movies, like there would just be 585 00:34:31,320 --> 00:34:32,880 Speaker 1: a still from a movie and this guy would just 586 00:34:32,880 --> 00:34:34,960 Speaker 1: be popped up in the back, and I love that. 587 00:34:35,040 --> 00:34:37,360 Speaker 1: It's like, what are some other things that have made 588 00:34:37,480 --> 00:34:39,759 Speaker 1: that big events throughout history that we can just put 589 00:34:39,800 --> 00:34:43,040 Speaker 1: this person there. And part of you feels badly for 590 00:34:43,080 --> 00:34:46,279 Speaker 1: the guy because it's not like like he was just 591 00:34:46,320 --> 00:34:48,360 Speaker 1: posing for a picture on top of the World Trade 592 00:34:48,400 --> 00:34:52,400 Speaker 1: Center and then somebody manipulated, presumably I don't think it 593 00:34:52,440 --> 00:34:55,080 Speaker 1: was him manipulated the picture so that there was this 594 00:34:55,200 --> 00:34:59,200 Speaker 1: plane in the background. Um, and then suddenly he becomes 595 00:34:59,239 --> 00:35:01,520 Speaker 1: the center of a me even he's not necessarily the 596 00:35:01,520 --> 00:35:04,439 Speaker 1: person who did the manipulation in the first place, but 597 00:35:04,920 --> 00:35:06,239 Speaker 1: at any rate, maybe he has a good sense of 598 00:35:06,280 --> 00:35:09,000 Speaker 1: humor about it. I don't know. It's very very weird 599 00:35:09,040 --> 00:35:10,920 Speaker 1: thing to have happened to you. I don't know how 600 00:35:10,960 --> 00:35:13,160 Speaker 1: I would feel if my image started This is not 601 00:35:13,239 --> 00:35:16,120 Speaker 1: an invitation, listeners, I don't know how I would feel 602 00:35:16,160 --> 00:35:18,600 Speaker 1: if my image started popping up everywhere. Go for it. 603 00:35:18,640 --> 00:35:21,560 Speaker 1: Go for it. Dylan and I have a bit more 604 00:35:21,600 --> 00:35:35,040 Speaker 1: to say about photo editing and manipulation after this quick break. Now, 605 00:35:35,160 --> 00:35:38,360 Speaker 1: Dylan Dean are you aware of our super fan, Aaron Cooper. 606 00:35:38,960 --> 00:35:42,799 Speaker 1: I am not so. Aaron Cooper has done tons of 607 00:35:43,560 --> 00:35:48,320 Speaker 1: photo editing, various stuff you should know, text stuff for 608 00:35:48,560 --> 00:35:53,440 Speaker 1: thinking folks. All the posters in Studio A, all of 609 00:35:53,480 --> 00:35:57,200 Speaker 1: those are his work. So like Holy Monty Python, the 610 00:35:57,239 --> 00:36:00,520 Speaker 1: Holy Grail featuring the people from How Stuff Works. That's 611 00:36:00,719 --> 00:36:03,719 Speaker 1: you know, this is Aaron Cooper's I commend you on 612 00:36:03,880 --> 00:36:06,280 Speaker 1: that is very good he's done. He's done a couple 613 00:36:06,360 --> 00:36:09,200 Speaker 1: of Star Wars ones for me because he did one 614 00:36:09,280 --> 00:36:12,120 Speaker 1: with where I was Luke and Chris Palette was lay up, 615 00:36:12,640 --> 00:36:16,160 Speaker 1: and then when Lauren came in as my co host, 616 00:36:16,280 --> 00:36:18,959 Speaker 1: he replaced Chris with Lauren. But then he put Chris 617 00:36:18,960 --> 00:36:21,680 Speaker 1: in the background as a ghost O BI one version 618 00:36:22,840 --> 00:36:24,840 Speaker 1: and I thought I was brilliant. And then when I 619 00:36:24,960 --> 00:36:26,880 Speaker 1: when when Lauren left the show, he wrote me and 620 00:36:26,880 --> 00:36:30,720 Speaker 1: said I'm not doing another one. Said I don't blame 621 00:36:30,719 --> 00:36:32,920 Speaker 1: you man, You've done You've done more than your share. 622 00:36:33,760 --> 00:36:36,520 Speaker 1: But he is an example of the people who like 623 00:36:36,600 --> 00:36:39,480 Speaker 1: to use photoshop in order to make a joke and 624 00:36:39,560 --> 00:36:43,560 Speaker 1: make a statement and show his appreciation for something that 625 00:36:43,600 --> 00:36:47,680 Speaker 1: he really enjoys. And so big props to Sea. I 626 00:36:47,719 --> 00:36:50,279 Speaker 1: think that his work is great and I'm glad to 627 00:36:50,280 --> 00:36:52,919 Speaker 1: put a name with someone who's made me laugh quite 628 00:36:52,920 --> 00:36:56,240 Speaker 1: a few times in this studio. Yeah, it's it's great stuff. 629 00:36:56,320 --> 00:36:59,480 Speaker 1: And uh, I'll definitely show you when this show is over, 630 00:36:59,560 --> 00:37:01,600 Speaker 1: I'll show you the pictures he's done for tech stuff. 631 00:37:01,800 --> 00:37:05,239 Speaker 1: Are there any other famous examples of photo manipulation manipulation 632 00:37:05,239 --> 00:37:07,879 Speaker 1: you'd like to talk about? Well, speaking of Star Wars, Uh, 633 00:37:07,920 --> 00:37:13,719 Speaker 1: In two thousand and eight, Uh, there was Iranian missile test. Yeah, 634 00:37:13,800 --> 00:37:18,400 Speaker 1: and um, it was an image released of the missile 635 00:37:18,440 --> 00:37:21,399 Speaker 1: tests that a bunch of news outlets picked up and 636 00:37:21,840 --> 00:37:25,880 Speaker 1: it didn't take long for someone to realize that, uh, 637 00:37:26,160 --> 00:37:28,799 Speaker 1: two of the missiles in the tests that were being 638 00:37:28,920 --> 00:37:31,320 Speaker 1: launched have been replicated, so it's the same missile twice. 639 00:37:32,440 --> 00:37:35,839 Speaker 1: And the internet had a field day with it. And 640 00:37:35,920 --> 00:37:38,840 Speaker 1: it eventually ended up that someone put a bunch of 641 00:37:38,880 --> 00:37:41,680 Speaker 1: missiles going off in in opposite directions they were they 642 00:37:41,680 --> 00:37:44,399 Speaker 1: were going in both directions, kind of shooting into each other. 643 00:37:44,719 --> 00:37:47,920 Speaker 1: There's smoke everywhere, and then the very bottom of the 644 00:37:47,960 --> 00:37:53,320 Speaker 1: frame there's a jar jar banks just his head and uh, 645 00:37:53,480 --> 00:37:59,040 Speaker 1: you know, fast forward four years, another uh, Iranian news 646 00:37:59,080 --> 00:38:03,879 Speaker 1: outlet picks up that picture, the edited picture for their 647 00:38:03,880 --> 00:38:10,040 Speaker 1: website on an article about Iranian missile tests. So someone 648 00:38:10,120 --> 00:38:13,360 Speaker 1: just searched for a picture of the tests and found 649 00:38:13,520 --> 00:38:18,799 Speaker 1: the wrong one. Yeah, and so like good enough. Yeah, 650 00:38:18,840 --> 00:38:22,560 Speaker 1: that's um, that's unfortunately we we know obviously we have 651 00:38:22,719 --> 00:38:28,120 Speaker 1: seen images that people have presented as as being genuine 652 00:38:28,520 --> 00:38:31,880 Speaker 1: without actually knowing that there had been manipulation. It doesn't 653 00:38:31,920 --> 00:38:35,520 Speaker 1: always mean if someone spreads these kind of photos around, 654 00:38:36,120 --> 00:38:39,080 Speaker 1: it doesn't always mean that they are aware of the 655 00:38:39,280 --> 00:38:41,880 Speaker 1: of the manipulation. They may be victims of that manipulation, 656 00:38:41,880 --> 00:38:45,319 Speaker 1: and then they further perpetuated by sharing it. It's very 657 00:38:45,360 --> 00:38:48,080 Speaker 1: easy to lose original credit on the Internet because things 658 00:38:48,120 --> 00:38:51,880 Speaker 1: are passed around. Even even things that you can verify 659 00:38:51,920 --> 00:38:56,640 Speaker 1: fairly easily can still spread. A famous example which will finally, 660 00:38:56,680 --> 00:39:00,000 Speaker 1: I think, be put to bed this year, We're almost there. 661 00:39:00,000 --> 00:39:06,120 Speaker 1: October is so close. The image of the digital Redoubt 662 00:39:06,480 --> 00:39:09,680 Speaker 1: and back to the future too. I've seen that manipulated 663 00:39:09,760 --> 00:39:13,080 Speaker 1: to be at least five different dates. Yeah, at least 664 00:39:14,120 --> 00:39:17,080 Speaker 1: sometimes more than once a year. Like there, I would 665 00:39:17,080 --> 00:39:19,680 Speaker 1: say at least since two thousand and ten, I've seen 666 00:39:20,200 --> 00:39:24,040 Speaker 1: quite a few. And it is it's in October this year, folks. 667 00:39:24,400 --> 00:39:27,319 Speaker 1: So once we get past October, I think that will 668 00:39:27,320 --> 00:39:29,200 Speaker 1: be the end of its. Once we get past October 669 00:39:29,239 --> 00:39:33,960 Speaker 1: and everyone has self licing shoes and hoverboards. Uh you know, 670 00:39:34,200 --> 00:39:39,160 Speaker 1: just remember they don't work on water unless you got power, 671 00:39:39,640 --> 00:39:41,920 Speaker 1: unless you got that Lexus hoverboard. H that's right, the 672 00:39:42,000 --> 00:39:45,560 Speaker 1: Lexus one. It's different yea that that includes uh, super 673 00:39:45,600 --> 00:39:47,960 Speaker 1: cool magnets that you want to be real careful with 674 00:39:48,000 --> 00:39:51,800 Speaker 1: that one. UM. But I think one of the issues 675 00:39:51,840 --> 00:39:56,160 Speaker 1: that's pretty prevalent these days with photo manipulation is UM 676 00:39:56,360 --> 00:40:03,319 Speaker 1: magazine covers and yes, I and UM. And there's been 677 00:40:03,360 --> 00:40:05,920 Speaker 1: a couple of breakthroughs in the past few years on 678 00:40:06,239 --> 00:40:09,320 Speaker 1: kind of raining that in because I'm sure that everyone's 679 00:40:09,360 --> 00:40:12,239 Speaker 1: seen this and and it's easy to tell that these 680 00:40:12,239 --> 00:40:15,799 Speaker 1: photos have been manipulated just because whose skin is that 681 00:40:15,960 --> 00:40:19,000 Speaker 1: perfect and things like that, whose waste is that small? 682 00:40:19,360 --> 00:40:23,600 Speaker 1: Especially if you if you see any like casual photo 683 00:40:23,719 --> 00:40:29,320 Speaker 1: of especially now that UM, we we have a flood 684 00:40:29,880 --> 00:40:33,120 Speaker 1: of photos hitting our social networks that are taken from 685 00:40:33,160 --> 00:40:36,319 Speaker 1: the subjects themselves, and you start to see what they 686 00:40:36,360 --> 00:40:39,279 Speaker 1: really look like. These and I'm not saying that they 687 00:40:39,640 --> 00:40:42,080 Speaker 1: are unattractive at all, I'm talking like, you know, everyone 688 00:40:42,120 --> 00:40:46,560 Speaker 1: from supermodels to actors, to authors to whatever. You see 689 00:40:46,560 --> 00:40:48,600 Speaker 1: what they really look like, and then you compare that 690 00:40:48,680 --> 00:40:52,120 Speaker 1: to the magazine version of them, the magazine cover version, 691 00:40:52,239 --> 00:40:56,080 Speaker 1: and sometimes they look like two totally different people. Yeah. 692 00:40:56,120 --> 00:40:59,480 Speaker 1: And also it's a lot of the originals get leaked 693 00:40:59,520 --> 00:41:04,040 Speaker 1: and people make, you know, comparisons where they overlay the 694 00:41:04,040 --> 00:41:06,440 Speaker 1: original with the edit and you can see all of 695 00:41:06,480 --> 00:41:09,960 Speaker 1: the work that they've done, the perceived flaws that have 696 00:41:10,040 --> 00:41:13,879 Speaker 1: been corrected. And in two thousand nine there was a 697 00:41:14,000 --> 00:41:20,439 Speaker 1: really extreme case from Ralph Lauren Uh they edited one 698 00:41:20,440 --> 00:41:24,839 Speaker 1: of their models to have just possibly in human proportions, 699 00:41:24,880 --> 00:41:28,719 Speaker 1: just a tiny, tiny waist, just very small arms, and 700 00:41:28,719 --> 00:41:32,359 Speaker 1: and there was a big backlash and that that kind 701 00:41:32,400 --> 00:41:37,080 Speaker 1: of that wasn't exactly the change of tide. But in 702 00:41:37,280 --> 00:41:41,919 Speaker 1: the years following UM there have been such instances as 703 00:41:43,120 --> 00:41:46,040 Speaker 1: in two thousand twelve, Israel became the first country to 704 00:41:46,320 --> 00:41:51,480 Speaker 1: require advertisements to say when they were digitally manipulating photos 705 00:41:51,560 --> 00:41:55,480 Speaker 1: to make people appear thinner. UM. It also set like 706 00:41:55,480 --> 00:41:59,640 Speaker 1: a minimum boss mat body mass index that the models 707 00:41:59,760 --> 00:42:03,440 Speaker 1: could have to ensure that no underweight models were used 708 00:42:03,480 --> 00:42:06,600 Speaker 1: in advertisements. So there was actually like a not a 709 00:42:06,640 --> 00:42:10,640 Speaker 1: weight limit, but in body mass limit. Yeah. And then um, 710 00:42:10,719 --> 00:42:15,000 Speaker 1: there was a teenager that started a campaign, uh, that 711 00:42:15,080 --> 00:42:19,279 Speaker 1: wanted seventeen magazine to stop retouching their models in two 712 00:42:19,320 --> 00:42:24,560 Speaker 1: thousand twelve, and seventeen magazine made like an eight point 713 00:42:24,600 --> 00:42:28,279 Speaker 1: packed that said that they would never change body or 714 00:42:28,280 --> 00:42:31,959 Speaker 1: face shapes and that they will only use models who 715 00:42:32,440 --> 00:42:35,759 Speaker 1: give the perception of health healthiness. Right. So, in other words, 716 00:42:36,080 --> 00:42:41,400 Speaker 1: in order to not perpetuate an unrealistic ideal of physical 717 00:42:41,440 --> 00:42:48,160 Speaker 1: beauty or to create this kind of unhealthy, uh, unhealthy 718 00:42:48,160 --> 00:42:52,200 Speaker 1: obsession with images that may in fact be unattainable because 719 00:42:52,200 --> 00:42:54,319 Speaker 1: they've been manipulated to the point where this is not 720 00:42:54,440 --> 00:42:56,759 Speaker 1: representative of what a human being looks like. Yeah, and 721 00:42:57,040 --> 00:42:59,840 Speaker 1: just make people feel really bad about themselves. Yeah, I remember. 722 00:43:00,280 --> 00:43:03,520 Speaker 1: Of course, there have been several Dove advertisement campaigns that 723 00:43:03,560 --> 00:43:08,760 Speaker 1: have shown the process of taking a model, and they 724 00:43:08,920 --> 00:43:14,000 Speaker 1: take the subject and show the whole process from going 725 00:43:14,080 --> 00:43:17,160 Speaker 1: from the way she looks before any anything's been done 726 00:43:17,520 --> 00:43:22,000 Speaker 1: to the wardrobe and makeup and hair process, which already 727 00:43:22,080 --> 00:43:25,080 Speaker 1: changes her appearance significantly from what she looked like before. 728 00:43:25,800 --> 00:43:29,719 Speaker 1: Then if the photography process, then the editing process in 729 00:43:29,719 --> 00:43:32,440 Speaker 1: the photography where they make even further changes to her 730 00:43:32,480 --> 00:43:35,840 Speaker 1: appearance and then do the side by side and say 731 00:43:35,880 --> 00:43:39,080 Speaker 1: that you know, this is an example of of how 732 00:43:39,680 --> 00:43:43,680 Speaker 1: the the industry, the beauty industry, has created this unrealistic 733 00:43:44,239 --> 00:43:48,399 Speaker 1: and and possible, you know, damaging image of what beauty is. 734 00:43:48,680 --> 00:43:53,600 Speaker 1: So in some instances, hopefully if it doesn't get if 735 00:43:53,640 --> 00:43:57,239 Speaker 1: it doesn't cease it completely, there at least disclaimers, there's 736 00:43:57,280 --> 00:44:00,440 Speaker 1: at least some there's someone, there's a watchdog of some 737 00:44:00,480 --> 00:44:04,280 Speaker 1: sort kind of looking out for things like this. UM. 738 00:44:04,320 --> 00:44:08,400 Speaker 1: But like I think I said earlier, there's also this 739 00:44:08,400 --> 00:44:11,040 Speaker 1: this this feeling I think now from people who are 740 00:44:11,120 --> 00:44:15,719 Speaker 1: inundated with so many photographs UM, and that there's just 741 00:44:15,800 --> 00:44:20,319 Speaker 1: this different access to photography and that you're you see 742 00:44:20,360 --> 00:44:22,600 Speaker 1: so many ads every day, you you go to so 743 00:44:22,640 --> 00:44:27,120 Speaker 1: many websites that um, any photograph that you see is 744 00:44:27,120 --> 00:44:30,440 Speaker 1: is going to still have edits, and that people are 745 00:44:30,440 --> 00:44:33,640 Speaker 1: probably going to feel awkward if they don't see the 746 00:44:33,760 --> 00:44:37,560 Speaker 1: edits that they might they might expect from an image 747 00:44:37,600 --> 00:44:40,960 Speaker 1: just right out the camera, the changing the contrast, fixing 748 00:44:40,960 --> 00:44:44,319 Speaker 1: the exposure, correcting the color. Like we've gotten to a 749 00:44:44,360 --> 00:44:46,800 Speaker 1: point now where a lot of the cameras inside phones. 750 00:44:46,880 --> 00:44:50,680 Speaker 1: The apps have software that auto automatically do certain enhancements 751 00:44:51,440 --> 00:44:54,920 Speaker 1: to contrast or lighting, or to make sure that things 752 00:44:54,960 --> 00:44:57,440 Speaker 1: like red eye aren't a problem. That kind of stuff, 753 00:44:57,560 --> 00:44:59,920 Speaker 1: or in the past few years, you know, putting the 754 00:45:00,040 --> 00:45:03,160 Speaker 1: hashtag no filter on your Instagram photo when you haven't 755 00:45:03,239 --> 00:45:05,960 Speaker 1: put a filter that changes the quality of the photos. 756 00:45:06,760 --> 00:45:09,840 Speaker 1: That that kind of that that's kind of interesting. Yeah, 757 00:45:10,120 --> 00:45:13,440 Speaker 1: I I never I never really use filters occasionally like 758 00:45:13,719 --> 00:45:17,680 Speaker 1: Google Google Photo, well once in a blue moon. And 759 00:45:17,719 --> 00:45:21,520 Speaker 1: I don't know what the algorithm chooses as the criteria 760 00:45:21,600 --> 00:45:25,000 Speaker 1: for this, but will single out an image and do 761 00:45:25,080 --> 00:45:27,480 Speaker 1: an auto enhancement and show you what it looks like, 762 00:45:27,520 --> 00:45:30,000 Speaker 1: and it doesn't automatically share it. You actually have the 763 00:45:30,040 --> 00:45:33,440 Speaker 1: option to say, you know, that's fine, awesome, forget it. 764 00:45:33,960 --> 00:45:36,040 Speaker 1: But I took a picture of my dog when I 765 00:45:36,080 --> 00:45:39,360 Speaker 1: was walking my dog in the woods of North Georgia. 766 00:45:39,400 --> 00:45:41,839 Speaker 1: We're right next to a river, and I just wanted 767 00:45:41,800 --> 00:45:44,600 Speaker 1: to take a picture of my dog because he had 768 00:45:44,680 --> 00:45:46,600 Speaker 1: never seen a river before, so this was a neat 769 00:45:46,600 --> 00:45:49,520 Speaker 1: experience for him. He was incredibly cute. He was flipping out. Yeah, 770 00:45:49,560 --> 00:45:51,680 Speaker 1: it was great. So I took a picture of him, 771 00:45:51,840 --> 00:45:55,239 Speaker 1: and I have an Android phone, so my photos automatically 772 00:45:55,280 --> 00:45:59,080 Speaker 1: back up to my Google Photo account and I get 773 00:45:59,080 --> 00:46:00,920 Speaker 1: a message a little bit like eater and it says 774 00:46:01,160 --> 00:46:04,279 Speaker 1: I've got an enhanced image of my dog. So I 775 00:46:04,320 --> 00:46:07,759 Speaker 1: pull it up and it has added filters, so it 776 00:46:07,840 --> 00:46:12,520 Speaker 1: looks almost like in a painting, almost like a painting style. 777 00:46:13,160 --> 00:46:16,359 Speaker 1: Um to the point where I said, if this looked 778 00:46:16,400 --> 00:46:17,880 Speaker 1: a little bit more like a painting, it would be 779 00:46:17,920 --> 00:46:19,520 Speaker 1: exactly the sort of thing you would see in like 780 00:46:19,520 --> 00:46:24,799 Speaker 1: a doctor's waiting room. Yeah, very rockwelly and kind of thing. 781 00:46:25,239 --> 00:46:28,359 Speaker 1: And so, uh, you know, it's it's interesting we've reached 782 00:46:28,400 --> 00:46:33,000 Speaker 1: this point now where the photo manipulation, uh is happening 783 00:46:33,000 --> 00:46:35,520 Speaker 1: on its own, like due to algorithms. It's not even 784 00:46:35,560 --> 00:46:38,759 Speaker 1: necessarily a human that's responsible for it. Yeah, and I 785 00:46:38,760 --> 00:46:42,040 Speaker 1: know you know something about Google deep dreaming. I'm not 786 00:46:42,120 --> 00:46:45,600 Speaker 1: too familiar with what exactly what it does, but is 787 00:46:45,600 --> 00:46:47,959 Speaker 1: there a quick way that you could break it down? Sure? 788 00:46:48,080 --> 00:46:52,319 Speaker 1: Google deep dream the the purpose of it deep dream 789 00:46:52,360 --> 00:46:54,640 Speaker 1: itself is kind of an extension of what the purpose was. 790 00:46:54,680 --> 00:46:58,560 Speaker 1: The purpose was to look at an image and I'll 791 00:46:58,600 --> 00:47:04,040 Speaker 1: grow them, examines an image and looks for examples of 792 00:47:04,120 --> 00:47:07,319 Speaker 1: areas of that image that it can enhance so that 793 00:47:07,800 --> 00:47:12,400 Speaker 1: you can see stuff in greater detail. So, uh, maybe 794 00:47:12,440 --> 00:47:16,360 Speaker 1: there's um like a blurry photo of someone's face. It 795 00:47:16,440 --> 00:47:20,440 Speaker 1: might be able to bring those features into more more 796 00:47:20,440 --> 00:47:23,200 Speaker 1: of a sharp appearance that you can actually see who 797 00:47:23,200 --> 00:47:25,680 Speaker 1: that person is and it was. You know, it could 798 00:47:25,719 --> 00:47:28,160 Speaker 1: be used for anything. It could be used for facial recognition. 799 00:47:28,200 --> 00:47:31,520 Speaker 1: It could be used just to make an image look nicer. 800 00:47:31,560 --> 00:47:35,399 Speaker 1: It could be used to um to to look at 801 00:47:35,440 --> 00:47:38,719 Speaker 1: old photos and see if you can figure out, like 802 00:47:39,040 --> 00:47:41,880 Speaker 1: imagine opening up a cold case. You've got this photograph 803 00:47:41,920 --> 00:47:43,520 Speaker 1: of a face, and you've never been able to identify 804 00:47:43,560 --> 00:47:45,719 Speaker 1: that person. Use this technology to try and see if 805 00:47:45,719 --> 00:47:47,640 Speaker 1: it can at least make a guess as to what 806 00:47:47,719 --> 00:47:50,520 Speaker 1: that person actually looks like. That's the sort of thing 807 00:47:51,120 --> 00:47:53,239 Speaker 1: it does. So in a way, it's like the old 808 00:47:53,280 --> 00:47:56,839 Speaker 1: science fiction zoom and enhance, you know, that old, good 809 00:47:56,840 --> 00:48:01,080 Speaker 1: old TV trope. It's like that, but again it's making guesses. 810 00:48:01,239 --> 00:48:04,200 Speaker 1: It's trying to fill in gaps where data does not exist, 811 00:48:04,880 --> 00:48:08,080 Speaker 1: So it's looking for patterns. It's looking to try and 812 00:48:08,200 --> 00:48:12,120 Speaker 1: insert information into those patterns that would make the most sense, 813 00:48:12,160 --> 00:48:16,600 Speaker 1: which means that sometimes it makes mistakes. Well, the deep 814 00:48:16,680 --> 00:48:20,880 Speaker 1: dream part is is cranking that up to like eleven. 815 00:48:20,880 --> 00:48:23,719 Speaker 1: It's like oversaturating a photo. It's like making all the 816 00:48:23,760 --> 00:48:26,000 Speaker 1: colors bleed, except in this case, what it's doing is 817 00:48:26,000 --> 00:48:28,600 Speaker 1: it's saying, Hey, you know that pattern recognition you have 818 00:48:28,719 --> 00:48:30,720 Speaker 1: where you're looking for stuff that looks like a face 819 00:48:30,840 --> 00:48:33,480 Speaker 1: or whatever, or looks like a a plant or a 820 00:48:33,520 --> 00:48:36,239 Speaker 1: dog or a lizard, whatever it may be. I'm going 821 00:48:36,320 --> 00:48:38,440 Speaker 1: to turn that way up so that now you're going 822 00:48:38,480 --> 00:48:40,560 Speaker 1: to be looking for that even harder. So anything that 823 00:48:40,680 --> 00:48:43,560 Speaker 1: remotely looks like one of those things, you're going to 824 00:48:43,600 --> 00:48:46,560 Speaker 1: interpret that as a representation, and then you're going to 825 00:48:46,600 --> 00:48:48,480 Speaker 1: fill in the gaps. Be like, hey, what if that 826 00:48:48,520 --> 00:48:52,319 Speaker 1: recognition software got real into psychedelic rock. Yeah. So if 827 00:48:52,360 --> 00:48:55,480 Speaker 1: you look at these Google deep dream images, some of 828 00:48:55,480 --> 00:48:58,120 Speaker 1: them like there's there's one in particular that's really good 829 00:48:58,120 --> 00:49:02,040 Speaker 1: at finding dog faces and everything. So you might take 830 00:49:02,040 --> 00:49:04,680 Speaker 1: a picture of a table that has a really interesting 831 00:49:04,719 --> 00:49:07,120 Speaker 1: wood grain to it and you throw it through this 832 00:49:07,200 --> 00:49:10,560 Speaker 1: Google Google deep Dream and it just finds dog faces everywhere, 833 00:49:10,600 --> 00:49:12,759 Speaker 1: and it starts filling in the information so that now 834 00:49:12,880 --> 00:49:17,640 Speaker 1: it's just an hple of craft Ian horror show of 835 00:49:17,760 --> 00:49:21,880 Speaker 1: a table covered in dog faces um or like there 836 00:49:21,880 --> 00:49:24,760 Speaker 1: are a lot of pictures of like again taking pictures 837 00:49:24,800 --> 00:49:26,600 Speaker 1: of dogs where you look at the picture of the 838 00:49:26,680 --> 00:49:28,719 Speaker 1: dog and it's looking at patterns in the fur, and 839 00:49:28,800 --> 00:49:31,360 Speaker 1: suddenly you see like all these other faces of dogs 840 00:49:31,480 --> 00:49:36,040 Speaker 1: emerging from these patterns of fur, and it definitely starts 841 00:49:36,080 --> 00:49:41,200 Speaker 1: to feel like you are in a trippy nightmare scenario, 842 00:49:41,840 --> 00:49:44,600 Speaker 1: like a nightmare, but you can close the browser window. Yeah, 843 00:49:44,719 --> 00:49:48,000 Speaker 1: so I'm sure that Salvador Dali would have said this 844 00:49:48,640 --> 00:49:52,920 Speaker 1: is amazing and it should be included on all cameras 845 00:49:52,960 --> 00:49:56,399 Speaker 1: and you should never be able to turn it off. Yeah, 846 00:49:57,239 --> 00:49:59,840 Speaker 1: because it is that kind of trippy sort of experience. 847 00:50:00,200 --> 00:50:02,920 Speaker 1: It's all based again on on pattern recognition, only in 848 00:50:03,000 --> 00:50:06,600 Speaker 1: this case it's recognizing patterns that are not really representative 849 00:50:06,719 --> 00:50:09,239 Speaker 1: of the thing. It interprets the mess. It's very much 850 00:50:09,280 --> 00:50:10,759 Speaker 1: the same thing as looking up at the clouds and 851 00:50:10,840 --> 00:50:13,920 Speaker 1: seeing a face. Now it's it's we humans do this 852 00:50:14,040 --> 00:50:16,719 Speaker 1: all the time. We see patterns where there really isn't 853 00:50:16,840 --> 00:50:19,320 Speaker 1: a pattern, Like we we recognize what appears to be 854 00:50:19,400 --> 00:50:23,040 Speaker 1: a pattern in something that's largely a chaotic system. Same 855 00:50:23,080 --> 00:50:25,800 Speaker 1: sort of thing, and it's actually pretty fascinating. It's a 856 00:50:25,880 --> 00:50:29,680 Speaker 1: kind of artificial intelligence in a way. And that wraps 857 00:50:29,800 --> 00:50:32,560 Speaker 1: up this classic episode of tech Stuff. Hope you enjoyed it. 858 00:50:33,040 --> 00:50:35,600 Speaker 1: Thanks again to Dylan for joining me, you know, seven 859 00:50:35,680 --> 00:50:40,400 Speaker 1: years ago to talk about photo manipulation and editing. Obviously, 860 00:50:40,480 --> 00:50:44,400 Speaker 1: the tools today are even more sophisticated, far more sophisticated 861 00:50:44,440 --> 00:50:47,680 Speaker 1: than they were seven years ago. Uh, and it is 862 00:50:48,360 --> 00:50:51,320 Speaker 1: phenomenal what can be done. Frightening in many ways, what 863 00:50:51,480 --> 00:50:55,200 Speaker 1: can be done with photo and video editing and manipulation. 864 00:50:56,120 --> 00:50:58,799 Speaker 1: So maybe at some point I will do a follow 865 00:50:58,920 --> 00:51:01,640 Speaker 1: up episode on this and kind of talk about sort 866 00:51:01,680 --> 00:51:06,560 Speaker 1: of the huge leaps in capabilities that we've seen since, 867 00:51:07,920 --> 00:51:10,959 Speaker 1: because it's pretty remarkable. But until then, if you would 868 00:51:11,000 --> 00:51:13,040 Speaker 1: like to suggest a topic for me to cover on 869 00:51:13,160 --> 00:51:14,680 Speaker 1: tech Stuff, there are a couple of different ways you 870 00:51:14,719 --> 00:51:18,040 Speaker 1: can reach out. One is you can download the I 871 00:51:18,160 --> 00:51:20,920 Speaker 1: Heart radio app, navigate over to tech Stuff, use that 872 00:51:21,000 --> 00:51:23,880 Speaker 1: little microphone icon that'll let you leave a message up 873 00:51:23,920 --> 00:51:26,160 Speaker 1: to thirty seconds in length telling me what you would 874 00:51:26,200 --> 00:51:28,960 Speaker 1: like me to cover. Or you can reach out to 875 00:51:29,040 --> 00:51:31,480 Speaker 1: me on Twitter. The handle that we use for the 876 00:51:31,520 --> 00:51:34,520 Speaker 1: show is tech Stuff hs W, and I'll talk to 877 00:51:34,560 --> 00:51:43,680 Speaker 1: you again. Release soon. Text Stuff is an I heart 878 00:51:43,800 --> 00:51:47,480 Speaker 1: Radio production. For more podcasts from my heart Radio, visit 879 00:51:47,560 --> 00:51:50,560 Speaker 1: the i heart Radio app, Apple Podcasts, or wherever you 880 00:51:50,719 --> 00:51:52,040 Speaker 1: listen to your favorite shows.