1 00:00:05,160 --> 00:00:07,680 Speaker 1: Hey, this is Anny and Samantha and welcome to Stephane 2 00:00:07,720 --> 00:00:19,200 Speaker 1: Never Told You production of I Heart Radio. And today 3 00:00:19,320 --> 00:00:23,239 Speaker 1: we are continuing our long technology streak. There's a lot 4 00:00:23,280 --> 00:00:27,240 Speaker 1: to talk about in the world. Technology is really, really 5 00:00:27,440 --> 00:00:30,040 Speaker 1: is and this one is a hot topic right now. 6 00:00:30,040 --> 00:00:32,440 Speaker 1: It is a hot button issue because we are talking 7 00:00:32,520 --> 00:00:38,239 Speaker 1: about artificial intelligence and women. Um, anything we mentioned in 8 00:00:38,240 --> 00:00:40,479 Speaker 1: this is not currently sponsored. We're just talking about some 9 00:00:40,520 --> 00:00:43,400 Speaker 1: of the bigger players in this field. You can see 10 00:00:43,400 --> 00:00:47,559 Speaker 1: our episode that just published recently with Katrina who works 11 00:00:47,960 --> 00:00:50,040 Speaker 1: with AI and has been in that community for a while, 12 00:00:50,159 --> 00:00:55,400 Speaker 1: about her experience. The date is February seven. And I'm 13 00:00:55,400 --> 00:00:58,480 Speaker 1: saying that because like every day I get a new 14 00:00:58,480 --> 00:01:02,959 Speaker 1: piece of information about this. Well, you know it's interesting. Um, 15 00:01:03,040 --> 00:01:05,240 Speaker 1: not only do we get more information, but the way 16 00:01:05,280 --> 00:01:08,640 Speaker 1: the programs are going, they are updating so quickly that 17 00:01:08,680 --> 00:01:12,160 Speaker 1: it has grown immensely. So every time we talk about it, 18 00:01:12,160 --> 00:01:15,280 Speaker 1: they've gone from like Virgin three to version twelve. So 19 00:01:15,280 --> 00:01:19,600 Speaker 1: it's kind of, uh, we're behind obviously talking about it. 20 00:01:19,959 --> 00:01:21,760 Speaker 1: And then as it gets on when we talk about 21 00:01:21,880 --> 00:01:25,520 Speaker 1: things like chat GPT, they're being acquired and there's this competition, 22 00:01:25,600 --> 00:01:29,720 Speaker 1: a huge competition happening within the technology world. So yeah, 23 00:01:29,800 --> 00:01:33,640 Speaker 1: things are changing very quickly. Yes, yes, And Samantha and 24 00:01:33,720 --> 00:01:39,080 Speaker 1: I are not at all experts. This is complicated topic. 25 00:01:39,880 --> 00:01:42,319 Speaker 1: Um so we're you're getting, you know, kind of a 26 00:01:42,440 --> 00:01:45,920 Speaker 1: very basic layout and understanding what's going on right now. 27 00:01:45,959 --> 00:01:48,720 Speaker 1: But I just I would I would hate for people 28 00:01:48,720 --> 00:01:51,160 Speaker 1: who really know about this to be like pulling out 29 00:01:51,160 --> 00:01:54,440 Speaker 1: your hair. It's it's fine. Um I know that I 30 00:01:54,440 --> 00:02:00,080 Speaker 1: am giving the most basic definitions and everything, but it 31 00:02:00,240 --> 00:02:03,240 Speaker 1: as always please right in let us know, uh if 32 00:02:03,240 --> 00:02:08,200 Speaker 1: you have any more information or resources about this topic. Okay. 33 00:02:08,520 --> 00:02:12,000 Speaker 1: Um so we did want to give a very brief 34 00:02:12,560 --> 00:02:19,840 Speaker 1: history of artificial intelligence or AI. So by the nineteen fifties, scientists, engineers, philosophers, 35 00:02:19,880 --> 00:02:23,919 Speaker 1: mathematicians all were familiar with the concept of artificial intelligence. 36 00:02:24,440 --> 00:02:28,400 Speaker 1: In his nineteen fifty paper Computing Machinery and Intelligence, Alan 37 00:02:28,480 --> 00:02:33,560 Speaker 1: Turing speculated that machines could use available information to make 38 00:02:33,600 --> 00:02:38,280 Speaker 1: decisions and problem solve and proposed building intelligent machines along 39 00:02:38,280 --> 00:02:41,160 Speaker 1: with ways to test their intelligence. And if you're if 40 00:02:41,200 --> 00:02:43,880 Speaker 1: you're if you'll remember like this was, you know, a 41 00:02:44,040 --> 00:02:47,680 Speaker 1: lot of science fiction was happening around this time too, 42 00:02:47,840 --> 00:02:52,639 Speaker 1: Like it really captured the public imagination. Technology was changing 43 00:02:52,720 --> 00:02:54,960 Speaker 1: very quickly then, not nearly as quickly as it is now, 44 00:02:55,000 --> 00:02:57,400 Speaker 1: but a lot of things that were very very new 45 00:02:57,440 --> 00:03:00,360 Speaker 1: were coming out TV more people were getting to these, 46 00:03:01,600 --> 00:03:03,760 Speaker 1: so it was it was a real time for people 47 00:03:03,800 --> 00:03:06,800 Speaker 1: to think about what was our future with technology and 48 00:03:06,840 --> 00:03:12,040 Speaker 1: with machines. However, this paper comes out the technology just 49 00:03:12,080 --> 00:03:15,600 Speaker 1: wasn't there for what Alan Turing was proposing, and everything 50 00:03:15,600 --> 00:03:19,880 Speaker 1: that was available was prohibitively expensive. That didn't stop people 51 00:03:19,919 --> 00:03:21,720 Speaker 1: from trying, though, and a couple of years later, a 52 00:03:21,760 --> 00:03:24,120 Speaker 1: few men came together to create a program called the 53 00:03:24,200 --> 00:03:29,520 Speaker 1: Logic Theorist, which many labeled the first AI program. Um Basically, 54 00:03:29,520 --> 00:03:32,080 Speaker 1: it was designed to copy the problem solving skills of 55 00:03:32,080 --> 00:03:36,000 Speaker 1: a human and the term artificial intelligence was coined in 56 00:03:36,240 --> 00:03:40,040 Speaker 1: nineteen fifty six after this program was presented. So over 57 00:03:40,080 --> 00:03:43,400 Speaker 1: the next two decades, the realm of AI really grew 58 00:03:43,440 --> 00:03:47,680 Speaker 1: as technology improved. Computers were now able to store commands 59 00:03:47,840 --> 00:03:52,160 Speaker 1: where they previously couldn't. Problem solving improved, along with interpreting 60 00:03:52,280 --> 00:03:56,880 Speaker 1: spoken languages. Government agencies like DARPA started funding AI research, 61 00:03:56,960 --> 00:04:00,040 Speaker 1: and by nineteen seventy people were predicting would have a 62 00:04:00,040 --> 00:04:05,920 Speaker 1: workable AI within the decade. Ohwever, the text still wasn't 63 00:04:05,960 --> 00:04:10,520 Speaker 1: there mostly issues around data storage and slow processing speeds. 64 00:04:10,640 --> 00:04:13,840 Speaker 1: In the eighties, a couple of big things happened. One 65 00:04:14,080 --> 00:04:19,120 Speaker 1: was the introduction of deep learning or machine learning through experience. 66 00:04:19,760 --> 00:04:23,680 Speaker 1: Another was expert systems, which allowed for computers to copy 67 00:04:23,839 --> 00:04:28,000 Speaker 1: experts in terms of decision making. It learned by asking 68 00:04:28,080 --> 00:04:31,760 Speaker 1: the expert questions and storing the answer so that non 69 00:04:31,839 --> 00:04:35,960 Speaker 1: experts could ask and get the same answer right. The 70 00:04:36,040 --> 00:04:40,640 Speaker 1: funding did dry out the nineties because basically everyone, and 71 00:04:40,680 --> 00:04:42,200 Speaker 1: this is kind of like a running joke for things 72 00:04:42,240 --> 00:04:43,720 Speaker 1: like Back to the Future right where we're like, why 73 00:04:43,720 --> 00:04:45,600 Speaker 1: don't we have flying cars ship Everyone thought this was 74 00:04:45,600 --> 00:04:48,000 Speaker 1: going to happen much more quickly than it was happening. 75 00:04:48,320 --> 00:04:50,840 Speaker 1: But even though the funding did dry up in the nineties, 76 00:04:51,080 --> 00:04:55,680 Speaker 1: AI flourished in the nineties and two thousands, UM World 77 00:04:55,760 --> 00:05:00,000 Speaker 1: Chess Champion and Grandmaster Gary Casparo was beaten by deep Blue, 78 00:05:00,560 --> 00:05:04,600 Speaker 1: which was an IBM chess playing program. UH speech recognition 79 00:05:05,080 --> 00:05:08,720 Speaker 1: was implemented on Windows that same year. The real game 80 00:05:08,800 --> 00:05:11,200 Speaker 1: changer and all of those, has been the increase in 81 00:05:11,240 --> 00:05:15,120 Speaker 1: computer storage and processing time. So this more or less 82 00:05:15,120 --> 00:05:19,039 Speaker 1: brings us to where we are today. Industries ranging from 83 00:05:19,200 --> 00:05:22,279 Speaker 1: banking to marketing to entertainment utilize AI all kinds of 84 00:05:22,279 --> 00:05:23,880 Speaker 1: things that when I was reading it, I was like, oh, 85 00:05:23,920 --> 00:05:25,560 Speaker 1: I never thought of that, But that's true. It's like 86 00:05:25,960 --> 00:05:27,640 Speaker 1: it's used in a lot more things than I bet 87 00:05:27,680 --> 00:05:29,919 Speaker 1: you a lot of us would think at first. It 88 00:05:30,080 --> 00:05:34,080 Speaker 1: really encompasses a lot more then I would say most 89 00:05:34,120 --> 00:05:37,200 Speaker 1: people imagine, but that could just be me. I don't 90 00:05:37,240 --> 00:05:43,440 Speaker 1: want to project. There was a huge spike in popularity 91 00:05:43,760 --> 00:05:47,599 Speaker 1: of apps, UM programs and platforms like Lenza and mid 92 00:05:47,680 --> 00:05:51,440 Speaker 1: Journey and chat GPT which stands for Chat Generative pre 93 00:05:51,600 --> 00:05:55,800 Speaker 1: Trained Transformer. And I mean, if you haven't I'd be 94 00:05:55,880 --> 00:05:57,960 Speaker 1: surprised if you haven't heard of them, but if you 95 00:05:58,000 --> 00:06:01,360 Speaker 1: haven't experimented with them, they're basically like text to image 96 00:06:01,400 --> 00:06:05,719 Speaker 1: prompts and text to text prompts UM. So basically, you 97 00:06:05,760 --> 00:06:07,880 Speaker 1: type in something, we'll go over some examples in a minute, 98 00:06:07,920 --> 00:06:11,560 Speaker 1: and then the AI generates an image or generates some 99 00:06:11,640 --> 00:06:14,240 Speaker 1: text based on what you put in UM. And these 100 00:06:14,240 --> 00:06:17,880 Speaker 1: platforms have access to billions of images and texts. It's 101 00:06:17,920 --> 00:06:21,600 Speaker 1: kind of a key part of this whole conversation. I 102 00:06:21,600 --> 00:06:24,919 Speaker 1: didn't want thrown here. I saw Megan. I'm sure a 103 00:06:24,960 --> 00:06:28,359 Speaker 1: lot of people working AI were like, no, don't mention Megan. 104 00:06:28,839 --> 00:06:32,000 Speaker 1: Um let me ask, Let me ask why is the three? 105 00:06:32,160 --> 00:06:35,599 Speaker 1: Why is Megan? The three? It stands for. I can't 106 00:06:35,600 --> 00:06:37,760 Speaker 1: remember what it is, but there's three E s and 107 00:06:37,839 --> 00:06:41,880 Speaker 1: it's an acronym. Okay, okay, so I think M E 108 00:06:41,880 --> 00:06:49,320 Speaker 1: E E G A N. But cutely Megan, I think, okay, okay, 109 00:06:49,400 --> 00:06:52,839 Speaker 1: no spoilers. But it had an interesting commentary about AI 110 00:06:53,000 --> 00:06:57,240 Speaker 1: and then about how the issue is more people and 111 00:06:57,360 --> 00:07:01,840 Speaker 1: biases programmed in and learn with AI. I read an 112 00:07:01,920 --> 00:07:03,719 Speaker 1: essay about it that it stuck with me to this day. 113 00:07:03,800 --> 00:07:09,159 Speaker 1: It was about teaching an AI program too, like I 114 00:07:09,160 --> 00:07:13,040 Speaker 1: don't know, sign the prettiest signature you've ever seen, and 115 00:07:13,520 --> 00:07:17,600 Speaker 1: nothing nefarious, but just the way that the AI interpreted 116 00:07:17,840 --> 00:07:20,320 Speaker 1: the code or the prompts that people were putting in, 117 00:07:20,360 --> 00:07:24,000 Speaker 1: it ended up destroying the world. But it wasn't designed 118 00:07:24,080 --> 00:07:28,800 Speaker 1: to do that. Yes, yes, es, yes, but it's also 119 00:07:28,920 --> 00:07:30,960 Speaker 1: Megan also has a lot of stuff around, like the 120 00:07:31,040 --> 00:07:35,480 Speaker 1: relationships we've formed with machines, But I would say it 121 00:07:35,520 --> 00:07:39,160 Speaker 1: was more about replacing human connection with machines. Was the 122 00:07:39,200 --> 00:07:41,840 Speaker 1: commentary that was happening. I do think this is the 123 00:07:41,920 --> 00:07:43,800 Speaker 1: fear discomfort a lot of a lot of us have 124 00:07:43,880 --> 00:07:45,600 Speaker 1: after seeing so many movies. Again, going back to that 125 00:07:45,680 --> 00:07:47,880 Speaker 1: kind of sci fi thing, A lot of times, the 126 00:07:47,920 --> 00:07:49,800 Speaker 1: first thing that comes up is like, what about what 127 00:07:49,880 --> 00:07:53,200 Speaker 1: happened in the ovenders? You know that kind of right. 128 00:07:53,800 --> 00:07:55,560 Speaker 1: I think of the matrix and I just hope that 129 00:07:55,600 --> 00:08:04,640 Speaker 1: we have a neo. We gotta get to training, alright. 130 00:08:04,720 --> 00:08:08,400 Speaker 1: So we do have a few numbers about this rapidly 131 00:08:08,440 --> 00:08:12,120 Speaker 1: growing market for youth. First is expected to be worth 132 00:08:12,320 --> 00:08:16,520 Speaker 1: about two hundred and sixty seven billion dollars by seven 133 00:08:16,880 --> 00:08:21,040 Speaker 1: and it will contribute and estimated fifteen point seven trillion 134 00:08:21,240 --> 00:08:25,600 Speaker 1: dollars to the global economy by twenty thirty. Of the 135 00:08:25,600 --> 00:08:30,680 Speaker 1: businesses and organizations used some type of AI. Um. Actually, 136 00:08:30,800 --> 00:08:34,480 Speaker 1: we just had a fellow podcasts or use chat GPT 137 00:08:34,679 --> 00:08:37,680 Speaker 1: to do some scripts. So that was interesting. Yeah, yeah, 138 00:08:40,600 --> 00:08:45,520 Speaker 1: that number. So here's an interesting number. Is estimated that 139 00:08:45,640 --> 00:08:51,160 Speaker 1: by two years from now, hey, I will destroy eighty 140 00:08:51,160 --> 00:08:58,000 Speaker 1: five million jobs but create nineties seven million jobs. So 141 00:08:58,120 --> 00:09:03,120 Speaker 1: y'all better get to learning this AI think. I guess, yeah, yeah, 142 00:09:03,160 --> 00:09:05,679 Speaker 1: I know, I know. Well, and so I've seen a 143 00:09:05,760 --> 00:09:08,720 Speaker 1: lot of conflicting numbers about this, but I do think 144 00:09:09,480 --> 00:09:12,880 Speaker 1: we even kind of used a loaded term destroy. I 145 00:09:12,920 --> 00:09:15,800 Speaker 1: feel like a lot we're not talking about the creation 146 00:09:15,880 --> 00:09:17,800 Speaker 1: part so much so much as these jobs will go away, 147 00:09:17,800 --> 00:09:19,760 Speaker 1: which is scary, Like that's a scary saying, don't get 148 00:09:19,760 --> 00:09:22,079 Speaker 1: me wrong, especially if that's your job, that has been 149 00:09:22,120 --> 00:09:25,360 Speaker 1: your job. Um, but it will create jobs as well, 150 00:09:38,880 --> 00:09:41,280 Speaker 1: all right. So when it comes to gender, over n 151 00:09:42,000 --> 00:09:45,760 Speaker 1: of AI specialists are men. Men in the field receive 152 00:09:45,840 --> 00:09:49,760 Speaker 1: an average of one dollars and women almost went a 153 00:09:49,840 --> 00:09:52,560 Speaker 1: hundred and ten thousand dollars. When it comes to race, 154 00:09:53,679 --> 00:09:57,679 Speaker 1: of specialists are white, followed by Latino and Black about 155 00:09:57,720 --> 00:10:01,600 Speaker 1: nine each of those populations, and Asian at about three 156 00:10:01,640 --> 00:10:05,079 Speaker 1: percent of the workforce. The rest is categorized as other 157 00:10:05,280 --> 00:10:09,600 Speaker 1: or indigenous UM black and Latino specialists. The Latino specialists 158 00:10:09,600 --> 00:10:12,880 Speaker 1: received the blowest pay of these categories. There aren't too 159 00:10:12,920 --> 00:10:15,199 Speaker 1: many numbers about queer folks in this space, but one 160 00:10:15,280 --> 00:10:18,319 Speaker 1: chart we found suggested that in a breakdown of job titles, 161 00:10:18,440 --> 00:10:21,520 Speaker 1: queer people generally hovered around five per cent, but in 162 00:10:21,559 --> 00:10:25,360 Speaker 1: some areas reached around ten percent. I think again, some 163 00:10:25,360 --> 00:10:27,320 Speaker 1: of these numbers are hard to track down based on 164 00:10:28,480 --> 00:10:31,520 Speaker 1: what constitutes to day I and also how are they 165 00:10:32,280 --> 00:10:35,160 Speaker 1: reporting or not reporting those types of numbers. But this 166 00:10:35,200 --> 00:10:39,480 Speaker 1: brings us to lawsuits. So one of the biggest controversies 167 00:10:39,520 --> 00:10:44,280 Speaker 1: about AI has been around art and copyright. So if 168 00:10:44,320 --> 00:10:46,880 Speaker 1: you don't know a recent AI programs and platforms have 169 00:10:46,960 --> 00:10:50,800 Speaker 1: written children's books, They've created so much art, all kinds 170 00:10:50,800 --> 00:10:53,160 Speaker 1: of things, much the outrage of a lot of folks 171 00:10:53,160 --> 00:10:55,640 Speaker 1: in the creative space. So if you haven't used these platforms, 172 00:10:55,679 --> 00:10:58,240 Speaker 1: you can basically go to a site type in pretty 173 00:10:58,320 --> 00:11:00,920 Speaker 1: much anything you want from very clear to super obscure, 174 00:11:01,160 --> 00:11:03,640 Speaker 1: and a handful of images will get spit back out 175 00:11:03,640 --> 00:11:06,280 Speaker 1: at you or, in the case of GBT, an essay 176 00:11:06,360 --> 00:11:08,960 Speaker 1: or something like that. Uh. The AI tool and question 177 00:11:09,040 --> 00:11:12,000 Speaker 1: sort of guesses what you want based on the prompts 178 00:11:12,040 --> 00:11:15,400 Speaker 1: and then uses all of the images and data available 179 00:11:15,880 --> 00:11:18,920 Speaker 1: on the Internet to paint a picture, so to speak. 180 00:11:19,240 --> 00:11:22,480 Speaker 1: Um people have won awards with AI generated art, and 181 00:11:22,520 --> 00:11:25,280 Speaker 1: I read some really interesting articles about it, which we're 182 00:11:25,280 --> 00:11:26,760 Speaker 1: going to touch on some of them in a minute. 183 00:11:26,960 --> 00:11:32,240 Speaker 1: Beginning in January three, lawsuits against AI started rolling out. 184 00:11:32,600 --> 00:11:36,200 Speaker 1: These these newer platforms that got really popular anyway, Getty 185 00:11:36,240 --> 00:11:40,200 Speaker 1: Images sued Stability AI, alleging that they had quote chosen 186 00:11:40,240 --> 00:11:43,920 Speaker 1: to ignore viable licensing options and long standing legal protections 187 00:11:44,120 --> 00:11:47,760 Speaker 1: in pursuit of their standalone commercial interest. And if you remember, 188 00:11:47,800 --> 00:11:49,840 Speaker 1: this was the big thing people were warning about with 189 00:11:49,920 --> 00:11:54,800 Speaker 1: things like LENSA that any picture you uploaded belonged to them. UM. 190 00:11:55,000 --> 00:11:57,640 Speaker 1: Three visual artists out of San Francisco on the behalf 191 00:11:57,679 --> 00:12:01,120 Speaker 1: of the visual arts community also sued to Ability AI, 192 00:12:01,360 --> 00:12:04,560 Speaker 1: uh naming mid Journey and dB and Art as well. Essentially, 193 00:12:04,600 --> 00:12:07,360 Speaker 1: the lawsuit alleges that these AI tools and sites comb 194 00:12:07,440 --> 00:12:10,199 Speaker 1: the web for images and use them without crediting the artist, 195 00:12:10,320 --> 00:12:14,200 Speaker 1: especially because most of them charge of fear. There's a 196 00:12:14,200 --> 00:12:18,120 Speaker 1: tier where a fee can be involved. UM. Some images 197 00:12:18,200 --> 00:12:23,600 Speaker 1: generated by LINDSA included distortist included distorted artist signatures, indicating 198 00:12:23,640 --> 00:12:26,080 Speaker 1: that it used its specific art from these artists but 199 00:12:26,120 --> 00:12:28,720 Speaker 1: did not credit them. UM. Many of the AI companies 200 00:12:28,720 --> 00:12:31,920 Speaker 1: in question UM argue that this falls under fair use, 201 00:12:31,960 --> 00:12:34,280 Speaker 1: which is a very very murky idea and we've had 202 00:12:34,280 --> 00:12:35,720 Speaker 1: to have a lot of meetings about it in our 203 00:12:35,800 --> 00:12:39,800 Speaker 1: very own field, so it is a mess. UM. Most 204 00:12:39,840 --> 00:12:43,160 Speaker 1: of the entities suing are looking were getting credit and 205 00:12:43,200 --> 00:12:47,120 Speaker 1: compensation as opposed to shutting these AI platforms down. So 206 00:12:47,200 --> 00:12:51,679 Speaker 1: with Dolly not a part of the lawsuit, I think 207 00:12:51,720 --> 00:12:54,360 Speaker 1: they might be because, like I said, I wrote, I 208 00:12:54,400 --> 00:12:57,600 Speaker 1: wrote this section several weeks ago, and I just so 209 00:12:57,800 --> 00:13:01,640 Speaker 1: much keeps changer. Dolly is one of the original even 210 00:13:01,679 --> 00:13:04,040 Speaker 1: before mid Journey, Like mid Journey has been around, but 211 00:13:04,160 --> 00:13:06,680 Speaker 1: Dolly was the original that people were playing with probably 212 00:13:06,840 --> 00:13:08,839 Speaker 1: last year, like the beginning of last year. So I'm 213 00:13:08,920 --> 00:13:12,360 Speaker 1: kind of surprised that they wouldn't be a part of that. 214 00:13:12,600 --> 00:13:15,360 Speaker 1: They do have a free portion, I believe, but then 215 00:13:15,400 --> 00:13:18,000 Speaker 1: you hit a wall and then you have to pay, 216 00:13:18,080 --> 00:13:21,840 Speaker 1: So maybe that's why. But that's interesting. Yeah, I mean 217 00:13:21,880 --> 00:13:24,600 Speaker 1: there's they're very well could be one. I think this 218 00:13:24,679 --> 00:13:27,160 Speaker 1: is the big example that's happening right now. But there 219 00:13:27,200 --> 00:13:32,479 Speaker 1: are multiple lawsuits against these these companies, and it is key. 220 00:13:32,600 --> 00:13:36,560 Speaker 1: As we discuss with Katrina, most experts don't believe AI 221 00:13:36,600 --> 00:13:40,840 Speaker 1: generated images will erase art like human made art, because 222 00:13:40,880 --> 00:13:43,839 Speaker 1: we inherently place a human value on it as and 223 00:13:43,920 --> 00:13:46,080 Speaker 1: I want to support this person. I'll pay for their work. 224 00:13:47,200 --> 00:13:49,600 Speaker 1: I mean most people do, not everybody, but you know, 225 00:13:49,640 --> 00:13:51,000 Speaker 1: you can kind of are like, oh, I want to 226 00:13:51,040 --> 00:13:54,480 Speaker 1: support this person or this feels different than something I 227 00:13:54,480 --> 00:13:58,559 Speaker 1: could just make online. Um. But that being said, uh, 228 00:13:58,600 --> 00:14:01,160 Speaker 1: there are indeed questions about the right of styles and 229 00:14:01,200 --> 00:14:03,240 Speaker 1: the like. And on top of that, some artists argue 230 00:14:03,280 --> 00:14:06,800 Speaker 1: that some, not all, but some AI generated images are 231 00:14:06,920 --> 00:14:10,760 Speaker 1: art and they require creativity and experimentation. So it's like 232 00:14:10,800 --> 00:14:13,840 Speaker 1: a big there's so many conversations happening around this right 233 00:14:13,880 --> 00:14:18,200 Speaker 1: now because it feels new, and as Katrina also mentioned 234 00:14:18,200 --> 00:14:21,800 Speaker 1: in that interview we did with her, it's kind of 235 00:14:22,520 --> 00:14:24,880 Speaker 1: sometimes these shifts just happened, like with photoshop came on 236 00:14:24,880 --> 00:14:26,520 Speaker 1: the scene and we were all kind of trying to 237 00:14:26,520 --> 00:14:30,760 Speaker 1: figure out what that means, and it's similar. Um. But 238 00:14:30,880 --> 00:14:33,600 Speaker 1: and also, as we discussed in that episode with Katrina, 239 00:14:33,760 --> 00:14:35,760 Speaker 1: who does work in this space, there is a question 240 00:14:35,800 --> 00:14:39,120 Speaker 1: of accessibility and how AI can help in that way. 241 00:14:39,160 --> 00:14:43,400 Speaker 1: These kind of AI generated programs, these kind of AI 242 00:14:43,440 --> 00:14:47,560 Speaker 1: generated text to art or text to text things could 243 00:14:47,600 --> 00:14:51,680 Speaker 1: be a really beneficial tool for people with accessibility issues, right. 244 00:14:51,920 --> 00:14:54,120 Speaker 1: And yeah, like you said, it does take a skill 245 00:14:54,800 --> 00:14:59,440 Speaker 1: for prompting, which is how you kind of pull into things. Um. 246 00:14:59,480 --> 00:15:02,760 Speaker 1: And it's also worry somewhat you prompt what people can prompt. Yeah, 247 00:15:02,920 --> 00:15:07,000 Speaker 1: and and speaking of another concern uh and around things 248 00:15:07,040 --> 00:15:11,480 Speaker 1: like Lindza that owns images you upload, and that was 249 00:15:11,520 --> 00:15:14,760 Speaker 1: a big to do. People did not read the terms 250 00:15:14,760 --> 00:15:18,120 Speaker 1: and conditions. After being called out on Twitter, they updated 251 00:15:18,160 --> 00:15:20,880 Speaker 1: their terms like used in December. They still own your 252 00:15:20,920 --> 00:15:24,960 Speaker 1: images to train AI, but they now allow users sibilate 253 00:15:25,120 --> 00:15:28,520 Speaker 1: their data, which I'm kind of concerned about because you know, 254 00:15:28,560 --> 00:15:30,760 Speaker 1: we talked about that with twenty three and me and 255 00:15:30,840 --> 00:15:35,040 Speaker 1: DNA testing how one company made say so, but what 256 00:15:35,080 --> 00:15:37,720 Speaker 1: does the other company get right too? And yes, that 257 00:15:37,760 --> 00:15:41,200 Speaker 1: means they own your image of public image. There's a 258 00:15:41,200 --> 00:15:44,400 Speaker 1: whole conversation about like that society conversation makes me worried 259 00:15:44,440 --> 00:15:46,800 Speaker 1: about like deep fakes, which is a whole big issue 260 00:15:46,920 --> 00:15:49,640 Speaker 1: right now in the world of technology as well. But 261 00:15:49,720 --> 00:15:54,080 Speaker 1: if you get these images on through these applications, who 262 00:15:54,080 --> 00:15:55,840 Speaker 1: owns them, who can use them, and what can they 263 00:15:55,880 --> 00:15:59,880 Speaker 1: be used for? This has kick started conversations again around 264 00:16:00,000 --> 00:16:03,720 Speaker 1: privacy and how it seems hopeless to protect ourselves unless 265 00:16:03,760 --> 00:16:07,000 Speaker 1: we exit the grid completely because there's so many like 266 00:16:07,200 --> 00:16:10,480 Speaker 1: all every single social media platforms that allow for selfies 267 00:16:10,560 --> 00:16:14,840 Speaker 1: with filters have a type of AI filter. But that's 268 00:16:14,840 --> 00:16:18,040 Speaker 1: like conversation in general. When you upload things into social 269 00:16:18,080 --> 00:16:20,600 Speaker 1: media platforms in itself, that's a good point to make 270 00:16:20,760 --> 00:16:23,440 Speaker 1: as we do without all of these technology episodes, like 271 00:16:24,960 --> 00:16:27,440 Speaker 1: these AI platforms are not alone into doing this there, 272 00:16:27,600 --> 00:16:30,720 Speaker 1: Like this is an issue across a lot of platforms. 273 00:16:31,120 --> 00:16:33,360 Speaker 1: And I remember one of the big things that happened 274 00:16:33,360 --> 00:16:35,760 Speaker 1: after Lindsa blew up in December was everyone was like 275 00:16:35,800 --> 00:16:37,800 Speaker 1: take it down, all of your selfies, like, take all 276 00:16:37,800 --> 00:16:42,240 Speaker 1: of them down. And then I know before like Google 277 00:16:42,280 --> 00:16:44,800 Speaker 1: got in big trouble with that several years ago. But 278 00:16:44,800 --> 00:16:47,640 Speaker 1: people have suddenly see an ad and their pictures in 279 00:16:47,640 --> 00:16:52,280 Speaker 1: the ad and they didn't write do that, um, but 280 00:16:52,320 --> 00:16:55,080 Speaker 1: they are I believe at the time Google's argument was 281 00:16:55,160 --> 00:17:03,960 Speaker 1: what you uploaded it to drive? So thats what Here's 282 00:17:04,000 --> 00:17:06,880 Speaker 1: a grow from days to digital. There's an adage that's 283 00:17:06,920 --> 00:17:09,600 Speaker 1: been knocking around the Internet for years in various forms. 284 00:17:09,760 --> 00:17:11,800 Speaker 1: If you're not paying for the product, then you are 285 00:17:11,840 --> 00:17:14,239 Speaker 1: the product. Actually, this quote can be traced it back 286 00:17:14,280 --> 00:17:16,520 Speaker 1: to a nineteen seventy three video by the artist Richard 287 00:17:16,560 --> 00:17:20,760 Speaker 1: Sarah and Caro Lot of Fish Schoolman. Television delivers people. Quote. 288 00:17:20,760 --> 00:17:24,439 Speaker 1: Commercial television delivers twenty million people a minute read scrolling text. 289 00:17:24,680 --> 00:17:27,639 Speaker 1: It is the consumer who is consumed you are the 290 00:17:27,640 --> 00:17:30,680 Speaker 1: product of TV. You are delivered to the advertiser, who's 291 00:17:30,720 --> 00:17:34,680 Speaker 1: the customer. He consumes you you are the end product. 292 00:17:35,520 --> 00:17:38,320 Speaker 1: South familiar, kind of like the recent episode we did 293 00:17:38,359 --> 00:17:42,280 Speaker 1: with bridget So in two Clear View, AI, which has 294 00:17:42,359 --> 00:17:46,359 Speaker 1: become under fire for far right ties, offered its facial 295 00:17:46,400 --> 00:17:50,520 Speaker 1: recognition technology to Ukraine for free. The country has used 296 00:17:50,520 --> 00:17:53,760 Speaker 1: it to identify hundreds of diseased at Russian soldiers in 297 00:17:53,880 --> 00:17:57,000 Speaker 1: order to alert the family. But some have argued that 298 00:17:57,040 --> 00:18:00,560 Speaker 1: this is a way of chipping away at morale. And 299 00:18:00,640 --> 00:18:05,600 Speaker 1: we've discussed before about how facial recognition AI is not infallible, 300 00:18:05,640 --> 00:18:07,479 Speaker 1: especially when it comes to people of color and especially 301 00:18:07,520 --> 00:18:12,159 Speaker 1: black people. UM. The EU has introduced legislation that anything 302 00:18:12,160 --> 00:18:15,359 Speaker 1: created with AI has to be properly labeled. However, I 303 00:18:15,400 --> 00:18:18,399 Speaker 1: know a recent app came out that removes water marks, 304 00:18:18,440 --> 00:18:21,040 Speaker 1: which is one of the solutions they were also proposing 305 00:18:21,080 --> 00:18:24,720 Speaker 1: for labeling this. UM. It's really really complicated, especially for 306 00:18:24,800 --> 00:18:29,200 Speaker 1: journalists UM. But ideas that have been suggested are metadata 307 00:18:29,680 --> 00:18:36,720 Speaker 1: titling reveals, UM interstitials bylines, annotations, side by side, platform 308 00:18:36,720 --> 00:18:40,360 Speaker 1: and non platform warnings, and none of this is uniform 309 00:18:40,400 --> 00:18:44,000 Speaker 1: across platform. So basically a lot of this would require 310 00:18:44,040 --> 00:18:46,679 Speaker 1: the person to be acting in good faith in label 311 00:18:47,480 --> 00:18:50,240 Speaker 1: things properly, which I don't believe is the thing that 312 00:18:50,280 --> 00:18:53,479 Speaker 1: people we could trust people to do. But right they 313 00:18:53,560 --> 00:18:58,800 Speaker 1: are constantly stealing videos, constantly claiming videos or ideas, especially 314 00:18:58,880 --> 00:19:01,440 Speaker 1: in today's age of trying to being an influencer, isn't 315 00:19:01,520 --> 00:19:08,119 Speaker 1: quite quite interesting. Yes, yes, um and and I I 316 00:19:08,119 --> 00:19:10,439 Speaker 1: don't know if you saw this. There's also a huge 317 00:19:11,720 --> 00:19:17,159 Speaker 1: kerfuffle in the sports world. The guy who owns like 318 00:19:17,760 --> 00:19:19,600 Speaker 1: I don't I'm going to butcher it. But basically, this 319 00:19:19,640 --> 00:19:24,359 Speaker 1: guy owns this basketball team he's getting sued and the 320 00:19:24,359 --> 00:19:28,040 Speaker 1: facial recognition technology recognized one of the lawyers who's suing 321 00:19:28,080 --> 00:19:31,040 Speaker 1: him and wouldn't let her her her daughter into the 322 00:19:31,119 --> 00:19:33,879 Speaker 1: Madison Square Garden. Right, So that's kind of like the 323 00:19:33,960 --> 00:19:37,080 Speaker 1: level we're talking about too of how this is. Yeah, 324 00:19:37,920 --> 00:19:39,880 Speaker 1: so the you know, the the event was it wasn't 325 00:19:39,880 --> 00:19:42,000 Speaker 1: a it wasn't a basketball game. A company is suing 326 00:19:42,040 --> 00:19:45,880 Speaker 1: Madison Square Garden the event space. Uh. Through that, they 327 00:19:46,119 --> 00:19:48,200 Speaker 1: they's found a person who wasn't even on the case. 328 00:19:48,320 --> 00:19:51,480 Speaker 1: She just actually works in the same law firm therefore 329 00:19:51,600 --> 00:19:55,920 Speaker 1: denied her access and her daughter access to an event. Um. 330 00:19:56,000 --> 00:19:58,280 Speaker 1: And apparently he wasn't the only one. There have been 331 00:19:58,320 --> 00:20:00,680 Speaker 1: several who have been denied just because of their association, 332 00:20:00,720 --> 00:20:03,239 Speaker 1: not even being on the case, which does talks about like, 333 00:20:03,480 --> 00:20:05,199 Speaker 1: oh what does this mean for later? Just because you 334 00:20:05,200 --> 00:20:07,880 Speaker 1: don't like me or like something that someone else did 335 00:20:07,880 --> 00:20:13,360 Speaker 1: that I'm connecting to, You're going to deny me service? Yes, yes, yes, yes. Um. 336 00:20:13,400 --> 00:20:16,920 Speaker 1: There's also been a lot of concern no lawsuits as 337 00:20:16,920 --> 00:20:20,600 Speaker 1: of recording. I don't think around chat GPT about students 338 00:20:20,640 --> 00:20:24,320 Speaker 1: using it to write essays about it eliminating jobs. Uh, 339 00:20:24,359 --> 00:20:26,560 Speaker 1: there are there are a lot of conflicting reports about 340 00:20:26,560 --> 00:20:28,880 Speaker 1: how effective it is. As you said, things are improving 341 00:20:28,960 --> 00:20:32,479 Speaker 1: very quickly though, and that probably depends on the topic 342 00:20:32,520 --> 00:20:36,760 Speaker 1: and level of expertise involved. Then somebody created a tool 343 00:20:36,800 --> 00:20:39,000 Speaker 1: to detect chat GPT, but I feel like that's just 344 00:20:39,000 --> 00:20:42,560 Speaker 1: going to be a game of like constantly one opening 345 00:20:43,240 --> 00:20:48,360 Speaker 1: like right right, It definitely is, yeah, But it did 346 00:20:48,400 --> 00:20:50,919 Speaker 1: recently pass an exam at an Ivy League business school, 347 00:20:50,920 --> 00:20:53,720 Speaker 1: and it's past several exams at colleges. And this has 348 00:20:54,000 --> 00:20:59,040 Speaker 1: started a whole conversation and arguments around well, what does 349 00:20:59,080 --> 00:21:02,560 Speaker 1: it mean around plagiarism in schools and how worried should 350 00:21:02,560 --> 00:21:06,359 Speaker 1: we be about it? Anyway? So that's the whole thing, too, right. 351 00:21:07,040 --> 00:21:10,000 Speaker 1: I actually I just read a title of an article. 352 00:21:10,000 --> 00:21:12,520 Speaker 1: I did not read the article, y'all, but it said 353 00:21:12,560 --> 00:21:16,240 Speaker 1: that there had been three examples of a chat gpt 354 00:21:17,119 --> 00:21:21,120 Speaker 1: um writing out resumes that were perfect, and they were. 355 00:21:21,240 --> 00:21:24,960 Speaker 1: These people were hired based on those resumes to executive 356 00:21:25,040 --> 00:21:29,640 Speaker 1: level of positions, like high level positions through this, which 357 00:21:29,960 --> 00:21:37,960 Speaker 1: that means we need to work on it. Any kidding, Alright, 358 00:21:38,160 --> 00:21:40,800 Speaker 1: let's talk about the women portion, which is what we 359 00:21:40,880 --> 00:21:45,640 Speaker 1: are coming to. Uh. So, why is AI women? Uh? 360 00:21:45,680 --> 00:21:48,560 Speaker 1: There's been a lot of debate and conversation about why 361 00:21:48,640 --> 00:21:52,720 Speaker 1: so many of our AI voices are women. Sirie Alexa 362 00:21:52,800 --> 00:21:57,240 Speaker 1: Cortana the default setting on most map apps, uh, though 363 00:21:57,280 --> 00:21:59,560 Speaker 1: this is not the case in all countries, even in 364 00:21:59,680 --> 00:22:03,320 Speaker 1: fixed and her Megan x makin a mother from Alien 365 00:22:03,359 --> 00:22:06,480 Speaker 1: though you don't hear her voice the Red Queen from 366 00:22:06,480 --> 00:22:10,560 Speaker 1: Resident Evil. One of the theories behind this gendering exists 367 00:22:10,680 --> 00:22:14,160 Speaker 1: is based on the roles we typically assign women YEP 368 00:22:14,480 --> 00:22:18,919 Speaker 1: scheduling appointments, setting up reminders, answering questions, looking up data, 369 00:22:19,040 --> 00:22:22,080 Speaker 1: and communicating it's it's literally literally thought of it as 370 00:22:22,119 --> 00:22:27,000 Speaker 1: an assistant, something typically feminine coded in our society, proceed 371 00:22:27,040 --> 00:22:32,200 Speaker 1: as more welcoming, more soothing, less threatening. Others have cautioned 372 00:22:32,320 --> 00:22:36,080 Speaker 1: about the message this sends that with enough programming, we 373 00:22:36,119 --> 00:22:41,760 Speaker 1: can create the perfect subservient, pleasing woman. Mm hmmm, which 374 00:22:41,920 --> 00:22:48,800 Speaker 1: is that whole movie? Yes, yep, yep her Um, most 375 00:22:48,960 --> 00:22:50,760 Speaker 1: sex robots are in the image of a woman, of 376 00:22:50,760 --> 00:22:53,639 Speaker 1: though I think there's a couple points we could expand 377 00:22:53,680 --> 00:22:57,000 Speaker 1: upon within that. But some suggest that this indicates there 378 00:22:57,080 --> 00:23:01,600 Speaker 1: might be some sexualization of this a I voice that's 379 00:23:01,640 --> 00:23:04,600 Speaker 1: going on. Some also argue, which I thought was interesting, 380 00:23:04,640 --> 00:23:07,000 Speaker 1: that this is the reason so many of our AI 381 00:23:07,080 --> 00:23:10,840 Speaker 1: voices are feminine, because people were so traumatized by the 382 00:23:10,840 --> 00:23:15,160 Speaker 1: male voice of how from two thousands, I wouldn't even 383 00:23:15,200 --> 00:23:19,240 Speaker 1: know what that is, Oh I I remember, Uh, I'll 384 00:23:19,240 --> 00:23:22,600 Speaker 1: be sure. Maybe it's probably a lot of things. Yeah, 385 00:23:22,840 --> 00:23:27,440 Speaker 1: I've just learned, um that the train announcement from uh 386 00:23:27,520 --> 00:23:30,600 Speaker 1: the in England that says, mind the gap has switched 387 00:23:30,600 --> 00:23:33,119 Speaker 1: from the man to a woman. Um. And there was 388 00:23:33,119 --> 00:23:35,560 Speaker 1: this whole se story of this woman who used to 389 00:23:35,600 --> 00:23:38,440 Speaker 1: go to after her fault. It was her husband who 390 00:23:38,480 --> 00:23:41,000 Speaker 1: did this voice, and after he died, she would go 391 00:23:41,560 --> 00:23:44,479 Speaker 1: to the tubes just to hear his voice. And when 392 00:23:44,520 --> 00:23:48,000 Speaker 1: they were switching over to the woman that the transit 393 00:23:48,040 --> 00:23:50,520 Speaker 1: department actually gave her the recording to have, but like 394 00:23:50,600 --> 00:23:53,000 Speaker 1: she was so sad and so just wrong when it 395 00:23:53,040 --> 00:23:55,960 Speaker 1: went away. That was a side side piece of love. 396 00:23:59,480 --> 00:24:01,399 Speaker 1: But that is interesting because I didn't know this, but 397 00:24:01,440 --> 00:24:03,639 Speaker 1: the UK kept coming up as the example of like 398 00:24:04,040 --> 00:24:07,800 Speaker 1: the map default voice on map apps as a band um, 399 00:24:07,840 --> 00:24:11,000 Speaker 1: a lot of their default apps are masculine, and they 400 00:24:11,040 --> 00:24:14,080 Speaker 1: were kind of talking about why that might be. But interesting, 401 00:24:14,960 --> 00:24:17,360 Speaker 1: this brings us to the quote perfect woman, the sort 402 00:24:17,400 --> 00:24:22,440 Speaker 1: of idea of you know, mostly male users trying to 403 00:24:22,480 --> 00:24:25,920 Speaker 1: create what they see is the perfect woman through these prompts, 404 00:24:25,960 --> 00:24:29,680 Speaker 1: and then what does the AI return? Usually a white woman, 405 00:24:30,640 --> 00:24:35,320 Speaker 1: very traditionally beautiful, brown haired. But I did want to 406 00:24:35,359 --> 00:24:41,280 Speaker 1: include some of the very specific request that male creators 407 00:24:41,440 --> 00:24:45,760 Speaker 1: male users have input for women, starting with this one 408 00:24:46,160 --> 00:24:49,280 Speaker 1: desirable gorgeous woman in her fifties enjoying a pint at 409 00:24:49,320 --> 00:24:51,919 Speaker 1: the pub, four stools down, no one in between us 410 00:24:52,000 --> 00:24:55,440 Speaker 1: posh wearing thin white gauze blouse tailored for a very 411 00:24:55,480 --> 00:24:59,640 Speaker 1: close fit, unbuttoned, black demi cup, push up bra thin 412 00:25:00,040 --> 00:25:03,600 Speaker 1: white gal goals keep saying goals. I think it's right. Whatever. 413 00:25:04,200 --> 00:25:08,879 Speaker 1: Miniskirt tailored for a very close fit, gartered, fish net hose, 414 00:25:09,119 --> 00:25:12,800 Speaker 1: high heels, long straight hair, redhead with large bust, narrow waist, 415 00:25:12,960 --> 00:25:17,000 Speaker 1: wide hips, trim, fant body, curvy, full lips, very little makeup, 416 00:25:17,480 --> 00:25:21,280 Speaker 1: nice muscular legs, legs crossed, smiling back to the counter, 417 00:25:21,520 --> 00:25:24,520 Speaker 1: leaning back on the counter, propped up on elbows, revealing 418 00:25:24,600 --> 00:25:28,440 Speaker 1: skin side profile view, ultra wide angle ins hyper realistic 419 00:25:28,560 --> 00:25:34,040 Speaker 1: three D render. Okay, um, right, so you know, like 420 00:25:34,200 --> 00:25:37,040 Speaker 1: we had a whole experiment um with some of the 421 00:25:37,040 --> 00:25:39,400 Speaker 1: things and just put in a woman prompt and most 422 00:25:39,400 --> 00:25:42,760 Speaker 1: of them came out in redheaded, busty, small, very gorgeous, 423 00:25:43,320 --> 00:25:46,359 Speaker 1: unrealistic looking women. And it was kind of interesting that 424 00:25:46,440 --> 00:25:49,400 Speaker 1: that was the first basis to the point that there 425 00:25:49,520 --> 00:25:52,840 Speaker 1: was conversation with the creators, like why is this thing? 426 00:25:53,280 --> 00:25:55,800 Speaker 1: And that's what we talked about with Katrina. That's what's 427 00:25:55,840 --> 00:25:58,320 Speaker 1: on the internet, that is what is out there, so 428 00:25:58,359 --> 00:26:02,200 Speaker 1: therefore that is what it's like bringing back in. But yeah, 429 00:26:02,240 --> 00:26:04,679 Speaker 1: stuff like that prompting is more common than not, and 430 00:26:04,680 --> 00:26:07,639 Speaker 1: it's getting very specific to the point I think there's 431 00:26:07,680 --> 00:26:11,000 Speaker 1: so much conversation that we need to have about who 432 00:26:11,240 --> 00:26:14,080 Speaker 1: is controlling, just like in any social media platforms, as 433 00:26:14,080 --> 00:26:16,840 Speaker 1: we've talked about before, like the guidelines and making sure 434 00:26:16,880 --> 00:26:20,120 Speaker 1: they stick to the guidelines as is without question, because 435 00:26:20,119 --> 00:26:22,560 Speaker 1: there's so many things that could happen, including the fact 436 00:26:22,560 --> 00:26:26,680 Speaker 1: that I've seen obvious anime versions of young girls skinly 437 00:26:26,800 --> 00:26:29,280 Speaker 1: dressed and that you know who this is for, and 438 00:26:29,320 --> 00:26:31,760 Speaker 1: it's so concerning that this is continued to think like 439 00:26:32,119 --> 00:26:35,719 Speaker 1: this shouldn't be a thing, Like this is not helping 440 00:26:35,720 --> 00:26:39,160 Speaker 1: in any way for anyone. Yeah, yeah, And that's kind 441 00:26:39,160 --> 00:26:43,160 Speaker 1: of the it gets really complicated really quickly, because that's 442 00:26:43,240 --> 00:26:45,800 Speaker 1: part of the issue is also, as you said, like 443 00:26:45,840 --> 00:26:48,040 Speaker 1: what is what are we upholding as a society? Is 444 00:26:48,080 --> 00:26:50,639 Speaker 1: like you know, the beautiful, perfect woman? What is a woman? 445 00:26:50,800 --> 00:26:53,000 Speaker 1: Like when you just type that in, what does the 446 00:26:53,040 --> 00:26:55,480 Speaker 1: AI return to you? Because that says a lot. But 447 00:26:55,520 --> 00:26:58,800 Speaker 1: then also by doing this, by putting in these prompts 448 00:26:59,119 --> 00:27:00,760 Speaker 1: and by being like, oh, this is what I meant, 449 00:27:01,160 --> 00:27:04,359 Speaker 1: it's like reinforcing the learning on the AI side as well. 450 00:27:05,680 --> 00:27:09,400 Speaker 1: But it's also, like, like we said it with Katrina, 451 00:27:09,840 --> 00:27:12,320 Speaker 1: you know if I think part of the issue two 452 00:27:12,600 --> 00:27:15,480 Speaker 1: is it's a lot of men doing this so they 453 00:27:15,480 --> 00:27:20,439 Speaker 1: can kind of objectify women, which you know, uh, but 454 00:27:20,840 --> 00:27:23,760 Speaker 1: like if you're a woman and you wanna have a 455 00:27:24,000 --> 00:27:26,440 Speaker 1: you know, sexy picture of you, or if you want 456 00:27:26,440 --> 00:27:29,800 Speaker 1: to look at sexy pictures in a safe constittuation, I 457 00:27:29,840 --> 00:27:32,320 Speaker 1: don't know, that's fine. So it's like, I don't want 458 00:27:32,359 --> 00:27:35,520 Speaker 1: to stigmatize women's bodies or sexualized women's bodies even more 459 00:27:35,560 --> 00:27:37,719 Speaker 1: than they've already been. But at the same time, they 460 00:27:37,760 --> 00:27:44,440 Speaker 1: are being sexualized through these AI. So it's just it's 461 00:27:44,440 --> 00:27:48,960 Speaker 1: a lot. It's a lot. That comment I read apparently 462 00:27:49,000 --> 00:27:51,719 Speaker 1: this guy, it's the same guy, and they just followed 463 00:27:51,880 --> 00:27:54,520 Speaker 1: him trying to find the perfect woman in his mind 464 00:27:54,600 --> 00:27:57,760 Speaker 1: over several hours worth of prompts. So you can read 465 00:27:57,800 --> 00:27:59,600 Speaker 1: if you wanted to, you can read the whole journey. 466 00:27:59,600 --> 00:28:04,639 Speaker 1: He went on. Um, but I don't think he no 467 00:28:04,640 --> 00:28:08,199 Speaker 1: no no open. AI recently changed his algorithm to be 468 00:28:08,240 --> 00:28:11,480 Speaker 1: more racially diverse. We'll see how that goes. Because part 469 00:28:11,480 --> 00:28:14,560 Speaker 1: of this is, yeah, they're all white women from what 470 00:28:14,600 --> 00:28:17,000 Speaker 1: I understand. Samantha, who told me a story about how 471 00:28:17,600 --> 00:28:23,600 Speaker 1: it's making dogs even sexualized. There's it's really interesting because 472 00:28:23,960 --> 00:28:26,560 Speaker 1: you can't upload photos and it will try to translate 473 00:28:26,600 --> 00:28:29,239 Speaker 1: that in this sometimes words, it sometimes doesn't. It is. 474 00:28:29,520 --> 00:28:32,000 Speaker 1: It hasn't gotten there, y'all. I hadn't. It hasn't gotten 475 00:28:32,040 --> 00:28:34,320 Speaker 1: to that point. Um. Even I think people who have 476 00:28:34,520 --> 00:28:36,840 Speaker 1: used the AI filters have realized this is not me. 477 00:28:37,040 --> 00:28:39,640 Speaker 1: This is kind of me, but not me that kind 478 00:28:39,680 --> 00:28:43,720 Speaker 1: of version. And there was an experiment when uh we 479 00:28:43,840 --> 00:28:47,560 Speaker 1: used they uploaded my dog, they being my partner, and 480 00:28:48,360 --> 00:28:50,440 Speaker 1: it just did some prompts. None of it had to 481 00:28:50,440 --> 00:28:51,920 Speaker 1: do with women, none of it had to do it, 482 00:28:51,920 --> 00:28:54,200 Speaker 1: but it was a couple of victors actually gave her breasts, 483 00:28:54,600 --> 00:28:57,440 Speaker 1: gave her a shapely look, and she she was humanized, 484 00:28:57,480 --> 00:28:59,560 Speaker 1: which was a nightmare, y'all. I'd never want to see 485 00:28:59,560 --> 00:29:01,880 Speaker 1: those images. Why did it have to happen? But it 486 00:29:01,960 --> 00:29:04,320 Speaker 1: was interesting that that was a prompt that that was 487 00:29:04,440 --> 00:29:06,600 Speaker 1: I think at one point it was um something about 488 00:29:06,600 --> 00:29:10,480 Speaker 1: like sophisticated dog essentially because my dog is very sophisticated 489 00:29:10,680 --> 00:29:13,680 Speaker 1: if you were one course, so the AI must be 490 00:29:13,720 --> 00:29:16,840 Speaker 1: as well. But it kind of like one was really funny. 491 00:29:16,840 --> 00:29:20,480 Speaker 1: It looked like it was her scruff around her neck 492 00:29:20,640 --> 00:29:25,080 Speaker 1: looked like the royal old school British where so she 493 00:29:25,160 --> 00:29:27,080 Speaker 1: kind of looked like a British royalty at one point. 494 00:29:27,880 --> 00:29:30,080 Speaker 1: The male version, I guess and then others was just 495 00:29:30,200 --> 00:29:34,280 Speaker 1: literally feminine human features on her because of that word 496 00:29:34,360 --> 00:29:38,600 Speaker 1: sophisticated or any of these things, maybe like just automatically 497 00:29:38,600 --> 00:29:43,320 Speaker 1: made her to have breasts, which was like why right, Yeah, 498 00:29:43,360 --> 00:29:47,840 Speaker 1: I mean, there's a lot of stuff there we could unpack, 499 00:29:48,040 --> 00:29:51,440 Speaker 1: but it just seems like if there's an an opportunity 500 00:29:51,480 --> 00:29:56,719 Speaker 1: to sexualize something that's happened, you not to be da 501 00:29:57,840 --> 00:30:00,560 Speaker 1: how dare you? Um? I I had to find out 502 00:30:00,600 --> 00:30:03,400 Speaker 1: more information about this. We'll probably come back to it 503 00:30:03,440 --> 00:30:07,200 Speaker 1: in the future. But the first AI model, which is 504 00:30:07,440 --> 00:30:09,760 Speaker 1: is a thing, was a black woman named Shoe Doo 505 00:30:10,600 --> 00:30:12,640 Speaker 1: and here's a quote from the outlet. Within the first 506 00:30:12,720 --> 00:30:15,200 Speaker 1: two years of her career, she was featured in Vogue, 507 00:30:15,360 --> 00:30:19,080 Speaker 1: hype Beast, v Magazine, and w w D, fronted campaigns 508 00:30:19,200 --> 00:30:22,320 Speaker 1: for Balmain and Ellis Grace the red carpet at Bath 509 00:30:22,360 --> 00:30:26,479 Speaker 1: to nineteen Awards, wearing a bespoke gown by Savorski, released 510 00:30:26,520 --> 00:30:28,360 Speaker 1: her own record and was named one of the most 511 00:30:28,400 --> 00:30:33,080 Speaker 1: influential people on the Internet by time. However, her creator 512 00:30:33,400 --> 00:30:35,880 Speaker 1: is a white man who is making money off of this, 513 00:30:36,000 --> 00:30:38,320 Speaker 1: and in the words of Twitter user Vanessa, I d 514 00:30:39,040 --> 00:30:40,920 Speaker 1: that's what you're enabling Lindsa to do when you give 515 00:30:40,960 --> 00:30:43,040 Speaker 1: them your likeness. The more images you give it, the 516 00:30:43,080 --> 00:30:45,840 Speaker 1: smarter and more life like it becomes. In addition to 517 00:30:45,880 --> 00:30:48,320 Speaker 1: deep fakes and putting your face somewhere it's never been, 518 00:30:48,680 --> 00:30:51,920 Speaker 1: or saying something you never say, AI is also like 519 00:30:52,000 --> 00:30:58,560 Speaker 1: black face. Yeah, because I thought I thought I was 520 00:30:58,560 --> 00:31:01,880 Speaker 1: going to find more about this, because I remember that 521 00:31:01,960 --> 00:31:04,480 Speaker 1: kind of being a big conversation at one point of 522 00:31:04,560 --> 00:31:07,120 Speaker 1: like these creators you're following on social media aren't real 523 00:31:07,840 --> 00:31:09,960 Speaker 1: and who's running them? But I couldn't actually find that 524 00:31:10,040 --> 00:31:13,000 Speaker 1: much about it. So again, listeners, if you know, uh, 525 00:31:13,120 --> 00:31:17,040 Speaker 1: please the point is in the right direction. But yeah, 526 00:31:17,080 --> 00:31:18,640 Speaker 1: I mean, we're going to talk about deep fixs more 527 00:31:18,640 --> 00:31:20,280 Speaker 1: in a second. But it is very scary that the 528 00:31:20,560 --> 00:31:24,400 Speaker 1: idea that you could be somebody could be using your 529 00:31:24,520 --> 00:31:27,920 Speaker 1: likeness to say something you wouldn't want to say and 530 00:31:28,000 --> 00:31:45,840 Speaker 1: to make money without your permission. And then there's the 531 00:31:46,040 --> 00:31:48,920 Speaker 1: TikTok as I've kind of already mentioned. Uh, there's a 532 00:31:48,920 --> 00:31:52,560 Speaker 1: popular AI filter on TikTok right now Ai manga um 533 00:31:52,600 --> 00:31:55,080 Speaker 1: and people are tricking it to generate images of women 534 00:31:55,120 --> 00:31:58,560 Speaker 1: with big boobs but holding up items like cups or 535 00:31:58,640 --> 00:32:02,520 Speaker 1: hats over their chests, which but why yeah, so basically 536 00:32:02,520 --> 00:32:06,560 Speaker 1: they're like putting yeah, hats over their chests, and then 537 00:32:06,680 --> 00:32:09,880 Speaker 1: it tricks the filter into making like big brists. I 538 00:32:09,880 --> 00:32:13,080 Speaker 1: guess right. I haven't gone to that side of TikTok. 539 00:32:13,120 --> 00:32:16,960 Speaker 1: I've seen the ghost TikTok's with the AI like if 540 00:32:17,000 --> 00:32:19,640 Speaker 1: the image pops up and that's that's that, and then 541 00:32:19,720 --> 00:32:22,640 Speaker 1: the like the dog are they human? Or the animals? 542 00:32:22,640 --> 00:32:24,760 Speaker 1: Are they human? And are they because like sometimes when 543 00:32:24,800 --> 00:32:28,560 Speaker 1: you take the main guys thing on animals, they appear 544 00:32:28,600 --> 00:32:33,040 Speaker 1: as human and so their skin walkers is what they say. 545 00:32:34,600 --> 00:32:36,800 Speaker 1: My dog is not if you're wondering, Peaches is not 546 00:32:36,840 --> 00:32:42,120 Speaker 1: a skin walker? I see. But then yeah, let's talk 547 00:32:42,160 --> 00:32:46,640 Speaker 1: about Revenge Born, which you know, um as ai has exploded. 548 00:32:46,840 --> 00:32:50,080 Speaker 1: Moderators and guides have scrambled to keep up with it, 549 00:32:50,200 --> 00:32:54,280 Speaker 1: especially in terms of banning harmful content, and this is complicated, 550 00:32:54,360 --> 00:32:58,880 Speaker 1: to say the least. Sometimes are outright band others are debated, 551 00:32:59,000 --> 00:33:02,480 Speaker 1: and of course this involved individual biases and can vary 552 00:33:02,560 --> 00:33:06,360 Speaker 1: around the world. Uh. Some platforms have rules against generating 553 00:33:06,400 --> 00:33:10,440 Speaker 1: images of celebrities. Uh for one instance. You know, which 554 00:33:10,440 --> 00:33:14,160 Speaker 1: is interesting because I know recently something came up. This 555 00:33:14,280 --> 00:33:16,720 Speaker 1: is the things that I know because my partners really 556 00:33:16,720 --> 00:33:20,840 Speaker 1: into it. The word black in Spanish has actually been banned, 557 00:33:21,080 --> 00:33:24,959 Speaker 1: and a lot of Spanish and Latino people are like, 558 00:33:25,040 --> 00:33:27,120 Speaker 1: you can't do that to us. At the same time like, 559 00:33:27,240 --> 00:33:29,560 Speaker 1: We're so sorry. We're the company based out of the 560 00:33:29,680 --> 00:33:32,800 Speaker 1: US and racism is a big thing. We really can't 561 00:33:33,120 --> 00:33:36,280 Speaker 1: do this. I apologize that you can't us just Spanish words, 562 00:33:36,400 --> 00:33:38,640 Speaker 1: but can you do this? And that is complicated. You 563 00:33:38,680 --> 00:33:44,160 Speaker 1: do have the moment of like, oh, how does that go? Yeah? Yeah, yeah, 564 00:33:44,240 --> 00:33:48,320 Speaker 1: especially as we said, things around the world do vary 565 00:33:48,480 --> 00:33:50,720 Speaker 1: and it is complicated for sure. Um. And this is 566 00:33:50,720 --> 00:33:52,160 Speaker 1: one of the things I was really worried about with 567 00:33:52,200 --> 00:33:53,720 Speaker 1: a I was revenge porn. It was one of the 568 00:33:53,760 --> 00:33:56,680 Speaker 1: first things I was like, oh no, I'm sure. Um. 569 00:33:56,960 --> 00:33:59,760 Speaker 1: There hasn't been too much written about it yet, but 570 00:34:00,000 --> 00:34:05,240 Speaker 1: one article over on Wired wrote about how Lindsa generated nudes. 571 00:34:06,080 --> 00:34:08,760 Speaker 1: The author wrote about how Lensa generated nudes from her childhood. 572 00:34:08,800 --> 00:34:13,120 Speaker 1: Despite having the rules no nudes and no kids adults only. UM. 573 00:34:13,200 --> 00:34:18,040 Speaker 1: Women have reported uploading modest images of themselves only to 574 00:34:18,080 --> 00:34:22,239 Speaker 1: get nudes are cartoonishly sexualized images in return. Um and 575 00:34:22,280 --> 00:34:24,480 Speaker 1: the article warns this is the danger of how much 576 00:34:24,480 --> 00:34:28,800 Speaker 1: we sexualize women. UM. Here's a quote. I, for example, 577 00:34:28,840 --> 00:34:33,239 Speaker 1: received several fully nude results despite uploading only headshots. The 578 00:34:33,280 --> 00:34:36,840 Speaker 1: sexualization was also often racialized. Nearly a dozen women of 579 00:34:36,880 --> 00:34:40,080 Speaker 1: color told me that LINDSA whitened their skin and anglicized 580 00:34:40,160 --> 00:34:44,960 Speaker 1: their features, and LINDSA widely seeks to quote beautify images 581 00:34:45,000 --> 00:34:50,239 Speaker 1: of women, meaning whitened and sexualized. Many reported feeling violated 582 00:34:50,320 --> 00:34:55,200 Speaker 1: after seeing these images right um and then the author continued, 583 00:34:55,880 --> 00:34:58,799 Speaker 1: I'm used to feeling violated by the Internet, having been 584 00:34:58,840 --> 00:35:04,160 Speaker 1: the target of several raspient campaigns. I've seen my image manipulated, distorted, 585 00:35:04,239 --> 00:35:08,719 Speaker 1: and distributed without my consent on multiple occasions. Because I 586 00:35:08,719 --> 00:35:11,960 Speaker 1: am not face out as a sex worker, the novelty 587 00:35:12,000 --> 00:35:14,799 Speaker 1: of hunting down and circulating my likeness is for some 588 00:35:14,880 --> 00:35:18,080 Speaker 1: a sport. Because sex workers are not perceived by the 589 00:35:18,120 --> 00:35:22,080 Speaker 1: general public as human or deserving of basic rights, this 590 00:35:22,200 --> 00:35:26,320 Speaker 1: behavior is celebrated rather than condemned, and because sex workers 591 00:35:26,400 --> 00:35:29,080 Speaker 1: is so often presumed to be a moral failing rather 592 00:35:29,120 --> 00:35:32,799 Speaker 1: than a job, our dehumanization is redundant. I've logged onto 593 00:35:32,800 --> 00:35:35,480 Speaker 1: Twitter to see my face photoshopped into other women's bodies, 594 00:35:35,680 --> 00:35:38,640 Speaker 1: pictures of myself and unclothed clients and session and once 595 00:35:38,760 --> 00:35:42,320 Speaker 1: even as word search comprised of my face, personal details 596 00:35:42,360 --> 00:35:46,960 Speaker 1: and research interest. I'm not afraid of LINDSA. Yeah. Basically 597 00:35:46,960 --> 00:35:50,520 Speaker 1: she was like, I've seen this before, Lindsa, yeah, and 598 00:35:50,520 --> 00:35:52,000 Speaker 1: I'm going to see how bad you are, because she 599 00:35:52,080 --> 00:35:55,560 Speaker 1: was doing tests to see like what would happen and 600 00:35:55,600 --> 00:35:57,880 Speaker 1: the images were horrifying, a lot of them with a 601 00:35:58,000 --> 00:36:03,000 Speaker 1: child's face on clearly an adult nude body, right, And 602 00:36:03,120 --> 00:36:05,279 Speaker 1: again like I said that, some of the images that 603 00:36:05,320 --> 00:36:08,040 Speaker 1: I've seen, that's just projected, and you're like, why, we 604 00:36:08,080 --> 00:36:10,640 Speaker 1: know what your purposes? Why are you doing this? And 605 00:36:10,640 --> 00:36:13,600 Speaker 1: it's obvious what it is to sexualize young women, and 606 00:36:13,680 --> 00:36:17,280 Speaker 1: without any moderation or oversight, the potential for AI generated 607 00:36:17,320 --> 00:36:22,280 Speaker 1: violence inherent in quote magic avatars is staggering. LINDSA doesn't 608 00:36:22,320 --> 00:36:26,600 Speaker 1: seem to enforce its policies prohibiting nudity and miners, and 609 00:36:26,640 --> 00:36:30,480 Speaker 1: it doesn't have any policies at all, stipulating that users 610 00:36:30,520 --> 00:36:34,640 Speaker 1: can only upload images of themselves. It's only relevant specifications 611 00:36:34,680 --> 00:36:38,520 Speaker 1: are quote same person on all photos and uh no 612 00:36:38,719 --> 00:36:42,879 Speaker 1: other people on the photo. And like most other tech innovations, 613 00:36:43,040 --> 00:36:48,200 Speaker 1: lendsa's misusual most severely harm those already at risk, children, 614 00:36:48,239 --> 00:36:51,080 Speaker 1: women of color, and sex workers. Yes, so that was 615 00:36:51,120 --> 00:36:54,640 Speaker 1: another quote from the article. Um. The author also points 616 00:36:54,640 --> 00:36:57,799 Speaker 1: out how this could impact the sex work industry, not 617 00:36:57,920 --> 00:37:02,000 Speaker 1: just the art industry, and sex workers incurring blame for 618 00:37:02,200 --> 00:37:06,120 Speaker 1: quote teaching the AI by posting adult content when it 619 00:37:06,160 --> 00:37:09,200 Speaker 1: was never intended to be used that way. A lot 620 00:37:09,239 --> 00:37:11,200 Speaker 1: of platforms like mid Journey do you have a PG 621 00:37:11,360 --> 00:37:13,880 Speaker 1: thirteen rating? But it's it's it's similar to what we 622 00:37:13,880 --> 00:37:16,759 Speaker 1: talked about in YouTube, where people are still finding ways 623 00:37:16,800 --> 00:37:20,640 Speaker 1: around this stuff. Right well, tictok is supposed to be like, 624 00:37:20,760 --> 00:37:24,320 Speaker 1: I think twelve and older or something or thirteen and older, 625 00:37:24,440 --> 00:37:26,160 Speaker 1: and it ends up being younger kids and you're just 626 00:37:26,200 --> 00:37:29,560 Speaker 1: trying to figure out why, right right, right? Well, speaking 627 00:37:29,600 --> 00:37:36,960 Speaker 1: of it's time to talk about Lobe, who's the creepy 628 00:37:37,040 --> 00:37:40,280 Speaker 1: AI woman who is haunting the Internet and our dreams. 629 00:37:40,719 --> 00:37:42,759 Speaker 1: It was the original impetus for this episode because I 630 00:37:42,800 --> 00:37:46,520 Speaker 1: hadn't heard of this. Yes you bought it for my attention, 631 00:37:46,520 --> 00:37:50,400 Speaker 1: and I actually was thinking of something totally different. We 632 00:37:50,400 --> 00:37:54,960 Speaker 1: were on tunipment pages like no, go look though, go look, yes, Yes, 633 00:37:55,160 --> 00:37:59,320 Speaker 1: welcome for the nightmare. Yeah, thank you, thank you. First 634 00:37:59,360 --> 00:38:03,960 Speaker 1: introduced to the Internet on September six, this text image 635 00:38:04,040 --> 00:38:07,360 Speaker 1: AI generated woman has been haunting the internet. Um. She 636 00:38:07,400 --> 00:38:09,800 Speaker 1: has been called everything from the first cryptod of latent 637 00:38:10,080 --> 00:38:14,400 Speaker 1: space to a demon to a queer icon. I was 638 00:38:14,440 --> 00:38:16,719 Speaker 1: gonna try to describe her, but honestly, she appears in 639 00:38:16,800 --> 00:38:19,600 Speaker 1: so much stuff. I would say generally she's like a 640 00:38:19,640 --> 00:38:23,680 Speaker 1: creepy older woman. She's generally creepy. We'll talk about it 641 00:38:23,680 --> 00:38:26,040 Speaker 1: more in a second. Not always creepy older woman with 642 00:38:26,080 --> 00:38:30,360 Speaker 1: brown hair. She's white, and she's got dark eyes. Uh. 643 00:38:30,520 --> 00:38:37,360 Speaker 1: Distorted face, distorted face usually um hellobe was created in 644 00:38:37,440 --> 00:38:41,960 Speaker 1: April by Twitter users super composite, who detailed the whole 645 00:38:41,960 --> 00:38:44,960 Speaker 1: process in a Twitter thread. The super short version is 646 00:38:45,040 --> 00:38:48,279 Speaker 1: that this user put in a negative prompt brand O 647 00:38:49,320 --> 00:38:52,640 Speaker 1: on an AI platform, meaning the AI generates the opposite 648 00:38:53,120 --> 00:38:55,279 Speaker 1: of the prompt, and it returned an image of a 649 00:38:55,360 --> 00:38:59,480 Speaker 1: skyline with the letters digita, P and T I c 650 00:38:59,760 --> 00:39:04,280 Speaker 1: at um. The user put that in as a negative prompt, 651 00:39:04,400 --> 00:39:09,160 Speaker 1: so that phrase plus skyline logo, and it returned Lobe 652 00:39:09,880 --> 00:39:13,160 Speaker 1: because the text. We call her that because their text 653 00:39:13,560 --> 00:39:16,560 Speaker 1: seems to appear in one of the images. And again 654 00:39:16,600 --> 00:39:19,239 Speaker 1: we're not experts at this. I hope we haven't like 655 00:39:19,280 --> 00:39:21,520 Speaker 1: totally butchered it. But it's a really interesting process because 656 00:39:21,760 --> 00:39:25,040 Speaker 1: you're essentially asking for the opposite of something, and opposites 657 00:39:25,080 --> 00:39:28,359 Speaker 1: can be interpreted in all kinds of ways. But it's 658 00:39:28,360 --> 00:39:32,120 Speaker 1: also interesting because you might guess it would return Brando 659 00:39:32,360 --> 00:39:35,920 Speaker 1: the original negative prompt, but it didn't. End of noe. 660 00:39:35,960 --> 00:39:39,680 Speaker 1: Negative prompts don't always return the same images, so it's 661 00:39:39,719 --> 00:39:45,480 Speaker 1: puzzling that Lobe consistently returns this woman. One explanation for 662 00:39:45,520 --> 00:39:49,759 Speaker 1: this is the fact that the opposite of logos, or 663 00:39:49,800 --> 00:39:52,160 Speaker 1: at the very least, because it's kind of like the 664 00:39:52,160 --> 00:39:54,840 Speaker 1: way they explained it, it's like imagine a brain like 665 00:39:54,880 --> 00:39:59,600 Speaker 1: a map, a neural map, and it's not necessarily like 666 00:40:00,440 --> 00:40:04,040 Speaker 1: the opposite is how you and I might think for opposite, 667 00:40:04,080 --> 00:40:07,000 Speaker 1: it's the opposite is the thing that's the farthest away 668 00:40:07,120 --> 00:40:11,879 Speaker 1: from whatever you put in. So the opposite for an 669 00:40:11,920 --> 00:40:18,800 Speaker 1: AI of logos might be something that is scary gory imagery, 670 00:40:19,080 --> 00:40:22,040 Speaker 1: and these images, to be clear, don't previously exist. They 671 00:40:22,080 --> 00:40:26,520 Speaker 1: are being created in response to the prompt, though now 672 00:40:26,560 --> 00:40:28,440 Speaker 1: it could be in something of a feedback loop because 673 00:40:28,520 --> 00:40:32,400 Speaker 1: people are feeding images of Lobe into AI platforms, and 674 00:40:32,440 --> 00:40:37,000 Speaker 1: Supercomposite has confirmed Lobe does exist on multiple platforms. They 675 00:40:37,040 --> 00:40:39,480 Speaker 1: haven't said which one was the original one they used 676 00:40:39,480 --> 00:40:42,560 Speaker 1: because they didn't want people to, I don't know, mess up, 677 00:40:43,560 --> 00:40:48,160 Speaker 1: like I'll flood to it and mess it up. It 678 00:40:48,160 --> 00:40:51,480 Speaker 1: feels like the newer slender Man to me. But anyway, 679 00:40:52,080 --> 00:40:55,800 Speaker 1: so Supercomposite didn't stop there feeding the AI that image 680 00:40:55,800 --> 00:40:59,279 Speaker 1: and instructing it to create something new using Lobe as 681 00:40:59,320 --> 00:41:03,839 Speaker 1: the base image. The results were unsettling, to say the least, 682 00:41:03,840 --> 00:41:07,680 Speaker 1: including images of decapitation and gory no is adults and 683 00:41:07,800 --> 00:41:12,160 Speaker 1: children alike. Super Composite said low quote haunts everything she touches, 684 00:41:12,360 --> 00:41:15,560 Speaker 1: and quote, I guess because she's very far away from 685 00:41:15,600 --> 00:41:17,799 Speaker 1: a lot of concepts and so it's hard to get 686 00:41:17,840 --> 00:41:20,960 Speaker 1: out of her little spooky area in latent space. The 687 00:41:21,000 --> 00:41:23,719 Speaker 1: cultural question of why the data put this woman way 688 00:41:23,760 --> 00:41:26,439 Speaker 1: out there at the edge of the latin space near 689 00:41:26,520 --> 00:41:30,080 Speaker 1: gory horror imagery is another thing to think about. Um, 690 00:41:30,200 --> 00:41:33,560 Speaker 1: it's possible there's more to the story. Super Composite isn't telling, 691 00:41:33,680 --> 00:41:38,720 Speaker 1: but the story certainly grabbed the public imagination. Me too, Yes, 692 00:41:38,840 --> 00:41:42,640 Speaker 1: it did, it did um. I mean I when I 693 00:41:42,640 --> 00:41:44,000 Speaker 1: was researching this, I was looking at some of the 694 00:41:44,000 --> 00:41:48,720 Speaker 1: pictures like, yeah, what is this? You're welcome, thank you, 695 00:41:49,200 --> 00:41:52,080 Speaker 1: thank you. Some have questioned why our culture, as super 696 00:41:52,080 --> 00:41:55,120 Speaker 1: Composite was saying, and by extension, AI associate older women 697 00:41:55,120 --> 00:41:58,120 Speaker 1: with horror, and you know that highlights some agis um 698 00:41:58,239 --> 00:42:02,080 Speaker 1: ablis um sexism. Super Composite set of the people's reactions 699 00:42:02,120 --> 00:42:04,640 Speaker 1: to the first images of Lobe because the first images 700 00:42:04,680 --> 00:42:08,120 Speaker 1: weren't that creepy um or at all creepy. They kind 701 00:42:08,120 --> 00:42:11,239 Speaker 1: of went on this process to find out like what's 702 00:42:11,280 --> 00:42:13,640 Speaker 1: going on in this latent space, but it took a 703 00:42:13,680 --> 00:42:16,480 Speaker 1: minute before they became like the really creepy, gory ones. Right, 704 00:42:17,120 --> 00:42:20,320 Speaker 1: So your composite said, clearly, it's a I made an association. 705 00:42:20,360 --> 00:42:22,399 Speaker 1: It shouldn't have. I also think some people are being 706 00:42:22,480 --> 00:42:24,480 Speaker 1: very stupid and making fun of how Lowe looks in 707 00:42:24,520 --> 00:42:27,160 Speaker 1: the first pictures, like that's the horror show. That's not 708 00:42:27,200 --> 00:42:28,840 Speaker 1: the point at all, and it bums me out. She 709 00:42:28,880 --> 00:42:31,759 Speaker 1: looks like an average person to me, just really sad. Yeah, 710 00:42:31,760 --> 00:42:33,880 Speaker 1: so you can go. You can follow the Twitter thread 711 00:42:33,920 --> 00:42:37,960 Speaker 1: and see the first ones into what it eventually became. 712 00:42:39,080 --> 00:42:43,239 Speaker 1: But that is interesting. Gosh, there's so much, so much 713 00:42:43,280 --> 00:42:46,960 Speaker 1: to talk about here. Yeah, there are efforts to make 714 00:42:47,040 --> 00:42:49,000 Speaker 1: AIM more inclusive, though there's a long way to go 715 00:42:49,080 --> 00:42:54,800 Speaker 1: when it comes to pretty much every part of this world. UM, 716 00:42:54,920 --> 00:42:57,360 Speaker 1: like the who's working on it, who's using it because 717 00:42:57,400 --> 00:43:01,080 Speaker 1: it's not as many women using it, and these kind 718 00:43:01,080 --> 00:43:05,600 Speaker 1: of things with what images get returned and generated. I 719 00:43:05,640 --> 00:43:08,000 Speaker 1: did find an organization called Black Women in AI that 720 00:43:08,040 --> 00:43:11,280 Speaker 1: has a podcast. I would love if anybody else knows 721 00:43:11,360 --> 00:43:15,120 Speaker 1: of more things like that, because we talked about that 722 00:43:15,160 --> 00:43:18,359 Speaker 1: with UM. Some of newer technologies where people are trying 723 00:43:18,400 --> 00:43:20,799 Speaker 1: to like at the beginning, like VR, we talked about 724 00:43:20,800 --> 00:43:25,120 Speaker 1: it like take steps so it is more inclusive. But 725 00:43:25,960 --> 00:43:27,960 Speaker 1: we'll see, like like we've been saying this whole time, 726 00:43:28,000 --> 00:43:34,080 Speaker 1: this is it's growing so rapidly. M hmmm mm hmm. 727 00:43:34,239 --> 00:43:42,960 Speaker 1: Well we'll see maybe one day chat GPT over places 728 00:43:46,040 --> 00:43:49,000 Speaker 1: right for us, we'll voice it. I mean he went 729 00:43:49,080 --> 00:43:56,799 Speaker 1: to parents. I like to write my own scripts chat GPTM. Wow. 730 00:43:57,840 --> 00:44:02,480 Speaker 1: This has been quite a roller coaster, but listeners, If 731 00:44:02,560 --> 00:44:06,080 Speaker 1: you have any experience with this, any thoughts, any images 732 00:44:06,160 --> 00:44:09,919 Speaker 1: you'd like to share, uh, any resources you'd like to share, 733 00:44:09,960 --> 00:44:11,960 Speaker 1: We would love to hear from you. You can emails 734 00:44:11,960 --> 00:44:13,879 Speaker 1: at Stephanie and Mom stuff at iHeart media dot com. 735 00:44:13,920 --> 00:44:15,719 Speaker 1: You can find us on Twitter at most podcast, or 736 00:44:15,760 --> 00:44:18,399 Speaker 1: on Instagram and TikTok at Stuff I've Never Told You. Thanks. 737 00:44:18,400 --> 00:44:21,760 Speaker 1: It's always to our super producer, Christina. Thank you, Christina, Yes, 738 00:44:21,920 --> 00:44:24,200 Speaker 1: and thanks to you for listening. Stuff Whenever Told You 739 00:44:24,239 --> 00:44:26,239 Speaker 1: production by High Radio put more podcast in my Heart Radio. 740 00:44:26,239 --> 00:44:27,960 Speaker 1: You can check out the high Radio app Apple podcast 741 00:44:28,000 --> 00:44:29,520 Speaker 1: already listen to your favorite shows.