1 00:00:05,200 --> 00:00:07,360 Speaker 1: Hey, this is Annie and Samantha and welcome to stuff 2 00:00:07,360 --> 00:00:19,200 Speaker 1: I never told you production of iHeartRadio. And today we 3 00:00:19,280 --> 00:00:21,760 Speaker 1: have our promised follow up on AI, which is kind 4 00:00:21,760 --> 00:00:25,200 Speaker 1: of just our personal experiences that we've had lately because 5 00:00:25,560 --> 00:00:28,640 Speaker 1: it has really blown up since we talked about it, 6 00:00:29,760 --> 00:00:32,639 Speaker 1: which speaking of, you can see our past AI episodes 7 00:00:32,680 --> 00:00:35,960 Speaker 1: that we did, including our recent Monday mini where we 8 00:00:36,000 --> 00:00:40,839 Speaker 1: compare Chat GBT two different prompts on Chat GBT. Today 9 00:00:41,120 --> 00:00:44,960 Speaker 1: is April twenty first, twenty twenty three. I just just 10 00:00:45,080 --> 00:00:48,360 Speaker 1: put that in there. I also want to say there's 11 00:00:48,400 --> 00:00:52,159 Speaker 1: a lot going on in the world news wise. I 12 00:00:52,200 --> 00:00:54,160 Speaker 1: know there always is, but today is like a huge 13 00:00:54,200 --> 00:00:59,279 Speaker 1: day comesue abortion. Perhaps we'll see There's a lot going 14 00:00:59,320 --> 00:01:02,920 Speaker 1: on with the like LGBTQ stuff specifically, So we are 15 00:01:02,960 --> 00:01:06,880 Speaker 1: planning on talking about that, but right now we're kind 16 00:01:06,880 --> 00:01:09,480 Speaker 1: of waiting for something to happen and then we can 17 00:01:09,920 --> 00:01:10,560 Speaker 1: discuss it. 18 00:01:11,520 --> 00:01:13,600 Speaker 2: Waiting for so many things to happen. 19 00:01:13,640 --> 00:01:14,679 Speaker 3: We are, and I hate it. 20 00:01:14,720 --> 00:01:20,320 Speaker 1: I've been so stressed all day. Yeah, but I just 21 00:01:20,360 --> 00:01:24,200 Speaker 1: wanted to talk about some things that I've experienced with 22 00:01:24,240 --> 00:01:28,520 Speaker 1: AI lately. I know, I know in the news it 23 00:01:28,640 --> 00:01:32,840 Speaker 1: was recently there's a lot of discussion around music and 24 00:01:32,880 --> 00:01:35,560 Speaker 1: what's gonna happen and copyright around music, which was already 25 00:01:35,560 --> 00:01:37,480 Speaker 1: a mess. But I know there was that AI song 26 00:01:38,200 --> 00:01:41,760 Speaker 1: with The Weekend and Drake. It's Drake featuring the Weekend, 27 00:01:41,800 --> 00:01:44,200 Speaker 1: I think yeah, And. 28 00:01:43,959 --> 00:01:46,880 Speaker 2: I talked about it on our episode with Daily's like guys, 29 00:01:46,880 --> 00:01:49,360 Speaker 2: because Miles, who's the host of that, brought it up, 30 00:01:49,720 --> 00:01:51,520 Speaker 2: and then I have all these questions that he could 31 00:01:51,600 --> 00:01:54,800 Speaker 2: not answer because I'm like, wait, what and he's like, stop, 32 00:01:54,840 --> 00:01:56,840 Speaker 2: don't do this. He didn't say that to me, but 33 00:01:56,880 --> 00:01:58,880 Speaker 2: you could tell, like I'm asking way too many questions 34 00:01:58,920 --> 00:02:01,400 Speaker 2: about those because I'm like, wait, is this kind of 35 00:02:01,440 --> 00:02:05,600 Speaker 2: going to be like how actors are being used without 36 00:02:05,600 --> 00:02:07,520 Speaker 2: their permission because they didn't know that they had to 37 00:02:07,560 --> 00:02:09,959 Speaker 2: put this down off their contracts. Like we had this 38 00:02:10,000 --> 00:02:15,000 Speaker 2: auble discussion because like you know, when your technology exceeds it, 39 00:02:15,480 --> 00:02:20,079 Speaker 2: what happens. Music is a little better about infringement and 40 00:02:20,120 --> 00:02:24,160 Speaker 2: copywriting specific voices, tones and stuff, But I don't know. 41 00:02:25,280 --> 00:02:28,240 Speaker 2: I also saw that BuzzFeed News laid off a ton 42 00:02:28,320 --> 00:02:30,200 Speaker 2: of people and one of the reasons is that we're 43 00:02:30,200 --> 00:02:31,560 Speaker 2: going to use AI apparently. 44 00:02:31,800 --> 00:02:35,880 Speaker 3: Yeah, I saw I saw that too. 45 00:02:36,160 --> 00:02:38,639 Speaker 1: Well, I was thinking about that in terms of first 46 00:02:38,680 --> 00:02:40,360 Speaker 1: of all, that song is really good if you haven't 47 00:02:40,360 --> 00:02:40,800 Speaker 1: heard it, I was. 48 00:02:40,840 --> 00:02:41,320 Speaker 3: Kind of mad. 49 00:02:41,360 --> 00:02:42,040 Speaker 2: It was interesting. 50 00:02:42,400 --> 00:02:46,359 Speaker 1: Yeah, it's interesting, but it got pulled because people were 51 00:02:46,360 --> 00:02:48,679 Speaker 1: freaking out about it. And I there has been a 52 00:02:48,720 --> 00:02:51,160 Speaker 1: lot of concern, and you know, rightfully so in a 53 00:02:51,160 --> 00:02:53,040 Speaker 1: lot of ways. But I was thinking about it because 54 00:02:53,040 --> 00:02:56,680 Speaker 1: when we did that comparison of the podcast intros about 55 00:02:56,720 --> 00:03:00,200 Speaker 1: feminism that chat GPT wrote and we were saying, this 56 00:03:00,240 --> 00:03:02,520 Speaker 1: sounds kind of like a high schooler writing an. 57 00:03:02,520 --> 00:03:03,400 Speaker 3: Essay or something. 58 00:03:04,720 --> 00:03:08,240 Speaker 1: I one thing we didn't talk about, but this whole 59 00:03:08,280 --> 00:03:10,640 Speaker 1: thing made me think about is you and I have 60 00:03:10,720 --> 00:03:14,040 Speaker 1: a huge library of ourselves talking for the Internet to 61 00:03:14,120 --> 00:03:17,320 Speaker 1: pull from, from the AI to learn from. And I 62 00:03:17,400 --> 00:03:19,760 Speaker 1: got kind of I had a little moment of panic 63 00:03:19,760 --> 00:03:22,520 Speaker 1: about well, now someone could use my voice. 64 00:03:22,800 --> 00:03:27,359 Speaker 2: Oh wow, they really could. Yeah, especially for the well, 65 00:03:27,480 --> 00:03:30,919 Speaker 2: like the whole deep fake has been recently a thing, 66 00:03:31,760 --> 00:03:35,720 Speaker 2: so taking our images and then using that with our voices. 67 00:03:35,800 --> 00:03:42,920 Speaker 3: Ooh yeah, see now I'm passing on my concern to you. 68 00:03:44,240 --> 00:03:49,720 Speaker 1: Ah indeed, And I mean it's like you said there, 69 00:03:49,920 --> 00:03:52,960 Speaker 1: this has been a problem with deep fakes with actors 70 00:03:53,000 --> 00:03:55,440 Speaker 1: where you know, people didn't know they had to put 71 00:03:55,440 --> 00:03:59,440 Speaker 1: that in their contract. And then Disney, for example, owns 72 00:03:59,480 --> 00:04:02,440 Speaker 1: your whole image and all of your voice clips and 73 00:04:02,480 --> 00:04:07,520 Speaker 1: now it's you, but it's not you. It's not something 74 00:04:07,600 --> 00:04:10,360 Speaker 1: you agreed to, but they kind of own your face 75 00:04:10,440 --> 00:04:14,760 Speaker 1: and your voice. So I mean, it's not like it's 76 00:04:14,760 --> 00:04:17,479 Speaker 1: a new concern, but it's just a new it's like 77 00:04:17,560 --> 00:04:21,520 Speaker 1: a update on the concern as technology gets better. 78 00:04:21,960 --> 00:04:26,680 Speaker 2: Well now people losing jobs. Yeah, and we knew it 79 00:04:26,720 --> 00:04:29,520 Speaker 2: was coming, but this is quick. This seems very quick. 80 00:04:30,520 --> 00:04:31,120 Speaker 3: Yeah. 81 00:04:31,320 --> 00:04:33,000 Speaker 1: Yeah, And I've heard so many different takes on that 82 00:04:33,040 --> 00:04:37,360 Speaker 1: because somebody somebody was saying, you know, there's no way 83 00:04:37,960 --> 00:04:43,040 Speaker 1: AI could replace, like they're still going to need humans 84 00:04:43,040 --> 00:04:45,760 Speaker 1: for it. Sounds so high check and like sci fi 85 00:04:45,839 --> 00:04:51,080 Speaker 1: right now, but to do more like expert things. But 86 00:04:51,120 --> 00:04:54,039 Speaker 1: then somebody else was like, I've never I've never gotten 87 00:04:54,080 --> 00:04:56,480 Speaker 1: this good quality of work out of a freelancer. 88 00:04:57,200 --> 00:05:02,760 Speaker 3: I'm kidding from AI, Like, oh no, what I know, right, 89 00:05:04,880 --> 00:05:06,200 Speaker 3: that's all in the wound. 90 00:05:10,560 --> 00:05:12,159 Speaker 1: But one of the reasons I wanted to talk about this, 91 00:05:12,200 --> 00:05:13,960 Speaker 1: I mentioned it in our last one day many was 92 00:05:14,640 --> 00:05:16,599 Speaker 1: I have I don't know if you've heard me say 93 00:05:16,640 --> 00:05:19,159 Speaker 1: it but I have started publishing some fan fiction lately. 94 00:05:19,640 --> 00:05:20,480 Speaker 2: What's the name of it. 95 00:05:20,920 --> 00:05:24,920 Speaker 3: No, it would be so easy to find it, really would. 96 00:05:25,680 --> 00:05:29,200 Speaker 1: I actually made a pretty obvious accidental mistake it revealing 97 00:05:29,240 --> 00:05:30,320 Speaker 1: who I was in one of them. 98 00:05:30,839 --> 00:05:35,320 Speaker 3: That's amazing. It was really, really funny. It was like 99 00:05:35,360 --> 00:05:39,640 Speaker 3: I was doing a podcast outro. It was not good. 100 00:05:42,520 --> 00:05:46,960 Speaker 1: Yeah, yeah, but I started noticing. I told I told 101 00:05:47,000 --> 00:05:48,520 Speaker 1: all of you that I've been having this kind of 102 00:05:48,560 --> 00:05:53,800 Speaker 1: like notification anxiety. And I got some of these reviews, 103 00:05:53,800 --> 00:05:55,719 Speaker 1: which largely have been very good, but you get an 104 00:05:55,760 --> 00:05:58,920 Speaker 1: email every time you get a review, and I got 105 00:05:58,920 --> 00:06:02,040 Speaker 1: a bunch of them that said, like, this particular service, 106 00:06:02,040 --> 00:06:03,840 Speaker 1: which you have to pay for, has detected that there 107 00:06:03,960 --> 00:06:06,719 Speaker 1: is AI. This is written by a lying author. They're false, 108 00:06:06,720 --> 00:06:09,000 Speaker 1: they're not real, and they're tricking you. Here's how you 109 00:06:09,040 --> 00:06:11,800 Speaker 1: could find out that they're liars. 110 00:06:13,160 --> 00:06:13,440 Speaker 2: I know. 111 00:06:14,000 --> 00:06:17,360 Speaker 1: And then, because I've never seen this encountered this before, 112 00:06:17,760 --> 00:06:20,200 Speaker 1: I was gonna type back a like oh. 113 00:06:19,839 --> 00:06:25,840 Speaker 3: You in this love star wars love story or that 114 00:06:26,040 --> 00:06:30,160 Speaker 3: are wrote how dare you? 115 00:06:30,320 --> 00:06:31,360 Speaker 2: How dare you? 116 00:06:32,040 --> 00:06:34,400 Speaker 1: But then I was like I couldn't click on their name, 117 00:06:34,480 --> 00:06:36,799 Speaker 1: so I couldn't see what else they had like reviewed 118 00:06:36,880 --> 00:06:40,039 Speaker 1: or written or anything like that, and then I noticed 119 00:06:40,080 --> 00:06:42,279 Speaker 1: that most of them were worded pretty much the same. 120 00:06:43,080 --> 00:06:46,839 Speaker 1: So I was like, oh, either this is just some 121 00:06:47,320 --> 00:06:51,880 Speaker 1: automated chatbot or my conspiracy theory brain was like, what 122 00:06:52,000 --> 00:06:55,440 Speaker 1: if this is like AI. It's finding things and then 123 00:06:55,440 --> 00:06:57,120 Speaker 1: telling you you need to buy these other services, Like 124 00:06:57,160 --> 00:07:00,640 Speaker 1: what if they're working together, the AI detection service and 125 00:07:00,680 --> 00:07:04,400 Speaker 1: the AI But then I was like, is it true 126 00:07:04,480 --> 00:07:07,120 Speaker 1: that it's ninety one percent matched? I don't think so. 127 00:07:07,160 --> 00:07:09,800 Speaker 1: I think it's a lie. No one could match my 128 00:07:11,120 --> 00:07:29,840 Speaker 1: beautiful writing store style, this very original Star Wars idea. 129 00:07:30,440 --> 00:07:34,760 Speaker 1: So that was happening, and then I've started to get 130 00:07:34,800 --> 00:07:38,200 Speaker 1: another thing that I think has to do with AI learning. 131 00:07:38,880 --> 00:07:43,400 Speaker 3: But I'll get these reviews that are not human beings. 132 00:07:43,440 --> 00:07:44,400 Speaker 3: I can say that for. 133 00:07:44,360 --> 00:07:48,360 Speaker 1: Sure, but it will say like it's like a Wikipedia entry, 134 00:07:48,920 --> 00:07:51,840 Speaker 1: and then it'll be like this character first was born 135 00:07:51,880 --> 00:07:54,760 Speaker 1: in the Clone Wars era this time and all of 136 00:07:54,760 --> 00:07:56,400 Speaker 1: this stuff, and I'm like, are you trying to learn 137 00:07:56,400 --> 00:07:59,400 Speaker 1: from my fan fixing? 138 00:08:00,080 --> 00:08:01,840 Speaker 2: You're stealing in content. 139 00:08:02,520 --> 00:08:08,480 Speaker 1: Don't do it. It's very precious to be AI. So 140 00:08:08,520 --> 00:08:10,800 Speaker 1: it's just I'm glad I didn't panic. And now i 141 00:08:10,880 --> 00:08:14,120 Speaker 1: just mark those reviews as spam. I'm glad I didn't 142 00:08:14,160 --> 00:08:17,040 Speaker 1: engage in some kind of argument around it, but that 143 00:08:17,200 --> 00:08:19,800 Speaker 1: is something that I've noticed. And then I texted you. 144 00:08:21,320 --> 00:08:24,520 Speaker 1: But some people have been using AI to write fan 145 00:08:24,600 --> 00:08:28,880 Speaker 1: fiction or to edit it, which I still can't quite 146 00:08:28,880 --> 00:08:29,400 Speaker 1: figure out. 147 00:08:29,720 --> 00:08:32,800 Speaker 2: But yeah, I've been told to use that for edis 148 00:08:32,800 --> 00:08:34,280 Speaker 2: and I'm like, I don't know what that means. I 149 00:08:34,280 --> 00:08:34,959 Speaker 2: don't want to do that. 150 00:08:35,440 --> 00:08:36,640 Speaker 3: I can't figure that out. 151 00:08:38,040 --> 00:08:41,199 Speaker 1: If someone knows, please write in I because they'll one 152 00:08:41,240 --> 00:08:43,360 Speaker 1: of the As you know, they're like tags on fan fiction, 153 00:08:43,480 --> 00:08:46,840 Speaker 1: and one of them is now like beta read by AI, 154 00:08:47,280 --> 00:08:49,439 Speaker 1: which basically means edited by AI. 155 00:08:49,679 --> 00:08:50,800 Speaker 3: Think, so some. 156 00:08:50,760 --> 00:08:53,560 Speaker 1: People are doing it and they are marking it, but 157 00:08:53,640 --> 00:08:55,920 Speaker 1: it's just interesting to see it. 158 00:08:55,880 --> 00:08:56,840 Speaker 3: Kind of play out. 159 00:08:57,559 --> 00:08:59,400 Speaker 1: And then you and I were talking about I hadn't 160 00:08:59,400 --> 00:09:05,040 Speaker 1: really heard about this, but people getting AI calls from 161 00:09:05,240 --> 00:09:08,840 Speaker 1: people posing as someone they know and kind of saying 162 00:09:08,880 --> 00:09:12,439 Speaker 1: like your loved one is in this dangerous situation, right. 163 00:09:12,480 --> 00:09:15,840 Speaker 2: Right, So essentially what happened was they took the number 164 00:09:15,960 --> 00:09:18,199 Speaker 2: because we know we can do that for Google copy. 165 00:09:18,240 --> 00:09:20,319 Speaker 2: You replicate a number and then say I've got your 166 00:09:20,320 --> 00:09:23,720 Speaker 2: person here. You need to send this much more money. Essentially, 167 00:09:23,800 --> 00:09:30,360 Speaker 2: it's like a fishing with aim and demanding ransom and 168 00:09:30,400 --> 00:09:31,760 Speaker 2: scaring the hell out of people. 169 00:09:32,320 --> 00:09:35,320 Speaker 3: Yeah, yeah, that is creepy. 170 00:09:36,080 --> 00:09:38,200 Speaker 2: Right. Well, now that you say that, and the fact 171 00:09:38,240 --> 00:09:41,440 Speaker 2: that we have our voices, what if they like they 172 00:09:41,559 --> 00:09:43,480 Speaker 2: use our voices to say, hey, I need money? And 173 00:09:43,520 --> 00:09:44,640 Speaker 2: now that I said the word money. 174 00:09:45,160 --> 00:09:47,839 Speaker 3: I think you've probably said it before here. 175 00:09:48,559 --> 00:09:51,520 Speaker 2: That's not good. To be fair, not many people are 176 00:09:51,520 --> 00:09:53,760 Speaker 2: just gonna outrighte give me money because that's not something 177 00:09:53,800 --> 00:09:58,160 Speaker 2: I ask. So if I call you to ask for money, 178 00:09:58,360 --> 00:09:59,959 Speaker 2: question that because I don't typically do that. 179 00:10:01,040 --> 00:10:04,560 Speaker 3: Okay, I got some calls. I need to go back 180 00:10:04,600 --> 00:10:07,120 Speaker 3: over then where my been going to. 181 00:10:09,520 --> 00:10:11,360 Speaker 1: And then I was kind of thinking about this because 182 00:10:12,160 --> 00:10:14,920 Speaker 1: I was clearing out my voicemail and there was this 183 00:10:15,000 --> 00:10:17,680 Speaker 1: promotion that you, Samantha did to me first with the 184 00:10:17,720 --> 00:10:21,440 Speaker 1: newest screen movie Yes, where Gho's face would call you 185 00:10:21,520 --> 00:10:24,800 Speaker 1: and generally scare you, but it would say your name, 186 00:10:25,720 --> 00:10:29,080 Speaker 1: and I got I got to thinking about like, oh, dear, 187 00:10:29,760 --> 00:10:33,160 Speaker 1: someone could really get me pretty good future. 188 00:10:34,679 --> 00:10:35,080 Speaker 3: I did it. 189 00:10:35,360 --> 00:10:38,240 Speaker 2: I saw it on TikTok asper Usu and I was like, oh, 190 00:10:38,320 --> 00:10:40,800 Speaker 2: I must do this immediately because you had just seen it, 191 00:10:41,360 --> 00:10:43,880 Speaker 2: you believe, yeah, And I was like, oh, I must 192 00:10:43,880 --> 00:10:45,880 Speaker 2: have been one of the first people, because you had 193 00:10:45,880 --> 00:10:47,840 Speaker 2: said later when you tried it, it was too busy 194 00:10:48,360 --> 00:10:49,960 Speaker 2: and itwhelmed. So I. 195 00:10:51,760 --> 00:10:57,880 Speaker 3: Was how much fun? Okay? I was impressed though, because 196 00:10:57,920 --> 00:10:59,079 Speaker 3: it said my name correctly. 197 00:10:59,120 --> 00:11:03,040 Speaker 1: Normally the EUI throws off technologies like that, but nope, 198 00:11:03,679 --> 00:11:05,880 Speaker 1: and it left a voicemail because after you did it, 199 00:11:05,920 --> 00:11:07,559 Speaker 1: and yes, you were the first and your returns that 200 00:11:07,640 --> 00:11:10,240 Speaker 1: are a bunch of people did it to me and so. 201 00:11:11,880 --> 00:11:13,560 Speaker 3: I just one of them left of it would leave 202 00:11:13,559 --> 00:11:19,400 Speaker 3: a voicemail. I kept it because I liked it. Oh yeah, yeah. 203 00:11:19,440 --> 00:11:22,400 Speaker 1: But it shut down after that that weekend because it 204 00:11:22,440 --> 00:11:23,560 Speaker 1: got it got so overwhelmed. 205 00:11:23,559 --> 00:11:24,800 Speaker 3: I tried to do it to someone else and it 206 00:11:24,880 --> 00:11:27,160 Speaker 3: was like, nope, stop, it may be alone. 207 00:11:27,320 --> 00:11:29,160 Speaker 2: I'm murdering people. 208 00:11:29,800 --> 00:11:31,040 Speaker 3: Oh yep. 209 00:11:31,160 --> 00:11:33,000 Speaker 1: And that was one of the cool things, was I mean, 210 00:11:33,000 --> 00:11:36,000 Speaker 1: it's creepy, but as you said, the dumber said it 211 00:11:36,000 --> 00:11:38,040 Speaker 1: was from New York, which is where the movie takes place. 212 00:11:39,520 --> 00:11:39,880 Speaker 2: Yee. 213 00:11:41,080 --> 00:11:46,160 Speaker 1: Well, one other thing I want to mention, and this 214 00:11:46,240 --> 00:11:49,800 Speaker 1: has just been a random, a whole random thoughts thoughts 215 00:11:49,840 --> 00:11:53,760 Speaker 1: I have right now, but also one thing I've learned 216 00:11:54,040 --> 00:11:56,760 Speaker 1: I'm trying to learn through fan fiction is I don't 217 00:11:56,760 --> 00:11:59,800 Speaker 1: know a lot of lingo. I'm having to look up 218 00:12:00,400 --> 00:12:03,920 Speaker 1: a lot of new words because I'm guessing young people 219 00:12:04,280 --> 00:12:05,120 Speaker 1: I don't know. 220 00:12:05,160 --> 00:12:09,959 Speaker 2: Like like new phrases or new things specific to fan fiction. 221 00:12:10,920 --> 00:12:14,520 Speaker 1: New phrases. I'm just learning a lot. I'm learning a lot. 222 00:12:14,559 --> 00:12:16,320 Speaker 1: So far they've all been positive. But you know, you 223 00:12:16,360 --> 00:12:19,840 Speaker 1: have that moment flight if I look this up, it's terrible. Yeah, 224 00:12:20,000 --> 00:12:25,320 Speaker 1: so far, yes, yes, yes, yes, Well, thank you so 225 00:12:25,440 --> 00:12:28,000 Speaker 1: much for going on this journey. If you have any 226 00:12:28,000 --> 00:12:32,800 Speaker 1: thoughts about this or any experiences around this really burgeoning 227 00:12:32,800 --> 00:12:38,040 Speaker 1: world of AI or fan fiction or ghost face phone calls, 228 00:12:39,160 --> 00:12:42,000 Speaker 1: please let let us know. You can email us at 229 00:12:42,000 --> 00:12:44,600 Speaker 1: Stephania mom Stuff at iHeartMedia dot com. You can find 230 00:12:44,679 --> 00:12:47,120 Speaker 1: us on Twitter at mom Stuff podcast, or on Instagram 231 00:12:47,120 --> 00:12:49,080 Speaker 1: and TikTok at stuff Whenever told You. You can also 232 00:12:49,120 --> 00:12:51,400 Speaker 1: find us on YouTube. We also have a book that 233 00:12:51,480 --> 00:12:54,600 Speaker 1: you can pre order Stuff you should read books dot com. 234 00:12:54,760 --> 00:12:57,880 Speaker 1: Thanks as always to our super producer Christina, our executive 235 00:12:57,920 --> 00:13:00,800 Speaker 1: producer Maya, and our contributor Joey. Y'all are to Beth, 236 00:13:01,080 --> 00:13:04,520 Speaker 1: you are, and thanks to you for listening. Stefan never 237 00:13:04,520 --> 00:13:06,520 Speaker 1: told you the production of iHeartRadio. For more podcast from my 238 00:13:06,520 --> 00:13:08,240 Speaker 1: Heart Radio, you can check out the iHeart Radio app, 239 00:13:08,240 --> 00:13:10,400 Speaker 1: Apple podcast, or wherever you listen to your favorite shows,