1 00:00:00,120 --> 00:00:04,480 Speaker 1: It's nine O gen one engine with Jenny Garth and 2 00:00:04,559 --> 00:00:05,360 Speaker 1: Tory spelling. 3 00:00:09,480 --> 00:00:11,080 Speaker 2: This is so exciting, you guys. 4 00:00:11,320 --> 00:00:16,360 Speaker 1: His celebrity baby videos have gone viral, including the one 5 00:00:16,400 --> 00:00:19,160 Speaker 1: he made of our Beverly Hills nine two and oh cast, 6 00:00:19,160 --> 00:00:21,320 Speaker 1: which we're all obsessed with right. 7 00:00:21,440 --> 00:00:24,000 Speaker 3: Oh my gosh, us as babies. 8 00:00:24,360 --> 00:00:27,520 Speaker 1: Yes, we're so excited to chat with the talented content 9 00:00:27,600 --> 00:00:32,279 Speaker 1: creator Michael Waller of junk BOXAI dot com, who is 10 00:00:32,320 --> 00:00:36,519 Speaker 1: doing amazing things with artificial intelligence. Today we're going to 11 00:00:36,560 --> 00:00:41,199 Speaker 1: pick his brain about everything AI. Welcome Michael Waller to 12 00:00:41,240 --> 00:00:42,840 Speaker 1: the nine O two one OG podcast. 13 00:00:43,280 --> 00:00:45,000 Speaker 4: Hi Toy, Hi Jenny. 14 00:00:44,880 --> 00:00:47,920 Speaker 2: Hi Mike. Good to have you. Okay, this is this 15 00:00:48,080 --> 00:00:48,640 Speaker 2: is wild. 16 00:00:48,840 --> 00:00:52,120 Speaker 4: We we know this is wild. You guys have any. 17 00:00:52,560 --> 00:00:55,360 Speaker 3: Thanks for making us relevant on social media again. 18 00:00:55,400 --> 00:00:57,040 Speaker 2: Oh my god your posting. 19 00:00:58,000 --> 00:01:02,520 Speaker 1: Your post had us group chatting and laughing so hard 20 00:01:02,720 --> 00:01:06,160 Speaker 1: and just loving what you did for everybody that doesn't 21 00:01:06,200 --> 00:01:08,759 Speaker 1: know there was a post put up. You made this 22 00:01:08,880 --> 00:01:14,200 Speaker 1: with AI, I'm assuming and then it was the cast 23 00:01:14,240 --> 00:01:16,679 Speaker 1: when they were babies, and this is our cast. 24 00:01:16,760 --> 00:01:18,039 Speaker 2: Is not the first cast that you've done this. 25 00:01:18,040 --> 00:01:21,080 Speaker 5: For right now, I've done quite a few. Now, it's 26 00:01:21,200 --> 00:01:22,800 Speaker 5: hundreds of babies at this point. 27 00:01:23,000 --> 00:01:28,039 Speaker 3: Hundreds of babies junk boxai dot com. 28 00:01:28,520 --> 00:01:30,959 Speaker 2: Yes, that's your website. 29 00:01:31,440 --> 00:01:32,759 Speaker 3: I couldn't stop watching it. 30 00:01:33,840 --> 00:01:36,560 Speaker 4: No, yeah, and actually, let's see how many views does 31 00:01:36,600 --> 00:01:36,920 Speaker 4: it happen? 32 00:01:37,000 --> 00:01:37,160 Speaker 5: Now? 33 00:01:37,280 --> 00:01:37,680 Speaker 2: Oh yeah? 34 00:01:37,720 --> 00:01:38,080 Speaker 3: How many? 35 00:01:38,880 --> 00:01:44,319 Speaker 5: Five million views on my page? Over sixty one thousand shares, 36 00:01:44,400 --> 00:01:46,679 Speaker 5: So everybody can't stop watching it apparently. 37 00:01:47,000 --> 00:01:47,240 Speaker 2: Yeah. 38 00:01:47,400 --> 00:01:49,080 Speaker 1: Some you watch over and over and you're like, oh 39 00:01:49,080 --> 00:01:51,880 Speaker 1: my god, look how cute they were when they're babies. 40 00:01:51,920 --> 00:01:55,760 Speaker 1: But I told us that he wasn't chubby like that 41 00:01:55,880 --> 00:01:59,400 Speaker 1: as a baby. He was really skinny. 42 00:02:00,080 --> 00:02:02,760 Speaker 3: Oh my gosh. And for me, I wish I look 43 00:02:02,880 --> 00:02:06,680 Speaker 3: like that as a baby. You are so shute, and 44 00:02:06,760 --> 00:02:08,800 Speaker 3: so were you. I couldn't. Oh my god. 45 00:02:08,919 --> 00:02:12,440 Speaker 5: And it kind of takes the adult features and brings 46 00:02:12,480 --> 00:02:16,240 Speaker 5: it kind of like, you know, twenty years or you know, 47 00:02:16,400 --> 00:02:17,079 Speaker 5: it's pretty great. 48 00:02:17,400 --> 00:02:18,919 Speaker 2: What gave you the idea to do this? 49 00:02:19,680 --> 00:02:19,960 Speaker 4: Well? 50 00:02:20,040 --> 00:02:22,560 Speaker 5: I experiment with a lot of different art styles on 51 00:02:22,600 --> 00:02:25,000 Speaker 5: my page. If you scroll down past the babies, you'll 52 00:02:25,000 --> 00:02:28,200 Speaker 5: see a bunch of different styles of animation. I've been 53 00:02:28,480 --> 00:02:30,919 Speaker 5: kind of experimenting with different styles for the past two 54 00:02:31,000 --> 00:02:33,960 Speaker 5: and a half years now with that page, and I 55 00:02:34,080 --> 00:02:36,080 Speaker 5: just kind of landed on that, but I gave it 56 00:02:36,120 --> 00:02:38,920 Speaker 5: a shot. I wound up doing a piece with a 57 00:02:38,960 --> 00:02:44,920 Speaker 5: bunch of iconic celebrities like Snoop dog Bruce Lee, Mike Tyson, 58 00:02:46,360 --> 00:02:49,440 Speaker 5: Leonardo DiCaprio, and I put it out on my page 59 00:02:49,560 --> 00:02:51,560 Speaker 5: just just like a random post. I didn't really think 60 00:02:51,600 --> 00:02:54,000 Speaker 5: about making much of a campaign out of it, and 61 00:02:54,040 --> 00:02:59,000 Speaker 5: I randomly collabsed Snoop Dogg on it because he's collaborated 62 00:02:59,040 --> 00:03:01,000 Speaker 5: with me on a few posts, randomly like I'll just 63 00:03:01,040 --> 00:03:02,720 Speaker 5: you know, how you can add somebody as a collaborator. 64 00:03:03,680 --> 00:03:06,320 Speaker 4: I added him as a collaborate on it. Within ten minutes, 65 00:03:06,360 --> 00:03:07,760 Speaker 4: he collaborated. 66 00:03:07,120 --> 00:03:10,680 Speaker 5: On it, and it got such an awesome reaction and 67 00:03:11,240 --> 00:03:12,799 Speaker 5: it went like megaviral. 68 00:03:12,880 --> 00:03:14,320 Speaker 4: So I was like, I gotta do this again. 69 00:03:14,560 --> 00:03:17,160 Speaker 5: Did it again, and then it went just as viral, 70 00:03:17,200 --> 00:03:18,679 Speaker 5: and I was like, all right, I got something here, 71 00:03:18,760 --> 00:03:21,079 Speaker 5: And then I just kind of kept trying it out 72 00:03:21,120 --> 00:03:23,120 Speaker 5: to see like how long or how many babies I 73 00:03:23,160 --> 00:03:26,040 Speaker 5: could do before people got sick of them, and and they're. 74 00:03:25,880 --> 00:03:26,480 Speaker 2: Not sick of it. 75 00:03:26,880 --> 00:03:27,760 Speaker 3: They want more and more. 76 00:03:28,480 --> 00:03:31,200 Speaker 1: Yeah, I know, the friends cast video that you did 77 00:03:31,320 --> 00:03:32,840 Speaker 1: got what eighty million? 78 00:03:33,639 --> 00:03:36,560 Speaker 5: Yeah, almost almost ad eighty million views, you know, what's 79 00:03:36,600 --> 00:03:37,640 Speaker 5: crazy about that video? 80 00:03:38,400 --> 00:03:39,680 Speaker 3: Film? 81 00:03:38,880 --> 00:03:42,840 Speaker 4: Ok, there's a statistic about that video that blew my mind. 82 00:03:43,280 --> 00:03:47,800 Speaker 5: So that video has sixteen years watch time and it's 83 00:03:47,840 --> 00:03:49,240 Speaker 5: like a fifteen second video. 84 00:03:49,560 --> 00:03:52,240 Speaker 4: Wow, So that means that it's out for like. 85 00:03:52,280 --> 00:03:56,760 Speaker 1: People left, people spent fifteen sixteen what'd you say, sixteen years? 86 00:03:56,760 --> 00:03:59,480 Speaker 2: Sixteen years of their life? 87 00:03:59,640 --> 00:04:04,160 Speaker 5: Well, if you have a yeah, accumulatively, it's sixteen years 88 00:04:04,160 --> 00:04:07,200 Speaker 5: of time that people have spent watching the video, which 89 00:04:07,240 --> 00:04:08,040 Speaker 5: blows my mind. 90 00:04:09,040 --> 00:04:11,360 Speaker 2: Does that like just get you so excited and want 91 00:04:11,400 --> 00:04:11,760 Speaker 2: to do it? 92 00:04:11,800 --> 00:04:12,040 Speaker 3: Really? 93 00:04:12,040 --> 00:04:14,080 Speaker 4: Does? It motivates me a lot. 94 00:04:14,160 --> 00:04:16,080 Speaker 5: It makes me want to stay up all night just 95 00:04:16,440 --> 00:04:17,920 Speaker 5: cranking out these babies. 96 00:04:20,040 --> 00:04:23,599 Speaker 1: I'm glad you're cranking out babies on the internet because yeah, 97 00:04:23,600 --> 00:04:24,960 Speaker 1: that would be too many babies for you. 98 00:04:25,160 --> 00:04:27,120 Speaker 3: What's next? Are you going to keep doing babies? 99 00:04:27,480 --> 00:04:30,880 Speaker 5: That's a good question. For now, I'm gonna I think 100 00:04:30,920 --> 00:04:33,320 Speaker 5: it's gonna be a baby filled spring for me. I 101 00:04:33,560 --> 00:04:36,760 Speaker 5: just launched this. I know the context of it, it's 102 00:04:36,800 --> 00:04:39,760 Speaker 5: kind of weird, but I just launched this book, the 103 00:04:39,839 --> 00:04:42,160 Speaker 5: junk Box Baby Book, and that's kind of brand new. 104 00:04:42,400 --> 00:04:44,440 Speaker 5: So even although I've been doing the babies for like 105 00:04:44,480 --> 00:04:47,799 Speaker 5: maybe a month now, the book still has a lot 106 00:04:47,880 --> 00:04:51,560 Speaker 5: of uh kind of work ahead of it before I, 107 00:04:51,120 --> 00:04:53,640 Speaker 5: uh kind of start something else. So I think I'm 108 00:04:53,680 --> 00:04:56,680 Speaker 5: going to run that campaign for spring, and I'm also 109 00:04:56,680 --> 00:04:59,200 Speaker 5: going to experiment with some other stuff. But honestly, I 110 00:04:59,279 --> 00:05:01,800 Speaker 5: usually just kind of it come to me naturally, and 111 00:05:01,839 --> 00:05:04,719 Speaker 5: when I feel like doing something different, if my audience 112 00:05:04,760 --> 00:05:05,760 Speaker 5: reacts to it, I'll. 113 00:05:05,560 --> 00:05:08,159 Speaker 4: Continue to do it. And so far they just keep 114 00:05:08,200 --> 00:05:09,159 Speaker 4: asking for babies. 115 00:05:09,200 --> 00:05:11,560 Speaker 5: So that's pretty much that's all that's in my sights 116 00:05:11,560 --> 00:05:11,960 Speaker 5: for now. 117 00:05:12,200 --> 00:05:13,640 Speaker 2: I mean, what you do is so cool. 118 00:05:13,839 --> 00:05:17,520 Speaker 1: We hear everybody hears about AI, and you know, being 119 00:05:17,800 --> 00:05:21,280 Speaker 1: as gifted as you are at using it, should we 120 00:05:21,360 --> 00:05:22,240 Speaker 1: be scared of it all? 121 00:05:22,279 --> 00:05:23,360 Speaker 2: I mean, what's your take on. 122 00:05:23,240 --> 00:05:25,400 Speaker 3: That in the future we never going to work again. 123 00:05:25,600 --> 00:05:26,600 Speaker 3: Just break it down for us. 124 00:05:26,880 --> 00:05:28,240 Speaker 4: No, I don't think that so. 125 00:05:28,400 --> 00:05:33,120 Speaker 5: Like, I have a pretty extensive background in traditional videography 126 00:05:33,160 --> 00:05:37,080 Speaker 5: before I did, before I became an AI visual creator, 127 00:05:37,240 --> 00:05:41,680 Speaker 5: I shot a lot of videos, music videos, short films, commercials, 128 00:05:41,680 --> 00:05:44,560 Speaker 5: and advertisements. When social media reels became popular, I kind 129 00:05:44,560 --> 00:05:47,279 Speaker 5: of moved on to doing that. But I will tell 130 00:05:47,320 --> 00:05:52,160 Speaker 5: you from my experience, you can't really replace like traditional 131 00:05:52,200 --> 00:05:54,839 Speaker 5: videography with AI not yet at least, and I don't 132 00:05:54,839 --> 00:05:58,680 Speaker 5: think anytime soon. Creators like myself are are good at 133 00:05:58,760 --> 00:06:01,479 Speaker 5: figuring out what AI is good at and capable of, 134 00:06:01,520 --> 00:06:01,840 Speaker 5: and then. 135 00:06:01,800 --> 00:06:03,679 Speaker 4: Kind of working within that pocket. 136 00:06:04,400 --> 00:06:07,200 Speaker 5: But there's so many as as many things as AI 137 00:06:07,360 --> 00:06:10,200 Speaker 5: is capable of, there's even more that it's not capable of. 138 00:06:12,000 --> 00:06:15,560 Speaker 5: Going back to your question about it being scary and 139 00:06:15,680 --> 00:06:19,280 Speaker 5: kind of like is it destructive, I think that in 140 00:06:19,320 --> 00:06:22,880 Speaker 5: the wrong hands, I think it definitely can be misieused. 141 00:06:23,800 --> 00:06:24,720 Speaker 4: Well, I mean, like. 142 00:06:26,200 --> 00:06:29,599 Speaker 5: Well even even in terms of just like general CG 143 00:06:29,920 --> 00:06:31,840 Speaker 5: people have been able to kind of do what we're 144 00:06:31,839 --> 00:06:34,120 Speaker 5: doing with AI for years now. 145 00:06:34,200 --> 00:06:36,120 Speaker 4: It's just that we're able to do it a lot faster. 146 00:06:37,320 --> 00:06:40,200 Speaker 2: Yeah, how long does it take to make a baby video? 147 00:06:40,680 --> 00:06:42,560 Speaker 4: I do about one baby video a night. 148 00:06:42,680 --> 00:06:46,040 Speaker 5: It takes me about four hours from conception to the 149 00:06:46,080 --> 00:06:50,000 Speaker 5: final kind of run through, and then sometimes it takes 150 00:06:50,000 --> 00:06:52,680 Speaker 5: a little bit longer. But the longest part actually is 151 00:06:53,600 --> 00:06:56,680 Speaker 5: creating the data sets. Like so like for you guys, 152 00:06:56,720 --> 00:06:59,840 Speaker 5: for example, to create a data set, which is pretty 153 00:06:59,920 --> 00:07:03,120 Speaker 5: mu it's just like a it's like a hard drive 154 00:07:03,279 --> 00:07:06,599 Speaker 5: full of information that I feed to an AI model 155 00:07:06,800 --> 00:07:10,560 Speaker 5: to pretty much understand what it is that makes up 156 00:07:11,080 --> 00:07:14,800 Speaker 5: Tory and Jenny. So like for you guys, I went 157 00:07:14,800 --> 00:07:18,880 Speaker 5: on Google and I just typed in Tory spelling Jenny Garth, 158 00:07:18,960 --> 00:07:22,520 Speaker 5: and I downloaded fifty images of Toy, fifty images of Jenny, 159 00:07:22,920 --> 00:07:26,240 Speaker 5: and then I trained an AI data set. It's called 160 00:07:26,240 --> 00:07:28,920 Speaker 5: a Laura, that's the technical technical term for it. 161 00:07:29,480 --> 00:07:32,080 Speaker 4: Sorry, Laura, Yeah, Laura l r A. 162 00:07:33,280 --> 00:07:35,120 Speaker 5: It's an acronym for something, but I can't remember what 163 00:07:35,160 --> 00:07:37,560 Speaker 5: it is. And then when you're done with that, you 164 00:07:37,680 --> 00:07:40,560 Speaker 5: essentially have the ability to kind of type in whatever 165 00:07:40,640 --> 00:07:44,640 Speaker 5: you want and then get an image that's based on 166 00:07:45,160 --> 00:07:47,960 Speaker 5: the data that you trained it on. So like if 167 00:07:48,000 --> 00:07:50,760 Speaker 5: I was to say, create me a baby Tory, it 168 00:07:50,760 --> 00:07:53,280 Speaker 5: would use Tory's face and create a baby. If I 169 00:07:53,320 --> 00:07:56,640 Speaker 5: was to say, create me a medieval kind of hammer. 170 00:07:56,360 --> 00:07:59,720 Speaker 3: Stream to put postnoes job or like like that. 171 00:08:00,560 --> 00:08:04,520 Speaker 4: No, to answer that question, really, it really all depends 172 00:08:04,520 --> 00:08:04,840 Speaker 4: on the. 173 00:08:05,200 --> 00:08:06,560 Speaker 2: Content knows it's perfect. 174 00:08:06,640 --> 00:08:08,840 Speaker 3: I loved it, yeh, you guys. 175 00:08:08,960 --> 00:08:11,800 Speaker 5: Data sets came out flawless, so there was no extra 176 00:08:11,840 --> 00:08:13,440 Speaker 5: tweaking or anything I had to do to those. 177 00:08:13,640 --> 00:08:14,880 Speaker 2: Well, you captivated us. 178 00:08:15,040 --> 00:08:15,720 Speaker 4: It's funny though. 179 00:08:16,280 --> 00:08:19,520 Speaker 5: Last night I actually put together an animation to kind 180 00:08:19,560 --> 00:08:21,880 Speaker 5: of show you a little bit more of kind of 181 00:08:21,880 --> 00:08:24,240 Speaker 5: what can be done with AI. Since I already had 182 00:08:24,280 --> 00:08:26,200 Speaker 5: your data sets, I figured it'd be cool to kind 183 00:08:26,200 --> 00:08:29,280 Speaker 5: of make something to show you guys here today at 184 00:08:29,320 --> 00:08:30,400 Speaker 5: some data sets. 185 00:08:30,440 --> 00:08:33,000 Speaker 2: I'm so impressed, so very personal. 186 00:08:33,559 --> 00:08:33,920 Speaker 3: I know. 187 00:08:35,679 --> 00:08:39,760 Speaker 2: It's got flustered blush, Mike, tell me again, what was that? 188 00:08:40,280 --> 00:08:40,440 Speaker 3: Is it? 189 00:08:40,480 --> 00:08:40,960 Speaker 2: Software? 190 00:08:41,360 --> 00:08:41,640 Speaker 4: Yeah? 191 00:08:41,679 --> 00:08:44,440 Speaker 5: So there's a bunch of different programs that I use. 192 00:08:44,640 --> 00:08:48,720 Speaker 5: The main one to generate images. It's actually a developer 193 00:08:48,800 --> 00:08:52,319 Speaker 5: tool called flux f l u X. But I think 194 00:08:52,360 --> 00:08:55,120 Speaker 5: what you were asking was was the technical term for 195 00:08:55,200 --> 00:08:56,480 Speaker 5: the data set is called. 196 00:08:56,280 --> 00:08:57,560 Speaker 4: A Laura Laura. 197 00:08:57,960 --> 00:09:01,599 Speaker 5: Yeah, Okay, A lot of this stuff is really a 198 00:09:01,679 --> 00:09:04,360 Speaker 5: lot of the information that I'm talking about is it's 199 00:09:04,400 --> 00:09:07,120 Speaker 5: all available on the internet. If anybody in the in 200 00:09:07,160 --> 00:09:09,319 Speaker 5: your audience wants to learn about this stuff, you can 201 00:09:09,360 --> 00:09:13,800 Speaker 5: easily just find it by typing in image generation or 202 00:09:14,320 --> 00:09:18,200 Speaker 5: data set creation and then like wright AI after that 203 00:09:18,280 --> 00:09:19,840 Speaker 5: and you'll see thousands of videos. 204 00:09:19,880 --> 00:09:21,120 Speaker 4: I have a couple of videos up too. 205 00:09:21,920 --> 00:09:24,000 Speaker 3: There's so many apps, though, how do you know which 206 00:09:24,000 --> 00:09:26,520 Speaker 3: one's worthwhile for the beginner? 207 00:09:27,240 --> 00:09:29,520 Speaker 5: That's the secret sauce, and you, guys happen to be 208 00:09:29,600 --> 00:09:32,480 Speaker 5: with the experts, so the best. 209 00:09:32,280 --> 00:09:34,000 Speaker 4: Ones to use right now? If you do, you guys 210 00:09:34,040 --> 00:09:35,880 Speaker 4: use tchatchipt rarely. 211 00:09:36,600 --> 00:09:41,120 Speaker 5: Okaypt is a very commonly used AI software. 212 00:09:41,559 --> 00:09:42,920 Speaker 4: So I figured you guys might know that. 213 00:09:43,640 --> 00:09:46,640 Speaker 3: We're traditional in so many ways, that it's so hard 214 00:09:46,920 --> 00:09:50,000 Speaker 3: and everyone's like it's life changing, you've got to try it, 215 00:09:50,120 --> 00:09:51,800 Speaker 3: and I just it's been hard. 216 00:10:04,520 --> 00:10:06,400 Speaker 4: So I have a couple of things I want to 217 00:10:06,400 --> 00:10:07,000 Speaker 4: show you guys. 218 00:10:07,280 --> 00:10:11,040 Speaker 5: This is called the Tory and Jenny Multiverse. 219 00:10:11,920 --> 00:10:13,880 Speaker 3: Oh my goodness, jenns a. 220 00:10:13,800 --> 00:10:14,880 Speaker 4: Couple of different looks. 221 00:10:15,600 --> 00:10:18,000 Speaker 5: I just kind of went wild and put you guys 222 00:10:18,040 --> 00:10:19,760 Speaker 5: in a bunch of different lights and a bunch of 223 00:10:19,840 --> 00:10:23,640 Speaker 5: different environments. I wanted to say this, A lot of 224 00:10:23,679 --> 00:10:26,280 Speaker 5: the decisions in attire and clothing I kind of just 225 00:10:26,360 --> 00:10:27,199 Speaker 5: left up to the AI. 226 00:10:27,559 --> 00:10:31,559 Speaker 4: So don't pick it apart, uh too much, you know 227 00:10:31,640 --> 00:10:31,960 Speaker 4: what I mean? 228 00:10:32,120 --> 00:10:34,080 Speaker 2: No problem, We're so impressed. 229 00:10:34,920 --> 00:10:37,840 Speaker 3: We have already blown away yet no pig in here. 230 00:10:38,760 --> 00:10:42,479 Speaker 2: This is both bizarre. Mike what. 231 00:10:44,920 --> 00:10:44,959 Speaker 4: That? 232 00:10:46,200 --> 00:10:46,800 Speaker 2: Yeah? 233 00:10:46,840 --> 00:10:49,719 Speaker 3: Thank you? Oh my god, that just made our look 234 00:10:49,720 --> 00:10:54,280 Speaker 3: at us both smiles that it's fun, Like it's just fun. 235 00:10:54,720 --> 00:10:56,680 Speaker 4: Yeah, it's super cool to kind of see. 236 00:10:56,440 --> 00:10:58,640 Speaker 2: Yourself things that you would never see like. 237 00:10:58,880 --> 00:11:01,560 Speaker 5: Totally totally, that's the whole point of it. I think 238 00:11:01,600 --> 00:11:03,679 Speaker 5: that's the kind of that's the gap that it kind 239 00:11:03,679 --> 00:11:07,040 Speaker 5: of feels for people like me. For the longest time, 240 00:11:07,880 --> 00:11:11,000 Speaker 5: creating stuff like that was only kind of accessible for 241 00:11:11,080 --> 00:11:14,319 Speaker 5: people who had an extreme amount of knowledge or had 242 00:11:14,360 --> 00:11:17,839 Speaker 5: an extremely extensive background in that type of stuff. And 243 00:11:19,640 --> 00:11:21,760 Speaker 5: to be honest with you, there's so many more creative 244 00:11:21,800 --> 00:11:24,800 Speaker 5: people who who who are creative up here but don't 245 00:11:24,840 --> 00:11:27,080 Speaker 5: have the hands or the tools to get out what 246 00:11:27,120 --> 00:11:30,160 Speaker 5: they have in their head. And I look at AI 247 00:11:30,320 --> 00:11:32,920 Speaker 5: as kind of like the bridge to kind of allow 248 00:11:33,040 --> 00:11:35,560 Speaker 5: people like myself to kind of get out what's in 249 00:11:35,600 --> 00:11:40,199 Speaker 5: my head without having to, you know, walk a mile backwards, 250 00:11:40,440 --> 00:11:41,000 Speaker 5: you know what I mean? 251 00:11:41,240 --> 00:11:41,400 Speaker 2: Right? 252 00:11:41,520 --> 00:11:44,080 Speaker 3: Like, are you an artist? Like can you paint? 253 00:11:44,960 --> 00:11:47,280 Speaker 4: Yeah? So I've always been a visual artist. 254 00:11:47,400 --> 00:11:52,040 Speaker 5: I can draw, I've practiced every medium from photo manipulation 255 00:11:52,240 --> 00:11:55,280 Speaker 5: to video. I've always went with the most kind of 256 00:11:55,360 --> 00:11:59,360 Speaker 5: like dense visual art form. So I went from photo 257 00:11:59,400 --> 00:12:02,839 Speaker 5: manipulation to video because video just seemed more and then 258 00:12:02,920 --> 00:12:06,160 Speaker 5: AI kind of unlocks this other kind of realm of creativity. 259 00:12:06,160 --> 00:12:08,800 Speaker 5: But it also kind of rides the video train, so 260 00:12:08,840 --> 00:12:11,079 Speaker 5: it's even more than that to me, so and also 261 00:12:11,160 --> 00:12:14,199 Speaker 5: having the ability to be able to kind of create 262 00:12:14,320 --> 00:12:18,560 Speaker 5: these worlds from start to finish without leaving my Sometimes 263 00:12:18,600 --> 00:12:20,080 Speaker 5: I'm laying in my bed while I do it, and 264 00:12:20,120 --> 00:12:22,880 Speaker 5: you don't have to go out and shoot anything. It's 265 00:12:23,040 --> 00:12:25,920 Speaker 5: it's really I don't know, it's really cool. This is 266 00:12:25,960 --> 00:12:26,440 Speaker 5: really fun. 267 00:12:26,600 --> 00:12:29,640 Speaker 1: It's a whole different art form. But what about the 268 00:12:29,720 --> 00:12:32,880 Speaker 1: lost art a form of actually getting out of your 269 00:12:32,880 --> 00:12:34,360 Speaker 1: bed and going and filming something. 270 00:12:35,160 --> 00:12:37,640 Speaker 5: I will say from my experience, I can definitely tell 271 00:12:37,679 --> 00:12:42,360 Speaker 5: you that I pretty much took my own job as 272 00:12:42,360 --> 00:12:46,400 Speaker 5: a videographer. You know how they say, like AI creators 273 00:12:46,440 --> 00:12:49,040 Speaker 5: are going to take the jobs of videographers. I think 274 00:12:49,080 --> 00:12:53,760 Speaker 5: that if you look at it as a tool like 275 00:12:53,880 --> 00:12:56,559 Speaker 5: the way I do, you could kind of use it 276 00:12:56,600 --> 00:12:59,560 Speaker 5: to kind of help yourself as a videographer and grow 277 00:12:59,640 --> 00:13:02,560 Speaker 5: your brand in business. Like I don't think I ever 278 00:13:02,640 --> 00:13:05,960 Speaker 5: had any opportunities like the opportunities I'm getting today. Even 279 00:13:06,600 --> 00:13:08,880 Speaker 5: being able to talk to people like you guys and 280 00:13:09,040 --> 00:13:13,319 Speaker 5: get on shows like this wouldn't really I've been doing 281 00:13:13,720 --> 00:13:16,480 Speaker 5: content creation for ten years and I haven't really been 282 00:13:16,480 --> 00:13:18,920 Speaker 5: able to make it as much of an impact as 283 00:13:18,960 --> 00:13:24,600 Speaker 5: I was without these tools, So I'm very grateful for them, 284 00:13:24,600 --> 00:13:26,640 Speaker 5: and I feel blessed that I that I happen to 285 00:13:26,640 --> 00:13:29,440 Speaker 5: be able to create in a time that they're available, 286 00:13:29,640 --> 00:13:31,440 Speaker 5: you know, or as available as they are. 287 00:13:32,120 --> 00:13:33,360 Speaker 2: Yeah, I guess with anything. 288 00:13:33,920 --> 00:13:36,760 Speaker 1: I was recently having a conversation with someone and talking 289 00:13:36,760 --> 00:13:39,600 Speaker 1: about AI and how you know it can be a 290 00:13:39,640 --> 00:13:44,800 Speaker 1: scary thing, and he pointed out that, you know, a 291 00:13:44,840 --> 00:13:48,400 Speaker 1: surgeon holding a knife a knife can be a scary thing, 292 00:13:48,440 --> 00:13:50,520 Speaker 1: but when a surgeon is holding it to save your life, 293 00:13:50,559 --> 00:13:54,080 Speaker 1: it's a good thing. Agree, And like, you know, someone 294 00:13:54,120 --> 00:13:55,800 Speaker 1: bad is using it to hurt someone, then it's a 295 00:13:55,800 --> 00:13:58,040 Speaker 1: bad thing. So it's very similar with AI. I guess, 296 00:13:58,080 --> 00:14:01,480 Speaker 1: like it's it can be good, can be bad. Hopefully 297 00:14:01,480 --> 00:14:04,199 Speaker 1: people will use it for good, but I don't know 298 00:14:04,200 --> 00:14:05,400 Speaker 1: if I have that much faith. 299 00:14:05,800 --> 00:14:07,320 Speaker 4: I try to. I try to. 300 00:14:07,480 --> 00:14:10,520 Speaker 5: I know, I know that that's a pretty hot topic 301 00:14:10,679 --> 00:14:14,240 Speaker 5: when it comes to AI, and I'm very very aware 302 00:14:14,320 --> 00:14:16,520 Speaker 5: of how far you can push it in terms of 303 00:14:16,600 --> 00:14:19,640 Speaker 5: kind of like deceiving people or manipulating people's thoughts and 304 00:14:19,640 --> 00:14:23,880 Speaker 5: stuff like that. And I think it's really important to 305 00:14:24,840 --> 00:14:28,720 Speaker 5: find a pocket where you can make art but still 306 00:14:28,800 --> 00:14:31,000 Speaker 5: kind of hold a level of respect and kind of 307 00:14:31,040 --> 00:14:34,720 Speaker 5: have like a sense of honor when you're creating it. 308 00:14:35,080 --> 00:14:38,080 Speaker 5: I try to think a lot about like what would 309 00:14:38,080 --> 00:14:40,680 Speaker 5: the family members of the people who I created this 310 00:14:40,840 --> 00:14:43,640 Speaker 5: art about, or what would their fans, or what would 311 00:14:43,640 --> 00:14:47,040 Speaker 5: the people who think first when I create the art, 312 00:14:47,120 --> 00:14:50,920 Speaker 5: not like what would be an awesome joke, Because even 313 00:14:50,960 --> 00:14:53,760 Speaker 5: if you guys don't ever see it, I still think 314 00:14:53,800 --> 00:14:58,800 Speaker 5: that it's important to kind of kind of like show 315 00:14:58,840 --> 00:15:01,000 Speaker 5: that level of respect, just to kind of put a 316 00:15:01,000 --> 00:15:02,920 Speaker 5: good name out for AI. And I feel like if 317 00:15:02,920 --> 00:15:06,520 Speaker 5: more creators like myself are passionate about making art like that, 318 00:15:06,800 --> 00:15:10,440 Speaker 5: maybe we can like kind of lean Hollywood's perspective a 319 00:15:10,440 --> 00:15:12,680 Speaker 5: little bit more and kind of see maybe we can 320 00:15:12,760 --> 00:15:13,680 Speaker 5: work together in a way. 321 00:15:13,760 --> 00:15:14,480 Speaker 4: You know what I'm saying. 322 00:15:14,720 --> 00:15:18,360 Speaker 3: I agree. I've I've heard so many people speak the 323 00:15:18,400 --> 00:15:21,400 Speaker 3: way you're speaking about it, where it is about bridging 324 00:15:21,440 --> 00:15:25,720 Speaker 3: a gap and being respectful to the original form of 325 00:15:25,880 --> 00:15:30,600 Speaker 3: art and content creating and just kind of I mean, 326 00:15:30,680 --> 00:15:33,600 Speaker 3: we have to evolve, things have to go a certain way. 327 00:15:33,760 --> 00:15:37,000 Speaker 3: It just can't all be manipulated one way or another. 328 00:15:37,200 --> 00:15:40,040 Speaker 3: So and I know there's so many people that speak 329 00:15:40,080 --> 00:15:44,080 Speaker 3: towards your movement of incorporating it and using it the 330 00:15:44,160 --> 00:15:44,640 Speaker 3: right way. 331 00:15:45,080 --> 00:15:48,040 Speaker 5: Yeah, I mean a lot of people nowadays. I mean honestly, 332 00:15:48,160 --> 00:15:50,760 Speaker 5: a lot of people are copying my videos and it's 333 00:15:50,800 --> 00:15:52,600 Speaker 5: awesome to see a lot of other people kind of 334 00:15:52,640 --> 00:15:54,760 Speaker 5: taking those kind of ideas and pushing it more in 335 00:15:54,760 --> 00:15:58,600 Speaker 5: that direction. It definitely helps in that area of trying 336 00:15:58,600 --> 00:16:02,480 Speaker 5: to make a more lighter kind of environment for people 337 00:16:02,520 --> 00:16:04,680 Speaker 5: to come and create. And it's not it's not just 338 00:16:04,720 --> 00:16:07,840 Speaker 5: this scary world where we have these uh like where 339 00:16:07,840 --> 00:16:09,680 Speaker 5: our fingers on the nuke button and we can just 340 00:16:09,760 --> 00:16:11,440 Speaker 5: kind of blow up Hollywood at any moment. 341 00:16:11,480 --> 00:16:12,520 Speaker 4: It's really not like that. 342 00:16:12,760 --> 00:16:14,000 Speaker 3: It's not like War Games. 343 00:16:14,200 --> 00:16:16,120 Speaker 4: No, not really, It's really just kidding. 344 00:16:16,120 --> 00:16:18,840 Speaker 3: Do you guys remember that movie Matthew Broderick. 345 00:16:18,880 --> 00:16:20,920 Speaker 4: Yes, I should do a War Games animation. 346 00:16:21,120 --> 00:16:21,960 Speaker 2: Oh my gosh. 347 00:16:22,160 --> 00:16:26,880 Speaker 1: What like, we both commend you for using AI in 348 00:16:26,920 --> 00:16:32,000 Speaker 1: this fun, creative, artistic way. It's so entertaining, it's so refreshing. 349 00:16:32,600 --> 00:16:34,520 Speaker 1: I think my mom is going to love to see it. 350 00:16:34,560 --> 00:16:36,800 Speaker 1: I'm gonna show it tour sure. 351 00:16:37,360 --> 00:16:39,360 Speaker 5: If you guys have any requests, If your mom likes 352 00:16:39,360 --> 00:16:43,520 Speaker 5: any shows, let me know. I love I have such 353 00:16:43,520 --> 00:16:44,600 Speaker 5: a huge running list. 354 00:16:45,200 --> 00:16:47,760 Speaker 3: Oh my god, I'm sure everyone and their mother are 355 00:16:47,800 --> 00:16:49,560 Speaker 3: now saying like, you have to do this for me? 356 00:16:49,680 --> 00:16:53,080 Speaker 4: And right, oh my god, yeah, that's the only text message. 357 00:16:53,760 --> 00:16:55,720 Speaker 3: You're screwed. Wait are you on cameo? 358 00:16:56,160 --> 00:16:58,000 Speaker 4: No, that's a good idea. I should Maybe I should 359 00:16:58,040 --> 00:16:58,600 Speaker 4: make one of those. 360 00:16:59,360 --> 00:17:00,600 Speaker 3: Okay, give me the referral. 361 00:17:00,880 --> 00:17:07,119 Speaker 1: Thanks well, good luck on your adorable coffee table book. Yes, 362 00:17:07,280 --> 00:17:10,719 Speaker 1: coming out junk Box Babies. Right, where can people find it? 363 00:17:11,080 --> 00:17:13,280 Speaker 5: You can find it on Amazon, so it's sold right 364 00:17:13,320 --> 00:17:16,200 Speaker 5: on Amazon. Also my website, there's a ton of links 365 00:17:16,200 --> 00:17:18,600 Speaker 5: on there. You guys can definitely find it on there, 366 00:17:18,960 --> 00:17:20,560 Speaker 5: and in my Instagram. I have a bunch of links 367 00:17:20,600 --> 00:17:22,400 Speaker 5: in my bio for it. It's pretty much everywhere. 368 00:17:22,440 --> 00:17:22,800 Speaker 3: All right. 369 00:17:22,840 --> 00:17:24,239 Speaker 2: Well, we're going to be looking for you. 370 00:17:24,560 --> 00:17:26,159 Speaker 4: I really appreciate you guys having me. 371 00:17:26,640 --> 00:17:29,800 Speaker 3: Oh my god, thank you, and thanks for including us. 372 00:17:29,880 --> 00:17:32,240 Speaker 3: There's so many shows and movies and you could have 373 00:17:32,280 --> 00:17:35,040 Speaker 3: done so you do do and we're grateful to be 374 00:17:35,080 --> 00:17:37,840 Speaker 3: a part of it. And our kids, well my kids 375 00:17:37,960 --> 00:17:38,480 Speaker 3: right like. 376 00:17:39,320 --> 00:17:41,920 Speaker 5: You guys will definitely be featured on my page again 377 00:17:42,000 --> 00:17:44,320 Speaker 5: in the future. So I appreciate the time. 378 00:17:44,359 --> 00:17:47,320 Speaker 2: Thank you, guys, Thank you, Mike be Well