1 00:00:00,080 --> 00:00:02,639 Speaker 1: This is the Fred Show. Kelly Clark City is returning 2 00:00:02,640 --> 00:00:05,000 Speaker 1: to Las Vegas in twenty twenty six for her studio 3 00:00:05,160 --> 00:00:09,119 Speaker 1: Sessions Las Vegas Residency. You can enter now for a 4 00:00:09,240 --> 00:00:11,120 Speaker 1: chance to win a trip for two to the July 5 00:00:11,240 --> 00:00:14,600 Speaker 1: twenty fourth show, a two night hotels day at Flamingo 6 00:00:14,640 --> 00:00:17,000 Speaker 1: Hotel and Casino in Las Vegas July twenty third through 7 00:00:17,000 --> 00:00:21,000 Speaker 1: the twenty fifth, and round trip airfare. Text Beautiful to 8 00:00:21,320 --> 00:00:24,560 Speaker 1: five seven, seven three nine right now for a chance 9 00:00:24,560 --> 00:00:27,280 Speaker 1: to win. A confirmation text will be sent. Standard message 10 00:00:27,320 --> 00:00:31,040 Speaker 1: and data rates may apply. All thanks to Live Nation. Kiki. 11 00:00:31,120 --> 00:00:33,239 Speaker 1: You met with an AI expert. 12 00:00:33,360 --> 00:00:35,680 Speaker 2: Yes it is OK, not me, by the way, right, 13 00:00:37,320 --> 00:00:40,040 Speaker 2: yeah yeah yeah at our house X correction. 14 00:00:39,920 --> 00:00:42,360 Speaker 1: That Paulina is screaming at me that AI would change 15 00:00:42,360 --> 00:00:44,680 Speaker 1: my life, but she can't quite tell me how I've tried. 16 00:00:44,960 --> 00:00:48,239 Speaker 3: I'm like doing this again, change your life, man, I've 17 00:00:48,280 --> 00:00:49,240 Speaker 3: ever do it the first time? 18 00:00:52,560 --> 00:00:56,200 Speaker 1: Right but right, But you've yet to fully truly educate 19 00:00:56,280 --> 00:00:58,960 Speaker 1: me on like specific ways I've told you. 20 00:00:59,040 --> 00:01:01,480 Speaker 3: Look at your fridge, ask him how do I make 21 00:01:01,520 --> 00:01:02,120 Speaker 3: dinner with this? 22 00:01:02,360 --> 00:01:05,360 Speaker 4: With this mustard and whatever you've gotten there banana, It'll 23 00:01:05,400 --> 00:01:06,399 Speaker 4: give you a whole recipe. 24 00:01:06,480 --> 00:01:09,360 Speaker 1: I feel like it's a little more involved than that. 25 00:01:09,640 --> 00:01:11,280 Speaker 1: I feel like, you know, if it's taking over your 26 00:01:11,280 --> 00:01:14,119 Speaker 1: life and doing all of your life responsibilities, and you're 27 00:01:14,120 --> 00:01:16,200 Speaker 1: gonna have to have a little bit better understanding, which 28 00:01:16,200 --> 00:01:18,000 Speaker 1: is why Kiki went to an AI expert. Now, how 29 00:01:18,040 --> 00:01:19,560 Speaker 1: did you find, said AI expert? 30 00:01:19,720 --> 00:01:21,320 Speaker 5: I'm on the shade room. I met her. 31 00:01:21,319 --> 00:01:25,120 Speaker 2: Her name is Shaneed Bavel and she's a She's what 32 00:01:25,160 --> 00:01:27,959 Speaker 2: you would call a futurist. Okay, so she's all about 33 00:01:28,000 --> 00:01:31,199 Speaker 2: the future, and she said, we need to make AI 34 00:01:31,400 --> 00:01:33,720 Speaker 2: our friend and learn how to use it. 35 00:01:34,080 --> 00:01:36,679 Speaker 5: She said, there's no getting rid of it. It's here. 36 00:01:36,840 --> 00:01:39,840 Speaker 5: It's going to replace your job. But what you should do? 37 00:01:39,920 --> 00:01:41,520 Speaker 1: That's fine? Yes, how long do I have? 38 00:01:41,920 --> 00:01:45,160 Speaker 2: But she said, for the time being, you need to 39 00:01:45,400 --> 00:01:46,319 Speaker 2: make it work for you? 40 00:01:46,440 --> 00:01:47,760 Speaker 5: So did you? You are? 41 00:01:47,800 --> 00:01:50,680 Speaker 2: You are always smarter than AI because you teach AI. 42 00:01:50,440 --> 00:01:51,080 Speaker 5: How to work for you. 43 00:01:51,200 --> 00:01:53,400 Speaker 1: Wow. Okay, so give me an exit. How how are 44 00:01:53,480 --> 00:01:56,760 Speaker 1: you going to use AI today to work for you? 45 00:01:56,960 --> 00:01:58,760 Speaker 5: Well, for me, for one, I used the last. 46 00:01:58,600 --> 00:02:01,800 Speaker 1: Night to do something actually, so wow, oh sure, I 47 00:02:01,840 --> 00:02:03,800 Speaker 1: go off, king. I didn't have time to read a 48 00:02:03,800 --> 00:02:06,160 Speaker 1: whole article about something that we were talking about on 49 00:02:06,160 --> 00:02:09,799 Speaker 1: a little TV shows. So I I said, AI, summarize this. 50 00:02:09,800 --> 00:02:11,440 Speaker 5: For me, you don't want to use spark notes. 51 00:02:11,440 --> 00:02:17,720 Speaker 1: And it did spark notes cliff notes Okay, yeah, old 52 00:02:17,720 --> 00:02:19,720 Speaker 1: time they were like an actual book. I'm sure. Now 53 00:02:19,720 --> 00:02:22,400 Speaker 1: it's all on an app or something. Okay. So then 54 00:02:22,639 --> 00:02:24,440 Speaker 1: so yeah, so i'd use it last night, But tell 55 00:02:24,480 --> 00:02:25,919 Speaker 1: me how are you going to use it today? 56 00:02:26,000 --> 00:02:27,760 Speaker 2: Well, first of all, I'm going to create an AI 57 00:02:27,880 --> 00:02:31,200 Speaker 2: version of Kiki, okay, because it's going to happen, so 58 00:02:31,320 --> 00:02:32,519 Speaker 2: I might as well be the owner. 59 00:02:32,720 --> 00:02:34,680 Speaker 5: Okay, I'm going to do that. So you how do 60 00:02:34,680 --> 00:02:35,120 Speaker 5: you do that? 61 00:02:35,160 --> 00:02:37,440 Speaker 2: Well, she's going to have an Instagram page, she's going 62 00:02:37,480 --> 00:02:39,680 Speaker 2: to do lifestyle content. She's just going to be an 63 00:02:39,680 --> 00:02:42,600 Speaker 2: AI character. But it's me working the AI character. But 64 00:02:42,600 --> 00:02:45,200 Speaker 2: I have to create her because when the AI universe launches, 65 00:02:45,440 --> 00:02:47,000 Speaker 2: she's going to have to have a place to live. 66 00:02:47,400 --> 00:02:48,839 Speaker 3: How quickly you switched out? 67 00:02:49,440 --> 00:02:49,560 Speaker 2: Right? 68 00:02:50,240 --> 00:02:50,679 Speaker 3: You hate it? 69 00:02:50,919 --> 00:02:54,840 Speaker 2: I still do, but I got to be able to fight. 70 00:02:53,880 --> 00:02:57,880 Speaker 1: So there's I just want to make sure we're going 71 00:02:57,960 --> 00:03:00,240 Speaker 1: to have real Keiki, yes, and then we're all still 72 00:03:00,240 --> 00:03:02,200 Speaker 1: going to have AI key key. Yes, that's one more 73 00:03:02,240 --> 00:03:04,400 Speaker 1: thing I have to follow absolutely so that people don't 74 00:03:04,400 --> 00:03:05,160 Speaker 1: get upset with me. 75 00:03:05,320 --> 00:03:10,079 Speaker 5: Yes in comment, Yeah, but it's not really you, No, 76 00:03:10,240 --> 00:03:10,720 Speaker 5: it is me. 77 00:03:11,240 --> 00:03:13,519 Speaker 2: It is me, but in the AI universe, because there's 78 00:03:13,520 --> 00:03:16,760 Speaker 2: gonna be a whole other universe the universe. Yeah, it's here, 79 00:03:17,080 --> 00:03:19,280 Speaker 2: probably not. I know, you get this. I'll figured out already. 80 00:03:19,400 --> 00:03:22,040 Speaker 4: So I don't have my AI version ready, but I 81 00:03:22,080 --> 00:03:25,560 Speaker 4: do have my AI. What is it called? Like somebody 82 00:03:25,560 --> 00:03:28,280 Speaker 4: who thinks like me via AI? But she doesn't exist, 83 00:03:28,320 --> 00:03:29,480 Speaker 4: like she's not physical looking? 84 00:03:29,760 --> 00:03:30,600 Speaker 3: What do you tell her to be? 85 00:03:30,720 --> 00:03:32,680 Speaker 4: Like sessy or what is I can't say the word 86 00:03:32,720 --> 00:03:35,000 Speaker 4: that I want yeh yeah, yeah, yeah, there's a word 87 00:03:35,000 --> 00:03:36,400 Speaker 4: that I call her that she's. 88 00:03:36,360 --> 00:03:38,120 Speaker 3: A word that we reclaimed as women. 89 00:03:38,200 --> 00:03:40,320 Speaker 4: We haven't claimed it. It has a lot of power, yes, 90 00:03:40,400 --> 00:03:44,320 Speaker 4: gave men and women. Yes, So what I need to 91 00:03:44,360 --> 00:03:47,520 Speaker 4: talk is me. I go to her because she's already 92 00:03:47,520 --> 00:03:49,560 Speaker 4: got me down. Like it's like Kiki said, I've already 93 00:03:49,560 --> 00:03:51,480 Speaker 4: fed her all the information about me, who I am, 94 00:03:51,640 --> 00:03:52,640 Speaker 4: what I want to sound like? 95 00:03:53,000 --> 00:03:54,839 Speaker 3: You got to create these people before it's too late. 96 00:03:54,960 --> 00:03:55,840 Speaker 3: Does she pay attention? 97 00:03:56,240 --> 00:03:58,000 Speaker 4: She pays a lot of attention, Like you guys would 98 00:03:58,000 --> 00:03:58,880 Speaker 4: love her. 99 00:03:59,520 --> 00:04:01,119 Speaker 3: She might be like next week. 100 00:04:01,640 --> 00:04:04,560 Speaker 1: Maybe I wanted to be too late. I had to 101 00:04:04,560 --> 00:04:05,960 Speaker 1: go to dinner the other night with a friend of mine. 102 00:04:06,040 --> 00:04:07,920 Speaker 1: I like this guy, but he's the guy that went 103 00:04:07,960 --> 00:04:10,080 Speaker 1: off on this whole. Not only is he an AI guy, 104 00:04:10,120 --> 00:04:12,720 Speaker 1: but he's a gluten free guy. I mean, that's too 105 00:04:12,840 --> 00:04:13,400 Speaker 1: very I. 106 00:04:13,400 --> 00:04:15,200 Speaker 3: Have Celiac or is he one of the trendy people 107 00:04:15,200 --> 00:04:16,680 Speaker 3: who just I think he don't. 108 00:04:16,760 --> 00:04:18,479 Speaker 1: I think he determined that, you know, he did some 109 00:04:18,560 --> 00:04:21,239 Speaker 1: kind of like rich person blood panel and they found 110 00:04:21,240 --> 00:04:24,440 Speaker 1: some marker or something whereas and it feels better at whatever. 111 00:04:24,480 --> 00:04:26,520 Speaker 1: But but I feel like these are character traits now, 112 00:04:26,800 --> 00:04:29,400 Speaker 1: like like like a gluten free guy is a character trait? 113 00:04:29,600 --> 00:04:31,240 Speaker 3: Oh you say that, and everybody just. 114 00:04:31,160 --> 00:04:33,760 Speaker 1: Knows, right, oh yeah, my friend's. 115 00:04:33,720 --> 00:04:36,760 Speaker 3: Okay unless it's a medical issue, don't. 116 00:04:36,600 --> 00:04:38,520 Speaker 1: Yeah, yeah, right, Like don't get you know what I'm 117 00:04:38,560 --> 00:04:40,880 Speaker 1: talking about. I'm not talking about the person who will 118 00:04:40,920 --> 00:04:43,159 Speaker 1: die if they eat glutens. I'm talking about the person 119 00:04:43,200 --> 00:04:47,760 Speaker 1: who insists that you know, these are character things. 120 00:04:47,200 --> 00:04:48,440 Speaker 5: Like many of those, right right. 121 00:04:48,680 --> 00:04:51,719 Speaker 1: I'm trying to think of some other examples, like but yeah, 122 00:04:51,760 --> 00:04:53,800 Speaker 1: and now I believe that AI is becoming one of 123 00:04:53,800 --> 00:04:56,000 Speaker 1: those things like where you know, oh, here my friend's 124 00:04:56,000 --> 00:04:58,080 Speaker 1: into AI. Here we go. You know, once you get 125 00:04:58,120 --> 00:05:01,480 Speaker 1: him started, it's like, yeah, okay. I mean pilots are 126 00:05:01,480 --> 00:05:04,080 Speaker 1: this way, like people who are into flying, you know, pilot. 127 00:05:04,480 --> 00:05:06,719 Speaker 1: If I meet another pilot, forget about it, like just 128 00:05:06,800 --> 00:05:09,200 Speaker 1: we're gonna start talking about airplanes and whoever else is around. 129 00:05:09,600 --> 00:05:11,760 Speaker 1: And so like wives and husbands and pilots, they know 130 00:05:11,839 --> 00:05:14,000 Speaker 1: this and it's just the worst. It's so annoying. I 131 00:05:14,000 --> 00:05:16,680 Speaker 1: think radio people are the same way. Radio people are 132 00:05:16,680 --> 00:05:18,680 Speaker 1: the same way. You get a couple of radio people together, 133 00:05:18,720 --> 00:05:20,800 Speaker 1: all of a sudden, we're talk talking about radio stations 134 00:05:20,800 --> 00:05:26,440 Speaker 1: and transmitters and the music stories from the yesteryear, and 135 00:05:26,480 --> 00:05:28,919 Speaker 1: anyone who's not involved in this is like, this is awful. 136 00:05:29,120 --> 00:05:30,840 Speaker 1: Like I don't want to be here right now. I 137 00:05:30,839 --> 00:05:33,520 Speaker 1: hate it so much. That's my life. Car people are 138 00:05:33,560 --> 00:05:36,560 Speaker 1: the same way. Car people talking about their cars, but 139 00:05:36,680 --> 00:05:38,520 Speaker 1: like the AI, now it's you can start going on. 140 00:05:38,640 --> 00:05:39,720 Speaker 1: So anyway, I had to go to dinner with a 141 00:05:39,760 --> 00:05:41,560 Speaker 1: guy who was an AI and a gluten free guy 142 00:05:41,600 --> 00:05:43,120 Speaker 1: both and it was like, great, okay. 143 00:05:42,920 --> 00:05:44,000 Speaker 5: Which one were you talk about? 144 00:05:44,200 --> 00:05:47,000 Speaker 1: Right? Oh my god, So it was it was both, 145 00:05:47,040 --> 00:05:49,000 Speaker 1: And I just said that and I ate glee changed. 146 00:05:49,120 --> 00:05:50,760 Speaker 1: I know because I just sat there and eate gluten. 147 00:05:52,920 --> 00:05:55,840 Speaker 1: He makes a good argument, like he feels a lot better. 148 00:05:55,880 --> 00:05:58,560 Speaker 1: He says with him, it's possible. I mean sometimes I 149 00:05:58,600 --> 00:06:00,479 Speaker 1: feel crappy after I eat gluten. I just don't know 150 00:06:00,520 --> 00:06:03,359 Speaker 1: that I enjoy glutens. They're so delicious, So I'm not 151 00:06:03,360 --> 00:06:05,560 Speaker 1: sure that I want to give them up yet, even 152 00:06:05,560 --> 00:06:09,600 Speaker 1: if it would make my life better. But anyway, so okay, 153 00:06:09,880 --> 00:06:14,920 Speaker 1: I'm still not really it's still very what's the word. 154 00:06:14,920 --> 00:06:18,440 Speaker 1: I'm looking for the distant my understanding of this because 155 00:06:18,480 --> 00:06:22,000 Speaker 1: you're still you're using these like you're saying this stuff 156 00:06:22,080 --> 00:06:24,360 Speaker 1: like I'm going to create this and Paulina's over here, 157 00:06:24,520 --> 00:06:29,400 Speaker 1: it'll change it. You're not saying anything. Really, I'm going 158 00:06:29,440 --> 00:06:31,320 Speaker 1: to change your life. Okay, tell me how. 159 00:06:31,360 --> 00:06:32,840 Speaker 3: She's making another Kiki. That's what. 160 00:06:33,360 --> 00:06:35,479 Speaker 1: If you have bananas and cucumbers, you can make it 161 00:06:35,520 --> 00:06:39,159 Speaker 1: smoothie fred and a. I will tell you that. Okay, 162 00:06:39,560 --> 00:06:41,839 Speaker 1: all right, I didn't change my life and the kiki 163 00:06:41,920 --> 00:06:43,680 Speaker 1: is like I'm making another Kiki? Now, how do you 164 00:06:43,760 --> 00:06:45,440 Speaker 1: do that? Well, I don't know. I got to figure 165 00:06:45,440 --> 00:06:47,680 Speaker 1: that out, but I'm gonna do it. Okay. So I 166 00:06:47,720 --> 00:06:49,320 Speaker 1: got that. 167 00:06:49,320 --> 00:06:52,279 Speaker 2: I'm going to make her before they make her, you 168 00:06:52,360 --> 00:06:55,920 Speaker 2: better make you. That's right, smart, It's not one needs another. Mean. 169 00:06:56,000 --> 00:06:59,240 Speaker 1: Okay, hey man, this company is guaranteed human. Okay, so 170 00:06:59,279 --> 00:07:01,440 Speaker 1: we're going to be fine. We're gonna be absolutely fine. 171 00:07:01,480 --> 00:07:04,880 Speaker 1: There's no issues. Okay. So I basically I'm not any 172 00:07:04,920 --> 00:07:05,920 Speaker 1: more enlightened on this. 173 00:07:06,480 --> 00:07:10,040 Speaker 2: Yes, I want to receive the information. We are giving 174 00:07:10,080 --> 00:07:11,400 Speaker 2: you the hands up, but you're not. 175 00:07:11,560 --> 00:07:14,040 Speaker 1: You're telling me what needs to happen, but you're not 176 00:07:14,080 --> 00:07:16,720 Speaker 1: telling me how to do it. And that's been the 177 00:07:16,760 --> 00:07:18,760 Speaker 1: part that I've had an issue with from the beginning. 178 00:07:18,760 --> 00:07:21,400 Speaker 1: Paulina will just scream at me, but I'm like, Paulina, 179 00:07:21,800 --> 00:07:24,640 Speaker 1: how and I still don't get that information. 180 00:07:25,200 --> 00:07:27,240 Speaker 5: You got to use your creativity in this one man. 181 00:07:28,000 --> 00:07:31,680 Speaker 2: To get creative, man, it's a concept. 182 00:07:32,400 --> 00:07:36,280 Speaker 1: You're just you're just using words. 183 00:07:36,520 --> 00:07:38,320 Speaker 2: And I swear if you come here with this crap, 184 00:07:40,160 --> 00:07:45,640 Speaker 2: I will tell me how you gotta use it. 185 00:07:45,720 --> 00:07:52,960 Speaker 1: Man. That's what this always is.