1 00:00:00,200 --> 00:00:04,800 Speaker 1: Yuck digital, No thanks, It's one more thing I'm strong. 2 00:00:06,640 --> 00:00:12,720 Speaker 2: Indeed, before we get to yuck digital no thanks. I 3 00:00:12,760 --> 00:00:16,560 Speaker 2: get way too many phishing texts. Drives me crazy. That's 4 00:00:16,560 --> 00:00:18,000 Speaker 2: why it makes me angry when I go to a 5 00:00:18,040 --> 00:00:23,080 Speaker 2: restaurant and how many two insider outside doesn't matter to me. Okay, 6 00:00:23,079 --> 00:00:24,080 Speaker 2: can I have your phone number? 7 00:00:24,400 --> 00:00:28,040 Speaker 1: No? Why do you need my phone number for. 8 00:00:27,960 --> 00:00:31,120 Speaker 2: Me to eat at this restaurant? Because if I give 9 00:00:31,120 --> 00:00:32,720 Speaker 2: it to you, I'm gonna get all these damn texts 10 00:00:32,720 --> 00:00:34,599 Speaker 2: because you're selling it to people I know you are. 11 00:00:34,840 --> 00:00:36,640 Speaker 2: Drives me nuts. Anyway, I got one of these texts 12 00:00:36,640 --> 00:00:37,160 Speaker 2: the other day. 13 00:00:37,560 --> 00:00:40,040 Speaker 1: Thing it's probably reached critical mass went now, and there's 14 00:00:40,080 --> 00:00:43,479 Speaker 1: so many freaking companies and Russian mobsters who have my 15 00:00:43,600 --> 00:00:45,159 Speaker 1: phone number, I might as well just give it out 16 00:00:45,200 --> 00:00:48,160 Speaker 1: to everybody. I to post it on my roof, so 17 00:00:48,159 --> 00:00:50,120 Speaker 1: so airplanes going over ahead and can see it. 18 00:00:50,159 --> 00:00:53,239 Speaker 2: I got this one yesterday. Though usually it's pretty obviously 19 00:00:53,240 --> 00:00:55,160 Speaker 2: a scam of some part, but this one was pretty good. 20 00:00:55,480 --> 00:00:57,440 Speaker 2: If I had to choose my friend all over again, 21 00:00:57,480 --> 00:01:00,960 Speaker 2: I'd still choose you every single time from a random 22 00:01:01,000 --> 00:01:03,640 Speaker 2: phone number, and they're just hoping that I'm a lonely 23 00:01:03,680 --> 00:01:07,280 Speaker 2: person that thinks, wow, is there any link attached one 24 00:01:07,280 --> 00:01:09,000 Speaker 2: of my very Well, it's a phone number. I could 25 00:01:09,040 --> 00:01:12,280 Speaker 2: text back, but I assume something do it. Well, I 26 00:01:12,319 --> 00:01:14,800 Speaker 2: assume that alerts them that this is a live number, 27 00:01:14,800 --> 00:01:16,399 Speaker 2: and then I just get endless. 28 00:01:16,880 --> 00:01:20,160 Speaker 1: Katie, Katie too soon. He's still bouncing back from the 29 00:01:20,520 --> 00:01:25,240 Speaker 1: affair with the Ukrainian. That's right, yeah, scammer that relationship ending. 30 00:01:25,400 --> 00:01:26,640 Speaker 2: You think I should have pursued this? 31 00:01:26,880 --> 00:01:29,119 Speaker 3: Hell yeah, every time I get one to sell all 32 00:01:29,160 --> 00:01:32,160 Speaker 3: the time, I'll get these texts that say, hey, Ashley, 33 00:01:32,200 --> 00:01:35,880 Speaker 3: are we still on for lunch later? And I always go, 34 00:01:36,200 --> 00:01:39,479 Speaker 3: you have the wrong number. Sorry, oh I do. I'm 35 00:01:39,560 --> 00:01:42,480 Speaker 3: so sorry to have bothered you. And then I'm like, okay, 36 00:01:42,520 --> 00:01:45,119 Speaker 3: I know what this is. You didn't bother me. It's 37 00:01:45,120 --> 00:01:48,720 Speaker 3: all good. So now that we're talking, what do you 38 00:01:48,880 --> 00:01:51,560 Speaker 3: like to do for fun? And oh, I just I 39 00:01:51,760 --> 00:01:56,200 Speaker 3: broke these people in for if I have time, for 40 00:01:56,240 --> 00:01:58,000 Speaker 3: as long as I can, because that means they're not 41 00:01:58,040 --> 00:01:58,960 Speaker 3: doing it to somebody else. 42 00:01:58,960 --> 00:02:01,000 Speaker 2: Well, have you ever seen feel Von stuff on that 43 00:02:01,040 --> 00:02:03,600 Speaker 2: they podcast or comedian? He's got a whole website of it. 44 00:02:03,800 --> 00:02:06,800 Speaker 2: He is so good at it. It is hilarious. 45 00:02:07,040 --> 00:02:08,320 Speaker 1: Oh yeah, help out. 46 00:02:08,480 --> 00:02:10,560 Speaker 2: I'm here at the hospital. Thanks for getting a hold 47 00:02:10,560 --> 00:02:12,680 Speaker 2: of me. I just had my legs removed and I'm 48 00:02:12,720 --> 00:02:14,200 Speaker 2: really needing someone to talk to. 49 00:02:14,400 --> 00:02:18,519 Speaker 3: Oh yeah, And they'll keep going until you use the word. 50 00:02:18,320 --> 00:02:24,760 Speaker 2: Scammer and then they stop. Yeah. Oh it's so fun. Wow. Yeah, Okay, 51 00:02:24,800 --> 00:02:27,119 Speaker 2: I'll text this person back and then tomorrow we'll check 52 00:02:27,120 --> 00:02:28,840 Speaker 2: in and see how this person who said if I 53 00:02:28,880 --> 00:02:30,760 Speaker 2: had to choose my friend all over again, I still 54 00:02:30,800 --> 00:02:32,600 Speaker 2: choose you every single time. How should I respond to 55 00:02:32,639 --> 00:02:32,960 Speaker 2: that person? 56 00:02:34,000 --> 00:02:35,359 Speaker 3: I feel the same way about you. 57 00:02:35,760 --> 00:02:38,520 Speaker 2: So I was going to go, if you betray those 58 00:02:38,600 --> 00:02:41,040 Speaker 2: of the words of a you betrayed me, and you 59 00:02:41,080 --> 00:02:42,799 Speaker 2: are now my mortal end to me? Where do you live? 60 00:02:45,280 --> 00:02:49,760 Speaker 1: I like you? Oh, I like that even better. See, 61 00:02:49,919 --> 00:02:55,320 Speaker 1: I'm probably technically digitally naive. I always I don't want 62 00:02:55,320 --> 00:02:57,399 Speaker 1: to respond in any way, so they don't know it's 63 00:02:57,400 --> 00:02:59,800 Speaker 1: a live film number and it landed, but they can 64 00:03:00,240 --> 00:03:01,000 Speaker 1: tell that anyway. 65 00:03:01,680 --> 00:03:06,000 Speaker 3: Yeah, I send red receipts, so if I open it, 66 00:03:06,000 --> 00:03:10,239 Speaker 3: it shows both now Android and Apple that I. 67 00:03:11,440 --> 00:03:11,639 Speaker 1: Don't. 68 00:03:12,440 --> 00:03:14,799 Speaker 2: I don't. I want to be I want plausible deniability 69 00:03:14,880 --> 00:03:15,400 Speaker 2: to say. 70 00:03:16,160 --> 00:03:19,600 Speaker 1: It's always ability and everything you do kids. Here's a 71 00:03:19,639 --> 00:03:21,399 Speaker 1: tip from your old uncle's Joe and Jack. 72 00:03:23,760 --> 00:03:28,000 Speaker 3: And then you don't send the red receipt what it. 73 00:03:28,000 --> 00:03:31,440 Speaker 1: Says delivered, so the scammers would know, Okay, that is 74 00:03:31,520 --> 00:03:34,359 Speaker 1: a phone you think. Okay, Joe, I might as well 75 00:03:34,360 --> 00:03:36,960 Speaker 1: engage them and mess with them because that would delight me. 76 00:03:37,240 --> 00:03:39,280 Speaker 3: Yeah, as long as you don't click any links, you're fine. 77 00:03:39,720 --> 00:03:47,280 Speaker 1: I was made for that, tooted. So my my introduction 78 00:03:48,240 --> 00:03:51,800 Speaker 1: digital yuck. Essentially, it's about this piece we referenced very 79 00:03:51,800 --> 00:03:57,680 Speaker 1: briefly during the show the radio show Slash podcast earlier 80 00:03:57,760 --> 00:04:00,480 Speaker 1: by Ted Joya. Being human as cool as in is 81 00:04:00,520 --> 00:04:03,600 Speaker 1: Ai saturates everyday life. People are seeking refuge in flesh 82 00:04:03,680 --> 00:04:08,120 Speaker 1: and blood alternatives from bookstore signings to vinyl record sales. 83 00:04:09,080 --> 00:04:11,280 Speaker 1: And there are a bunch of different examples about this, 84 00:04:11,360 --> 00:04:14,440 Speaker 1: and I don't care examples of this. I don't care 85 00:04:14,760 --> 00:04:18,800 Speaker 1: if this has reached any sort of critical mass. I 86 00:04:19,160 --> 00:04:24,840 Speaker 1: have given up on quote unquote humanity. But if individuals 87 00:04:24,920 --> 00:04:26,960 Speaker 1: or small groups of people, or my friends or my 88 00:04:27,040 --> 00:04:31,239 Speaker 1: family can do the right, healthy thing, that's good enough 89 00:04:31,279 --> 00:04:31,520 Speaker 1: for me. 90 00:04:31,839 --> 00:04:34,680 Speaker 2: You've given up on humanity. That's an interesting ever has 91 00:04:34,760 --> 00:04:35,080 Speaker 2: to be. 92 00:04:35,400 --> 00:04:37,920 Speaker 1: Oh no, I never had any particular hope from humanity 93 00:04:38,040 --> 00:04:40,560 Speaker 1: before humanity. That's one of the reasons I'm in conservative. 94 00:04:41,720 --> 00:04:43,960 Speaker 1: That's why you have to have incentives and disincentives to 95 00:04:44,040 --> 00:04:46,120 Speaker 1: have in orderly society, because you're not going to reform 96 00:04:46,200 --> 00:04:48,159 Speaker 1: humanity and all of a sudden human nature is going 97 00:04:48,240 --> 00:04:51,520 Speaker 1: to change after a million years. It'll never ever happen anyway. 98 00:04:51,600 --> 00:04:54,680 Speaker 1: So here's this bookstore in Alabama. It's gotten a fair 99 00:04:54,680 --> 00:04:58,279 Speaker 1: amount of coverage in the national news. It's a small, 100 00:04:58,320 --> 00:05:03,359 Speaker 1: almost windowless business on a day in Birmingham, Alabama, and 101 00:05:03,440 --> 00:05:08,240 Speaker 1: it doesn't look like much either. But their thing is uh. 102 00:05:08,880 --> 00:05:12,880 Speaker 1: When many indie bookstores barely survive this book, the Alabama 103 00:05:12,920 --> 00:05:15,640 Speaker 1: booksmith is flourishing. But it has a crazy strategy that 104 00:05:15,720 --> 00:05:18,279 Speaker 1: draws customers who bypass stores in their own cities to 105 00:05:18,320 --> 00:05:21,040 Speaker 1: purchase from distant retailer. Here's the secret. Every book in 106 00:05:21,080 --> 00:05:24,400 Speaker 1: the store is signed by the author. This store owner 107 00:05:24,440 --> 00:05:26,719 Speaker 1: wants the author to travel to Alabama to sign copies 108 00:05:26,800 --> 00:05:28,880 Speaker 1: in the store. He likes to see the human author 109 00:05:28,920 --> 00:05:31,200 Speaker 1: in the flesh. In an age of ai slop and 110 00:05:31,240 --> 00:05:34,320 Speaker 1: books on Amazon, this is the ultimate verification of authorship. 111 00:05:34,320 --> 00:05:38,000 Speaker 1: Blah blah blah. The Alabama bookstore can guarantee sales of 112 00:05:38,080 --> 00:05:40,840 Speaker 1: several hundred copies of a new book and maybe more. 113 00:05:41,320 --> 00:05:44,400 Speaker 1: And that's enough to convince authors to fit in Alabama 114 00:05:44,440 --> 00:05:47,760 Speaker 1: on their book tours. And here's something for you, the 115 00:05:47,760 --> 00:05:51,640 Speaker 1: Alabama booksmith. It rarely charges a premium for the signed copies. 116 00:05:52,320 --> 00:05:54,760 Speaker 1: Our books don't cost more, the owner told the New Yorker, 117 00:05:54,839 --> 00:05:59,440 Speaker 1: But they're worth more interesting customers repay them with their loyalties. 118 00:05:59,440 --> 00:06:02,520 Speaker 1: Store does not solicit business. It now has approximately five 119 00:06:02,600 --> 00:06:06,440 Speaker 1: thousand customers on its email list, and so many people 120 00:06:06,480 --> 00:06:08,320 Speaker 1: travel from out of town to visit the store that 121 00:06:08,400 --> 00:06:10,960 Speaker 1: it is negotiated at a discount and rate with a 122 00:06:11,040 --> 00:06:17,880 Speaker 1: local hotel. Isn't it crazy? Then Ted writes about the 123 00:06:17,920 --> 00:06:21,440 Speaker 1: same dynamics fueling the vinyl revival in music. Musicians sell 124 00:06:21,480 --> 00:06:26,320 Speaker 1: these gigs. Many do it themselves. They man the merch table. 125 00:06:26,440 --> 00:06:27,400 Speaker 1: I've actually seen this. 126 00:06:27,520 --> 00:06:28,120 Speaker 2: Oh cool. 127 00:06:28,640 --> 00:06:30,680 Speaker 1: I bought some CDs, but I liked the idea of 128 00:06:30,760 --> 00:06:33,680 Speaker 1: vinyl even well. I had a lovely, lovely conversation with 129 00:06:33,720 --> 00:06:38,760 Speaker 1: a songwriter. I thought it was terrific. Anyway you transact 130 00:06:38,760 --> 00:06:41,520 Speaker 1: directly with fans here too, A real human does something 131 00:06:41,560 --> 00:06:44,640 Speaker 1: no AI bot can replace. Everybody enjoys it, and then 132 00:06:44,720 --> 00:06:46,760 Speaker 1: Ted writes, I speak with authority as someone who's been 133 00:06:46,800 --> 00:06:49,320 Speaker 1: on both sides of the transaction. I bought directly for 134 00:06:49,520 --> 00:06:52,599 Speaker 1: musicians relishing the opportunity to chat for a few seconds 135 00:06:52,640 --> 00:06:54,680 Speaker 1: with a person behind the album. And I've peddled my 136 00:06:54,760 --> 00:06:56,960 Speaker 1: own works at public events. Welcoming my chance to do 137 00:06:57,000 --> 00:07:01,279 Speaker 1: the three s is sell, sign, and schmooze. Let's see 138 00:07:01,320 --> 00:07:03,320 Speaker 1: you see the same thing? Oh go ahead, yeah and 139 00:07:03,400 --> 00:07:05,240 Speaker 1: jump in anytime. I'm sorry, there's a lot here and 140 00:07:05,279 --> 00:07:09,400 Speaker 1: I tend to go kind of quickly. But there's a 141 00:07:09,480 --> 00:07:11,800 Speaker 1: new secret strategy in the arts, and it's built on 142 00:07:11,840 --> 00:07:14,360 Speaker 1: the simplest thing you can imagine, namely existing as a 143 00:07:14,440 --> 00:07:18,720 Speaker 1: human being. Here's ad Week, which I used to read 144 00:07:18,840 --> 00:07:21,760 Speaker 1: all the time. You see the same thing in media 145 00:07:21,840 --> 00:07:25,760 Speaker 1: right now where live streaming is taking off. For viewers, 146 00:07:26,280 --> 00:07:29,880 Speaker 1: live streaming offers a refuge from the glowing, growing glut 147 00:07:29,920 --> 00:07:32,880 Speaker 1: of AI generated content on their feeds, and I would 148 00:07:32,880 --> 00:07:35,840 Speaker 1: also suggest to suggest it's a contrast with like the 149 00:07:35,920 --> 00:07:41,400 Speaker 1: super edited, insanely high paced, fast paced stuff they write 150 00:07:41,440 --> 00:07:43,960 Speaker 1: in a social media landscape where the difference between real 151 00:07:44,000 --> 00:07:48,120 Speaker 1: and artificial has grown nearly imperceptible, the unmistakable humanity of 152 00:07:48,160 --> 00:07:54,000 Speaker 1: real life video, real time videos or refreshing draw And 153 00:07:53,720 --> 00:07:57,240 Speaker 1: then they get into the They think Amazon shut down, 154 00:07:57,240 --> 00:07:59,640 Speaker 1: it's fresh and goat, it's fresh stores, fresh and ghost 155 00:07:59,640 --> 00:08:03,080 Speaker 1: stores whatever it's called, that you could buy groceries without 156 00:08:03,120 --> 00:08:05,760 Speaker 1: dealing with any checkout clerks, and that people didn't want this. 157 00:08:06,080 --> 00:08:08,360 Speaker 1: I know there's a divide on the show over this, 158 00:08:08,440 --> 00:08:11,400 Speaker 1: and I fully think maybe I'm in a minority. I'd 159 00:08:11,480 --> 00:08:13,320 Speaker 1: rather do the self checkout, but I know you don't 160 00:08:13,360 --> 00:08:13,640 Speaker 1: like it. 161 00:08:13,840 --> 00:08:16,680 Speaker 2: No love self checkout. I just don't like hassling with 162 00:08:16,720 --> 00:08:18,480 Speaker 2: the computer thing because I don't know how to handle 163 00:08:18,840 --> 00:08:19,600 Speaker 2: three bananas. 164 00:08:20,160 --> 00:08:24,760 Speaker 1: Oh wow, So your your inner misanthrope is in conflict 165 00:08:24,800 --> 00:08:25,840 Speaker 1: with your inner luttite. 166 00:08:26,000 --> 00:08:29,640 Speaker 2: Yes, absolutely, that must be terrible, and the and the 167 00:08:29,720 --> 00:08:31,280 Speaker 2: inner Why am I doing your job? 168 00:08:33,280 --> 00:08:33,559 Speaker 1: Yeah? 169 00:08:33,600 --> 00:08:36,880 Speaker 2: But then you know grapes, I don't know. It doesn't 170 00:08:36,880 --> 00:08:39,520 Speaker 2: have a bar code on it. The frig see. 171 00:08:39,600 --> 00:08:41,920 Speaker 1: I think it's fun to figure it out. That's weird. 172 00:08:42,480 --> 00:08:45,120 Speaker 1: I mean, I'm weird. I know I'm weird. Uh as 173 00:08:45,160 --> 00:08:50,359 Speaker 1: Ai customer service becomes more pervasive. Oh my god, Oh 174 00:08:50,559 --> 00:08:54,960 Speaker 1: my god. Judy and I were trying to rebalance what 175 00:08:55,120 --> 00:08:56,839 Speaker 1: we have in our four oh one k. You got 176 00:08:56,840 --> 00:08:58,439 Speaker 1: a little of this fund, a little that fun, a 177 00:08:58,440 --> 00:09:00,760 Speaker 1: little less fun. And we talked to our angel Guru 178 00:09:00,800 --> 00:09:02,880 Speaker 1: and he said, yeah, you're super heavily weighted in this. 179 00:09:02,960 --> 00:09:04,959 Speaker 1: I'd sell some of this Bubba And we thought, all right, 180 00:09:05,000 --> 00:09:07,040 Speaker 1: no problem, I think I'll do it on the website. 181 00:09:07,160 --> 00:09:12,040 Speaker 1: How hard can it be harder than getting a camel 182 00:09:12,160 --> 00:09:15,040 Speaker 1: through the eye of a needle to paraphrase the Bible 183 00:09:15,160 --> 00:09:18,360 Speaker 1: as it turns out? And then I tried the effing 184 00:09:18,640 --> 00:09:21,480 Speaker 1: chat bot that was going to help me. I might 185 00:09:21,520 --> 00:09:25,360 Speaker 1: as well have asked my dog, Hey, Baxter, Baxter, how 186 00:09:25,360 --> 00:09:27,200 Speaker 1: do you figure I sell some of this and buy 187 00:09:27,280 --> 00:09:30,240 Speaker 1: some of the other thing? Baxter, Listen to me. There's 188 00:09:30,280 --> 00:09:36,200 Speaker 1: a complete waste of time. So we get this, used 189 00:09:36,200 --> 00:09:38,199 Speaker 1: the phone to call a human being and he helped 190 00:09:38,240 --> 00:09:39,959 Speaker 1: us brilliantly in about five minutes. 191 00:09:40,000 --> 00:09:42,400 Speaker 2: Well, congratulations on getting a hold of a human being. 192 00:09:42,559 --> 00:09:44,400 Speaker 2: Oh my god, I was trying to do a thing 193 00:09:44,440 --> 00:09:47,120 Speaker 2: with a bank yesterday and could not figure out how 194 00:09:47,160 --> 00:09:49,720 Speaker 2: to get to a human just couldn't do it. 195 00:09:50,440 --> 00:09:53,319 Speaker 1: Yeah. Okay, so I interrupted myself, but I like this 196 00:09:53,440 --> 00:09:56,760 Speaker 1: a couple of sentences as AI customer service becomes more pervasive, 197 00:09:57,080 --> 00:10:00,000 Speaker 1: the luxury brands will survive by offering this human touch. 198 00:10:00,280 --> 00:10:04,200 Speaker 1: I'm now encountering this term. Concierge service is a marketing 199 00:10:04,240 --> 00:10:08,880 Speaker 1: angle in the digital age, and it's a fancy way 200 00:10:08,920 --> 00:10:10,800 Speaker 1: to say I can speak to a human. 201 00:10:10,920 --> 00:10:13,600 Speaker 2: If customer service makes a comeback, I'll be so happy. 202 00:10:15,080 --> 00:10:19,200 Speaker 1: Oh yeah, yeah, it might cost you. It's like, you know, I, 203 00:10:19,280 --> 00:10:21,280 Speaker 1: at this point in my career own a pretty nice car, 204 00:10:21,360 --> 00:10:26,600 Speaker 1: and where I go for service is different than if 205 00:10:26,640 --> 00:10:30,520 Speaker 1: you own a pretty cheap car. They take care of you. 206 00:10:30,640 --> 00:10:33,240 Speaker 1: It's humans at every step and they want you to 207 00:10:33,280 --> 00:10:37,280 Speaker 1: be happy. I'd love to see that penetrate, you know, 208 00:10:37,400 --> 00:10:39,760 Speaker 1: down in the market more like it always used to be. 209 00:10:40,480 --> 00:10:41,960 Speaker 1: I get efficiencies, I get it. 210 00:10:42,120 --> 00:10:43,920 Speaker 3: I'd like to see it go down to the peasants. 211 00:10:45,360 --> 00:10:46,920 Speaker 1: I would like that for you peasants. 212 00:10:47,040 --> 00:10:49,360 Speaker 2: That's how I heard it too. Hey, I bought some 213 00:10:49,520 --> 00:10:52,880 Speaker 2: jeens the other day, Brandon Jeans, and they sent me 214 00:10:52,880 --> 00:10:54,800 Speaker 2: an email saying, we want to make sure you get 215 00:10:54,840 --> 00:10:56,600 Speaker 2: the right size so you don't have to return them. 216 00:10:57,600 --> 00:11:00,760 Speaker 2: And so they measured them and they said, they explain 217 00:11:00,800 --> 00:11:02,840 Speaker 2: to me, take your favorite pair of genes and measure 218 00:11:02,880 --> 00:11:05,400 Speaker 2: them this way and see if they're the right size. 219 00:11:05,679 --> 00:11:07,360 Speaker 2: And I did, and so they sent them to me 220 00:11:07,400 --> 00:11:10,320 Speaker 2: and they fit perfectly, as opposed to like half the 221 00:11:10,360 --> 00:11:13,080 Speaker 2: time when I order stuff online it doesn't fit, and 222 00:11:13,080 --> 00:11:14,959 Speaker 2: then you're supposed to send it back, but I never 223 00:11:14,960 --> 00:11:17,520 Speaker 2: get around to it. So I have stacks and staxiclothes 224 00:11:17,559 --> 00:11:18,040 Speaker 2: that don't fit. 225 00:11:19,120 --> 00:11:23,640 Speaker 1: How your local thrift stores. Oh yeah, yeah, so this 226 00:11:23,760 --> 00:11:27,119 Speaker 1: is interesting. Even tech companies are figuring this out. Spotify 227 00:11:27,240 --> 00:11:31,239 Speaker 1: now boasts that it has human curators, not just cold algorithms. 228 00:11:31,400 --> 00:11:35,640 Speaker 1: Cool Apple Music says human curation is more important than ever. 229 00:11:36,200 --> 00:11:41,760 Speaker 2: Wow. I think in that world that maybe the AI 230 00:11:41,880 --> 00:11:43,560 Speaker 2: would be better than a human, but maybe not. 231 00:11:45,360 --> 00:11:50,120 Speaker 1: Huh. I know, I haven't paid attention lately. I remember 232 00:11:50,320 --> 00:11:59,480 Speaker 1: early on thinking they sucked. The algorithms. I don't know. 233 00:11:59,520 --> 00:12:01,920 Speaker 1: I don't want to it off into musical explorations, but 234 00:12:02,080 --> 00:12:05,440 Speaker 1: just they're terrible. I'd like listen to something like really 235 00:12:05,520 --> 00:12:08,160 Speaker 1: quirky and esoteric from a certain era, and it would 236 00:12:08,200 --> 00:12:10,960 Speaker 1: give me the big smash hits from that era, and 237 00:12:11,000 --> 00:12:13,160 Speaker 1: I'm like, no, I'm a hipster. How do you not 238 00:12:13,240 --> 00:12:14,439 Speaker 1: get that I'm a hipster. 239 00:12:15,720 --> 00:12:19,000 Speaker 3: Spotify has gotten really good with that. If you listen 240 00:12:19,080 --> 00:12:22,640 Speaker 3: to something, it's more exploring things that are kind of 241 00:12:22,679 --> 00:12:25,680 Speaker 3: in that realm rather than just throwing the you know, 242 00:12:25,720 --> 00:12:26,480 Speaker 3: the hits at you. 243 00:12:26,520 --> 00:12:27,520 Speaker 1: What I want to. 244 00:12:27,559 --> 00:12:30,199 Speaker 2: But what I hate about Spotify is not having to 245 00:12:30,240 --> 00:12:32,120 Speaker 2: search for the hot hits. So if I type in 246 00:12:32,160 --> 00:12:36,079 Speaker 2: Fleetwood Mac and you have your albums listed, you don't 247 00:12:36,120 --> 00:12:39,480 Speaker 2: have rumors. I've got to type that in by title 248 00:12:39,679 --> 00:12:41,560 Speaker 2: to get that. If I'm listening to Fleetwood Mac, you 249 00:12:41,600 --> 00:12:44,920 Speaker 2: got all these other live albums and recent albums nobody wants, 250 00:12:45,200 --> 00:12:47,800 Speaker 2: you can't give me the biggest hit they ever had, 251 00:12:47,880 --> 00:12:49,880 Speaker 2: And that happens over and over and over again. Drives 252 00:12:49,880 --> 00:12:51,840 Speaker 2: me nuts. I almost don't use Spotify because of that. 253 00:12:52,400 --> 00:12:54,680 Speaker 1: Interesting, Yeah, I'm not familiar with it. I mostly listen 254 00:12:54,720 --> 00:13:00,760 Speaker 1: to Apple Music. Interestingly, band camps. Band Camp is organizing 255 00:13:00,800 --> 00:13:05,000 Speaker 1: band camp clubs where members get special music selections, listening parties, 256 00:13:05,000 --> 00:13:07,800 Speaker 1: and other perks from human curators. Then they like talk 257 00:13:07,880 --> 00:13:10,080 Speaker 1: to each other about whether they liked the song and stuff. 258 00:13:10,120 --> 00:13:17,400 Speaker 1: Imagine that. Let's see. Recently, streamer Quo Buzz Somebody Anybody, 259 00:13:17,720 --> 00:13:23,960 Speaker 1: qobu z Yeah launched a announced launcher of a proprietary 260 00:13:24,040 --> 00:13:27,320 Speaker 1: AI detection tool. The companies published a set of principles 261 00:13:27,320 --> 00:13:30,520 Speaker 1: that include this paragraph quote our conviction AI can be 262 00:13:30,520 --> 00:13:33,720 Speaker 1: a valuable amplifier and ever substitute for human judgments of 263 00:13:33,760 --> 00:13:36,079 Speaker 1: their human curation. 264 00:13:36,200 --> 00:13:38,520 Speaker 2: That sort of thing Quobaz or whatever you said that 265 00:13:38,640 --> 00:13:43,640 Speaker 2: could be nobody or could have eighty million followers on 266 00:13:43,679 --> 00:13:45,079 Speaker 2: some platform and I don't know them. 267 00:13:46,200 --> 00:13:50,280 Speaker 3: It's a music streaming service that's considered better than Spotify 268 00:13:50,360 --> 00:13:51,400 Speaker 3: for sound quality. 269 00:13:51,840 --> 00:13:54,800 Speaker 2: Okay, I like that too, I like that got my attention. 270 00:13:55,120 --> 00:13:58,040 Speaker 1: Yeah, I got to send this to my son, who, 271 00:13:58,120 --> 00:14:00,839 Speaker 1: as I've said several times, would be an editor Rolling 272 00:14:00,880 --> 00:14:04,559 Speaker 1: Stone if he had been born thirty years sooner. But 273 00:14:04,600 --> 00:14:06,880 Speaker 1: there's so much content everywhere you can't make any money 274 00:14:06,920 --> 00:14:09,880 Speaker 1: from it. So I thought this was interesting. Final note 275 00:14:09,920 --> 00:14:13,480 Speaker 1: from this bahbah the paradox of living in a digital age. 276 00:14:13,520 --> 00:14:16,520 Speaker 1: Human beings have more prestige than ever they just and 277 00:14:16,559 --> 00:14:20,200 Speaker 1: they get it just by showing up. Just being a 278 00:14:20,280 --> 00:14:24,440 Speaker 1: human is so attractive in the digital world. This won't change. 279 00:14:24,480 --> 00:14:26,840 Speaker 1: In fact, the Silicon Valley elites forcing tech down our 280 00:14:26,880 --> 00:14:29,320 Speaker 1: throats will only make us hate cold, sterile tech more 281 00:14:29,360 --> 00:14:32,560 Speaker 1: than ever they have me, and they won't fix that 282 00:14:32,600 --> 00:14:35,360 Speaker 1: problem by training AI to pretend to be human. That 283 00:14:35,560 --> 00:14:40,480 Speaker 1: just adds insult to injury. Oh so true. Yeah, this 284 00:14:40,680 --> 00:14:43,720 Speaker 1: might even be the hot new career path, ready made 285 00:14:43,720 --> 00:14:48,200 Speaker 1: for curators, concierges, caregivers, conversationalists, and other people who loved 286 00:14:48,240 --> 00:14:51,560 Speaker 1: being around people. As the old pop song anticipated, they 287 00:14:51,640 --> 00:14:54,000 Speaker 1: might just end up being the happiest people of them all. 288 00:14:54,360 --> 00:14:57,080 Speaker 1: So welcome to the lovely new economy where humans actually matter. 289 00:14:57,280 --> 00:14:59,280 Speaker 1: Go ahead, try it out, be cool, be a human. 290 00:14:59,440 --> 00:15:01,520 Speaker 1: All the bot and boctom will never be able to 291 00:15:01,560 --> 00:15:02,680 Speaker 1: take that away from you. Well. 292 00:15:02,720 --> 00:15:06,400 Speaker 2: I don't particularly like humans, but uh, since the all 293 00:15:06,440 --> 00:15:10,040 Speaker 2: the AI customer service stuff sucks, I prefer a human 294 00:15:10,080 --> 00:15:10,240 Speaker 2: to that. 295 00:15:11,160 --> 00:15:15,400 Speaker 1: Yeah, because it's just horrible. I like some humans. Just 296 00:15:15,400 --> 00:15:18,040 Speaker 1: a question of how many and how much. 297 00:15:17,960 --> 00:15:20,840 Speaker 2: There's like five unplanet Earth the rest of y'all can 298 00:15:21,280 --> 00:15:23,960 Speaker 2: Poundsand what if they're friendly? 299 00:15:24,000 --> 00:15:25,560 Speaker 1: Do they annoy you or. 300 00:15:27,480 --> 00:15:27,920 Speaker 2: Good looks? 301 00:15:27,960 --> 00:15:31,000 Speaker 1: Scerely friendly or like clearly friendly for a purpose? 302 00:15:31,240 --> 00:15:33,640 Speaker 2: I just again with we were talking about this earlier 303 00:15:33,680 --> 00:15:35,480 Speaker 2: in the show, the idea that AI is going to 304 00:15:35,480 --> 00:15:38,480 Speaker 2: take all our jobs. Okay, I'll believe it. You're a genius, 305 00:15:38,840 --> 00:15:41,920 Speaker 2: but sure ain't close yet. I mean, you'd think customer 306 00:15:41,960 --> 00:15:44,080 Speaker 2: service would be better than it is. You can call 307 00:15:44,120 --> 00:15:47,840 Speaker 2: a multi billion dollar bank and there and their AI 308 00:15:48,360 --> 00:15:52,680 Speaker 2: customer service thing is shit. I mean, it's just shit. 309 00:15:53,040 --> 00:15:56,120 Speaker 2: Whatever question you have, I'm sorry. Did you say accounts? No, 310 00:15:56,160 --> 00:16:00,440 Speaker 2: I didn't say accounts. It's just I can't believe it's 311 00:16:00,480 --> 00:16:02,040 Speaker 2: not better than it is. How how's it going to 312 00:16:02,080 --> 00:16:04,600 Speaker 2: replace every white color job in America if they haven't 313 00:16:04,600 --> 00:16:05,880 Speaker 2: gotten better than this so far? 314 00:16:06,320 --> 00:16:09,480 Speaker 1: And you speak more distinctly than the average human being, right, 315 00:16:09,520 --> 00:16:11,120 Speaker 1: what about your average marble. 316 00:16:10,880 --> 00:16:13,240 Speaker 2: Mouthed half with boom Hower calls in needs to do 317 00:16:13,320 --> 00:16:14,400 Speaker 2: something to checking. 318 00:16:14,120 --> 00:16:18,520 Speaker 1: Cat coucheck around a. 319 00:16:22,480 --> 00:16:23,600 Speaker 2: Little interest rates. 320 00:16:29,320 --> 00:16:31,360 Speaker 1: I like In and Out for their customer service. I 321 00:16:31,360 --> 00:16:32,760 Speaker 1: actually think they're customer service. 322 00:16:32,840 --> 00:16:34,720 Speaker 2: High school kids standing out in the parking lot in 323 00:16:34,760 --> 00:16:36,040 Speaker 2: the long line taking your order. 324 00:16:36,080 --> 00:16:36,480 Speaker 1: I like it. 325 00:16:36,600 --> 00:16:36,840 Speaker 3: Yep. 326 00:16:37,200 --> 00:16:39,000 Speaker 1: Nothing I love better than the old in and Out 327 00:16:39,080 --> 00:16:39,360 Speaker 1: Michael 328 00:16:39,560 --> 00:16:45,160 Speaker 2: Huh, Well, I guess that's it with pickles