1 00:00:00,200 --> 00:00:03,560 Speaker 1: Welcome to Worst Year Ever, a production of I Heart Radio. 2 00:00:06,760 --> 00:00:24,160 Speaker 1: Welcome Together Everything, so don't don't do so. Welcome to 3 00:00:24,640 --> 00:00:31,480 Speaker 1: the Worst Hear Ever Podcast. My name is Katie Stole 4 00:00:31,880 --> 00:00:34,880 Speaker 1: and I'm Robert Evans. And in case you guys didn't 5 00:00:34,920 --> 00:00:36,599 Speaker 1: hear that, this is going to be the worst year 6 00:00:36,600 --> 00:00:40,720 Speaker 1: ever because Cody is continuing to do his time machine noises. Uh, 7 00:00:40,720 --> 00:00:43,919 Speaker 1: and they are grotesque. I was told that there were 8 00:00:44,000 --> 00:00:46,479 Speaker 1: some mixed reviews and uh, I didn't go over all 9 00:00:46,520 --> 00:00:50,199 Speaker 1: with some people, and now doubling down, that's where we are. 10 00:00:50,400 --> 00:00:55,440 Speaker 1: This is your fault, you guys. You've encouraged I forgave O. J. 11 00:00:55,600 --> 00:00:58,360 Speaker 1: Simpson when he came back on Twitter with that series 12 00:00:58,400 --> 00:01:01,760 Speaker 1: of charming videos. But I can't to give this interesting 13 00:01:02,560 --> 00:01:06,080 Speaker 1: interesting your can for giveness mechanism? You have? Wait, so 14 00:01:06,160 --> 00:01:08,720 Speaker 1: when he posted when he posted the videos saying like 15 00:01:08,800 --> 00:01:10,280 Speaker 1: I got a lot of what do you say, like 16 00:01:10,400 --> 00:01:12,920 Speaker 1: I got a lot of scores to settle or something 17 00:01:13,000 --> 00:01:17,640 Speaker 1: like that that that garnered your forgiveness. When I when 18 00:01:17,680 --> 00:01:20,040 Speaker 1: I was a child and I would do something wrong, 19 00:01:20,160 --> 00:01:22,320 Speaker 1: or when you know my parents would do something wrong. 20 00:01:22,360 --> 00:01:25,160 Speaker 1: I was raised to believe that when you apologize, you 21 00:01:25,240 --> 00:01:28,000 Speaker 1: pose with a set of golf clubs on a golf 22 00:01:28,040 --> 00:01:30,319 Speaker 1: course and talk about all the scores you have to settle, 23 00:01:30,440 --> 00:01:33,160 Speaker 1: and that is the truest way to express contrition. So 24 00:01:33,200 --> 00:01:38,200 Speaker 1: I can assume that O. J. Simpson meant to express 25 00:01:38,280 --> 00:01:43,160 Speaker 1: his deep sorrow over his past actions. And uh yeah, okay, 26 00:01:43,200 --> 00:01:47,560 Speaker 1: well then I apologize for the time travel noise, and 27 00:01:47,600 --> 00:01:50,120 Speaker 1: I promised to get revenge on you by doing it. 28 00:01:54,040 --> 00:01:56,320 Speaker 1: I don't believe it without the golf course. We can 29 00:01:56,360 --> 00:01:59,000 Speaker 1: addit this out, but can I read it? It's really 30 00:02:00,240 --> 00:02:03,560 Speaker 1: The review says that time machine noise was all it 31 00:02:03,600 --> 00:02:08,920 Speaker 1: took back to Rachel Maddow, which I was like, you 32 00:02:08,960 --> 00:02:14,600 Speaker 1: know what, that's part out Daniel. By the way, Oh no, no, 33 00:02:14,680 --> 00:02:16,800 Speaker 1: I say we leave it in the world needs to 34 00:02:16,880 --> 00:02:22,120 Speaker 1: know I'm de radicalizing people with my time noises. I'm 35 00:02:22,160 --> 00:02:24,880 Speaker 1: imagining that person was like on their way to bomb 36 00:02:24,880 --> 00:02:27,240 Speaker 1: a bank with like a mask on and stuff, and 37 00:02:27,320 --> 00:02:29,400 Speaker 1: they heard your time machine noises and said, you know 38 00:02:29,480 --> 00:02:33,480 Speaker 1: what I'm going, Joe, we gotta get rid of We 39 00:02:33,560 --> 00:02:39,120 Speaker 1: gotta get rid of. Drump turns on him a moderate cool. Well, 40 00:02:39,240 --> 00:02:44,000 Speaker 1: today guys were gathered here to discuss not Joe Biden, 41 00:02:44,600 --> 00:02:48,160 Speaker 1: not Rachel matt Out, not time Machine voices, but Andrew Yang, 42 00:02:48,840 --> 00:02:52,079 Speaker 1: the meme king, the unabashed lover of s n L. 43 00:02:53,200 --> 00:02:56,519 Speaker 1: He loves SNL. Well, he was tweeting about how much 44 00:02:56,520 --> 00:02:59,320 Speaker 1: he liked this last week's episode or something like that, 45 00:02:59,520 --> 00:03:01,560 Speaker 1: and you know he wanted to sit down with Shane Gillis. 46 00:03:02,919 --> 00:03:04,799 Speaker 1: I don't know, guys, this is going to be a 47 00:03:04,800 --> 00:03:08,800 Speaker 1: different kind of episode because Andrew Yang has never been 48 00:03:08,800 --> 00:03:12,160 Speaker 1: a politician. Uh So, there's we're going to be digging 49 00:03:12,200 --> 00:03:16,160 Speaker 1: in right, There's not a lot of record, and we're 50 00:03:16,240 --> 00:03:19,680 Speaker 1: talking about his life, his career. Are you ready for this? 51 00:03:21,240 --> 00:03:27,360 Speaker 1: I am so ready. Everyone seems pumped. I'm gonna need 52 00:03:27,400 --> 00:03:32,000 Speaker 1: to take the energy downal looney, Okay, Okay, let me 53 00:03:32,080 --> 00:03:35,240 Speaker 1: try this again, Katie, I am. I am so into 54 00:03:35,320 --> 00:03:40,480 Speaker 1: Andrew Yang that whenever I read studies on the efficacy 55 00:03:40,520 --> 00:03:46,080 Speaker 1: of universal basic income, the crutch of my pants explodes. Okay, 56 00:03:46,120 --> 00:03:49,360 Speaker 1: that's the kind of setup, Like we are so on, 57 00:03:49,600 --> 00:03:52,880 Speaker 1: we're on fire right now. Yeah, Like it's crotch, geez 58 00:03:53,960 --> 00:03:57,920 Speaker 1: my crotch. When I even think about Andrew Yang's policy statements, 59 00:03:58,080 --> 00:03:59,920 Speaker 1: you just this is going to be a very un 60 00:04:00,000 --> 00:04:05,120 Speaker 1: comfortable episodes for you. Then, I can't wear quaduroy anymore. 61 00:04:05,160 --> 00:04:07,800 Speaker 1: Too many third degree burns. Just take off your pants 62 00:04:07,840 --> 00:04:11,880 Speaker 1: now then, okay, I'm pansless because of Andrew Yang, which 63 00:04:12,000 --> 00:04:16,200 Speaker 1: is interestingly enough, his new slogan going into he doesn't 64 00:04:16,200 --> 00:04:21,240 Speaker 1: need anymore memes, but he's got one that'll that'll do. 65 00:04:21,360 --> 00:04:26,320 Speaker 1: Pig that'll do. Andrew Yang, leader of the Yangang, self 66 00:04:26,360 --> 00:04:30,560 Speaker 1: described serial entrepreneur, was born in Snectadee, New York, to 67 00:04:30,600 --> 00:04:34,640 Speaker 1: Taiwanese immigrants. In growing up, Yang was bullied for his race. 68 00:04:34,760 --> 00:04:37,360 Speaker 1: That sucks. Both of his parents went to grad school 69 00:04:37,360 --> 00:04:39,520 Speaker 1: at Berkeley, and Yang attended Brown as an undergrad before 70 00:04:39,520 --> 00:04:43,320 Speaker 1: getting his lotgrey from Columbia. Uh. He quickly got a 71 00:04:43,440 --> 00:04:47,039 Speaker 1: job as a corporate attorney at Davis, Polk and ward Well, 72 00:04:47,080 --> 00:04:49,239 Speaker 1: which is an international law firm in New York City, 73 00:04:49,279 --> 00:04:53,240 Speaker 1: but that did not last long. Apparently he left after 74 00:04:53,320 --> 00:04:56,599 Speaker 1: only five months because he did not feel that was 75 00:04:56,680 --> 00:04:59,560 Speaker 1: the right path for him. It's interesting, that's a real 76 00:04:59,640 --> 00:05:02,279 Speaker 1: quick after putting in all that time and money and 77 00:05:02,320 --> 00:05:06,000 Speaker 1: then immediately and that's fine, that's fine. I kind of 78 00:05:06,040 --> 00:05:09,400 Speaker 1: respect that a lot. It's hard, especially when you put 79 00:05:09,440 --> 00:05:11,480 Speaker 1: in like it's hard to quit a job. But like 80 00:05:11,600 --> 00:05:13,719 Speaker 1: when you've when you've really put the time in on 81 00:05:13,760 --> 00:05:16,360 Speaker 1: the back end, and you realize that should I hate this? No, 82 00:05:17,600 --> 00:05:20,520 Speaker 1: I mean, like I say, it's interesting, but you're right. 83 00:05:20,560 --> 00:05:22,880 Speaker 1: It's also like, who wants to be a corporate attorney? 84 00:05:23,240 --> 00:05:24,880 Speaker 1: A lot of people? Yeah, even like I've got friends 85 00:05:24,880 --> 00:05:27,000 Speaker 1: who went to law school because they were like told 86 00:05:27,040 --> 00:05:28,920 Speaker 1: they should or like there was a lot of pressure. 87 00:05:29,800 --> 00:05:31,760 Speaker 1: I guess I'll go to law school. And now they 88 00:05:31,760 --> 00:05:34,360 Speaker 1: don't do anything with it because we're like, I don't 89 00:05:34,400 --> 00:05:39,200 Speaker 1: care for it. So, after leaving his law job, he 90 00:05:39,279 --> 00:05:42,320 Speaker 1: started a nonprofit called Star Giving, which was a website 91 00:05:42,320 --> 00:05:45,560 Speaker 1: which connected people to celebrity charities, making small donations. Each 92 00:05:45,560 --> 00:05:48,000 Speaker 1: time the visitor clicked a button. The site would enter 93 00:05:48,080 --> 00:05:52,440 Speaker 1: visitors into raffles to meet early two thousand stars like Mike, 94 00:05:52,520 --> 00:05:56,440 Speaker 1: Magic Johnson and Hoodie and the Blowfish. But despite the 95 00:05:56,480 --> 00:05:59,440 Speaker 1: draw of those magnetic personalities, the company only lasted a 96 00:05:59,440 --> 00:06:04,040 Speaker 1: couple of years. Uh, like two thousand. That's good, that's 97 00:06:04,200 --> 00:06:08,640 Speaker 1: towards the end of there. There's a correlation here to 98 00:06:08,680 --> 00:06:10,760 Speaker 1: why his company last. And I don't think it's just 99 00:06:10,800 --> 00:06:13,919 Speaker 1: the dot com bubble bursting. Probably well, that's what they 100 00:06:13,960 --> 00:06:16,760 Speaker 1: would point to. I think it's it's who he blew 101 00:06:16,800 --> 00:06:22,840 Speaker 1: all his fish. Um, hey, yes, that's the late two 102 00:06:22,880 --> 00:06:26,240 Speaker 1: thousand's reboot of Hoodie and the Blowfish. He blew all 103 00:06:26,279 --> 00:06:32,120 Speaker 1: of it perfect U. He briefly sold cut Co knives 104 00:06:32,480 --> 00:06:37,320 Speaker 1: to make ends meet before. I was hoping that would 105 00:06:37,320 --> 00:06:40,000 Speaker 1: get a little should we should we give our listeners 106 00:06:40,000 --> 00:06:41,560 Speaker 1: the heads up as to what I'm going to guess 107 00:06:41,600 --> 00:06:45,040 Speaker 1: about fifty percent of them have gotten slightly conned by 108 00:06:45,120 --> 00:06:47,880 Speaker 1: cut Co in the past. I'm getting saying, why did 109 00:06:47,960 --> 00:06:50,440 Speaker 1: Roberts just burst out laughing? Do you want to give 110 00:06:50,520 --> 00:06:52,080 Speaker 1: us some background on cut Do you have like a 111 00:06:52,120 --> 00:06:54,920 Speaker 1: personal relationship with cut Knives? I have a little bit 112 00:06:54,960 --> 00:06:56,320 Speaker 1: of one. I have a little bit of one. When 113 00:06:56,360 --> 00:07:00,359 Speaker 1: I was nineteen years old and looking for a job, 114 00:07:01,440 --> 00:07:04,280 Speaker 1: I saw an ad up in my college for a 115 00:07:04,320 --> 00:07:07,320 Speaker 1: fifteen dollar an hour, which is pretty good wage back 116 00:07:07,360 --> 00:07:12,360 Speaker 1: in the two thousand seven um job as a I 117 00:07:12,360 --> 00:07:15,000 Speaker 1: think it advertised it is like working as a secretary 118 00:07:15,080 --> 00:07:18,880 Speaker 1: for cut Co Knives endeavor And I had to like 119 00:07:18,960 --> 00:07:21,400 Speaker 1: come to They wanted me to come to this orientation 120 00:07:21,480 --> 00:07:23,280 Speaker 1: seminar to see if I could. And I called them 121 00:07:23,280 --> 00:07:24,920 Speaker 1: and stuff and like, oh yeah, you sound perfect for 122 00:07:24,960 --> 00:07:27,160 Speaker 1: the job, come to this orientation seminar and like we'll 123 00:07:27,160 --> 00:07:29,760 Speaker 1: get everything started. And so I was like really excited 124 00:07:30,200 --> 00:07:33,560 Speaker 1: because I needed money, and I like told my dad 125 00:07:33,720 --> 00:07:35,960 Speaker 1: and he like looked at it and said, do you 126 00:07:35,960 --> 00:07:37,840 Speaker 1: know what cut co knives are? And I was like no, 127 00:07:38,040 --> 00:07:41,440 Speaker 1: and he says, Okay, well, it's a pyramid scheme based 128 00:07:41,440 --> 00:07:44,360 Speaker 1: around selling kitchen knives for a way too much money, 129 00:07:44,800 --> 00:07:46,880 Speaker 1: and there's no job for you. They're going to try 130 00:07:46,880 --> 00:07:48,600 Speaker 1: to get you to buy a bunch of kitchen knives. 131 00:07:49,000 --> 00:07:50,679 Speaker 1: And then I looked it up online and I found 132 00:07:50,720 --> 00:07:53,640 Speaker 1: like dozens and dozens more than that of like people's reports. 133 00:07:53,680 --> 00:07:55,880 Speaker 1: It is like they they get you into a room 134 00:07:55,880 --> 00:07:57,480 Speaker 1: and they try to sell you a bunch of overpriced 135 00:07:57,520 --> 00:07:59,840 Speaker 1: knives that you can go do knife demonstrations about how 136 00:08:00,800 --> 00:08:03,160 Speaker 1: we'll see that might have been a good job for you, though, Robert, 137 00:08:03,560 --> 00:08:06,360 Speaker 1: you love knives, no because they're bad. Nine. I would 138 00:08:06,360 --> 00:08:09,000 Speaker 1: love to sell good knives when I get tired of 139 00:08:09,040 --> 00:08:12,720 Speaker 1: being heartbroken that my friends in Syria are being betrayed 140 00:08:13,160 --> 00:08:17,760 Speaker 1: by the US government. UM, I will absolutely sell quality 141 00:08:17,960 --> 00:08:21,280 Speaker 1: handmade knives to people by hacking through pieces of wood 142 00:08:21,360 --> 00:08:24,040 Speaker 1: and old state grizzle that sounds like a great way 143 00:08:24,080 --> 00:08:25,880 Speaker 1: to spend my life. Feels like a cut Co knives 144 00:08:25,880 --> 00:08:27,760 Speaker 1: are bad. It feels like a good time to put 145 00:08:27,760 --> 00:08:30,120 Speaker 1: out there to anybody that's selling nice knives, that we're 146 00:08:30,120 --> 00:08:36,280 Speaker 1: looking for knives sponsors. You're looking very knife sponsor. Uh, Gerber, 147 00:08:36,400 --> 00:08:40,199 Speaker 1: hit us up. You know, worst, your clever worst, your 148 00:08:40,240 --> 00:08:44,959 Speaker 1: clever terrence. Anyway, it was sharpens pieces of metal on 149 00:08:45,000 --> 00:08:47,440 Speaker 1: the side of a trash can in my alley near 150 00:08:47,480 --> 00:08:51,480 Speaker 1: my house. We're you know, we're taking all comers all right. 151 00:08:52,000 --> 00:08:55,560 Speaker 1: After his brief stint as a cut Co knives salesman, Uh, 152 00:08:55,640 --> 00:08:58,000 Speaker 1: he became a part time tutor from Manhattan Prep, a 153 00:08:58,040 --> 00:09:00,880 Speaker 1: tutoring company in New York. He had been began tutoring 154 00:09:00,880 --> 00:09:03,520 Speaker 1: full time and actually writing college the questions for the 155 00:09:04,360 --> 00:09:08,120 Speaker 1: tests uh and was appointed CEO in two thousand and 156 00:09:08,160 --> 00:09:11,439 Speaker 1: six after the CEO left UM. The company's hook was 157 00:09:11,480 --> 00:09:13,440 Speaker 1: that it paid tutors a hundred dollars an hour, which 158 00:09:13,480 --> 00:09:15,160 Speaker 1: is like four times at the market rate, and they 159 00:09:15,160 --> 00:09:18,120 Speaker 1: were very selective about which tutors they accepted. UM. But 160 00:09:18,160 --> 00:09:20,640 Speaker 1: then in two thousand and eight, when the recession was peaking, 161 00:09:20,760 --> 00:09:22,560 Speaker 1: Yang started to get worried that his company would be 162 00:09:22,640 --> 00:09:24,559 Speaker 1: hit hard. But it turned out the opposite was true. 163 00:09:25,360 --> 00:09:27,520 Speaker 1: With the loss of all these jobs. Uh, they actually 164 00:09:27,520 --> 00:09:29,520 Speaker 1: saw a dramatic spike both in people going back to 165 00:09:29,559 --> 00:09:32,240 Speaker 1: business school and enrolling in GMAC courses as well as 166 00:09:32,280 --> 00:09:34,959 Speaker 1: tutors looking for employment. Employment, and the company became very 167 00:09:34,960 --> 00:09:39,079 Speaker 1: profitable very quickly. Um, and it seems that he was 168 00:09:39,080 --> 00:09:42,520 Speaker 1: a very affable, unlikable boss. I included this little quote 169 00:09:42,600 --> 00:09:46,000 Speaker 1: to give some perspective. Quote. After joining the company, Yang's 170 00:09:46,000 --> 00:09:48,760 Speaker 1: former employees say they found a chill start up vibe 171 00:09:48,760 --> 00:09:50,920 Speaker 1: and a boss with a whimsical presence in the office. 172 00:09:51,200 --> 00:09:53,959 Speaker 1: Yang constantly saying musical narrations of what he was doing, 173 00:09:54,040 --> 00:09:59,960 Speaker 1: like sending emails and which and would strike a gone 174 00:10:00,000 --> 00:10:02,320 Speaker 1: long on Friday evenings to tell everyone to go home. 175 00:10:02,720 --> 00:10:10,040 Speaker 1: That's from Slate us Michael Scott a little bit, Michael, Yeah, yeah, yes, 176 00:10:10,600 --> 00:10:13,960 Speaker 1: people liked that. People seem to like it, I guess 177 00:10:14,200 --> 00:10:17,680 Speaker 1: or just noted it. I don't know, right, and start 178 00:10:17,800 --> 00:10:21,280 Speaker 1: to read like it could be like it's like, oh yeah, 179 00:10:21,280 --> 00:10:23,680 Speaker 1: they found like a chill startup vibe and a boss 180 00:10:23,720 --> 00:10:26,120 Speaker 1: with like a whimsical presence in the office. He hit 181 00:10:26,160 --> 00:10:30,439 Speaker 1: a gong every like the tone can shift. I chose 182 00:10:30,480 --> 00:10:33,679 Speaker 1: to read it as yeah, you know, aditive, but you know, 183 00:10:33,720 --> 00:10:37,640 Speaker 1: a little insight into the kind of boss that he is. Um. Interestingly, 184 00:10:38,080 --> 00:10:41,520 Speaker 1: you does have mixed feelings about standardized testing. That's probably 185 00:10:41,760 --> 00:10:44,960 Speaker 1: because of his time in the standardized test world. Um. 186 00:10:44,960 --> 00:10:48,439 Speaker 1: But he believes that it contributes to the myth of meritocracy, 187 00:10:48,760 --> 00:10:50,839 Speaker 1: criticizes it in his book The World on Normal People. 188 00:10:50,920 --> 00:10:53,480 Speaker 1: Most of success today is about how good you are 189 00:10:53,480 --> 00:10:56,240 Speaker 1: at certain tests and what kind of family background you have. Intellect, 190 00:10:56,240 --> 00:10:59,199 Speaker 1: as narrowly divined by academics and test scores, is now 191 00:10:59,280 --> 00:11:02,360 Speaker 1: the proxy for human worth. Yeah. I agree with him. 192 00:11:03,360 --> 00:11:05,000 Speaker 1: He's right on the money with that one. Yeah. He 193 00:11:05,080 --> 00:11:07,600 Speaker 1: also told Slate that the fact that so many kids 194 00:11:07,600 --> 00:11:09,880 Speaker 1: have their ambition shaped by their performance on these tests 195 00:11:09,920 --> 00:11:13,120 Speaker 1: is unfortunate. Right. Oh yeah, it sticks with you. Um, 196 00:11:14,120 --> 00:11:16,680 Speaker 1: I'm on board with all that. Yeah. Yeah. So anyway, 197 00:11:16,720 --> 00:11:20,960 Speaker 1: Manhattan Prep was sold to Kaplan, the tutoring behemoth and 198 00:11:21,040 --> 00:11:25,080 Speaker 1: two thousand nine, which earned Yang millions of dollars. Uh. 199 00:11:25,080 --> 00:11:27,920 Speaker 1: He went on the Freakonomics podcast earlier this year, after 200 00:11:28,040 --> 00:11:30,640 Speaker 1: you know running for office, and said that we were 201 00:11:30,679 --> 00:11:32,800 Speaker 1: acquired for low tens of millions so I walked away 202 00:11:32,800 --> 00:11:34,960 Speaker 1: with some number in the millions. Um. And I'm bringing 203 00:11:34,960 --> 00:11:38,280 Speaker 1: that up because obviously that was probably very exciting sale 204 00:11:38,400 --> 00:11:40,640 Speaker 1: for him and it made him a wealthy man. But 205 00:11:40,679 --> 00:11:43,240 Speaker 1: it's very different than the picture that I had of 206 00:11:43,360 --> 00:11:46,920 Speaker 1: him as like a tech billionaire. I don't know where 207 00:11:47,000 --> 00:11:49,480 Speaker 1: I got that idea from him, but everyone's like, you know, 208 00:11:49,559 --> 00:11:53,560 Speaker 1: as tech billionaire guy, but you know he's actually it's 209 00:11:53,640 --> 00:11:59,240 Speaker 1: a much more modest sum than that. Um and Stan, Yeah, 210 00:11:59,280 --> 00:12:02,640 Speaker 1: like what he's done. And then like the company, that's 211 00:12:02,640 --> 00:12:05,439 Speaker 1: a lot of gongs, Katie, it's a lot of gongs. 212 00:12:05,480 --> 00:12:08,080 Speaker 1: You can buy a lot more gongs for millions of dollars, 213 00:12:08,400 --> 00:12:12,640 Speaker 1: or you can buy one nice. I was gonna say, 214 00:12:12,679 --> 00:12:14,679 Speaker 1: a big one, but a nice one, sure. I mean 215 00:12:14,720 --> 00:12:17,080 Speaker 1: the size doesn't necesarily matter with gongs. It's like you know, 216 00:12:17,200 --> 00:12:19,240 Speaker 1: the residents, the type of metal. I would say, a 217 00:12:19,240 --> 00:12:22,679 Speaker 1: big one resonates your right size is not the possibility 218 00:12:22,679 --> 00:12:25,440 Speaker 1: of him being our first gong president really actually does 219 00:12:25,559 --> 00:12:27,800 Speaker 1: make me consider voting for him just at the at 220 00:12:27,800 --> 00:12:35,320 Speaker 1: five o'clock Eastern standard time every day presidential gong. Yeah, 221 00:12:35,360 --> 00:12:37,240 Speaker 1: we could just we could replace I don't know what's 222 00:12:37,280 --> 00:12:41,000 Speaker 1: one of the states. We don't need um Oklahoma with 223 00:12:41,080 --> 00:12:44,080 Speaker 1: a gong, uh like a giant or No, it needs 224 00:12:44,080 --> 00:12:48,400 Speaker 1: to be more central Kansas. Kansas will make Kansas one 225 00:12:48,480 --> 00:12:51,040 Speaker 1: big gong and we can just ring out the day 226 00:12:51,040 --> 00:12:55,360 Speaker 1: every night. It sounds lovely, sorry Kansas. Yeah. Um. After 227 00:12:55,360 --> 00:12:58,480 Speaker 1: the sale of Manhattan Prep, Yang briefly worked as a 228 00:12:58,640 --> 00:13:02,000 Speaker 1: health tech entrepreneur. Could find the name of that company, 229 00:13:02,080 --> 00:13:05,439 Speaker 1: but that was very brief before starting Venture for America 230 00:13:05,600 --> 00:13:09,560 Speaker 1: and twenty eleven. The company's stated goal is to find 231 00:13:09,640 --> 00:13:13,240 Speaker 1: the next generation of entrepreneurs by providing a two year 232 00:13:13,280 --> 00:13:16,800 Speaker 1: fellowship to recent college graduates and pairing them with startups UM. 233 00:13:16,840 --> 00:13:20,520 Speaker 1: According to their website quote, fellows learn important startup skills 234 00:13:20,520 --> 00:13:22,600 Speaker 1: at our month long training camp, apply for jobs within 235 00:13:22,600 --> 00:13:24,560 Speaker 1: our vetted company network, and work for two years as 236 00:13:24,600 --> 00:13:27,199 Speaker 1: full time salaried employees in one of fourteen cities. When 237 00:13:27,200 --> 00:13:28,960 Speaker 1: fellows are ready to start a company, be it two 238 00:13:29,040 --> 00:13:31,480 Speaker 1: years after college or ten, v f A has the 239 00:13:31,520 --> 00:13:34,640 Speaker 1: resources UM to help make that dream a reality. And 240 00:13:34,679 --> 00:13:40,200 Speaker 1: I actually think that's kind of interesting, um, an interesting idea, uh. 241 00:13:40,480 --> 00:13:42,840 Speaker 1: Circling back to what I was just saying about his 242 00:13:42,960 --> 00:13:47,920 Speaker 1: money and income even there, UM, he isn't earning it 243 00:13:48,240 --> 00:13:52,600 Speaker 1: astronomical amount of money. In sixteen as CEO, he earned 244 00:13:52,600 --> 00:13:57,000 Speaker 1: two thousand dollars uh, when the company's revenue was almost 245 00:13:57,040 --> 00:14:01,160 Speaker 1: at seven million. So again wealthy and successful, but by 246 00:14:01,559 --> 00:14:06,240 Speaker 1: means like hoarding his wealth billionaire. It seems like also 247 00:14:06,920 --> 00:14:09,959 Speaker 1: fair compensation for CEO of a company of that size. 248 00:14:10,040 --> 00:14:14,760 Speaker 1: It does like Musk style tech billionaire. And ain't all 249 00:14:14,800 --> 00:14:17,199 Speaker 1: that just to say like it seems like a generally 250 00:14:17,600 --> 00:14:21,720 Speaker 1: okay dude, you know. Um. And while he was there 251 00:14:21,760 --> 00:14:26,880 Speaker 1: at Venture for America, that's when Yang's metamorphosis from regular 252 00:14:26,920 --> 00:14:30,880 Speaker 1: old entrepreneur to tech savvy politician started to take root. 253 00:14:31,520 --> 00:14:34,360 Speaker 1: It is kind of like his origin story, if you will. Uh. 254 00:14:34,880 --> 00:14:38,280 Speaker 1: While running v f A, Yang traveled across the Midwest 255 00:14:38,280 --> 00:14:41,560 Speaker 1: and saw the problems that automation has caused, and he 256 00:14:41,600 --> 00:14:44,200 Speaker 1: realized that automation was changing how people function in the economy. 257 00:14:44,240 --> 00:14:48,800 Speaker 1: So you started thinking about the benefits of universal basic income. Now, 258 00:14:48,800 --> 00:14:51,640 Speaker 1: obviously u b I is not a new idea. There 259 00:14:51,680 --> 00:14:54,600 Speaker 1: have been experiments with it all over the world. Um, 260 00:14:54,640 --> 00:14:57,520 Speaker 1: but it certainly is a new concept for many Americans. Uh. 261 00:14:57,600 --> 00:15:01,479 Speaker 1: And this is kind of the central tenants of his candidacy. 262 00:15:01,560 --> 00:15:04,320 Speaker 1: He believes that automation will eliminate jobs and make basic 263 00:15:04,400 --> 00:15:09,800 Speaker 1: income necessary. Um. That in economic inequality can be addressed 264 00:15:09,840 --> 00:15:13,560 Speaker 1: with more entrepreneurship, and that universal basic income will help 265 00:15:13,680 --> 00:15:17,560 Speaker 1: people to become entrepreneurs. And we're going to talk more 266 00:15:17,600 --> 00:15:19,560 Speaker 1: about that in just a minute. We're going to circle 267 00:15:19,640 --> 00:15:21,880 Speaker 1: back because there's a lot to unpack here. I'm just 268 00:15:21,880 --> 00:15:25,680 Speaker 1: getting through his backstory first. Um. But then after that, 269 00:15:25,760 --> 00:15:28,920 Speaker 1: Yang stepped down as CEO of Venture for America and 270 00:15:30,640 --> 00:15:33,640 Speaker 1: filed with the FEC to run for president as a 271 00:15:33,640 --> 00:15:37,800 Speaker 1: Democrat and also unnowingly added his name to our list 272 00:15:37,800 --> 00:15:40,120 Speaker 1: of people that are helping make this the worst year ever. 273 00:15:41,040 --> 00:15:46,840 Speaker 1: Uh yeah, okay, can't wait? Cool? Uh yeah, So Yan 274 00:15:47,000 --> 00:15:49,720 Speaker 1: the politician, like I mentioned up top, he's different than 275 00:15:49,800 --> 00:15:52,720 Speaker 1: all of our other candidates because Yang has never held 276 00:15:52,720 --> 00:15:57,640 Speaker 1: in the elected office. He's never even run for office. Hum, 277 00:15:58,000 --> 00:16:02,080 Speaker 1: candidate but the president? But of our events sure sure, well, 278 00:16:02,120 --> 00:16:04,640 Speaker 1: except that at this point he has run and one 279 00:16:05,200 --> 00:16:07,640 Speaker 1: well also at that point when Trump was running, he 280 00:16:07,680 --> 00:16:13,600 Speaker 1: had he had run before, um, not successfully, But so 281 00:16:13,800 --> 00:16:16,360 Speaker 1: this is this guy's a little different. Um. But he 282 00:16:16,440 --> 00:16:19,120 Speaker 1: qualified for the debates by getting one percent in the polls, 283 00:16:19,440 --> 00:16:21,760 Speaker 1: he's now at three percent UH and getting a minimum 284 00:16:21,960 --> 00:16:25,720 Speaker 1: of sixty donors. Yang believes that the reason Trump is 285 00:16:25,800 --> 00:16:29,640 Speaker 1: president is, quote, we automated away four million manufacturing jobs 286 00:16:29,640 --> 00:16:32,120 Speaker 1: in Michigan, Ohio, Iowa, and all the swing states he 287 00:16:32,160 --> 00:16:34,320 Speaker 1: needed to win. I'm not sure where he got that 288 00:16:34,320 --> 00:16:37,240 Speaker 1: four million number. Predictions about how many jobs will be 289 00:16:37,280 --> 00:16:40,800 Speaker 1: lost to automation are kind of all over the place, UH, 290 00:16:40,840 --> 00:16:45,080 Speaker 1: but Oxford studies suggest that pent of all U S 291 00:16:45,160 --> 00:16:48,000 Speaker 1: jobs are at risk of being automated in the coming 292 00:16:48,040 --> 00:16:51,680 Speaker 1: decades UM. So that brings us back to universal Basic 293 00:16:51,720 --> 00:16:59,280 Speaker 1: income UM. Yang's plan is a proposed UBI of a 294 00:16:59,280 --> 00:17:01,840 Speaker 1: thousand dollars in month for every American UM and it's 295 00:17:01,880 --> 00:17:06,240 Speaker 1: also known as his Freedom Dividend, so a thousand dollars 296 00:17:06,240 --> 00:17:08,199 Speaker 1: a month or twelve thous dollars a year to all 297 00:17:08,280 --> 00:17:12,080 Speaker 1: US citizens over the age of eighteen. His website claims 298 00:17:12,119 --> 00:17:14,280 Speaker 1: that will be paid for by value added tacks of 299 00:17:14,320 --> 00:17:18,320 Speaker 1: ten percent and by consolidating it with existing welfare programs. 300 00:17:18,560 --> 00:17:22,800 Speaker 1: So basically, people would need to choose between keeping their 301 00:17:22,840 --> 00:17:26,159 Speaker 1: welfare their food stamps or what have you, or getting 302 00:17:26,560 --> 00:17:30,320 Speaker 1: U B I not both, and that's a problem to me. 303 00:17:31,640 --> 00:17:33,879 Speaker 1: And I know that you have thoughts about that. Well, 304 00:17:34,080 --> 00:17:36,439 Speaker 1: he's talked about it on Reuben Reports specifically. I know 305 00:17:36,800 --> 00:17:39,560 Speaker 1: like sort of like that's kind of the end goal 306 00:17:39,600 --> 00:17:41,560 Speaker 1: of like get rid of these welfare programs and just 307 00:17:41,600 --> 00:17:44,800 Speaker 1: have this one thing. My issue with it is a 308 00:17:44,920 --> 00:17:49,320 Speaker 1: it's called a freedom dividend, and I've I think we 309 00:17:49,359 --> 00:17:53,240 Speaker 1: need to sort of collectively reevaluate what freedom means to 310 00:17:53,359 --> 00:17:56,320 Speaker 1: us and how like what is living free? How do 311 00:17:56,359 --> 00:18:01,159 Speaker 1: you live free? Um? And we know that like there 312 00:18:01,160 --> 00:18:04,239 Speaker 1: are these basic needs that everybody has to survive and 313 00:18:04,280 --> 00:18:07,480 Speaker 1: it's shelter and it's food. Um. Community is important. But 314 00:18:07,560 --> 00:18:10,320 Speaker 1: like the idea that you have this freedom dividend and 315 00:18:10,320 --> 00:18:12,760 Speaker 1: it's meant to sort of free people up to live 316 00:18:12,760 --> 00:18:16,840 Speaker 1: in the society where jobs are disappearing, Like truck driving 317 00:18:17,000 --> 00:18:19,280 Speaker 1: is the number one job in America and that will 318 00:18:19,320 --> 00:18:22,200 Speaker 1: eventually be gone. But if you have a thousand dollars 319 00:18:22,200 --> 00:18:25,080 Speaker 1: a month and you're calling it a freedom dividend and 320 00:18:25,119 --> 00:18:28,359 Speaker 1: it's for people, it's for everybody, then you need to 321 00:18:28,400 --> 00:18:31,240 Speaker 1: pair that with you are able to survive on a 322 00:18:31,240 --> 00:18:34,480 Speaker 1: thousand dollars a month. You can have a you can 323 00:18:34,520 --> 00:18:36,040 Speaker 1: have you can get shelter, you can get food, and 324 00:18:36,080 --> 00:18:38,080 Speaker 1: you can get clean water and get all these things. 325 00:18:38,760 --> 00:18:41,280 Speaker 1: If you have a thousand dollars a month, right, Like 326 00:18:42,280 --> 00:18:45,480 Speaker 1: thousand dollars a month as is doesn't really help people, 327 00:18:45,600 --> 00:18:49,240 Speaker 1: especially if you're taking away there are other benefits, So 328 00:18:49,400 --> 00:18:51,960 Speaker 1: it would be helpful to people if they were able 329 00:18:52,000 --> 00:18:54,719 Speaker 1: to get like if we had functioning food stamps, if 330 00:18:54,760 --> 00:18:57,680 Speaker 1: we had a like if we had um a universal 331 00:18:57,680 --> 00:19:02,040 Speaker 1: health care system, if his plan didn't involve like kind 332 00:19:02,080 --> 00:19:05,920 Speaker 1: of the gutting and reduction of like food stamps for Medica. Yeah, 333 00:19:05,920 --> 00:19:08,159 Speaker 1: he has for Medicare for all. He's yeah, he's on 334 00:19:08,200 --> 00:19:10,679 Speaker 1: board with that now. But like it doesn't work just 335 00:19:10,880 --> 00:19:12,840 Speaker 1: on its own, and it doesn't work if you're also 336 00:19:12,960 --> 00:19:17,800 Speaker 1: cutting these other social services that people need, because like, um, 337 00:19:17,840 --> 00:19:20,240 Speaker 1: you know, a thousand dollars a month isn't nothing. It 338 00:19:20,760 --> 00:19:23,360 Speaker 1: could significantly help people, but if you're also cutting their 339 00:19:23,400 --> 00:19:26,760 Speaker 1: access to other benefits, than like you're kind of just 340 00:19:27,200 --> 00:19:31,800 Speaker 1: robbing Peter to pay also Peter, Yeah, Peter and Peter 341 00:19:32,000 --> 00:19:35,400 Speaker 1: Peters still fucked. It just doesn't go very far. And 342 00:19:35,440 --> 00:19:37,439 Speaker 1: then you're getting people and and he says it's to 343 00:19:37,560 --> 00:19:42,760 Speaker 1: encourage entrepreneurship, Like that's what we need in America more entrepreneurship, 344 00:19:43,119 --> 00:19:45,399 Speaker 1: And like, I understand the idea of wanting to promote 345 00:19:45,400 --> 00:19:48,720 Speaker 1: and encourage growth and progress and all of that, but 346 00:19:49,680 --> 00:19:52,080 Speaker 1: really that does to me is that it continues to 347 00:19:52,880 --> 00:19:58,040 Speaker 1: um uh exacerbate the income inequality gap. So like people 348 00:19:58,080 --> 00:20:00,880 Speaker 1: that have more money, uh will be getting this a thousand, 349 00:20:00,960 --> 00:20:02,439 Speaker 1: like maybe they will have more money to play with 350 00:20:02,480 --> 00:20:05,919 Speaker 1: to encourage entrepreneurship. People that need money are still going 351 00:20:05,960 --> 00:20:07,800 Speaker 1: to be needing to spend that on their food in 352 00:20:07,840 --> 00:20:09,919 Speaker 1: their basic living expenses. Right, and then two or three 353 00:20:09,920 --> 00:20:11,720 Speaker 1: weeks go by and they're like, well, that freedom dividend 354 00:20:11,800 --> 00:20:13,520 Speaker 1: is gone, and I still he did the map on 355 00:20:13,560 --> 00:20:15,400 Speaker 1: the way over here that comes out to like six 356 00:20:15,440 --> 00:20:17,680 Speaker 1: dollars an hour or like four hours a week. It's 357 00:20:17,680 --> 00:20:21,119 Speaker 1: like something like that, which again it's like I don't know. 358 00:20:21,520 --> 00:20:23,480 Speaker 1: If you're giving people thousand dollars a month and you're like, 359 00:20:23,640 --> 00:20:26,359 Speaker 1: now this is good, then make sure that it can 360 00:20:26,440 --> 00:20:29,000 Speaker 1: It can help people survive even if they like if 361 00:20:29,040 --> 00:20:31,600 Speaker 1: they're on disability or like they have they just lost 362 00:20:31,600 --> 00:20:33,920 Speaker 1: their job, or they have a medical emergency or something. Well, 363 00:20:34,000 --> 00:20:36,479 Speaker 1: at least I can pay rent and food with this 364 00:20:36,560 --> 00:20:39,560 Speaker 1: freedom dividend, Right, Um. I think one of the just 365 00:20:39,640 --> 00:20:42,800 Speaker 1: one of the like most insidious things I think and 366 00:20:42,880 --> 00:20:46,680 Speaker 1: clever things that the Founding Fathers did was not give 367 00:20:46,760 --> 00:20:49,320 Speaker 1: us a U B. I. Well. The phrase life liberty 368 00:20:49,320 --> 00:20:51,639 Speaker 1: and the pursuit of happiness. The pursuit of happiness was 369 00:20:52,040 --> 00:20:55,040 Speaker 1: not the original phrasing. It was life, liberty and property. 370 00:20:55,920 --> 00:20:59,560 Speaker 1: But they were like, oh, yeah, you can't guarantee property, 371 00:20:59,560 --> 00:21:03,040 Speaker 1: and we're gonna change it to pursuit of happiness and 372 00:21:03,080 --> 00:21:06,320 Speaker 1: this sort of vague idea. UM. And I don't know. 373 00:21:06,400 --> 00:21:10,040 Speaker 1: I just think it's yeah, and it's just I don't here. 374 00:21:10,119 --> 00:21:12,920 Speaker 1: I don't think that U B. I is. I think 375 00:21:12,920 --> 00:21:15,840 Speaker 1: it's an important conversation and I'm glad that some that 376 00:21:15,880 --> 00:21:19,080 Speaker 1: we're having it. I just think this specifically is the 377 00:21:19,119 --> 00:21:23,359 Speaker 1: wrong plan and um and I don't want that to 378 00:21:23,720 --> 00:21:26,960 Speaker 1: detract from people's acceptance of the idea that we might 379 00:21:27,040 --> 00:21:29,160 Speaker 1: need to keep having this conversation in the future as 380 00:21:29,320 --> 00:21:32,120 Speaker 1: this is increasing. Do you know what I mean? UM, 381 00:21:32,320 --> 00:21:36,560 Speaker 1: that's kind of my perspective on your freedom dividend. It's 382 00:21:36,600 --> 00:21:39,800 Speaker 1: not a terrible idea, but when you're not using it 383 00:21:39,800 --> 00:21:43,560 Speaker 1: to screw over people on food stamps or uh do 384 00:21:43,840 --> 00:21:47,600 Speaker 1: it's collaterally doing that. And then Yang made waves in 385 00:21:47,640 --> 00:21:49,960 Speaker 1: September with a stunt that he did at the Democratic 386 00:21:50,000 --> 00:21:51,760 Speaker 1: Debate when he announced that he will give away of 387 00:21:51,800 --> 00:21:55,600 Speaker 1: Freedom dividend to ten families thousand dollars a month for 388 00:21:55,680 --> 00:21:58,440 Speaker 1: a year for each family. UM, the funds are coming 389 00:21:58,440 --> 00:22:00,439 Speaker 1: from his campaign and we'll pay out even Fiang does 390 00:22:00,480 --> 00:22:04,840 Speaker 1: not become president UM or the nominee. But he's been 391 00:22:04,840 --> 00:22:09,480 Speaker 1: accused of bribing voters by this, uh doing something illegal. 392 00:22:09,800 --> 00:22:14,920 Speaker 1: But PolitiFact reach out to the FC FEC spokesman, spokeswoman 393 00:22:15,040 --> 00:22:17,960 Speaker 1: excuse me, Judith Ingram uh and they said it's not 394 00:22:18,160 --> 00:22:20,919 Speaker 1: covered by the FEC commissioned rules for the Federal Election 395 00:22:21,000 --> 00:22:24,280 Speaker 1: Campaign Actum. It's kind of a legal gray area, uh, 396 00:22:24,359 --> 00:22:26,840 Speaker 1: you know, and it's kind of a shaky legal reasoning 397 00:22:27,000 --> 00:22:29,520 Speaker 1: that they have against him because it can be argued 398 00:22:29,520 --> 00:22:32,240 Speaker 1: that it's a campaign promotion. UM, but it is definitely 399 00:22:32,240 --> 00:22:35,000 Speaker 1: a stunt. Yeah. I don't really see it as any 400 00:22:35,119 --> 00:22:38,639 Speaker 1: different though than um spending money on an advertisement or whatever. 401 00:22:38,680 --> 00:22:40,760 Speaker 1: That's essentially what he's doing. That's what this is for, 402 00:22:40,840 --> 00:22:44,240 Speaker 1: is to make political ads. Like I don't yeah, anyone 403 00:22:44,280 --> 00:22:47,399 Speaker 1: making a big deal about that, Like it seems Um, 404 00:22:47,640 --> 00:22:50,600 Speaker 1: pretty like a pretty reasonable thing to do. Now, I 405 00:22:50,640 --> 00:22:53,359 Speaker 1: don't think it would actually makes an argument in favor 406 00:22:53,359 --> 00:22:57,480 Speaker 1: of U b I, because what I would judge the 407 00:22:57,520 --> 00:22:59,439 Speaker 1: success or failure of a UBI program on is its 408 00:22:59,480 --> 00:23:04,280 Speaker 1: ability to help people who are in trouble. Um. Reaching 409 00:23:04,280 --> 00:23:06,560 Speaker 1: out to the Yank campaign for some probably not there 410 00:23:06,600 --> 00:23:09,520 Speaker 1: you probably not a lot of people who are on disability, 411 00:23:09,560 --> 00:23:14,439 Speaker 1: who are on Medicare, who are on who require food stamps, um, Like, 412 00:23:14,520 --> 00:23:16,400 Speaker 1: those are the people I'm curious to see, Like, does 413 00:23:16,440 --> 00:23:18,680 Speaker 1: this really help them? Does There's a lot of data 414 00:23:18,720 --> 00:23:22,040 Speaker 1: that U b I could help people in those positions, 415 00:23:22,080 --> 00:23:24,600 Speaker 1: But like, I don't. We're not gonna learn anything from this. 416 00:23:24,680 --> 00:23:28,000 Speaker 1: But I think it's a reasonably intelligent marketing strategy absolutely. 417 00:23:28,080 --> 00:23:30,240 Speaker 1: I I just you know, I call it a stunt. 418 00:23:30,280 --> 00:23:33,600 Speaker 1: I bring this up to transition after the ad break, 419 00:23:33,600 --> 00:23:36,359 Speaker 1: we're about to take UH to talk about the fact 420 00:23:36,359 --> 00:23:40,159 Speaker 1: that he is a social media savvy candidate. He is 421 00:23:40,280 --> 00:23:43,159 Speaker 1: using the Internet in a different way than other candidates 422 00:23:43,320 --> 00:23:45,840 Speaker 1: are or have in the past. UH. And that's kind 423 00:23:45,840 --> 00:23:47,960 Speaker 1: of an example of it. But we're gonna get to 424 00:23:48,000 --> 00:23:49,560 Speaker 1: that in a minute because we've got to talk about 425 00:23:49,600 --> 00:23:55,080 Speaker 1: products and services, because yeah, I am so. I mean 426 00:23:56,040 --> 00:24:00,439 Speaker 1: when you said, uh, social media, uh that that that 427 00:24:00,440 --> 00:24:06,440 Speaker 1: that got my pants exploding, you said ads make double exploding. Um, 428 00:24:06,760 --> 00:24:12,080 Speaker 1: it's more like a friction burn. But the yeah, lots 429 00:24:12,080 --> 00:24:15,399 Speaker 1: of friction burns. You guys can cut wonderful friction burns. 430 00:24:15,840 --> 00:24:31,280 Speaker 1: This is terrible. These ads have made us together everything, 431 00:24:31,480 --> 00:24:37,399 Speaker 1: So don't don't. We're back and I now have headphones 432 00:24:37,480 --> 00:24:42,080 Speaker 1: on M and am competently recording this podcast again willing 433 00:24:42,160 --> 00:24:44,160 Speaker 1: we realized he had a TV on in the background. 434 00:24:44,920 --> 00:24:48,080 Speaker 1: Oh no, yeah, it's terrible. It's all fine. You guys 435 00:24:48,160 --> 00:24:50,320 Speaker 1: maybe won't even notice because Dan was going to edit 436 00:24:50,400 --> 00:24:53,239 Speaker 1: that out, but now that we but now we've mentioned it, 437 00:24:53,440 --> 00:24:56,200 Speaker 1: so maybe he was gonna happen. I feel like it's 438 00:24:56,200 --> 00:24:59,880 Speaker 1: important to mention daniel sacrifices for this podcast. Thank you, Dan, 439 00:25:00,119 --> 00:25:03,920 Speaker 1: Like like a seventeen year old boy from Kentucky going 440 00:25:03,960 --> 00:25:09,760 Speaker 1: ashore on Omaha Beach on d day nine, Daniel is 441 00:25:09,800 --> 00:25:12,160 Speaker 1: going to have to scrub the sounds of the TV 442 00:25:12,240 --> 00:25:16,720 Speaker 1: in the background out of my podcast recording. That's the 443 00:25:16,720 --> 00:25:19,600 Speaker 1: comparison I was going to make too. Thank you. Okay, 444 00:25:19,640 --> 00:25:21,640 Speaker 1: back to how Yang is helping make this the worst 445 00:25:21,720 --> 00:25:24,879 Speaker 1: year ever like a seventeen year old boy from Canada. 446 00:25:26,880 --> 00:25:32,800 Speaker 1: Yang's supporters are known online as the Yang Gang. Andrew Andrews. God, 447 00:25:32,840 --> 00:25:37,560 Speaker 1: that's our None of it works. Go back to the 448 00:25:37,560 --> 00:25:43,040 Speaker 1: time machine noises Yang Gang, don't Yang Gang in and 449 00:25:43,080 --> 00:25:46,440 Speaker 1: of itself is enough to make this the worst year. 450 00:25:47,280 --> 00:25:51,639 Speaker 1: I just think it's so dumb um rhymes, unlike the 451 00:25:51,640 --> 00:25:55,840 Speaker 1: Trump train. You know, you're right, that is out of 452 00:25:56,960 --> 00:25:59,919 Speaker 1: it was weird. Andrew andrew all of it all right, 453 00:26:00,040 --> 00:26:05,360 Speaker 1: just keep doing it, oh Trump train. I understand. As 454 00:26:05,400 --> 00:26:07,920 Speaker 1: I mentioned, it is undeniable that Andrew Yang has done 455 00:26:07,920 --> 00:26:11,359 Speaker 1: an excellent job harnessing social media to gain traction and 456 00:26:11,440 --> 00:26:14,840 Speaker 1: appeal to a wide swath of voters. He's made himself 457 00:26:14,840 --> 00:26:18,160 Speaker 1: into the meme king, which again is a really great 458 00:26:18,200 --> 00:26:22,119 Speaker 1: skill to have as an entrepreneur. Uh, there's something about 459 00:26:22,119 --> 00:26:27,560 Speaker 1: it as from a presidential candidate that I don't love. Um. 460 00:26:27,960 --> 00:26:33,359 Speaker 1: Sometimes his campaign feels gamified, you know, like that corporate 461 00:26:33,760 --> 00:26:36,879 Speaker 1: term for applying game design to have the situation to 462 00:26:36,920 --> 00:26:39,399 Speaker 1: make them more appealing. And UM, well it feels that 463 00:26:39,400 --> 00:26:41,200 Speaker 1: way because it is that way. That's what he's doing. 464 00:26:41,200 --> 00:26:43,840 Speaker 1: He comes from the corporate tech world and he's taking 465 00:26:43,880 --> 00:26:48,080 Speaker 1: that into this presidential campaign U. Surprisingly, his supporters seem 466 00:26:48,160 --> 00:26:50,520 Speaker 1: to be drawn from all across the political spectrum, with 467 00:26:50,720 --> 00:26:54,000 Speaker 1: crazy memes showing up all over the internet, especially on Reddit. 468 00:26:54,240 --> 00:26:56,280 Speaker 1: I've got this one here, Cody. Do you want to 469 00:26:56,320 --> 00:26:59,280 Speaker 1: describe it for us? Oh? Yeah, it's some. It's a 470 00:26:59,440 --> 00:27:02,320 Speaker 1: meme bunch of people's swords, you know, meeting the swords 471 00:27:02,320 --> 00:27:05,879 Speaker 1: in the middle. One of the people's Bernie Bros x 472 00:27:05,920 --> 00:27:09,320 Speaker 1: Mega Bros Zoomers that are turning eighteen and they're all 473 00:27:09,400 --> 00:27:13,199 Speaker 1: joining swords together to equal the Yang Gangang Gang. So 474 00:27:13,240 --> 00:27:17,719 Speaker 1: that's a nice example of the memes sloating around. UM. 475 00:27:18,160 --> 00:27:19,719 Speaker 1: To be fair, there are there are also like a 476 00:27:19,720 --> 00:27:23,160 Speaker 1: lot of racist anti Yang memes coming out and that's 477 00:27:23,359 --> 00:27:26,280 Speaker 1: very ugly. UM. But he's very good at branding, which 478 00:27:26,280 --> 00:27:29,119 Speaker 1: is why he's garnered as much support as he has UM. 479 00:27:29,119 --> 00:27:31,639 Speaker 1: According to The Daily Beast, Yang saw a surge of 480 00:27:31,640 --> 00:27:34,320 Speaker 1: new supporters after an appearance on The Joe Rogan Experience 481 00:27:34,320 --> 00:27:38,320 Speaker 1: in February. He's also been on Tucker Carlson's show. Uh. 482 00:27:38,480 --> 00:27:43,040 Speaker 1: He's even appeared at a Turning Point USA event. Interesting 483 00:27:43,359 --> 00:27:46,879 Speaker 1: U and according to The Virtue, white nationalist supporters have 484 00:27:47,000 --> 00:27:51,520 Speaker 1: begun selectively choosing statements from Yang uh to support the 485 00:27:51,560 --> 00:27:54,200 Speaker 1: idea that he wants to to stop the decline of 486 00:27:54,240 --> 00:27:57,920 Speaker 1: the white race. UH, something that which Yang, the kid 487 00:27:57,920 --> 00:28:01,080 Speaker 1: of Taiwanese immigrants, grew up being vicious bullied. You know, 488 00:28:01,160 --> 00:28:04,280 Speaker 1: he's emphatically disavowing that support. He says, I'd announced and 489 00:28:04,359 --> 00:28:08,240 Speaker 1: disavow hatred, bigotry, racism, white nationalism, anti semitism, and the 490 00:28:08,280 --> 00:28:10,520 Speaker 1: alt right in all its many forms, full stop for 491 00:28:10,560 --> 00:28:12,679 Speaker 1: anyone with his agenda. We do not want your support, 492 00:28:12,720 --> 00:28:14,439 Speaker 1: we do not want your votes. You're not welcome in 493 00:28:14,440 --> 00:28:17,720 Speaker 1: this campaign. Cool, but it is unclear why so many 494 00:28:17,760 --> 00:28:21,120 Speaker 1: all right UH types are attracted to Yang. It might 495 00:28:21,160 --> 00:28:25,199 Speaker 1: come from his focus on automation in the opioid crisis, um, 496 00:28:25,400 --> 00:28:28,320 Speaker 1: which you know can be seen as helping Middle America. 497 00:28:28,680 --> 00:28:33,560 Speaker 1: I can answer that, Um yeah, they talk about it 498 00:28:33,600 --> 00:28:35,880 Speaker 1: a lot. So there's there's a chunk of the sort 499 00:28:35,920 --> 00:28:37,720 Speaker 1: of fascist right that, like a lot of them, were 500 00:28:37,800 --> 00:28:39,960 Speaker 1: very excited after Trump won and then have kind of 501 00:28:40,000 --> 00:28:43,360 Speaker 1: been increasingly dejected in the years since when he didn't, 502 00:28:43,440 --> 00:28:46,360 Speaker 1: you know, kill all the Jews or any of the 503 00:28:46,360 --> 00:28:51,480 Speaker 1: other things they hope wasn't explicitly doing their Yeah. Um 504 00:28:51,600 --> 00:28:54,720 Speaker 1: so they've they've been I don't know that there's a 505 00:28:54,760 --> 00:28:57,320 Speaker 1: bunch of different terms they'd use for it, but like 506 00:28:57,400 --> 00:29:01,520 Speaker 1: some of them have kind of gotten come to the 507 00:29:01,960 --> 00:29:04,920 Speaker 1: understanding like there's no saving the world by their standards, 508 00:29:04,960 --> 00:29:07,880 Speaker 1: which means ending the white genocide that they think is 509 00:29:07,880 --> 00:29:11,160 Speaker 1: going on, and there's no political solution, so the best 510 00:29:11,200 --> 00:29:13,680 Speaker 1: thing they can do is get a thousand dollars a 511 00:29:13,720 --> 00:29:16,840 Speaker 1: month into house with some friends and play video games 512 00:29:16,880 --> 00:29:19,080 Speaker 1: until it all collapses. Like that really is a big 513 00:29:19,200 --> 00:29:22,320 Speaker 1: chunk of it. Um is these people who are just like, well, 514 00:29:22,360 --> 00:29:25,120 Speaker 1: I might as well get some money. It's not it's 515 00:29:25,160 --> 00:29:27,160 Speaker 1: not quite like black pilling, but it is in that 516 00:29:27,880 --> 00:29:31,920 Speaker 1: I'm just like, well, we didn't do it. Yeah, And 517 00:29:31,960 --> 00:29:36,520 Speaker 1: that's it's upsetting. I mean, it's like, okay, so we 518 00:29:36,600 --> 00:29:39,680 Speaker 1: have this interesting conversation about UBI and all this stuff, 519 00:29:39,720 --> 00:29:42,040 Speaker 1: and in the people that it's getting the most traction 520 00:29:42,080 --> 00:29:45,360 Speaker 1: with or the worst, the very worst of us. Um, 521 00:29:45,600 --> 00:29:48,360 Speaker 1: it's probably like some slightly related even of just like 522 00:29:48,400 --> 00:29:51,360 Speaker 1: because one of his main slogans is not right, not 523 00:29:51,520 --> 00:29:54,680 Speaker 1: left forward, and I think them it's like, all right, 524 00:29:54,680 --> 00:29:57,600 Speaker 1: well let's just like funk at all and do whatever 525 00:29:57,680 --> 00:30:00,640 Speaker 1: crazy thing, just sort of like destabilize stuff and sort 526 00:30:00,680 --> 00:30:05,000 Speaker 1: of hoping that shaking things up will either lead to 527 00:30:05,360 --> 00:30:08,640 Speaker 1: that collapse or just yeah, be all right, Well we're 528 00:30:08,680 --> 00:30:11,840 Speaker 1: gonna be gamers for a thousand bucks a month from 529 00:30:11,840 --> 00:30:15,200 Speaker 1: now on, we'll form our own ethnic state and it 530 00:30:15,200 --> 00:30:20,280 Speaker 1: will be a big house. It was video games, and 531 00:30:20,360 --> 00:30:24,960 Speaker 1: everyone will be white. Uh. Other than universal basic income, 532 00:30:25,440 --> 00:30:27,800 Speaker 1: Yang is for medicare for all. Like we talked about 533 00:30:28,040 --> 00:30:31,760 Speaker 1: dealing with climate change very vaguely though, that's the thing. 534 00:30:32,040 --> 00:30:33,560 Speaker 1: So that's the thing when you're looking at it. It's 535 00:30:33,600 --> 00:30:36,760 Speaker 1: a lot of stuff that he's just generally in alignment 536 00:30:36,880 --> 00:30:41,080 Speaker 1: with liberal policies, but dealing with climate change, it's like 537 00:30:41,640 --> 00:30:46,960 Speaker 1: rejoining the Pairs Climate Agreement. But okay, what else? Yeah, 538 00:30:47,280 --> 00:30:49,200 Speaker 1: you know, um, And he talks a lot about human 539 00:30:49,320 --> 00:30:53,040 Speaker 1: centered capitalism, the basic ideas that the base unit in 540 00:30:53,080 --> 00:30:57,160 Speaker 1: society should be people in human welfare, not money. Um. 541 00:30:57,200 --> 00:31:01,640 Speaker 1: But there's nothing more specific. I don't know because really 542 00:31:01,680 --> 00:31:04,920 Speaker 1: what that means, you know, it's it's I'd like to see, 543 00:31:05,080 --> 00:31:07,960 Speaker 1: you know, it's just stuff that's like your I mean, 544 00:31:07,960 --> 00:31:10,120 Speaker 1: I think there's a way of looking at ubi where 545 00:31:10,120 --> 00:31:12,440 Speaker 1: you can be putting people in the center, and there's 546 00:31:12,440 --> 00:31:14,320 Speaker 1: a way of looking at UBI where you're putting money 547 00:31:14,360 --> 00:31:17,320 Speaker 1: in the center, and Yang's plan strikes me as putting 548 00:31:17,320 --> 00:31:20,080 Speaker 1: money in the center, especially since it's kind of paired 549 00:31:20,120 --> 00:31:22,640 Speaker 1: with with cuts to the social programs that people can 550 00:31:22,760 --> 00:31:26,719 Speaker 1: use UM and benefit from UM. So the focus is 551 00:31:26,760 --> 00:31:29,160 Speaker 1: just that like, oh, these people's problems can be solved 552 00:31:29,200 --> 00:31:31,440 Speaker 1: with money, and it's like, no, there's actually other resources 553 00:31:31,480 --> 00:31:34,440 Speaker 1: that we need that our society does provide, although it 554 00:31:34,440 --> 00:31:38,040 Speaker 1: doesn't provide enough of them, and like UBI could be 555 00:31:38,160 --> 00:31:40,760 Speaker 1: part of a reformation of society that puts people at 556 00:31:40,760 --> 00:31:43,760 Speaker 1: the center. But just giving them the money isn't isn't 557 00:31:43,800 --> 00:31:47,840 Speaker 1: the answer, yo, Right, like you were saying earlier, like okay, 558 00:31:47,880 --> 00:31:51,680 Speaker 1: but how about access to a plan about affordable housing, 559 00:31:52,080 --> 00:31:56,120 Speaker 1: a plan for if you're not having a mental healthcare 560 00:31:56,280 --> 00:31:59,040 Speaker 1: all of it healthcare? Yeah, if you're not providing these 561 00:31:59,120 --> 00:32:01,760 Speaker 1: basic needs, Like if you have these basic needs met, 562 00:32:01,800 --> 00:32:05,560 Speaker 1: then you are more likely to aspire to greater things 563 00:32:05,640 --> 00:32:08,560 Speaker 1: and like work to you know, you are these needs 564 00:32:08,600 --> 00:32:10,560 Speaker 1: are taken care of. Now I have these other emotional needs, 565 00:32:10,560 --> 00:32:12,120 Speaker 1: now have these spiritual needs, and you sort of like 566 00:32:12,160 --> 00:32:14,320 Speaker 1: build yourself up when he's talking about it, like he 567 00:32:14,320 --> 00:32:17,040 Speaker 1: doesn't want to, like you said, center money. But one 568 00:32:17,040 --> 00:32:19,960 Speaker 1: of the solutions is like money, well, money equals anything. 569 00:32:20,440 --> 00:32:23,880 Speaker 1: It's not food necessarily, it's not housing necessarily. Is just 570 00:32:23,920 --> 00:32:26,480 Speaker 1: like you can spend this money on anything, because money 571 00:32:26,640 --> 00:32:29,480 Speaker 1: is a thing that we made up, you know, to 572 00:32:29,640 --> 00:32:33,840 Speaker 1: like people anything. Um. So that's that's why his approach 573 00:32:34,000 --> 00:32:36,400 Speaker 1: sort of rubs me the wrong way to where it's like, 574 00:32:36,960 --> 00:32:39,440 Speaker 1: but what are we actually doing if you're saying that 575 00:32:39,520 --> 00:32:43,040 Speaker 1: you want to focus on human beings, but you're just 576 00:32:43,240 --> 00:32:47,400 Speaker 1: giving them a thing that can pay for like literally anything. 577 00:32:48,040 --> 00:32:50,600 Speaker 1: Um yeah, it just doesn't. It doesn't. It doesn't. It 578 00:32:50,600 --> 00:32:54,239 Speaker 1: doesn't speak to what he what he wants saying that 579 00:32:54,320 --> 00:32:57,400 Speaker 1: he wanted to speak to you. He's also very vague 580 00:32:57,480 --> 00:33:02,600 Speaker 1: on foreign policy, which isn't surprising given his literal zero 581 00:33:02,840 --> 00:33:07,200 Speaker 1: years of experience. Uh. He says that America has made mistakes, 582 00:33:07,280 --> 00:33:10,000 Speaker 1: but that we've been a positive force in world history, 583 00:33:10,360 --> 00:33:13,600 Speaker 1: leading to the spread of peace, prosperity, and democracy. Um. 584 00:33:15,360 --> 00:33:19,800 Speaker 1: I know, it's just peace we spread to Cambodia. It's 585 00:33:19,840 --> 00:33:25,440 Speaker 1: like it's like it's attempt at being a just diplomatic 586 00:33:24,320 --> 00:33:30,360 Speaker 1: politicians speak, but it's not very good politicians, right, is 587 00:33:30,360 --> 00:33:33,000 Speaker 1: super vague and also, I mean it doesn't speak to 588 00:33:33,080 --> 00:33:35,000 Speaker 1: our current moment at all. No, it doesn't. I mean 589 00:33:35,080 --> 00:33:37,440 Speaker 1: regarding the border, he says that there are issues that 590 00:33:37,480 --> 00:33:39,720 Speaker 1: need to be fixed to provide security for Americans and 591 00:33:39,760 --> 00:33:45,200 Speaker 1: equity in our immigration system. Okay, elaborate, you know, yeah, 592 00:33:45,480 --> 00:33:47,920 Speaker 1: you know. And as I've said, you know, he does 593 00:33:49,000 --> 00:33:52,800 Speaker 1: support the obvious liberal policies l g B, t Q rights, 594 00:33:52,800 --> 00:33:56,840 Speaker 1: abortion rights, campaign finance reform, common sense gun reform, all 595 00:33:56,880 --> 00:33:59,560 Speaker 1: of that. But then there's other things that just kind 596 00:33:59,560 --> 00:34:02,640 Speaker 1: of seems silly or not well thought out, or just 597 00:34:02,800 --> 00:34:06,480 Speaker 1: kind of pandering to our social media addled brains. Uh. 598 00:34:06,520 --> 00:34:11,640 Speaker 1: For example, this one's fun making taxes fun. Currently paying 599 00:34:11,680 --> 00:34:14,239 Speaker 1: taxes as a slog, let's make it a celebration. That's 600 00:34:14,239 --> 00:34:17,080 Speaker 1: from his website. Uh. This is basically like a tax 601 00:34:17,160 --> 00:34:20,239 Speaker 1: day rebrand, which is okay, that's there's something to that. 602 00:34:20,320 --> 00:34:22,680 Speaker 1: But there there are added perks of citizens could direct 603 00:34:22,680 --> 00:34:25,560 Speaker 1: one percent of their taxes to a specific project. You know, 604 00:34:25,760 --> 00:34:31,239 Speaker 1: it's fine, but okay, Uh, modern time banking, which would 605 00:34:31,239 --> 00:34:35,240 Speaker 1: give you points for doing things like volunteering, working at fairs, 606 00:34:35,480 --> 00:34:40,040 Speaker 1: even fixing a neighbor's appliance. Uh, that sounds not at 607 00:34:40,040 --> 00:34:44,480 Speaker 1: all like China's social current. Yeah, and then you can 608 00:34:44,480 --> 00:34:48,640 Speaker 1: exchange those points for prices, I guess, like the website 609 00:34:48,760 --> 00:34:51,240 Speaker 1: suggests trading in your points for tickets to a local 610 00:34:51,280 --> 00:34:54,080 Speaker 1: ball game or a chance to talk to your elected officials. 611 00:34:54,560 --> 00:34:56,640 Speaker 1: Aren't we supposed to be able to talk to them anyway? 612 00:34:59,719 --> 00:35:02,440 Speaker 1: You how to fix? You got to fix seven lawnmowers 613 00:35:02,480 --> 00:35:07,239 Speaker 1: to talk to you? Like, wait, what is this? What 614 00:35:07,400 --> 00:35:10,200 Speaker 1: is that's? Because that's that's also that speaks to what 615 00:35:10,200 --> 00:35:13,080 Speaker 1: we've been talking about of like you're saying that you 616 00:35:13,160 --> 00:35:17,200 Speaker 1: get like human community points for these acts of things 617 00:35:17,200 --> 00:35:20,319 Speaker 1: and then for that. The example of the tickets for 618 00:35:20,320 --> 00:35:22,319 Speaker 1: a baseball game is weird because it's like you get 619 00:35:22,400 --> 00:35:24,960 Speaker 1: universal basic income, which is money which could buy you 620 00:35:25,000 --> 00:35:29,560 Speaker 1: baseball tickets, or you can get your human points and 621 00:35:29,600 --> 00:35:33,600 Speaker 1: then use those to buy baseball tickets too. It's weird food, 622 00:35:33,840 --> 00:35:38,960 Speaker 1: like do you get food if you fix a lawnmowers? World, 623 00:35:39,040 --> 00:35:43,480 Speaker 1: we all get baseball tickets. And that's beautiful, beautiful, it's 624 00:35:43,480 --> 00:35:45,800 Speaker 1: just so interesting, Like there's elements of it that feel 625 00:35:46,360 --> 00:35:52,040 Speaker 1: like some weird socialist thing, but also he's very much 626 00:35:52,080 --> 00:35:56,200 Speaker 1: a capitalist and human centered capitalism, and so it's unclear. 627 00:35:56,360 --> 00:36:00,200 Speaker 1: It's like he's reaching out for the things that might 628 00:36:00,200 --> 00:36:03,759 Speaker 1: appeal to people on like an instinctual like, oh that 629 00:36:03,800 --> 00:36:07,000 Speaker 1: sounds kind of cool way, but it's not substantive, it's 630 00:36:07,080 --> 00:36:12,000 Speaker 1: not well thought out. Yeah, it's still yeah, Still I 631 00:36:12,040 --> 00:36:14,040 Speaker 1: get what he's trying to go for of like human 632 00:36:14,120 --> 00:36:17,640 Speaker 1: centered capitalism, like that sounds like, oh, well that's better 633 00:36:17,719 --> 00:36:21,240 Speaker 1: than the other kind, but it still seems to center 634 00:36:21,280 --> 00:36:24,319 Speaker 1: on money, right, Yeah, But it's like still trying to 635 00:36:24,360 --> 00:36:27,799 Speaker 1: appeal to people that were maybe Bernie supporters while being 636 00:36:28,200 --> 00:36:31,200 Speaker 1: very different from what Bernie is offering, you know what 637 00:36:31,239 --> 00:36:36,799 Speaker 1: I mean? UM, And it's it's it's still denying I 638 00:36:36,840 --> 00:36:41,440 Speaker 1: think at a fundamental level the value that UM social 639 00:36:41,440 --> 00:36:44,920 Speaker 1: programs aimed at actually helping people can have, like like 640 00:36:45,360 --> 00:36:48,799 Speaker 1: giving people something besides UM that that aren't treated as 641 00:36:48,800 --> 00:36:53,759 Speaker 1: a commodity. So when you have uh, like like state 642 00:36:53,800 --> 00:36:56,640 Speaker 1: sponsored daycare, which many nations in the West half for kids, 643 00:36:56,800 --> 00:37:00,000 Speaker 1: when you have paternity leave for fathers, UM in addition 644 00:37:00,000 --> 00:37:03,520 Speaker 1: into maternity leave which many states that's not a monetize 645 00:37:03,520 --> 00:37:05,200 Speaker 1: herbal thing. I guess you could say that like, oh, 646 00:37:05,280 --> 00:37:06,719 Speaker 1: but if you're getting a thousand dollars a month, you 647 00:37:06,719 --> 00:37:09,520 Speaker 1: don't have to work. But like that doesn't go super 648 00:37:09,560 --> 00:37:13,600 Speaker 1: far with the baby. Um, having those things, having those 649 00:37:13,640 --> 00:37:18,360 Speaker 1: things which are not easily um priced, having like access 650 00:37:18,360 --> 00:37:20,279 Speaker 1: to mental health care and stuff that you don't have 651 00:37:20,400 --> 00:37:23,359 Speaker 1: to budget for because you know people you could again, 652 00:37:23,360 --> 00:37:25,840 Speaker 1: you could say you could spend you know, six d 653 00:37:26,000 --> 00:37:27,840 Speaker 1: dollars a month from your thousand dollars a month on 654 00:37:28,239 --> 00:37:30,399 Speaker 1: seeing a therapist four times a month. But how many 655 00:37:30,400 --> 00:37:32,040 Speaker 1: people are going to do that when they have rent 656 00:37:32,080 --> 00:37:37,879 Speaker 1: in food bills do? Yeah, it's it's just this, It's 657 00:37:37,920 --> 00:37:39,600 Speaker 1: not that U b I is a bad idea. I 658 00:37:39,640 --> 00:37:41,720 Speaker 1: think you b I might be part of a solution 659 00:37:41,760 --> 00:37:44,000 Speaker 1: to many of the social ills we have. But treating 660 00:37:44,040 --> 00:37:47,360 Speaker 1: it as the center of the solution, I think mistakes 661 00:37:47,400 --> 00:37:50,440 Speaker 1: the center of many of the problems. Yeah. I agree 662 00:37:50,440 --> 00:37:55,319 Speaker 1: with you completely to take Yeah, I mean you have to. Yeah, 663 00:37:55,480 --> 00:37:59,640 Speaker 1: when you're not pairing it with giving people the things 664 00:37:59,840 --> 00:38:05,479 Speaker 1: they need. Um the text thing sounds nice, sounds There's 665 00:38:05,520 --> 00:38:09,000 Speaker 1: one more that Vice reported about. I mean, there's lots more. 666 00:38:09,239 --> 00:38:13,160 Speaker 1: This is another weird one. Reduce harm to children caused 667 00:38:13,200 --> 00:38:18,360 Speaker 1: by smartphones a plan UH by creating a Department of 668 00:38:18,400 --> 00:38:22,640 Speaker 1: the Attention Economy, which would regulate smartphones, social media gaming 669 00:38:22,800 --> 00:38:26,719 Speaker 1: and chat apps. I don't know, man, um it's all 670 00:38:26,760 --> 00:38:28,319 Speaker 1: that time is interesting. It's the kind of thing that 671 00:38:28,400 --> 00:38:31,279 Speaker 1: like you want, uh, people to be out there talking 672 00:38:31,360 --> 00:38:34,440 Speaker 1: about like even Yeah, but I don't necessarily need it 673 00:38:34,480 --> 00:38:37,960 Speaker 1: to be a presidential platform. Um. But maybe it's good. 674 00:38:38,000 --> 00:38:40,359 Speaker 1: Maybe that's good that we have these conversations and we're 675 00:38:40,360 --> 00:38:43,040 Speaker 1: having you know, he's putting forth these things that we're 676 00:38:43,040 --> 00:38:47,480 Speaker 1: having these conversations. But yeah, I I actually don't like I. 677 00:38:47,480 --> 00:38:49,960 Speaker 1: I don't think Andrew Yang has a real chance of 678 00:38:50,280 --> 00:38:53,040 Speaker 1: winning the presidency just looking at his poll numbers, and 679 00:38:53,080 --> 00:38:55,880 Speaker 1: I don't think he has a very comprehensive attitude for 680 00:38:55,920 --> 00:38:58,560 Speaker 1: what's needed in the country. I do think it's good 681 00:38:58,600 --> 00:39:01,399 Speaker 1: that people are talking about you be I at least 682 00:39:01,440 --> 00:39:04,320 Speaker 1: some people as as a thing that might actually work 683 00:39:04,360 --> 00:39:07,440 Speaker 1: as opposed to just um it being a thing. I 684 00:39:07,440 --> 00:39:09,880 Speaker 1: read about zines that are printed by my friends and 685 00:39:09,960 --> 00:39:14,280 Speaker 1: handed out a protests. So UM, I'm on board with 686 00:39:14,280 --> 00:39:17,400 Speaker 1: with what he's done in regards to that. UM. I 687 00:39:17,520 --> 00:39:20,640 Speaker 1: just I hope that we can arrive at kind of 688 00:39:20,640 --> 00:39:25,000 Speaker 1: a more approach, like a more realistic way to institute 689 00:39:25,000 --> 00:39:28,080 Speaker 1: it totally. Uh, We're gonna have to take a quick 690 00:39:28,080 --> 00:39:31,120 Speaker 1: break for you know, products and services and zone and 691 00:39:31,160 --> 00:39:35,279 Speaker 1: so forth, and then yep, but then we're gonna come back. 692 00:39:35,320 --> 00:39:40,120 Speaker 1: And then finished talking about the Yang Gang Andrew Andrews 693 00:39:40,120 --> 00:39:51,080 Speaker 1: well together everything, So don't don't then we're back from 694 00:39:51,080 --> 00:39:55,480 Speaker 1: those ads. There were great, weren't things loved? I know 695 00:39:55,640 --> 00:39:57,879 Speaker 1: and I know you the listeners love ads too. Yeah, 696 00:39:57,880 --> 00:40:01,239 Speaker 1: who doesn't buy up some product and stuff? It's my 697 00:40:01,280 --> 00:40:04,520 Speaker 1: favorite part of uh when I'm streaming, Like, I hope 698 00:40:04,520 --> 00:40:06,360 Speaker 1: an ad comes up. I sure, hope I have a 699 00:40:06,400 --> 00:40:10,799 Speaker 1: commercial break right now. Um. So, yeah, we left off 700 00:40:11,200 --> 00:40:13,960 Speaker 1: talking about some of his policies. There's one more policy 701 00:40:14,000 --> 00:40:16,520 Speaker 1: I wanted to bring up, and that was his policy 702 00:40:16,600 --> 00:40:21,640 Speaker 1: to closely monitor the mental health of White House staff. Uh. 703 00:40:21,680 --> 00:40:26,320 Speaker 1: And this is clearly a direct response to our current president. 704 00:40:26,840 --> 00:40:30,560 Speaker 1: And it does make sense to have mental health support 705 00:40:30,640 --> 00:40:33,000 Speaker 1: for people that work on a high stress situation and whatnot. 706 00:40:33,040 --> 00:40:36,080 Speaker 1: But there's a world where that's a little bit dangerous. 707 00:40:36,120 --> 00:40:39,440 Speaker 1: I don't know if you're like, you know, what if 708 00:40:39,480 --> 00:40:41,759 Speaker 1: if it's the wrong pair of hands that are in 709 00:40:41,800 --> 00:40:47,000 Speaker 1: the White White House that are you know, rooting people out, 710 00:40:47,040 --> 00:40:49,200 Speaker 1: like is there are people gonna get fired for having 711 00:40:49,239 --> 00:40:53,520 Speaker 1: anxiety and depressions. And you know, it's like it's very 712 00:40:53,560 --> 00:40:59,600 Speaker 1: easy to pin mental health on people who are just 713 00:41:00,080 --> 00:41:04,040 Speaker 1: bad actors. Also like and that seems unfair to right 714 00:41:04,160 --> 00:41:08,520 Speaker 1: blaming Yeah, we especially now and obviously the president does 715 00:41:08,560 --> 00:41:13,360 Speaker 1: this too, but like blaming terrible behavior on mental earliest 716 00:41:13,680 --> 00:41:16,480 Speaker 1: and then also like blaming, yeah, blaming mental eliss on 717 00:41:16,640 --> 00:41:18,279 Speaker 1: like a lot of problems that we have and sort 718 00:41:18,280 --> 00:41:22,239 Speaker 1: of conflating things that don't necessarily pair together. Um, and 719 00:41:22,600 --> 00:41:28,600 Speaker 1: just everyone should have it everybody for everybody, exactly everybody. 720 00:41:28,640 --> 00:41:32,000 Speaker 1: And even there's a reaction because he had talked about 721 00:41:32,160 --> 00:41:40,279 Speaker 1: um couples counseling, um where everyone can have yeah, and 722 00:41:40,400 --> 00:41:44,200 Speaker 1: I know, like Megan McCain was freaking out about it, 723 00:41:44,320 --> 00:41:46,319 Speaker 1: like my marriage is fine. We if we if we 724 00:41:46,320 --> 00:41:48,440 Speaker 1: are hunting in trouble, we like get drunk, we go 725 00:41:48,480 --> 00:41:51,760 Speaker 1: shoot guns in the woods and or whatever. And um, 726 00:41:51,800 --> 00:41:53,480 Speaker 1: I'm not gonna, I'm not gonna with the government like 727 00:41:53,480 --> 00:41:56,719 Speaker 1: make me go to uh go to couple's counseling and 728 00:41:56,840 --> 00:41:58,880 Speaker 1: all like all the stuff we're talking about. It's so 729 00:41:58,960 --> 00:42:03,279 Speaker 1: weird because like it's not it's not mandatory. Everything we're 730 00:42:03,280 --> 00:42:05,000 Speaker 1: talking about. It's like, no, you have if you were 731 00:42:05,200 --> 00:42:08,279 Speaker 1: a human being in this country, you have access to 732 00:42:08,320 --> 00:42:11,120 Speaker 1: this thing that benefits you. If you do it. If 733 00:42:11,160 --> 00:42:12,759 Speaker 1: you don't need it, you don't have to do it. 734 00:42:13,520 --> 00:42:15,480 Speaker 1: Do you can and it will help you help all 735 00:42:15,520 --> 00:42:18,200 Speaker 1: of us. And framing it like it's all these things 736 00:42:18,200 --> 00:42:21,120 Speaker 1: are mandatory is just so it's so disingenuous and weird. 737 00:42:21,960 --> 00:42:25,280 Speaker 1: It's it's like me getting piste off that elementary schools exist, 738 00:42:25,360 --> 00:42:27,200 Speaker 1: because what are they going to make me go? I 739 00:42:27,200 --> 00:42:33,480 Speaker 1: don't want to go learnt to ten? Trying to educational 740 00:42:33,520 --> 00:42:36,600 Speaker 1: destroy the schools, turn them down, coding. Oh hey hey, 741 00:42:36,640 --> 00:42:41,320 Speaker 1: now hey, I'm I'm I've been radicalized. You got de radicalized, 742 00:42:41,320 --> 00:42:44,160 Speaker 1: and now I'm radicalized. I'm an anti school crusader. Now 743 00:42:45,880 --> 00:42:47,640 Speaker 1: what I can do for you, Robert real quick to 744 00:42:47,680 --> 00:42:53,719 Speaker 1: de radicalize you? No, I gotta I can take out 745 00:42:53,760 --> 00:42:57,759 Speaker 1: my headphones. You are using a subscribers left and right. 746 00:42:59,080 --> 00:43:05,480 Speaker 1: I'm kind of thinking, though, Hillary, there were headlines about that, 747 00:43:05,719 --> 00:43:09,719 Speaker 1: people wanting under run um. I think that might take 748 00:43:09,719 --> 00:43:13,080 Speaker 1: away With Andrew Yang. It seems like a nice guy. 749 00:43:13,280 --> 00:43:18,279 Speaker 1: He's got some good um. When I like that he's 750 00:43:17,960 --> 00:43:21,320 Speaker 1: not billionaire, I'm glad he's talking about him, he seems 751 00:43:21,360 --> 00:43:24,759 Speaker 1: to have people's best interests at heart. Um, I don't 752 00:43:24,800 --> 00:43:27,799 Speaker 1: think he's qualified for the job. I don't even think 753 00:43:27,800 --> 00:43:30,400 Speaker 1: you can point to his history as an entrepreneur to 754 00:43:30,520 --> 00:43:33,960 Speaker 1: suggest that he is a successful leader. Like, he's had 755 00:43:34,000 --> 00:43:37,120 Speaker 1: companies that didn't work out. He became the CEO of 756 00:43:37,160 --> 00:43:41,680 Speaker 1: a company that already existed. It coincided with the recession, 757 00:43:41,880 --> 00:43:44,359 Speaker 1: and his company happened to take off and he made 758 00:43:44,400 --> 00:43:46,200 Speaker 1: a good amount of money from it. And then he 759 00:43:46,239 --> 00:43:49,920 Speaker 1: created a tech startup that's doing fine, you know, which 760 00:43:50,320 --> 00:43:53,520 Speaker 1: encapsulates a lot of the ideas that things that he believes. 761 00:43:53,920 --> 00:43:59,200 Speaker 1: And yeah, I mean like to encourage entrepreneurship and people 762 00:43:59,200 --> 00:44:03,200 Speaker 1: to like, you know, innovate, and that's cool. But it's 763 00:44:03,200 --> 00:44:07,600 Speaker 1: not like he's a wildly successful leader. He seems again 764 00:44:07,640 --> 00:44:14,399 Speaker 1: affable but maybe annoying in the workplace with his gong. Yeah, 765 00:44:14,520 --> 00:44:16,839 Speaker 1: I did not know that. I've seen I've seen him both. 766 00:44:16,920 --> 00:44:19,360 Speaker 1: He's better, okay, well more confident. I would love to 767 00:44:19,400 --> 00:44:25,439 Speaker 1: see that skate off between him. No, Sophie says, no, no, no, no, 768 00:44:25,480 --> 00:44:28,719 Speaker 1: never mind. I would not love to see that. I 769 00:44:28,960 --> 00:44:33,759 Speaker 1: love to see um. And I don't even know if 770 00:44:33,760 --> 00:44:39,040 Speaker 1: I think I would suggest him a cabinet position, you know, like, yeah, 771 00:44:39,080 --> 00:44:41,279 Speaker 1: you know, I I hope. I don't know if i'd 772 00:44:41,280 --> 00:44:44,160 Speaker 1: suggest a coubnet position either. I like his attitude towards 773 00:44:44,200 --> 00:44:48,120 Speaker 1: standardized tests. Yes, for Chary with that opinion talks to 774 00:44:48,400 --> 00:44:51,600 Speaker 1: a president at some point. But I don't think he 775 00:44:51,600 --> 00:44:53,879 Speaker 1: would be the most certainly not the most qualified person 776 00:44:53,920 --> 00:44:56,680 Speaker 1: in the country to be the Secretary of Education, although 777 00:44:56,719 --> 00:45:00,239 Speaker 1: apparently that doesn't really matter. I mean, yeah, comparably like 778 00:45:00,360 --> 00:45:04,200 Speaker 1: I think, I I mean, but if you decide that 779 00:45:04,239 --> 00:45:06,840 Speaker 1: we're going to like try to comprehensively reform our social 780 00:45:06,880 --> 00:45:08,920 Speaker 1: safety net, I can think the worst people to have 781 00:45:08,960 --> 00:45:10,800 Speaker 1: a seat at the table. He could be one person 782 00:45:10,800 --> 00:45:13,879 Speaker 1: who could. Yeah, you know, he's contribute, give him some credit. Yeah, 783 00:45:13,920 --> 00:45:17,239 Speaker 1: but no, I I think he's not someone who I 784 00:45:17,280 --> 00:45:21,760 Speaker 1: think is desperately deserves to be have a cabinet position. 785 00:45:22,080 --> 00:45:24,920 Speaker 1: That does not strike me as a thing. But maybe 786 00:45:25,000 --> 00:45:27,520 Speaker 1: he will run for other offices after this, so he 787 00:45:27,880 --> 00:45:32,439 Speaker 1: obviously will have gained a lot from mayor, shoot, from mayor, 788 00:45:32,440 --> 00:45:35,200 Speaker 1: maybe even governor. We don't care about governors like Funckt. 789 00:45:35,280 --> 00:45:39,759 Speaker 1: Why not give it a shot, Arnold Schwartzenegger go become 790 00:45:39,800 --> 00:45:43,279 Speaker 1: the governor of California. You can't do worse than Arnold did. Like, like, 791 00:45:43,600 --> 00:45:48,000 Speaker 1: it's totally reasonable thing to to go for it. And again, 792 00:45:48,040 --> 00:45:50,920 Speaker 1: I appreciate a lot a lot of these conversations are 793 00:45:50,920 --> 00:45:54,279 Speaker 1: being had. Um, he is not the worst offender of 794 00:45:54,280 --> 00:45:57,560 Speaker 1: contributing to the worst year ever is I guess may 795 00:45:57,760 --> 00:46:00,319 Speaker 1: take away. I don't even know that like I would. 796 00:46:00,360 --> 00:46:02,520 Speaker 1: I think some a lot of his fans have contributed 797 00:46:02,600 --> 00:46:04,960 Speaker 1: to that. I don't. I haven't seen him do anything 798 00:46:05,040 --> 00:46:07,480 Speaker 1: I think is wrong. He's a guy. I don't think 799 00:46:07,480 --> 00:46:09,360 Speaker 1: he's the best pick for president, with a couple of 800 00:46:09,360 --> 00:46:13,040 Speaker 1: good ideas and some bad ones saying what he believes 801 00:46:13,160 --> 00:46:17,080 Speaker 1: and that's fine. The only thing I'd say that he 802 00:46:17,120 --> 00:46:19,480 Speaker 1: did wrong was suggesting he sit down with Shane Gillis. 803 00:46:19,719 --> 00:46:22,440 Speaker 1: But yeah, that was a bad call. That was a 804 00:46:22,480 --> 00:46:24,400 Speaker 1: bad call. That was a bad call. But I understand 805 00:46:24,440 --> 00:46:25,880 Speaker 1: where it came from. Yeah, I mean, he's got a 806 00:46:25,880 --> 00:46:29,960 Speaker 1: lot of drive for for the unity. It's the it's 807 00:46:29,960 --> 00:46:33,600 Speaker 1: it's an offshoot of his not right, not left forward, 808 00:46:34,000 --> 00:46:35,600 Speaker 1: the whole thing. I'm just like, no, we all can 809 00:46:35,640 --> 00:46:39,960 Speaker 1: do it together. Um. And that's said Andrew Yang. If 810 00:46:40,000 --> 00:46:42,360 Speaker 1: you're listening and aren't turned off by this conversation and 811 00:46:42,360 --> 00:46:46,720 Speaker 1: would love to have a chat with you. And uh, 812 00:46:46,760 --> 00:46:49,719 Speaker 1: you know, you can even bring your gong. You can 813 00:46:49,840 --> 00:46:52,240 Speaker 1: you can bang a gong. Cody can make his horrible 814 00:46:52,320 --> 00:46:57,480 Speaker 1: time travel noises um, and together we can give Daniel 815 00:46:57,520 --> 00:47:00,600 Speaker 1: an aneurysm. Wouldn't that be an You can do it. 816 00:47:00,680 --> 00:47:03,480 Speaker 1: We can do it. We can do it. And you, guys, 817 00:47:03,520 --> 00:47:09,880 Speaker 1: but only with you, Mr Daniearism, Danielism, that's all good, Sophie. 818 00:47:09,920 --> 00:47:13,200 Speaker 1: Is he happy with that he's doing? I don't want 819 00:47:13,239 --> 00:47:19,680 Speaker 1: to talk. He's happy. He's Daniels the head banging and 820 00:47:20,080 --> 00:47:27,000 Speaker 1: joy and yeah, fist raised in celebration. Andrew Yang, he 821 00:47:27,120 --> 00:47:31,200 Speaker 1: exists and is running for president grabbed. You know what 822 00:47:31,239 --> 00:47:35,200 Speaker 1: I'll say about Andrew Yang, one last positive thing. I 823 00:47:35,280 --> 00:47:37,439 Speaker 1: didn't grab a baby by the dick, and he didn't 824 00:47:37,440 --> 00:47:42,000 Speaker 1: grab a woman by the pussy. Nope, as as far 825 00:47:42,040 --> 00:47:46,160 Speaker 1: as I know, has never grabbed anybody without consent, as 826 00:47:46,239 --> 00:47:50,400 Speaker 1: far as we know. And that's great. That's the low benchmark. 827 00:47:50,480 --> 00:47:54,600 Speaker 1: We now have, really slipping by that extremely low bar 828 00:47:55,000 --> 00:48:00,160 Speaker 1: and crawling under a chiant Andrew consensual grabbing Yang, that's there. 829 00:48:00,239 --> 00:48:04,839 Speaker 1: We gods, very catchy. Yeah, he's getting all sorts of 830 00:48:04,880 --> 00:48:09,600 Speaker 1: good stuff from us today. Alright, cool, you guys, Could 831 00:48:09,640 --> 00:48:13,360 Speaker 1: I just say that his families a really cute, is it? Yeah? 832 00:48:14,200 --> 00:48:18,640 Speaker 1: Precious though they are color cornated outfits and like unlike Beto, 833 00:48:18,719 --> 00:48:23,200 Speaker 1: I'm like, oh, you know, just like wonderful. I would 834 00:48:23,239 --> 00:48:25,280 Speaker 1: love to sit down and have dinner with him. Yeah, 835 00:48:25,560 --> 00:48:28,520 Speaker 1: he seems great. There's one other thing I wanted to 836 00:48:28,520 --> 00:48:32,520 Speaker 1: get to, which is my suspicion that or more of 837 00:48:32,520 --> 00:48:35,920 Speaker 1: why the Internet got behind him so far, is that 838 00:48:36,040 --> 00:48:40,120 Speaker 1: yang gang rhymes for sure. For sure. I think that's 839 00:48:40,200 --> 00:48:42,239 Speaker 1: a lot of it, Like it's a low bar to 840 00:48:42,239 --> 00:48:45,040 Speaker 1: get somebody to just do that retweet um, which is 841 00:48:45,080 --> 00:48:48,279 Speaker 1: a lot of the the activism you've seen, And I 842 00:48:48,320 --> 00:48:52,279 Speaker 1: think people just think it's funny. Um. Yeah, it's fairly, 843 00:48:52,360 --> 00:49:00,280 Speaker 1: it's yeah, it's way better than Bernie bro way better. Yeah. No, 844 00:49:00,280 --> 00:49:02,680 Speaker 1: nobody wants to identify as a Bernie Broke. People want 845 00:49:02,680 --> 00:49:04,719 Speaker 1: to be on the Trump train, which I don't understand, 846 00:49:04,719 --> 00:49:06,200 Speaker 1: and they also want to be on the Yang Gang 847 00:49:06,840 --> 00:49:10,640 Speaker 1: and we don't have h Bernie doesn't have. He never 848 00:49:10,680 --> 00:49:13,520 Speaker 1: came with a good good He's not a nickname guy 849 00:49:13,719 --> 00:49:16,040 Speaker 1: Bernie Sanders. He never came up with a good, pithy 850 00:49:16,160 --> 00:49:20,759 Speaker 1: nickname for his fans. Neither has Elizabeth Warren Sanders Fanders. 851 00:49:21,160 --> 00:49:26,239 Speaker 1: Oh Sanders fans is noted for this shop. Get on 852 00:49:26,280 --> 00:49:31,759 Speaker 1: the phone with Bernie. I would love to Warren's quorumzars, right, 853 00:49:32,160 --> 00:49:36,840 Speaker 1: they those those angry massogynist Liz lads and everybody's mentions 854 00:49:37,360 --> 00:49:44,800 Speaker 1: fucking Liz lets. Uh that does sound like an insulting lad. 855 00:49:45,600 --> 00:49:47,960 Speaker 1: We'll hear that by the end of the year. Um 856 00:49:48,840 --> 00:49:53,440 Speaker 1: quorum shouldn't be in a nickname. It was that that 857 00:49:54,160 --> 00:49:56,799 Speaker 1: I was hoping we would just gloss right over it. 858 00:49:56,880 --> 00:49:59,200 Speaker 1: But no, you got to bring it background warrens for 859 00:49:59,360 --> 00:50:08,000 Speaker 1: him more informed. It's not working. Biden Baby Biden, Sweet 860 00:50:08,000 --> 00:50:12,080 Speaker 1: little Babies. No, keep babies away from Joe Biden. Yeah, 861 00:50:12,120 --> 00:50:15,200 Speaker 1: that's a that's a good, solid, rememberable nickname. Keep babies 862 00:50:15,239 --> 00:50:20,200 Speaker 1: away from Joe Biden. I would I gotta say it 863 00:50:20,200 --> 00:50:21,640 Speaker 1: would be the end of the world, but I would 864 00:50:21,719 --> 00:50:24,680 Speaker 1: kind of love it if the hat based competition that 865 00:50:24,719 --> 00:50:29,120 Speaker 1: occurs in is make America great Again versus Keith Babies 866 00:50:29,160 --> 00:50:33,799 Speaker 1: away from Joe Biden Jr. Wait, I have one more thing, 867 00:50:34,760 --> 00:50:38,560 Speaker 1: one more thing that's going to make you take Young 868 00:50:38,640 --> 00:50:42,480 Speaker 1: down just a notch. His answer to Maga is math, 869 00:50:43,120 --> 00:50:49,279 Speaker 1: make America think hugging. Here's here's the thing. Is he 870 00:50:49,360 --> 00:50:51,680 Speaker 1: just using two letters way to second? Wait a second, 871 00:50:51,680 --> 00:50:54,719 Speaker 1: wait a second? Is he making an acronym? Were the 872 00:50:54,800 --> 00:50:57,680 Speaker 1: last two letters of the acronym are the first two 873 00:50:57,800 --> 00:51:03,800 Speaker 1: letters of the I believe that's true. You are clearly 874 00:51:03,840 --> 00:51:07,800 Speaker 1: a smart man. You know that's not how acronyms work, right, 875 00:51:08,640 --> 00:51:14,000 Speaker 1: That's that's just not how it works. Yeah, that's really 876 00:51:14,120 --> 00:51:17,080 Speaker 1: you just make All you gotta do is say, make 877 00:51:17,120 --> 00:51:21,759 Speaker 1: America think hard. Yeah, Like, that's an acronym. That's an acronym. 878 00:51:21,840 --> 00:51:26,160 Speaker 1: That's at least something right. It's not good, but it's better. 879 00:51:26,760 --> 00:51:29,399 Speaker 1: Sophie's really making us. Look at this photo of Andrew 880 00:51:29,480 --> 00:51:37,560 Speaker 1: Young just devouring a turkey leg, and that's all right, 881 00:51:37,680 --> 00:51:39,759 Speaker 1: turkeys are a vegetable. One second, I gotta come up 882 00:51:39,760 --> 00:51:44,520 Speaker 1: with a good acronym. Here, make America think for the 883 00:51:44,600 --> 00:51:53,759 Speaker 1: first time, finally, would Youth's here's the thing about this, 884 00:51:53,880 --> 00:51:56,359 Speaker 1: all right, And here's the thing about the worst year 885 00:51:56,400 --> 00:52:02,480 Speaker 1: ever that we're all going to experience. Don't nobody, nobody, 886 00:52:03,080 --> 00:52:06,760 Speaker 1: no matter who you are, if you're running for president. 887 00:52:07,600 --> 00:52:11,160 Speaker 1: Do not do an acronym that plays off of maga. 888 00:52:11,480 --> 00:52:14,560 Speaker 1: Just don't do it. Don't try to make America it 889 00:52:14,719 --> 00:52:18,160 Speaker 1: was already great, make America, think, make America good, good back. 890 00:52:18,239 --> 00:52:20,200 Speaker 1: You know, like, whatever it is, do not do it. 891 00:52:20,360 --> 00:52:22,840 Speaker 1: We don't like it, Nobody likes it. It doesn't work. 892 00:52:23,080 --> 00:52:27,759 Speaker 1: It's really cringe e. It makes you seem simple and ineffective, 893 00:52:27,880 --> 00:52:33,000 Speaker 1: and it's just basically basic basic. You're crazy, you're desperate. 894 00:52:33,320 --> 00:52:38,240 Speaker 1: No maga pons, no maga, no maga puns. Don't try 895 00:52:38,280 --> 00:52:41,440 Speaker 1: to do what Trump does, just without being a racist 896 00:52:41,440 --> 00:52:45,000 Speaker 1: piece of ship, Like, don't don't just be like, Okay, well, 897 00:52:45,040 --> 00:52:48,280 Speaker 1: they want somebody who's got a catchy acronym and hats 898 00:52:48,280 --> 00:52:50,200 Speaker 1: with slogans on him, and he yells at people and 899 00:52:50,200 --> 00:52:52,520 Speaker 1: he gives people nicknames. So what if I do that 900 00:52:52,560 --> 00:52:54,600 Speaker 1: but I don't hate Mexicans? Is that is that going 901 00:52:54,640 --> 00:52:58,440 Speaker 1: to get me be president? It's like no, no, no, 902 00:52:59,480 --> 00:53:04,839 Speaker 1: those aren't of things, especially the nickname thing. Please don't 903 00:53:04,840 --> 00:53:06,799 Speaker 1: do that. And I don't like do the bar back 904 00:53:06,840 --> 00:53:10,239 Speaker 1: and forth thing with him. Yeah, that's what I keep 905 00:53:10,280 --> 00:53:12,520 Speaker 1: seeing more and more and more of, like the candidates 906 00:53:12,600 --> 00:53:17,959 Speaker 1: like tweeting him and being like you're there's something people 907 00:53:18,000 --> 00:53:22,479 Speaker 1: see a spike in numbers went Trump bullies, and so 908 00:53:22,640 --> 00:53:25,640 Speaker 1: there is a thing where people start like, I'm going 909 00:53:25,680 --> 00:53:27,719 Speaker 1: to pick at him and then I'm going to get 910 00:53:27,719 --> 00:53:30,080 Speaker 1: those likes and I'm going to get him to give 911 00:53:30,120 --> 00:53:33,200 Speaker 1: me a nick kname. You stop engaging with that creep 912 00:53:33,920 --> 00:53:37,440 Speaker 1: in that especially Um. There are ways to engage with 913 00:53:37,520 --> 00:53:40,040 Speaker 1: him in ways, obviously because you're gonna have to because 914 00:53:40,080 --> 00:53:42,399 Speaker 1: you're gonna you're trying to be the president, so he's not. 915 00:53:42,840 --> 00:53:46,680 Speaker 1: But don't do it like that. Please. There's there's one 916 00:53:46,719 --> 00:53:50,520 Speaker 1: way I want people to engage with Trump on Twitter 917 00:53:50,600 --> 00:53:53,440 Speaker 1: if their presidential candidates, and I will give my vote 918 00:53:53,640 --> 00:53:57,200 Speaker 1: to whoever does this first, even if it's Joseph Robinette 919 00:53:57,200 --> 00:54:00,400 Speaker 1: Biden and that is slide into the press, it's d 920 00:54:00,560 --> 00:54:04,719 Speaker 1: ms and send him a picture your dick. Just just 921 00:54:04,840 --> 00:54:08,920 Speaker 1: send him your send the president your dick presidential candidates. 922 00:54:09,360 --> 00:54:14,120 Speaker 1: That is what Robert is sexist. What should listen Laurrant, 923 00:54:14,160 --> 00:54:18,239 Speaker 1: I'm just kidding, well, what should Elizabeth her husband's dick? Yeah, absolutely, 924 00:54:18,520 --> 00:54:22,400 Speaker 1: any dick. It doesn't have to be your dick, president's 925 00:54:22,440 --> 00:54:24,799 Speaker 1: d M. Send him a dick and I will vote 926 00:54:24,840 --> 00:54:27,360 Speaker 1: for you. Prove it by posting the exchange on Twitter, 927 00:54:27,560 --> 00:54:29,800 Speaker 1: and I'll vote for you. You You will have my vote, 928 00:54:30,080 --> 00:54:34,560 Speaker 1: Kamala Harris, you'll have my vote. Biden, You'll have my vote. 929 00:54:35,680 --> 00:54:41,400 Speaker 1: But still in he still do it? Yeah, send the president, 930 00:54:41,440 --> 00:54:48,960 Speaker 1: Dick picks all the heroes, Michael Michael, Michael Bennett. Yeah, Bennett, 931 00:54:49,000 --> 00:54:51,399 Speaker 1: get your dick in. There is he out? Here's here's 932 00:54:51,440 --> 00:54:54,680 Speaker 1: a picture of my my my penis, Mr President. It's 933 00:54:54,760 --> 00:54:59,399 Speaker 1: too bad he's probably out. I believe he's out. That's 934 00:55:00,560 --> 00:55:03,680 Speaker 1: what a wonderful sup question than I would be happy 935 00:55:03,719 --> 00:55:07,600 Speaker 1: to vote for him. If we've got to be more 936 00:55:07,719 --> 00:55:12,839 Speaker 1: unifying everybody. Here's my idea. So here's my duke. All right, 937 00:55:13,280 --> 00:55:16,040 Speaker 1: this has been real fun. You can check us out 938 00:55:16,080 --> 00:55:19,720 Speaker 1: online all the social media places at Worst Year pod, 939 00:55:19,960 --> 00:55:23,839 Speaker 1: that's the Instagram's, the Twitters, what have you. Eventually we'll 940 00:55:23,840 --> 00:55:27,920 Speaker 1: have a website Worst Year Ever dot com. I love websites. 941 00:55:28,360 --> 00:55:31,160 Speaker 1: And then we're online too. You can find us on twitters, 942 00:55:31,440 --> 00:55:33,560 Speaker 1: you know what. That's true with our names. I would 943 00:55:33,560 --> 00:55:36,960 Speaker 1: search our names and then uh Twitter, they'll probably be 944 00:55:37,000 --> 00:55:40,759 Speaker 1: linked to this tweet. So you'll find us easy. There's 945 00:55:40,760 --> 00:55:44,239 Speaker 1: no excuses. You'll find us easy, and you can keep 946 00:55:44,280 --> 00:55:47,880 Speaker 1: track of our campaign to sexually harass a sitting president, 947 00:55:48,680 --> 00:55:51,040 Speaker 1: which I think is really going to set us apart 948 00:55:51,200 --> 00:55:54,560 Speaker 1: from the other podcasts about the election. There aren't any 949 00:55:54,600 --> 00:55:56,959 Speaker 1: other podcasts. There are no other podcasts, and if there were, 950 00:55:57,000 --> 00:56:01,319 Speaker 1: they definitely wouldn't support just like grabbing them grabbing the 951 00:56:01,320 --> 00:56:10,359 Speaker 1: president tush hashtag grabbed the President's tush Honk Honko, Mr 952 00:56:10,400 --> 00:56:19,719 Speaker 1: President Honka Honk. Indeed, everything so dull, Everything so dull. 953 00:56:22,920 --> 00:56:28,440 Speaker 1: I tried. Daniel Lovely. Worst Year Ever is a production 954 00:56:28,480 --> 00:56:31,360 Speaker 1: of I Heart Radio. For more podcasts from my heart Radio, 955 00:56:31,440 --> 00:56:34,400 Speaker 1: visit the i heart Radio app, Apple Podcasts, or wherever 956 00:56:34,440 --> 00:56:35,800 Speaker 1: you listen to your favorite shows.