1 00:00:01,360 --> 00:00:03,640 Speaker 1: This episode is brought to you by P and C Bank. 2 00:00:04,000 --> 00:00:06,760 Speaker 1: A lot of people think podcasts about work are boring, 3 00:00:07,000 --> 00:00:10,840 Speaker 1: and sure they definitely can be, but understanding a professionals 4 00:00:10,920 --> 00:00:14,160 Speaker 1: routine shows us how they achieve their success little by little, 5 00:00:14,560 --> 00:00:18,200 Speaker 1: day after day. It's like banking with P and C Bank. 6 00:00:18,680 --> 00:00:21,480 Speaker 1: It might seem boring to safe plan and make calculated 7 00:00:21,480 --> 00:00:24,599 Speaker 1: decisions with your bank, but keeping your money boring is 8 00:00:24,640 --> 00:00:27,960 Speaker 1: what helps you live or more happily fulfilled life. P 9 00:00:28,080 --> 00:00:32,879 Speaker 1: and C Bank Brilliantly Boring since eighteen sixty five. Brilliantly 10 00:00:32,960 --> 00:00:35,600 Speaker 1: Boring since eighteen sixty five is a service mark of 11 00:00:35,600 --> 00:00:38,840 Speaker 1: the PNC Financial Service Group, Inc. P and C Bank 12 00:00:39,280 --> 00:00:45,920 Speaker 1: National Association Member FDIC erness What's Up? You ever walk 13 00:00:45,920 --> 00:00:49,000 Speaker 1: into a small business and everything just works like the 14 00:00:49,080 --> 00:00:53,200 Speaker 1: checkout is fast, the receipts are digital, tipping is a breeze, 15 00:00:53,479 --> 00:00:55,640 Speaker 1: and you're out the door before the line even builds. 16 00:00:55,960 --> 00:01:00,240 Speaker 1: Odds are they're using Square? We love supporting business that 17 00:01:00,320 --> 00:01:02,959 Speaker 1: run on Square because it just feels seamless. Whether it's 18 00:01:03,000 --> 00:01:06,000 Speaker 1: a local coffee shop, a vendor at a pop up market, 19 00:01:06,200 --> 00:01:09,040 Speaker 1: or even one of our merch partners. Square makes it 20 00:01:09,200 --> 00:01:12,559 Speaker 1: easy for them to take payments, manage inventory, and run 21 00:01:12,600 --> 00:01:16,600 Speaker 1: their business with confidence, all from one simple system. If 22 00:01:16,640 --> 00:01:19,480 Speaker 1: you're a business owner or even just thinking about launching 23 00:01:19,520 --> 00:01:22,520 Speaker 1: something soon, Square is hands down one of the best 24 00:01:22,520 --> 00:01:25,680 Speaker 1: tools out there to help you start, run and grow. 25 00:01:25,959 --> 00:01:28,959 Speaker 1: It's not just about payments, it's about giving you time 26 00:01:29,040 --> 00:01:32,000 Speaker 1: back so you can focus on what matters most ready. 27 00:01:32,040 --> 00:01:35,399 Speaker 1: To see how Square can transform your business, visit Square 28 00:01:35,480 --> 00:01:40,160 Speaker 1: dot com backslash go backslash eyl to learn more. That's 29 00:01:40,200 --> 00:01:45,640 Speaker 1: Square dot com backslash, go backslash eyl. Don't wait, don't hesitate. 30 00:01:45,760 --> 00:01:48,160 Speaker 1: Let's Square handle the back end so you can keep 31 00:01:48,160 --> 00:01:49,400 Speaker 1: pushing your vision forward. 32 00:01:51,600 --> 00:01:54,160 Speaker 2: Because when people think of AI, I think right now, 33 00:01:54,240 --> 00:01:57,080 Speaker 2: they're looking at it from how they can actually use it, 34 00:01:57,120 --> 00:02:00,640 Speaker 2: which is beneficial, but from an opportunity stamp point right, 35 00:02:00,920 --> 00:02:05,440 Speaker 2: And there's multiple different opportunities inside of AI. It's it's everything, right, 36 00:02:05,520 --> 00:02:09,800 Speaker 2: So what opportunities should we be looking at inside of 37 00:02:09,800 --> 00:02:12,679 Speaker 2: AI as opposed to like just using it for our 38 00:02:12,800 --> 00:02:14,240 Speaker 2: personal or businesses. 39 00:02:14,560 --> 00:02:16,120 Speaker 3: Well, part of the reason I sent that to you 40 00:02:16,200 --> 00:02:17,800 Speaker 3: is because I think we need to work that out. 41 00:02:18,680 --> 00:02:21,960 Speaker 3: I you know, you're the business guys. You know, I'm 42 00:02:22,000 --> 00:02:24,640 Speaker 3: just like the political guy. Who just happened to stumble 43 00:02:24,720 --> 00:02:26,399 Speaker 3: upon where I am a couple of years ago and said, 44 00:02:26,400 --> 00:02:28,920 Speaker 3: holy crap, this is either going to put black people 45 00:02:28,960 --> 00:02:30,600 Speaker 3: at the top or at the bottom, depending on how 46 00:02:30,639 --> 00:02:35,440 Speaker 3: we respond. But there are different levels of it all 47 00:02:35,440 --> 00:02:39,680 Speaker 3: the way from uh, you know, it's probably too late 48 00:02:40,440 --> 00:02:43,960 Speaker 3: to do to get much of an allocation in some 49 00:02:44,080 --> 00:02:48,799 Speaker 3: of the bigger, the bigger models, like the open AI, 50 00:02:48,960 --> 00:02:51,360 Speaker 3: like the big big stuff like that's almost like now 51 00:02:51,720 --> 00:02:57,040 Speaker 3: plumbing or infrastructure for the whole new system. But there 52 00:02:57,040 --> 00:02:59,720 Speaker 3: are different layers and levels on top of that. There 53 00:02:59,400 --> 00:03:03,960 Speaker 3: are and be opportunities for I'm not the person to 54 00:03:04,000 --> 00:03:06,000 Speaker 3: talk to about that, but everything I find I'm a 55 00:03:06,080 --> 00:03:09,200 Speaker 3: cind y'all, because you know, you're kind of the megaphone 56 00:03:09,240 --> 00:03:13,040 Speaker 3: to start pointing people in this direction, and really nobody knows. 57 00:03:13,040 --> 00:03:14,480 Speaker 3: And it's a crazy thing even when I talk to 58 00:03:14,480 --> 00:03:17,760 Speaker 3: you know, I talked to Ashton Kutcher, and you know, 59 00:03:17,840 --> 00:03:20,079 Speaker 3: people see him as like some like white Hollywood dude, 60 00:03:20,120 --> 00:03:22,240 Speaker 3: but he's actually been involved in text for a long time, 61 00:03:23,080 --> 00:03:25,520 Speaker 3: and he actually cares about communities that don't have much 62 00:03:25,560 --> 00:03:26,399 Speaker 3: cares about black folks. 63 00:03:26,560 --> 00:03:28,239 Speaker 4: He grew up like a poor white. 64 00:03:28,040 --> 00:03:31,120 Speaker 3: Dude eating government cheese like not no joke, and so 65 00:03:31,880 --> 00:03:35,040 Speaker 3: you know he's he has a whole approach where he's 66 00:03:35,040 --> 00:03:37,800 Speaker 3: broken it down and he's seeing opportunities. He talks to 67 00:03:37,800 --> 00:03:39,560 Speaker 3: Will I Am a bunch. We're gonna have to create 68 00:03:39,600 --> 00:03:42,080 Speaker 3: a whole culture around and a whole community around how 69 00:03:42,080 --> 00:03:45,400 Speaker 3: we invest in this. But you know, you guys are 70 00:03:45,400 --> 00:03:46,680 Speaker 3: gonna be smart about that that I am. I'm just 71 00:03:46,720 --> 00:03:48,040 Speaker 3: gonna give you all the information I can. 72 00:03:47,960 --> 00:03:50,040 Speaker 4: Find, smart man. Thank you. 73 00:03:51,720 --> 00:03:56,120 Speaker 5: So I'm interested. As you said, you ran it too, 74 00:03:56,200 --> 00:03:58,400 Speaker 5: Will I Am. And it was a mind blowing conversation. 75 00:03:58,520 --> 00:04:00,960 Speaker 5: You spoke to ask some culture. You're talking to people 76 00:04:01,000 --> 00:04:03,800 Speaker 5: outside the realm of politics, but inside the realm right 77 00:04:03,840 --> 00:04:06,720 Speaker 5: where you know. 78 00:04:05,560 --> 00:04:08,720 Speaker 4: You built a career. I wonder what the temperature is 79 00:04:08,920 --> 00:04:09,760 Speaker 4: around AI. 80 00:04:09,880 --> 00:04:13,440 Speaker 5: Is it just from a regulation standpoint, from an informational standpoint, 81 00:04:13,680 --> 00:04:16,479 Speaker 5: from a revolutionary standpoint? How is it being viewed in 82 00:04:16,480 --> 00:04:20,919 Speaker 5: the political or amongst people in DC and throughout the country. 83 00:04:21,480 --> 00:04:26,520 Speaker 3: I think they're mainly coming from a protective point of view, 84 00:04:26,520 --> 00:04:33,160 Speaker 3: that there's a tension within the government over AI. On 85 00:04:33,200 --> 00:04:36,359 Speaker 3: the one hand, they want, you know, they're afraid that 86 00:04:36,440 --> 00:04:39,719 Speaker 3: this is going to be a frankensigin technology, it gets 87 00:04:39,720 --> 00:04:41,520 Speaker 3: out of hand and causes a bunch of problems. They 88 00:04:41,560 --> 00:04:44,560 Speaker 3: want to slow it down. At the same time, they 89 00:04:44,640 --> 00:04:48,839 Speaker 3: look over their shoulder at China, and China has an 90 00:04:48,880 --> 00:04:53,279 Speaker 3: advantage over us in that we've made these algorithms. But 91 00:04:53,440 --> 00:04:57,479 Speaker 3: for an AI, data is like oil, like data is 92 00:04:57,560 --> 00:05:00,440 Speaker 3: like gas. Data is what they need. So is like 93 00:05:00,440 --> 00:05:03,800 Speaker 3: the Saudi Arabia of data because they have a billion 94 00:05:03,839 --> 00:05:08,039 Speaker 3: two people and no privacy rights, so they can their 95 00:05:08,080 --> 00:05:10,720 Speaker 3: ais can get faster and faster, better, better, faster, faster 96 00:05:10,839 --> 00:05:11,800 Speaker 3: with no barriers. 97 00:05:12,120 --> 00:05:13,280 Speaker 4: So they're coming fast. 98 00:05:13,680 --> 00:05:18,240 Speaker 3: That scares our government because if they win the AI war, 99 00:05:18,279 --> 00:05:20,679 Speaker 3: where their their ais are faster and smarter than ours 100 00:05:20,720 --> 00:05:22,800 Speaker 3: and they can win wars against us, they can do 101 00:05:22,800 --> 00:05:25,240 Speaker 3: a bunch of stuff. Yeah and so yeah, So this 102 00:05:25,400 --> 00:05:29,479 Speaker 3: this tension between do you protect Americans from a runaway 103 00:05:29,520 --> 00:05:33,200 Speaker 3: technology or do you protect America from a runaway China. 104 00:05:33,760 --> 00:05:37,760 Speaker 3: That's the tension. And so for me, that's all above 105 00:05:37,800 --> 00:05:40,920 Speaker 3: my pay grade. That's that's that's that's that's grown folk business. 106 00:05:41,520 --> 00:05:43,760 Speaker 3: I'm saying, regardless of what y'all do, I'm going to 107 00:05:43,839 --> 00:05:46,239 Speaker 3: make sure that black people are aware in form investing 108 00:05:46,279 --> 00:05:49,040 Speaker 3: in empowered, and so you know, I'm going to leave 109 00:05:49,080 --> 00:05:51,560 Speaker 3: it to some of the big These bigger forces are 110 00:05:51,560 --> 00:05:54,560 Speaker 3: going to battle it out. But no matter what, the 111 00:05:54,600 --> 00:05:58,560 Speaker 3: thing you know is that capacities that you need to 112 00:05:58,600 --> 00:06:02,080 Speaker 3: make your business go faster, opportunities that you have to 113 00:06:02,120 --> 00:06:05,839 Speaker 3: make money in a neutrillion dollar economy are either here 114 00:06:05,960 --> 00:06:08,040 Speaker 3: or right around the corner. And they're going to be 115 00:06:08,080 --> 00:06:10,120 Speaker 3: people who are broke right now who are going to 116 00:06:10,160 --> 00:06:12,840 Speaker 3: be rich tomorrow because of AI. People who are rich 117 00:06:12,920 --> 00:06:15,560 Speaker 3: right now are gonna be broke tomorrow because AI and 118 00:06:16,320 --> 00:06:20,640 Speaker 3: our entrepreneur class. You know, I wouldn't bet against us. 119 00:06:20,920 --> 00:06:22,200 Speaker 3: I wouldn't bet against us. 120 00:06:22,800 --> 00:06:25,039 Speaker 6: Really quick, for context, can we say that this AI 121 00:06:25,120 --> 00:06:27,840 Speaker 6: boom is probably going to be one hundred or two 122 00:06:27,880 --> 00:06:30,840 Speaker 6: hundred times bigger than a dot com boom of ninety 123 00:06:30,880 --> 00:06:33,680 Speaker 6: nine two thousand easily. 124 00:06:33,920 --> 00:06:41,040 Speaker 3: Plus don't forget there are three Ais whant everybody's pay 125 00:06:41,040 --> 00:06:44,360 Speaker 3: attention to. But there's two other technological revolutions coming right 126 00:06:44,400 --> 00:06:48,039 Speaker 3: behind it, biotech and quantum computing. 127 00:06:48,120 --> 00:06:49,960 Speaker 4: Quantum it's going to be crazy. Yeah. 128 00:06:50,040 --> 00:06:54,000 Speaker 3: So now look, you take biotech, which means, you know, 129 00:06:54,279 --> 00:06:58,279 Speaker 3: you basically can start reprogramming humans at a biological level 130 00:06:59,480 --> 00:07:02,360 Speaker 3: to you have vaccines for cancers. I mean, you could 131 00:07:02,360 --> 00:07:05,800 Speaker 3: probably start literally creating Captain America super soldiers in about 132 00:07:05,839 --> 00:07:06,240 Speaker 3: ten years. 133 00:07:06,279 --> 00:07:07,719 Speaker 4: I mean that's going to be crazy. 134 00:07:08,120 --> 00:07:12,440 Speaker 3: And then quantum computing means that almost everything you're doing 135 00:07:12,480 --> 00:07:14,360 Speaker 3: right now from a privacy point of view, is going 136 00:07:14,440 --> 00:07:18,000 Speaker 3: to be null and void, because whereas it would take 137 00:07:18,000 --> 00:07:20,800 Speaker 3: a normal computer make it up one hundred years to 138 00:07:20,840 --> 00:07:23,040 Speaker 3: break your password, quantum. 139 00:07:22,720 --> 00:07:24,320 Speaker 4: Computing can do it in like two minutes. 140 00:07:24,880 --> 00:07:29,400 Speaker 3: So all security, all cybersecurity, is going to be reimagined. 141 00:07:29,880 --> 00:07:31,040 Speaker 4: And so there's opportunity there. 142 00:07:31,080 --> 00:07:37,120 Speaker 3: But you put the three together, AI, biotech and quantum computing, 143 00:07:37,800 --> 00:07:42,000 Speaker 3: you're talking about a completely different I mean, it's it's 144 00:07:42,000 --> 00:07:44,800 Speaker 3: almost indescribable the level of changes and the level of opportunities. 145 00:07:45,040 --> 00:07:47,680 Speaker 3: But what's going to happen is we're going to get scared. 146 00:07:48,240 --> 00:07:49,840 Speaker 3: Oh this is going to happen bad, It's going to happen. 147 00:07:49,840 --> 00:07:51,680 Speaker 3: It gon to discriminate against black people. You're going to 148 00:07:51,760 --> 00:07:54,239 Speaker 3: have racist algorithms. What's gonna happen to us? Blah blah blah. 149 00:07:54,240 --> 00:07:56,360 Speaker 3: What's going to happen to us is what we determine 150 00:07:56,640 --> 00:08:00,000 Speaker 3: is going to happen to us. And for us. White 151 00:08:00,080 --> 00:08:03,040 Speaker 3: folks are not in charge either. Nobody is all new 152 00:08:03,120 --> 00:08:05,960 Speaker 3: to everybody. It's like everybody got the same Christmas package 153 00:08:06,000 --> 00:08:07,160 Speaker 3: and opened it up the same day. 154 00:08:07,880 --> 00:08:08,040 Speaker 4: You know. 155 00:08:08,320 --> 00:08:11,360 Speaker 3: Back we have video games, everybody get the same video game, 156 00:08:11,400 --> 00:08:13,080 Speaker 3: and it's whoever is like playing the hardest over the 157 00:08:13,120 --> 00:08:16,000 Speaker 3: weekend is good. When the video game first arrives, nobody's 158 00:08:16,000 --> 00:08:19,400 Speaker 3: any good. This stuff is just arriving. Nobody's any good. 159 00:08:19,480 --> 00:08:20,840 Speaker 3: You have a you have a few people right now 160 00:08:20,880 --> 00:08:23,000 Speaker 3: who are ahead of us, not that many, and so, 161 00:08:23,400 --> 00:08:25,680 Speaker 3: but I just want I mean, the reason I'm so 162 00:08:25,680 --> 00:08:30,400 Speaker 3: excited about invest Fest is until invest Fest, there was 163 00:08:30,480 --> 00:08:33,920 Speaker 3: never a place of enough of us large enough to 164 00:08:34,000 --> 00:08:36,480 Speaker 3: even make this conversation relevant. I mean, That's why where 165 00:08:36,480 --> 00:08:39,120 Speaker 3: I am is going. That's why I'm going, because you know, 166 00:08:39,480 --> 00:08:42,520 Speaker 3: there's an opportunity if we all, if whatever twenty thousand 167 00:08:42,480 --> 00:08:44,440 Speaker 3: and thirty thousand, I meny thousand people come to invest Fest, 168 00:08:44,440 --> 00:08:47,719 Speaker 3: If everybody leaves with that same mission that this is 169 00:08:47,720 --> 00:08:52,000 Speaker 3: what we're gonna at least be focused on, we will 170 00:08:52,080 --> 00:08:53,360 Speaker 3: leave for If you if you're not first of all, 171 00:08:53,360 --> 00:08:55,280 Speaker 3: if you don't come to invest vest, I'm wrong with you. 172 00:08:55,280 --> 00:08:58,440 Speaker 3: You're not serious. Something wrong with you, and that's it's serious. 173 00:08:58,480 --> 00:09:00,240 Speaker 3: I mean, you're not You're not serious whatever whatever you said. 174 00:09:00,240 --> 00:09:02,640 Speaker 3: You Oh, I have all these dreams, vision board. Take 175 00:09:02,679 --> 00:09:03,800 Speaker 3: the vision board down. 176 00:09:05,760 --> 00:09:06,880 Speaker 6: People change in the world. 177 00:09:07,240 --> 00:09:10,440 Speaker 4: Yeah, context, can we be honest? 178 00:09:10,679 --> 00:09:14,800 Speaker 6: Sam Altman had why Combinator, which was like a best 179 00:09:14,800 --> 00:09:18,520 Speaker 6: festal startups, Thank you, took all of the best ideas 180 00:09:18,920 --> 00:09:22,320 Speaker 6: and now probably has the most important startup in human history. 181 00:09:23,520 --> 00:09:25,000 Speaker 4: I mean not probably definitely has it. 182 00:09:25,160 --> 00:09:28,000 Speaker 3: And you know, and it has a whole network of people, 183 00:09:28,520 --> 00:09:32,760 Speaker 3: I mean, and it all comes down to individual people. 184 00:09:32,760 --> 00:09:35,679 Speaker 3: In fact, we have somebody on our team who used 185 00:09:35,679 --> 00:09:39,960 Speaker 3: to work with Sam Altman, a sister, uh you know Dom. 186 00:09:40,200 --> 00:09:40,480 Speaker 4: You know. 187 00:09:40,840 --> 00:09:44,720 Speaker 3: She's like, what are y'all doing? 188 00:09:47,559 --> 00:09:48,880 Speaker 4: What are y'all doing? 189 00:09:49,120 --> 00:09:51,320 Speaker 3: Like this is not going to work at all, And 190 00:09:51,320 --> 00:09:53,480 Speaker 3: we're like, look, we're doing the best we can. I'm 191 00:09:53,520 --> 00:09:55,559 Speaker 3: on TV and stuff. I still come out the grassroots 192 00:09:55,559 --> 00:09:57,880 Speaker 3: and so a lot of stuff. I'm still learning about it. 193 00:09:57,960 --> 00:10:01,200 Speaker 3: She's just like, I'm glad left, like, you know, Sam's 194 00:10:01,200 --> 00:10:04,439 Speaker 3: team to come and help our team. But it's it is, 195 00:10:04,640 --> 00:10:07,160 Speaker 3: it's just going to be if you like, but I'm 196 00:10:07,160 --> 00:10:10,640 Speaker 3: going to tell you this, if you're serious about changing 197 00:10:10,679 --> 00:10:12,920 Speaker 3: your life. If you're serious about changing your company, if 198 00:10:12,920 --> 00:10:15,360 Speaker 3: it's serous about changing your community, your family, and you're 199 00:10:15,360 --> 00:10:18,800 Speaker 3: not an investmentest, don't call me to shut up, because 200 00:10:19,160 --> 00:10:22,000 Speaker 3: there's no other place you're going to find the people 201 00:10:22,080 --> 00:10:24,240 Speaker 3: can help you do it. The conversation is going to 202 00:10:24,280 --> 00:10:25,920 Speaker 3: help you do it. The access and stuff that we're 203 00:10:25,920 --> 00:10:28,880 Speaker 3: talking about, it's just that it doesn't exist. And so 204 00:10:28,960 --> 00:10:30,840 Speaker 3: that's why that's why we're so excited about being a 205 00:10:30,880 --> 00:10:33,160 Speaker 3: part of it, because otherwise we're at a call to 206 00:10:33,240 --> 00:10:34,840 Speaker 3: sac you know, it's just us and our little three 207 00:10:34,920 --> 00:10:36,400 Speaker 3: or four hundred people. Then we can come and be 208 00:10:36,440 --> 00:10:39,680 Speaker 3: a part of thousands, thousands of people. I mean this, 209 00:10:40,040 --> 00:10:43,880 Speaker 3: Do not feel discouraged about the political situation or anything else. 210 00:10:44,240 --> 00:10:47,920 Speaker 3: Go where the hope is This, This revolution Ai, that's 211 00:10:47,920 --> 00:10:49,640 Speaker 3: where the hope is there. 212 00:10:49,679 --> 00:10:50,680 Speaker 4: You have it, man. 213 00:10:50,760 --> 00:10:53,839 Speaker 2: I appreciate your time, appreciate your brother, Thank you so much. 214 00:10:54,760 --> 00:10:57,480 Speaker 2: That's fast and not only coming, but like I said, 215 00:10:57,520 --> 00:11:01,480 Speaker 2: bringing your whole you know, curriculum and doing. 216 00:11:01,440 --> 00:11:03,040 Speaker 4: Crew everybody's gone. 217 00:11:03,400 --> 00:11:05,319 Speaker 2: That's what I want people to fully understand too. It's 218 00:11:05,360 --> 00:11:08,640 Speaker 2: not just like will I Am literally is coming from Europe. 219 00:11:09,240 --> 00:11:12,120 Speaker 2: He's on tour with Black Eyed Peas, He's changed his 220 00:11:12,160 --> 00:11:14,760 Speaker 2: whole tour schedule, can come from Europe to go to Atlanta, 221 00:11:14,840 --> 00:11:17,200 Speaker 2: to go back to Europe. Like these are people that 222 00:11:17,360 --> 00:11:20,839 Speaker 2: don't have to speak. Will i Am doesn't have to. 223 00:11:21,160 --> 00:11:23,240 Speaker 2: Ninety percent of people don't even know that Will i 224 00:11:23,320 --> 00:11:26,280 Speaker 2: Am is even in technology. They just think that the 225 00:11:26,360 --> 00:11:29,560 Speaker 2: Black Eyed Peas and he's a producer. If I told 226 00:11:29,600 --> 00:11:31,000 Speaker 2: you some of the stuff that Will i Am is 227 00:11:31,400 --> 00:11:35,360 Speaker 2: like some Mercedes and he could literally never have to 228 00:11:35,360 --> 00:11:37,319 Speaker 2: speak about anything for the rest of his life and 229 00:11:37,360 --> 00:11:40,400 Speaker 2: he'll be good. So it's like this shows you they're 230 00:11:40,440 --> 00:11:44,600 Speaker 2: actually trying to help, right, They're trying to help, rearranging 231 00:11:44,600 --> 00:11:48,719 Speaker 2: their schedule, bringing the team, having programming. It's bigger than 232 00:11:48,800 --> 00:11:52,920 Speaker 2: just celebrities that's coming. This is actually very important information. 233 00:11:53,480 --> 00:11:56,640 Speaker 2: So A, you should go, but B if you do go, 234 00:11:57,760 --> 00:12:00,640 Speaker 2: you should go to every panel. You have options, but 235 00:12:01,000 --> 00:12:05,240 Speaker 2: the artificial one, the artificial teles mandatory those I would 236 00:12:05,400 --> 00:12:08,760 Speaker 2: definitely not miss any of those panels from the Friday 237 00:12:08,880 --> 00:12:11,880 Speaker 2: all the way to Sunday. Make sure you're there, whether 238 00:12:11,880 --> 00:12:15,480 Speaker 2: it's vind the marketplace, stage, a stage, whatever it is. 239 00:12:16,679 --> 00:12:19,560 Speaker 2: If you're serious about taking your life to the next level, 240 00:12:19,600 --> 00:12:22,839 Speaker 2: not just your career, your life because artificial intelligence stuff 241 00:12:22,880 --> 00:12:23,959 Speaker 2: is going to change life. 242 00:12:24,120 --> 00:12:27,800 Speaker 7: An illegal alien from Guatemala charged with raping a child 243 00:12:27,840 --> 00:12:31,640 Speaker 7: in Massachusetts. An MS thirteen gang member from Al Salvador 244 00:12:31,880 --> 00:12:36,000 Speaker 7: accused of murdering a Texas man of Venezuelan charged with 245 00:12:36,080 --> 00:12:39,960 Speaker 7: filming and selling child pornography in Michigan. These are just 246 00:12:40,080 --> 00:12:43,840 Speaker 7: some of the heinous migrant criminals caught because of President 247 00:12:43,880 --> 00:12:47,400 Speaker 7: Donald J. Trump's leadership. I'm Christine Noman, the United States 248 00:12:47,480 --> 00:12:52,280 Speaker 7: Secretary of Homeland Security. Under President Trump, attempted illegal border 249 00:12:52,280 --> 00:12:55,840 Speaker 7: crossings are at the lowest levels ever recorded, and over 250 00:12:55,920 --> 00:12:59,120 Speaker 7: one hundred thousand illegal aliens have been arrested. If you 251 00:12:59,200 --> 00:13:03,040 Speaker 7: were here legally, your next you will be fine nearly 252 00:13:03,120 --> 00:13:07,160 Speaker 7: one thousand dollars a day, imprisoned and deported, you will 253 00:13:07,200 --> 00:13:10,840 Speaker 7: never return. But if you register using our CBP home 254 00:13:10,880 --> 00:13:14,240 Speaker 7: app and leave now, you could be allowed to return legally. 255 00:13:14,640 --> 00:13:19,360 Speaker 7: Do what's right. Leave now. Under President Trump, America's laws, 256 00:13:19,520 --> 00:13:21,960 Speaker 7: border and families will be protected. 257 00:13:22,080 --> 00:13:24,200 Speaker 4: Sponsored by the United States Department of Homeland Security,