1 00:00:00,040 --> 00:00:02,719 Speaker 1: Welcome to Money Making Conversation Masterclass. This is your host, 2 00:00:03,200 --> 00:00:07,280 Speaker 1: Rashan McDonald, our host this weekly Money Making Conversation Masterclass show. 3 00:00:07,720 --> 00:00:10,920 Speaker 1: The interviews and information that this show provides off for everyone. 4 00:00:11,240 --> 00:00:14,120 Speaker 1: It's time to stop reading other people's success stories and 5 00:00:14,200 --> 00:00:16,720 Speaker 1: start living your own. If you want to be a 6 00:00:16,760 --> 00:00:20,919 Speaker 1: guest on my show, please visit our website, Moneymaking Conversations 7 00:00:21,400 --> 00:00:24,439 Speaker 1: dot com and click to be a guest button. Let's 8 00:00:24,480 --> 00:00:27,560 Speaker 1: get this show started with my guests. My guests founded 9 00:00:27,600 --> 00:00:32,479 Speaker 1: a company that helps churches and nonprofit organizations specifically to 10 00:00:32,640 --> 00:00:38,040 Speaker 1: understand technology and help them with artificial intelligence technology to 11 00:00:38,120 --> 00:00:41,480 Speaker 1: help them spread the gospel. Please welcome to Money Making 12 00:00:41,479 --> 00:00:44,879 Speaker 1: Conversation Masterclass. Gregory Richardson. How you're doing, Gregory? 13 00:00:45,960 --> 00:00:48,879 Speaker 2: Fantastic? Rashaan, thank you for having me on your program. 14 00:00:48,920 --> 00:00:49,760 Speaker 2: Can you hear me? Well? 15 00:00:49,800 --> 00:00:50,239 Speaker 3: Pretty good? 16 00:00:50,360 --> 00:00:50,600 Speaker 2: Can you? 17 00:00:50,800 --> 00:00:52,600 Speaker 3: Where are you based at at? Gregory? Where you based? 18 00:00:52,800 --> 00:00:53,440 Speaker 3: Where are your best? 19 00:00:53,840 --> 00:00:56,600 Speaker 2: I am? I live in Dallas, Texas and I am 20 00:00:56,720 --> 00:00:59,080 Speaker 2: right now in my office in Dallas, Texas. But I 21 00:00:59,360 --> 00:01:01,520 Speaker 2: move around a lot. So but right now you caught 22 00:01:01,560 --> 00:01:01,960 Speaker 2: me at home. 23 00:01:02,160 --> 00:01:04,720 Speaker 1: We're good, We're good. Well, I know I just mentioned that. 24 00:01:04,880 --> 00:01:09,080 Speaker 1: Let's get a little background on you. Is working in AI? 25 00:01:09,240 --> 00:01:09,399 Speaker 2: Is that? 26 00:01:09,640 --> 00:01:11,360 Speaker 3: Is that a full time role for you? 27 00:01:11,800 --> 00:01:14,120 Speaker 1: Or you do other things? Because you seem like a 28 00:01:14,160 --> 00:01:15,560 Speaker 1: man who multitasks. 29 00:01:17,440 --> 00:01:22,160 Speaker 2: Huh, you spit a mouthful at me there. Let me 30 00:01:22,240 --> 00:01:25,199 Speaker 2: let me start by start at the end and say, 31 00:01:25,319 --> 00:01:31,720 Speaker 2: I firmly disagree with the notion of multitasking in my 32 00:01:32,600 --> 00:01:35,800 Speaker 2: fairly limited experience. I'm only I'm only I'm approaching sixty 33 00:01:35,880 --> 00:01:38,760 Speaker 2: years old, so you know, I'm not a million years 34 00:01:38,760 --> 00:01:41,640 Speaker 2: old or anything like that. But in my experience, when 35 00:01:41,720 --> 00:01:44,240 Speaker 2: I try to multitask, or when employees of mine try 36 00:01:44,240 --> 00:01:47,000 Speaker 2: to multitask, what they end up doing is not doing 37 00:01:47,040 --> 00:01:51,760 Speaker 2: any of the tasks really well. Humans are fundamentally designed 38 00:01:52,080 --> 00:01:56,440 Speaker 2: to single purpose task. Obviously, you can chew and walk, 39 00:01:56,560 --> 00:01:58,480 Speaker 2: you know, chew gum and walk at the same time, 40 00:01:58,720 --> 00:02:02,120 Speaker 2: But I don't think your brain functions well doing two 41 00:02:02,160 --> 00:02:05,360 Speaker 2: separate tasks. The context switching and other things that need 42 00:02:05,400 --> 00:02:08,040 Speaker 2: to happen when you're flipping and flopping between tasks end 43 00:02:08,160 --> 00:02:11,280 Speaker 2: up making you more inefficient, So that's one part of it. 44 00:02:11,360 --> 00:02:12,960 Speaker 2: I just like to always I like that. 45 00:02:13,040 --> 00:02:15,880 Speaker 1: I like that because I always I always proplay myself 46 00:02:15,880 --> 00:02:17,040 Speaker 1: as a multitasker. 47 00:02:17,160 --> 00:02:17,760 Speaker 3: I really do. 48 00:02:17,960 --> 00:02:19,760 Speaker 1: That means I'm doing a lot of things, not saying 49 00:02:19,760 --> 00:02:23,360 Speaker 1: I'm doing them simultaneously. That means that I got a 50 00:02:23,360 --> 00:02:26,000 Speaker 1: couple of things happening. I got a couple of gigs, 51 00:02:26,000 --> 00:02:28,079 Speaker 1: I got a couple of deals. I got a couple 52 00:02:28,120 --> 00:02:31,679 Speaker 1: of things that my fire's burning. They don't mean on fire. 53 00:02:31,720 --> 00:02:34,680 Speaker 1: At the same time. That means I got a motivated 54 00:02:34,680 --> 00:02:37,360 Speaker 1: to get these actions done. I will tell you this, Gregory, 55 00:02:37,560 --> 00:02:39,840 Speaker 1: I hear what you're saying. If you're trying to do 56 00:02:39,919 --> 00:02:45,040 Speaker 1: something anybody, if a computer is trying to calculate something simultaneously, 57 00:02:45,440 --> 00:02:48,600 Speaker 1: or you're trying to load something on Wi Fi simultaneously, 58 00:02:48,720 --> 00:02:52,440 Speaker 1: it's gonna drag. So you absolutely correct. When you're trying 59 00:02:52,440 --> 00:02:57,040 Speaker 1: to multitask simultaneously, it will not be effective. 60 00:02:58,720 --> 00:03:01,799 Speaker 2: Of facts. Now, to get to the core of what 61 00:03:01,880 --> 00:03:04,480 Speaker 2: you I believe you're asking. I'm confident you're asking. You 62 00:03:04,520 --> 00:03:08,160 Speaker 2: were spot on. You detect that I am a multi entrepreneur, 63 00:03:08,160 --> 00:03:12,280 Speaker 2: a serial entrepreneur. Over my life, I've I've ran, owned, 64 00:03:12,320 --> 00:03:18,760 Speaker 2: operated started multiple companies from car rentals to restaurants to 65 00:03:19,160 --> 00:03:24,160 Speaker 2: you name it, to consulting companies, software companies, real estate company. 66 00:03:24,280 --> 00:03:29,080 Speaker 2: Like my wife and I we we do entrepreneurship really well. 67 00:03:29,320 --> 00:03:32,840 Speaker 2: In addition to that, I one hundred percent do have 68 00:03:32,880 --> 00:03:35,920 Speaker 2: a full time job also, So I am what you 69 00:03:36,000 --> 00:03:39,560 Speaker 2: call an ethical hacker, and for maybe the last forty 70 00:03:39,640 --> 00:03:44,960 Speaker 2: years or so, my career has been cybersecurity. How do 71 00:03:45,040 --> 00:03:53,520 Speaker 2: I help large typically large organizations, the world's biggest banks, governments, companies, GM, FOD, 72 00:03:53,680 --> 00:03:57,440 Speaker 2: you name it, Nissan that was recently in Tokyo with 73 00:03:57,560 --> 00:04:01,119 Speaker 2: Nissan helping them solve some of their side security problems. 74 00:04:01,600 --> 00:04:06,080 Speaker 2: That's my day job. Now how that connects to AI? 75 00:04:06,440 --> 00:04:07,840 Speaker 3: Okay, before we go there, let's go back. 76 00:04:08,000 --> 00:04:11,200 Speaker 1: Let's go back to this cybersecurity, because that's huge. You know, 77 00:04:11,600 --> 00:04:14,840 Speaker 1: we have people doing cyber ransom. You know, you have 78 00:04:14,920 --> 00:04:19,360 Speaker 1: a bank's, hospitals, governments, so being held hostage by people. 79 00:04:19,680 --> 00:04:22,000 Speaker 1: Talk to us through that whole process. When you're talking 80 00:04:22,000 --> 00:04:26,120 Speaker 1: about cybersecurity and we're talking about people who still use one, two, three, 81 00:04:26,160 --> 00:04:27,159 Speaker 1: four as they password. 82 00:04:28,360 --> 00:04:33,919 Speaker 2: Oh yeah, that's there's so much there. One of the 83 00:04:33,960 --> 00:04:36,440 Speaker 2: things I like to kind of like flow the conversation 84 00:04:36,600 --> 00:04:43,120 Speaker 2: with is cybersecurity because of the volume, the share magnitude 85 00:04:43,400 --> 00:04:46,320 Speaker 2: of the number of threats that we've been seeing and 86 00:04:46,360 --> 00:04:50,360 Speaker 2: it's been growing like crazy. It's literally billions of threats 87 00:04:50,720 --> 00:04:56,480 Speaker 2: per hour, so it's you know it, it's more than huge. 88 00:04:56,760 --> 00:05:00,000 Speaker 2: Maybe for the last fifteen twenty thirty years, it's been 89 00:05:00,200 --> 00:05:03,359 Speaker 2: more than you could hire humans to fix, right, So 90 00:05:03,520 --> 00:05:07,200 Speaker 2: that's why we started leveraging AI to help us do 91 00:05:07,360 --> 00:05:11,200 Speaker 2: things like prediction and classification and quickly looking at a 92 00:05:11,240 --> 00:05:13,440 Speaker 2: million files and saying which of these are bad files, 93 00:05:13,440 --> 00:05:15,920 Speaker 2: which of them look like files that were by files 94 00:05:16,000 --> 00:05:18,920 Speaker 2: yesterday that were viruses or malware or ransomware, et cetera, 95 00:05:18,960 --> 00:05:21,440 Speaker 2: et cetera. So that's a big part of it. The 96 00:05:21,520 --> 00:05:24,480 Speaker 2: other part of what we do in cybersecurity is called 97 00:05:24,560 --> 00:05:28,799 Speaker 2: security operations. That's where you're engaging with the users and 98 00:05:29,000 --> 00:05:31,520 Speaker 2: working against things like you just mentioned, Hey, you can't 99 00:05:31,560 --> 00:05:34,560 Speaker 2: set your password to one, two, three, four, even bigger. 100 00:05:34,760 --> 00:05:37,640 Speaker 2: Most people get that by now even bigger. You don't 101 00:05:37,680 --> 00:05:40,719 Speaker 2: want to reuse the same password on your Gmail, on 102 00:05:40,760 --> 00:05:43,000 Speaker 2: your Facebook, on your Instagram, on your this, on your that, 103 00:05:43,080 --> 00:05:47,800 Speaker 2: on your banking you you really ideally should have separate 104 00:05:47,920 --> 00:05:51,600 Speaker 2: passwords for every single public facing service that you connect to. 105 00:05:52,040 --> 00:05:55,920 Speaker 2: And in my from my perspective, what I recommend customers 106 00:05:55,960 --> 00:05:59,400 Speaker 2: and employees and whoever else I don't even use passwords anymore. 107 00:05:59,480 --> 00:06:05,000 Speaker 2: I'm either doing biometrics, finger I hand facial recognition, or 108 00:06:05,200 --> 00:06:07,600 Speaker 2: if I do have to type something in, I'm using 109 00:06:07,640 --> 00:06:09,880 Speaker 2: what we call a passphrase. So I'll give you a 110 00:06:09,920 --> 00:06:13,880 Speaker 2: real world example of that. Look, when computers started, you know, 111 00:06:14,640 --> 00:06:17,520 Speaker 2: or the internet connectivity started, a password could be eight 112 00:06:17,560 --> 00:06:20,520 Speaker 2: to ten characters long. There ain't a system in the world, 113 00:06:20,600 --> 00:06:24,240 Speaker 2: a reasonably modern system, where you have that constraint. So therefore, 114 00:06:24,440 --> 00:06:28,080 Speaker 2: why are we trying to squeeze a password into a 115 00:06:28,080 --> 00:06:30,600 Speaker 2: couple of characters. I'll give you a real world example. 116 00:06:30,920 --> 00:06:34,000 Speaker 2: My password on one of my social media's right now. 117 00:06:34,080 --> 00:06:38,040 Speaker 2: Good luck trying to hack it, though. Is Philippians space 118 00:06:38,360 --> 00:06:45,520 Speaker 2: two colon three comma consider others more important than yourselves. Period. 119 00:06:46,279 --> 00:06:49,839 Speaker 2: That's literally a scripture quote with the comma, with the period, 120 00:06:49,880 --> 00:06:52,320 Speaker 2: with the spaces, with the capital letters. Philippians. 121 00:06:52,720 --> 00:06:55,440 Speaker 3: Password number, password. 122 00:06:56,400 --> 00:06:59,719 Speaker 2: That's that's one of my probably five hundred passwords that 123 00:06:59,720 --> 00:07:02,680 Speaker 2: I have. That's why I said I'm not afraid to 124 00:07:02,720 --> 00:07:05,440 Speaker 2: put it out there, because I also changed my passwords constantly. 125 00:07:06,000 --> 00:07:06,960 Speaker 2: But yeah, let me. 126 00:07:06,920 --> 00:07:08,400 Speaker 3: Ask you this. Let me ask you this. Great because 127 00:07:08,400 --> 00:07:08,960 Speaker 3: I love talking to you. 128 00:07:09,040 --> 00:07:11,640 Speaker 1: I love your energy, because I've never talked to a 129 00:07:11,680 --> 00:07:14,240 Speaker 1: guest like you that had the knowledge, because what we're 130 00:07:14,240 --> 00:07:17,800 Speaker 1: talking about is, you know, the simplicity and how people 131 00:07:17,880 --> 00:07:20,800 Speaker 1: kind of really disrespect the process and then when they 132 00:07:20,800 --> 00:07:24,480 Speaker 1: get scammed then they don't know why. Now the two 133 00:07:24,600 --> 00:07:27,080 Speaker 1: factor that's the big thing. I use two factor a lot. 134 00:07:27,240 --> 00:07:29,640 Speaker 1: What are your thoughts on two factor because I do 135 00:07:29,800 --> 00:07:32,680 Speaker 1: use passwords. I do have a password vault. Because I 136 00:07:32,680 --> 00:07:35,440 Speaker 1: do have a ton of passwords, and I do change 137 00:07:35,600 --> 00:07:38,960 Speaker 1: every password for every account. The value of the password 138 00:07:39,040 --> 00:07:41,360 Speaker 1: vault allows me to do that. What do you say 139 00:07:41,360 --> 00:07:45,560 Speaker 1: when people are these systems recommend you use two factor, 140 00:07:46,080 --> 00:07:47,680 Speaker 1: you know authorization. 141 00:07:48,360 --> 00:07:52,120 Speaker 2: One hundred percent agree with that, and you're going, obviously, 142 00:07:52,120 --> 00:07:55,840 Speaker 2: we've we've just met Risson, but you're you're absolutely doing 143 00:07:55,880 --> 00:07:58,800 Speaker 2: the things you need to do. A password vault, obviously 144 00:07:58,840 --> 00:08:00,480 Speaker 2: you want to make sure it's a password evolved from 145 00:08:00,520 --> 00:08:04,320 Speaker 2: a very reputable company. I personally use a Google password vault. 146 00:08:04,400 --> 00:08:06,400 Speaker 2: I also use Google authenticat or you could use a 147 00:08:06,440 --> 00:08:09,160 Speaker 2: Microsoft password vault. Maybe that's a little bit less secure, 148 00:08:09,200 --> 00:08:13,040 Speaker 2: but use a password vault from a very reputable company, 149 00:08:13,280 --> 00:08:16,080 Speaker 2: not some little mickey Mouse startup that if something goes wrong, 150 00:08:16,080 --> 00:08:18,040 Speaker 2: you could at least see them and get you know, 151 00:08:18,200 --> 00:08:20,640 Speaker 2: tens of thousands of dollars from them. That's step one, 152 00:08:20,640 --> 00:08:24,160 Speaker 2: and you're you're doing that. Step two you absolutely the 153 00:08:24,200 --> 00:08:27,240 Speaker 2: password vault allows you to use multiple passwords without having 154 00:08:27,280 --> 00:08:28,920 Speaker 2: to remember or write down on a piece of post 155 00:08:28,920 --> 00:08:31,800 Speaker 2: to know what the password is, so excellent. Step two 156 00:08:32,160 --> 00:08:37,520 Speaker 2: you unequivocally to want to use two factor authentication. Well, 157 00:08:37,520 --> 00:08:41,240 Speaker 2: two factor authentication is it's a second factor. So the 158 00:08:41,280 --> 00:08:44,120 Speaker 2: first factor could be your fingerprint or your password, or 159 00:08:44,160 --> 00:08:47,920 Speaker 2: your facial recognition or whatever. The second factor is it 160 00:08:48,160 --> 00:08:51,720 Speaker 2: sends something to something you have. So first factor is 161 00:08:51,720 --> 00:08:54,360 Speaker 2: something you know or something you are, a fingerprint or 162 00:08:54,360 --> 00:08:57,160 Speaker 2: something you know a password. The second factor could be 163 00:08:57,160 --> 00:08:59,760 Speaker 2: something you have, your cell phone, So it'll send a 164 00:08:59,760 --> 00:09:02,280 Speaker 2: ten to your cell phone saying hey, yes, Greg, if 165 00:09:02,320 --> 00:09:05,320 Speaker 2: you're really Greg, you're not only supposed to know the password, 166 00:09:05,360 --> 00:09:07,040 Speaker 2: but you should have your cell phone with you as well. 167 00:09:07,200 --> 00:09:08,839 Speaker 2: I'm a pop a code up on your cell phone, 168 00:09:08,880 --> 00:09:11,120 Speaker 2: put that, enter that here, and then I'll know that 169 00:09:11,160 --> 00:09:15,600 Speaker 2: it's you. That's one example of a two factor. Absolutely 170 00:09:15,720 --> 00:09:19,720 Speaker 2: recommend that I get my Facebook, for example, gets bombarded. 171 00:09:21,080 --> 00:09:22,920 Speaker 2: I guess you were trying to log on Greg or 172 00:09:22,960 --> 00:09:25,040 Speaker 2: you know, it looks like you lost your password. I 173 00:09:25,080 --> 00:09:28,040 Speaker 2: don't even bother with them anymore because I know I 174 00:09:28,040 --> 00:09:31,720 Speaker 2: have two factor turned on. So even if someone manages 175 00:09:31,760 --> 00:09:34,719 Speaker 2: to somehow guess or finagle my password, two factors is 176 00:09:34,720 --> 00:09:35,880 Speaker 2: going to stop them from getting in. 177 00:09:36,080 --> 00:09:42,360 Speaker 1: Greg, because you just hit home with these Bank of 178 00:09:42,440 --> 00:09:47,280 Speaker 1: America type emails or these Facebook emails or these American 179 00:09:47,320 --> 00:09:49,840 Speaker 1: that's what I get. They look like American Express, you 180 00:09:49,880 --> 00:09:52,319 Speaker 1: know exactly. You know, they look like a Bank of America. 181 00:09:52,360 --> 00:09:55,200 Speaker 1: They look like they look like the Facebook and the 182 00:09:55,280 --> 00:09:59,079 Speaker 1: matters talk about you violated a copyright if if you 183 00:09:59,080 --> 00:10:01,800 Speaker 1: don't click this link, we're going to take your site down. 184 00:10:02,400 --> 00:10:08,240 Speaker 1: How does one you know, don't become don't be mislaid 185 00:10:09,600 --> 00:10:14,040 Speaker 1: by this fear of clicking the wrong thing, because they 186 00:10:14,080 --> 00:10:17,959 Speaker 1: always said the elderly community community is victimized the most. 187 00:10:18,280 --> 00:10:21,079 Speaker 1: But talk to us about avoiding those type of scams. 188 00:10:23,640 --> 00:10:26,760 Speaker 2: Avoiding them is practically impossible. And I'll actually take a 189 00:10:26,760 --> 00:10:30,080 Speaker 2: step back if I may. What has caused or what 190 00:10:30,360 --> 00:10:34,959 Speaker 2: is causing that drastic uptick that we've seen for the 191 00:10:35,040 --> 00:10:38,400 Speaker 2: last five, ten, fifteen years in those types of scams. 192 00:10:38,800 --> 00:10:44,400 Speaker 2: It is simple opportunism in around twenty twelve, the FBI 193 00:10:44,720 --> 00:10:47,320 Speaker 2: they released reports every year on the state of you know, 194 00:10:47,400 --> 00:10:50,000 Speaker 2: crime and that kind of stuff. Somewhere around twenty ten, 195 00:10:50,040 --> 00:10:52,360 Speaker 2: eleven to twelve, I don't remember exactly when, for the 196 00:10:52,400 --> 00:10:55,240 Speaker 2: first time the FBI released a report called the State 197 00:10:55,280 --> 00:10:58,440 Speaker 2: of Cyber Crime Security. It wasn't the first time they 198 00:10:58,520 --> 00:11:00,439 Speaker 2: released the report, but what about to say is the 199 00:11:00,480 --> 00:11:03,760 Speaker 2: first time they stated it. And in that report, somewhere 200 00:11:03,800 --> 00:11:07,199 Speaker 2: around twenty ten ish they stated for the first time 201 00:11:07,640 --> 00:11:13,560 Speaker 2: that revenue from criminal organizations that do ransomware and viruses 202 00:11:13,640 --> 00:11:18,640 Speaker 2: and spam and scams via the Internet surpassed for the 203 00:11:18,720 --> 00:11:23,640 Speaker 2: first time in history all revenue from all cocaine sales, 204 00:11:23,840 --> 00:11:29,440 Speaker 2: heroin sales, and marijuana sales globally. So when criminal organizations 205 00:11:29,520 --> 00:11:32,680 Speaker 2: start noticing, hey, I don't need to buy a submarine 206 00:11:32,679 --> 00:11:34,640 Speaker 2: and smuggle drugs and this and that and the other, 207 00:11:35,000 --> 00:11:37,319 Speaker 2: or I could keep doing that, but I can also 208 00:11:37,520 --> 00:11:40,040 Speaker 2: generate a lot of money over here by doing these 209 00:11:40,040 --> 00:11:43,959 Speaker 2: computer things. That is what triggers the attacks that we're 210 00:11:43,960 --> 00:11:49,080 Speaker 2: seeing now. Criminals are notoriously opportunistic, and you have to 211 00:11:49,200 --> 00:11:51,680 Speaker 2: look at every email you get. And this answers your 212 00:11:51,760 --> 00:11:55,960 Speaker 2: question directly, as is someone trying to get something that's 213 00:11:56,040 --> 00:12:00,079 Speaker 2: mine because money also want to be cautious. Nowadays, the 214 00:12:00,080 --> 00:12:05,440 Speaker 2: biggest target is PII. It's called Personal identifiable information. That's 215 00:12:05,480 --> 00:12:08,680 Speaker 2: the biggest target. Is someone trying to find out from 216 00:12:08,679 --> 00:12:11,160 Speaker 2: me my first name, last name, data, birth, first name, 217 00:12:11,240 --> 00:12:14,240 Speaker 2: last name data, birth, social security? Are there is that? 218 00:12:14,400 --> 00:12:17,319 Speaker 2: And if someone is trying to get that or any 219 00:12:17,400 --> 00:12:19,920 Speaker 2: other and even just first name, last name, data birth, 220 00:12:20,120 --> 00:12:22,920 Speaker 2: like if they're not asking for social still be paranoid. 221 00:12:23,120 --> 00:12:26,319 Speaker 2: Is someone trying to get my username and my password? 222 00:12:26,520 --> 00:12:30,800 Speaker 2: Anytime you get emails requesting that type of information, you 223 00:12:30,880 --> 00:12:34,760 Speaker 2: need to pause, take a breath, and be very very 224 00:12:34,840 --> 00:12:37,520 Speaker 2: very very cautious. What's the source of the email? You 225 00:12:37,600 --> 00:12:42,080 Speaker 2: pick up the phone and call the Amazon or Facebook 226 00:12:42,200 --> 00:12:45,520 Speaker 2: or whoever and verify, Hey, did you send me this email? 227 00:12:45,679 --> 00:12:49,040 Speaker 2: Like is my account at risk of being That's how 228 00:12:49,080 --> 00:12:52,120 Speaker 2: you have to handle that, because that's what those criminals 229 00:12:52,160 --> 00:12:54,720 Speaker 2: are opportunistic. It's like parking a bunch of cars on 230 00:12:54,760 --> 00:12:57,559 Speaker 2: the street. Happens in the neighborhoods all over the world, 231 00:12:57,600 --> 00:13:00,599 Speaker 2: even fancy ones. Criminals will walk back asked and just 232 00:13:00,679 --> 00:13:03,520 Speaker 2: tried door handles. Which car they're gonna rob first? The 233 00:13:03,520 --> 00:13:06,920 Speaker 2: one where the cars unlocked. They are opportunistic, so the 234 00:13:06,960 --> 00:13:08,760 Speaker 2: easiest target is the one that gets hit. 235 00:13:08,920 --> 00:13:09,240 Speaker 3: Wow. 236 00:13:09,280 --> 00:13:12,920 Speaker 2: And because of the volume of that, you know, it's 237 00:13:12,960 --> 00:13:16,680 Speaker 2: no longer you know Billy, Bob or or Nikky in 238 00:13:16,720 --> 00:13:20,840 Speaker 2: his grandmother's basement playing around on the internet doing some scams. 239 00:13:20,880 --> 00:13:24,800 Speaker 2: That's not what it is anymore. It is fundamental, well 240 00:13:25,960 --> 00:13:33,199 Speaker 2: funded and resourced criminal organizations that literally have office buildings. 241 00:13:33,440 --> 00:13:38,480 Speaker 2: They go to campuses to recruit smart computer science kids 242 00:13:39,160 --> 00:13:43,880 Speaker 2: to come into their criminal organization and create ransomware, create viruses, 243 00:13:43,960 --> 00:13:48,200 Speaker 2: and craft these sting operations, et cetera. So that's what 244 00:13:48,240 --> 00:13:50,560 Speaker 2: you're going against these people. It's no longer you know 245 00:13:50,720 --> 00:13:54,440 Speaker 2: Susie in the basement. It is now an organization coming 246 00:13:54,480 --> 00:13:57,840 Speaker 2: after you. So you gotta bear back. And one of 247 00:13:57,880 --> 00:14:01,000 Speaker 2: the known things with scams, it's all they tried to 248 00:14:01,679 --> 00:14:04,040 Speaker 2: impose a sense of urgency. You got to do this 249 00:14:04,160 --> 00:14:08,240 Speaker 2: right now, right anytime you see that, feel that either 250 00:14:08,320 --> 00:14:09,080 Speaker 2: that or they. 251 00:14:09,000 --> 00:14:12,960 Speaker 1: Threaten you with an ultimate penalty if you don't respond. 252 00:14:13,280 --> 00:14:15,280 Speaker 1: You know, and I will tell you this. I'm talking 253 00:14:15,320 --> 00:14:21,120 Speaker 1: to Gregory Richardson AI expert, cybersecurity expert, a man who 254 00:14:21,200 --> 00:14:24,240 Speaker 1: loves his job, I'm just telling you it's a level 255 00:14:24,280 --> 00:14:26,560 Speaker 1: of passion. How did you And we're going to get 256 00:14:26,560 --> 00:14:28,920 Speaker 1: into the AI in the next break, but how did 257 00:14:28,960 --> 00:14:30,720 Speaker 1: you know this was for you? Like, I have a 258 00:14:30,720 --> 00:14:34,840 Speaker 1: degree in mathematics. My minus associate is in sociology. When 259 00:14:34,840 --> 00:14:37,880 Speaker 1: I graduate from college, I started out in computers. I 260 00:14:37,920 --> 00:14:41,800 Speaker 1: love the logic. How did you know from a state 261 00:14:41,840 --> 00:14:45,600 Speaker 1: of passion that this is where you knew your career 262 00:14:45,640 --> 00:14:47,880 Speaker 1: and your life would be gregory. 263 00:14:50,440 --> 00:14:54,360 Speaker 2: It's a very very interesting story. I am in college. 264 00:14:54,680 --> 00:14:56,720 Speaker 2: I left at a fairly young age. I was just 265 00:14:56,720 --> 00:15:00,000 Speaker 2: just turning sixteen, late part of my sixteen year old 266 00:15:00,200 --> 00:15:02,960 Speaker 2: that phrase when I left the Caribbean island that I 267 00:15:03,000 --> 00:15:04,840 Speaker 2: grew up on and my parents and all of my 268 00:15:04,880 --> 00:15:07,400 Speaker 2: family are from, and we moved. Well, I was sent 269 00:15:07,560 --> 00:15:10,200 Speaker 2: actually to the US, the middle of the US, to 270 00:15:10,320 --> 00:15:14,160 Speaker 2: the Midwest, to college. Kind of stupid, you know kids 271 00:15:14,160 --> 00:15:16,040 Speaker 2: are when they're stupid, think they know it all, et cetera. 272 00:15:16,360 --> 00:15:20,200 Speaker 2: But my major initially was civil engineering. People told me 273 00:15:20,240 --> 00:15:22,640 Speaker 2: a good job is you know, civil engineering, You'll build 274 00:15:22,720 --> 00:15:26,480 Speaker 2: roads and buildings for governments and just good job security. Now, 275 00:15:26,760 --> 00:15:30,920 Speaker 2: while that's my major, I am in my dorm room 276 00:15:30,960 --> 00:15:33,840 Speaker 2: and this is the early eighties, right, So like there 277 00:15:33,880 --> 00:15:36,920 Speaker 2: is no Internet yet, there are no computer stores. Yet 278 00:15:37,160 --> 00:15:42,520 Speaker 2: I'm in my dorm room building computers from scratch and 279 00:15:42,680 --> 00:15:48,800 Speaker 2: selling them on campus to students, and I'm literally investing 280 00:15:48,960 --> 00:15:51,240 Speaker 2: all of my That was but the joy that I had. 281 00:15:51,360 --> 00:15:54,720 Speaker 2: I hated my classes. I hated every and after about 282 00:15:54,800 --> 00:15:58,360 Speaker 2: literally I switched my major from from civil engineering to 283 00:15:58,400 --> 00:16:00,960 Speaker 2: electrical engineering. Then I switched to the math, then I 284 00:16:00,960 --> 00:16:02,840 Speaker 2: switched it to bits. Like I was all over. I 285 00:16:02,920 --> 00:16:03,400 Speaker 2: was stupid. 286 00:16:03,400 --> 00:16:06,200 Speaker 1: I was dumb with no, no, no, no, you were me, Gregory. 287 00:16:06,280 --> 00:16:10,200 Speaker 1: I started out the civil engineering, then I went to accounting, 288 00:16:10,560 --> 00:16:13,320 Speaker 1: then I went to chemical engineering, and then I wind 289 00:16:13,400 --> 00:16:16,760 Speaker 1: up getting my degree in mathematics. But I hold that thought, Gregory, 290 00:16:16,760 --> 00:16:19,400 Speaker 1: you are fantastic. Please don't go anywhere. We're talking to 291 00:16:19,440 --> 00:16:26,120 Speaker 1: Gregory richardson cybersecurity expert AI AI Genius something called me 292 00:16:26,120 --> 00:16:29,200 Speaker 1: AI genius because he is on fire right now. And 293 00:16:29,240 --> 00:16:31,560 Speaker 1: if you want to hear more, especially how he's using 294 00:16:31,960 --> 00:16:37,480 Speaker 1: and teaching nonprofit organizations in churches how to use AI 295 00:16:37,600 --> 00:16:40,120 Speaker 1: to spread the gospel, don't go nowhere. You listen to 296 00:16:40,160 --> 00:16:43,360 Speaker 1: Money Making Conversation master Class, and I'm your host with 297 00:16:43,480 --> 00:16:44,400 Speaker 1: Sean McDonald. 298 00:16:44,880 --> 00:16:48,040 Speaker 4: Please don't go anywhere. We'll be right back with more 299 00:16:48,120 --> 00:16:58,520 Speaker 4: money Making Conversations Masterclass. Welcome back to the Money Making 300 00:16:58,600 --> 00:17:04,280 Speaker 4: Conversations Masterclass, hosted by Rashaan McDonald. Money Making Conversations Masterclass 301 00:17:04,359 --> 00:17:08,960 Speaker 4: continues online at Moneymakingconversations dot com and follow money Making 302 00:17:09,000 --> 00:17:13,879 Speaker 4: Conversations Masterclass on Facebook, Twitter, and Instagram. 303 00:17:14,200 --> 00:17:17,680 Speaker 1: Gregory, the word passion is an understatement when I hear 304 00:17:17,760 --> 00:17:22,240 Speaker 1: you talk about cybersecurity. And I say that because I'm 305 00:17:22,240 --> 00:17:24,600 Speaker 1: gonna tell you some When I log on my computer, 306 00:17:25,200 --> 00:17:29,679 Speaker 1: when I click emails, I'm nervous and I'm not very comfortable. 307 00:17:30,040 --> 00:17:32,520 Speaker 1: And that's the world we live in. It's a world 308 00:17:32,760 --> 00:17:37,920 Speaker 1: of fast moving activity. But are you uncomfortable as I 309 00:17:37,960 --> 00:17:42,160 Speaker 1: am as we navigate through this new world of technology 310 00:17:42,200 --> 00:17:44,560 Speaker 1: that's being put forth out there that we have to 311 00:17:44,680 --> 00:17:46,400 Speaker 1: use in order to participate. 312 00:17:48,119 --> 00:17:51,280 Speaker 2: Great, great, great, great question, and I'll answer it with 313 00:17:51,359 --> 00:17:53,040 Speaker 2: a little, a tiny bit of background. 314 00:17:53,080 --> 00:17:53,439 Speaker 3: Okay. 315 00:17:53,600 --> 00:17:59,040 Speaker 2: I am a staunch believer in the concept that we 316 00:17:59,119 --> 00:18:02,720 Speaker 2: find in the Bible in Jeremiah chapter one, somewhere around 317 00:18:02,800 --> 00:18:07,560 Speaker 2: verse four or so, and that's Jeremiah being spoken to 318 00:18:07,640 --> 00:18:11,600 Speaker 2: by God. Jeremiah's a prophet and God tells him, before 319 00:18:11,800 --> 00:18:17,760 Speaker 2: you were formed in your mother's womb, I intimately knew 320 00:18:17,760 --> 00:18:22,000 Speaker 2: what you were made for. I firmly believe in that. 321 00:18:22,000 --> 00:18:24,199 Speaker 2: That's why, jokingly, I was dumb as a bag of 322 00:18:24,200 --> 00:18:30,080 Speaker 2: bricks because when I was twelve years old in high school, 323 00:18:30,920 --> 00:18:32,919 Speaker 2: a bunch of friends and I had a little public 324 00:18:32,960 --> 00:18:36,160 Speaker 2: access TV program where we taught adults how to use computers. 325 00:18:36,560 --> 00:18:38,920 Speaker 2: So it was very obvious to me that my passion 326 00:18:39,119 --> 00:18:42,639 Speaker 2: was technology from twelve years old, and when I go 327 00:18:42,680 --> 00:18:45,840 Speaker 2: to college, I'm doing everything but technology. It took two 328 00:18:45,880 --> 00:18:48,440 Speaker 2: and a half three years into college for a friend 329 00:18:48,520 --> 00:18:50,560 Speaker 2: happened to be a chaplain to walk he was on 330 00:18:50,640 --> 00:18:52,479 Speaker 2: my floor, to walk up to me and say, dude, 331 00:18:52,560 --> 00:18:55,200 Speaker 2: why are you not why are you not studying computers. 332 00:18:55,240 --> 00:18:57,800 Speaker 2: I'm like, it's not I'm electrical engineering or a business major. 333 00:18:57,920 --> 00:19:01,080 Speaker 2: He said, dude, all you're doing all day is sitting 334 00:19:01,119 --> 00:19:04,760 Speaker 2: here behind this computer doing computer stuff. Is obvious your 335 00:19:04,800 --> 00:19:07,320 Speaker 2: passion And that was kind of like an AHA moment 336 00:19:07,400 --> 00:19:11,920 Speaker 2: for me, and that is what fuels not only my success, 337 00:19:12,280 --> 00:19:16,760 Speaker 2: but my ability. And I say this to entrepreneurs, business 338 00:19:16,800 --> 00:19:20,000 Speaker 2: people people you know early in career or late in 339 00:19:20,080 --> 00:19:24,240 Speaker 2: career making changes if you can do what you're passionate about. 340 00:19:24,920 --> 00:19:26,800 Speaker 2: I'm not guarantee you and you're gonna make a million 341 00:19:26,840 --> 00:19:30,040 Speaker 2: dollars or whatever whatever. You might maybe not, but at 342 00:19:30,160 --> 00:19:35,160 Speaker 2: least you'll be able to function and be effective through 343 00:19:35,680 --> 00:19:39,199 Speaker 2: all adversities. There's been many rough periods in time. I 344 00:19:39,240 --> 00:19:43,880 Speaker 2: remember twenty thirty years ago, cybersecurity was a terrible industry 345 00:19:43,880 --> 00:19:45,879 Speaker 2: to be in. No one wasn't hiring. I'd go and 346 00:19:46,000 --> 00:19:49,040 Speaker 2: visit banks and businesses, especially in the Caribbean where I 347 00:19:49,040 --> 00:19:51,840 Speaker 2: was living at that time. Hey, do your cybersecurity. No, 348 00:19:51,960 --> 00:19:55,119 Speaker 2: we don't need no cybersecurity. My it guy handles that. 349 00:19:55,640 --> 00:19:58,600 Speaker 2: It took Target getting hacked home depot right there in 350 00:19:58,600 --> 00:20:02,480 Speaker 2: Atlanta getting hacked. Suddenly the owners to be, oh my god, 351 00:20:02,520 --> 00:20:04,679 Speaker 2: we need to do something about cybersecurity, and then all 352 00:20:04,720 --> 00:20:07,080 Speaker 2: of a sudden it boomed. But prior to that, no 353 00:20:07,119 --> 00:20:12,440 Speaker 2: one cared about cybersecurity. What kept me passionate and invested 354 00:20:12,440 --> 00:20:15,360 Speaker 2: in cybersecurity was the fact that I was doing what 355 00:20:15,480 --> 00:20:21,000 Speaker 2: I was in. I believe genuinely cybersecurity and technology was 356 00:20:21,080 --> 00:20:24,760 Speaker 2: stitched into my DNA by the person created me. So 357 00:20:24,800 --> 00:20:27,480 Speaker 2: I'm just doing what's natural, and that's what you hear 358 00:20:27,520 --> 00:20:30,040 Speaker 2: in my voice. That's what you hear in the passion 359 00:20:30,200 --> 00:20:33,560 Speaker 2: when I answer questions. You can I tell my employees, coworkers, 360 00:20:33,560 --> 00:20:35,800 Speaker 2: people that work for me, whatever. All the time. You 361 00:20:35,840 --> 00:20:38,560 Speaker 2: wake me up at three in the morning, bro, I'll 362 00:20:38,560 --> 00:20:40,359 Speaker 2: answer questions. Problem. 363 00:20:40,800 --> 00:20:43,040 Speaker 3: I love it. I love it. I love it. 364 00:20:43,240 --> 00:20:46,760 Speaker 1: Hey Gregory, No, let's transition to AI because I don't 365 00:20:46,760 --> 00:20:48,760 Speaker 1: want to say I'm going to talk about something and 366 00:20:48,800 --> 00:20:50,640 Speaker 1: we don't talk about it. I think they're to frustrate 367 00:20:50,720 --> 00:20:55,040 Speaker 1: our listeners or AI. What concerns you about AI when 368 00:20:55,080 --> 00:20:59,360 Speaker 1: it comes to securities and current concerns and privacy? Because 369 00:20:59,520 --> 00:21:03,440 Speaker 1: we just over your AI is being thrown into everybody's conversation. 370 00:21:03,560 --> 00:21:07,679 Speaker 1: Everybody knows the two letters AI artificial intelligence. But it 371 00:21:07,840 --> 00:21:11,639 Speaker 1: is security concerns, it is private concerns. What do you think. 372 00:21:13,200 --> 00:21:15,719 Speaker 2: More? There are definitely security and privacy. As a matter 373 00:21:15,760 --> 00:21:18,680 Speaker 2: of fact, is it okay if I give your your 374 00:21:18,720 --> 00:21:20,600 Speaker 2: listeners a way for them to reach me? Can I 375 00:21:20,640 --> 00:21:21,880 Speaker 2: just quickly plug a website? 376 00:21:21,960 --> 00:21:24,440 Speaker 1: Absolutely do your thing, because I think that is important 377 00:21:24,480 --> 00:21:27,880 Speaker 1: that people know how to reach you exactly. 378 00:21:27,880 --> 00:21:31,280 Speaker 2: Anyone that wants to reach me connect LinkedIn social media, 379 00:21:31,320 --> 00:21:34,440 Speaker 2: all of it by my book, whatever, all of it. 380 00:21:34,800 --> 00:21:38,360 Speaker 2: Every way to get in touch with me is at www. 381 00:21:38,600 --> 00:21:41,240 Speaker 2: My name Gregory G R E g O R Y 382 00:21:41,680 --> 00:21:46,120 Speaker 2: Richardson r I C hr DSN dot AI, Gregory Richardson 383 00:21:46,320 --> 00:21:49,280 Speaker 2: dot AI. You can get me there, get everything there. 384 00:21:49,520 --> 00:21:52,879 Speaker 2: I answer messages. So that's out of the way. Let's continue. 385 00:21:53,800 --> 00:21:57,200 Speaker 2: Are there security concerns with AI one hundred percent? But 386 00:21:57,320 --> 00:22:00,439 Speaker 2: it's more than that, And candidly, I think the opportunity 387 00:22:00,640 --> 00:22:04,600 Speaker 2: is bigger than the concerns in AI. I often used 388 00:22:04,840 --> 00:22:10,240 Speaker 2: the analog of the Gutenberg printing Press fourteen hundred fifteen hundred. 389 00:22:10,280 --> 00:22:13,840 Speaker 2: Somewhere around the fifteenth century, Gutenberg invented. He was a jeweler, 390 00:22:14,240 --> 00:22:17,040 Speaker 2: worked with gold stuff and working with defined things in gold, 391 00:22:17,040 --> 00:22:19,600 Speaker 2: he created this thing called the printing press. And the 392 00:22:19,640 --> 00:22:23,159 Speaker 2: first thing they started printing was gossip rags and you know, 393 00:22:23,280 --> 00:22:27,359 Speaker 2: like descriptions of let's call them lurid tales, you know, 394 00:22:27,440 --> 00:22:30,800 Speaker 2: so basically written out pornography. That was the first things 395 00:22:30,880 --> 00:22:33,200 Speaker 2: the press was used for. Because even then they knew 396 00:22:33,240 --> 00:22:35,879 Speaker 2: if I print something sallacious, people are gonna gobble it up. 397 00:22:36,160 --> 00:22:38,399 Speaker 2: If I say you a woman found in bed with 398 00:22:38,480 --> 00:22:42,000 Speaker 2: a cow, people they're gonna buy that newspaper. So the 399 00:22:42,160 --> 00:22:45,320 Speaker 2: church in particular was very concerned, Oh my god, this 400 00:22:45,320 --> 00:22:48,960 Speaker 2: thing is evil, and lots of the industries were very 401 00:22:49,040 --> 00:22:51,120 Speaker 2: this thing is not good, like we shouldn't be doing this. 402 00:22:51,560 --> 00:22:54,159 Speaker 2: Then suddenly, now keep in mind, if you look at 403 00:22:54,160 --> 00:22:57,720 Speaker 2: a historical timeline, Europe and most of the world right 404 00:22:57,760 --> 00:23:00,639 Speaker 2: now is smack dab in the middle of the Dark Ages. 405 00:23:01,400 --> 00:23:05,760 Speaker 2: Black plague is killing millions of people across Europe. That's 406 00:23:05,800 --> 00:23:10,240 Speaker 2: when this happened. Fast forward Finally, some monk somewhere says, 407 00:23:10,280 --> 00:23:11,720 Speaker 2: you know what I thinking, I'm gonna use the printing 408 00:23:11,760 --> 00:23:15,120 Speaker 2: press for these Bible books that we've been writing out 409 00:23:15,119 --> 00:23:17,119 Speaker 2: by hand. And it takes us twenty years, We're going 410 00:23:17,160 --> 00:23:20,960 Speaker 2: to print some Fast forward fifty five, zero years later, 411 00:23:21,200 --> 00:23:23,280 Speaker 2: the world has been in the Dark Ages for more 412 00:23:23,320 --> 00:23:27,359 Speaker 2: than a thousand years. Fast forward fifty years later. The 413 00:23:27,440 --> 00:23:31,639 Speaker 2: printing press and the publication of Bibles, which to this 414 00:23:31,720 --> 00:23:35,919 Speaker 2: day is still the top selling book, single handedly hold 415 00:23:36,000 --> 00:23:40,040 Speaker 2: the entire globe out of the Dark Ages because someone 416 00:23:40,400 --> 00:23:43,840 Speaker 2: leaned into the new technology, the printing press, and used 417 00:23:43,880 --> 00:23:47,120 Speaker 2: it for something good. That's what AI is today. Yes, 418 00:23:47,200 --> 00:23:49,919 Speaker 2: can it do bad things? Hundred percent? Could it potentially, 419 00:23:50,400 --> 00:23:53,679 Speaker 2: you know, yes, all of the risks are true, but 420 00:23:53,800 --> 00:23:57,639 Speaker 2: I think the reward and the potential and the upside 421 00:23:58,040 --> 00:24:03,959 Speaker 2: is exponentially better. Especially I focus on nonprofits because of 422 00:24:04,000 --> 00:24:07,440 Speaker 2: my Christian worldview, but I also help. As I've said already, 423 00:24:07,640 --> 00:24:12,000 Speaker 2: large large, large businesses and organizations and businesses from small 424 00:24:12,080 --> 00:24:15,360 Speaker 2: to large right now are starting to leverage. And that's 425 00:24:15,400 --> 00:24:18,000 Speaker 2: what my book is about. The six technology levers. How 426 00:24:18,040 --> 00:24:21,560 Speaker 2: to find the million dollar problem in your business and 427 00:24:21,600 --> 00:24:26,120 Speaker 2: make a million dollar solution using AI. That's what it's about. 428 00:24:26,720 --> 00:24:30,440 Speaker 2: Businesses are leveraging AI today, and we are in what 429 00:24:30,680 --> 00:24:34,119 Speaker 2: I call the age of the ephemeral app. You're going 430 00:24:34,200 --> 00:24:37,239 Speaker 2: to be able to build an app now without one 431 00:24:37,320 --> 00:24:39,680 Speaker 2: hundred thousand dollars of investment. You're going to be able 432 00:24:39,680 --> 00:24:41,680 Speaker 2: to go to an AI and say, hey, I need 433 00:24:41,720 --> 00:24:44,440 Speaker 2: an app that helps my son. And I literally did 434 00:24:44,440 --> 00:24:47,639 Speaker 2: this with my daughter. That helps my daughter prepare for 435 00:24:48,160 --> 00:24:52,919 Speaker 2: this semester of advanced mathematics. She's sixteen years old. It generated, 436 00:24:53,359 --> 00:24:56,399 Speaker 2: I gave it all of the documents, the syllabus, the 437 00:24:56,560 --> 00:24:59,920 Speaker 2: course materials from her math class. It generated weekly QUI 438 00:25:00,359 --> 00:25:03,360 Speaker 2: for me in an app form that I can give 439 00:25:03,400 --> 00:25:05,600 Speaker 2: to my daughter or anyone else and say, hey, get 440 00:25:05,600 --> 00:25:07,240 Speaker 2: the crack in. This is how you're going to prepare. 441 00:25:07,880 --> 00:25:11,720 Speaker 2: That's aye we're in. So as a business person, when 442 00:25:11,760 --> 00:25:14,119 Speaker 2: you start looking, or as an entrepreneur, you got an 443 00:25:14,119 --> 00:25:16,919 Speaker 2: idea of rummaging around in your head. You no longer 444 00:25:17,040 --> 00:25:19,680 Speaker 2: need to go and find a coder. You can simply 445 00:25:19,800 --> 00:25:22,480 Speaker 2: jump on these AI tools. And that's what my book 446 00:25:22,520 --> 00:25:25,119 Speaker 2: walks you through and say, hey, this is the idea 447 00:25:25,200 --> 00:25:29,919 Speaker 2: I have. I've helped doctors, I've helped lawyers, I've helped forests, 448 00:25:30,240 --> 00:25:35,440 Speaker 2: I've helped accountants build custom apps in days that have 449 00:25:35,600 --> 00:25:39,520 Speaker 2: completely changed the landscape of their business. Give you a 450 00:25:39,640 --> 00:25:43,920 Speaker 2: if I may give you a simple, simple example. A 451 00:25:44,000 --> 00:25:47,320 Speaker 2: business had a car rental fleet of cars forty cars deep, 452 00:25:47,400 --> 00:25:51,080 Speaker 2: going great, but they started saying this car rental thing 453 00:25:51,160 --> 00:25:52,840 Speaker 2: might not be for me. I looked at their business 454 00:25:52,840 --> 00:25:56,280 Speaker 2: and said, you know what you're missing. You're losing a 455 00:25:56,280 --> 00:25:58,480 Speaker 2: lot of money on tolls and Texas. Every time you 456 00:25:58,560 --> 00:26:00,720 Speaker 2: touch the roads here, they're charging tolls. If you have 457 00:26:00,760 --> 00:26:03,119 Speaker 2: a car rental fleet, if you're not tracking the tolls properly, 458 00:26:03,160 --> 00:26:06,440 Speaker 2: you use it. I help them build an app that 459 00:26:06,640 --> 00:26:10,160 Speaker 2: tracks the tolls and then builds the customer for it directly. 460 00:26:10,720 --> 00:26:13,240 Speaker 2: They ended up taking that app that they built for 461 00:26:13,359 --> 00:26:16,400 Speaker 2: their car and they sold it leased it twenty five 462 00:26:16,520 --> 00:26:19,959 Speaker 2: thirty forty dollars a month to every other car rental 463 00:26:20,000 --> 00:26:23,880 Speaker 2: agency in the area. That app ended up making them 464 00:26:23,920 --> 00:26:28,560 Speaker 2: a bigger income stream than the car rental agency itself. Wow, 465 00:26:29,200 --> 00:26:32,520 Speaker 2: that's the age we're in today. Your idea that's bubbling 466 00:26:32,600 --> 00:26:35,480 Speaker 2: around in your head might very well be a bigger 467 00:26:35,600 --> 00:26:38,560 Speaker 2: income stream than the thing that you're doing. You might 468 00:26:38,600 --> 00:26:41,560 Speaker 2: be a baker, but the idea you have for tracking 469 00:26:41,640 --> 00:26:45,320 Speaker 2: recipes for all your cookies and distributing someone might pay 470 00:26:45,359 --> 00:26:48,160 Speaker 2: you fifteen twenty dollars a month to access your database 471 00:26:48,320 --> 00:26:50,960 Speaker 2: of recipes, but they might not buy twenty dollars or 472 00:26:50,960 --> 00:26:54,119 Speaker 2: cookies every month. That's the age we are in today. 473 00:26:54,160 --> 00:26:57,119 Speaker 2: So when I look at it in that scope, AI 474 00:26:57,400 --> 00:27:01,680 Speaker 2: is significantly a bigger opportun tunity than it is a risk. 475 00:27:02,080 --> 00:27:06,840 Speaker 2: Now are there risks? Cybersecurity risks, privacy risks, data leakage risks. 476 00:27:07,000 --> 00:27:09,639 Speaker 2: Could China be spying or anywhere else be spying on 477 00:27:09,800 --> 00:27:14,040 Speaker 2: us and trying to reap private information about us through AI? 478 00:27:14,160 --> 00:27:17,439 Speaker 2: One hundred percent sad part of that, though, is Facebook's 479 00:27:17,440 --> 00:27:21,600 Speaker 2: doing it too. Google's doing it too. You kind of 480 00:27:21,640 --> 00:27:22,640 Speaker 2: can't avoid it. 481 00:27:22,840 --> 00:27:26,480 Speaker 3: Right, I tell you what, Gregory, very entertaining. 482 00:27:27,480 --> 00:27:29,600 Speaker 1: I know that I want to get more into the AI, 483 00:27:29,920 --> 00:27:33,240 Speaker 1: especially in the nonprofit, especially in the church and spreading 484 00:27:33,280 --> 00:27:37,320 Speaker 1: the gospel conversation. Can I invite you back on the show? 485 00:27:38,280 --> 00:27:42,280 Speaker 2: Of course you would be honored, honored. I'll add this 486 00:27:42,680 --> 00:27:47,320 Speaker 2: the reason I personally focus on churches because where churches 487 00:27:47,320 --> 00:27:50,320 Speaker 2: and nonprofits, they usually don't have the budgets of the 488 00:27:50,320 --> 00:27:53,720 Speaker 2: big banks and the businesses, so they need the efficiency 489 00:27:53,760 --> 00:27:56,119 Speaker 2: that AI can bring even more. 490 00:27:56,240 --> 00:27:58,600 Speaker 1: So give me a riskite, give me a website before 491 00:27:58,600 --> 00:28:00,880 Speaker 1: we go, Give me a website because to reach out to. 492 00:28:00,840 --> 00:28:08,119 Speaker 2: You, Www. Gregory Richardson dot AI. Gregory Richardson dot AI, 493 00:28:08,320 --> 00:28:11,040 Speaker 2: really appreciate this opportunity to be here and talk to 494 00:28:11,080 --> 00:28:14,320 Speaker 2: your audience. Man, this has been a ton of fun. 495 00:28:14,640 --> 00:28:17,200 Speaker 1: You're awesome. You're awesome, and Gregor, we will talk soon. 496 00:28:17,320 --> 00:28:17,960 Speaker 1: I appreciate you. 497 00:28:18,000 --> 00:28:18,200 Speaker 3: Man. 498 00:28:18,280 --> 00:28:20,119 Speaker 1: Thank you for filling out that be a guest for him, 499 00:28:20,280 --> 00:28:22,320 Speaker 1: because that's how he submitted the information. 500 00:28:22,440 --> 00:28:23,400 Speaker 3: I contacted him. 501 00:28:23,480 --> 00:28:26,160 Speaker 1: He's on the show and guess what, the information he's 502 00:28:26,200 --> 00:28:28,639 Speaker 1: provided on my show has been amazing. 503 00:28:28,680 --> 00:28:31,160 Speaker 3: Thank you. For coming on Money Making Conversation Masterclass. 504 00:28:31,880 --> 00:28:33,280 Speaker 2: Appreciate you, sir, have a great one. 505 00:28:33,400 --> 00:28:36,159 Speaker 1: We talked soon. Don't go nowhere. We'll be back with 506 00:28:36,200 --> 00:28:39,560 Speaker 1: more of Money Making Conversation Masterclass. This is Rushan McDonald. 507 00:28:39,600 --> 00:28:40,280 Speaker 1: I'm your host. 508 00:28:41,000 --> 00:28:44,240 Speaker 5: This has been another edition of Money Making Conversation Masterclass 509 00:28:44,320 --> 00:28:47,400 Speaker 5: hosted by me, Rashawn McDonald. Thank you to our guests 510 00:28:47,480 --> 00:28:50,840 Speaker 5: on the show today and thank you listening to audience now. 511 00:28:50,920 --> 00:28:52,880 Speaker 5: If you want to listen to any episode I want 512 00:28:52,920 --> 00:28:56,760 Speaker 5: to be a guest on the show, visit Moneymakingconversations dot com. 513 00:28:56,880 --> 00:29:00,520 Speaker 5: Our social media handle is money Making Conversation. Join us 514 00:29:00,560 --> 00:29:03,120 Speaker 5: next week and remember to always leave with your gifts. 515 00:29:03,440 --> 00:29:04,040 Speaker 3: Keep winning. 516 00:33:52,280 --> 00:33:52,440 Speaker 2: Hi. 517 00:33:52,520 --> 00:33:55,200 Speaker 6: I am Rashan McDonald, a host of weekly Money Making 518 00:33:55,240 --> 00:33:59,080 Speaker 6: Conversation Masterclass show. The interviews and information that this show 519 00:33:59,120 --> 00:34:02,160 Speaker 6: provides off for everyone. It's time to start reading other 520 00:34:02,200 --> 00:34:06,040 Speaker 6: people's success stories and start living your own. This week 521 00:34:06,200 --> 00:34:10,759 Speaker 6: I will interview Gregory Richardson, he uses artificial intelligence technology 522 00:34:11,000 --> 00:34:14,360 Speaker 6: to help spread the gospel. Then I'm talking to Heather Younger. 523 00:34:14,680 --> 00:34:18,160 Speaker 6: She will tell you how to navigate workplace fears, job security, 524 00:34:18,400 --> 00:34:22,480 Speaker 6: performance anxiety, and change. You could only hear these interviews 525 00:34:22,600 --> 00:34:26,000 Speaker 6: on the Money Making Conversation Masterclass Show, Keep winning. 526 00:34:26,200 --> 00:34:26,719 Speaker 3: There you go,