1 00:00:00,120 --> 00:00:02,639 Speaker 1: Hi, I am Rashaun McDonald, our host this weekly Money 2 00:00:02,640 --> 00:00:06,240 Speaker 1: Making Conversation Masterclass show. The interviews and information that this 3 00:00:06,280 --> 00:00:09,119 Speaker 1: show provides off for everyone. It's time to start reading 4 00:00:09,160 --> 00:00:12,119 Speaker 1: other people's success stories and start living your own. If 5 00:00:12,200 --> 00:00:13,920 Speaker 1: you want to be a guest on my show, please 6 00:00:14,000 --> 00:00:18,320 Speaker 1: visit our website, Moneymakingconversations dot com and click to be 7 00:00:18,400 --> 00:00:21,320 Speaker 1: a guest button to submit your information. My guest is 8 00:00:21,360 --> 00:00:25,200 Speaker 1: the co founder and CEO of Bitmind, the world's first 9 00:00:25,200 --> 00:00:32,320 Speaker 1: decentralized artificial intelligence AI. As they say, deep fake detection system. 10 00:00:32,360 --> 00:00:36,519 Speaker 1: Me again, deep fake detection system. He's an authority on 11 00:00:36,560 --> 00:00:40,360 Speaker 1: how people can protect themselves using free solutions that they create. 12 00:00:40,840 --> 00:00:45,400 Speaker 1: Please welcome to Money Making Conversation Masterclass. Ken John Biachi 13 00:00:46,360 --> 00:00:47,000 Speaker 1: Hei you doing. 14 00:00:46,880 --> 00:00:49,320 Speaker 2: Sir, doing well? Thanks for having me. 15 00:00:49,560 --> 00:00:53,280 Speaker 1: First of all, deep fake AI? I say about the 16 00:00:53,400 --> 00:00:56,080 Speaker 1: last five years, you know, I've been hearing a lot 17 00:00:56,120 --> 00:00:59,680 Speaker 1: of it. I'm a writer, Hollywood writer, so a major 18 00:00:59,800 --> 00:01:04,679 Speaker 1: poor that strike was about AI and a major portion 19 00:01:04,760 --> 00:01:09,200 Speaker 1: of conversation is in the acting game is AI? What 20 00:01:09,360 --> 00:01:11,000 Speaker 1: is your take on all that? Before we get to 21 00:01:11,080 --> 00:01:15,320 Speaker 1: your company, because I'm sure that inspired you to do 22 00:01:15,360 --> 00:01:19,600 Speaker 1: all this software creation to be able to detect AI 23 00:01:19,640 --> 00:01:20,319 Speaker 1: and deep fake. 24 00:01:21,520 --> 00:01:22,480 Speaker 2: Yeah, one hundred percent. 25 00:01:22,640 --> 00:01:27,480 Speaker 3: So I think as many people have experienced over the past, 26 00:01:27,720 --> 00:01:30,399 Speaker 3: you know, two to three years, Jenner of AI has 27 00:01:30,520 --> 00:01:34,200 Speaker 3: really taken off, right. It started with chat, GPT, but 28 00:01:34,240 --> 00:01:37,880 Speaker 3: then you're getting into images, video, audio, et cetera. 29 00:01:38,080 --> 00:01:40,000 Speaker 2: And we went from you know, maybe two years. 30 00:01:39,880 --> 00:01:43,360 Speaker 3: Ago will that Will Smith eating pastave video looked absolutely 31 00:01:43,480 --> 00:01:50,000 Speaker 3: terrible to now some extremely hyper realistic images, video, et cetera. 32 00:01:50,720 --> 00:01:54,560 Speaker 3: And so it's it's really affected almost every aspect of 33 00:01:54,600 --> 00:01:59,760 Speaker 3: our life, but specifically kind of the creative, the news, 34 00:01:59,800 --> 00:02:06,160 Speaker 3: just stribution fields, et cetera. And so it's very difficult 35 00:02:06,480 --> 00:02:12,080 Speaker 3: two kind of think about AI in the current constructs 36 00:02:12,080 --> 00:02:15,760 Speaker 3: that we have right now, IP protection, image and likeness, 37 00:02:15,800 --> 00:02:19,440 Speaker 3: et cetera. Where you know, now with the proliferation of 38 00:02:19,480 --> 00:02:22,960 Speaker 3: these tools, you can just hey, create you know, a voice, 39 00:02:23,200 --> 00:02:26,160 Speaker 3: an image of video of someone, use it to promote 40 00:02:26,240 --> 00:02:30,040 Speaker 3: your your your thing, or create you know, having them 41 00:02:30,080 --> 00:02:32,360 Speaker 3: do whatever and have it fit in that world. So 42 00:02:32,400 --> 00:02:36,079 Speaker 3: it's a very complicated space, but something that will definitely 43 00:02:36,160 --> 00:02:38,320 Speaker 3: need to be figured out, especially as AI gets better 44 00:02:38,320 --> 00:02:38,680 Speaker 3: and better. 45 00:02:39,360 --> 00:02:45,960 Speaker 4: Now, give me your background, are you your education level 46 00:02:46,280 --> 00:02:49,880 Speaker 4: or training to go into this lane before we get 47 00:02:49,919 --> 00:02:50,840 Speaker 4: into more detail. 48 00:02:52,080 --> 00:02:56,360 Speaker 3: Yeah, so studied computer science at University of California, San Diego. 49 00:02:57,200 --> 00:02:59,919 Speaker 3: Was really interested in you know, kind of traditionally studied, 50 00:03:00,320 --> 00:03:03,280 Speaker 3: but got really into the blockchain space. When I was 51 00:03:03,320 --> 00:03:06,120 Speaker 3: attending my undergrad I decided to start a company in 52 00:03:06,160 --> 00:03:08,840 Speaker 3: the crypto space. And then after that I worked in 53 00:03:08,919 --> 00:03:11,600 Speaker 3: more traditional AI at Amazon for a couple of years, 54 00:03:11,760 --> 00:03:15,320 Speaker 3: like think about like recommendation systems for buying products. And 55 00:03:15,360 --> 00:03:18,640 Speaker 3: then after my time there, I worked at a layer 56 00:03:18,680 --> 00:03:23,720 Speaker 3: one blockchain called Near, and the founder of Near was 57 00:03:23,760 --> 00:03:27,560 Speaker 3: actually the original author on the paper Attention is All 58 00:03:27,600 --> 00:03:30,519 Speaker 3: You Need, which is the kind of fundamental building blocks 59 00:03:30,520 --> 00:03:33,720 Speaker 3: of these lms it's called the Transformer. Was exposed to 60 00:03:33,760 --> 00:03:36,600 Speaker 3: a lot of this kind of intersection of AI and 61 00:03:36,680 --> 00:03:40,000 Speaker 3: crypto there, and then towards the end of twenty twenty three, 62 00:03:40,520 --> 00:03:42,920 Speaker 3: I was you know, doing a lot of research for them, 63 00:03:43,000 --> 00:03:48,600 Speaker 3: looking at new projects, and felt like this protocol that 64 00:03:48,640 --> 00:03:51,360 Speaker 3: we're building on right now, Bittenser was extremely fascinating and 65 00:03:51,400 --> 00:03:55,320 Speaker 3: interesting to start a company on. But then also that 66 00:03:55,640 --> 00:04:00,160 Speaker 3: there was some really you know, critical important problems to 67 00:04:00,240 --> 00:04:04,680 Speaker 3: tackle that that were like existential and and and kind 68 00:04:04,680 --> 00:04:07,760 Speaker 3: of made the jump to go back in the entrepreneurship game. 69 00:04:08,160 --> 00:04:10,200 Speaker 1: That ken, I hear forty hour week, I have forty 70 00:04:10,240 --> 00:04:12,720 Speaker 1: hour a week. I have forty hour a week. Now 71 00:04:12,720 --> 00:04:15,440 Speaker 1: you're an entrepreneur. Now. A lot of people when they 72 00:04:15,480 --> 00:04:18,440 Speaker 1: make that transition, there's a lot of thought process goes 73 00:04:18,480 --> 00:04:20,760 Speaker 1: into a lot of fear that goes into it. Talk 74 00:04:20,880 --> 00:04:23,560 Speaker 1: me through those steps of going from mister forty hour week, 75 00:04:23,600 --> 00:04:28,000 Speaker 1: mister guaranteed check, mister health benefits to mister independent. 76 00:04:29,839 --> 00:04:33,240 Speaker 3: Yeah, definitely, definitely an interesting jump. 77 00:04:33,320 --> 00:04:36,320 Speaker 2: So I think ever since I was young. 78 00:04:36,160 --> 00:04:41,240 Speaker 3: I was always very motivated to go into entrepreneurship, start 79 00:04:41,240 --> 00:04:44,200 Speaker 3: my own thing, have ownership, and I was I was 80 00:04:44,279 --> 00:04:47,120 Speaker 3: kind of against working at you know, a fang, a 81 00:04:47,200 --> 00:04:50,000 Speaker 3: large tech company, But after my first endeavor in the 82 00:04:50,080 --> 00:04:53,200 Speaker 3: entrepreneurship game, I was like, hey, I think maybe this 83 00:04:53,200 --> 00:04:57,120 Speaker 3: could be a good opportunity to get experience, learn from 84 00:04:57,320 --> 00:05:02,080 Speaker 3: other really amazing engineers, learn these processes, and I always 85 00:05:02,080 --> 00:05:04,919 Speaker 3: had the itch to get back in. And so for me, 86 00:05:05,400 --> 00:05:08,320 Speaker 3: it was more of like, you know, I wanted to 87 00:05:08,360 --> 00:05:12,480 Speaker 3: do it for a while, and I have one hundred 88 00:05:12,520 --> 00:05:15,600 Speaker 3: percent belief in myself, and so like I wasn't you know, 89 00:05:15,680 --> 00:05:18,520 Speaker 3: it actually wasn't too hard for me. It was more like, hey, 90 00:05:19,040 --> 00:05:21,640 Speaker 3: you know, this is an idea, an opportunity that I 91 00:05:21,680 --> 00:05:24,400 Speaker 3: have extremely high conviction in. I didn't want to start 92 00:05:24,440 --> 00:05:26,840 Speaker 3: something just to start something because I wanted to. I 93 00:05:26,880 --> 00:05:28,880 Speaker 3: was kind of like, you know, looking waiting for an idea, 94 00:05:28,880 --> 00:05:31,760 Speaker 3: and once it came about, I was I. 95 00:05:31,680 --> 00:05:32,680 Speaker 2: Was one hundred percent on in. 96 00:05:32,720 --> 00:05:34,760 Speaker 3: And then it's really about convincing kind of your loved 97 00:05:34,800 --> 00:05:37,719 Speaker 3: ones and people you're relying on, you know, my girlfriend, 98 00:05:37,800 --> 00:05:38,320 Speaker 3: my family. 99 00:05:39,200 --> 00:05:43,080 Speaker 1: Ability. They see hike, they see stability, they see dreamer, 100 00:05:43,400 --> 00:05:46,640 Speaker 1: they see stability. Go hey, brother, are you crazy? 101 00:05:48,160 --> 00:05:48,400 Speaker 2: Now? 102 00:05:48,720 --> 00:05:54,760 Speaker 1: Bit mine? Okay, See that's where I'm with in this conversation. Okay, 103 00:05:54,880 --> 00:05:58,119 Speaker 1: do you start a company called bitmine with the idea 104 00:05:58,120 --> 00:06:00,360 Speaker 1: you're going to make money? How do you do that? 105 00:06:00,440 --> 00:06:03,560 Speaker 1: How does one in your mind can because I get 106 00:06:03,600 --> 00:06:06,359 Speaker 1: the fact if you have a product, you know, because 107 00:06:06,360 --> 00:06:09,480 Speaker 1: this is really a lot of the software design, a 108 00:06:09,520 --> 00:06:11,359 Speaker 1: lot of people can't see that. You know, if you 109 00:06:11,400 --> 00:06:13,840 Speaker 1: design a car, it drives down the street. You make 110 00:06:14,279 --> 00:06:16,240 Speaker 1: some bread, you can go eat it. You know, you 111 00:06:16,279 --> 00:06:18,280 Speaker 1: open a restaurant, you can sit down and be served. 112 00:06:18,680 --> 00:06:22,200 Speaker 1: Now your company doesn't do that. How were you able 113 00:06:22,240 --> 00:06:25,560 Speaker 1: to in your mind understand the profit laws or the 114 00:06:25,600 --> 00:06:28,479 Speaker 1: p and L on this opportunity. 115 00:06:28,560 --> 00:06:32,160 Speaker 3: Yeah, great, great question. So kind of as you mentioned, 116 00:06:32,200 --> 00:06:35,200 Speaker 3: if you're in like, you know, a physical products or 117 00:06:35,720 --> 00:06:38,720 Speaker 3: hardware engineering as like as you mentioned cars, there's a 118 00:06:38,760 --> 00:06:41,279 Speaker 3: little more of an obvious p and L business model 119 00:06:41,320 --> 00:06:46,400 Speaker 3: around this type of stuff. In software, it's definitely, I 120 00:06:46,440 --> 00:06:48,280 Speaker 3: don't want to say a little more opaque, but there's 121 00:06:48,960 --> 00:06:51,440 Speaker 3: a little bit more design space and how you design 122 00:06:51,720 --> 00:06:54,880 Speaker 3: your like revenue model and your business model, et cetera. 123 00:06:55,520 --> 00:06:57,840 Speaker 3: When I looked at this, right, there was a couple 124 00:06:57,880 --> 00:07:00,760 Speaker 3: of different things. So the first one is that by 125 00:07:00,839 --> 00:07:03,960 Speaker 3: participating in this network that I mentioned Bittensor, which I 126 00:07:04,000 --> 00:07:07,000 Speaker 3: decided to build the company on, if you perform well 127 00:07:07,080 --> 00:07:12,520 Speaker 3: and you create essentially powerful AI models, powerful intelligence, then 128 00:07:13,280 --> 00:07:16,360 Speaker 3: you get emissions from the network, and so you're earning 129 00:07:16,400 --> 00:07:19,920 Speaker 3: cryptocurrency which allows you to bootstrap. 130 00:07:19,400 --> 00:07:20,960 Speaker 5: And expand your business. 131 00:07:21,120 --> 00:07:23,120 Speaker 3: But then the second piece is really what is this 132 00:07:23,200 --> 00:07:26,960 Speaker 3: business model around defake detection? So I thought that TAM 133 00:07:27,040 --> 00:07:30,280 Speaker 3: was actually pretty big there, Right, you have billions of 134 00:07:30,280 --> 00:07:35,600 Speaker 3: dollars getting pumped into generative AI, and maybe at the 135 00:07:35,640 --> 00:07:39,560 Speaker 3: time it was less than two hundred million. Kind of 136 00:07:39,720 --> 00:07:44,080 Speaker 3: that was invested into defake detection or kind of like 137 00:07:44,160 --> 00:07:45,400 Speaker 3: detection models and. 138 00:07:45,360 --> 00:07:50,920 Speaker 1: So security, right, so exactly you develop a security system. 139 00:07:50,600 --> 00:07:54,239 Speaker 3: Essentially, yes, So like cybersecurity for AI is like definitely 140 00:07:54,280 --> 00:07:56,640 Speaker 3: a good mental model to think about it. And what 141 00:07:56,680 --> 00:07:58,320 Speaker 3: I thought is because of all this money is going 142 00:07:58,400 --> 00:08:00,960 Speaker 3: into genera of AI, I felt like this was such 143 00:08:01,000 --> 00:08:03,400 Speaker 3: an existential problem for really humanity. 144 00:08:03,800 --> 00:08:04,240 Speaker 2: You know, what. 145 00:08:04,280 --> 00:08:08,920 Speaker 3: Happens when you can't differentiate between what is real and 146 00:08:08,960 --> 00:08:13,200 Speaker 3: what is not. There's already a huge deterioration of trust 147 00:08:13,320 --> 00:08:16,840 Speaker 3: in traditional media of news that is being consumed, and 148 00:08:16,880 --> 00:08:21,080 Speaker 3: what happens to just human communication when you know you 149 00:08:21,200 --> 00:08:25,680 Speaker 3: not only are being skeptical and don't know what you're 150 00:08:25,720 --> 00:08:29,120 Speaker 3: looking at whenever you're browsing the internet, but just you know, 151 00:08:29,680 --> 00:08:32,000 Speaker 3: even like you're not only questioning fake things like hey 152 00:08:32,080 --> 00:08:34,920 Speaker 3: is this fake? But like you'll see something AI generated 153 00:08:34,960 --> 00:08:36,720 Speaker 3: and you're like, is it real? It goes both ways. 154 00:08:36,960 --> 00:08:40,840 Speaker 3: And I felt like because of that essential need, there 155 00:08:41,080 --> 00:08:46,040 Speaker 3: was essentially a path to get distribution and consumption of 156 00:08:46,080 --> 00:08:48,559 Speaker 3: this good. And then there's a lot of really interesting 157 00:08:48,679 --> 00:08:52,240 Speaker 3: business models in the software space, Like you know, if 158 00:08:52,280 --> 00:08:54,000 Speaker 3: you get a lot of consumption, you could have a 159 00:08:54,000 --> 00:08:57,240 Speaker 3: freemium premium to your model. You could do what kind 160 00:08:57,240 --> 00:08:59,440 Speaker 3: of Google does incorporate ads. If you have a lot 161 00:08:59,480 --> 00:09:03,280 Speaker 3: of users in consumption, you could create a SaaS API service. 162 00:09:03,880 --> 00:09:06,360 Speaker 2: There was a lot of different avenues. 163 00:09:05,920 --> 00:09:10,199 Speaker 3: There to create a powerful and sustainable revenue and business 164 00:09:10,240 --> 00:09:13,240 Speaker 3: model around this stuff. But the first step is always consumption. 165 00:09:13,440 --> 00:09:15,240 Speaker 3: You need a lot of users. You need to make 166 00:09:15,240 --> 00:09:16,680 Speaker 3: an impactful product that people want. 167 00:09:17,240 --> 00:09:20,280 Speaker 1: Okay, now can't I'm trying to hang in it with you. Now, 168 00:09:20,559 --> 00:09:23,920 Speaker 1: Bill's got to be paid, okay, And this is money 169 00:09:23,920 --> 00:09:28,680 Speaker 1: making conversations, mastic Land. Now you say software, you say crypto. 170 00:09:29,360 --> 00:09:32,640 Speaker 1: How are you generating revenue to be able to maintain 171 00:09:32,760 --> 00:09:36,400 Speaker 1: an everyday lifestyle, keep the lights on, go out to launch, 172 00:09:36,880 --> 00:09:38,400 Speaker 1: keep gas in the car, or if you have an 173 00:09:38,400 --> 00:09:41,480 Speaker 1: e car, charge it up. How's that happening with this 174 00:09:41,559 --> 00:09:44,960 Speaker 1: business model and this company you created called bitmind. 175 00:09:45,920 --> 00:09:48,840 Speaker 3: Yeah, so we essentially have three ways where we're you know, 176 00:09:49,080 --> 00:09:52,920 Speaker 3: funding our operational expenses and paying salaries, et cetera. So 177 00:09:52,960 --> 00:09:55,280 Speaker 3: the first one is that we we did raise venture 178 00:09:55,280 --> 00:09:58,360 Speaker 3: funding there you right, So like that, yeah, that's the 179 00:09:58,360 --> 00:10:01,439 Speaker 3: most basic one, and like you know that that was 180 00:10:01,480 --> 00:10:04,440 Speaker 3: a big level of confidence of me diving into it. 181 00:10:04,480 --> 00:10:05,520 Speaker 2: You know, I was going when. 182 00:10:05,320 --> 00:10:07,199 Speaker 1: We did a VC funding, did you have to give 183 00:10:07,200 --> 00:10:09,240 Speaker 1: away a portion of your company a percentage? 184 00:10:09,840 --> 00:10:12,280 Speaker 2: Correct? Correct? Yes. 185 00:10:12,679 --> 00:10:15,640 Speaker 3: And then the second one is kind of the two 186 00:10:15,720 --> 00:10:19,560 Speaker 3: levers I mentioned, But the first one is essentially revenue 187 00:10:19,640 --> 00:10:24,200 Speaker 3: from cryptocurrency emissions. So like, right, by participating in this 188 00:10:24,320 --> 00:10:29,760 Speaker 3: network building contributing AI models to this network, you earn cryptocurrency, 189 00:10:29,760 --> 00:10:31,960 Speaker 3: which is called tao. And then the third one is 190 00:10:32,000 --> 00:10:35,280 Speaker 3: we have a variety of products and services that essentially 191 00:10:35,320 --> 00:10:40,440 Speaker 3: have subscriptions subscription fees, and so we're generating like revenue 192 00:10:40,640 --> 00:10:44,839 Speaker 3: through those three I guess, not revenue, but the second two, right, 193 00:10:44,920 --> 00:10:48,880 Speaker 3: the cryptocurrencury emissions and then the subscription revenues, and then 194 00:10:49,320 --> 00:10:51,280 Speaker 3: we're kind of putting all of that actually back into 195 00:10:51,320 --> 00:10:53,720 Speaker 3: the business. Right now, we're trying to grow, we're trying 196 00:10:53,760 --> 00:10:56,880 Speaker 3: to expand our market share, our mind share, and we're 197 00:10:56,920 --> 00:11:00,319 Speaker 3: funding kind of our operational expenses off of the funding 198 00:11:00,320 --> 00:11:00,880 Speaker 3: that we raised. 199 00:11:01,360 --> 00:11:04,160 Speaker 1: Now, yours is not because you mentioned earlier. You know, 200 00:11:04,320 --> 00:11:09,240 Speaker 1: the chat GPT, which is the I guess you could 201 00:11:09,240 --> 00:11:14,400 Speaker 1: say detects or the form of AI because they used 202 00:11:14,400 --> 00:11:16,880 Speaker 1: a lot you Google, you go to Google and go 203 00:11:16,960 --> 00:11:19,640 Speaker 1: some of the platform there and ask the question, and 204 00:11:19,640 --> 00:11:23,319 Speaker 1: they're basically using chat GPT to give you an answer. 205 00:11:24,080 --> 00:11:26,959 Speaker 1: And so, now, but your video your de fake. You're 206 00:11:27,040 --> 00:11:29,439 Speaker 1: the deep fake get video guy. That's what I'm just 207 00:11:29,520 --> 00:11:33,760 Speaker 1: calling out in this sparticular conversation. Now, you said I 208 00:11:33,800 --> 00:11:36,600 Speaker 1: said this in your intro. You know, the world's first 209 00:11:36,600 --> 00:11:42,560 Speaker 1: decentralized artificial intelligence or AI deep fake detection system. You're 210 00:11:42,559 --> 00:11:45,320 Speaker 1: an authority on how people can protect themselves. This is 211 00:11:45,360 --> 00:11:51,160 Speaker 1: about security protection using free solutions. Let's discuss those free 212 00:11:51,160 --> 00:11:54,240 Speaker 1: solutions because the word free always catches everybody here. So 213 00:11:54,320 --> 00:11:57,160 Speaker 1: let's go with the free solutions first, kid, before we 214 00:11:57,200 --> 00:11:58,199 Speaker 1: get to the subscriptions. 215 00:11:58,200 --> 00:11:59,920 Speaker 2: Okay, yeah, yea. 216 00:12:00,200 --> 00:12:03,360 Speaker 3: So maybe to draw comparison to chat GPT, right, if 217 00:12:03,400 --> 00:12:06,120 Speaker 3: you put an image or a video into chat GPT 218 00:12:06,320 --> 00:12:08,280 Speaker 3: and you say, hey, is this AI generated or not? 219 00:12:08,360 --> 00:12:10,280 Speaker 3: It can't tell you. It will give you an answer 220 00:12:10,320 --> 00:12:12,840 Speaker 3: on that, right, And so it's like, okay, can we 221 00:12:12,920 --> 00:12:17,480 Speaker 3: help people really make a you know, a determination, help 222 00:12:17,720 --> 00:12:21,600 Speaker 3: help educate users and keep them protected. And so we 223 00:12:21,679 --> 00:12:25,120 Speaker 3: have two kind of core products right now that help 224 00:12:25,200 --> 00:12:27,360 Speaker 3: users use This is the first one is a website 225 00:12:27,400 --> 00:12:30,200 Speaker 3: called thedetector dot ai. You go to it, you can 226 00:12:30,280 --> 00:12:32,640 Speaker 3: drag and drop an image or in a video, and 227 00:12:32,679 --> 00:12:37,240 Speaker 3: then it will give you a essentially classification whether it's 228 00:12:37,600 --> 00:12:40,880 Speaker 3: real or AI generated with some confidence score. And the 229 00:12:40,920 --> 00:12:43,600 Speaker 3: second one, we have a browser extension that while you're 230 00:12:43,640 --> 00:12:46,840 Speaker 3: scrolling Twitter, while you're scrolling any news website, any website 231 00:12:46,840 --> 00:12:49,840 Speaker 3: at all, if you hover over any image or video, 232 00:12:50,600 --> 00:12:53,600 Speaker 3: it will get processed and overlay like you know, basically 233 00:12:53,640 --> 00:12:57,040 Speaker 3: a little green check mark or like a robot looking thing, 234 00:12:57,040 --> 00:13:00,120 Speaker 3: and say, hey, is this AI generated or is this realm? 235 00:13:01,400 --> 00:13:04,560 Speaker 6: Please don't go anywhere. We'll be right back with more 236 00:13:04,640 --> 00:13:16,199 Speaker 6: Money Making Conversations Masterclass. Welcome back to the Money Making 237 00:13:16,280 --> 00:13:19,400 Speaker 6: Conversations Masterclass, hosted by Rashaan McDonald. 238 00:13:21,280 --> 00:13:24,960 Speaker 1: Now, now we're going downs coming to the consumers everyday, 239 00:13:24,960 --> 00:13:29,880 Speaker 1: people like me. I'm just on the internet. I guess 240 00:13:30,320 --> 00:13:33,360 Speaker 1: I'm thinking about elections. I probably want to know if 241 00:13:33,640 --> 00:13:36,640 Speaker 1: did Trump really say that, or did Biden say that, 242 00:13:37,160 --> 00:13:41,360 Speaker 1: Kamala Harris say that, or Jeffries that part of the 243 00:13:41,400 --> 00:13:45,720 Speaker 1: conversation or Putin did he say that? Because AI fakes 244 00:13:45,760 --> 00:13:50,480 Speaker 1: are international, so as a consumer, is that the only 245 00:13:50,520 --> 00:13:55,080 Speaker 1: areas I should be concerned? About fake videos. Is in 246 00:13:55,120 --> 00:13:58,439 Speaker 1: the political arena or there other arenas that I need 247 00:13:58,480 --> 00:14:01,280 Speaker 1: to be opening my eyes up for can to come 248 00:14:01,480 --> 00:14:05,520 Speaker 1: and start at least using your free solutions. 249 00:14:07,600 --> 00:14:11,240 Speaker 3: Yes, there's definitely many other areas, but that is a 250 00:14:11,360 --> 00:14:14,320 Speaker 3: very large one, right right. One of the reasons why 251 00:14:14,320 --> 00:14:18,960 Speaker 3: we started this was like, hey, one decentralized AI. It 252 00:14:19,040 --> 00:14:23,600 Speaker 3: sounds very, very buzzworthy, but dee fakes. I think a 253 00:14:23,600 --> 00:14:26,800 Speaker 3: lot of everyday consumers understand that. And this was started 254 00:14:26,840 --> 00:14:29,760 Speaker 3: at the beginning of twenty twenty four. And really what 255 00:14:29,840 --> 00:14:32,720 Speaker 3: was fascinating about twenty twenty four was it was the 256 00:14:32,760 --> 00:14:37,000 Speaker 3: first year of the first election cycle in the age 257 00:14:37,000 --> 00:14:41,520 Speaker 3: of AI, right right, maybe Obama's you know, ever since 258 00:14:41,560 --> 00:14:43,240 Speaker 3: two thousand and eight, there were some elections in the 259 00:14:43,240 --> 00:14:45,560 Speaker 3: age of social media, but this was really the first 260 00:14:45,600 --> 00:14:49,520 Speaker 3: one where AI was being proliferated throughout society. And the 261 00:14:49,520 --> 00:14:52,880 Speaker 3: thesis was, hey, AI is really going to affect this 262 00:14:53,040 --> 00:14:56,080 Speaker 3: upcoming election cycle. And I think we saw that play out. 263 00:14:56,280 --> 00:14:58,960 Speaker 3: So it's not only the US elections, but India had 264 00:14:59,000 --> 00:15:03,560 Speaker 3: their elections, Tai had their elections, and actually in those 265 00:15:03,680 --> 00:15:07,320 Speaker 3: countries where the regulations are slightly different, there was a 266 00:15:07,360 --> 00:15:10,920 Speaker 3: report that over fifty million dollars was invested in generating 267 00:15:11,000 --> 00:15:14,680 Speaker 3: AI content in India, both for like malicious reasons for 268 00:15:14,720 --> 00:15:18,280 Speaker 3: the opposing candidate and to make themselves look good and 269 00:15:18,320 --> 00:15:21,040 Speaker 3: then right for the US ones, it played out really interesting. 270 00:15:21,240 --> 00:15:24,560 Speaker 3: There was a couple really important deep fake videos that 271 00:15:24,600 --> 00:15:28,920 Speaker 3: came out. There was an accusation of Tim Waltz being 272 00:15:29,200 --> 00:15:32,680 Speaker 3: like for sexual abuse and that ended up being a defake, right, 273 00:15:32,920 --> 00:15:35,240 Speaker 3: But a lot of it was actually used in satire 274 00:15:35,280 --> 00:15:39,800 Speaker 3: and humor, like you know, Elon posting Vice President Harris 275 00:15:39,840 --> 00:15:43,760 Speaker 3: and you know at the time like in you know, 276 00:15:43,960 --> 00:15:46,480 Speaker 3: kind of propaganda Garbin and stuff like that. 277 00:15:46,520 --> 00:15:47,920 Speaker 2: So I think that highly critical. 278 00:15:48,160 --> 00:15:50,680 Speaker 3: You know, information is really important, but the other use 279 00:15:50,720 --> 00:15:53,920 Speaker 3: cases are are pretty broad, right. I think one of 280 00:15:53,920 --> 00:15:56,600 Speaker 3: the main ones that is really critical is kind of 281 00:15:56,600 --> 00:16:00,320 Speaker 3: financial fraud and abuse. So there's a lot of There's 282 00:16:00,360 --> 00:16:03,320 Speaker 3: been a lot of scams recently that have been popularized 283 00:16:03,360 --> 00:16:08,560 Speaker 3: in the news, Like Tom Hanks was selling some health 284 00:16:08,640 --> 00:16:12,920 Speaker 3: medicine and it was using his likeness and his voice 285 00:16:12,960 --> 00:16:16,520 Speaker 3: to do that. There was a scam for Joe Biden 286 00:16:16,680 --> 00:16:22,880 Speaker 3: not running but essentially promoting a charity or a foundation 287 00:16:23,240 --> 00:16:25,880 Speaker 3: that ended up being being fake. And then a really 288 00:16:25,880 --> 00:16:28,800 Speaker 3: big one in I believe it was in London where 289 00:16:29,440 --> 00:16:32,000 Speaker 3: this was actually an audio and it was very sophisticated, 290 00:16:32,040 --> 00:16:35,240 Speaker 3: but they basically faked being. They faked the voice of 291 00:16:35,280 --> 00:16:37,120 Speaker 3: a large client and they were able to get the 292 00:16:37,160 --> 00:16:41,200 Speaker 3: bank to send over fifty million dollars to an unverified account. 293 00:16:41,480 --> 00:16:44,040 Speaker 3: And so you have like this huge sector of financial 294 00:16:44,760 --> 00:16:47,760 Speaker 3: and then you have like a variety of other things, 295 00:16:47,840 --> 00:16:52,000 Speaker 3: but maybe the most tangible one is right now, there's 296 00:16:52,400 --> 00:16:55,880 Speaker 3: a lot of e commerce that's fake. So think Temu, 297 00:16:56,000 --> 00:17:00,360 Speaker 3: think Ali Baba, even think Amazon right, people selling fake products. 298 00:17:00,680 --> 00:17:04,879 Speaker 3: There was a really funny viral kind of story that 299 00:17:04,920 --> 00:17:08,400 Speaker 3: came out like there was a large couch that looked 300 00:17:08,440 --> 00:17:12,080 Speaker 3: like a gorilla, and you know they were selling it 301 00:17:12,119 --> 00:17:15,040 Speaker 3: for like five hundred dollars and people were buying it, 302 00:17:15,440 --> 00:17:17,959 Speaker 3: but you know it was AI generated and you know 303 00:17:18,040 --> 00:17:20,360 Speaker 3: it wasn't It wasn't that product. So I think there's 304 00:17:20,720 --> 00:17:25,240 Speaker 3: there's honestly unlimited angles of just consumer protection where hey, 305 00:17:25,280 --> 00:17:31,879 Speaker 3: anything that you could there's incentive to to confuse, uh 306 00:17:32,400 --> 00:17:36,440 Speaker 3: scam the user with AI right, you can be protected 307 00:17:36,560 --> 00:17:39,720 Speaker 3: right with with verified I'm daism. 308 00:17:39,840 --> 00:17:43,680 Speaker 1: You know, you get Bank of America coming in American Express. 309 00:17:43,680 --> 00:17:45,399 Speaker 1: You don't know if it's an American Express. It looks like 310 00:17:45,400 --> 00:17:48,000 Speaker 1: an American Express, and do you click it. You have 311 00:17:48,080 --> 00:17:50,040 Speaker 1: to click that email and see if it's a Gmail 312 00:17:50,240 --> 00:17:54,040 Speaker 1: or some outlook created American Express. All getting like I 313 00:17:54,080 --> 00:17:57,400 Speaker 1: got an email the other day, said email that email 314 00:17:57,840 --> 00:18:01,440 Speaker 1: PayPal dot com. Okay, just because of that PayPal and that, 315 00:18:01,600 --> 00:18:03,200 Speaker 1: but then an email in front of it, I knew 316 00:18:03,240 --> 00:18:06,040 Speaker 1: that was a fake. So we're being hit so many 317 00:18:06,040 --> 00:18:10,160 Speaker 1: different times with this. Of course, you know the generation 318 00:18:10,240 --> 00:18:13,480 Speaker 1: that is most vulnerable to this is the very young 319 00:18:14,040 --> 00:18:17,800 Speaker 1: and the very old in this process, and then in 320 00:18:17,880 --> 00:18:20,879 Speaker 1: the business side, like you said, the banking industry that 321 00:18:21,000 --> 00:18:24,440 Speaker 1: was a fifty million dollars heist. Basically, we can say 322 00:18:24,720 --> 00:18:27,840 Speaker 1: coming out there, so how do you get bidmine in 323 00:18:27,880 --> 00:18:30,920 Speaker 1: front of all this ken to be able to say, hey, 324 00:18:31,160 --> 00:18:34,280 Speaker 1: we're over here, we can help you, we can help you. Okay, 325 00:18:34,560 --> 00:18:37,320 Speaker 1: we got we got the software over here. How do 326 00:18:37,359 --> 00:18:40,359 Speaker 1: you from a marketing and branding standpoint, how do you 327 00:18:40,400 --> 00:18:42,480 Speaker 1: cut through the clutter and market your brand? 328 00:18:44,400 --> 00:18:44,720 Speaker 4: Yeah? 329 00:18:44,760 --> 00:18:48,040 Speaker 3: So from a marketing standpoint, very simply, we have the 330 00:18:48,040 --> 00:18:51,120 Speaker 3: best defake detection solution in the world right now that's 331 00:18:51,160 --> 00:18:54,800 Speaker 3: available to consumers so there's a variety of other competitors 332 00:18:54,880 --> 00:18:57,879 Speaker 3: in the space, but most of them are offering you know, 333 00:18:58,359 --> 00:19:02,520 Speaker 3: enterprise government, contract related software. And what we're saying is, hey, 334 00:19:02,560 --> 00:19:05,240 Speaker 3: we want to go direct to consumer. Right, these scams 335 00:19:05,280 --> 00:19:08,679 Speaker 3: are coming more and more sophisticated. What you mentioned with 336 00:19:08,760 --> 00:19:11,400 Speaker 3: the email stuff, So this is a field, right, it's 337 00:19:11,400 --> 00:19:16,560 Speaker 3: called b EC Business Email compromise scams that is over 338 00:19:16,680 --> 00:19:20,639 Speaker 3: a one billion dollar per year industry and only rising 339 00:19:20,720 --> 00:19:24,080 Speaker 3: fast and getting more and more sophisticated with with these 340 00:19:24,119 --> 00:19:27,560 Speaker 3: AI tools. And now that these AI generate AI tools 341 00:19:27,560 --> 00:19:34,520 Speaker 3: are so they're so easy to access, it's extremely you know, 342 00:19:34,840 --> 00:19:36,760 Speaker 3: the barrier of entry is low. You don't need to 343 00:19:36,760 --> 00:19:40,159 Speaker 3: be a sophisticated software engineer. You can like, you know, 344 00:19:40,240 --> 00:19:42,399 Speaker 3: go on a website and generate someone's voids, or go 345 00:19:42,440 --> 00:19:44,720 Speaker 3: on a website and generate a video of someone where 346 00:19:44,720 --> 00:19:47,480 Speaker 3: they're just becoming more and more difficult to spot. And 347 00:19:47,520 --> 00:19:49,840 Speaker 3: so how bitmind gets ahead of it is kind of 348 00:19:49,880 --> 00:19:52,600 Speaker 3: like the basically the tools that we're building are saying, hey, 349 00:19:53,400 --> 00:19:54,920 Speaker 3: anything that is touching. 350 00:19:54,640 --> 00:19:59,160 Speaker 5: You, you know, you utilize a tool to basically. 351 00:19:58,720 --> 00:20:00,000 Speaker 2: Do an initial check on it. 352 00:20:00,320 --> 00:20:02,440 Speaker 3: You know, when a phone call is coming in can 353 00:20:02,480 --> 00:20:06,000 Speaker 3: you verify the number when you're looking at any news 354 00:20:06,080 --> 00:20:08,800 Speaker 3: or content on Twitter, can it identify whether it's AI 355 00:20:08,840 --> 00:20:12,040 Speaker 3: generated or not? You know, an enterprise solution in the 356 00:20:12,040 --> 00:20:14,919 Speaker 3: future could be related to emails, you know, basically, like 357 00:20:14,960 --> 00:20:17,000 Speaker 3: you can think of even its scraping your emails and 358 00:20:17,080 --> 00:20:20,479 Speaker 3: like anything that is flagged from like something that's you know, 359 00:20:21,119 --> 00:20:24,480 Speaker 3: AI looking or malicious looking it going to a certain 360 00:20:24,480 --> 00:20:26,800 Speaker 3: folder or flagging it in a certain way, but really 361 00:20:26,840 --> 00:20:33,160 Speaker 3: trying to be proactive in informing the user of what. 362 00:20:34,880 --> 00:20:37,280 Speaker 2: Is AI generated or not and what we really believe in. 363 00:20:37,320 --> 00:20:39,400 Speaker 3: There's like a lot of there's a lot of discussion 364 00:20:39,440 --> 00:20:43,119 Speaker 3: right now in regulation and I'm definitely in the opinion of, 365 00:20:43,200 --> 00:20:46,960 Speaker 3: like I think that could be a slippery slope of 366 00:20:47,760 --> 00:20:51,679 Speaker 3: kind of going down anti free speech. 367 00:20:51,920 --> 00:20:56,399 Speaker 5: What you want to do is give consumers the tools. 368 00:20:56,040 --> 00:20:59,040 Speaker 3: The education to be able to make their decisions and 369 00:20:59,040 --> 00:21:01,399 Speaker 3: protect them if they don't want to view it at all, they. 370 00:21:01,240 --> 00:21:02,560 Speaker 2: Want to be very secure. 371 00:21:02,920 --> 00:21:04,520 Speaker 3: You know, you can do this, but if you want 372 00:21:04,520 --> 00:21:06,240 Speaker 3: to be able to look at what you're viewing but 373 00:21:06,320 --> 00:21:09,199 Speaker 3: make the determination for yourself whether it's AI generator or not, 374 00:21:09,720 --> 00:21:11,520 Speaker 3: just have the information displayed to you so that you 375 00:21:11,560 --> 00:21:12,400 Speaker 3: can make that decision. 376 00:21:12,800 --> 00:21:15,159 Speaker 1: Okay, cool, I'm talking to my guests is the co 377 00:21:15,280 --> 00:21:20,199 Speaker 1: founder and CEO Bitmind, the world's first decentralized artificial intelligence 378 00:21:20,359 --> 00:21:25,119 Speaker 1: AI deep fake detection system. Now where could we go? 379 00:21:25,480 --> 00:21:27,640 Speaker 1: We talked about free solution, but I have not given 380 00:21:27,720 --> 00:21:31,199 Speaker 1: anybody a website location or a number talk about how 381 00:21:31,240 --> 00:21:34,080 Speaker 1: we can access these free solutions or these subscriptions. 382 00:21:35,560 --> 00:21:39,640 Speaker 3: Yeah, so the first website, go to the detector dot ai. 383 00:21:40,080 --> 00:21:42,000 Speaker 3: That is our web application where you can drag and 384 00:21:42,080 --> 00:21:48,240 Speaker 3: drop images and videos. More time, repeat that thedetector dot ai. 385 00:21:49,840 --> 00:21:51,600 Speaker 3: And then the second one is go to the Chrome 386 00:21:51,640 --> 00:21:54,840 Speaker 3: web Store. You know, there's plenty of Chrome extensions out there, 387 00:21:54,840 --> 00:21:57,199 Speaker 3: but go to the Chrome web Store and just look up. 388 00:21:57,480 --> 00:22:01,200 Speaker 3: Just look up bitmind or AI detector and will be 389 00:22:01,240 --> 00:22:03,399 Speaker 3: the first one that popped it up. We're featured on 390 00:22:03,440 --> 00:22:07,199 Speaker 3: the on the Chrome web Store right now. And that 391 00:22:07,320 --> 00:22:10,840 Speaker 3: is the browser extension where it will overlay information on 392 00:22:10,880 --> 00:22:13,560 Speaker 3: your web page as we're scrolling the internet, looking at News, 393 00:22:13,880 --> 00:22:16,520 Speaker 3: looking at Twitter, looking at your social websites. 394 00:22:16,200 --> 00:22:16,600 Speaker 2: Et cetera. 395 00:22:16,840 --> 00:22:19,040 Speaker 1: Now you know, Ken, this is the fear factor. Now 396 00:22:19,080 --> 00:22:22,600 Speaker 1: you you proclaimed clear, You're not IBM, You're not Google 397 00:22:22,680 --> 00:22:26,359 Speaker 1: you're not Apple, You're not some big company named quote 398 00:22:26,440 --> 00:22:29,480 Speaker 1: unquote we trust you know, we thought we trusted TikTok, 399 00:22:29,640 --> 00:22:31,840 Speaker 1: and they told us it's spying on us. We can't 400 00:22:31,880 --> 00:22:33,760 Speaker 1: even put their own government computers. 401 00:22:37,480 --> 00:22:41,359 Speaker 3: That is that is an absolutely great question and another 402 00:22:41,440 --> 00:22:47,080 Speaker 3: reason why we started this. So, right, if you have 403 00:22:47,400 --> 00:22:49,439 Speaker 3: maybe it's a start, if you have Google or Apple 404 00:22:49,680 --> 00:22:52,000 Speaker 3: or any of these companies telling you that like, hey, 405 00:22:52,040 --> 00:22:54,000 Speaker 3: this is this is AI or this is not. 406 00:22:54,440 --> 00:22:56,840 Speaker 5: This is a huge conflict of interest for these companies. 407 00:22:56,920 --> 00:22:58,640 Speaker 5: They're the ones generating the content. 408 00:22:59,040 --> 00:23:01,080 Speaker 3: So how you're gonna be the one generating the content 409 00:23:01,119 --> 00:23:03,399 Speaker 3: and telling it it's us if it's ar fake and 410 00:23:03,800 --> 00:23:07,520 Speaker 3: you already see, like you know, these big companies being influenced, 411 00:23:07,680 --> 00:23:09,520 Speaker 3: you know, in certain ways or another. And then right, 412 00:23:09,560 --> 00:23:12,159 Speaker 3: you mentioned TikTok as well. So what bitmin does and 413 00:23:12,200 --> 00:23:15,199 Speaker 3: the reason this is the reason why it's decentralized. We 414 00:23:15,240 --> 00:23:19,680 Speaker 3: talk about, you know, the first decentralized AI defake detection tool. 415 00:23:19,680 --> 00:23:24,880 Speaker 3: The decentralized piece is what really is the necessary aspect 416 00:23:24,960 --> 00:23:28,720 Speaker 3: that incorporates the trust and it also improves the performance. 417 00:23:28,760 --> 00:23:32,760 Speaker 3: So basically, what we have running is a competition from 418 00:23:32,800 --> 00:23:36,280 Speaker 3: AI developers around the world to create the best defake 419 00:23:36,359 --> 00:23:39,280 Speaker 3: detection models, and they also get paid for doing this. 420 00:23:39,800 --> 00:23:42,480 Speaker 5: And then so when you use our applications, you're not 421 00:23:42,560 --> 00:23:46,040 Speaker 5: getting the detection from just one single model. You're actually 422 00:23:46,080 --> 00:23:49,919 Speaker 5: getting a detection from over two hundred different models at 423 00:23:49,920 --> 00:23:53,159 Speaker 5: any given time. And then what is being displayed to 424 00:23:53,240 --> 00:23:56,080 Speaker 5: the end user is the aggregation or the average of 425 00:23:56,160 --> 00:24:00,160 Speaker 5: all those models. So by decentralizing it and saying, hey, 426 00:24:00,240 --> 00:24:02,800 Speaker 5: we're not going to just like use one solution that 427 00:24:02,840 --> 00:24:07,280 Speaker 5: could be overfitted have biases, We're using hundreds of models 428 00:24:07,680 --> 00:24:11,720 Speaker 5: at a time, so that that bias that trust. If 429 00:24:11,760 --> 00:24:15,080 Speaker 5: any single model is trying to, you know, do something malicious, 430 00:24:15,119 --> 00:24:18,560 Speaker 5: it won't matter in this whole average, and you're just 431 00:24:18,640 --> 00:24:23,920 Speaker 5: trusting this aggregated, decentralized competition of AI developers around the 432 00:24:23,960 --> 00:24:26,520 Speaker 5: world that are that are creating these these models. 433 00:24:26,840 --> 00:24:29,679 Speaker 1: Okay Kin mentioned India early in the interview, and I 434 00:24:29,680 --> 00:24:31,879 Speaker 1: didn't want to cut him off, but I want but 435 00:24:31,920 --> 00:24:34,920 Speaker 1: I did want to add this fact to really validate 436 00:24:34,960 --> 00:24:37,720 Speaker 1: this story about they have a morning show in India. 437 00:24:38,320 --> 00:24:42,120 Speaker 1: It's hosted by one person that is completely AI generated, 438 00:24:42,880 --> 00:24:48,560 Speaker 1: and that's why actors are nervous about AI because if 439 00:24:48,600 --> 00:24:51,320 Speaker 1: you can create a morning show person that is not 440 00:24:51,400 --> 00:24:54,360 Speaker 1: even real, and this is what they benefit. They don't 441 00:24:54,400 --> 00:24:56,800 Speaker 1: call it sick. They don't have to pay for the 442 00:24:56,880 --> 00:25:00,560 Speaker 1: health benefits. They don't take vacation. And that is from 443 00:25:00,600 --> 00:25:04,600 Speaker 1: a business perspective. That's what people see the value of 444 00:25:04,760 --> 00:25:08,760 Speaker 1: AIM because of the fact that it doesn't complain. But 445 00:25:08,880 --> 00:25:11,439 Speaker 1: the program in place, that programmer can be on vacation, 446 00:25:11,720 --> 00:25:13,800 Speaker 1: that programmer can be in New York, that programmer can 447 00:25:13,880 --> 00:25:16,360 Speaker 1: be in the Heati, that programmer can be in Japan. 448 00:25:16,920 --> 00:25:19,080 Speaker 1: They're doing the work or they just switch off to 449 00:25:19,160 --> 00:25:22,159 Speaker 1: somebody else on another eight hour shift. And that person 450 00:25:22,200 --> 00:25:25,240 Speaker 1: who maybe if it becomes popular because they say this 451 00:25:25,359 --> 00:25:29,119 Speaker 1: AI generated character is very popular, can you don't have 452 00:25:29,119 --> 00:25:30,880 Speaker 1: to pay that person ten million dollars a year? 453 00:25:34,240 --> 00:25:35,760 Speaker 2: I think yeah, no, one hundred percent. 454 00:25:35,840 --> 00:25:39,320 Speaker 3: Yeah, right, when you're using software, when you're using AI, right, 455 00:25:39,359 --> 00:25:43,200 Speaker 3: we're talking about maybe protecting yourself from scams. So now 456 00:25:43,720 --> 00:25:46,280 Speaker 3: you not only have just like you know, traditionally you 457 00:25:46,320 --> 00:25:48,720 Speaker 3: had a person or a team working on trying to 458 00:25:48,760 --> 00:25:51,600 Speaker 3: develop these scams, but of AI working on it. This 459 00:25:51,680 --> 00:25:54,240 Speaker 3: thing's twenty four to seven and never sleeps, right, you know, 460 00:25:54,400 --> 00:25:58,200 Speaker 3: it's always coming, always improving and so right. I think 461 00:25:58,359 --> 00:26:01,359 Speaker 3: you need to give consumers use there's tools education to 462 00:26:01,880 --> 00:26:04,119 Speaker 3: protect themselves better in this new world. 463 00:26:04,720 --> 00:26:08,080 Speaker 1: Well, I'll tell you something, brother, I am. I get 464 00:26:08,080 --> 00:26:11,199 Speaker 1: on my email every day, I click on stuff, and 465 00:26:11,240 --> 00:26:13,320 Speaker 1: I had to educate myself. Ken. I had to go 466 00:26:13,400 --> 00:26:16,359 Speaker 1: and have my staff. It's just real. It's just real. 467 00:26:16,640 --> 00:26:19,640 Speaker 1: And finally with Rashaan stopped and I had to learn 468 00:26:19,640 --> 00:26:22,000 Speaker 1: the tricks. But of course when you learn a trick, 469 00:26:22,040 --> 00:26:25,560 Speaker 1: there's a better trick created, okay, And I'm just telling 470 00:26:25,640 --> 00:26:28,040 Speaker 1: my listeners. The reason I brought Ken on the show 471 00:26:28,119 --> 00:26:31,240 Speaker 1: is to say, here's free solutions that you can walk 472 00:26:31,280 --> 00:26:34,760 Speaker 1: in and use. And if you need more advanced from 473 00:26:34,760 --> 00:26:37,920 Speaker 1: a consumer standpoint or a business standpoint, they have a subscription. 474 00:26:38,840 --> 00:26:44,840 Speaker 1: As you said earlier, overlays, the easy detection, drag it 475 00:26:44,880 --> 00:26:47,360 Speaker 1: over to your software goes in there. You put overlays 476 00:26:47,359 --> 00:26:49,880 Speaker 1: on different software. I just know that this we all 477 00:26:49,920 --> 00:26:52,399 Speaker 1: know is not going away, can so we need to 478 00:26:52,440 --> 00:26:55,520 Speaker 1: just strap up to the reality of deal with it 479 00:26:55,920 --> 00:26:58,879 Speaker 1: so you can secure your money. So one day you 480 00:26:58,880 --> 00:27:00,919 Speaker 1: will open your bank account and all your minds are 481 00:27:00,920 --> 00:27:02,600 Speaker 1: going Am I righting air assessment? 482 00:27:03,920 --> 00:27:06,680 Speaker 2: Right? I think the first step is education. 483 00:27:06,920 --> 00:27:08,719 Speaker 3: Right, if you don't know that there's a problem, if 484 00:27:08,720 --> 00:27:11,000 Speaker 3: you're not aware of it, you can't you can't solve it. 485 00:27:11,440 --> 00:27:15,680 Speaker 2: But right, the velocity and the. 486 00:27:15,640 --> 00:27:20,040 Speaker 3: Improvement of generative AI, this is it's akin to cybersecurity, right, 487 00:27:20,119 --> 00:27:23,320 Speaker 3: like new security threats get created all the time and 488 00:27:23,640 --> 00:27:24,800 Speaker 3: the security. 489 00:27:24,320 --> 00:27:26,280 Speaker 5: Needs to adopt. And this is the same thing. 490 00:27:26,560 --> 00:27:29,640 Speaker 3: New AI will be generated at all times, and we're 491 00:27:29,680 --> 00:27:32,400 Speaker 3: in the game of continuously keeping up, trying. 492 00:27:32,200 --> 00:27:36,160 Speaker 2: To stay ahead. But to do that right, I. 493 00:27:36,040 --> 00:27:40,920 Speaker 3: Think we're close and very soon it will be impossible 494 00:27:40,920 --> 00:27:43,760 Speaker 3: to differentiate with the human eye what is AI generated 495 00:27:43,840 --> 00:27:44,280 Speaker 3: or what is not. 496 00:27:44,880 --> 00:27:47,119 Speaker 1: Absolutely, that's why we need your show. That's why it's 497 00:27:47,119 --> 00:27:48,840 Speaker 1: going to be. We need your company to be at 498 00:27:48,840 --> 00:27:51,640 Speaker 1: the front of it all. Bid Mind again, tell everybody 499 00:27:51,640 --> 00:27:54,199 Speaker 1: how we can go there see what you're doing on 500 00:27:54,240 --> 00:27:58,439 Speaker 1: your website. So I'll make a and also download Chrome 501 00:27:58,920 --> 00:27:59,600 Speaker 1: the app. 502 00:28:02,440 --> 00:28:08,639 Speaker 3: Go to the detector dot ai. Repeat the detector dot ai. 503 00:28:08,800 --> 00:28:11,240 Speaker 3: That's our web application. And then if you want to 504 00:28:11,280 --> 00:28:14,760 Speaker 3: have the kind of like interactive Chrome extension, go to 505 00:28:14,760 --> 00:28:18,520 Speaker 3: the Chrome web store type in AI detector or bitmind. 506 00:28:18,920 --> 00:28:21,440 Speaker 3: It'll pop up and you can download it and then 507 00:28:21,880 --> 00:28:25,560 Speaker 3: right information will be overlaid on your screen. Whether this 508 00:28:25,800 --> 00:28:28,160 Speaker 3: image or video you're looking at it either AI generated 509 00:28:28,320 --> 00:28:29,440 Speaker 3: or fake. 510 00:28:29,880 --> 00:28:32,720 Speaker 1: Yeah, he has a James Bond Type nine Ken John 511 00:28:32,840 --> 00:28:36,520 Speaker 1: at but he's a superstar in the field of protection, 512 00:28:36,840 --> 00:28:41,360 Speaker 1: especially cybersecurity. Deep fake. The reality is that it's coming. 513 00:28:41,760 --> 00:28:44,240 Speaker 1: If you don't believe it, then go to his website 514 00:28:44,440 --> 00:28:47,120 Speaker 1: google stuff. But again, if you google it, now, remember 515 00:28:47,120 --> 00:28:49,640 Speaker 1: they're in the business of deep faith too. So again, 516 00:28:49,840 --> 00:28:51,600 Speaker 1: let's go to a place or find a home that 517 00:28:51,680 --> 00:28:55,240 Speaker 1: will be decentralized and give us an opportunity to see 518 00:28:55,280 --> 00:28:57,360 Speaker 1: the truth. Thank you Ken for coming on Money Making 519 00:28:57,400 --> 00:29:01,840 Speaker 1: Conversations Masterclass and allow myself to storyteller talk to a 520 00:29:01,880 --> 00:29:03,360 Speaker 1: subject matter expert like you. 521 00:29:04,400 --> 00:29:05,720 Speaker 2: Rashan, thank you so much for having me Ma, I 522 00:29:05,760 --> 00:29:06,480 Speaker 2: appreciate it. 523 00:29:06,680 --> 00:29:10,320 Speaker 1: This has been another edition of Moneymaking Conversation Masterclass posted 524 00:29:10,400 --> 00:29:13,200 Speaker 1: by me Rushawn McDonald. Thank you to our guests on 525 00:29:13,240 --> 00:29:15,720 Speaker 1: the show today and thank you for listening to the 526 00:29:15,800 --> 00:29:18,200 Speaker 1: audience now. If you want to listen to any episode 527 00:29:18,240 --> 00:29:20,280 Speaker 1: I want to be a guest on the show, visit 528 00:29:20,560 --> 00:29:25,320 Speaker 1: Moneymakingconversations dot com. Our social media handle is money Making Conversation. 529 00:29:25,760 --> 00:29:28,160 Speaker 1: Join us next week and remember to always leave with 530 00:29:28,200 --> 00:29:29,680 Speaker 1: your gifts. Keep winning,