1 00:00:10,200 --> 00:00:14,280 Speaker 1: Hello, and welcome to another episode of the Odd Lots Podcast. 2 00:00:14,360 --> 00:00:16,080 Speaker 1: I'm Joe Wisenthal. 3 00:00:15,600 --> 00:00:16,639 Speaker 2: And I'm Tracy Alloway. 4 00:00:17,120 --> 00:00:20,000 Speaker 1: Tracy, do you remember a few weeks ago, or maybe 5 00:00:20,000 --> 00:00:22,040 Speaker 1: a couple of months ago now, someone made a like 6 00:00:22,079 --> 00:00:25,840 Speaker 1: a fake AI version of the Odd Lots podcast and 7 00:00:25,880 --> 00:00:27,760 Speaker 1: they like replicated our voice. 8 00:00:28,000 --> 00:00:30,520 Speaker 2: Yes, I think they used chat gpt to do it. 9 00:00:30,600 --> 00:00:33,960 Speaker 2: And they basically asked chat gpt to create a script 10 00:00:34,000 --> 00:00:37,320 Speaker 2: for odd Loots and that they had someone replicate another app 11 00:00:37,400 --> 00:00:38,400 Speaker 2: or something replicated. 12 00:00:38,479 --> 00:00:40,120 Speaker 1: It wasn't a someone like I mean, it was like 13 00:00:40,200 --> 00:00:43,680 Speaker 1: somewhat like other like AI app. It was like, yeah, 14 00:00:43,840 --> 00:00:45,760 Speaker 1: it was pretty it was pretty good. It was like good. 15 00:00:45,840 --> 00:00:48,200 Speaker 1: I mean it clearly like was not us. 16 00:00:48,479 --> 00:00:50,159 Speaker 2: How much did it worry you, Joe? 17 00:00:50,840 --> 00:00:53,319 Speaker 1: I think I'm more of an AI domer worrier than 18 00:00:53,360 --> 00:00:53,639 Speaker 1: you are. 19 00:00:53,960 --> 00:00:54,520 Speaker 3: Interesting. 20 00:00:54,880 --> 00:00:57,840 Speaker 2: I mean, I have concerns, don't get me wrong, But 21 00:00:57,880 --> 00:00:59,520 Speaker 2: what do you think is the worst case scenario? Is 22 00:00:59,520 --> 00:01:00,880 Speaker 2: it robot taking over the world. 23 00:01:01,000 --> 00:01:01,120 Speaker 4: No. 24 00:01:01,200 --> 00:01:04,080 Speaker 1: I think that like in a couple of years, the 25 00:01:04,160 --> 00:01:06,360 Speaker 1: AI will do a really good job of making the 26 00:01:06,360 --> 00:01:09,840 Speaker 1: Odd Lots podcast. And people say, I don't really need 27 00:01:09,880 --> 00:01:11,960 Speaker 1: to listen to Joe and Tracy anymore. The AI is 28 00:01:12,120 --> 00:01:14,920 Speaker 1: I don't know. I'm like, you know, I but I 29 00:01:14,959 --> 00:01:16,400 Speaker 1: feel like maybe that's all a parandoise. 30 00:01:16,520 --> 00:01:19,240 Speaker 2: No, I feel like there is reason for concern. I hope. 31 00:01:19,360 --> 00:01:21,319 Speaker 2: Maybe I'm just being an optimist here. I hope we 32 00:01:21,360 --> 00:01:24,200 Speaker 2: are a few years away from the odd Blots replicat. 33 00:01:24,240 --> 00:01:25,720 Speaker 1: I know what, I want to do this for like 34 00:01:25,760 --> 00:01:27,480 Speaker 1: another thirty years, and I don't want to just have 35 00:01:27,600 --> 00:01:29,480 Speaker 1: like three or four years or a few years. 36 00:01:29,480 --> 00:01:32,080 Speaker 2: That's the optimistic scenario. Is it doing this in thirty 37 00:01:32,160 --> 00:01:32,560 Speaker 2: years time? 38 00:01:32,640 --> 00:01:33,040 Speaker 3: Okay? 39 00:01:33,240 --> 00:01:33,600 Speaker 4: All right? 40 00:01:34,360 --> 00:01:37,720 Speaker 2: AI. Lots of people expressing some worries about. 41 00:01:37,520 --> 00:01:38,920 Speaker 1: It, a million different kind of words. 42 00:01:39,000 --> 00:01:42,199 Speaker 2: Yeah, and we see all these, you know, good ways 43 00:01:42,280 --> 00:01:45,040 Speaker 2: that the technology is being used, but also insidious ways. 44 00:01:45,160 --> 00:01:47,760 Speaker 2: As you pointed out, someone made an Audlots episode. I 45 00:01:47,800 --> 00:01:51,720 Speaker 2: think that was in a in a very fun spirited way, 46 00:01:51,800 --> 00:01:54,520 Speaker 2: a nice way. But there are people who use this 47 00:01:54,560 --> 00:01:57,480 Speaker 2: technology for more nefarious things. Yeah. 48 00:01:57,480 --> 00:01:59,880 Speaker 1: I think there was like a sixty minutes recently where 49 00:01:59,880 --> 00:02:03,640 Speaker 1: like someone was able to like impersonate someone's voice and say, hey, 50 00:02:04,320 --> 00:02:07,360 Speaker 1: you know, you know, you could imagine this scam where 51 00:02:07,400 --> 00:02:09,520 Speaker 1: someone's like kids, say hey, I need money or something 52 00:02:09,560 --> 00:02:13,079 Speaker 1: like that, and they call and it sounds just like them, 53 00:02:13,160 --> 00:02:14,760 Speaker 1: especially in a couple of years when the tech gets 54 00:02:14,800 --> 00:02:17,640 Speaker 1: better and it becomes a bit of a crisis in 55 00:02:17,680 --> 00:02:19,760 Speaker 1: the sense it's like, do I know that a person 56 00:02:19,840 --> 00:02:21,959 Speaker 1: is a person when they talk to me on a zoom, 57 00:02:21,960 --> 00:02:23,480 Speaker 1: when they talk to me on a podcast, when they 58 00:02:23,480 --> 00:02:24,880 Speaker 1: talk to me on a phone call, et cetera. 59 00:02:25,080 --> 00:02:27,760 Speaker 2: Well, even on Twitter on a much more basic level, 60 00:02:27,800 --> 00:02:29,760 Speaker 2: you have all these bots, and I don't know about you, 61 00:02:29,880 --> 00:02:32,200 Speaker 2: but there seems to be a new Tracy Alloway with 62 00:02:32,320 --> 00:02:36,600 Speaker 2: the l as like an I and fooling everyone almost 63 00:02:36,680 --> 00:02:38,520 Speaker 2: like every few weeks, and then you have to go 64 00:02:38,560 --> 00:02:41,800 Speaker 2: through the whole bureaucracy of reporting to Twitter and proving 65 00:02:41,880 --> 00:02:44,720 Speaker 2: that you are in fact the original Tracy alloyway. 66 00:02:44,880 --> 00:02:47,359 Speaker 1: And not to get into much of a tangent, but 67 00:02:47,440 --> 00:02:49,600 Speaker 1: wasn't Elon going to solve this because he said like, oh, 68 00:02:49,600 --> 00:02:52,200 Speaker 1: I'm buying Twitter to solve the bots problem, and I don't, 69 00:02:52,480 --> 00:02:54,120 Speaker 1: as you say, it's not been solved. 70 00:02:54,160 --> 00:02:55,600 Speaker 2: I think that's a whole other episode. 71 00:02:55,800 --> 00:03:00,320 Speaker 1: But even with like really rudimentary like human you know, 72 00:03:00,560 --> 00:03:03,520 Speaker 1: fake traces and fake Joe's like, that's the most rudimentary thing, 73 00:03:03,600 --> 00:03:06,000 Speaker 1: just like taking our avatar and like changing like one 74 00:03:06,000 --> 00:03:07,840 Speaker 1: of the l's and trade in L a way to 75 00:03:07,880 --> 00:03:10,760 Speaker 1: like an I or whatever or one like. That's just 76 00:03:10,800 --> 00:03:11,480 Speaker 1: gonna get worse. 77 00:03:13,000 --> 00:03:16,280 Speaker 2: You know what's a really good business model creating a 78 00:03:16,400 --> 00:03:19,800 Speaker 2: problem and then creating the solution to it. 79 00:03:21,240 --> 00:03:23,880 Speaker 1: Yes, that's exactly right. So you have all So we 80 00:03:23,919 --> 00:03:27,440 Speaker 1: have all of these ais, and they're going to like 81 00:03:27,480 --> 00:03:29,120 Speaker 1: replicate us, They're going to look like us, or they're 82 00:03:29,120 --> 00:03:32,040 Speaker 1: going to talk like us. But the good news, in 83 00:03:32,080 --> 00:03:34,280 Speaker 1: a sense good is that others are working out the 84 00:03:34,360 --> 00:03:37,400 Speaker 1: problem of how do you prove that you are who 85 00:03:37,440 --> 00:03:41,400 Speaker 1: you are in a world in which AIS can replicate 86 00:03:41,400 --> 00:03:43,480 Speaker 1: our voices and our faces and our expression. 87 00:03:43,640 --> 00:03:46,320 Speaker 2: Technology can save us from technology? 88 00:03:46,520 --> 00:03:48,440 Speaker 1: Is the thrust of this, right exactly. So we are 89 00:03:48,480 --> 00:03:52,800 Speaker 1: going to be speaking to someone working on the technology 90 00:03:52,840 --> 00:03:57,280 Speaker 1: designed to address this concern in part. And as you say, 91 00:03:57,360 --> 00:03:59,400 Speaker 1: some of the people working on it are some of 92 00:03:59,440 --> 00:04:02,920 Speaker 1: the same peopleeople behind some of the most cutting edge 93 00:04:03,000 --> 00:04:05,880 Speaker 1: AI research that's sort of like blowing everyone's minds in 94 00:04:05,960 --> 00:04:07,680 Speaker 1: both good and scary ways these days. 95 00:04:07,720 --> 00:04:09,520 Speaker 2: So I'm just going to go ahead and lay out 96 00:04:09,560 --> 00:04:12,440 Speaker 2: my priors here, which are that I find this sort 97 00:04:12,440 --> 00:04:16,920 Speaker 2: of instinctively dystopian the topic of this podcast what we're 98 00:04:16,960 --> 00:04:20,200 Speaker 2: going to be discussing. This technology creeps me out a 99 00:04:20,240 --> 00:04:23,080 Speaker 2: little bit. But that said, I don't know that much 100 00:04:23,080 --> 00:04:25,039 Speaker 2: about it. I don't know much about how it works, 101 00:04:25,040 --> 00:04:28,039 Speaker 2: and so I am very interested to hear the bowl 102 00:04:28,120 --> 00:04:30,040 Speaker 2: case on eyeball scanning. 103 00:04:30,560 --> 00:04:32,440 Speaker 1: So you just jump bright. We're going to be talking 104 00:04:32,680 --> 00:04:36,080 Speaker 1: eyeball scanning. We are going to be speaking with Alex Blania. 105 00:04:36,400 --> 00:04:40,159 Speaker 1: He is the co founder of world Coin, an entity 106 00:04:40,200 --> 00:04:43,680 Speaker 1: that was set up also by Sam Altman, who everybody 107 00:04:43,800 --> 00:04:46,680 Speaker 1: a name that everybody knows now because he's the co 108 00:04:46,800 --> 00:04:49,920 Speaker 1: founder or the founder of open ai, which powers chad 109 00:04:49,960 --> 00:04:53,560 Speaker 1: GPT and other tools. Other co founder Max nor Frenstern. 110 00:04:53,800 --> 00:04:56,719 Speaker 1: We're going to be talking about how an eyeball scanner 111 00:04:57,160 --> 00:05:00,760 Speaker 1: can allow us to theoretically prove our human in this 112 00:05:00,920 --> 00:05:04,240 Speaker 1: future AI world. So Alex, thank you so much for 113 00:05:04,600 --> 00:05:05,880 Speaker 1: coming on odd Lots. 114 00:05:06,400 --> 00:05:08,080 Speaker 4: Thank you for having me. I'm excited to talk. 115 00:05:08,880 --> 00:05:12,640 Speaker 1: So I'm trying to think, I mean, like the threat, 116 00:05:13,240 --> 00:05:15,520 Speaker 1: how real is it that you know, in a few 117 00:05:15,600 --> 00:05:21,120 Speaker 1: years do you think not whether AI will be able 118 00:05:21,120 --> 00:05:24,640 Speaker 1: to create a perfect replica of the podcast, but that 119 00:05:25,400 --> 00:05:28,880 Speaker 1: let's say we're having this conversation and I might really 120 00:05:28,920 --> 00:05:32,880 Speaker 1: not know that I am talking to the real Alex 121 00:05:32,960 --> 00:05:37,719 Speaker 1: Blania versus an AI that could perfectly replicate your voice 122 00:05:37,720 --> 00:05:40,400 Speaker 1: and replicate your face. Like is that two years away? 123 00:05:40,440 --> 00:05:43,080 Speaker 1: Is that three years away? Like? How soon is that coming? 124 00:05:44,640 --> 00:05:49,479 Speaker 4: Well? I think partially we are already there. Right the. 125 00:05:51,080 --> 00:05:53,520 Speaker 5: Technology is not completely open source yet, so you don't 126 00:05:53,560 --> 00:05:58,159 Speaker 5: have widely deployed, deployed systems that can't do that, but 127 00:05:58,279 --> 00:05:59,800 Speaker 5: rather you have right now you have like a few 128 00:06:00,080 --> 00:06:02,960 Speaker 5: teas that actually can do things like that or would 129 00:06:03,000 --> 00:06:05,279 Speaker 5: be able to do things like that. But this is 130 00:06:05,320 --> 00:06:08,279 Speaker 5: going to change because of course the progress is increasing. 131 00:06:08,360 --> 00:06:12,440 Speaker 5: So you had it started with Dolly and you had 132 00:06:12,480 --> 00:06:15,440 Speaker 5: stable diffusion and many other things that on the image 133 00:06:15,440 --> 00:06:20,200 Speaker 5: generation side became increasingly impressive. And I think are ai 134 00:06:20,360 --> 00:06:21,880 Speaker 5: R is one of the coolest things of the of 135 00:06:21,920 --> 00:06:25,360 Speaker 5: the last couple of the last year, to be honest. 136 00:06:26,080 --> 00:06:28,360 Speaker 5: So that's on the image generation side of things, and 137 00:06:28,400 --> 00:06:30,080 Speaker 5: then of course you have chat GIPT on the text 138 00:06:30,080 --> 00:06:32,320 Speaker 5: generation side of things. So I think on Twitter, as 139 00:06:32,360 --> 00:06:36,920 Speaker 5: you discussed in the introduction, you are I think already 140 00:06:36,960 --> 00:06:39,720 Speaker 5: there that it could happen that you just chat with 141 00:06:39,760 --> 00:06:41,920 Speaker 5: a neural network and you have absolute no idea that 142 00:06:41,960 --> 00:06:44,200 Speaker 5: you chat with a new network and you might argue 143 00:06:44,240 --> 00:06:47,880 Speaker 5: about odd lots for like an hour and don't realize that. 144 00:06:48,120 --> 00:06:51,520 Speaker 2: I'm just thinking, like half the people I argue with 145 00:06:51,560 --> 00:06:52,839 Speaker 2: on Twitter probably are. 146 00:06:53,040 --> 00:06:54,280 Speaker 1: Bots neural networks. 147 00:06:54,480 --> 00:06:58,039 Speaker 2: Yeah, well, maybe just to step back in time a 148 00:06:58,040 --> 00:07:00,280 Speaker 2: little bit, I mean, world coin has been around for 149 00:07:00,400 --> 00:07:03,760 Speaker 2: a few years now, and to some degree, as Joe mentioned, 150 00:07:03,839 --> 00:07:07,599 Speaker 2: you guys kind of anticipated this threat. What was the 151 00:07:07,760 --> 00:07:12,240 Speaker 2: sort of aha moment that made this all come together, Like, 152 00:07:12,480 --> 00:07:16,240 Speaker 2: how did this actually develop into a company, and what 153 00:07:16,480 --> 00:07:20,200 Speaker 2: was the use case thesis when you were just setting. 154 00:07:19,960 --> 00:07:24,440 Speaker 5: Out, Well, we honestly did not. We didn't start off 155 00:07:24,480 --> 00:07:28,800 Speaker 5: to solve a threat, all right, that is not brief 156 00:07:28,840 --> 00:07:32,080 Speaker 5: history in time. Sure, back then, I did research. 157 00:07:32,720 --> 00:07:34,600 Speaker 4: So we started the company a little bit over three 158 00:07:34,680 --> 00:07:35,200 Speaker 4: years ago. 159 00:07:35,800 --> 00:07:40,160 Speaker 5: And I was doing research in deep learning applied to 160 00:07:40,240 --> 00:07:44,520 Speaker 5: quantum physics. So I basically predicted quantum system of neural 161 00:07:44,600 --> 00:07:47,440 Speaker 5: networks and that's I think it's a very promising field 162 00:07:47,440 --> 00:07:50,920 Speaker 5: in physics. And Sam was already working on the idea 163 00:07:51,640 --> 00:07:54,120 Speaker 5: overall kind together with Max, so I think they've already 164 00:07:54,160 --> 00:07:55,960 Speaker 5: been on it for like six months. But of course 165 00:07:56,280 --> 00:07:58,480 Speaker 5: Sam had a full time job at opening Ie. Max 166 00:07:58,520 --> 00:08:00,480 Speaker 5: had a full time job back then, so it was 167 00:08:00,520 --> 00:08:03,880 Speaker 5: all like in an early idea face. Then I got 168 00:08:03,880 --> 00:08:09,240 Speaker 5: an email from from Max back then to Poop with 169 00:08:09,320 --> 00:08:12,800 Speaker 5: like an early paper describing the vision of world cooin 170 00:08:13,680 --> 00:08:15,520 Speaker 5: and that it should come by and talk. And then 171 00:08:15,960 --> 00:08:18,520 Speaker 5: at a couple of interree years with Max uh one 172 00:08:18,560 --> 00:08:20,160 Speaker 5: with Sam, we spent a lot of time together and 173 00:08:20,560 --> 00:08:23,440 Speaker 5: actually decided to. 174 00:08:22,560 --> 00:08:23,280 Speaker 4: Proceed, but. 175 00:08:25,200 --> 00:08:27,440 Speaker 5: To go back to the initial proposition, and I think 176 00:08:27,440 --> 00:08:30,640 Speaker 5: that is very important. It's not about solving a threat, 177 00:08:30,680 --> 00:08:34,839 Speaker 5: but rather how I mean, Sam was already breaking in 178 00:08:34,880 --> 00:08:37,240 Speaker 5: open EIE and this was three years ago, so back 179 00:08:37,280 --> 00:08:41,520 Speaker 5: then it was not yet an accepted topic that AI 180 00:08:41,679 --> 00:08:43,240 Speaker 5: is going to happen and it's going to be as 181 00:08:43,240 --> 00:08:44,280 Speaker 5: powerful as today. 182 00:08:44,320 --> 00:08:45,280 Speaker 4: It was like a very. 183 00:08:46,040 --> 00:08:50,120 Speaker 5: Niche topic I think, although open of course was already 184 00:08:50,200 --> 00:08:53,760 Speaker 5: quite famous entity. But of course Sam was motivated by 185 00:08:53,760 --> 00:08:56,160 Speaker 5: the idea of AI and what it's going to do 186 00:08:56,200 --> 00:08:57,320 Speaker 5: to the world and it's going to be this very 187 00:08:57,320 --> 00:09:01,839 Speaker 5: powerful technology. And so the initial proposition of Rocline was 188 00:09:01,880 --> 00:09:06,600 Speaker 5: really what is the societal infrastructure that we can build 189 00:09:06,600 --> 00:09:10,280 Speaker 5: that makes sure that AI is for the better of 190 00:09:10,400 --> 00:09:13,559 Speaker 5: all of society versus a few and so there's a 191 00:09:13,679 --> 00:09:15,839 Speaker 5: there's a many things actually that come together here, like 192 00:09:15,880 --> 00:09:19,480 Speaker 5: proof of personhood is what you just described, is the 193 00:09:19,520 --> 00:09:22,040 Speaker 5: ability to prove that you're actually human being on the Internet, 194 00:09:22,040 --> 00:09:25,679 Speaker 5: which is basically just a much more capable identity system. 195 00:09:25,720 --> 00:09:27,360 Speaker 4: But it goes as far as. 196 00:09:28,880 --> 00:09:32,200 Speaker 5: We think eventually UBI will become a thing, and it 197 00:09:32,240 --> 00:09:34,240 Speaker 5: actually is like a big chance if you have like 198 00:09:34,240 --> 00:09:36,640 Speaker 5: a very very powerful technology and it's not about kind 199 00:09:36,640 --> 00:09:39,560 Speaker 5: of political redistribution, but you have a very centralizing technology 200 00:09:40,200 --> 00:09:42,720 Speaker 5: that you want to share the upside with the world with. 201 00:09:43,760 --> 00:09:46,800 Speaker 5: And there was another motivation, kind of Rokin can actually 202 00:09:46,800 --> 00:09:48,920 Speaker 5: be the infrastructure to do that, because you have a 203 00:09:49,000 --> 00:09:51,760 Speaker 5: very capable identity system, you have an economic system that 204 00:09:51,800 --> 00:09:55,960 Speaker 5: connects everyone. So there were like a couple of very 205 00:09:55,960 --> 00:09:59,360 Speaker 5: ambitious reasons why we think something like that. So basically 206 00:09:59,360 --> 00:10:05,880 Speaker 5: creating and bootstrapping the largest identity and financial network that 207 00:10:06,040 --> 00:10:12,160 Speaker 5: is actually open and privacy preserving versus relying on centralized 208 00:10:12,679 --> 00:10:15,559 Speaker 5: government systems that might not work in large parts of 209 00:10:15,600 --> 00:10:19,760 Speaker 5: the world is a very important preposition for the coming decades. 210 00:10:20,160 --> 00:10:23,800 Speaker 1: So let's just jump right to it. If you're listening 211 00:10:23,920 --> 00:10:26,360 Speaker 1: people who are listening to this podcast, you might want 212 00:10:26,400 --> 00:10:29,040 Speaker 1: to actually watch the video version. We're recording this on 213 00:10:29,200 --> 00:10:32,880 Speaker 1: video as well. People just the audio butt here on 214 00:10:32,960 --> 00:10:35,719 Speaker 1: the set. I'm just gonna I was gonna wait a 215 00:10:35,760 --> 00:10:38,640 Speaker 1: little while before unveiling it, but this is I'm holding 216 00:10:38,640 --> 00:10:38,960 Speaker 1: in my. 217 00:10:38,880 --> 00:10:41,199 Speaker 2: Hand like we need extra music, a big. 218 00:10:41,160 --> 00:10:44,440 Speaker 1: Orb It feels about six pounds. I don't know how 219 00:10:44,760 --> 00:10:47,360 Speaker 1: heavy is this, Alex, I'm sure you know the exact weight. 220 00:10:48,360 --> 00:10:50,640 Speaker 4: It's a little bit over one kilogram. 221 00:10:51,120 --> 00:10:53,400 Speaker 1: Oh okay, so I was wrong on So that's like 222 00:10:53,480 --> 00:10:58,440 Speaker 1: just on like about two and a half pounds, Okay, okay, 223 00:10:58,520 --> 00:11:00,840 Speaker 1: that's fine, about two and have found and it's an 224 00:11:00,880 --> 00:11:03,400 Speaker 1: eyeball scanner, and so here it is, and it's not 225 00:11:03,440 --> 00:11:06,240 Speaker 1: working right now, or we're not scanning right now, so 226 00:11:06,280 --> 00:11:09,160 Speaker 1: I can like tilt this in our face. It's not scanning. 227 00:11:09,360 --> 00:11:12,040 Speaker 1: But let's like talk about what this piece of hardware 228 00:11:12,160 --> 00:11:13,920 Speaker 1: that I'm actually holding it is that I'm going to 229 00:11:13,960 --> 00:11:16,800 Speaker 1: set it on the table in front of me. There's 230 00:11:16,800 --> 00:11:18,120 Speaker 1: an eyeball scanner. What does it do? 231 00:11:19,240 --> 00:11:23,000 Speaker 5: So the show to answer is one it will start 232 00:11:23,040 --> 00:11:25,200 Speaker 5: with shortened than the longer answer. The short answer is 233 00:11:25,240 --> 00:11:29,199 Speaker 5: that it first ensures that whatever is interacting with the device, 234 00:11:29,240 --> 00:11:32,400 Speaker 5: is an actual human being, So okay, not a display, 235 00:11:33,320 --> 00:11:36,480 Speaker 5: not a photograph of an eyeball, not a photograph or 236 00:11:37,080 --> 00:11:40,160 Speaker 5: like more sophisticated optical attacks, because there's like many. 237 00:11:39,960 --> 00:11:41,040 Speaker 4: Things that you can do here. 238 00:11:41,720 --> 00:11:44,000 Speaker 5: So that's the first step checking that whatever we see 239 00:11:44,000 --> 00:11:47,040 Speaker 5: is an actual human being. And then the second piece 240 00:11:47,800 --> 00:11:52,120 Speaker 5: is it takes an image of my eye, calculates a 241 00:11:52,240 --> 00:11:57,040 Speaker 5: unique embedding out of this picture. Everything happens on a device, 242 00:11:57,920 --> 00:12:01,720 Speaker 5: and that embedding then signed by the device, and that's 243 00:12:01,840 --> 00:12:05,040 Speaker 5: then the only thing that actually leaves the device and 244 00:12:06,320 --> 00:12:10,240 Speaker 5: gets compared against all other users. And if that uniqueness 245 00:12:10,320 --> 00:12:14,480 Speaker 5: check is successful, then me that I just verified with 246 00:12:14,480 --> 00:12:17,720 Speaker 5: an ORB I then can later do a zero knowledge 247 00:12:17,760 --> 00:12:21,880 Speaker 5: proof that I am included in that set. And if 248 00:12:21,920 --> 00:12:23,880 Speaker 5: you're not familiar with your knowledge proves, what it allows 249 00:12:23,920 --> 00:12:27,160 Speaker 5: you to do is it lets you prove something without 250 00:12:27,200 --> 00:12:32,520 Speaker 5: actually revealing the underlying information. So what that enables us 251 00:12:32,600 --> 00:12:36,560 Speaker 5: to do is it builds a self like essentially a 252 00:12:36,600 --> 00:12:41,040 Speaker 5: self custodial identity network where the data control is with 253 00:12:41,080 --> 00:12:44,080 Speaker 5: the user and there's no centralized entity that actually has 254 00:12:44,200 --> 00:12:47,000 Speaker 5: the information of who you are, where you're from. All 255 00:12:47,000 --> 00:12:50,199 Speaker 5: of these things but rather this person is a unique 256 00:12:50,280 --> 00:12:55,160 Speaker 5: human being and has verified revocan before, while the user 257 00:12:55,520 --> 00:12:58,320 Speaker 5: could decide to attach more information to that. So that's 258 00:12:58,360 --> 00:13:00,600 Speaker 5: kind of the short of it. I think we should 259 00:13:00,640 --> 00:13:02,560 Speaker 5: talk much more about the why and for a second, 260 00:13:02,559 --> 00:13:03,960 Speaker 5: because that's not trivial. 261 00:13:04,000 --> 00:13:05,079 Speaker 4: But yeah, that's it. 262 00:13:05,520 --> 00:13:08,319 Speaker 2: Yeah, maybe just to help us get to the why, 263 00:13:08,440 --> 00:13:11,480 Speaker 2: you could talk about what would be a practical use 264 00:13:11,559 --> 00:13:14,920 Speaker 2: case for this technology. Because if I think about something 265 00:13:15,040 --> 00:13:18,600 Speaker 2: like you know, you mentioned UBI universal Basic income. If 266 00:13:18,640 --> 00:13:21,880 Speaker 2: I think a government starts some sort of UBI program, 267 00:13:21,920 --> 00:13:25,600 Speaker 2: then I would assume that they identify people by something 268 00:13:25,679 --> 00:13:28,760 Speaker 2: like a social Security number or whatever the equivalent might 269 00:13:28,800 --> 00:13:32,160 Speaker 2: be in that country. So what does this do that 270 00:13:32,240 --> 00:13:35,120 Speaker 2: those traditional systems of identity. 271 00:13:34,679 --> 00:13:36,760 Speaker 4: Do not well? 272 00:13:36,800 --> 00:13:40,640 Speaker 5: So when you start with traditional identity, the really hard 273 00:13:40,679 --> 00:13:44,679 Speaker 5: thing for governments is actually making sure everyone only has 274 00:13:44,720 --> 00:13:48,960 Speaker 5: one password, and that is with an identity is called deduplication. 275 00:13:49,640 --> 00:13:53,160 Speaker 5: It's a really hard problem because all the systems, all 276 00:13:53,160 --> 00:13:55,800 Speaker 5: the technologies you use usually in your life, let's say 277 00:13:55,840 --> 00:13:59,360 Speaker 5: your iPhone Phase ID. What that actually does is just 278 00:13:59,520 --> 00:14:04,080 Speaker 5: it re authenticates you, so it realizes I'm the same 279 00:14:04,120 --> 00:14:06,240 Speaker 5: person again logging into that phone. And that's a one 280 00:14:06,240 --> 00:14:09,079 Speaker 5: to one comparison, and that is fairly easy to do. 281 00:14:10,160 --> 00:14:12,200 Speaker 5: What is really hard to do is making sure okay, 282 00:14:12,320 --> 00:14:15,360 Speaker 5: Alex did not sign up compared to a billion other people. 283 00:14:16,280 --> 00:14:20,360 Speaker 5: And so that's why many governments actually have biometric systems 284 00:14:20,360 --> 00:14:24,320 Speaker 5: in place, because otherwise also their Social Security number system 285 00:14:24,320 --> 00:14:27,880 Speaker 5: would completely break. But that's not rolled out globally. So 286 00:14:27,960 --> 00:14:31,920 Speaker 5: in fact, a good mental model is about fifty percent 287 00:14:31,920 --> 00:14:35,560 Speaker 5: of the global population actually have a digitally verifiable identity. 288 00:14:36,120 --> 00:14:37,920 Speaker 4: But that's a huge number of people. 289 00:14:37,680 --> 00:14:40,400 Speaker 5: That just don't have it right, And so I think 290 00:14:40,600 --> 00:14:45,840 Speaker 5: United States is pretty good, or at least decent. It's 291 00:14:45,880 --> 00:14:50,120 Speaker 5: actually not a really good at any solution. Europe in 292 00:14:50,240 --> 00:14:56,880 Speaker 5: parts is pretty capable. The strongest is India, although there's 293 00:14:57,000 --> 00:15:00,240 Speaker 5: definitely privacy concerns here because they don't use you knowledge 294 00:15:00,280 --> 00:15:01,120 Speaker 5: proofs and things like that. 295 00:15:01,160 --> 00:15:02,800 Speaker 4: But for example, India. 296 00:15:03,000 --> 00:15:06,080 Speaker 5: They have a program called at HOR that also used 297 00:15:06,880 --> 00:15:09,680 Speaker 5: ours recognition to kind of roll out to the global 298 00:15:09,720 --> 00:15:12,160 Speaker 5: population and it really accelerated the whole economy like crazy. 299 00:15:12,280 --> 00:15:14,600 Speaker 5: So they had at HOR and then they build UPI 300 00:15:14,720 --> 00:15:17,000 Speaker 5: on Top, which is a payment rail, and so whenever 301 00:15:17,040 --> 00:15:18,960 Speaker 5: you talk to people that are in that kind of 302 00:15:19,560 --> 00:15:22,480 Speaker 5: area of technology, everyone always brings up India as the 303 00:15:22,520 --> 00:15:26,480 Speaker 5: most prominent example of like what identity plus a strong 304 00:15:26,520 --> 00:15:29,600 Speaker 5: financial rail can actually do to accelerate an economy. 305 00:15:30,320 --> 00:15:32,160 Speaker 4: So, and this is what roll coin is doing. Basically, 306 00:15:32,240 --> 00:15:32,800 Speaker 4: roll coin. 307 00:15:32,760 --> 00:15:38,400 Speaker 5: Is is able to create a d duplication set, so 308 00:15:38,720 --> 00:15:40,960 Speaker 5: it provides me with the ability to prove that I'm 309 00:15:41,000 --> 00:15:44,640 Speaker 5: actually unique against everyone else. But then also there's a 310 00:15:44,640 --> 00:15:47,160 Speaker 5: world ID, which is an identity protocol on top of that. 311 00:15:48,040 --> 00:15:49,840 Speaker 4: And let's start with the basics. 312 00:15:49,920 --> 00:15:51,640 Speaker 5: So the first thing I could do is I could 313 00:15:51,800 --> 00:15:54,840 Speaker 5: log into Twitter and I could say I don't already 314 00:15:54,840 --> 00:15:58,640 Speaker 5: have a Twitter account without actually really my name, And 315 00:15:58,680 --> 00:16:00,920 Speaker 5: then a few things a little bit forward. You could 316 00:16:00,920 --> 00:16:04,040 Speaker 5: even cryptographically sign every tweet you make, so that you 317 00:16:04,480 --> 00:16:06,440 Speaker 5: kind of lay a trace of like, okay, this is 318 00:16:06,480 --> 00:16:10,200 Speaker 5: actually Alex versus anyone else. And I think this is 319 00:16:10,240 --> 00:16:14,280 Speaker 5: really where this is going to go, right, because this 320 00:16:14,720 --> 00:16:19,920 Speaker 5: progress around AI is definitely an exponential So we will 321 00:16:19,920 --> 00:16:21,960 Speaker 5: get to a place where you will help You will 322 00:16:21,960 --> 00:16:24,240 Speaker 5: have to authenticate pretty much everything you do on the internet. 323 00:16:25,080 --> 00:16:25,880 Speaker 4: So that's the basic. 324 00:16:25,960 --> 00:16:28,920 Speaker 5: But then with rold eddy, which is a protocol that 325 00:16:29,040 --> 00:16:33,880 Speaker 5: lets you add verifiable credentials and many other things, with 326 00:16:33,920 --> 00:16:36,080 Speaker 5: an identity space, you can actually bootstop a whole identity 327 00:16:36,120 --> 00:16:36,760 Speaker 5: system on top. 328 00:16:37,480 --> 00:16:39,960 Speaker 4: So this was like a quick grant. But no, it's good. 329 00:16:40,520 --> 00:16:44,040 Speaker 2: So it's not that it's proving that you are, you know, 330 00:16:44,120 --> 00:16:46,720 Speaker 2: a Joe Wisenthal or a Tracy Alloway. It's proving that 331 00:16:46,760 --> 00:16:51,040 Speaker 2: there is only one of you without necessarily the ID 332 00:16:51,240 --> 00:16:52,800 Speaker 2: attached or the name attached. 333 00:16:53,720 --> 00:16:55,960 Speaker 4: That is exactly so that that is the I. 334 00:16:56,520 --> 00:16:59,600 Speaker 5: That is the whole point because then but then what 335 00:16:59,680 --> 00:17:04,600 Speaker 5: we're id lets you do so if you're not into crypto, 336 00:17:04,840 --> 00:17:08,720 Speaker 5: but this whole idea of self custody is very important, 337 00:17:08,880 --> 00:17:10,160 Speaker 5: right of course, I'm pretty sure. 338 00:17:10,080 --> 00:17:10,800 Speaker 4: It came up before. 339 00:17:12,560 --> 00:17:15,679 Speaker 5: What that lets you do is you basically you verify 340 00:17:15,720 --> 00:17:19,679 Speaker 5: for well, can you receive your world Eddy? And then 341 00:17:19,720 --> 00:17:22,000 Speaker 5: it might be that for some applications you actually want 342 00:17:22,000 --> 00:17:26,040 Speaker 5: to prove that you're over eighteen or you're Joe viisatal. 343 00:17:26,800 --> 00:17:30,119 Speaker 5: Then the world Eddie protocol, which is literally just like 344 00:17:30,119 --> 00:17:33,000 Speaker 5: an open source protocol and every developer, every company could 345 00:17:33,040 --> 00:17:36,120 Speaker 5: implement with that, You as a user reader can then 346 00:17:36,160 --> 00:17:39,760 Speaker 5: decide to also attach additional information to your world Eddie, 347 00:17:39,800 --> 00:17:42,239 Speaker 5: so that is my name or that is my can 348 00:17:42,320 --> 00:17:45,400 Speaker 5: we see information and then you can decide to reveel 349 00:17:45,480 --> 00:17:47,840 Speaker 5: that to some applications with zon knowledge proofs as well, 350 00:17:48,240 --> 00:17:51,360 Speaker 5: but no one other than you actually has that information. 351 00:17:51,840 --> 00:17:55,960 Speaker 1: So I think, God, I've I have like hundreds of questions. 352 00:17:56,000 --> 00:17:57,800 Speaker 1: There are so many different ways. So you know, one 353 00:17:57,840 --> 00:18:01,399 Speaker 1: basic but one idea. You keep mentioning zero knowledge proven 354 00:18:01,520 --> 00:18:03,399 Speaker 1: just to sort of like the basic idea, And this 355 00:18:03,480 --> 00:18:06,320 Speaker 1: is the shortest. Like you know, I go to websites, 356 00:18:06,359 --> 00:18:09,760 Speaker 1: sometimes government websites, you entering your social security number. But 357 00:18:09,800 --> 00:18:12,560 Speaker 1: the idea is with zero knowledge proof, it would be 358 00:18:13,040 --> 00:18:16,200 Speaker 1: I prove to the government that I know my social 359 00:18:16,200 --> 00:18:19,480 Speaker 1: Security number without telling the government what my social security 360 00:18:19,520 --> 00:18:20,000 Speaker 1: number is. 361 00:18:21,080 --> 00:18:23,160 Speaker 5: Well, I mean that that one is actually tricky because 362 00:18:23,520 --> 00:18:25,720 Speaker 5: in that cause you would reveal the social security number. 363 00:18:25,760 --> 00:18:28,240 Speaker 5: What is a much more what is a much better 364 00:18:28,359 --> 00:18:33,680 Speaker 5: way to explain it is a website asks you are 365 00:18:33,680 --> 00:18:35,639 Speaker 5: you on any kinds of sanctional list? 366 00:18:36,520 --> 00:18:36,720 Speaker 2: Right? 367 00:18:37,200 --> 00:18:37,480 Speaker 4: Right? 368 00:18:38,560 --> 00:18:42,399 Speaker 5: Or are you over eighteen? Or are you actually a 369 00:18:42,440 --> 00:18:43,080 Speaker 5: US citizen? 370 00:18:43,960 --> 00:18:44,240 Speaker 4: Right? 371 00:18:44,359 --> 00:18:47,600 Speaker 5: And then these things you could prove of zon knowledge 372 00:18:47,960 --> 00:18:49,800 Speaker 5: And then basically what comes out of this is like 373 00:18:50,600 --> 00:18:54,639 Speaker 5: approve runs on your phone that you're not included in 374 00:18:54,680 --> 00:18:57,280 Speaker 5: a sanctions list. Let's say and that the only thing 375 00:18:57,320 --> 00:18:59,359 Speaker 5: that comes out and ends up with the website is 376 00:18:59,520 --> 00:19:02,920 Speaker 5: yes or no. But the website does not know what 377 00:19:03,080 --> 00:19:05,399 Speaker 5: is the social security number you check against it. 378 00:19:22,160 --> 00:19:24,719 Speaker 1: So then the other question I have, and the obvious 379 00:19:24,760 --> 00:19:29,160 Speaker 1: thing is look, scanning your eyeball it's gonna make anyone anxious, 380 00:19:29,200 --> 00:19:31,960 Speaker 1: and I'm like totally upward even I'm like, you know, 381 00:19:32,000 --> 00:19:37,040 Speaker 1: a little bit anxious about it. How do you establish 382 00:19:37,680 --> 00:19:41,359 Speaker 1: so there's no unlike say clear at the airport, which 383 00:19:41,359 --> 00:19:43,560 Speaker 1: also uses an eyeball scan so that you can go 384 00:19:43,640 --> 00:19:48,520 Speaker 1: buy security, there's no centralized database of our irises, And 385 00:19:48,560 --> 00:19:50,800 Speaker 1: how do you sort of like establish or prove to 386 00:19:50,880 --> 00:19:53,960 Speaker 1: people that there's no database Because if my understanding, there 387 00:19:54,000 --> 00:19:56,679 Speaker 1: has been reporting that at least at some point in 388 00:19:56,760 --> 00:20:00,359 Speaker 1: the process of the development of world Cli and as 389 00:20:00,359 --> 00:20:02,720 Speaker 1: you've grown as a startup, that at one point or 390 00:20:02,760 --> 00:20:05,400 Speaker 1: at least currently there was a Irish database. 391 00:20:06,200 --> 00:20:08,600 Speaker 4: Okay, so there there's many things here. Yeah, we've got 392 00:20:08,600 --> 00:20:08,960 Speaker 4: to start. 393 00:20:09,000 --> 00:20:12,919 Speaker 5: I'm going to start with explaining how to develop comfort 394 00:20:12,920 --> 00:20:16,960 Speaker 5: around the idea and why why it's necessary. So, first 395 00:20:16,960 --> 00:20:20,399 Speaker 5: of all, I hated the idea of building the door, 396 00:20:20,560 --> 00:20:20,879 Speaker 5: like when we. 397 00:20:20,880 --> 00:20:22,280 Speaker 4: Actually started working a rockin like it. 398 00:20:22,359 --> 00:20:26,399 Speaker 5: It was absolutely horrific because it's We've been four people 399 00:20:26,480 --> 00:20:29,760 Speaker 5: sitting in a smart, small apartment in San Francisco and 400 00:20:30,040 --> 00:20:34,560 Speaker 5: the idea of building hydred devices that take biometric data 401 00:20:34,600 --> 00:20:37,240 Speaker 5: and enroll theomagic globally was a horrific idea back then. 402 00:20:37,280 --> 00:20:41,640 Speaker 5: One because it's very complicated expensive, And then yeah. 403 00:20:41,440 --> 00:20:43,760 Speaker 2: I imagine pitching that one to VCS would be kind 404 00:20:43,760 --> 00:20:44,560 Speaker 2: of tricky. 405 00:20:44,320 --> 00:20:47,240 Speaker 4: Right, Oh, it was, it was. It was horrific. 406 00:20:47,320 --> 00:20:49,159 Speaker 5: It was in the middle of it was also like, 407 00:20:49,320 --> 00:20:52,840 Speaker 5: I mean rock kind is also a crypto project, and 408 00:20:53,440 --> 00:20:56,199 Speaker 5: well I want to ask you. Yeah, we started like 409 00:20:56,240 --> 00:20:59,160 Speaker 5: in the middle of a bear market. So I remember 410 00:20:59,240 --> 00:21:01,280 Speaker 5: back then when we talked the first syvestors, They're like, okay, 411 00:21:01,320 --> 00:21:03,879 Speaker 5: like other than etheroreum or bridcoin, nothing will exist, Like 412 00:21:03,920 --> 00:21:06,760 Speaker 5: all of this nonsense. What do you talk about AI? Also, 413 00:21:06,920 --> 00:21:09,440 Speaker 5: no one cares about that. Also like what are you 414 00:21:09,440 --> 00:21:09,960 Speaker 5: even doing? 415 00:21:10,000 --> 00:21:10,119 Speaker 4: So? 416 00:21:10,600 --> 00:21:16,160 Speaker 5: But again, so neither me or anyone else was really 417 00:21:16,200 --> 00:21:21,600 Speaker 5: excited about building a hyder device. But really we spent 418 00:21:21,760 --> 00:21:25,919 Speaker 5: almost a year doing research in pretty much everything you 419 00:21:25,920 --> 00:21:28,879 Speaker 5: could do to solve the problem, right, and solving the 420 00:21:28,960 --> 00:21:32,639 Speaker 5: problem is okay, issuing a privacy preserving identity that works globally, 421 00:21:33,160 --> 00:21:36,000 Speaker 5: not only in the United States, not only in Europe. 422 00:21:37,200 --> 00:21:41,439 Speaker 5: What do we have to do? And there's basically like 423 00:21:41,520 --> 00:21:45,600 Speaker 5: three big answers to this. One is you use government 424 00:21:45,720 --> 00:21:49,359 Speaker 5: KYC right, which might be a sufficient answer for some 425 00:21:49,359 --> 00:21:51,399 Speaker 5: parts of the world, but it was pretty shocking to 426 00:21:51,440 --> 00:21:55,600 Speaker 5: realize for how few right, So kind of that very 427 00:21:55,680 --> 00:22:01,520 Speaker 5: quickly disqualified. Second is what people call of trust, So 428 00:22:01,720 --> 00:22:03,720 Speaker 5: this whole idea that you just like you build reputation 429 00:22:03,840 --> 00:22:06,560 Speaker 5: between people, and that also, like in Shure, it just 430 00:22:06,600 --> 00:22:07,120 Speaker 5: didn't work. 431 00:22:07,320 --> 00:22:08,800 Speaker 4: And the second and then the third. 432 00:22:08,600 --> 00:22:14,760 Speaker 5: Is biometrics and everything that surrounds that whole field. And 433 00:22:14,880 --> 00:22:19,840 Speaker 5: within biometrics, we then build prototypes for pretty much everything 434 00:22:19,840 --> 00:22:23,800 Speaker 5: you could imagine. We build palm scanners, we build kind 435 00:22:23,800 --> 00:22:27,320 Speaker 5: of a phase ID like implementation of it, we build 436 00:22:27,480 --> 00:22:29,800 Speaker 5: fingerprint scanners, all these things. But the short of it 437 00:22:29,920 --> 00:22:33,680 Speaker 5: is to solve this D duplication problem. So this uniqueness check, 438 00:22:33,960 --> 00:22:38,920 Speaker 5: you need a lot of entropy, so information about each user. 439 00:22:39,320 --> 00:22:43,600 Speaker 5: If you don't have that, your error rate explodes exponentially. 440 00:22:43,920 --> 00:22:47,399 Speaker 5: And so simple example, if you would use a face 441 00:22:47,480 --> 00:22:50,800 Speaker 5: ID to solve the same problem, after tens of millions 442 00:22:50,840 --> 00:22:54,199 Speaker 5: of users, you would have to reject everyone, so you 443 00:22:54,200 --> 00:22:56,200 Speaker 5: don't have a constant errate. You don't have like five 444 00:22:56,240 --> 00:22:59,400 Speaker 5: percent will constantly reject, but you just hit a wall 445 00:22:59,440 --> 00:23:02,240 Speaker 5: and your system breaks, and. 446 00:23:02,320 --> 00:23:03,280 Speaker 4: So face does work. 447 00:23:03,760 --> 00:23:07,000 Speaker 5: Same is true for fingerprints, and the only thing that 448 00:23:07,040 --> 00:23:10,520 Speaker 5: really works is and is proven to work as harvest recognition. 449 00:23:10,640 --> 00:23:11,880 Speaker 4: So that's why we ended up there. 450 00:23:12,280 --> 00:23:14,119 Speaker 5: And then even with an iris recognition, we even had 451 00:23:14,160 --> 00:23:15,760 Speaker 5: to build our own hardware. We had to build our 452 00:23:15,800 --> 00:23:21,080 Speaker 5: own lens, which sounds absolutely ridiculous, but we kind of 453 00:23:21,119 --> 00:23:22,919 Speaker 5: had to build this whole device from the ground up 454 00:23:22,960 --> 00:23:23,760 Speaker 5: to make that happen. 455 00:23:24,280 --> 00:23:26,760 Speaker 4: So much to the why and. 456 00:23:26,880 --> 00:23:29,240 Speaker 5: How do we get people are like, how do we 457 00:23:29,320 --> 00:23:34,600 Speaker 5: think about this whole headline of comfort around this? There's 458 00:23:34,640 --> 00:23:38,720 Speaker 5: a couple of things. One actually built a privacy preserving 459 00:23:38,760 --> 00:23:42,600 Speaker 5: system that goes way further than what you're used to, 460 00:23:42,920 --> 00:23:45,080 Speaker 5: all right, And that's I think really what we did 461 00:23:45,480 --> 00:23:49,359 Speaker 5: because there's no information about you that you actually reveal 462 00:23:49,400 --> 00:23:51,040 Speaker 5: by using world coin, and I think. 463 00:23:50,880 --> 00:23:52,840 Speaker 4: That's pretty remarkable. 464 00:23:52,840 --> 00:23:55,720 Speaker 5: It's definitely countentitive because you're like, okay, we use eyeball 465 00:23:55,720 --> 00:23:59,600 Speaker 5: scanning and yeah, so this is headline one, headline two, 466 00:24:00,480 --> 00:24:05,280 Speaker 5: and that's a journey for sure is open sourcing everything, right, 467 00:24:05,359 --> 00:24:09,080 Speaker 5: So this, I think that's one of the core parts 468 00:24:09,080 --> 00:24:13,919 Speaker 5: of the crypto ethos is just you don't have to trust, 469 00:24:14,040 --> 00:24:18,320 Speaker 5: but rather you can verify and we the hardware is 470 00:24:18,320 --> 00:24:22,159 Speaker 5: completely open source already, the software to meaningful parts, the 471 00:24:22,200 --> 00:24:26,639 Speaker 5: protocol completely and so everything will be open source in 472 00:24:26,680 --> 00:24:31,439 Speaker 5: the coming months. And so that's that's answer two. And 473 00:24:31,480 --> 00:24:32,520 Speaker 5: then three. 474 00:24:33,600 --> 00:24:36,960 Speaker 4: It actually is way less of a thing than you 475 00:24:36,960 --> 00:24:37,440 Speaker 4: would think. 476 00:24:37,600 --> 00:24:40,320 Speaker 5: So we have now one point seven million users, roughly 477 00:24:41,280 --> 00:24:43,480 Speaker 5: six hundred thousand of them that use the app on 478 00:24:43,600 --> 00:24:44,480 Speaker 5: a monthly basis. 479 00:24:45,560 --> 00:24:47,480 Speaker 4: I travel the world very frequently. 480 00:24:47,520 --> 00:24:49,160 Speaker 5: I talk to a lot of users on the ground 481 00:24:49,160 --> 00:24:51,560 Speaker 5: and just like try to understand what's kind of what's 482 00:24:51,640 --> 00:24:53,359 Speaker 5: what's your feeling, what do you what do you understand? 483 00:24:53,400 --> 00:24:54,240 Speaker 4: What you not do understand? 484 00:24:54,280 --> 00:25:00,719 Speaker 5: And it is very much Yeah, like broadly speakingly speaking, 485 00:25:00,840 --> 00:25:04,200 Speaker 5: this is way less of a concern, which obviously does 486 00:25:04,240 --> 00:25:07,080 Speaker 5: not lure our bar but it's the case. 487 00:25:07,960 --> 00:25:10,159 Speaker 2: So I have two things on that, and I think 488 00:25:10,160 --> 00:25:12,560 Speaker 2: they're sort of related. But one why does open source 489 00:25:12,560 --> 00:25:16,879 Speaker 2: seem to be such a thing an AI in particular? 490 00:25:17,040 --> 00:25:22,640 Speaker 2: And then secondly, what is the crypto role in this project? 491 00:25:22,720 --> 00:25:25,520 Speaker 2: Because yeah, you know we mentioned world coin. I can 492 00:25:25,560 --> 00:25:28,520 Speaker 2: only imagine what it was like pitching two vcs in 493 00:25:28,560 --> 00:25:32,159 Speaker 2: the midst of crypto bear market saying we want to 494 00:25:32,240 --> 00:25:35,040 Speaker 2: scan everyone's eyeballs and oh, by the way, there's also 495 00:25:35,119 --> 00:25:37,720 Speaker 2: like a crypto element to this as well, a token. 496 00:25:38,200 --> 00:25:39,760 Speaker 2: Why have that component of it? 497 00:25:41,080 --> 00:25:43,640 Speaker 5: I mean, at this point I'm actually not an AI 498 00:25:43,680 --> 00:25:46,080 Speaker 5: expert anymore because I think I'm basically two years out 499 00:25:46,080 --> 00:25:49,359 Speaker 5: of the field. Went it too crypto, which is an 500 00:25:49,400 --> 00:25:52,440 Speaker 5: interesting journey also, but as. 501 00:25:52,359 --> 00:25:54,720 Speaker 2: Soon you'll be going back into AI probably that's what 502 00:25:54,800 --> 00:25:56,200 Speaker 2: all the crypto people are doing. 503 00:25:56,040 --> 00:25:59,480 Speaker 4: Now right, Well, no. 504 00:26:01,160 --> 00:26:01,440 Speaker 2: Joking. 505 00:26:03,280 --> 00:26:05,520 Speaker 5: So why is open sourcing such a thing? It is 506 00:26:05,600 --> 00:26:10,760 Speaker 5: basically around AI specifically. It is that you have a 507 00:26:10,880 --> 00:26:16,840 Speaker 5: technology that is probably as foundational as electricity, as literally 508 00:26:17,480 --> 00:26:21,600 Speaker 5: like it's it's still I think, incomprehensible to people. What 509 00:26:21,680 --> 00:26:23,600 Speaker 5: a big deal this is, right, This is like one 510 00:26:23,600 --> 00:26:27,240 Speaker 5: of the biggest paradigm shifts in technology that happened ever. 511 00:26:28,080 --> 00:26:31,959 Speaker 5: And there is different approaches of how to deal with 512 00:26:32,000 --> 00:26:36,880 Speaker 5: this and all the underlying risks, and basically two different 513 00:26:37,840 --> 00:26:40,400 Speaker 5: lines of thought is one as the open source everything. 514 00:26:40,920 --> 00:26:43,560 Speaker 5: Everyone should be able to ratify and understand how everything 515 00:26:43,600 --> 00:26:46,680 Speaker 5: is happening, and then the other is, well, you actually 516 00:26:46,720 --> 00:26:48,399 Speaker 5: kind of have to lock down these models because if 517 00:26:48,440 --> 00:26:51,880 Speaker 5: it ends up in the wrong hands, then malicious actors 518 00:26:51,920 --> 00:26:54,720 Speaker 5: have a lot of power. And I think, I mean 519 00:26:54,760 --> 00:26:58,720 Speaker 5: OPENINGI is definitely the kind of in the center of 520 00:26:58,720 --> 00:27:01,760 Speaker 5: discourse because it started with open sourcing and then the 521 00:27:01,760 --> 00:27:06,800 Speaker 5: team realized throughout the years that actually this whole premise 522 00:27:06,840 --> 00:27:11,200 Speaker 5: of open sourcing is probably a terrible idea. So that's 523 00:27:11,200 --> 00:27:14,480 Speaker 5: why it's a discussion discussion with an AI. Within crypto, 524 00:27:15,040 --> 00:27:19,200 Speaker 5: it is just that the whole thesis of crypto is 525 00:27:19,240 --> 00:27:22,280 Speaker 5: that you build actual protocols, not companies, So you build 526 00:27:22,520 --> 00:27:26,560 Speaker 5: systems that are not dependent on a small group of people, 527 00:27:27,440 --> 00:27:32,639 Speaker 5: can be completely verified and can run over decades without 528 00:27:32,680 --> 00:27:36,000 Speaker 5: being interrupted by a living and that's open sources, just 529 00:27:36,040 --> 00:27:40,040 Speaker 5: like the core of this, together with decentralization. So I 530 00:27:40,040 --> 00:27:42,800 Speaker 5: think that's a I actually do think with an AI 531 00:27:42,880 --> 00:27:45,480 Speaker 5: it's it's a very overblown discussion. 532 00:27:46,400 --> 00:27:49,200 Speaker 4: I think within crypto it's the core thesis almost. 533 00:27:50,119 --> 00:27:53,879 Speaker 2: And so the crypto element with world coin, as I 534 00:27:54,000 --> 00:27:57,840 Speaker 2: understand it, is to like insert a degree of I 535 00:27:57,840 --> 00:28:01,359 Speaker 2: guess community control over that technology. Right, Like there's some 536 00:28:01,400 --> 00:28:04,320 Speaker 2: talk about the tokens coming with voting rights and things 537 00:28:04,359 --> 00:28:04,720 Speaker 2: like that. 538 00:28:05,520 --> 00:28:06,560 Speaker 4: There are so many things there. 539 00:28:06,600 --> 00:28:13,679 Speaker 5: Actually, one is I think crypto, and that's something that 540 00:28:13,800 --> 00:28:16,560 Speaker 5: is if you're not kind of deeply in the space, 541 00:28:16,640 --> 00:28:19,640 Speaker 5: that is hard to kind of realize what a big 542 00:28:19,680 --> 00:28:23,560 Speaker 5: deal is. But it is this idea that tokens build 543 00:28:23,640 --> 00:28:27,960 Speaker 5: business models for networks almost right, So you don't know, 544 00:28:28,080 --> 00:28:30,200 Speaker 5: you don't have a company, but rather you have a 545 00:28:31,040 --> 00:28:34,800 Speaker 5: you have a like let's let's let's say ethereum. Etheroreum 546 00:28:34,840 --> 00:28:38,360 Speaker 5: is a very powerful protocol, but it has kind of 547 00:28:38,400 --> 00:28:40,600 Speaker 5: an underlying business model. But it's not that Vitaly gets 548 00:28:40,680 --> 00:28:44,080 Speaker 5: rich by that. It is his fees on the system. 549 00:28:44,360 --> 00:28:47,280 Speaker 5: He well, he did, but because he was early, not 550 00:28:47,400 --> 00:28:49,520 Speaker 5: because now he still makes money. Right, It's not that 551 00:28:49,560 --> 00:28:51,800 Speaker 5: the company is printing money and then Vitaly gets rich 552 00:28:51,840 --> 00:28:54,480 Speaker 5: every month, but rather you have a token and you 553 00:28:54,560 --> 00:28:57,680 Speaker 5: have a fee structure, and as the utility increases, the 554 00:28:57,720 --> 00:29:01,080 Speaker 5: network can basically sustain itself, right, so it can sustain 555 00:29:01,120 --> 00:29:03,960 Speaker 5: its security, it can sustain its operation and things like that, 556 00:29:04,640 --> 00:29:06,680 Speaker 5: and so. 557 00:29:06,880 --> 00:29:10,120 Speaker 4: You basically find a way how you fund a. 558 00:29:10,120 --> 00:29:14,280 Speaker 5: Decentralized operation and can it just let it sustain over 559 00:29:14,320 --> 00:29:18,360 Speaker 5: time and that is one of the big core things 560 00:29:18,400 --> 00:29:22,960 Speaker 5: here is you build very like when rol con works, 561 00:29:23,000 --> 00:29:28,080 Speaker 5: it's really foundational infrastructure. It's I think, without being overconfident, 562 00:29:28,160 --> 00:29:30,160 Speaker 5: I think it's it is a very big deal because 563 00:29:30,160 --> 00:29:33,320 Speaker 5: it's a very foundational piece of technology. It just doesn't 564 00:29:33,320 --> 00:29:36,480 Speaker 5: exist right now. That should not be in the hands 565 00:29:36,520 --> 00:29:40,280 Speaker 5: off a few people, but rather it should be governed 566 00:29:40,320 --> 00:29:42,880 Speaker 5: by a wider group of people, and it should actually 567 00:29:42,920 --> 00:29:46,480 Speaker 5: be decentralized. So also that it cannot break just if 568 00:29:46,480 --> 00:29:48,680 Speaker 5: I'm in a bad mood, Let's say that would be depth. 569 00:29:48,800 --> 00:29:51,160 Speaker 2: That would be so instead of having like a centralized 570 00:29:51,200 --> 00:29:55,560 Speaker 2: actor who controls the ORB, which is a sentence I 571 00:29:55,560 --> 00:29:59,480 Speaker 2: didn't think I would say today you have this decentralized 572 00:29:59,520 --> 00:30:04,640 Speaker 2: group of investors slash stakeholders. Is that how it works? 573 00:30:05,600 --> 00:30:06,880 Speaker 3: Yep, got it. 574 00:30:07,000 --> 00:30:09,120 Speaker 5: This is like the sparks now like a whole other 575 00:30:09,360 --> 00:30:13,440 Speaker 5: part of conversation, but basically how it works is so 576 00:30:13,480 --> 00:30:17,040 Speaker 5: the ORB is an open source heard of advice. Right now, 577 00:30:18,240 --> 00:30:20,680 Speaker 5: the company that I'm a CEO of Tools for Humanity 578 00:30:21,600 --> 00:30:24,320 Speaker 5: is the only ones that produce these orbs. But this 579 00:30:24,360 --> 00:30:27,040 Speaker 5: will change, so you will have, like if you're fast 580 00:30:27,040 --> 00:30:28,800 Speaker 5: forward two three years in the future, you and I 581 00:30:28,840 --> 00:30:31,440 Speaker 5: have many companies that produce their implementation of the ORB 582 00:30:31,560 --> 00:30:34,840 Speaker 5: following the same standards that connect to the work on protocol. 583 00:30:35,680 --> 00:30:38,160 Speaker 4: Right, so you basically. 584 00:30:37,880 --> 00:30:40,560 Speaker 1: Apple could produce an or theoremically. 585 00:30:40,160 --> 00:30:43,280 Speaker 5: Apple, Microsoft, like whatever, they could produce their own ORB, 586 00:30:43,280 --> 00:30:45,760 Speaker 5: follow their own standards, make it, design it the way 587 00:30:45,760 --> 00:30:50,040 Speaker 5: they want, but it will connect to the protocol, and 588 00:30:50,960 --> 00:30:56,400 Speaker 5: that will allow for just a very foundational protocol on 589 00:30:56,440 --> 00:30:59,760 Speaker 5: the Internet that everyone can use that doesn't break. And 590 00:31:00,280 --> 00:31:06,760 Speaker 5: the token distributes the governance and also the incentive makingsmus 591 00:31:06,760 --> 00:31:08,160 Speaker 5: all around the same goal. 592 00:31:24,480 --> 00:31:27,440 Speaker 1: So I mean you talked about the importance of like, okay, 593 00:31:27,440 --> 00:31:29,880 Speaker 1: if this is going to be this foundational infrastructure that 594 00:31:30,200 --> 00:31:34,440 Speaker 1: everyone uses in some way to verify themselves or whatever, 595 00:31:35,040 --> 00:31:37,960 Speaker 1: and also is the sort of rails perhaps for some 596 00:31:38,200 --> 00:31:41,480 Speaker 1: universal basic income that we all need because you know, 597 00:31:41,520 --> 00:31:43,160 Speaker 1: AI is going to put half of us out of 598 00:31:43,240 --> 00:31:46,280 Speaker 1: jobs in theory, like okay, it's better be like not 599 00:31:46,440 --> 00:31:49,880 Speaker 1: just controlled no offense by Alex Blanya and Sam Altman, 600 00:31:49,960 --> 00:31:51,920 Speaker 1: like that'd be kind of weird. But how do we 601 00:31:51,960 --> 00:31:53,720 Speaker 1: know this not the case? And I want to go 602 00:31:53,800 --> 00:31:56,600 Speaker 1: back to a question I asked specifically about like in 603 00:31:56,680 --> 00:31:59,680 Speaker 1: the past, like have you had a IRIS database. 604 00:32:01,000 --> 00:32:04,280 Speaker 5: So how it works is and that's not it's not 605 00:32:04,320 --> 00:32:07,800 Speaker 5: a secret. Basically, when you when you verify with with 606 00:32:07,920 --> 00:32:11,640 Speaker 5: roll coin, go to the flow. So let's say here 607 00:32:11,640 --> 00:32:12,760 Speaker 5: what roll coin, you're excited about it? 608 00:32:12,840 --> 00:32:15,280 Speaker 4: Whatever, you want to sign up? You download an app. 609 00:32:15,680 --> 00:32:17,560 Speaker 5: Right now, there's only one app that lets you do that, 610 00:32:17,600 --> 00:32:20,479 Speaker 5: but soon that also will change. So right now that 611 00:32:20,520 --> 00:32:23,920 Speaker 5: it is called world app. It's a non constudial wallet. 612 00:32:24,600 --> 00:32:28,680 Speaker 5: You click verify now, a map pops up, hopefully we're 613 00:32:28,720 --> 00:32:32,200 Speaker 5: near you. You go to an orb and then you 614 00:32:32,280 --> 00:32:35,840 Speaker 5: show kr open. Thirty seconds later you receive your world 615 00:32:35,920 --> 00:32:38,719 Speaker 5: eddy and you also receive world coin on a rolling basis. 616 00:32:38,800 --> 00:32:41,480 Speaker 5: Every week receive like a small part. 617 00:32:41,760 --> 00:32:47,240 Speaker 4: Of world coin. And in that kind of flow. 618 00:32:47,600 --> 00:32:49,880 Speaker 5: Because we're in beta, we actually did not launch yet 619 00:32:50,120 --> 00:32:52,479 Speaker 5: right The launch is something that's going to happen recentaly soon. 620 00:32:52,520 --> 00:32:54,040 Speaker 5: It's also why we talk right now, why it's a 621 00:32:54,120 --> 00:32:57,960 Speaker 5: very important moment in time for us. There you had 622 00:32:57,960 --> 00:33:00,640 Speaker 5: two options, opt in or opt out. Opt out is 623 00:33:00,680 --> 00:33:04,320 Speaker 5: basically we don't have any data of you. Opt in 624 00:33:04,400 --> 00:33:09,440 Speaker 5: means we basically have custody of your data. For the 625 00:33:09,520 --> 00:33:12,320 Speaker 5: reason that the neural networks that are on this device 626 00:33:12,400 --> 00:33:14,960 Speaker 5: right now are still in development, and so if you 627 00:33:15,440 --> 00:33:19,760 Speaker 5: opt out, it could happen that you have to reverify 628 00:33:19,800 --> 00:33:22,560 Speaker 5: in a year or so, because we basically just updated 629 00:33:22,560 --> 00:33:25,560 Speaker 5: a model as we as we as we went. But 630 00:33:26,040 --> 00:33:29,360 Speaker 5: where this is going is very clearly by and I 631 00:33:29,400 --> 00:33:32,000 Speaker 5: think actually launch, it will be completely opt out. 632 00:33:32,880 --> 00:33:33,480 Speaker 4: This is just. 633 00:33:35,360 --> 00:33:38,560 Speaker 5: That the nature of being in beta and developing technology 634 00:33:38,560 --> 00:33:38,880 Speaker 5: as we go. 635 00:33:39,880 --> 00:33:41,719 Speaker 2: So I take the point that you're in beta at 636 00:33:41,720 --> 00:33:45,760 Speaker 2: the moment. But you know, the arc of recent technological 637 00:33:45,800 --> 00:33:48,719 Speaker 2: innovation seems to be that people often come up with 638 00:33:48,760 --> 00:33:51,560 Speaker 2: something with the best of intentions or in order to 639 00:33:51,680 --> 00:33:56,320 Speaker 2: solve an identified problem, and then it often gets used 640 00:33:56,480 --> 00:34:01,240 Speaker 2: in the worst possible way. So what's like the terrible 641 00:34:01,640 --> 00:34:05,280 Speaker 2: use case arc of this technology? Because one thing that 642 00:34:05,320 --> 00:34:08,040 Speaker 2: springs to mind, and you know, we could talk about 643 00:34:08,120 --> 00:34:11,920 Speaker 2: robot invasions and like terminator style AI takeovers, but like 644 00:34:12,040 --> 00:34:15,920 Speaker 2: one more realistic thing that would spring to mind is, Okay, 645 00:34:16,200 --> 00:34:18,799 Speaker 2: you set something like this up and then as you 646 00:34:18,880 --> 00:34:21,520 Speaker 2: walk through a shopping mall, you know, there's some other 647 00:34:21,600 --> 00:34:25,359 Speaker 2: technology that's scanning your eyeballs, identifying who you are and 648 00:34:25,440 --> 00:34:29,600 Speaker 2: maybe pitching targeted ads or something like that. Like how 649 00:34:29,640 --> 00:34:35,160 Speaker 2: do you separate the identity verification from companies using that 650 00:34:35,760 --> 00:34:37,319 Speaker 2: for other purposes? 651 00:34:38,080 --> 00:34:42,839 Speaker 4: Yep. So we actually have a quite. 652 00:34:44,160 --> 00:34:47,319 Speaker 5: In dem for extreme in the company right now in 653 00:34:47,320 --> 00:34:49,960 Speaker 5: Tools Rehumanity, and we will publish this as we go 654 00:34:50,800 --> 00:34:54,400 Speaker 5: that just like we tried to come of the worst, 655 00:34:54,800 --> 00:34:59,360 Speaker 5: weirdest scenarios of like whatever China's attacking the network with 656 00:34:59,400 --> 00:35:01,640 Speaker 5: billions of the like what could actually happen? 657 00:35:01,680 --> 00:35:02,719 Speaker 4: What could go wrong? Right? 658 00:35:03,160 --> 00:35:07,799 Speaker 5: So the thing is given seer knowledge proofs separate your 659 00:35:07,840 --> 00:35:13,200 Speaker 5: walid from the kind of the uniqueness check the only 660 00:35:13,280 --> 00:35:16,680 Speaker 5: thing that could happen. And to be clear, in some 661 00:35:16,760 --> 00:35:19,040 Speaker 5: cases that actually, like if you think ten years in 662 00:35:19,080 --> 00:35:21,000 Speaker 5: the future and you think, like rollkindness is very powerful 663 00:35:21,040 --> 00:35:26,480 Speaker 5: technology that could actually be bad is people could say, Okay, 664 00:35:26,560 --> 00:35:29,640 Speaker 5: that individual is verified with roll coin, yes or no, right, 665 00:35:29,719 --> 00:35:32,240 Speaker 5: and that might like you could definitely think of scenarios 666 00:35:32,280 --> 00:35:36,040 Speaker 5: of like, okay, China is banning roll coin because it's 667 00:35:36,200 --> 00:35:39,399 Speaker 5: literally the anti thesis of what they're building with their 668 00:35:39,440 --> 00:35:40,040 Speaker 5: in anity. 669 00:35:39,800 --> 00:35:43,200 Speaker 4: System, because it's privacy preserving that. 670 00:35:43,280 --> 00:35:46,279 Speaker 5: You might actually have it might be bad for you 671 00:35:46,360 --> 00:35:48,759 Speaker 5: to be verified with roll coin or whatever. Right, So 672 00:35:48,800 --> 00:35:50,640 Speaker 5: there's like there's some set scenarios here. But I think 673 00:35:50,680 --> 00:35:53,160 Speaker 5: that's that is pretty much the worst case scenario you're 674 00:35:53,160 --> 00:35:57,399 Speaker 5: looking at, because otherwise the only like, let's go even 675 00:35:57,400 --> 00:35:59,840 Speaker 5: to the far extreme, which to be very clear, is 676 00:35:59,880 --> 00:36:00,880 Speaker 5: not what is the case. 677 00:36:01,400 --> 00:36:05,000 Speaker 4: Let's say there would be like a huge database. 678 00:36:04,600 --> 00:36:08,360 Speaker 5: Of of of biometric images of all users. 679 00:36:09,120 --> 00:36:12,440 Speaker 4: It's not connected to your actual account, so. 680 00:36:12,840 --> 00:36:17,440 Speaker 5: You you still have the most there's like multiple players, 681 00:36:17,880 --> 00:36:19,120 Speaker 5: layers of privacy in between. 682 00:36:19,160 --> 00:36:21,160 Speaker 4: But did make sure that nothing really bad is happening there? 683 00:36:22,520 --> 00:36:27,279 Speaker 1: Well, I'm gonna get into a slightly macabre arc of 684 00:36:27,320 --> 00:36:30,759 Speaker 1: this or maybe sci fi angle, but since we're just 685 00:36:30,800 --> 00:36:34,320 Speaker 1: talking about bad things happening, I'm just gonna go there, 686 00:36:34,800 --> 00:36:39,800 Speaker 1: which is eyeball theft? Someone cutting out, uh if a 687 00:36:39,960 --> 00:36:44,320 Speaker 1: really important person who uses world coin to verify themselves 688 00:36:44,360 --> 00:36:49,920 Speaker 1: on some network, a criminal attacker physically maiming someone like 689 00:36:50,800 --> 00:36:54,000 Speaker 1: I imagine these these these sort of scenarios are digging up 690 00:36:54,480 --> 00:36:57,759 Speaker 1: the dead for their eyeballs. Can these must come up 691 00:36:57,800 --> 00:37:01,120 Speaker 1: in your conversations about really extreme scene. Can you talk 692 00:37:01,160 --> 00:37:03,319 Speaker 1: about some of these sort of like weird out there 693 00:37:03,360 --> 00:37:04,400 Speaker 1: physical risks? 694 00:37:05,280 --> 00:37:07,319 Speaker 5: Yeah, they all of these sign ups would not work 695 00:37:07,400 --> 00:37:12,200 Speaker 5: because the art would realize that you are that person 696 00:37:12,320 --> 00:37:16,160 Speaker 5: is these eyes are not alive and are not connected 697 00:37:16,200 --> 00:37:17,960 Speaker 5: to an actual This is the. 698 00:37:18,239 --> 00:37:22,440 Speaker 2: Moment on all thoughts when we talk about degenerating eyeball tissue. 699 00:37:23,120 --> 00:37:24,680 Speaker 1: But I don't know. 700 00:37:24,600 --> 00:37:26,719 Speaker 4: Anything part of discussion. 701 00:37:26,800 --> 00:37:29,000 Speaker 1: But no, no, no, I mean I am sure. I'm make 702 00:37:29,000 --> 00:37:31,240 Speaker 1: sure really like this is like what is a liveness 703 00:37:31,320 --> 00:37:33,040 Speaker 1: check like? And I guess it goes back to your 704 00:37:33,120 --> 00:37:36,440 Speaker 1: question of like why eyeballs not thumbprints or why not 705 00:37:36,480 --> 00:37:39,160 Speaker 1: fingerprints or face ID or your gate or something like 706 00:37:39,239 --> 00:37:42,719 Speaker 1: talk to us about that verification. Why the eyeball can't 707 00:37:42,760 --> 00:37:45,520 Speaker 1: be replicated? Why wouldn't work if not attached to a 708 00:37:45,560 --> 00:37:48,440 Speaker 1: live person? Like these must come up? This would this? 709 00:37:48,480 --> 00:37:49,399 Speaker 1: What would freak me out? 710 00:37:50,719 --> 00:37:50,959 Speaker 4: Sure? 711 00:37:51,040 --> 00:37:54,520 Speaker 5: So, so the device has multiple senses in front, there's 712 00:37:54,520 --> 00:37:56,759 Speaker 5: like a lot of computer on this device. There's like 713 00:37:56,800 --> 00:38:01,000 Speaker 5: a separate GPU and this seven year old firks that 714 00:38:01,320 --> 00:38:07,640 Speaker 5: basically at all time run and what a large part 715 00:38:07,680 --> 00:38:09,919 Speaker 5: of these networks is doing is just to make sure 716 00:38:09,960 --> 00:38:14,040 Speaker 5: that whatever we see is an actual human being alive, 717 00:38:14,160 --> 00:38:17,560 Speaker 5: no fraud attack, no kind of weird cases. And then 718 00:38:17,640 --> 00:38:20,439 Speaker 5: there is in front, there's multiple sensors that make sure 719 00:38:20,480 --> 00:38:22,920 Speaker 5: that that's the case. And you could even go as 720 00:38:22,960 --> 00:38:25,680 Speaker 5: far as like an AI trying to attack the system, 721 00:38:25,760 --> 00:38:28,080 Speaker 5: because that can also happen in the future, right, so 722 00:38:28,160 --> 00:38:31,719 Speaker 5: all of that should not happen. And so what the 723 00:38:31,840 --> 00:38:34,880 Speaker 5: ORB does It images multi spectral, So what that means 724 00:38:34,920 --> 00:38:39,520 Speaker 5: is across multiple wavelengths and the electroc neetic spectrum all 725 00:38:39,560 --> 00:38:45,239 Speaker 5: at the same time. And this is just like you 726 00:38:45,239 --> 00:38:49,160 Speaker 5: can think of very sophisticated optical table attacks like I 727 00:38:49,239 --> 00:38:51,160 Speaker 5: did a lot of these physics when I was younger. 728 00:38:52,280 --> 00:38:57,279 Speaker 5: You can think of these things, but it's extremely expensive 729 00:38:58,120 --> 00:39:00,480 Speaker 5: and I think even that will not be possible the future. 730 00:39:00,520 --> 00:39:02,920 Speaker 5: So it's just really really hard to fool this device. 731 00:39:04,840 --> 00:39:08,200 Speaker 5: I mean, of course, to be clear, the infrastructure is there, 732 00:39:08,200 --> 00:39:10,440 Speaker 5: like the new networks are still we were still building up, 733 00:39:10,480 --> 00:39:12,880 Speaker 5: so right now we are actually aware of a couple 734 00:39:12,920 --> 00:39:15,960 Speaker 5: of tacks that can still happen, and that will still 735 00:39:16,000 --> 00:39:17,319 Speaker 5: be the case for the next year. Like all of 736 00:39:17,320 --> 00:39:20,560 Speaker 5: this technology is not perfect yet, so we'll still take time, 737 00:39:20,640 --> 00:39:22,960 Speaker 5: but at least the technology is there to make sure 738 00:39:22,960 --> 00:39:24,719 Speaker 5: that none of these attacks will ever go through. 739 00:39:26,080 --> 00:39:30,560 Speaker 2: So setting aside the worst case scenarios maybe a slightly 740 00:39:30,960 --> 00:39:36,120 Speaker 2: easier question, but what is the path to adoption that 741 00:39:36,160 --> 00:39:39,319 Speaker 2: you see here, because like, there are a lot of 742 00:39:39,360 --> 00:39:42,000 Speaker 2: things it feels like to overcome. One is the creepiness 743 00:39:42,040 --> 00:39:45,239 Speaker 2: factor of having your eyeball scan. But secondly, you know, 744 00:39:45,239 --> 00:39:49,200 Speaker 2: if you're touting this as a way to identify people 745 00:39:49,280 --> 00:39:52,080 Speaker 2: who might be in need of some sort of official 746 00:39:52,120 --> 00:39:55,760 Speaker 2: identification in countries that don't necessarily have those systems yet 747 00:39:56,000 --> 00:39:59,319 Speaker 2: or have subpar systems, I imagine like there has to 748 00:39:59,360 --> 00:40:01,960 Speaker 2: be a reason and why they would want to have 749 00:40:02,000 --> 00:40:05,799 Speaker 2: this done. You know, in most places, in almost all 750 00:40:05,840 --> 00:40:08,320 Speaker 2: places in the world, no one is getting universal basic 751 00:40:08,400 --> 00:40:11,839 Speaker 2: income yet, So what is the path to adoption and 752 00:40:11,880 --> 00:40:15,239 Speaker 2: what are the the incentives for people to do this? 753 00:40:15,960 --> 00:40:16,280 Speaker 1: Right? 754 00:40:16,640 --> 00:40:20,600 Speaker 5: This is kind of the the everything question, Like, of 755 00:40:20,640 --> 00:40:23,080 Speaker 5: course if I would have the perfect. 756 00:40:22,760 --> 00:40:26,480 Speaker 4: Answer to this, I would tweet it out. 757 00:40:26,600 --> 00:40:33,560 Speaker 6: But that's a good answer every question we've ever Keep going, 758 00:40:34,040 --> 00:40:39,359 Speaker 6: So we're worried about being replaced. We don't need that anyway, Sorry, 759 00:40:39,440 --> 00:40:40,000 Speaker 6: keep gone. 760 00:40:41,120 --> 00:40:45,160 Speaker 5: Okay, So there's like there there's a couple different arcs 761 00:40:45,200 --> 00:40:47,719 Speaker 5: to the response to this question, right. So one is 762 00:40:48,400 --> 00:40:50,879 Speaker 5: when we launch, there's going to be a token token 763 00:40:50,920 --> 00:40:54,120 Speaker 5: with a capsule apply that will play a very important 764 00:40:54,160 --> 00:40:58,279 Speaker 5: role in that kind of the whole evolving nature of 765 00:40:58,360 --> 00:41:03,080 Speaker 5: this project. And when you sign up, you basically receive 766 00:41:03,600 --> 00:41:05,680 Speaker 5: on a running basis, you receive ownership in a network 767 00:41:05,719 --> 00:41:08,920 Speaker 5: through the token, And I think that's exciting for a 768 00:41:08,960 --> 00:41:09,520 Speaker 5: lot of reasons. 769 00:41:09,600 --> 00:41:10,960 Speaker 4: Like one, it's a direct incentive. 770 00:41:10,960 --> 00:41:13,719 Speaker 5: It's a direct like you literally get money basically that 771 00:41:13,800 --> 00:41:16,239 Speaker 5: you could you could sell, or you could do anything with. 772 00:41:17,200 --> 00:41:21,279 Speaker 5: So it's already like a very small form of It's 773 00:41:21,280 --> 00:41:23,520 Speaker 5: not basic because there's not enough money, but it's a 774 00:41:23,680 --> 00:41:26,360 Speaker 5: small form of UBI in some sense, I think that 775 00:41:26,400 --> 00:41:28,239 Speaker 5: will motivate a lot of people. And we already see 776 00:41:28,280 --> 00:41:31,160 Speaker 5: it's like moving a lot of people to like we 777 00:41:31,200 --> 00:41:34,680 Speaker 5: have the crazy Like literally right now is the craziest 778 00:41:34,680 --> 00:41:38,080 Speaker 5: time for me running this project because it's blowing up, 779 00:41:38,160 --> 00:41:42,440 Speaker 5: like all the dashboards completely go vertical, and we have 780 00:41:42,520 --> 00:41:46,360 Speaker 5: like people flying from Japan to Portugal to sign up 781 00:41:46,400 --> 00:41:50,600 Speaker 5: and whatever, like it's pretty really, it's pretty crazy. So 782 00:41:50,680 --> 00:41:56,280 Speaker 5: that certainly does something that's one too is of course integrations. 783 00:41:57,120 --> 00:42:01,719 Speaker 5: So making sure that kind of large technology companies or 784 00:42:02,719 --> 00:42:08,200 Speaker 5: services actually integrate with world ID. That's gonna I mean, 785 00:42:08,200 --> 00:42:09,839 Speaker 5: it's gonna take a while. Because you have a chicken 786 00:42:09,920 --> 00:42:12,080 Speaker 5: egg problem. You need a lot of users for that 787 00:42:12,160 --> 00:42:14,920 Speaker 5: to be interesting too, products and. 788 00:42:14,719 --> 00:42:16,319 Speaker 4: So and so forth. There just like a little bit 789 00:42:16,360 --> 00:42:18,880 Speaker 4: of a balancing act. I'm sure it's going to happen. 790 00:42:20,440 --> 00:42:22,560 Speaker 5: And once that is the case, once you have like 791 00:42:22,640 --> 00:42:28,040 Speaker 5: major integrations, then kind of an actual flywheel is starting 792 00:42:28,160 --> 00:42:30,480 Speaker 5: because as a user you will be able to use 793 00:42:30,480 --> 00:42:32,160 Speaker 5: a lot of services that you're just not able to 794 00:42:32,239 --> 00:42:34,480 Speaker 5: use without it. And so that's kind of that's the 795 00:42:34,600 --> 00:42:35,920 Speaker 5: that's the short of it. And I have a lot 796 00:42:36,200 --> 00:42:40,359 Speaker 5: more specific specific answers are much more localized as well. 797 00:42:40,440 --> 00:42:43,839 Speaker 5: So for example, we are in Bonos Aires and so 798 00:42:43,840 --> 00:42:49,400 Speaker 5: we will work on local integrations in Argentina. So it 799 00:42:49,440 --> 00:42:52,719 Speaker 5: will not start with Twitter or Meta. It will start 800 00:42:52,760 --> 00:42:58,520 Speaker 5: with like very localized applications. But yeah, that's where it goes. 801 00:43:00,160 --> 00:43:03,000 Speaker 1: Since this is odd lots, I have a really quick question. 802 00:43:03,120 --> 00:43:05,400 Speaker 1: You know, you built this I'm going to pick it 803 00:43:05,480 --> 00:43:07,000 Speaker 1: up again because it's really kind of fun the whole 804 00:43:07,239 --> 00:43:09,279 Speaker 1: or you built this piece of hardware, Yeah, this ORB. 805 00:43:09,800 --> 00:43:12,640 Speaker 1: Do you have any supply interesting supply chain issues in 806 00:43:12,719 --> 00:43:16,000 Speaker 1: terms of the optics, so the semiconductors or I mean 807 00:43:16,040 --> 00:43:18,160 Speaker 1: it's a and how much does a right now cost 808 00:43:18,160 --> 00:43:18,799 Speaker 1: to build an ORB? 809 00:43:21,800 --> 00:43:26,280 Speaker 5: So we had especially during COVID, we had crazy supply 810 00:43:26,440 --> 00:43:30,640 Speaker 5: chain crunches. So it's like generally, like Elon talked a 811 00:43:30,680 --> 00:43:33,640 Speaker 5: lot about this in the earlier days, but I definitely 812 00:43:33,680 --> 00:43:36,160 Speaker 5: also ran into this. It's like you have a perfect 813 00:43:36,280 --> 00:43:39,960 Speaker 5: running prototype and then it just feels trivial to produce 814 00:43:40,000 --> 00:43:44,960 Speaker 5: this thing, but it's just not. It was absolutely terrific, 815 00:43:45,160 --> 00:43:46,839 Speaker 5: Like we had this, We had this prototype, and then 816 00:43:46,840 --> 00:43:49,320 Speaker 5: building up a production line and really being able to 817 00:43:49,360 --> 00:43:51,640 Speaker 5: produce like thousands of them. It was a completely different 818 00:43:51,680 --> 00:43:53,719 Speaker 5: challenge that it almost took like a year or two 819 00:43:53,960 --> 00:43:57,600 Speaker 5: to complete. Doctors locked us in and then within COVID 820 00:43:57,600 --> 00:44:02,480 Speaker 5: we had crazy supply chain issues, but it's now mostly gone. 821 00:44:02,680 --> 00:44:05,719 Speaker 5: The kind of the industry has completely recovered. 822 00:44:06,000 --> 00:44:08,279 Speaker 2: What was the hardest component to get. 823 00:44:11,960 --> 00:44:15,880 Speaker 4: Oh, when it was it was death by a thousand cuts. 824 00:44:16,239 --> 00:44:24,040 Speaker 5: It was everything from the computing unit to random capacitors, 825 00:44:24,400 --> 00:44:29,239 Speaker 5: weird ones that we use. We have like that the 826 00:44:29,320 --> 00:44:33,200 Speaker 5: actual lens right now is single source supplier, which is 827 00:44:33,239 --> 00:44:37,280 Speaker 5: something that obviously has to change. But since we customly 828 00:44:37,280 --> 00:44:39,120 Speaker 5: had to build this lens, there's only one company in 829 00:44:39,160 --> 00:44:41,920 Speaker 5: the world that builds that lens, so that that was 830 00:44:42,160 --> 00:44:43,640 Speaker 5: a really tricky one to kind of ramp it up 831 00:44:43,680 --> 00:44:46,879 Speaker 5: over time. How much does it cost right now? Right 832 00:44:46,920 --> 00:44:52,040 Speaker 5: now is still We're still pretty expensive. But honestly, it 833 00:44:52,080 --> 00:44:55,920 Speaker 5: doesn't really matter because so right now it's like four thousand, 834 00:44:56,280 --> 00:44:58,480 Speaker 5: four to five thousand dollars depending on the exact patch, 835 00:44:59,080 --> 00:45:01,440 Speaker 5: it's coming down to be around one thousand, five hundred 836 00:45:01,440 --> 00:45:04,120 Speaker 5: dollars at mass manufacturing, and I think in a couple 837 00:45:04,160 --> 00:45:06,600 Speaker 5: of years it's going to be below five hundred dollars. 838 00:45:06,760 --> 00:45:09,319 Speaker 1: We got to put the THOT to put the ORB 839 00:45:09,400 --> 00:45:12,000 Speaker 1: into the FEDS inflation basket. We're getting so much better 840 00:45:12,040 --> 00:45:14,520 Speaker 1: at producing eyeball scanning orbs. 841 00:45:15,040 --> 00:45:19,520 Speaker 2: Yes, yes, healthcare and medical care massively inflated, but we 842 00:45:19,560 --> 00:45:22,919 Speaker 2: can all scan our eyeballs cheaper and cheaper excellent. 843 00:45:23,320 --> 00:45:26,440 Speaker 1: Alex Bonya, this was such a fascinating conversation. It's like, 844 00:45:26,480 --> 00:45:28,640 Speaker 1: I mean, there's like a million more things I could 845 00:45:28,719 --> 00:45:30,279 Speaker 1: want to ask, and maybe we'll have you back in 846 00:45:30,280 --> 00:45:32,799 Speaker 1: a year and then in two years when we're all 847 00:45:32,920 --> 00:45:35,279 Speaker 1: on UBI because we've been put out of work by 848 00:45:35,360 --> 00:45:38,319 Speaker 1: AI and so, you know, whatever it is. But this 849 00:45:38,440 --> 00:45:40,680 Speaker 1: was this was far and away our most sort of 850 00:45:40,719 --> 00:45:45,360 Speaker 1: like futuristic sci fi ish conversation we've ever had not 851 00:45:45,480 --> 00:45:47,800 Speaker 1: something I would have ever contemplated in the past. So 852 00:45:47,920 --> 00:45:50,400 Speaker 1: thank you so much for coming on outbox amazing. 853 00:45:50,440 --> 00:45:53,319 Speaker 4: I take that as a compliment. Thanks for absolutely question. 854 00:45:53,719 --> 00:46:08,560 Speaker 3: Yeah, that was really interesting, Tracy. 855 00:46:08,600 --> 00:46:09,880 Speaker 1: Are you are you going to get scanned? 856 00:46:11,600 --> 00:46:11,920 Speaker 3: You know what? 857 00:46:11,960 --> 00:46:13,960 Speaker 2: I think I'm gonna wait a little bit until there's 858 00:46:13,960 --> 00:46:17,520 Speaker 2: some visible upside for me, which which is kind of 859 00:46:17,560 --> 00:46:19,799 Speaker 2: that last question that I asked Alex, which you know, 860 00:46:19,920 --> 00:46:25,280 Speaker 2: I am reasonably I understand the use case of having 861 00:46:25,400 --> 00:46:29,520 Speaker 2: some sort of like wallet or portable identification that you 862 00:46:29,560 --> 00:46:33,920 Speaker 2: can take with you across the internet and like various outlets. 863 00:46:33,960 --> 00:46:35,399 Speaker 2: That kind of makes sense to me, and I think 864 00:46:35,400 --> 00:46:39,840 Speaker 2: we've spoken about that use case before. However, at this 865 00:46:39,960 --> 00:46:43,520 Speaker 2: point in time, I don't see a lot of upside. 866 00:46:43,760 --> 00:46:46,000 Speaker 2: Uh you know, maybe it would be easier to file 867 00:46:46,120 --> 00:46:49,840 Speaker 2: some claims against Twitter for like yeah, bought copycats, but 868 00:46:50,200 --> 00:46:50,839 Speaker 2: it's not much. 869 00:46:51,000 --> 00:46:53,600 Speaker 1: Yeah, but and to your point though, like or and 870 00:46:53,640 --> 00:46:57,840 Speaker 1: really or sorry to Alex's point, like, so it's on Twitter, 871 00:46:58,040 --> 00:47:00,279 Speaker 1: like do they want to integrate the world coin? Yeah, 872 00:47:00,480 --> 00:47:03,720 Speaker 1: which I don't actually get the impression that, Like I'm 873 00:47:03,840 --> 00:47:06,319 Speaker 1: it doesn't seem like any of the big networks like 874 00:47:06,360 --> 00:47:08,920 Speaker 1: are anywhere, like that's on the roadmap of anyone, right. 875 00:47:09,000 --> 00:47:12,320 Speaker 2: Well, this is the other thing I'm thinking about, is, uh, 876 00:47:12,360 --> 00:47:15,319 Speaker 2: you know, part of the world coin use case is 877 00:47:15,360 --> 00:47:19,840 Speaker 2: this decentralization aspect of it. But what we've seen time 878 00:47:19,880 --> 00:47:24,319 Speaker 2: and time again throughout history is that the world tends 879 00:47:24,440 --> 00:47:28,759 Speaker 2: towards middlemen and verification of some sort. So I can 880 00:47:28,800 --> 00:47:31,800 Speaker 2: imagine a future where maybe you do get your eyeball scanned, 881 00:47:32,040 --> 00:47:35,960 Speaker 2: but maybe it's not by world coin or decentralized entity. 882 00:47:36,160 --> 00:47:40,000 Speaker 2: Maybe it's Twitter itself, and they this is even more discian, 883 00:47:40,120 --> 00:47:42,640 Speaker 2: you know, like Twitter keeps its own database of your eyeballs. 884 00:47:42,640 --> 00:47:44,440 Speaker 1: Well that's the thing, like I used clear, you know, 885 00:47:44,560 --> 00:47:46,600 Speaker 1: like I already like scanned my ball at the airport, 886 00:47:46,680 --> 00:47:49,560 Speaker 1: and it's like one hundred dollars. That's a centralized database. 887 00:47:49,600 --> 00:47:52,000 Speaker 1: I save a few minutes. It's like I've already I 888 00:47:52,080 --> 00:47:55,040 Speaker 1: could when it comes to privacy, I've already like basically 889 00:47:55,200 --> 00:47:56,960 Speaker 1: given up. I've already given it all up to save 890 00:47:57,000 --> 00:48:00,600 Speaker 1: about five minutes in the airline. So I guess I've like, yeah, 891 00:48:00,680 --> 00:48:04,680 Speaker 1: you know, I'll get scanned, why not. But I'm skeptical, 892 00:48:04,719 --> 00:48:07,080 Speaker 1: you know, actually, honestly, it's not even like the privacy 893 00:48:07,200 --> 00:48:10,920 Speaker 1: or anything like that, or like the dystopian scenarios so 894 00:48:11,040 --> 00:48:14,480 Speaker 1: much as like, I don't really want a scenario in 895 00:48:14,560 --> 00:48:17,239 Speaker 1: which like computers put us all out of jobs and 896 00:48:17,280 --> 00:48:19,239 Speaker 1: we need the UBI, and I think we didn't really 897 00:48:19,280 --> 00:48:22,560 Speaker 1: get into like this scenario in the future, Like where 898 00:48:22,560 --> 00:48:25,719 Speaker 1: will the political impulse from UBI come from. It'll come 899 00:48:25,760 --> 00:48:29,200 Speaker 1: from in an economy that's so lopsided in who reaps 900 00:48:29,239 --> 00:48:31,239 Speaker 1: the benefits of AI that we like have to have 901 00:48:31,320 --> 00:48:34,080 Speaker 1: it for political stability, which is that's like the scenario 902 00:48:34,120 --> 00:48:35,880 Speaker 1: that like freaks me out more than anything. 903 00:48:35,719 --> 00:48:38,799 Speaker 2: Right in the future where you really need world coin, 904 00:48:39,120 --> 00:48:41,680 Speaker 2: Like the most insidious thing is that you need it 905 00:48:41,719 --> 00:48:42,920 Speaker 2: because no one has jobs. 906 00:48:43,040 --> 00:48:45,479 Speaker 1: That's right, right, Like more than anything else, the scan 907 00:48:45,760 --> 00:48:48,600 Speaker 1: or whatever, it's this scenario in which, oh, I need 908 00:48:48,600 --> 00:48:50,840 Speaker 1: to get my UBI from like somewhere, and like they 909 00:48:50,840 --> 00:48:53,719 Speaker 1: attack sam Altman so that we can all Like that's 910 00:48:53,760 --> 00:48:56,160 Speaker 1: like the part more than the SCAN or the privacy 911 00:48:56,320 --> 00:48:58,440 Speaker 1: or the you know, the digging up the dead that 912 00:48:58,480 --> 00:48:59,160 Speaker 1: freaks me out. 913 00:48:59,280 --> 00:49:01,960 Speaker 2: Yeah, I'm telling you a great business model create the 914 00:49:02,000 --> 00:49:04,920 Speaker 2: problem and the solution. On that happy note, shall we 915 00:49:04,960 --> 00:49:05,279 Speaker 2: leave it there. 916 00:49:05,360 --> 00:49:06,680 Speaker 1: Let's leave it there, all right. 917 00:49:06,600 --> 00:49:09,360 Speaker 2: This has been another episode of the All Thoughts podcast. 918 00:49:09,400 --> 00:49:11,879 Speaker 2: I'm Tracy Alloway. You can follow me on Twitter at 919 00:49:11,920 --> 00:49:12,640 Speaker 2: Tracy Alloway. 920 00:49:12,760 --> 00:49:15,480 Speaker 1: And I'm Joe Wisenthal. You can follow me on Twitter 921 00:49:15,560 --> 00:49:19,280 Speaker 1: at the Stalwart. Follow our guest Alex Blanya, as he said, 922 00:49:19,360 --> 00:49:21,160 Speaker 1: if he might just tweet it out, he might just 923 00:49:21,200 --> 00:49:23,000 Speaker 1: tweet out the answer to all of these things. He's 924 00:49:23,080 --> 00:49:27,240 Speaker 1: at Alex Blana. And follow our producers on Twitter Carmen 925 00:49:27,320 --> 00:49:31,359 Speaker 1: Rodriguez at Carmen Arman and dash Ol Bennett at dashbot. 926 00:49:31,520 --> 00:49:34,080 Speaker 1: And follow all of the Bloomberg podcasts onto the handle 927 00:49:34,200 --> 00:49:38,040 Speaker 1: at podcasts and for more Oddlots content, go to Bloomberg 928 00:49:38,040 --> 00:49:40,799 Speaker 1: dot com slash Oddlots, where we have a blog, we 929 00:49:40,880 --> 00:49:43,920 Speaker 1: have a transcript and a newsletter that comes out every Friday. 930 00:49:44,000 --> 00:49:47,400 Speaker 1: And check out our discord. It's super fun people chatting 931 00:49:47,400 --> 00:49:50,160 Speaker 1: about odd loot stuff with other listeners twenty four to seven. 932 00:49:50,440 --> 00:49:53,400 Speaker 1: Go to discord dot gg slash odlogs. 933 00:49:53,520 --> 00:49:56,760 Speaker 2: Yeah, the discord is very fun. You should also stream 934 00:49:56,880 --> 00:50:01,799 Speaker 2: Bloomberg originals on Apple TV, Samsung Roku. Tune in at 935 00:50:01,880 --> 00:50:30,040 Speaker 2: ten pm