1 00:00:04,840 --> 00:00:08,440 Speaker 1: On this episode of Newsworld. This past March, I was 2 00:00:08,560 --> 00:00:11,840 Speaker 1: visiting Maxwell Air Force Base in Montgomery, Alabama, where I 3 00:00:11,840 --> 00:00:14,480 Speaker 1: was speaking at the Joint Flag Officers war Fighting Course. 4 00:00:15,120 --> 00:00:17,320 Speaker 1: While I was there, I asked to talk with an 5 00:00:17,360 --> 00:00:22,680 Speaker 1: expert in emerging technology, frankly, the metaverse and cyber operations 6 00:00:22,680 --> 00:00:26,400 Speaker 1: in particular, and the dean introduced me to my guest today, 7 00:00:26,760 --> 00:00:29,960 Speaker 1: doctor Joshua Sipper. I have to tell you he did 8 00:00:30,040 --> 00:00:33,440 Speaker 1: such an amazing job of breaking down the concept of 9 00:00:33,440 --> 00:00:37,600 Speaker 1: the metaverse, concept I frankly don't fully understand, but he 10 00:00:37,680 --> 00:00:41,800 Speaker 1: made it both understandable and interesting. So he's here to 11 00:00:41,920 --> 00:00:44,800 Speaker 1: help us make sense of the cyber world and the metaverse. 12 00:00:45,240 --> 00:00:48,560 Speaker 1: And I'm really pleased to welcome my guest, doctor Joshua Cipper. 13 00:00:48,880 --> 00:00:52,680 Speaker 1: He is a professor of cyber Warfare Studies at the 14 00:00:52,720 --> 00:00:55,360 Speaker 1: Air Force Cyber College and he's the author of the 15 00:00:55,400 --> 00:01:10,280 Speaker 1: new book The Cyber Meta Reality Beyond the Metaverse. Josh 16 00:01:10,319 --> 00:01:13,160 Speaker 1: thank you for joining us today. Thank you so much, sir. 17 00:01:13,240 --> 00:01:15,080 Speaker 1: It's great to be here with you today. I thought 18 00:01:15,120 --> 00:01:17,520 Speaker 1: for our audience, we could start by talking a little 19 00:01:17,520 --> 00:01:20,280 Speaker 1: bit about your background, because you have twenty one years 20 00:01:20,319 --> 00:01:25,120 Speaker 1: of technical and intelligence experience. Could you sort of outline 21 00:01:25,120 --> 00:01:27,479 Speaker 1: for us how you got to hear well. I started 22 00:01:27,480 --> 00:01:30,000 Speaker 1: out as many people do in the military. I was 23 00:01:30,000 --> 00:01:32,880 Speaker 1: in the year Force for several years, and I was 24 00:01:33,000 --> 00:01:38,840 Speaker 1: placed in the intelligence career field, specifically electromagnetic spectrum signals intelligence, 25 00:01:38,880 --> 00:01:41,640 Speaker 1: so I dealt a lot with electro magnetic warfare and 26 00:01:41,760 --> 00:01:45,320 Speaker 1: intelligence surveillance reconnaissance. I worked on the YouTube program collecting 27 00:01:45,360 --> 00:01:48,520 Speaker 1: intelligence for several years, and then for Lockheed Martin I 28 00:01:48,640 --> 00:01:50,640 Speaker 1: did the same job for a few years, and then 29 00:01:50,680 --> 00:01:54,920 Speaker 1: switched over to cyber working for General Dynamics, and then 30 00:01:54,960 --> 00:01:56,800 Speaker 1: over to the La May Center where you and I 31 00:01:56,880 --> 00:01:59,720 Speaker 1: met not too long ago, and I worked there and 32 00:02:00,160 --> 00:02:05,680 Speaker 1: Cyber Billet, doing application development and web administration, shareboard administration, 33 00:02:06,040 --> 00:02:08,639 Speaker 1: a lot of other technical type things. And then most 34 00:02:08,680 --> 00:02:11,360 Speaker 1: recently I wound up as a professor at the Cyber College, 35 00:02:11,480 --> 00:02:15,120 Speaker 1: teaching there and just having a great time exploring the 36 00:02:15,160 --> 00:02:18,400 Speaker 1: world of cyber. I have to also point out the 37 00:02:18,720 --> 00:02:21,600 Speaker 1: in addition to all of your technical knowledge, that you're 38 00:02:21,639 --> 00:02:25,399 Speaker 1: a writer with several novels and non fictionalists. In particular, 39 00:02:25,760 --> 00:02:29,440 Speaker 1: your novels include Runaway Swimmer, The Tower of Quail, and 40 00:02:29,600 --> 00:02:33,520 Speaker 1: Following Durant is running particular pattern to your novels. Well. 41 00:02:33,639 --> 00:02:37,440 Speaker 1: I started out with historical fiction. The first novel is 42 00:02:37,480 --> 00:02:40,120 Speaker 1: based off of a family story actually that my great 43 00:02:40,160 --> 00:02:43,200 Speaker 1: aunt told my father when he was growing up, of 44 00:02:43,560 --> 00:02:47,960 Speaker 1: my fifth great grandfather's brother. His name was George Washington Skipper, 45 00:02:48,000 --> 00:02:50,480 Speaker 1: her name of all from Skipper to Sipper over the generations. 46 00:02:50,480 --> 00:02:54,040 Speaker 1: But in any case, he was apparently abducted by a 47 00:02:54,120 --> 00:02:57,560 Speaker 1: Cherokee tribe who lived in the same area as them, 48 00:02:57,639 --> 00:03:00,839 Speaker 1: and he was half Cherokee. So that kind of thing 49 00:03:00,919 --> 00:03:03,400 Speaker 1: happened fairly often back in the day, but nobody ever 50 00:03:03,480 --> 00:03:05,600 Speaker 1: knew what happened to him. So I decided, hey, I'll 51 00:03:05,639 --> 00:03:07,839 Speaker 1: write a story of what might have happened to him 52 00:03:07,880 --> 00:03:11,000 Speaker 1: and everything. From there, I've kind of delved into historical 53 00:03:11,040 --> 00:03:13,960 Speaker 1: fiction and some science fiction as well. So I just 54 00:03:14,080 --> 00:03:17,720 Speaker 1: enjoy exploring new worlds. I guess that's part of what 55 00:03:17,800 --> 00:03:20,200 Speaker 1: this current book is about, too, is exploring a new world. 56 00:03:20,720 --> 00:03:24,000 Speaker 1: My pattern is really just trying to delve into ideas 57 00:03:24,080 --> 00:03:29,880 Speaker 1: and concepts, philosophies, etc. Behind these fascinating spaces that people 58 00:03:30,360 --> 00:03:32,799 Speaker 1: maybe don't understand, and try to help them understand them better. 59 00:03:33,400 --> 00:03:37,200 Speaker 1: So your new book is entitled The cyber meta reality 60 00:03:37,680 --> 00:03:40,560 Speaker 1: beyond the metaverse. For those of us who don't quite 61 00:03:40,600 --> 00:03:45,880 Speaker 1: get all this, can you kind of break that down? Yes, sir, well, so, really, 62 00:03:46,000 --> 00:03:49,320 Speaker 1: the whole idea of a meta reality is predicated on 63 00:03:49,400 --> 00:03:53,080 Speaker 1: this concept that a gentleman named Worry Bascar developed back 64 00:03:53,120 --> 00:03:56,240 Speaker 1: in two thousand and I'm gonna give you a mouthful 65 00:03:56,280 --> 00:03:58,240 Speaker 1: real quick, but then I'm gonna break it down for you. 66 00:03:58,320 --> 00:04:01,600 Speaker 1: But it's based off of this concept called transcendental dialectical 67 00:04:01,640 --> 00:04:05,160 Speaker 1: critical realism. Just repeat that slowly. That's such a mouthful 68 00:04:05,160 --> 00:04:08,520 Speaker 1: you could get a PhD. Just off that. It really 69 00:04:08,720 --> 00:04:14,840 Speaker 1: is transcendental dialectical critical realism, and that shortened would be 70 00:04:15,080 --> 00:04:18,800 Speaker 1: meta reality. I knew if we had you on, Josh, 71 00:04:18,839 --> 00:04:22,080 Speaker 1: that it would take us into worlds none of us understand. 72 00:04:22,440 --> 00:04:25,800 Speaker 1: And you've already succeeded in one sentence. Well, the way 73 00:04:25,800 --> 00:04:28,719 Speaker 1: he breaks it down, he said, it's Roy Bascar sees 74 00:04:28,760 --> 00:04:31,480 Speaker 1: the world as a reality that can be studied at 75 00:04:31,520 --> 00:04:35,279 Speaker 1: a variety of space, time, layers, and levels, and that 76 00:04:35,480 --> 00:04:39,600 Speaker 1: nature itself is divided into different strata and domains, collectively 77 00:04:39,880 --> 00:04:44,160 Speaker 1: comprising a metaality or a reality of and about reality. 78 00:04:44,279 --> 00:04:47,880 Speaker 1: So it's basically if you've heard of the term the multiverse. 79 00:04:48,440 --> 00:04:50,279 Speaker 1: Some of us really aren't too familiar with what that 80 00:04:50,440 --> 00:04:53,719 Speaker 1: is either. It's the idea that we live in a 81 00:04:53,839 --> 00:04:58,480 Speaker 1: particular reality ourselves, but our reality was somehow generated out 82 00:04:58,560 --> 00:05:03,440 Speaker 1: of the spacetime continuum that includes many other possible realities, 83 00:05:03,640 --> 00:05:07,560 Speaker 1: if I understand it. In physics, there's now a theory 84 00:05:07,600 --> 00:05:13,760 Speaker 1: that there may be parallel, multiple universes that don't quite 85 00:05:13,800 --> 00:05:18,040 Speaker 1: connect with each other, but they literally exist in the 86 00:05:18,160 --> 00:05:22,359 Speaker 1: space time continuum. Yes, and that's completely hypothetical, I mean really, 87 00:05:22,400 --> 00:05:24,640 Speaker 1: when it comes down to it, Since we could never 88 00:05:25,200 --> 00:05:30,280 Speaker 1: since those other universes or experience them physically ourselves, we 89 00:05:30,279 --> 00:05:34,719 Speaker 1: can only hypothesize that they exist. However, in the cyber 90 00:05:35,240 --> 00:05:39,719 Speaker 1: reality that we live in, we do experience other people's realities, 91 00:05:39,839 --> 00:05:43,240 Speaker 1: other types of realities. We can touch those, we can 92 00:05:43,320 --> 00:05:47,000 Speaker 1: delve into those, we can actually connect to these other 93 00:05:47,240 --> 00:05:50,599 Speaker 1: concepts and ideas and realities. When I started researching the 94 00:05:50,640 --> 00:05:53,599 Speaker 1: book and then writing it as well, I started to see, Wow, 95 00:05:53,640 --> 00:05:56,280 Speaker 1: this would be a really interesting way to sort of 96 00:05:56,320 --> 00:06:00,400 Speaker 1: model what a multiverse would look like by just delving 97 00:06:00,440 --> 00:06:04,760 Speaker 1: into what we can actually intellectually and informationally see and touch, 98 00:06:05,160 --> 00:06:08,960 Speaker 1: so to speak. These other realities within cyberspace, and that 99 00:06:09,000 --> 00:06:12,239 Speaker 1: goes into the dark web and a deep web, into 100 00:06:12,440 --> 00:06:16,400 Speaker 1: places like China where their whole reality of cyberspace is 101 00:06:16,400 --> 00:06:19,640 Speaker 1: completely different from ours, and North Korea where their whole 102 00:06:20,000 --> 00:06:23,840 Speaker 1: idea of cyberspace is completely different from ours, and even Europe. 103 00:06:23,880 --> 00:06:26,760 Speaker 1: You know, all around the world you're interconnecting cultures and 104 00:06:26,839 --> 00:06:30,920 Speaker 1: realities in different and exciting fundamental ways. So in a sense, 105 00:06:30,960 --> 00:06:33,719 Speaker 1: are you also suggesting that the culture you bring to 106 00:06:33,839 --> 00:06:37,640 Speaker 1: it changes your experience of what it is. So there's 107 00:06:37,680 --> 00:06:41,640 Speaker 1: not necessarily objective reality, but there are a series of 108 00:06:41,680 --> 00:06:46,279 Speaker 1: experienced realities that are culturally dependment. Yes, I think that 109 00:06:46,279 --> 00:06:48,480 Speaker 1: that is some of it. I also do believe that 110 00:06:48,520 --> 00:06:52,480 Speaker 1: there has to be unifying threads of objective reality. You know, 111 00:06:52,520 --> 00:06:54,960 Speaker 1: we're already trying to develop those kinds of threads of 112 00:06:54,960 --> 00:06:58,160 Speaker 1: objective reality and the way that we set up norms 113 00:06:58,360 --> 00:07:03,279 Speaker 1: and policy and security across cyberspace wherever that may be. 114 00:07:03,640 --> 00:07:06,040 Speaker 1: In a lot of ways, cyberspace and the cyber meta 115 00:07:06,080 --> 00:07:08,760 Speaker 1: reality is still the wild West. And that's another part 116 00:07:08,760 --> 00:07:10,360 Speaker 1: of the reason for this book is to try to 117 00:07:10,480 --> 00:07:13,280 Speaker 1: establish some science behind all of this, because you know, 118 00:07:13,320 --> 00:07:17,480 Speaker 1: we have science that governs and has taxonomies and ontologies 119 00:07:17,520 --> 00:07:22,600 Speaker 1: and antogenes for biology, for instance, and for astrophysics and 120 00:07:22,680 --> 00:07:25,840 Speaker 1: for other sciences. But when it comes to this new reality, 121 00:07:25,840 --> 00:07:28,520 Speaker 1: this new domain in which we are existing as human beings, 122 00:07:28,520 --> 00:07:33,400 Speaker 1: we don't really have that common thread, that common unifying reality. 123 00:07:33,760 --> 00:07:35,960 Speaker 1: And so I'm trying to draw out the fact that, yes, 124 00:07:36,000 --> 00:07:39,480 Speaker 1: we do exists very subjectively within this. But how can 125 00:07:39,520 --> 00:07:42,440 Speaker 1: we make this more objective? How can we apply science 126 00:07:42,480 --> 00:07:47,160 Speaker 1: and scientific knowledge and schemes to the cyber meta reality 127 00:07:47,200 --> 00:07:50,720 Speaker 1: that have not yet been applied to it? So let 128 00:07:50,800 --> 00:07:52,840 Speaker 1: me ask you just a couple of carefully things. If 129 00:07:52,880 --> 00:07:57,080 Speaker 1: I understand it, the concept of a meta reality or 130 00:07:57,120 --> 00:08:01,680 Speaker 1: a metaverse is basically the way of describe the totality 131 00:08:01,760 --> 00:08:06,000 Speaker 1: within which everything else exists. What is Zuckerberg trying to 132 00:08:06,040 --> 00:08:09,480 Speaker 1: do with his name change, which sort of moves in 133 00:08:09,520 --> 00:08:12,320 Speaker 1: this direction? But what is he trying to achieve? Yeah? 134 00:08:12,360 --> 00:08:14,560 Speaker 1: I think what he's trying to really do is jaw 135 00:08:14,760 --> 00:08:18,680 Speaker 1: several different threads together into this unifying hole. Now, I 136 00:08:18,880 --> 00:08:22,320 Speaker 1: do want to say that the metaverse is really a 137 00:08:22,360 --> 00:08:24,720 Speaker 1: subset of what I'm talking about as the meta reality. 138 00:08:24,800 --> 00:08:27,640 Speaker 1: It's just one piece of a larger hole. But the 139 00:08:27,720 --> 00:08:31,840 Speaker 1: metaverse is really trying to bring together not just the 140 00:08:31,960 --> 00:08:35,840 Speaker 1: knowledge spaces in the experience spaces like virtuality and augmented reality. 141 00:08:35,880 --> 00:08:38,559 Speaker 1: That's a big part of it, but also the commerce 142 00:08:38,640 --> 00:08:41,960 Speaker 1: piece of it, because if you look at things like bitcoin, 143 00:08:42,240 --> 00:08:45,839 Speaker 1: things like nonfunciable tokens, and we talked a bit about 144 00:08:45,880 --> 00:08:49,920 Speaker 1: this at Maxwell a few months ago, nonfuncible tokens basically 145 00:08:49,960 --> 00:08:56,760 Speaker 1: are things like property and clothing and automobiles, houses in cyberspace, 146 00:08:56,960 --> 00:08:59,640 Speaker 1: and people are paying bitcoin, which is based off of 147 00:08:59,679 --> 00:09:04,479 Speaker 1: real currency, to purchase these non funcial tokens, these houses 148 00:09:04,520 --> 00:09:09,240 Speaker 1: and cars and clothes and even food in cyberspace. In fact, 149 00:09:09,240 --> 00:09:11,320 Speaker 1: not too long ago, McDonald's bought all the Rice to 150 00:09:11,320 --> 00:09:15,839 Speaker 1: become the official restaurant of the Metaverse. And I found 151 00:09:15,840 --> 00:09:18,600 Speaker 1: that so fascinating because let's say, for instance, in order 152 00:09:18,600 --> 00:09:21,240 Speaker 1: for you to continue to survive in the metaverse, you 153 00:09:21,280 --> 00:09:23,480 Speaker 1: have to eat well. In order to eat, you have 154 00:09:23,559 --> 00:09:27,680 Speaker 1: to have non physical food. I know this sounds crazy, 155 00:09:27,760 --> 00:09:30,079 Speaker 1: but the thing is really, I think what Zukerberg is 156 00:09:30,120 --> 00:09:32,160 Speaker 1: really getting at, and what McDonald's and all these other 157 00:09:32,160 --> 00:09:34,080 Speaker 1: corporations are really trying to get it is, hey, we 158 00:09:34,160 --> 00:09:37,760 Speaker 1: have human beings who are entering and living in this 159 00:09:37,880 --> 00:09:41,760 Speaker 1: new reality space more and more, How are we going 160 00:09:41,840 --> 00:09:45,160 Speaker 1: to take that and allow them to live their most 161 00:09:45,760 --> 00:09:51,120 Speaker 1: realistic life within this non physical space, and that includes 162 00:09:51,160 --> 00:09:54,079 Speaker 1: food and includes clothes. In fact, this one site called 163 00:09:54,120 --> 00:09:57,320 Speaker 1: dissenter Land, they use a currency that they create themselves 164 00:09:57,360 --> 00:10:03,520 Speaker 1: called mana na. They NFTs of property. They'll take a city, 165 00:10:03,640 --> 00:10:05,719 Speaker 1: or create a digital twin of a city like New York, 166 00:10:05,760 --> 00:10:08,800 Speaker 1: and then they'll divide it up into apartments and buildings, etc. 167 00:10:09,480 --> 00:10:12,520 Speaker 1: I think Snoop Dogg actually bought a large property in 168 00:10:13,040 --> 00:10:17,080 Speaker 1: virtual New York City in decentraland that NFT. So you 169 00:10:17,120 --> 00:10:21,800 Speaker 1: see these people who are investing in virtual real estate already, 170 00:10:21,920 --> 00:10:24,320 Speaker 1: and it's growing, it's growing. Hold on, let me slay 171 00:10:24,360 --> 00:10:26,760 Speaker 1: you down on behalf of those of us who don't 172 00:10:26,800 --> 00:10:32,520 Speaker 1: quite get it. So someday buys Fifth Avenue in the metaverse, 173 00:10:33,480 --> 00:10:39,320 Speaker 1: what is he actually acquired. He's acquired a nonreproducible asset. 174 00:10:39,720 --> 00:10:44,920 Speaker 1: So whenever they produce this property, they make it where 175 00:10:44,960 --> 00:10:47,000 Speaker 1: you can I don't know exactly all the fundamentals of 176 00:10:47,040 --> 00:10:48,400 Speaker 1: how they do this, but they make it where you 177 00:10:48,440 --> 00:10:50,280 Speaker 1: can't produce any more of it. I don't know if 178 00:10:50,280 --> 00:10:52,440 Speaker 1: it's a rights thing or legal or what it is. 179 00:10:52,480 --> 00:10:56,680 Speaker 1: But basically it's nonreproducible. And so whenever someone buys that 180 00:10:56,840 --> 00:10:59,600 Speaker 1: virtual property, let's say they want to divide up Fifth 181 00:10:59,600 --> 00:11:03,080 Speaker 1: Avenue and resell it for a profit in mana or 182 00:11:03,160 --> 00:11:06,520 Speaker 1: bitcoin or ethereum, whatever it may be, they can do 183 00:11:06,559 --> 00:11:08,680 Speaker 1: that and they can actually make money off of that 184 00:11:08,800 --> 00:11:11,520 Speaker 1: virtual real estate in the future, and then people can 185 00:11:11,720 --> 00:11:15,360 Speaker 1: live quote unquote within that virtual real estate. For that 186 00:11:15,480 --> 00:11:18,760 Speaker 1: to happen, you have to find another cuckoo who wants 187 00:11:18,800 --> 00:11:22,400 Speaker 1: to commit and buy your property in a non existent space. 188 00:11:23,400 --> 00:11:26,080 Speaker 1: This is worse than the old Florida lamboo when they 189 00:11:26,160 --> 00:11:28,160 Speaker 1: used to not be any land where the landboom was 190 00:11:28,360 --> 00:11:30,960 Speaker 1: because it was called part of a marsh. I know 191 00:11:31,080 --> 00:11:49,120 Speaker 1: it's crazy, it's crazy, but it's what's happening in that context. 192 00:11:49,240 --> 00:11:53,080 Speaker 1: What are the national security implications. So I think that 193 00:11:53,280 --> 00:11:57,600 Speaker 1: we're beginning to see a lot of movement within the 194 00:11:57,679 --> 00:12:01,280 Speaker 1: metaverse and within the meta reality as a whole as 195 00:12:01,280 --> 00:12:04,160 Speaker 1: far as where we keep our information in assets, and 196 00:12:04,200 --> 00:12:06,080 Speaker 1: I may this has been going on for decades now. 197 00:12:06,120 --> 00:12:08,400 Speaker 1: You know a lot of people are storing, they're investing 198 00:12:08,960 --> 00:12:13,040 Speaker 1: their money, keeping their assets in these online spaces, and 199 00:12:13,080 --> 00:12:17,400 Speaker 1: of course that the diplomatic, information, military, and economic pieces 200 00:12:17,440 --> 00:12:21,439 Speaker 1: of everything that we do from a national security perspective 201 00:12:21,880 --> 00:12:23,760 Speaker 1: is wrapped up in that. I mean, you think of 202 00:12:23,760 --> 00:12:26,360 Speaker 1: the OPM hack and what happened, you know, years ago 203 00:12:26,520 --> 00:12:29,040 Speaker 1: with all of the social security numbers coming into the 204 00:12:29,040 --> 00:12:32,480 Speaker 1: clear to China who hack that database and bringing that out. 205 00:12:32,880 --> 00:12:36,199 Speaker 1: Those are things that can be exploited, can be used 206 00:12:36,240 --> 00:12:41,080 Speaker 1: for future hacks, for future information, for espionage, for blackmail, 207 00:12:41,559 --> 00:12:44,720 Speaker 1: any number of different things. So the idea that we're 208 00:12:44,720 --> 00:12:48,560 Speaker 1: continuously sort of expanding this information space means that we 209 00:12:48,640 --> 00:12:52,200 Speaker 1: have a much larger footprint of information out there that 210 00:12:52,280 --> 00:12:56,720 Speaker 1: can be attacked, exploited, and used for nefarious purposes against 211 00:12:57,200 --> 00:13:00,480 Speaker 1: America and our allies. Let me walk you through just 212 00:13:00,520 --> 00:13:03,240 Speaker 1: a little bit of history and have you sort of 213 00:13:03,280 --> 00:13:06,240 Speaker 1: either correct me or expand on it. It seems to 214 00:13:06,280 --> 00:13:11,640 Speaker 1: me that a really powerful case can be made that 215 00:13:11,880 --> 00:13:17,920 Speaker 1: the early information warfare capability of both breaking the Japanese 216 00:13:17,960 --> 00:13:23,559 Speaker 1: code and breaking the German Enigma system changed the probable 217 00:13:23,640 --> 00:13:26,679 Speaker 1: course of World War Two. Clearly, the Battle of Midway 218 00:13:26,800 --> 00:13:30,800 Speaker 1: was decisively shaped by the handful of people in a 219 00:13:30,880 --> 00:13:34,760 Speaker 1: basement who managed to break enough of the Japanese code. 220 00:13:35,160 --> 00:13:40,800 Speaker 1: To what extent would those two achievements early examples of 221 00:13:40,840 --> 00:13:44,480 Speaker 1: the power of what in a sense is an invisible 222 00:13:44,480 --> 00:13:49,840 Speaker 1: battlefield to decisively reshape the physical battlefield. I think they're 223 00:13:49,880 --> 00:13:53,040 Speaker 1: absolutely fundamental to how we see information and deal with 224 00:13:53,080 --> 00:13:56,840 Speaker 1: information today. The imitation game that goes into Alan Turing 225 00:13:56,880 --> 00:14:00,920 Speaker 1: in his work and breaking the Enigma code, essentially developing 226 00:14:00,960 --> 00:14:04,200 Speaker 1: this machine called Christopher that was using a machine to 227 00:14:04,400 --> 00:14:08,199 Speaker 1: break another machine. It was fascinating, but he worked with 228 00:14:08,320 --> 00:14:14,360 Speaker 1: another scientist. These two gentlemen together developed essentially all computer language, 229 00:14:14,600 --> 00:14:20,200 Speaker 1: computer processing, binary gateways, transistors, all these different things. That 230 00:14:20,320 --> 00:14:23,840 Speaker 1: was the beginning of what we see as the computer age. 231 00:14:23,920 --> 00:14:27,360 Speaker 1: Everything from there has built to what we see today. 232 00:14:27,520 --> 00:14:31,000 Speaker 1: So it was really the fundamental basis of what we 233 00:14:31,040 --> 00:14:34,400 Speaker 1: are dealing with in the cyberment reality, the metaverse, and 234 00:14:34,480 --> 00:14:37,680 Speaker 1: so forth today. I never thought about it, but there 235 00:14:37,760 --> 00:14:42,240 Speaker 1: must have been a constant cultural struggle between the physical 236 00:14:42,320 --> 00:14:45,640 Speaker 1: warfare guys who want to know how many extra battleships 237 00:14:45,640 --> 00:14:50,200 Speaker 1: can you get me? And the intellectual warfare guys who 238 00:14:50,200 --> 00:14:52,320 Speaker 1: want to know how quickly can I target and kill 239 00:14:52,360 --> 00:14:54,600 Speaker 1: the ship you just got? I mean, in a way 240 00:14:54,640 --> 00:14:58,320 Speaker 1: they're really very very different approaches, and in the early 241 00:14:58,400 --> 00:15:03,840 Speaker 1: days it was the intelligence guys who are the distinct minority. Today, 242 00:15:03,880 --> 00:15:08,320 Speaker 1: I think is probably a sophisticated understanding that intelligence ultimately 243 00:15:08,360 --> 00:15:12,600 Speaker 1: destroys physical assets well. And during that time too, Nut 244 00:15:12,960 --> 00:15:15,920 Speaker 1: there was a confluence of different technologies back in during 245 00:15:15,920 --> 00:15:19,200 Speaker 1: Word War two. Not only were computer systems Christopher and 246 00:15:19,240 --> 00:15:23,000 Speaker 1: the Enigma, etc. But radar had a huge part in 247 00:15:23,240 --> 00:15:26,400 Speaker 1: helping to win World War two as well. You know, 248 00:15:26,560 --> 00:15:30,120 Speaker 1: having radar towers set up in England that could detect 249 00:15:30,240 --> 00:15:32,640 Speaker 1: the incoming bombers and things that's where and being able 250 00:15:32,680 --> 00:15:36,600 Speaker 1: to process that information. You had to have systems that 251 00:15:36,640 --> 00:15:39,760 Speaker 1: could process that information and identify if it was a 252 00:15:39,840 --> 00:15:43,480 Speaker 1: large flock of birds versus bombers or whatever it may be, 253 00:15:43,560 --> 00:15:45,920 Speaker 1: that we're coming. And also ships you know that we're 254 00:15:45,960 --> 00:15:49,400 Speaker 1: coming across the channel and all these other sensory capabilities. 255 00:15:50,360 --> 00:15:52,920 Speaker 1: You raised a fascinating point which I don't know that 256 00:15:52,960 --> 00:15:56,640 Speaker 1: I've ever read a study of the Battle of Britain 257 00:15:57,320 --> 00:16:01,880 Speaker 1: as an information warfare strategy, that in fact it's clear 258 00:16:01,920 --> 00:16:04,280 Speaker 1: that the British had a dominance and had invested in 259 00:16:04,360 --> 00:16:08,000 Speaker 1: the thirties and ironically, Churchill is one of the people 260 00:16:08,000 --> 00:16:11,360 Speaker 1: who's on the committee, even when he's in a distinct minority, 261 00:16:11,360 --> 00:16:14,400 Speaker 1: and people think he's nuts in the mid thirties because 262 00:16:14,440 --> 00:16:18,520 Speaker 1: he keeps warning about Hitler and apocalyptic terms. Nonetheless, they 263 00:16:18,560 --> 00:16:21,520 Speaker 1: knew he was so smart that, having invented the tank 264 00:16:21,600 --> 00:16:24,200 Speaker 1: while he was the head of the Admiralty, they turned 265 00:16:24,200 --> 00:16:27,200 Speaker 1: around and said, why don't you serve on this committee? 266 00:16:27,200 --> 00:16:30,880 Speaker 1: And there was the investment in the combination, as you 267 00:16:30,960 --> 00:16:34,280 Speaker 1: point out correctly, not just of radar, but of the 268 00:16:34,320 --> 00:16:38,440 Speaker 1: information processing capables, which back then were mostly manual, but 269 00:16:38,560 --> 00:16:42,920 Speaker 1: which enabled the British to consistently move their resources in 270 00:16:42,960 --> 00:16:45,720 Speaker 1: a way that the Germans just could never quite match 271 00:16:45,840 --> 00:16:48,960 Speaker 1: or figure out exactly. Yeah, and being able to actually 272 00:16:49,040 --> 00:16:51,600 Speaker 1: draw all those pieces together. That's so much of what 273 00:16:51,600 --> 00:16:56,280 Speaker 1: we're dealing with today with information warfare, joint all domain operations, 274 00:16:56,400 --> 00:16:59,120 Speaker 1: doing all domain commanding control, etc. That you'll hear as 275 00:16:59,160 --> 00:17:02,840 Speaker 1: many military buzz acronyms that are coming across today, but 276 00:17:02,880 --> 00:17:05,480 Speaker 1: they're very important for us to be able to stay 277 00:17:05,520 --> 00:17:10,320 Speaker 1: pre eminent within battlespaces, information battle spaces and physical battle spaces. 278 00:17:10,760 --> 00:17:14,400 Speaker 1: Can you comment briefly, obviously only on non secure basis 279 00:17:14,880 --> 00:17:17,919 Speaker 1: about the impact of Elon Musk's capacity to move his 280 00:17:18,040 --> 00:17:22,679 Speaker 1: Starling communications over Ukraine and the degree to which I 281 00:17:22,760 --> 00:17:27,480 Speaker 1: gave Ukraine a kind of information dominance over the battlefield. Yes, 282 00:17:27,600 --> 00:17:30,880 Speaker 1: that was actually a very interesting item of discussion at 283 00:17:30,880 --> 00:17:33,840 Speaker 1: the Cyber College and at Area University to see this 284 00:17:34,040 --> 00:17:38,399 Speaker 1: movement of a civilian network. Really, you know, this is 285 00:17:38,440 --> 00:17:42,040 Speaker 1: not a military company defense company. It's almost like the 286 00:17:42,080 --> 00:17:46,400 Speaker 1: old concept of letters of mark back during the Revolutionary War, 287 00:17:46,520 --> 00:17:50,040 Speaker 1: that we're given to privateers to go and take cargo 288 00:17:50,200 --> 00:17:53,200 Speaker 1: from British vessels back during that time frame. We now 289 00:17:53,240 --> 00:17:58,240 Speaker 1: today we have this private company owner using those assets 290 00:17:58,320 --> 00:18:02,040 Speaker 1: to assist in a military operation. But it provided the 291 00:18:02,160 --> 00:18:07,159 Speaker 1: communications and satellite links that were needed for Ukraine to 292 00:18:07,240 --> 00:18:09,679 Speaker 1: continue the operations they needed to do. A lot of 293 00:18:09,720 --> 00:18:12,160 Speaker 1: what you and I talked about when we met earlier 294 00:18:12,200 --> 00:18:15,240 Speaker 1: in the year was about cyber operations. One of the 295 00:18:15,240 --> 00:18:17,840 Speaker 1: first questions you asked me was, hey, it seems like 296 00:18:17,880 --> 00:18:19,960 Speaker 1: there hasn't been a lot of cyber operations right off 297 00:18:20,000 --> 00:18:23,080 Speaker 1: the bat. Well, in hindsight, there actually been quite a 298 00:18:23,080 --> 00:18:28,840 Speaker 1: lot of cyber operations in conjunction with electromentated warfare and space, 299 00:18:29,240 --> 00:18:32,320 Speaker 1: with Viasat being taken down, and now with Elon Musk 300 00:18:32,440 --> 00:18:36,719 Speaker 1: using Starlink to help with GPS and with targeting and 301 00:18:36,840 --> 00:18:39,720 Speaker 1: with intelligence and with all these different capabilities that he's 302 00:18:39,760 --> 00:18:42,119 Speaker 1: brought to bear through that technology. So it's really a 303 00:18:42,160 --> 00:18:47,399 Speaker 1: game changer. I think the game changer with the relatively inexpensive, 304 00:18:48,400 --> 00:18:53,200 Speaker 1: various drones that have a long loiter twenty five minutes 305 00:18:53,240 --> 00:18:56,639 Speaker 1: to an hour capability that are pre programs, so you 306 00:18:56,680 --> 00:19:00,520 Speaker 1: can't break through communications because it's all internal to vehicle 307 00:19:01,119 --> 00:19:04,200 Speaker 1: and they're programmed to say does this look like a tank? Oh? 308 00:19:04,240 --> 00:19:06,320 Speaker 1: It does, I think I'll go kill it. So you're 309 00:19:06,320 --> 00:19:09,879 Speaker 1: getting an exchange rate I'm guessing of fifty thousand dollars 310 00:19:09,960 --> 00:19:12,479 Speaker 1: or less for the drone and three million or more 311 00:19:12,600 --> 00:19:15,160 Speaker 1: for the tank. I mean, I certainly did not think 312 00:19:15,640 --> 00:19:19,640 Speaker 1: you'd have that big a mismatch that we've seen in Ukraine. 313 00:19:19,840 --> 00:19:23,159 Speaker 1: Truly asymmetric warfare is what we're seeing, and that can 314 00:19:23,200 --> 00:19:25,639 Speaker 1: go both ways. It's one of those things that I 315 00:19:25,680 --> 00:19:29,000 Speaker 1: think the US is really concerned about, is that asymmetric 316 00:19:29,119 --> 00:19:33,239 Speaker 1: nature of warfare, because yes, you speak truly, if you 317 00:19:33,320 --> 00:19:37,399 Speaker 1: can invest very little money in a preprogrammed drone that 318 00:19:37,440 --> 00:19:42,160 Speaker 1: can drive itself and has the autonomous capability, artificial intelligence 319 00:19:42,160 --> 00:19:45,480 Speaker 1: and machine learning capabilities in built to be able to 320 00:19:45,520 --> 00:19:49,600 Speaker 1: identify and put bombs on target without having to use, 321 00:19:50,320 --> 00:19:53,919 Speaker 1: for instance, a driver on the ground, someone who is 322 00:19:54,000 --> 00:19:56,880 Speaker 1: using the electromedic spectrum to drive that drone and could 323 00:19:57,160 --> 00:20:00,840 Speaker 1: be jammed, GPS jammed or whatever it may be. That's 324 00:20:00,840 --> 00:20:04,160 Speaker 1: a big advance, but it also brings issues with it, 325 00:20:04,160 --> 00:20:07,920 Speaker 1: of course, because say, for instance, if that drone misidentified 326 00:20:07,960 --> 00:20:11,080 Speaker 1: the tank as a school or an orphanage or a 327 00:20:11,160 --> 00:20:13,840 Speaker 1: hospital or whatever it may be, and then you have 328 00:20:13,880 --> 00:20:17,879 Speaker 1: a Geneva conventions problem on your hands and lots of 329 00:20:17,920 --> 00:20:20,280 Speaker 1: other things that can go wrong there. So there are 330 00:20:20,320 --> 00:20:23,000 Speaker 1: at least two sides of it, definitely. In fact, back 331 00:20:23,040 --> 00:20:26,920 Speaker 1: in I think it was twenty fourteen, European Parliament established 332 00:20:27,040 --> 00:20:30,720 Speaker 1: new policy on who do you blame if an unmanned 333 00:20:30,720 --> 00:20:35,399 Speaker 1: autonomous drone does something like that accidentally kills civilians or 334 00:20:35,400 --> 00:20:37,280 Speaker 1: something of that sort. So there are a lot of 335 00:20:37,280 --> 00:20:40,119 Speaker 1: people thinking about this from a legal standpoint as well, 336 00:20:40,200 --> 00:20:57,480 Speaker 1: which brings other challenges to the table. You on the 337 00:20:57,520 --> 00:21:03,399 Speaker 1: immediate battlefield, there's just you potential for asymmetric warfare involving 338 00:21:03,520 --> 00:21:07,280 Speaker 1: cyber capabilities. I mean, I've thought about, for example, if 339 00:21:07,280 --> 00:21:10,200 Speaker 1: the Chinese or North Koreans wanted to take down all 340 00:21:10,200 --> 00:21:14,480 Speaker 1: of our ATMs and the average American couldn't get cash, 341 00:21:14,600 --> 00:21:17,280 Speaker 1: the speed of our response as a culture would be 342 00:21:17,359 --> 00:21:20,679 Speaker 1: like hours. And a precursor of that, it strikes me, 343 00:21:21,320 --> 00:21:24,800 Speaker 1: is the number of people who engage in some kind 344 00:21:24,840 --> 00:21:27,200 Speaker 1: of ransom approach where they threatened to take over your 345 00:21:27,240 --> 00:21:30,359 Speaker 1: pipeline unless you give them enough money. How big a 346 00:21:30,480 --> 00:21:34,240 Speaker 1: threat is that in real time, and how difficult it 347 00:21:34,320 --> 00:21:36,800 Speaker 1: is going to be for us to overmatch it. It's 348 00:21:36,840 --> 00:21:39,840 Speaker 1: so funny that you mentioned that, because just this past weekend, 349 00:21:39,840 --> 00:21:41,120 Speaker 1: so I've been on leave for a couple of weeks, 350 00:21:41,320 --> 00:21:43,840 Speaker 1: was my son graduate from high school. But during this time, 351 00:21:43,880 --> 00:21:46,440 Speaker 1: I'm a big gardener. So I went to Lows and 352 00:21:46,600 --> 00:21:48,960 Speaker 1: nothing bad on love Lows. Went to Lows to buy 353 00:21:49,119 --> 00:21:52,400 Speaker 1: some plants and some things to expand my system. Well, 354 00:21:52,400 --> 00:21:53,720 Speaker 1: I get up to the register and they're like, we 355 00:21:53,760 --> 00:21:56,600 Speaker 1: can't take any cards. Our entire card network is down 356 00:21:57,280 --> 00:22:00,560 Speaker 1: throughout the entire US, any Canada. We're only taking cash 357 00:22:00,640 --> 00:22:03,000 Speaker 1: right now. And so I'm like, oh my gosh, I 358 00:22:03,080 --> 00:22:05,200 Speaker 1: have you know, a basketful I don't have any cash 359 00:22:05,200 --> 00:22:06,719 Speaker 1: on me because I'm used to use in my car 360 00:22:06,760 --> 00:22:09,600 Speaker 1: all the time. So I like drive down the highway 361 00:22:09,680 --> 00:22:12,199 Speaker 1: to my bank to go to the ATM. Guess what, 362 00:22:12,400 --> 00:22:16,080 Speaker 1: the ATM's out? So I go to Walmart across the street. 363 00:22:16,480 --> 00:22:18,040 Speaker 1: I go to a registrart. I'm like, hey, are your 364 00:22:18,080 --> 00:22:20,560 Speaker 1: card things working? Yes? They are? Oh thank god. So 365 00:22:20,600 --> 00:22:22,480 Speaker 1: I was able to get cash at Walmart, go back 366 00:22:22,520 --> 00:22:24,159 Speaker 1: across the street to Lows and get my stuff. But 367 00:22:24,359 --> 00:22:27,160 Speaker 1: that's the thing, like you said, I mean, it's just amazing. 368 00:22:27,320 --> 00:22:29,240 Speaker 1: And the first thing goes across my mind is is 369 00:22:29,280 --> 00:22:33,200 Speaker 1: this a cyber attack? Is Russia punishing us by and 370 00:22:33,160 --> 00:22:34,800 Speaker 1: a lot of allowing us to buy our stuff at 371 00:22:34,800 --> 00:22:36,679 Speaker 1: Lows or you know, I mean that kind of thing 372 00:22:36,680 --> 00:22:39,439 Speaker 1: goes through your mind. But yeah, that and things like 373 00:22:39,560 --> 00:22:42,160 Speaker 1: the ransomware that you were referring to before. I actually 374 00:22:42,240 --> 00:22:45,080 Speaker 1: have a case study in the book about ransomware and 375 00:22:45,119 --> 00:22:47,400 Speaker 1: it goes into detail about what happened at a hospital 376 00:22:47,440 --> 00:22:50,760 Speaker 1: system in North Alabama a couple of years ago and 377 00:22:50,880 --> 00:22:53,119 Speaker 1: the legal ramifications that came out of that, and they 378 00:22:53,160 --> 00:22:55,560 Speaker 1: eventually paid the ransom. I think they paid about six 379 00:22:55,720 --> 00:22:58,119 Speaker 1: hundred thousand dollars, which is low by today's standards. I 380 00:22:58,119 --> 00:23:01,560 Speaker 1: think the latest is sixty million million dollars that was 381 00:23:01,600 --> 00:23:05,400 Speaker 1: actually paid by a company down in Florida just about 382 00:23:05,440 --> 00:23:07,800 Speaker 1: six months ago or so. And isn't there a big 383 00:23:07,840 --> 00:23:12,240 Speaker 1: bias against reporting it because the management doesn't want customers 384 00:23:12,240 --> 00:23:16,280 Speaker 1: to think they're insecure. Yes, absolutely, So that's the things 385 00:23:16,320 --> 00:23:18,480 Speaker 1: that get suppressed a lot of times because they don't 386 00:23:18,520 --> 00:23:22,720 Speaker 1: want their reputation to be soiled by these types of attacks. 387 00:23:22,920 --> 00:23:25,399 Speaker 1: And it's very, very difficult to track down through the 388 00:23:25,480 --> 00:23:29,200 Speaker 1: dark web and through blockchain, etc. The people who actually 389 00:23:29,280 --> 00:23:33,120 Speaker 1: do these ransomware attacks. There have been some successes after 390 00:23:33,160 --> 00:23:36,680 Speaker 1: Colonial Pipeline, for instance, the Department of Justice was able 391 00:23:36,680 --> 00:23:40,520 Speaker 1: to track down the Russian hackers who actually perpetrated that 392 00:23:40,560 --> 00:23:45,000 Speaker 1: act and take back the bitcoin from them and shut 393 00:23:45,000 --> 00:23:47,840 Speaker 1: down their dark web presence as well. So there have 394 00:23:47,880 --> 00:23:51,280 Speaker 1: been some successes, but there's still a lot of problems 395 00:23:51,280 --> 00:23:54,240 Speaker 1: that are coming across as a result of ransomware. And 396 00:23:54,280 --> 00:23:57,639 Speaker 1: we've seen it during the Ukraine conflict as well, ransomware 397 00:23:57,720 --> 00:24:00,520 Speaker 1: that has helped to drive certain things during that conflict. 398 00:24:00,960 --> 00:24:04,960 Speaker 1: Have you looked at what I presume was the North 399 00:24:05,040 --> 00:24:09,040 Speaker 1: Korean takedown of Sony, Yes, yes, that's been a big 400 00:24:09,080 --> 00:24:12,040 Speaker 1: case study for us in the cyber world. I mean, 401 00:24:12,080 --> 00:24:14,320 Speaker 1: it struck. I mean, first of all, it proves you 402 00:24:14,359 --> 00:24:17,600 Speaker 1: should not make a movie about a short, fat dictator 403 00:24:17,960 --> 00:24:19,919 Speaker 1: and not expect him to get really mad at you. 404 00:24:19,960 --> 00:24:22,199 Speaker 1: I mean, can you must have seen part of the 405 00:24:22,240 --> 00:24:25,200 Speaker 1: movie and gone that said, we're getting those guys, we'll 406 00:24:25,240 --> 00:24:28,040 Speaker 1: teach them. Yeah, And that's the thing, you know, just 407 00:24:28,200 --> 00:24:32,199 Speaker 1: even image things like that. For instance, in China, pictures 408 00:24:32,200 --> 00:24:35,360 Speaker 1: of Winnie the Pooh, you can't have those anywhere because 409 00:24:35,440 --> 00:24:39,440 Speaker 1: you know the way presidents Ge tends to walk in 410 00:24:39,560 --> 00:24:42,280 Speaker 1: a specific gait that looks like Whinnie the Pooh, and 411 00:24:42,359 --> 00:24:45,560 Speaker 1: so he doesn't want his image tarnished. And so you 412 00:24:45,640 --> 00:24:50,000 Speaker 1: have dictators like Kim John Un who don't want their 413 00:24:50,080 --> 00:24:52,640 Speaker 1: image tarnished. And if you do that, then they're going 414 00:24:52,680 --> 00:24:58,680 Speaker 1: to retaliate in ways that seem completely just insane, like 415 00:24:58,720 --> 00:25:04,040 Speaker 1: the Sony Act. But here's a desperately poor country where 416 00:25:04,040 --> 00:25:08,400 Speaker 1: the average height has actually shrunk because of malnutrition, heavily 417 00:25:08,440 --> 00:25:12,240 Speaker 1: invested in nuclear weapons and missiles, and yet somehow they've 418 00:25:12,240 --> 00:25:16,360 Speaker 1: managed to educate people enough to have apparently a very 419 00:25:16,400 --> 00:25:20,639 Speaker 1: successful and sophisticated cyber capability. What impressed me about the 420 00:25:20,680 --> 00:25:26,119 Speaker 1: Sony takedown was these guys were really good and they're formidable. Yes, 421 00:25:26,359 --> 00:25:29,320 Speaker 1: and it's not just the cyber capabilities, but their ability 422 00:25:29,359 --> 00:25:32,800 Speaker 1: to gather intelligence, because before you can do any kind 423 00:25:32,800 --> 00:25:35,000 Speaker 1: of cyber operation, it takes a lot of preparation and 424 00:25:35,080 --> 00:25:37,440 Speaker 1: quite a lot of information, and you have to get 425 00:25:37,480 --> 00:25:39,399 Speaker 1: footholds and all those kinds of things. You have to 426 00:25:39,440 --> 00:25:41,199 Speaker 1: be able to find a way into the system to 427 00:25:41,240 --> 00:25:43,440 Speaker 1: be able to have it, you have to find ways 428 00:25:43,480 --> 00:25:46,440 Speaker 1: in to be able to bring them down. Thirty years ago, 429 00:25:46,520 --> 00:25:49,080 Speaker 1: I think I would have said to you that the 430 00:25:49,200 --> 00:25:53,359 Speaker 1: National Security Agency was clearly the preeminent system in the world, 431 00:25:53,960 --> 00:25:58,760 Speaker 1: that we could do both intelligence gathering and cyber operations 432 00:25:58,840 --> 00:26:01,640 Speaker 1: better than anybody else. I'm not sure today that there's 433 00:26:01,680 --> 00:26:06,359 Speaker 1: any great margin between US, China and Russia, and maybe 434 00:26:06,400 --> 00:26:10,240 Speaker 1: even Israel and North and South Korea. I mean, there's 435 00:26:10,240 --> 00:26:13,760 Speaker 1: a growing capability around the planet. Does that sort of 436 00:26:13,800 --> 00:26:17,600 Speaker 1: fit your threat analysis? Yes, we're very aware, and the 437 00:26:17,600 --> 00:26:20,920 Speaker 1: research that I have done bears out that the gap 438 00:26:21,200 --> 00:26:24,440 Speaker 1: is narrowing. It definitely is narrowing, and some of that 439 00:26:24,640 --> 00:26:30,240 Speaker 1: is partly due to the fact that cyber operations cyber 440 00:26:30,359 --> 00:26:33,720 Speaker 1: war has a fairly low bar of entry. I mean, 441 00:26:33,720 --> 00:26:37,440 Speaker 1: if you can develop a server farm and you have 442 00:26:38,040 --> 00:26:40,840 Speaker 1: enough people with enough skill, they can do things like 443 00:26:41,320 --> 00:26:43,960 Speaker 1: distributed denial of service attacks so they can take down 444 00:26:44,080 --> 00:26:47,200 Speaker 1: entire networks, websites, whatever they may be, or ransomware like 445 00:26:47,200 --> 00:26:49,600 Speaker 1: we talked about just a moment ago. New You're not 446 00:26:49,600 --> 00:26:51,960 Speaker 1: going to believe this, but I'm going to tell you you, sir, 447 00:26:52,359 --> 00:26:55,760 Speaker 1: could go today if you wished, get on your laptop 448 00:26:55,800 --> 00:26:58,199 Speaker 1: after we finished talking. You could go out download the 449 00:26:58,200 --> 00:27:00,960 Speaker 1: tour Browser, the Union Round. You could go out on 450 00:27:01,000 --> 00:27:03,000 Speaker 1: the dark web and you could type into dut go 451 00:27:03,359 --> 00:27:08,159 Speaker 1: ransomware as a service and find a ransomware as a 452 00:27:08,200 --> 00:27:12,119 Speaker 1: service capability or tool that you, yourself, sir, could download, 453 00:27:12,720 --> 00:27:16,080 Speaker 1: and then you could ransomware a small government, local government, 454 00:27:16,160 --> 00:27:19,080 Speaker 1: or hospital for a piece of five hundred or a 455 00:27:19,119 --> 00:27:21,399 Speaker 1: thousand dollars software that you bought off the dark web, 456 00:27:21,760 --> 00:27:25,360 Speaker 1: and make yourself hundreds of thousands of dollars. You could 457 00:27:25,359 --> 00:27:27,800 Speaker 1: do that. That's a low bar of entry. Wouldn't you 458 00:27:27,840 --> 00:27:30,880 Speaker 1: say that anybody could really do that? And I don't 459 00:27:30,880 --> 00:27:33,320 Speaker 1: mean to insult your intellect. I'm just saying that you know, 460 00:27:33,359 --> 00:27:37,560 Speaker 1: as a non technical person that you or anyone out 461 00:27:37,600 --> 00:27:39,560 Speaker 1: there who wanted to do that. They could do that. 462 00:27:39,600 --> 00:27:41,800 Speaker 1: Now I'm not saying they should. Now I feel almost 463 00:27:41,840 --> 00:27:44,320 Speaker 1: challenged to do it, except if I do it, I 464 00:27:44,400 --> 00:27:46,920 Speaker 1: have a hunch I'll end up in jail. You would, 465 00:27:47,800 --> 00:27:49,679 Speaker 1: I think any of us would. But but it's one 466 00:27:49,720 --> 00:27:52,280 Speaker 1: of those things. I'm just trying to highlight the fact 467 00:27:52,320 --> 00:27:56,639 Speaker 1: that you have these tools that are available to pretty 468 00:27:56,720 --> 00:27:59,600 Speaker 1: much anyone with an Internet connection who can go out 469 00:27:59,600 --> 00:28:02,120 Speaker 1: there and get them, or you can go and download 470 00:28:02,160 --> 00:28:04,320 Speaker 1: Kylie Lennox and go on YouTube and learn all kinds 471 00:28:04,359 --> 00:28:07,080 Speaker 1: of different hacking skills in a matter of hours. That 472 00:28:07,119 --> 00:28:08,760 Speaker 1: it could allow you to do things that you never 473 00:28:08,800 --> 00:28:11,520 Speaker 1: thought you could do. It's amazing, it really is. How 474 00:28:11,600 --> 00:28:15,080 Speaker 1: much would you guess road countries like North Korea or 475 00:28:16,040 --> 00:28:21,840 Speaker 1: organizations like al Qaeda actually fund themselves through various kinds 476 00:28:21,840 --> 00:28:25,760 Speaker 1: of ransomware. That's actually a proven fact. We've seen that 477 00:28:25,840 --> 00:28:29,600 Speaker 1: quite a lot, that you have ransomware and other sources 478 00:28:29,640 --> 00:28:33,320 Speaker 1: of just stealing money from banks, etc. North Korea has 479 00:28:33,359 --> 00:28:34,960 Speaker 1: done that quite a lot. They've done that to South 480 00:28:35,040 --> 00:28:38,240 Speaker 1: Korea on several occasions, stolen money from their banks directly 481 00:28:38,720 --> 00:28:42,360 Speaker 1: as well as ransomware al Qaeda in the past has 482 00:28:42,920 --> 00:28:46,560 Speaker 1: used ransomware and stealing money. Of course, Joint Task Force 483 00:28:46,640 --> 00:28:49,479 Speaker 1: ARIES was a great effort to bring down the cyber 484 00:28:49,520 --> 00:28:52,600 Speaker 1: caliphate of al Qaeda and was very effective, and they've 485 00:28:52,640 --> 00:28:56,600 Speaker 1: been able to really suppress al qaida cyber presence for 486 00:28:56,640 --> 00:29:00,240 Speaker 1: the last decade almost now. But that's a constant fight, 487 00:29:00,320 --> 00:29:04,120 Speaker 1: as a constant battle, because you constantly have organizations like 488 00:29:04,200 --> 00:29:08,000 Speaker 1: al Qaida, or North Korea or Iran who are trying 489 00:29:08,080 --> 00:29:10,280 Speaker 1: to do that kind of thing. In the last chapter 490 00:29:10,320 --> 00:29:13,360 Speaker 1: of your book, which is called The New World, you 491 00:29:13,400 --> 00:29:16,080 Speaker 1: have a very interesting sentence, O what you to expand on? 492 00:29:16,560 --> 00:29:19,560 Speaker 1: I'm quoting you now. One of the most interesting aspects 493 00:29:19,560 --> 00:29:23,720 Speaker 1: of human interaction in and with the cyber meta reality 494 00:29:24,480 --> 00:29:27,280 Speaker 1: is the continued shaping of the two by each other. 495 00:29:28,640 --> 00:29:32,080 Speaker 1: So in a sense, we're shaping the meta reality, but 496 00:29:32,160 --> 00:29:34,520 Speaker 1: the meta reality is shaping us. But expand on that 497 00:29:34,600 --> 00:29:37,360 Speaker 1: from it, it's almost my way of what Nietzsche said, 498 00:29:37,440 --> 00:29:39,800 Speaker 1: you stare into the void and the void stares back 499 00:29:39,840 --> 00:29:43,120 Speaker 1: into you. Well, as human beings, we are doing that, 500 00:29:43,200 --> 00:29:46,440 Speaker 1: so to speak. We're staring into this new world of cyberspace, 501 00:29:46,960 --> 00:29:49,880 Speaker 1: of the cyber meta reality, and we're exploring it, but 502 00:29:50,000 --> 00:29:51,760 Speaker 1: it is changing us. I mean, I don't think that 503 00:29:51,800 --> 00:29:57,000 Speaker 1: anyone could deny that social media has drastically changed the 504 00:29:57,040 --> 00:30:00,000 Speaker 1: way that we interface with each other as human beings, 505 00:29:59,680 --> 00:30:02,880 Speaker 1: and as a result, we've needed to make some changes 506 00:30:02,920 --> 00:30:06,720 Speaker 1: to social media to help refine that and make it 507 00:30:06,840 --> 00:30:10,320 Speaker 1: maybe a better space for us as humans to interact within, 508 00:30:10,440 --> 00:30:12,720 Speaker 1: so to try to get a little away from the 509 00:30:12,760 --> 00:30:15,400 Speaker 1: echo chamber so much to be able to have more 510 00:30:15,480 --> 00:30:18,920 Speaker 1: open conversations with each other and be able to cross 511 00:30:19,000 --> 00:30:23,360 Speaker 1: the divide that we have socially and politically and intellectually, 512 00:30:23,440 --> 00:30:26,680 Speaker 1: whatever it may be. I think that as we continue 513 00:30:26,760 --> 00:30:29,720 Speaker 1: into this new world that we're going to, it's going 514 00:30:29,760 --> 00:30:33,240 Speaker 1: to necessitate more and more of that kind of interplay 515 00:30:33,320 --> 00:30:37,280 Speaker 1: that ceasaw effect of us making the changes to the 516 00:30:37,320 --> 00:30:41,720 Speaker 1: technology and the technology affecting us as human beings, and 517 00:30:41,840 --> 00:30:45,000 Speaker 1: hopefully finding some sort of happy medium therein. As you 518 00:30:45,080 --> 00:30:49,320 Speaker 1: look out, giving all your knowledge and experience, say fifteen 519 00:30:49,400 --> 00:30:53,760 Speaker 1: years from now, how do you think things will have evolved? Well? 520 00:30:54,080 --> 00:30:58,120 Speaker 1: I see them going in a direction where it seems 521 00:30:58,160 --> 00:31:01,200 Speaker 1: like a lot of attention is being given to virtual 522 00:31:01,240 --> 00:31:04,560 Speaker 1: reality and augment into reality. For one thing, many people 523 00:31:04,600 --> 00:31:09,840 Speaker 1: are beginning to find more immersive ways to insert themselves 524 00:31:09,960 --> 00:31:14,360 Speaker 1: into things like the metaverse itself one part of the 525 00:31:14,360 --> 00:31:19,520 Speaker 1: cyber meta reality, but also as we continue to reach 526 00:31:19,560 --> 00:31:22,560 Speaker 1: out to other people in different places. I'd like to 527 00:31:22,600 --> 00:31:26,680 Speaker 1: see and I hope to see changes in education. For instance, 528 00:31:27,040 --> 00:31:30,280 Speaker 1: one big part that I think is a positive part 529 00:31:30,440 --> 00:31:33,400 Speaker 1: of Zuckerberg's metaverse and the metaverse as a whole, and 530 00:31:33,440 --> 00:31:36,960 Speaker 1: this is from Google and Microsoft and Apple and Amazon. 531 00:31:37,000 --> 00:31:39,600 Speaker 1: They're all trying to come together to try to provide 532 00:31:39,960 --> 00:31:45,040 Speaker 1: education through the metaverse to people who are maybe underprivileged, 533 00:31:45,160 --> 00:31:48,560 Speaker 1: who maybe don't have the same access to resources as 534 00:31:48,640 --> 00:31:52,440 Speaker 1: some folks do in larger cities, for instance, maybe they're 535 00:31:52,480 --> 00:31:55,320 Speaker 1: in rural areas, or maybe they're in underprivileged areas even 536 00:31:55,360 --> 00:31:58,840 Speaker 1: in other countries. So being able to actually use that 537 00:31:58,920 --> 00:32:05,280 Speaker 1: to drive social change, cultural change, idealistic change, etc. Nationally 538 00:32:05,320 --> 00:32:08,160 Speaker 1: and internationally. I see those things coming and I think 539 00:32:08,160 --> 00:32:12,000 Speaker 1: they're good things. That's tremendous. Well, listen, I'm really grateful 540 00:32:12,040 --> 00:32:13,720 Speaker 1: you to take those kind of time. As I said, 541 00:32:14,320 --> 00:32:16,960 Speaker 1: when Claire and I were down visiting with you, it 542 00:32:17,040 --> 00:32:19,719 Speaker 1: was really breathtaking the work you're doing and how you're 543 00:32:19,720 --> 00:32:23,000 Speaker 1: approaching it. I wanted to let our listeners have a 544 00:32:23,080 --> 00:32:26,440 Speaker 1: taste of the kind of thinking that's going on at 545 00:32:26,440 --> 00:32:28,600 Speaker 1: the cutting edge. I also want them to know that 546 00:32:28,600 --> 00:32:30,600 Speaker 1: we're gonna have a link to your new book, The 547 00:32:30,720 --> 00:32:33,960 Speaker 1: Cyber Meta Reality Beyond the Metaverse on our show page 548 00:32:34,240 --> 00:32:36,680 Speaker 1: at newtsworld dot com. And I look forward to, I 549 00:32:36,760 --> 00:32:39,040 Speaker 1: said them down there, to getting caught up with you again, 550 00:32:39,040 --> 00:32:42,560 Speaker 1: because the stuff all keeps evolving and it takes smart 551 00:32:42,560 --> 00:32:45,600 Speaker 1: guys like you to keep track to educate guys like me. 552 00:32:46,280 --> 00:32:48,400 Speaker 1: Thank you so much, dude, I really appreciate it, and 553 00:32:48,800 --> 00:32:54,240 Speaker 1: great talking to you. Always enjoy our conversations. Thank you 554 00:32:54,280 --> 00:32:57,400 Speaker 1: to my guest doctor Joshua Slipper. You can get a 555 00:32:57,440 --> 00:33:01,080 Speaker 1: link to buy his new book, The Cyber Reality Beyond 556 00:33:01,120 --> 00:33:05,360 Speaker 1: the Metaverse on our showpage at newtsworld dot com. News 557 00:33:05,400 --> 00:33:09,760 Speaker 1: World is produced by Gingridge three sixty and iHeartMedia. Our 558 00:33:09,800 --> 00:33:14,440 Speaker 1: executive producer is Garnsey Slump, our producer is Rebecca Howe, 559 00:33:14,720 --> 00:33:18,760 Speaker 1: and our researcher is Rachel Peterson. The artwork for the 560 00:33:18,760 --> 00:33:22,560 Speaker 1: show was created by Steve Penley. Special thanks to the 561 00:33:22,600 --> 00:33:26,120 Speaker 1: team at Gingridge three sixty. If you've been enjoying Newtsworld, 562 00:33:26,360 --> 00:33:29,320 Speaker 1: I hope you'll go to Apple Podcast and both rate 563 00:33:29,400 --> 00:33:32,440 Speaker 1: us with five stars and give us a review so 564 00:33:32,560 --> 00:33:36,240 Speaker 1: others can learn what it's all about. Right now, listeners 565 00:33:36,240 --> 00:33:39,240 Speaker 1: of newts World can sign up for my three free 566 00:33:39,280 --> 00:33:44,080 Speaker 1: weekly columns at Gingwich three sixty dot com slash newsletter. 567 00:33:44,600 --> 00:33:47,040 Speaker 1: I'm Newt Gingridge. This is Newtsworld