1 00:00:04,160 --> 00:00:07,160 Speaker 1: Get in touch with technology with tech Stuff from how 2 00:00:07,240 --> 00:00:13,600 Speaker 1: stuff works dot com. Hey there, and welcome to tech Stuff. 3 00:00:13,640 --> 00:00:16,720 Speaker 1: I'm your host, Jonathan Strickland. I'm an executive producer at 4 00:00:16,720 --> 00:00:19,880 Speaker 1: how stuff Works and I love all things tech. Although 5 00:00:20,040 --> 00:00:24,560 Speaker 1: this particular topic maybe a little less than usual because 6 00:00:24,600 --> 00:00:29,080 Speaker 1: it gets pretty harry. Seeing May two thousand eighteen, News 7 00:00:29,120 --> 00:00:32,440 Speaker 1: broke that more than a dozen Google employees had handed 8 00:00:32,440 --> 00:00:36,320 Speaker 1: in their resignations over the company's involvement in a program 9 00:00:36,360 --> 00:00:40,599 Speaker 1: called Project Maven. So what the heck is Project Maven 10 00:00:40,720 --> 00:00:44,440 Speaker 1: and why did those employees leave? And why have an 11 00:00:44,560 --> 00:00:48,560 Speaker 1: estimated four thousand Google employees put their names on petitions 12 00:00:48,600 --> 00:00:51,960 Speaker 1: to end the company's involvement with the project. It's time 13 00:00:52,000 --> 00:00:58,520 Speaker 1: to dive into a really serious current topic. Project Maven 14 00:00:58,840 --> 00:01:03,080 Speaker 1: is a large technology project overseen by the United States 15 00:01:03,200 --> 00:01:08,320 Speaker 1: Defense Department with a specific focus on bringing artificial intelligence 16 00:01:08,480 --> 00:01:14,600 Speaker 1: or AI applications into military functions and campaigns. The argument 17 00:01:14,600 --> 00:01:17,640 Speaker 1: for Project may even was that AI as a field 18 00:01:17,959 --> 00:01:22,479 Speaker 1: has been advancing for years, with particularly impressive advancements made 19 00:01:22,480 --> 00:01:25,319 Speaker 1: in the last couple of years alone, and yet the 20 00:01:25,319 --> 00:01:29,600 Speaker 1: military has lagged behind. As Air Force Lieutenant General Jack 21 00:01:29,640 --> 00:01:34,280 Speaker 1: sent Shanahan wrote in the Bulletin of the Atomic Scientists 22 00:01:34,280 --> 00:01:39,959 Speaker 1: back in November. Quote. The US military still performs many 23 00:01:40,000 --> 00:01:42,760 Speaker 1: activities in a style that would be familiar to the 24 00:01:42,800 --> 00:01:46,360 Speaker 1: military of World War Two end quote. The argument was 25 00:01:46,400 --> 00:01:49,520 Speaker 1: that something needed to change to bring these processes into 26 00:01:49,520 --> 00:01:53,080 Speaker 1: the twenty first century. To that effect, on April two 27 00:01:53,120 --> 00:01:56,880 Speaker 1: thousand seventeen, Robert Work, who then was the Deputy Secretary 28 00:01:56,920 --> 00:01:59,400 Speaker 1: of the Defense Department, released a memo calling for the 29 00:01:59,440 --> 00:02:03,080 Speaker 1: establishment of a new team called Project Maven. Well that's 30 00:02:03,120 --> 00:02:05,240 Speaker 1: the nickname. The other name for the team was the 31 00:02:05,280 --> 00:02:10,120 Speaker 1: Algorithmic Warfare Cross Functional Team. The memos opening paragraph says this, 32 00:02:10,320 --> 00:02:14,240 Speaker 1: and I quote, as numerous studies have made clear, the 33 00:02:14,280 --> 00:02:18,440 Speaker 1: Department of Defense must integrate artificial intelligence and machine learning 34 00:02:18,440 --> 00:02:23,520 Speaker 1: more effectively across operations to maintain advantages over increasingly capable 35 00:02:23,560 --> 00:02:27,640 Speaker 1: adversaries and competitors. Although we have taken tentative steps to 36 00:02:27,760 --> 00:02:32,080 Speaker 1: explore the potential of artificial intelligence, big data, and deep learning, 37 00:02:32,400 --> 00:02:34,760 Speaker 1: I remain convinced that we need to do much more 38 00:02:34,919 --> 00:02:38,600 Speaker 1: and move much faster across d O D that's Department 39 00:02:38,639 --> 00:02:42,600 Speaker 1: of Defense to take advantage of recent and future advances 40 00:02:42,639 --> 00:02:47,359 Speaker 1: in these critical areas end quote. On its face, this 41 00:02:47,440 --> 00:02:51,720 Speaker 1: sounds pretty reasonable, or at least understandable. After all, AI 42 00:02:51,840 --> 00:02:55,720 Speaker 1: has the potential to do enormous good or harm depending 43 00:02:55,800 --> 00:03:00,160 Speaker 1: upon its design and implementation. To not pursue AI in 44 00:03:00,200 --> 00:03:03,280 Speaker 1: the realm of military applications seems like it would be 45 00:03:03,400 --> 00:03:07,960 Speaker 1: a bad idea. Other nations and militaries are certainly exploring 46 00:03:08,000 --> 00:03:11,720 Speaker 1: such options, and the landscape of warfare continues to change 47 00:03:11,760 --> 00:03:15,720 Speaker 1: and become more complex. Artificial or augmented intelligence would be 48 00:03:15,720 --> 00:03:18,160 Speaker 1: really handy in such a world. Being able to work 49 00:03:18,200 --> 00:03:23,000 Speaker 1: with sophisticated programs to identify targets, gather intelligence, and form 50 00:03:23,080 --> 00:03:27,079 Speaker 1: strategies could potentially win a conflict, save lives, or it 51 00:03:27,160 --> 00:03:29,920 Speaker 1: might even allow for a non violent method to resolve 52 00:03:29,919 --> 00:03:33,200 Speaker 1: a situation, which in my mind, tends to be the 53 00:03:33,200 --> 00:03:36,800 Speaker 1: best of all options. The memos third paragraph details what 54 00:03:36,920 --> 00:03:40,080 Speaker 1: this group's focus will be, and I quote yet again, 55 00:03:40,480 --> 00:03:44,200 Speaker 1: the a w c f t S first task is 56 00:03:44,240 --> 00:03:49,520 Speaker 1: to field technology to augment or automate processing, exploitation and 57 00:03:49,600 --> 00:03:54,680 Speaker 1: dissemination p e D for tactical Unmanned aerial system u 58 00:03:54,720 --> 00:03:58,680 Speaker 1: a S and mid altitude full motion video fm V 59 00:03:59,320 --> 00:04:03,240 Speaker 1: in support of the Defeat ISIS campaign. This will help 60 00:04:03,280 --> 00:04:07,320 Speaker 1: to reduce the human factors burden of FMV analysis, increase 61 00:04:07,400 --> 00:04:11,960 Speaker 1: actionable intelligence, and enhance military decision making. A w c 62 00:04:12,120 --> 00:04:17,159 Speaker 1: FT will number one organize a data labeling effort and develop, acquire, 63 00:04:17,320 --> 00:04:21,400 Speaker 1: and or modify algorithms to accomplish key tasks. Number two 64 00:04:21,680 --> 00:04:25,920 Speaker 1: identify required computational resources and identify a path to fielding 65 00:04:25,960 --> 00:04:30,600 Speaker 1: that infrastructure. And number three integrate algorithmic based technology with 66 00:04:30,640 --> 00:04:34,280 Speaker 1: programs of record in nine D day sprints. Now, remember 67 00:04:34,279 --> 00:04:38,440 Speaker 1: a sprint when we talked about agile frameworks is a 68 00:04:38,960 --> 00:04:42,440 Speaker 1: essentially a period of time in which a project takes 69 00:04:42,480 --> 00:04:46,080 Speaker 1: place and you're expected to do numerous updates throughout that 70 00:04:46,160 --> 00:04:49,840 Speaker 1: period and have something that is implementable by the end 71 00:04:49,960 --> 00:04:53,320 Speaker 1: of the sprint. Now, in other words, the earliest task 72 00:04:53,400 --> 00:04:56,200 Speaker 1: for this new project was to work on programs that 73 00:04:56,200 --> 00:04:59,560 Speaker 1: would allow unmanned aerial systems which we commonly refer to 74 00:04:59,680 --> 00:05:04,640 Speaker 1: as drones, to analyze full motion video with object detection 75 00:05:04,720 --> 00:05:08,520 Speaker 1: and classification in an effort to identify members of ISIS, 76 00:05:09,440 --> 00:05:14,000 Speaker 1: areas of interest, equipment and weapons that bad actors might 77 00:05:14,040 --> 00:05:16,680 Speaker 1: have at their disposal. The memo continues on to talk 78 00:05:16,720 --> 00:05:19,760 Speaker 1: about the use of machine learning and automation and efforts 79 00:05:19,839 --> 00:05:25,560 Speaker 1: of improving intelligence, surveillance and reconnaissance missions. The memo established 80 00:05:25,600 --> 00:05:29,880 Speaker 1: May one, twenty seventeen, as the date of the first 81 00:05:29,960 --> 00:05:33,240 Speaker 1: meeting of the a w c F T OR Project 82 00:05:33,240 --> 00:05:36,679 Speaker 1: MAVEN to give reports directly to the Administrator of the program, 83 00:05:36,680 --> 00:05:41,120 Speaker 1: who would be Robert Work himself. Lieutenant General Shanahan was 84 00:05:41,320 --> 00:05:44,440 Speaker 1: named director of the project, which originally had only six 85 00:05:44,560 --> 00:05:47,760 Speaker 1: members on it, and this marked an aggressive strategy to 86 00:05:47,839 --> 00:05:51,240 Speaker 1: implement these technologies in the realm of the combat theater, 87 00:05:51,640 --> 00:05:54,919 Speaker 1: and to do so very quickly. Now that's not to 88 00:05:54,960 --> 00:05:57,440 Speaker 1: say that the Department of Defense was a stranger to 89 00:05:57,480 --> 00:06:01,000 Speaker 1: cutting edge technology. Far from it. In nineteen fifty eight, 90 00:06:01,000 --> 00:06:05,400 Speaker 1: President Dwight D. Eisenhower formed the Advanced Research Projects Agency 91 00:06:05,640 --> 00:06:09,360 Speaker 1: or ARPA, which would change names to the Defense Advanced 92 00:06:09,400 --> 00:06:13,680 Speaker 1: Research Projects Agency or DARPA in nineteen seventy two. While 93 00:06:13,720 --> 00:06:16,160 Speaker 1: the agency has swapped names back and forth since then 94 00:06:16,200 --> 00:06:18,599 Speaker 1: a couple of times, its mission has remained the same. 95 00:06:18,920 --> 00:06:22,719 Speaker 1: The office provides funding for various projects aimed to expand 96 00:06:22,920 --> 00:06:26,600 Speaker 1: science and technology, generally with some thought given towards the 97 00:06:26,640 --> 00:06:30,840 Speaker 1: possible military benefits. Those projects have led to amazing things, 98 00:06:30,920 --> 00:06:34,960 Speaker 1: including the Internet and the development of autonomous cars. But 99 00:06:35,040 --> 00:06:39,000 Speaker 1: while those projects have had and will continue to have 100 00:06:39,240 --> 00:06:42,560 Speaker 1: a major impact both in military and non military uses, 101 00:06:43,000 --> 00:06:46,600 Speaker 1: the focus wasn't narrow enough for the purposes of Project Mayven. 102 00:06:46,880 --> 00:06:49,680 Speaker 1: This project marked a change, one in which the military 103 00:06:49,680 --> 00:06:52,240 Speaker 1: would be reaching out to experts in the various disciplines 104 00:06:52,279 --> 00:06:56,359 Speaker 1: that comprise artificial intelligence, with the goal of improving military 105 00:06:56,400 --> 00:06:59,840 Speaker 1: capabilities that could be implemented as soon as possible in 106 00:07:00,000 --> 00:07:04,240 Speaker 1: a real world combat theater setting. Obviously, this raised many 107 00:07:04,320 --> 00:07:08,000 Speaker 1: questions about the process and implementation of technologies. The political 108 00:07:08,040 --> 00:07:12,400 Speaker 1: climate was, to put it mildly delicate. Many companies weren't 109 00:07:12,400 --> 00:07:14,600 Speaker 1: eager to get involved in projects in the wake of 110 00:07:14,680 --> 00:07:18,720 Speaker 1: Donald Trump's election to president, and various information leaks about 111 00:07:18,800 --> 00:07:23,320 Speaker 1: governments and corporations had left many more companies a little 112 00:07:23,360 --> 00:07:27,640 Speaker 1: cautious about getting involved in defense contracts. Part of the 113 00:07:27,680 --> 00:07:30,400 Speaker 1: strategy to deal with this reluctance was a tight focus 114 00:07:30,480 --> 00:07:34,520 Speaker 1: on a specific implementation of AI, that being full motion 115 00:07:34,680 --> 00:07:39,080 Speaker 1: video analysis. The AI and Project MAVEN isn't meant to 116 00:07:39,080 --> 00:07:43,080 Speaker 1: take any sort of military or offensive action. Instead, it's 117 00:07:43,080 --> 00:07:46,200 Speaker 1: meant to sift through data. More on that in just 118 00:07:46,360 --> 00:07:49,720 Speaker 1: a second, but first let's take a quick break to 119 00:07:49,840 --> 00:08:00,120 Speaker 1: thank our sponsor. Not only was the committee looking for 120 00:08:00,280 --> 00:08:03,760 Speaker 1: rapid development, the project also had the goal of streamlining 121 00:08:03,800 --> 00:08:07,000 Speaker 1: all the bureaucratic red tape most parties had to endure 122 00:08:07,280 --> 00:08:11,320 Speaker 1: when applying for funding from the government. The contracting procedures 123 00:08:11,320 --> 00:08:14,040 Speaker 1: with the government had a well earned reputation for being 124 00:08:14,120 --> 00:08:18,600 Speaker 1: laborious and slow. This alone discouraged many from applying to 125 00:08:18,600 --> 00:08:21,520 Speaker 1: be part of government projects. Why would you go through 126 00:08:21,560 --> 00:08:24,840 Speaker 1: the long approval process when you could work in the 127 00:08:24,880 --> 00:08:30,040 Speaker 1: private sector and make money the entire time. Also, project 128 00:08:30,080 --> 00:08:34,120 Speaker 1: may even focused on reducing the pains of contracting with 129 00:08:34,200 --> 00:08:37,040 Speaker 1: government agencies. I thought that was kind of a clever 130 00:08:37,200 --> 00:08:40,280 Speaker 1: part of the project, not just the idea of finding 131 00:08:40,320 --> 00:08:44,160 Speaker 1: a way to get these technologies rapidly developed and deployed, 132 00:08:44,559 --> 00:08:47,240 Speaker 1: but how to streamline the process on the front end 133 00:08:47,480 --> 00:08:51,959 Speaker 1: to encourage more participants in the project. And then there's 134 00:08:52,000 --> 00:08:55,679 Speaker 1: the desire for rapid deployment. Getting technology out in the 135 00:08:55,760 --> 00:08:59,000 Speaker 1: field can be a long process on top of everything else. 136 00:08:59,320 --> 00:09:04,359 Speaker 1: In short, all the stages of funding, developing, and implementing 137 00:09:04,400 --> 00:09:08,160 Speaker 1: technology are traditionally so slow when it comes to government 138 00:09:08,200 --> 00:09:10,840 Speaker 1: contracts that by the time you get the tech out 139 00:09:10,840 --> 00:09:14,920 Speaker 1: into the real world, it's already obsolete. Project may even 140 00:09:14,920 --> 00:09:17,280 Speaker 1: aimed to change all that. The goal was to develop 141 00:09:17,360 --> 00:09:22,440 Speaker 1: tools and iterations and allow the user community that's their quote, 142 00:09:22,720 --> 00:09:26,160 Speaker 1: to test them as they became available. Now, in this case, 143 00:09:26,200 --> 00:09:29,920 Speaker 1: the user community happens to be the military to create 144 00:09:30,000 --> 00:09:32,880 Speaker 1: an AI that can analyze full motion video and look 145 00:09:32,960 --> 00:09:36,280 Speaker 1: look for specific things within that video. That was the 146 00:09:36,320 --> 00:09:39,559 Speaker 1: whole purpose of the project, and it called for an 147 00:09:39,640 --> 00:09:43,240 Speaker 1: artificial neural network. I've talked about these before, but let 148 00:09:43,280 --> 00:09:46,040 Speaker 1: me give a quick rundown of what this is right now. 149 00:09:46,840 --> 00:09:50,520 Speaker 1: Roughly speaking, and artificial neural network is a system of 150 00:09:50,640 --> 00:09:55,120 Speaker 1: one or more computers in which units of calculation called neurons, 151 00:09:55,400 --> 00:09:59,360 Speaker 1: connect to one another through weighted values called synapses in 152 00:09:59,440 --> 00:10:03,360 Speaker 1: an effort process information in a way that's similar to 153 00:10:03,480 --> 00:10:07,680 Speaker 1: how our brains work. As these neurons perform operations on data, 154 00:10:07,800 --> 00:10:10,720 Speaker 1: they send the data through the synapses, which affects the 155 00:10:10,800 --> 00:10:15,120 Speaker 1: data itself. This data eventually emerges as output, though the 156 00:10:15,160 --> 00:10:18,000 Speaker 1: design of the neural network determines how many neurons it 157 00:10:18,080 --> 00:10:21,240 Speaker 1: must pass through before this happens. The goal is to 158 00:10:21,280 --> 00:10:24,520 Speaker 1: create a system that can actually learn, once trained to 159 00:10:24,640 --> 00:10:28,040 Speaker 1: do something, so Let's say you've got an artificial neural 160 00:10:28,080 --> 00:10:30,599 Speaker 1: network and you want to train it to recognize a 161 00:10:30,679 --> 00:10:35,040 Speaker 1: specific image, and we'll say, for the purposes of this example, 162 00:10:35,280 --> 00:10:38,720 Speaker 1: that the image is a cat. You start to feed 163 00:10:38,880 --> 00:10:42,160 Speaker 1: the artificial neural network a series of images, some of 164 00:10:42,200 --> 00:10:44,880 Speaker 1: them cats and some of them of other things, and 165 00:10:44,920 --> 00:10:47,720 Speaker 1: you design the network so it identifies an object as 166 00:10:47,720 --> 00:10:51,839 Speaker 1: a cat by process of elimination. And if it does 167 00:10:52,040 --> 00:10:55,240 Speaker 1: identify something as a cat and it's not a cat, 168 00:10:55,720 --> 00:10:59,559 Speaker 1: it generates an error. That error then back propagates through 169 00:10:59,600 --> 00:11:03,120 Speaker 1: the whole network, and the system quote unquote learns that 170 00:11:03,120 --> 00:11:06,079 Speaker 1: that particular image did not represent a cat, and if 171 00:11:06,120 --> 00:11:10,200 Speaker 1: it encounters that image again, it won't mistakenly identify the 172 00:11:10,240 --> 00:11:13,080 Speaker 1: image as a cat. The same is true if it 173 00:11:13,240 --> 00:11:17,160 Speaker 1: fails to identify a cat that is present in an image. 174 00:11:17,600 --> 00:11:20,880 Speaker 1: Doing this millions of times will refine the system as 175 00:11:20,880 --> 00:11:24,480 Speaker 1: it learns what is and is not a cat. Google 176 00:11:24,600 --> 00:11:28,840 Speaker 1: actually did something similar to this several years ago. Google's 177 00:11:28,880 --> 00:11:32,679 Speaker 1: research and development lab created an artificial neural network consisting 178 00:11:32,880 --> 00:11:37,960 Speaker 1: of sixteen thousand computer processors with a billion connections within 179 00:11:38,000 --> 00:11:41,640 Speaker 1: the system. They fed the system ten million YouTube video 180 00:11:41,760 --> 00:11:45,000 Speaker 1: thumbnails selected at random, then they gave the system a 181 00:11:45,040 --> 00:11:48,680 Speaker 1: list of twenty thousand items. The system began to recognize 182 00:11:48,720 --> 00:11:51,960 Speaker 1: pictures of cats using a deep learning algorithm. And remember, 183 00:11:52,120 --> 00:11:56,160 Speaker 1: an algorithm is just a set of instructions directions that 184 00:11:56,200 --> 00:11:58,560 Speaker 1: you follow in order to get to a specific result. 185 00:11:59,000 --> 00:12:02,160 Speaker 1: The big breakthrough year was that this system was able 186 00:12:02,200 --> 00:12:05,480 Speaker 1: to recognize the image of a cat without first being 187 00:12:05,520 --> 00:12:09,640 Speaker 1: taught what a cat actually was. It learned through training 188 00:12:09,640 --> 00:12:12,600 Speaker 1: itself by looking at this large set of data. In 189 00:12:12,600 --> 00:12:16,200 Speaker 1: a similar fashion, Project May even wished to develop an 190 00:12:16,320 --> 00:12:21,320 Speaker 1: artificial neural network that could identify potential ISIS activity. Lieutenant 191 00:12:21,400 --> 00:12:26,560 Speaker 1: General Shanahan wrote about how the military has countless hours 192 00:12:26,679 --> 00:12:30,880 Speaker 1: of footage gathered by unmanned aerial systems or drones. A 193 00:12:30,960 --> 00:12:34,720 Speaker 1: quick word about these. The ones he named specifically included 194 00:12:34,800 --> 00:12:38,280 Speaker 1: the Scan Eagle, the m Q one C Gray Eagle, 195 00:12:38,720 --> 00:12:42,320 Speaker 1: and the m Q nine Reaper. The Boeing in situ 196 00:12:42,679 --> 00:12:46,360 Speaker 1: Scan Eagle is a low altitude drone that is more 197 00:12:46,400 --> 00:12:49,079 Speaker 1: than five ft long or about one and a half meters, 198 00:12:49,440 --> 00:12:51,880 Speaker 1: with a wingspan a little more than ten ft or 199 00:12:52,000 --> 00:12:54,960 Speaker 1: three point one one ms wide. It can travel at 200 00:12:55,000 --> 00:12:58,520 Speaker 1: ninety two miles per hour. Also at kilometers per hour 201 00:12:58,640 --> 00:13:00,800 Speaker 1: at top speed, and it can stay in flight for 202 00:13:00,840 --> 00:13:03,240 Speaker 1: more than twenty four hours at a time. Has high 203 00:13:03,280 --> 00:13:07,760 Speaker 1: resolution imaging sensors on it, including thermal imagery sensors. The 204 00:13:07,880 --> 00:13:11,239 Speaker 1: m Q one C Gray Eagle is built by General Atomics. 205 00:13:11,600 --> 00:13:14,880 Speaker 1: It's a medium altitude drone and is an upgrade to 206 00:13:14,920 --> 00:13:18,839 Speaker 1: the famous Predator drone. It's much larger than the Scan Eagle, 207 00:13:18,960 --> 00:13:21,760 Speaker 1: at twenty eight feet or eight point five three ms 208 00:13:21,840 --> 00:13:24,920 Speaker 1: long and a wingspan of fifty six feet or seventeen 209 00:13:25,000 --> 00:13:27,960 Speaker 1: meters wide. It can travel at a hundred two miles 210 00:13:28,000 --> 00:13:31,400 Speaker 1: per hour or three nine kilometers per hour at top speed, 211 00:13:31,840 --> 00:13:36,760 Speaker 1: and it, unlike the Scan Eagle, can be armed with 212 00:13:36,800 --> 00:13:40,960 Speaker 1: stuff like bombs and missiles. The m Q nine Reaper, 213 00:13:41,200 --> 00:13:44,760 Speaker 1: sometimes called the Predator B is even larger. It's more 214 00:13:44,800 --> 00:13:47,719 Speaker 1: than thirty six ft long or about eleven meters, it's 215 00:13:47,760 --> 00:13:51,720 Speaker 1: got a wingspan of sixty ft seven inches or twenty meters, 216 00:13:51,960 --> 00:13:53,679 Speaker 1: and it can travel at three hundred miles per hour 217 00:13:53,840 --> 00:13:56,640 Speaker 1: or two kilometers per hour at top speed, and it 218 00:13:56,720 --> 00:14:02,480 Speaker 1: can also carry a various armory of weapons all on itself. 219 00:14:02,559 --> 00:14:07,640 Speaker 1: So those are drones that could be used for offensive measures. 220 00:14:08,000 --> 00:14:12,400 Speaker 1: While the Lieutenant General was talking only about using an 221 00:14:12,480 --> 00:14:16,839 Speaker 1: artificial neural network to analyze video captured by devices like this. 222 00:14:17,720 --> 00:14:20,760 Speaker 1: Naming a couple of drones that are weaponized likely raised 223 00:14:20,800 --> 00:14:23,640 Speaker 1: many eyebrows, but let me stick with what he was 224 00:14:23,680 --> 00:14:29,480 Speaker 1: pitching back in November. Just for his argument, Shanahan specifically 225 00:14:29,520 --> 00:14:33,000 Speaker 1: talked about how these drones were gathering thousands of hours 226 00:14:33,080 --> 00:14:37,320 Speaker 1: of video intelligence, but sifting through that intelligence required even 227 00:14:37,320 --> 00:14:41,840 Speaker 1: more time and many human analysts. Even then, you could 228 00:14:41,840 --> 00:14:45,200 Speaker 1: only tackle a fraction of what was being gathered. This 229 00:14:45,280 --> 00:14:47,280 Speaker 1: meant that most of the time you were reacting to 230 00:14:47,360 --> 00:14:50,680 Speaker 1: something that had happened in the field. For example, if 231 00:14:50,680 --> 00:14:55,640 Speaker 1: an improvised explosive detonated, you might scour through video footage 232 00:14:55,640 --> 00:14:58,720 Speaker 1: of the area leading up to the detonation in an 233 00:14:58,760 --> 00:15:02,120 Speaker 1: attempt to identify the persons responsible for it and then 234 00:15:02,160 --> 00:15:05,480 Speaker 1: track their movements. It still takes an incredible amount of 235 00:15:05,480 --> 00:15:08,360 Speaker 1: time and work to follow these things, but one of 236 00:15:08,440 --> 00:15:11,720 Speaker 1: you could automate the system, bringing up analysts to look 237 00:15:11,720 --> 00:15:16,120 Speaker 1: at what Shanahan was referring to as higher value analysis 238 00:15:16,160 --> 00:15:20,040 Speaker 1: work and AI system that could comb through hours of 239 00:15:20,160 --> 00:15:24,200 Speaker 1: data and automatically classify and label things, categorizing them either 240 00:15:24,320 --> 00:15:27,960 Speaker 1: as mundane and unimportant or something to pay attention to 241 00:15:28,080 --> 00:15:31,800 Speaker 1: flagging it for human analysis. It could conceivably make a 242 00:15:31,880 --> 00:15:34,960 Speaker 1: huge difference and speed things up. Not at the start. 243 00:15:35,680 --> 00:15:39,840 Speaker 1: Humans labeled more than one hundred fifty thousand images to 244 00:15:39,920 --> 00:15:43,400 Speaker 1: train Maven on data sets, with the goal of increasing 245 00:15:43,440 --> 00:15:46,800 Speaker 1: that up to one million images by the end of January. 246 00:15:47,880 --> 00:15:50,520 Speaker 1: This would give Maven a start at being able to 247 00:15:50,600 --> 00:15:56,040 Speaker 1: identify various objects at different distances, resolutions, angles, and more. 248 00:15:56,160 --> 00:16:00,040 Speaker 1: Because remember, computer vision is a tricky thing. Teaching a 249 00:16:00,080 --> 00:16:05,280 Speaker 1: computer to recognize an image of something is tricky enough, 250 00:16:05,480 --> 00:16:08,160 Speaker 1: even if you're just sticking to one kind of lighting, 251 00:16:08,640 --> 00:16:12,080 Speaker 1: one orientation. I always use the example of a coffee mug. 252 00:16:12,360 --> 00:16:15,360 Speaker 1: Let's say you've got a red, bright red coffee mug 253 00:16:15,640 --> 00:16:18,240 Speaker 1: and the handles pointed toward the left with respect to 254 00:16:18,320 --> 00:16:21,960 Speaker 1: your view through an image, and you teach a computer 255 00:16:22,080 --> 00:16:24,400 Speaker 1: this is a coffee mug. Well, what happens if you 256 00:16:24,440 --> 00:16:26,360 Speaker 1: have it under a different lighting so it doesn't look 257 00:16:26,400 --> 00:16:28,760 Speaker 1: like it's bright red, and maybe you've turned it so 258 00:16:28,800 --> 00:16:31,000 Speaker 1: that the handles facing the other way, and maybe the 259 00:16:31,080 --> 00:16:33,160 Speaker 1: angle is a little bit different, so you're looking kind 260 00:16:33,160 --> 00:16:36,240 Speaker 1: of down into the cup. Will the computers still be 261 00:16:36,280 --> 00:16:39,360 Speaker 1: able to recognize that as a coffee mug. You have 262 00:16:39,440 --> 00:16:41,880 Speaker 1: to train it. And then let's say that you change 263 00:16:42,080 --> 00:16:44,960 Speaker 1: the color of the coffee mug. Is the exact same shape, 264 00:16:45,040 --> 00:16:48,200 Speaker 1: but now it's blue instead of red. Will the computer 265 00:16:48,280 --> 00:16:50,960 Speaker 1: recognize it. Let's say that you change the shape of 266 00:16:51,000 --> 00:16:53,680 Speaker 1: the coffee mug. Now it's a different style of coffee 267 00:16:53,720 --> 00:16:57,360 Speaker 1: mug from the previous one. We humans can pick up 268 00:16:57,360 --> 00:16:59,480 Speaker 1: on this very quickly. You teach a human a couple 269 00:16:59,520 --> 00:17:01,760 Speaker 1: of things about coffee mugs, and then they kind of 270 00:17:01,800 --> 00:17:04,120 Speaker 1: get the innate grasp of it. You can show them 271 00:17:04,119 --> 00:17:09,120 Speaker 1: all sorts of different sizes, shapes, colors, lighting conditions. They're 272 00:17:09,119 --> 00:17:11,480 Speaker 1: going to recognize that as a coffee mug. The same 273 00:17:11,560 --> 00:17:14,600 Speaker 1: is not necessarily true with computers, so training it is 274 00:17:14,640 --> 00:17:18,000 Speaker 1: a laborious process. Now, once it is trained, it can 275 00:17:18,040 --> 00:17:21,280 Speaker 1: go through data far faster than a human could, but 276 00:17:21,359 --> 00:17:24,640 Speaker 1: you still have to teach it. Shanahan viewed this project 277 00:17:24,680 --> 00:17:28,240 Speaker 1: as proof that a small, nimble team approach in getting 278 00:17:28,280 --> 00:17:31,840 Speaker 1: the right parties involved worked, and that this in turn 279 00:17:31,880 --> 00:17:35,280 Speaker 1: would spawn a new era of high tech projects aimed 280 00:17:35,280 --> 00:17:39,960 Speaker 1: at incorporating AI into other military operations. He also even 281 00:17:40,000 --> 00:17:44,280 Speaker 1: expressed a little caution about this era. So what exactly 282 00:17:44,600 --> 00:17:47,199 Speaker 1: about all of this prompted so many at Google to 283 00:17:47,280 --> 00:17:51,040 Speaker 1: protest the company's involvement. Well, I'll explain that in just 284 00:17:51,119 --> 00:17:53,840 Speaker 1: a second, but first let's take another quick break to 285 00:17:54,000 --> 00:18:04,080 Speaker 1: thank our sponsor. By the end of twenty seen technology 286 00:18:04,119 --> 00:18:07,240 Speaker 1: developed for a project Maven was in use in various 287 00:18:07,320 --> 00:18:10,200 Speaker 1: sights in the Middle East, and that's an incredible turnaround. 288 00:18:10,200 --> 00:18:12,560 Speaker 1: It was less than a year that had gone by 289 00:18:12,640 --> 00:18:15,800 Speaker 1: since the April memo had launched the project, and already 290 00:18:16,160 --> 00:18:19,280 Speaker 1: AI algorithms were being trained to look for specific types 291 00:18:19,320 --> 00:18:23,040 Speaker 1: of data within full motion video footage. Not only was 292 00:18:23,080 --> 00:18:25,080 Speaker 1: it being used in the Middle East, the military was 293 00:18:25,080 --> 00:18:27,680 Speaker 1: already starting to use it in other parts of the 294 00:18:27,720 --> 00:18:30,960 Speaker 1: world like Africa. The military stress that this tech was 295 00:18:31,000 --> 00:18:35,359 Speaker 1: meant to augment personnel's abilities in gathering and sifting through information. 296 00:18:35,400 --> 00:18:39,280 Speaker 1: It was meant to flag data so that a human 297 00:18:39,320 --> 00:18:42,800 Speaker 1: analyst could look it over. Nothing was automatically happening through 298 00:18:42,800 --> 00:18:46,040 Speaker 1: this system. There was no intent to do anything beyond 299 00:18:46,160 --> 00:18:50,560 Speaker 1: analyzing information the military was already gathering. But that's still 300 00:18:50,600 --> 00:18:54,560 Speaker 1: a pretty alarming revelation for many people and over at Google. 301 00:18:55,359 --> 00:18:58,399 Speaker 1: Google being a company that was handling a lot of 302 00:18:58,440 --> 00:19:01,439 Speaker 1: this information, a lot of these algorithms, and working with 303 00:19:01,440 --> 00:19:04,720 Speaker 1: the military to develop them. Worries we're growing that it 304 00:19:04,760 --> 00:19:08,040 Speaker 1: would not stop at data analysis, and this brings us 305 00:19:08,080 --> 00:19:11,679 Speaker 1: to the petition that thousands of Google employees signed. The 306 00:19:11,760 --> 00:19:15,959 Speaker 1: Google petition opens with a pretty clear message. Quote, we 307 00:19:16,080 --> 00:19:20,280 Speaker 1: believe that Google should not be in the business of war. Therefore, 308 00:19:20,440 --> 00:19:23,000 Speaker 1: we asked that project may even be canceled, and that 309 00:19:23,119 --> 00:19:27,919 Speaker 1: Google draft publicize and enforce a clear policy saying that 310 00:19:27,960 --> 00:19:32,359 Speaker 1: neither Google nor its contractors will ever build warfare technology. 311 00:19:32,480 --> 00:19:36,640 Speaker 1: End quote. The petition also expressed skepticism about Project Maven's 312 00:19:36,720 --> 00:19:41,520 Speaker 1: stated purpose. In the third paragraph, it reads, quote, recently 313 00:19:41,880 --> 00:19:47,520 Speaker 1: Googler's voice concerns about Maven internally, Diane Green responded, assuring 314 00:19:47,600 --> 00:19:50,920 Speaker 1: them that the technology will not operate or fly drones 315 00:19:51,280 --> 00:19:54,119 Speaker 1: and will not be used to launch weapons. While this 316 00:19:54,200 --> 00:19:57,760 Speaker 1: eliminates a narrow set of direct applications, the technology is 317 00:19:57,800 --> 00:20:00,639 Speaker 1: being built for the military, and once it's delivered, it 318 00:20:00,640 --> 00:20:04,840 Speaker 1: could easily be used to assist in these tasks end quote. 319 00:20:05,320 --> 00:20:09,080 Speaker 1: The petition states that beyond the ethical and moral implications 320 00:20:09,119 --> 00:20:12,120 Speaker 1: of the project, that it will do damage to the company, 321 00:20:12,200 --> 00:20:16,439 Speaker 1: including hurting its ability to attract talent to work for Google, saying, 322 00:20:16,920 --> 00:20:20,600 Speaker 1: if we have a reputation for giving the military technology 323 00:20:20,640 --> 00:20:23,800 Speaker 1: that helps them kill people, it's gonna be really hard 324 00:20:23,840 --> 00:20:27,440 Speaker 1: to convince new developers to come work for our company. 325 00:20:27,640 --> 00:20:31,440 Speaker 1: A Google spokesperson responded to the petition with a letter 326 00:20:31,960 --> 00:20:35,720 Speaker 1: It says, quote MAVEN is a well publicized Department of 327 00:20:35,760 --> 00:20:39,320 Speaker 1: Defense project, and Google is working on one part of it, 328 00:20:39,600 --> 00:20:43,440 Speaker 1: specifically scope to be for non offensive purposes and using 329 00:20:43,560 --> 00:20:48,520 Speaker 1: open source object recognition software available to any Google Cloud customer. 330 00:20:48,920 --> 00:20:52,880 Speaker 1: The models are based on unclassified data only. The technology 331 00:20:53,000 --> 00:20:55,680 Speaker 1: is used to flag images for human review and is 332 00:20:55,720 --> 00:20:58,879 Speaker 1: intended to save lives and save people from having to 333 00:20:58,960 --> 00:21:03,040 Speaker 1: do highly td is work. End quote. So the response 334 00:21:03,080 --> 00:21:08,359 Speaker 1: here saying this technology, these algorithms are already available and 335 00:21:08,359 --> 00:21:11,760 Speaker 1: they're open source. Anyone could take that source code and 336 00:21:11,800 --> 00:21:16,240 Speaker 1: develop applications based upon it. So really, Google could take 337 00:21:16,280 --> 00:21:18,560 Speaker 1: the contract and make some money off of it, or 338 00:21:18,640 --> 00:21:21,959 Speaker 1: not take the contract, but their work still ends up 339 00:21:21,960 --> 00:21:25,119 Speaker 1: being used for that purpose because it's open source and 340 00:21:25,160 --> 00:21:28,600 Speaker 1: anyone can use it. According to this spokesperson, the AI 341 00:21:28,720 --> 00:21:32,879 Speaker 1: and project may even is only assisting humans by pulling 342 00:21:32,960 --> 00:21:35,280 Speaker 1: up that data that may be of interest while ignoring 343 00:21:35,320 --> 00:21:38,280 Speaker 1: everything else, so it's not making any decisions on its own. 344 00:21:38,280 --> 00:21:41,520 Speaker 1: But even Shanahan states in his blog post that there 345 00:21:41,520 --> 00:21:44,720 Speaker 1: are ethical questions when it comes to incorporating AI that 346 00:21:44,960 --> 00:21:48,800 Speaker 1: must be addressed. He urges for the development of quote, 347 00:21:49,040 --> 00:21:54,080 Speaker 1: technological and organizational safeguards to ensure that Washington's military use 348 00:21:54,119 --> 00:21:58,359 Speaker 1: of AI is consistent with national values end quote. He 349 00:21:58,400 --> 00:22:01,040 Speaker 1: also points out that AI could be vulnerable to different 350 00:22:01,080 --> 00:22:04,200 Speaker 1: failure modes that could be disastrous, and that these two 351 00:22:04,280 --> 00:22:08,040 Speaker 1: must be taken into consideration. So he's saying that we 352 00:22:08,119 --> 00:22:10,840 Speaker 1: have to implement it responsibly, and we have to be 353 00:22:10,880 --> 00:22:13,480 Speaker 1: aware of its failure points so that we can make 354 00:22:13,520 --> 00:22:17,640 Speaker 1: sure that it's not vulnerable to them, because otherwise we're 355 00:22:17,640 --> 00:22:21,200 Speaker 1: going to rely far too heavily on a dangerous technology 356 00:22:21,359 --> 00:22:25,840 Speaker 1: that could end up causing irreparable harm if misused. To 357 00:22:25,960 --> 00:22:29,679 Speaker 1: further complicate matters, other groups have also urged Google to 358 00:22:29,720 --> 00:22:33,160 Speaker 1: step away from military projects. In April two thousand eighteen, 359 00:22:33,240 --> 00:22:36,520 Speaker 1: one year after May Even launched as a project, the 360 00:22:36,680 --> 00:22:40,439 Speaker 1: Tech Workers Coalition, which is a group representing employees of 361 00:22:40,480 --> 00:22:45,280 Speaker 1: major tech companies like Google, IBM, Microsoft, Amazon, and others, 362 00:22:45,640 --> 00:22:48,919 Speaker 1: launched their own petition stating that tech should not be 363 00:22:49,000 --> 00:22:52,440 Speaker 1: in the business of war. This petition says that military 364 00:22:52,480 --> 00:22:56,560 Speaker 1: contracts break user trust, in this case the user being 365 00:22:56,600 --> 00:22:59,080 Speaker 1: either the general public or in the case of big 366 00:22:59,119 --> 00:23:03,320 Speaker 1: companies like ib UM, other companies, and the International Committee 367 00:23:03,359 --> 00:23:08,959 Speaker 1: for Robot Arms Control meaning autonomous weapons not you know 368 00:23:09,359 --> 00:23:12,760 Speaker 1: the arms of a robot issued an open letter to 369 00:23:12,840 --> 00:23:16,760 Speaker 1: Google urging the company to stop pursuing military contracts. More 370 00:23:16,800 --> 00:23:21,160 Speaker 1: than ninety academics signed this paper. The letter paints a 371 00:23:21,280 --> 00:23:24,320 Speaker 1: very grim picture of a possible use for Maven, that 372 00:23:24,440 --> 00:23:28,760 Speaker 1: of identifying targets based upon probabilities arrived at by analyzing 373 00:23:28,800 --> 00:23:33,119 Speaker 1: long range surveillance footage, ultimately resulting in signature strikes and 374 00:23:33,160 --> 00:23:35,960 Speaker 1: pattern of life strikes, meaning that Google would at least 375 00:23:36,000 --> 00:23:40,159 Speaker 1: be somewhat complicit in targeted killings. The letter goes on 376 00:23:40,200 --> 00:23:42,919 Speaker 1: to state that while the express purpose of Maven is 377 00:23:43,000 --> 00:23:47,639 Speaker 1: purely for analysis, such tools could be turned toward automated 378 00:23:47,680 --> 00:23:50,480 Speaker 1: target recognition in the future. Those of you who are 379 00:23:50,600 --> 00:23:53,280 Speaker 1: Terminator fans might think of this as another step toward 380 00:23:53,359 --> 00:23:57,320 Speaker 1: the mythical sky net system, which would ultimately turn against 381 00:23:57,440 --> 00:24:00,920 Speaker 1: humans and attempt to wipe us out. Sending Arnold Schwarzenegger 382 00:24:00,960 --> 00:24:04,960 Speaker 1: back in time. The letter ends with three requests, First 383 00:24:05,000 --> 00:24:08,119 Speaker 1: that Google terminate its project may even contract with the 384 00:24:08,160 --> 00:24:11,639 Speaker 1: Department of Defense. Second, that the company commit not to 385 00:24:11,720 --> 00:24:15,480 Speaker 1: develop military technologies nor to allow the personal data is 386 00:24:15,520 --> 00:24:19,159 Speaker 1: collected to be used for military operations. And finally that 387 00:24:19,240 --> 00:24:24,200 Speaker 1: it pledged to neither participate in nor support the development, manufacture, trade, 388 00:24:24,400 --> 00:24:27,439 Speaker 1: or use of autonomous weapons, and to support efforts to 389 00:24:27,640 --> 00:24:33,040 Speaker 1: ban autonomous weapons. So far, Google hasn't shown any signs 390 00:24:33,080 --> 00:24:36,240 Speaker 1: of following those suggestions. In fact, it's been reported that 391 00:24:36,280 --> 00:24:39,600 Speaker 1: the company is actively bidding on another Department of Defense 392 00:24:39,640 --> 00:24:45,840 Speaker 1: project called the Joint Enterprise Defense Infrastructure or JEDI. This 393 00:24:45,920 --> 00:24:49,200 Speaker 1: project's goal is to create a suite of cloud services 394 00:24:49,240 --> 00:24:52,440 Speaker 1: for the Department of Defense. It's largely intended to reduce 395 00:24:52,520 --> 00:24:56,359 Speaker 1: complexity in military data storage systems, which at the moment 396 00:24:56,359 --> 00:25:00,320 Speaker 1: are fragmented across multiple branches and departments. The at EYE 397 00:25:00,320 --> 00:25:03,640 Speaker 1: contract will go to a single vendor, which means one 398 00:25:03,720 --> 00:25:07,040 Speaker 1: company stands to make a lot of money from the project. 399 00:25:07,520 --> 00:25:09,760 Speaker 1: That's hard to walk away from. I suppose that if 400 00:25:09,760 --> 00:25:12,720 Speaker 1: thousands of your employees are protesting the move, it might 401 00:25:12,880 --> 00:25:18,360 Speaker 1: warrant some soul searching. Now, current wisdom states that Amazon 402 00:25:18,440 --> 00:25:21,280 Speaker 1: is probably in the lead for that Jedi contract. Anyway, 403 00:25:21,520 --> 00:25:25,879 Speaker 1: it might behoove Google to consider listening to its employees 404 00:25:25,960 --> 00:25:29,000 Speaker 1: in an effort not to alienate its workforce and to 405 00:25:29,720 --> 00:25:33,159 Speaker 1: potentially damage the company's reputation to the point where no 406 00:25:33,160 --> 00:25:35,000 Speaker 1: one will come and work for it, or very few 407 00:25:35,280 --> 00:25:38,520 Speaker 1: and not necessarily the best and brightest. I'm very conflicted 408 00:25:38,560 --> 00:25:42,240 Speaker 1: about this. On the one hand, using artificial intelligence to 409 00:25:42,400 --> 00:25:46,760 Speaker 1: augment people's abilities to do what they've already started doing. 410 00:25:47,200 --> 00:25:49,960 Speaker 1: Is it makes sense to me using it to help 411 00:25:50,000 --> 00:25:54,120 Speaker 1: people cut down on endless hours of work? I get that. 412 00:25:54,440 --> 00:25:57,280 Speaker 1: On the other hand, if you do think about this 413 00:25:57,320 --> 00:26:00,840 Speaker 1: as being a stepping stone towards using AI to actually 414 00:26:00,920 --> 00:26:06,280 Speaker 1: target and potentially kill people, that is terrifying. It is 415 00:26:06,400 --> 00:26:08,960 Speaker 1: terrifying to think of the amount of power that gives 416 00:26:09,000 --> 00:26:15,800 Speaker 1: people the uh the removal of barriers to commit such actions, 417 00:26:15,880 --> 00:26:18,879 Speaker 1: because if you think about military actions, you have to 418 00:26:18,920 --> 00:26:22,119 Speaker 1: think if you are in command of the possibility of 419 00:26:22,160 --> 00:26:25,159 Speaker 1: the loss of human life on your side, right, you 420 00:26:25,240 --> 00:26:27,520 Speaker 1: have to consider that you have to say how many 421 00:26:27,520 --> 00:26:29,320 Speaker 1: how many people are we going to lose if we 422 00:26:29,400 --> 00:26:33,119 Speaker 1: commit to this action and is our commitment to that 423 00:26:33,160 --> 00:26:35,879 Speaker 1: action justifiable based on how many people we think we're 424 00:26:35,880 --> 00:26:40,560 Speaker 1: gonna lose. Well, if you start using automated systems, then 425 00:26:40,760 --> 00:26:44,960 Speaker 1: that number creeps down closer to zero for your side. Right, 426 00:26:45,040 --> 00:26:48,440 Speaker 1: If you're using automated systems to carry out your attacks, 427 00:26:48,920 --> 00:26:52,280 Speaker 1: then you have fewer casualties on your side, and that 428 00:26:52,400 --> 00:26:55,840 Speaker 1: might mean that you're more willing to enter into those 429 00:26:55,880 --> 00:27:00,959 Speaker 1: situations and thus more people do die. It's just there 430 00:27:01,000 --> 00:27:04,200 Speaker 1: are people on the other side. So you might, as 431 00:27:04,240 --> 00:27:08,159 Speaker 1: a military personnel, considered that a good thing, but for 432 00:27:08,240 --> 00:27:12,080 Speaker 1: others like myself, you might consider it pretty horrifying. So 433 00:27:12,160 --> 00:27:15,240 Speaker 1: I totally understand why there are Google employees walking out 434 00:27:15,280 --> 00:27:19,720 Speaker 1: of the company resigning they are unable to resolve their 435 00:27:19,880 --> 00:27:23,000 Speaker 1: philosophical beliefs with the moves that the company has made 436 00:27:23,000 --> 00:27:26,160 Speaker 1: in the last year. Uh. I also understand the need 437 00:27:26,200 --> 00:27:29,960 Speaker 1: to incorporate AI into military operations. So this is a 438 00:27:30,080 --> 00:27:34,159 Speaker 1: very complicated topic. It's not something that's so easy to 439 00:27:34,240 --> 00:27:37,119 Speaker 1: say this is wrong and we shouldn't do it. Because 440 00:27:37,119 --> 00:27:43,520 Speaker 1: I also agree with Shanahan who says other organizations out there, countries, 441 00:27:44,359 --> 00:27:47,920 Speaker 1: militaries and others are already working on this, and so 442 00:27:48,000 --> 00:27:51,000 Speaker 1: we need to do it too, just so that we 443 00:27:51,040 --> 00:27:54,399 Speaker 1: don't end up falling behind. There cannot be an AI gap, 444 00:27:55,200 --> 00:27:57,199 Speaker 1: so it becomes another arms race, which a lot of 445 00:27:57,200 --> 00:28:00,000 Speaker 1: people have likened a I two in the past. Anyway, 446 00:28:00,000 --> 00:28:02,800 Speaker 1: I wraps up this discussion about Project Maven. If you 447 00:28:02,800 --> 00:28:05,480 Speaker 1: guys have any suggestions for future topics I can tackle 448 00:28:05,520 --> 00:28:08,080 Speaker 1: here on tech Stuff, please send them to me. My 449 00:28:08,160 --> 00:28:11,360 Speaker 1: email address is tech stuff at our stuff works dot com, 450 00:28:11,440 --> 00:28:13,560 Speaker 1: or you can draw me a line on Twitter or Facebook. 451 00:28:13,600 --> 00:28:16,120 Speaker 1: To handle with both of those is text stuff h 452 00:28:16,359 --> 00:28:19,399 Speaker 1: s W. Remember we have an Instagram account. You can 453 00:28:19,400 --> 00:28:22,919 Speaker 1: follow us there, and you can watch me record live 454 00:28:23,000 --> 00:28:25,840 Speaker 1: on twitch dot tv slash tech Stuff. Come on over 455 00:28:25,880 --> 00:28:27,679 Speaker 1: there on a Wednesday or Friday. You're gonna see me 456 00:28:27,680 --> 00:28:30,520 Speaker 1: record these shows live, and you'll see as I slowly 457 00:28:30,560 --> 00:28:33,800 Speaker 1: lose my sanity as more and more people walk in 458 00:28:33,840 --> 00:28:36,239 Speaker 1: and out of the door that's directly adjacent to this 459 00:28:36,359 --> 00:28:40,920 Speaker 1: recording studio, necessitating that I re record the thing I 460 00:28:41,040 --> 00:28:43,680 Speaker 1: just said a second earlier. If you want to see 461 00:28:43,720 --> 00:28:46,520 Speaker 1: Jonathan go insane, go to twitch dot tv slash tech 462 00:28:46,560 --> 00:28:49,280 Speaker 1: Stuff on a Wednesday or Friday, and you too can 463 00:28:49,280 --> 00:28:51,280 Speaker 1: be part of the fun. I hope to see there, 464 00:28:51,440 --> 00:28:59,560 Speaker 1: and I'll talk to you again really soon. For more 465 00:28:59,600 --> 00:29:01,920 Speaker 1: on this and thousands of other topics, is a How 466 00:29:01,960 --> 00:29:12,720 Speaker 1: Stuff Works dot com