1 00:00:04,440 --> 00:00:12,360 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,400 --> 00:00:15,920 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:15,960 --> 00:00:19,799 Speaker 1: I'm an executive producer with iHeartRadio, and how the tech 4 00:00:19,920 --> 00:00:23,160 Speaker 1: are you? It's time for a tech Stuff classic episode. 5 00:00:23,239 --> 00:00:27,880 Speaker 1: This episode is called Augmenting Your Reality and originally published 6 00:00:27,920 --> 00:00:35,199 Speaker 1: November twenty third, twenty sixteen. Obviously, augmented reality is still 7 00:00:35,280 --> 00:00:39,120 Speaker 1: a developing technology. I would argue that no one has 8 00:00:39,200 --> 00:00:44,240 Speaker 1: really created the killer hardware for augmented reality as of yet. 9 00:00:44,479 --> 00:00:47,400 Speaker 1: By the time you listen to this, is entirely possible 10 00:00:47,479 --> 00:00:52,760 Speaker 1: that Apple has already unveiled its mixed reality headset, and 11 00:00:53,000 --> 00:00:57,000 Speaker 1: maybe Apple will be the company that really succeeds where 12 00:00:57,040 --> 00:01:01,720 Speaker 1: others just haven't yet. I don't know. I'm recording this 13 00:01:02,160 --> 00:01:04,880 Speaker 1: way back in May twenty twenty three before Apple has 14 00:01:04,920 --> 00:01:07,960 Speaker 1: held that event, So when it comes round, I guess 15 00:01:08,000 --> 00:01:12,440 Speaker 1: I'll find out. Anyway, that's going down a long tangent. 16 00:01:12,560 --> 00:01:16,640 Speaker 1: Let's listen to this classic episode, Augmenting Your Reality, which 17 00:01:16,680 --> 00:01:22,120 Speaker 1: published November twenty third, twenty sixteen. So I thought I 18 00:01:22,120 --> 00:01:26,160 Speaker 1: would do a deeper dive, a bigger explanation about what 19 00:01:26,200 --> 00:01:29,680 Speaker 1: augmented reality is. What it's all about how it works 20 00:01:30,120 --> 00:01:35,000 Speaker 1: and sort of the applications we might put a R 21 00:01:35,160 --> 00:01:39,040 Speaker 1: toward things that, you know, was it good for tons 22 00:01:39,080 --> 00:01:42,560 Speaker 1: of stuff? As it turns out, So the first thing 23 00:01:42,600 --> 00:01:46,119 Speaker 1: we should do is probably defined some terms, because if 24 00:01:46,120 --> 00:01:49,840 Speaker 1: you haven't really looked into augmented reality and you aren't 25 00:01:49,880 --> 00:01:53,760 Speaker 1: familiar with AR, you might just be lost. I'm going 26 00:01:53,840 --> 00:01:56,760 Speaker 1: to define it all for you right now, because that's 27 00:01:56,800 --> 00:01:59,880 Speaker 1: the kind of stand up guy I am. Technically speaking, 28 00:02:00,040 --> 00:02:05,120 Speaker 1: augmented reality is using digital information to enhance or augment 29 00:02:05,760 --> 00:02:10,880 Speaker 1: and experience in our physical real world. So the way 30 00:02:10,919 --> 00:02:15,040 Speaker 1: we usually see this implemented involves some sort of display 31 00:02:15,760 --> 00:02:18,320 Speaker 1: that has an image of the real world on it 32 00:02:18,360 --> 00:02:21,400 Speaker 1: and it overlays digital information on top of that image. 33 00:02:21,440 --> 00:02:25,400 Speaker 1: So think of like a camera's viewfinder, like an LCD 34 00:02:25,840 --> 00:02:29,400 Speaker 1: screen on a camera, and it actually labels the buildings 35 00:02:29,440 --> 00:02:31,840 Speaker 1: that are in view. When you're out on the street 36 00:02:31,880 --> 00:02:35,160 Speaker 1: and you hold the camera up, or a smartphone or 37 00:02:35,200 --> 00:02:38,800 Speaker 1: even a wearable device like a head mounted display that 38 00:02:38,840 --> 00:02:41,720 Speaker 1: you can look through so you can see the real world. 39 00:02:41,760 --> 00:02:44,320 Speaker 1: You're not just staring at a screen, or if you 40 00:02:44,360 --> 00:02:46,880 Speaker 1: are staring at a screen, you're staring at a video 41 00:02:47,000 --> 00:02:50,240 Speaker 1: feed that is provided by an external camera mounted just 42 00:02:50,320 --> 00:02:51,920 Speaker 1: on the other side of the screen, so it's like 43 00:02:51,960 --> 00:02:55,240 Speaker 1: you're looking through a display in the first place, but 44 00:02:55,320 --> 00:02:58,920 Speaker 1: then on top of that view you have this digital information. 45 00:02:59,520 --> 00:03:02,400 Speaker 1: That's the most common implementation we talk about, but it's 46 00:03:02,480 --> 00:03:05,080 Speaker 1: not the only one. Augmented reality does not have to 47 00:03:05,200 --> 00:03:10,280 Speaker 1: only be or even involve visual information at all. You 48 00:03:10,320 --> 00:03:14,440 Speaker 1: could have audio only augmented reality, for example, But the 49 00:03:14,480 --> 00:03:17,920 Speaker 1: whole idea is that it's something that is created digitally 50 00:03:18,320 --> 00:03:21,800 Speaker 1: to enhance your experience in the real world. Now we 51 00:03:21,880 --> 00:03:27,720 Speaker 1: can contrast this with the concept of virtual reality. Virtual reality, 52 00:03:27,760 --> 00:03:31,320 Speaker 1: of course, is a term where you create an experience 53 00:03:31,480 --> 00:03:36,760 Speaker 1: completely through computer generated means a computer is making all 54 00:03:36,800 --> 00:03:40,800 Speaker 1: the things you see and hear, and maybe even beyond 55 00:03:40,880 --> 00:03:44,840 Speaker 1: that if you have really sophisticated setups, so you might 56 00:03:44,880 --> 00:03:48,040 Speaker 1: have some haptic feedback. Haptic refers to your sense of touch, 57 00:03:48,440 --> 00:03:50,600 Speaker 1: so if you have haptic feedback, that means you're getting 58 00:03:51,040 --> 00:03:54,840 Speaker 1: information feedback through your sense of touch. A common example 59 00:03:54,880 --> 00:03:58,800 Speaker 1: of this is a rumble pack inside a game controller 60 00:03:59,000 --> 00:04:01,520 Speaker 1: where you fire a gun and a first person shooter 61 00:04:02,000 --> 00:04:05,960 Speaker 1: and your controller rumbles as a result, letting you know 62 00:04:06,000 --> 00:04:11,760 Speaker 1: that you are in fact, unleashing virtual destruction upon all 63 00:04:11,840 --> 00:04:15,360 Speaker 1: you survey. Well, the same thing can be true with 64 00:04:15,720 --> 00:04:19,839 Speaker 1: a virtual reality setup. So virtual reality is all about 65 00:04:19,839 --> 00:04:25,599 Speaker 1: constructing an artificial reality, a simulated reality. Augmented reality is 66 00:04:25,600 --> 00:04:28,880 Speaker 1: all about enhancing the one that we are actually in. 67 00:04:29,320 --> 00:04:34,240 Speaker 1: And then there's also mixed reality. Mixed reality is kind 68 00:04:34,279 --> 00:04:37,159 Speaker 1: of sort of in between the two. You might have 69 00:04:37,240 --> 00:04:42,360 Speaker 1: some physical objects within a room that are also mapped 70 00:04:42,400 --> 00:04:46,400 Speaker 1: to a virtual environment, and then you use something like 71 00:04:46,400 --> 00:04:49,960 Speaker 1: a head melted display to enter the virtual environment. That's 72 00:04:49,960 --> 00:04:53,240 Speaker 1: what looks like you're inside. But you have physical objects 73 00:04:53,279 --> 00:04:55,640 Speaker 1: in the room around you that are also mapped to 74 00:04:55,680 --> 00:04:58,120 Speaker 1: the virtual world, meaning you could pick up this physical 75 00:04:58,160 --> 00:05:00,919 Speaker 1: object and you would see that reflect did within the 76 00:05:01,000 --> 00:05:03,719 Speaker 1: virtual world, where you might pick up a sword and 77 00:05:03,800 --> 00:05:07,120 Speaker 1: shield or move a chair or something along those lines. 78 00:05:08,000 --> 00:05:12,080 Speaker 1: So augmented reality, virtual reality, and mixed reality are all 79 00:05:12,200 --> 00:05:16,160 Speaker 1: kind of interrelated, so much so that their histories also 80 00:05:16,520 --> 00:05:20,920 Speaker 1: are very much interrelated. And there's some people who try 81 00:05:20,920 --> 00:05:25,760 Speaker 1: to collect these different technologies, these different approaches and put 82 00:05:25,760 --> 00:05:28,719 Speaker 1: them under a common umbrella, and they tend to use 83 00:05:28,760 --> 00:05:33,520 Speaker 1: the phrase alternate reality, which is unfortunate because that's also 84 00:05:33,880 --> 00:05:37,080 Speaker 1: ar but Alternate reality is kind of the umbrella for virtual, 85 00:05:37,400 --> 00:05:41,839 Speaker 1: augmented and mixed reality. Now that kind of gives you 86 00:05:42,000 --> 00:05:45,480 Speaker 1: the definition of those basic terms, and it is important 87 00:05:45,480 --> 00:05:49,039 Speaker 1: to understand them because they're becoming more and more important today. 88 00:05:49,480 --> 00:05:52,640 Speaker 1: You are already probably aware of a lot of VR 89 00:05:52,760 --> 00:05:55,680 Speaker 1: headsets that are out there on the market as well 90 00:05:55,720 --> 00:06:00,560 Speaker 1: as VR well, they're kind of like case is that 91 00:06:00,640 --> 00:06:04,520 Speaker 1: you slide your smartphone into, so your smartphone becomes the 92 00:06:05,080 --> 00:06:08,599 Speaker 1: actual display on a VR headset. The headset itself is 93 00:06:08,600 --> 00:06:13,880 Speaker 1: more or less just a head mounted case for your phone. 94 00:06:13,960 --> 00:06:15,360 Speaker 1: We've seen a lot of those come out over the 95 00:06:15,440 --> 00:06:18,320 Speaker 1: last few years. We've also seen a lot of AR 96 00:06:18,600 --> 00:06:22,600 Speaker 1: applications come out, typically for things like iPads and smartphones, 97 00:06:22,600 --> 00:06:26,599 Speaker 1: but we've also seen some hardware come out that for 98 00:06:26,680 --> 00:06:31,039 Speaker 1: wearable devices that falls into the augmented reality category, stuff 99 00:06:31,080 --> 00:06:33,440 Speaker 1: like Google Glass, which i'll talk about more a little 100 00:06:33,480 --> 00:06:37,640 Speaker 1: bit later in this episode. For augmented reality to work 101 00:06:37,880 --> 00:06:42,800 Speaker 1: to get this enhanced experience of reality around you, there 102 00:06:42,839 --> 00:06:46,000 Speaker 1: are a lot of technological components that have to come 103 00:06:46,040 --> 00:06:50,480 Speaker 1: together so that you actually do get an experience that 104 00:06:50,720 --> 00:06:55,320 Speaker 1: is meaningful. You have to have technology that quote unquote 105 00:06:55,400 --> 00:06:58,560 Speaker 1: knows where you are and what you are looking at 106 00:06:58,720 --> 00:07:01,120 Speaker 1: or what you are close to in order to get 107 00:07:01,120 --> 00:07:04,680 Speaker 1: that augmented experience. It wouldn't do me any good if 108 00:07:04,680 --> 00:07:08,760 Speaker 1: I put on an augmented reality headset, for example, and 109 00:07:08,920 --> 00:07:13,560 Speaker 1: stared at let's say a famous painting, and instead of 110 00:07:13,600 --> 00:07:16,840 Speaker 1: getting information about the famous painting, I see an exploded 111 00:07:16,960 --> 00:07:20,240 Speaker 1: view of an car engine. That would make no sense. 112 00:07:20,680 --> 00:07:23,800 Speaker 1: So you have to build in technologies in order for 113 00:07:23,920 --> 00:07:27,320 Speaker 1: the AR to understand what it is you're trying to 114 00:07:27,360 --> 00:07:31,440 Speaker 1: do and to augment that experience, which meant that we 115 00:07:31,480 --> 00:07:35,200 Speaker 1: had to wait a pretty good long time for the 116 00:07:35,320 --> 00:07:41,560 Speaker 1: various technologies that we use to create this relationship to 117 00:07:41,680 --> 00:07:45,560 Speaker 1: mature to a point where it was possible. Sometimes we 118 00:07:45,600 --> 00:07:47,600 Speaker 1: had technologies that would allow us to do it, but 119 00:07:47,800 --> 00:07:53,360 Speaker 1: it required tethering headsets to very large computers, which meant 120 00:07:53,360 --> 00:07:57,080 Speaker 1: that you didn't have really any mobility and it really 121 00:07:57,160 --> 00:08:02,920 Speaker 1: limited the usefulness of the actual application. In other cases, 122 00:08:03,680 --> 00:08:06,800 Speaker 1: you could say things like your head tracking technology was 123 00:08:06,840 --> 00:08:10,160 Speaker 1: absolutely necessary for AR to develop the way it did. 124 00:08:11,000 --> 00:08:15,160 Speaker 1: GPS technology as well. Remember it wasn't that long ago 125 00:08:15,640 --> 00:08:20,880 Speaker 1: that we ordinary mirror mortals didn't have access to really 126 00:08:21,040 --> 00:08:25,960 Speaker 1: accurate GPS information For a very long time that was 127 00:08:26,080 --> 00:08:31,080 Speaker 1: purposefully made less accurate. It was a matter of national defense. 128 00:08:31,800 --> 00:08:34,480 Speaker 1: It wasn't until the nineties that you started to see 129 00:08:34,520 --> 00:08:40,760 Speaker 1: GPS become more accurate. For the basic consumer. Way back 130 00:08:40,760 --> 00:08:43,960 Speaker 1: in the day, you might get accuracy of up to 131 00:08:44,040 --> 00:08:46,559 Speaker 1: around one hundred meters, which is not great if you're 132 00:08:46,559 --> 00:08:49,200 Speaker 1: looking for the next place to make your turn. If 133 00:08:49,280 --> 00:08:53,240 Speaker 1: it's one hundred meters away, that's pretty far. But now 134 00:08:53,240 --> 00:08:55,880 Speaker 1: it's within a few feet, so it's much better. That 135 00:08:56,000 --> 00:08:58,360 Speaker 1: sort of stuff all had to come together in order 136 00:08:58,400 --> 00:09:04,640 Speaker 1: for augmented reality to become viable. I almost said a reality, 137 00:09:04,679 --> 00:09:08,439 Speaker 1: but that just starts to sound redundant at any rate, 138 00:09:08,520 --> 00:09:13,280 Speaker 1: Let's talk about some of these technologies we really need, 139 00:09:13,320 --> 00:09:19,160 Speaker 1: things like gyroscopes, accelerometers. These help devices understand their orientation 140 00:09:20,240 --> 00:09:23,840 Speaker 1: where they are in respect to something else, like are 141 00:09:23,880 --> 00:09:25,760 Speaker 1: they For a smartphone, it might be is it in 142 00:09:25,840 --> 00:09:29,680 Speaker 1: landscape mode or portrait mode? But for a head mounted display, 143 00:09:30,080 --> 00:09:34,240 Speaker 1: it would help give the unit the information it needs 144 00:09:34,280 --> 00:09:36,480 Speaker 1: to know which way you're looking, like are you looking 145 00:09:36,520 --> 00:09:38,440 Speaker 1: to the east or to the west. That kind of 146 00:09:38,440 --> 00:09:45,760 Speaker 1: thing also compasses obviously very important GPS sensors, image recognition software, 147 00:09:45,800 --> 00:09:48,400 Speaker 1: but has become really important so that when you are 148 00:09:48,440 --> 00:09:53,480 Speaker 1: looking at something, the system can actually identify what that is. 149 00:09:53,800 --> 00:09:56,439 Speaker 1: In some cases you can get around this. You can 150 00:09:56,480 --> 00:09:59,480 Speaker 1: design an AR system where let's say you make a 151 00:09:59,520 --> 00:10:05,440 Speaker 1: movie post and the AR application has the movie poster 152 00:10:05,640 --> 00:10:09,319 Speaker 1: animate in some way if you hold up a smartphone 153 00:10:09,320 --> 00:10:13,360 Speaker 1: that's running the appropriate app. So I'm just gonna take 154 00:10:13,720 --> 00:10:16,080 Speaker 1: a movie from my past that does not have an 155 00:10:16,400 --> 00:10:19,800 Speaker 1: AR movie poster associated with it, but one that I 156 00:10:19,840 --> 00:10:22,080 Speaker 1: can talk about as if it were a good example, 157 00:10:22,440 --> 00:10:24,640 Speaker 1: and that has to be Big Trouble in Little China, 158 00:10:25,840 --> 00:10:28,760 Speaker 1: universally declared the best movie that has ever been made. 159 00:10:29,040 --> 00:10:31,520 Speaker 1: So you've got your Big Trouble in Little China poster 160 00:10:31,679 --> 00:10:34,920 Speaker 1: up on the wall, and you hold up your smartphone 161 00:10:34,960 --> 00:10:38,160 Speaker 1: and you activate your Big Trouble in Little China movie 162 00:10:38,200 --> 00:10:42,360 Speaker 1: marketing app and the camera on your phone to text 163 00:10:42,360 --> 00:10:45,439 Speaker 1: the poster. You know the posters there well, The app 164 00:10:45,480 --> 00:10:49,040 Speaker 1: and the poster together are able to construct the augmented 165 00:10:49,120 --> 00:10:52,840 Speaker 1: experience because there have been elements put into the poster 166 00:10:53,200 --> 00:10:56,000 Speaker 1: that the app is looking for, and once the app 167 00:10:56,040 --> 00:11:00,280 Speaker 1: identifies that, like, it sees maybe eight different points on 168 00:11:00,320 --> 00:11:03,199 Speaker 1: the poster and because of the orientation of those points, 169 00:11:03,600 --> 00:11:06,320 Speaker 1: it knows what angle it's at, what height it's at 170 00:11:06,600 --> 00:11:10,280 Speaker 1: in relation to the phone, and can give you on 171 00:11:10,320 --> 00:11:14,840 Speaker 1: your display the augmented reality experience. In this case, it's 172 00:11:14,960 --> 00:11:19,360 Speaker 1: obviously Jack Burton and the pork Chop Express eating a sandwich, because, 173 00:11:19,400 --> 00:11:23,320 Speaker 1: as we know, the most riveting scene in the movie 174 00:11:23,559 --> 00:11:27,160 Speaker 1: unfolds in this way. So that would be kind of 175 00:11:27,200 --> 00:11:29,679 Speaker 1: an augmented reality experience where you didn't have to worry 176 00:11:29,720 --> 00:11:33,760 Speaker 1: about every possible application out in the real world. You 177 00:11:33,840 --> 00:11:36,880 Speaker 1: made it for something very specific, which means in your 178 00:11:36,880 --> 00:11:41,120 Speaker 1: software you can have the camera look quote unquote for 179 00:11:41,440 --> 00:11:46,479 Speaker 1: these particular points of reference and thus create the augmented 180 00:11:46,559 --> 00:11:49,280 Speaker 1: experience in that way. If you want to take that 181 00:11:49,400 --> 00:11:51,360 Speaker 1: and move it to the real world where you can 182 00:11:51,720 --> 00:11:55,240 Speaker 1: see augmented information about just the world around you, it 183 00:11:55,320 --> 00:11:58,840 Speaker 1: becomes way more complicated. You have to have very sophisticated 184 00:11:58,920 --> 00:12:03,840 Speaker 1: image recognition saw so that the camera picks up the images, 185 00:12:04,080 --> 00:12:07,600 Speaker 1: the software processes the information, identifies what those images are, 186 00:12:07,960 --> 00:12:11,839 Speaker 1: and gives you the relevant information. So working with all 187 00:12:11,920 --> 00:12:15,440 Speaker 1: the sensors, augmented reality can make this a possibility. So 188 00:12:15,520 --> 00:12:19,160 Speaker 1: another example, let's say you're out on the street in Atlanta, 189 00:12:19,280 --> 00:12:22,760 Speaker 1: you're here in my hometown Atlanta, Georgia, and you're looking 190 00:12:22,800 --> 00:12:24,520 Speaker 1: at a building and you wonder what it is, and 191 00:12:24,520 --> 00:12:26,480 Speaker 1: you hold up your phone and you've got your little 192 00:12:26,920 --> 00:12:29,960 Speaker 1: map app that allows you to look at a real 193 00:12:30,040 --> 00:12:33,360 Speaker 1: world setting and tells you information about it, and it 194 00:12:33,400 --> 00:12:35,880 Speaker 1: tells you it's the Georgia Aquarium. Well, first of all, 195 00:12:35,880 --> 00:12:37,839 Speaker 1: you would probably know that already because the signage there 196 00:12:37,880 --> 00:12:41,480 Speaker 1: is actually pretty good. But the point being that this 197 00:12:41,520 --> 00:12:45,160 Speaker 1: would be something that would tap into the GPS coordinates 198 00:12:45,200 --> 00:12:47,280 Speaker 1: on your phone, so it would know where your location 199 00:12:47,559 --> 00:12:50,120 Speaker 1: was and help narrow that down. The compass would tell 200 00:12:50,120 --> 00:12:55,240 Speaker 1: it what direction you are facing the camera angle. Also, 201 00:12:55,480 --> 00:12:59,479 Speaker 1: you have some image recognition going on there. The accelerometer 202 00:12:59,559 --> 00:13:02,720 Speaker 1: tells the orientation of the phone itself. All of this 203 00:13:02,840 --> 00:13:06,480 Speaker 1: data together would give the software the information needed for 204 00:13:06,559 --> 00:13:11,720 Speaker 1: it to display the labeled Georgia Aquarium on your phone. 205 00:13:11,760 --> 00:13:16,280 Speaker 1: And it all happens in an instant that's pretty amazing. Typically, 206 00:13:16,280 --> 00:13:20,359 Speaker 1: you also have to have some other method to communicate 207 00:13:20,440 --> 00:13:24,439 Speaker 1: with a larger infrastructure, because we don't have the capability 208 00:13:24,520 --> 00:13:28,720 Speaker 1: of building an enormously powerful computer that has all this 209 00:13:28,800 --> 00:13:32,120 Speaker 1: real world information programmed into it and make it a 210 00:13:32,280 --> 00:13:36,600 Speaker 1: handheld or wearable device. So usually you have to pair 211 00:13:36,720 --> 00:13:40,480 Speaker 1: these devices with some other larger infrastructure. Sometimes it's a 212 00:13:40,520 --> 00:13:44,480 Speaker 1: double handshake. For example, with Google Glass, you would use 213 00:13:44,480 --> 00:13:47,920 Speaker 1: Bluetooth to connect Google Glass to a smartphone. Then the 214 00:13:47,960 --> 00:13:51,800 Speaker 1: smartphone would have the connection to the larger internet through 215 00:13:51,880 --> 00:14:00,800 Speaker 1: your smartphone's cell service provider. So while you're experiencing the 216 00:14:00,840 --> 00:14:04,400 Speaker 1: augmented reality through the Google Glass, it's actually communicating through 217 00:14:04,400 --> 00:14:06,800 Speaker 1: your phone to the infrastructure to get the data it 218 00:14:06,880 --> 00:14:10,280 Speaker 1: needs to show you the information. It's showing you very 219 00:14:10,280 --> 00:14:12,800 Speaker 1: important elements. And all of these components, like I said, 220 00:14:12,920 --> 00:14:16,720 Speaker 1: came together more or less around the same time. Most 221 00:14:16,760 --> 00:14:20,160 Speaker 1: of them were being developed independently of each other, and 222 00:14:20,160 --> 00:14:23,040 Speaker 1: it's just that now we're seeing them all converge. That's 223 00:14:23,080 --> 00:14:26,960 Speaker 1: an old favorite word here at tech stuff converge together 224 00:14:27,000 --> 00:14:29,920 Speaker 1: to create the augmented reality experience and make it possible. 225 00:14:30,760 --> 00:14:32,880 Speaker 1: So how did we get here? How did these different 226 00:14:32,880 --> 00:14:37,080 Speaker 1: elements develop? Well, there are a whole bunch of technology 227 00:14:37,120 --> 00:14:40,560 Speaker 1: pioneers who really create the foundation for augmented reality as 228 00:14:40,560 --> 00:14:43,560 Speaker 1: well as virtual reality and mixed reality. But one that 229 00:14:43,640 --> 00:14:46,120 Speaker 1: I think we really need to concentrate on it first 230 00:14:46,240 --> 00:14:51,120 Speaker 1: is Ivan Sutherland. Now Sutherland was born in Hastings, Nebraska 231 00:14:51,160 --> 00:14:54,200 Speaker 1: in nineteen thirty eight, and as a kid, he was 232 00:14:54,280 --> 00:15:00,520 Speaker 1: fascinated with mathematics, particularly geometry, and also with engineering. He 233 00:15:00,560 --> 00:15:03,920 Speaker 1: began to study and experiment with computers while he was 234 00:15:03,960 --> 00:15:06,560 Speaker 1: in school, and this was at a time where personal 235 00:15:06,560 --> 00:15:09,240 Speaker 1: computers weren't a thing. There were no personal computers at 236 00:15:09,240 --> 00:15:11,880 Speaker 1: this point. Computers were actually pretty rare, and they were huge, 237 00:15:12,360 --> 00:15:16,440 Speaker 1: and in fact, they often would rely upon physical media 238 00:15:16,520 --> 00:15:21,200 Speaker 1: formats like punch cards or paper tape to read a program. 239 00:15:22,000 --> 00:15:25,840 Speaker 1: So you didn't even have a disc or like certainly 240 00:15:25,840 --> 00:15:29,040 Speaker 1: nothing like a USB thumb drive or anything like that. 241 00:15:29,120 --> 00:15:32,360 Speaker 1: You actually had to put physical media into the machine 242 00:15:32,400 --> 00:15:36,080 Speaker 1: for it to read and then execute whatever program you 243 00:15:36,120 --> 00:15:40,480 Speaker 1: had designed for that device. He went to college at 244 00:15:40,520 --> 00:15:44,000 Speaker 1: what is now Carnegie Mellon University on a full scholarship. 245 00:15:44,200 --> 00:15:46,520 Speaker 1: He graduated with a Bachelor of Science degree. He would 246 00:15:46,560 --> 00:15:48,680 Speaker 1: then go on to earn a master's degree at cal 247 00:15:48,760 --> 00:15:53,360 Speaker 1: Tech and a PhD in electrical engineering from MIT. And 248 00:15:53,560 --> 00:15:58,400 Speaker 1: actually his doctoral thesis supervisor was Claude Shannon. And we 249 00:15:58,520 --> 00:16:01,520 Speaker 1: talked about Claude Shannon back in the twenty fourteen episode 250 00:16:01,520 --> 00:16:05,720 Speaker 1: Who is Claude Shannon. We recorded that not too long 251 00:16:05,800 --> 00:16:09,080 Speaker 1: after Shannon's passing, So if you want to hear a 252 00:16:09,120 --> 00:16:13,120 Speaker 1: really interesting story about a pioneer in computer science, you 253 00:16:13,120 --> 00:16:17,200 Speaker 1: should go check out that twenty fourteen episode. Back to Sutherland. 254 00:16:17,560 --> 00:16:22,040 Speaker 1: For his thesis, he created something called Sketchpad and that 255 00:16:22,200 --> 00:16:25,880 Speaker 1: was really, by most accounts, the first computer graphical user 256 00:16:25,920 --> 00:16:29,920 Speaker 1: interface or guy A graphic goal user interface means that 257 00:16:30,160 --> 00:16:36,440 Speaker 1: you interact with the computer through graphics representing various commands 258 00:16:36,520 --> 00:16:39,640 Speaker 1: on the computer. Windows and the Mac operating system are 259 00:16:39,680 --> 00:16:43,520 Speaker 1: both examples of graphical user interfaces, as is the interface 260 00:16:43,560 --> 00:16:46,400 Speaker 1: on your smartphone. If you have a smartphone where you 261 00:16:46,680 --> 00:16:51,520 Speaker 1: choose applications on a screen, that's a graphical user interface. Well, 262 00:16:51,560 --> 00:16:54,400 Speaker 1: Sutherland created what is largely considered to be the first 263 00:16:54,400 --> 00:16:58,080 Speaker 1: one of those. After college, he entered military service and 264 00:16:58,200 --> 00:17:01,920 Speaker 1: he was assigned to the National Seecurity Agency. We have 265 00:17:02,040 --> 00:17:06,479 Speaker 1: great friends there. I assume I'm sure they're listening, because 266 00:17:06,480 --> 00:17:09,960 Speaker 1: they're listening to everything at any rate. He entered the 267 00:17:10,080 --> 00:17:13,320 Speaker 1: NSA as an electrical engineer, and in nineteen sixty four 268 00:17:13,440 --> 00:17:17,200 Speaker 1: he replaced JC. R. Licklider as the head of DARPA's 269 00:17:17,240 --> 00:17:23,360 Speaker 1: Information Processing Techniques Office. Or IPTO. And also by back 270 00:17:23,400 --> 00:17:28,560 Speaker 1: then DARPA wasn't DARPA, it was just ARPA. So this 271 00:17:28,600 --> 00:17:30,320 Speaker 1: is the same group, by the way, that would end 272 00:17:30,440 --> 00:17:33,080 Speaker 1: up doing a lot of work that would form the 273 00:17:33,280 --> 00:17:36,199 Speaker 1: ARPANET a few years later, and the arpainet was the 274 00:17:36,280 --> 00:17:40,200 Speaker 1: predecessor to the Internet in some ways. At least, the 275 00:17:40,320 --> 00:17:44,320 Speaker 1: ARPAET was what ended up being the building blocks for 276 00:17:44,720 --> 00:17:48,479 Speaker 1: the infrastructure that would become the Internet. Now, all of 277 00:17:48,480 --> 00:17:51,520 Speaker 1: that work happened after Sutherland had already departed the organization. 278 00:17:53,080 --> 00:17:57,320 Speaker 1: His work became a fundamental component of both virtual and 279 00:17:57,359 --> 00:18:00,760 Speaker 1: augmented reality. As I mentioned earlier, in nineteen sixty five, 280 00:18:01,160 --> 00:18:03,560 Speaker 1: he wrote a piece and essay. It's very short, it's 281 00:18:03,640 --> 00:18:05,879 Speaker 1: very easy read, and you can find it online. The 282 00:18:05,880 --> 00:18:09,199 Speaker 1: title of the essay is the Ultimate Display. And if 283 00:18:09,240 --> 00:18:13,280 Speaker 1: you ever do any research and virtual reality or augmented reality, 284 00:18:13,520 --> 00:18:16,440 Speaker 1: this essay is going to pop up in your research, 285 00:18:16,800 --> 00:18:19,040 Speaker 1: so go ahead and read it. It's like two pages long, 286 00:18:19,119 --> 00:18:22,840 Speaker 1: so it goes very quickly. In that essay he talked 287 00:18:22,840 --> 00:18:29,080 Speaker 1: about several ideas, including the idealized display, the ultimate display, 288 00:18:29,240 --> 00:18:34,919 Speaker 1: something that would be the furthest you could go with 289 00:18:35,040 --> 00:18:39,000 Speaker 1: display technology. Now, keep in mind in his time. By 290 00:18:39,040 --> 00:18:41,320 Speaker 1: his time, he's still alive, by the way, this time 291 00:18:41,359 --> 00:18:45,879 Speaker 1: in the nineteen sixties, he you know, things were just 292 00:18:46,040 --> 00:18:50,840 Speaker 1: restricted to monitors. You might have a light pen, but 293 00:18:50,960 --> 00:18:54,280 Speaker 1: usually you just use a keyboard. Like it was pretty 294 00:18:54,880 --> 00:18:57,919 Speaker 1: bare bones. But he said, let's push this as far 295 00:18:58,000 --> 00:19:02,320 Speaker 1: as we can imagine it. His example, he thought of 296 00:19:02,359 --> 00:19:06,400 Speaker 1: a room that would be completely controlled by computers. Everything 297 00:19:06,400 --> 00:19:10,000 Speaker 1: you would experience within that room would be generated by 298 00:19:10,000 --> 00:19:14,919 Speaker 1: a computer. Everything you see here, smell, taste, and touch, 299 00:19:15,800 --> 00:19:18,320 Speaker 1: all of it generated by computers. The computer would even 300 00:19:18,400 --> 00:19:22,640 Speaker 1: be able to form physical objects out of pure matter itself. Now, 301 00:19:22,680 --> 00:19:26,320 Speaker 1: he wasn't suggesting that this would ever be a device 302 00:19:26,359 --> 00:19:28,399 Speaker 1: that we would actually be able to build. He was 303 00:19:28,440 --> 00:19:33,560 Speaker 1: just saying, what is the ultimate incarnation of display technology? 304 00:19:34,359 --> 00:19:37,159 Speaker 1: And if you read it, you realize, oh, this is 305 00:19:37,200 --> 00:19:39,879 Speaker 1: where the Star Trek Next Generation writers got their idea 306 00:19:39,920 --> 00:19:43,360 Speaker 1: for the Holidack. But unlike Star Trek the Next Generation, 307 00:19:43,640 --> 00:19:46,840 Speaker 1: the Ultimate Display would not go on the Fritz every 308 00:19:46,840 --> 00:19:49,840 Speaker 1: other episode and try to kill the crew. It was 309 00:19:49,920 --> 00:19:54,720 Speaker 1: better than that. The Ultimate Display was sort of a 310 00:19:55,840 --> 00:20:01,760 Speaker 1: foundational like Philosophically, it was foundational forvirual reality and augmented reality. 311 00:20:01,840 --> 00:20:05,520 Speaker 1: This idea of a very immersive experience where you, as 312 00:20:05,560 --> 00:20:12,400 Speaker 1: a user are surrounded somehow by this computer generated experience. 313 00:20:12,880 --> 00:20:15,720 Speaker 1: And that's true both with augmented reality and virtual reality. 314 00:20:15,800 --> 00:20:20,600 Speaker 1: In augmented reality, the real world is still there, but 315 00:20:20,680 --> 00:20:24,960 Speaker 1: you get this enhanced experience that is completely computer generated. 316 00:20:26,200 --> 00:20:31,200 Speaker 1: So in nineteen sixty eight, Sutherland and a student named 317 00:20:31,280 --> 00:20:36,679 Speaker 1: Danny Cohen would create a VR ar head mounted display 318 00:20:36,960 --> 00:20:41,560 Speaker 1: or HMD, and they nicknamed it the Sword of Damocles. 319 00:20:42,800 --> 00:20:45,439 Speaker 1: Why because you had to suspend it from the ceiling. 320 00:20:45,480 --> 00:20:48,359 Speaker 1: It was too heavy to wear on your head. You 321 00:20:48,400 --> 00:20:53,520 Speaker 1: needed it to be nice and sturdy. It included transparent lenses, 322 00:20:53,560 --> 00:20:57,400 Speaker 1: which meant you could overlay computer information on the lenses themselves, 323 00:20:57,640 --> 00:21:00,040 Speaker 1: and thus you could look through the lenses at the 324 00:21:00,080 --> 00:21:04,760 Speaker 1: real world and have these wireframe graphics on top of 325 00:21:04,840 --> 00:21:07,199 Speaker 1: what you were looking at. And it also had a 326 00:21:07,240 --> 00:21:11,240 Speaker 1: magnetic tracking system, meaning that it had sensors that could 327 00:21:11,240 --> 00:21:14,320 Speaker 1: detect magnetic fields, and as you turned your head or 328 00:21:14,480 --> 00:21:18,399 Speaker 1: you change the inclination of your head, it would change 329 00:21:18,440 --> 00:21:21,600 Speaker 1: the magnetic field and this would be relayed as a 330 00:21:21,680 --> 00:21:26,720 Speaker 1: command to the visual center the actual lenses themselves, so 331 00:21:26,760 --> 00:21:30,040 Speaker 1: that it would the change would be reflected in what 332 00:21:30,080 --> 00:21:33,240 Speaker 1: you saw. So if you have a virtual environment and 333 00:21:33,280 --> 00:21:35,359 Speaker 1: you turn your head to the left, you want the 334 00:21:35,480 --> 00:21:38,040 Speaker 1: view within the virtual environment to go to the left too, 335 00:21:38,520 --> 00:21:41,480 Speaker 1: But without head tracking technology that's impossible. So this was 336 00:21:41,520 --> 00:21:44,400 Speaker 1: a very early example of head tracking technology, and again 337 00:21:44,440 --> 00:21:49,720 Speaker 1: it used magnets magnetic fields in order to do that. Obviously, 338 00:21:49,720 --> 00:21:53,720 Speaker 1: it's also really important for augmented reality. Again, if the 339 00:21:54,160 --> 00:21:57,399 Speaker 1: AR system doesn't detect that you are looking around, then 340 00:21:57,480 --> 00:22:01,240 Speaker 1: you're not getting relevant information for the specific thing you 341 00:22:01,359 --> 00:22:05,439 Speaker 1: are looking at. Anyway, as I said, the graphics were 342 00:22:05,440 --> 00:22:09,400 Speaker 1: pretty primitive, they were wireframe drawings, but they still showed 343 00:22:09,520 --> 00:22:15,280 Speaker 1: that this was a viable approach to technology using HMD 344 00:22:15,440 --> 00:22:18,840 Speaker 1: for augmented or virtual reality use. Oh and one other 345 00:22:18,880 --> 00:22:21,399 Speaker 1: note I should make So a lot of people say 346 00:22:21,440 --> 00:22:24,160 Speaker 1: the sort of Damocles was the first head mounted display, 347 00:22:24,840 --> 00:22:26,840 Speaker 1: and they say, you know, this is the first HMD 348 00:22:27,040 --> 00:22:30,479 Speaker 1: was made in nineteen sixty eight. I take issue with that. 349 00:22:31,359 --> 00:22:34,520 Speaker 1: I don't think of the sort of Damocles as the 350 00:22:34,560 --> 00:22:37,800 Speaker 1: first head mounted display. That to me should go to 351 00:22:38,080 --> 00:22:44,840 Speaker 1: a different invention called the head site hadsight now that 352 00:22:44,920 --> 00:22:48,800 Speaker 1: was developed by phil Co and unlike the sort of damocles. 353 00:22:49,400 --> 00:22:53,800 Speaker 1: It didn't create a virtual world. Instead, the head site 354 00:22:54,040 --> 00:22:58,479 Speaker 1: was sort of a remote viewfinder for a video camera. 355 00:23:00,000 --> 00:23:04,240 Speaker 1: Imagine that you got a camera mounted on a mechanical 356 00:23:05,320 --> 00:23:09,199 Speaker 1: swiveling mount, so you can move it left right, you 357 00:23:09,200 --> 00:23:13,000 Speaker 1: can change the orientation the inclination as well, and then 358 00:23:13,040 --> 00:23:15,879 Speaker 1: you have that mapped to a head mounted display, so 359 00:23:15,920 --> 00:23:18,200 Speaker 1: that if I put the display on and I look 360 00:23:18,280 --> 00:23:20,880 Speaker 1: to the left, the camera pans to the left. If 361 00:23:20,880 --> 00:23:22,440 Speaker 1: I look to the right, it pans to the right. 362 00:23:22,560 --> 00:23:26,000 Speaker 1: That sort of thing. It was meant to be a 363 00:23:26,040 --> 00:23:29,879 Speaker 1: way for people to operate a camera in a remote 364 00:23:29,960 --> 00:23:33,800 Speaker 1: location that might not be very friendly to a human 365 00:23:33,880 --> 00:23:37,639 Speaker 1: being standing there. For example, the exterior of an aircraft. 366 00:23:38,000 --> 00:23:40,080 Speaker 1: You could have a camera mounted on the outside of 367 00:23:40,080 --> 00:23:43,639 Speaker 1: your aircraft that would allow an engineer on the inside 368 00:23:44,080 --> 00:23:48,560 Speaker 1: to look around and maybe help a pilot land or 369 00:23:48,800 --> 00:23:52,360 Speaker 1: navigate in a dangerous situation, or just get an idea 370 00:23:52,520 --> 00:23:56,959 Speaker 1: of the status of the aircraft itself. This was very 371 00:23:57,080 --> 00:24:01,560 Speaker 1: much a technology that was being pushed by the military, 372 00:24:02,560 --> 00:24:06,440 Speaker 1: an idea to create more military uses using this technology 373 00:24:06,480 --> 00:24:11,040 Speaker 1: to make the military more competent, more adept at very 374 00:24:11,240 --> 00:24:17,160 Speaker 1: rapid changing situations on the technology front, so headsite preceded 375 00:24:17,520 --> 00:24:20,320 Speaker 1: the sort of Damocles by about seven years. It came 376 00:24:20,359 --> 00:24:23,000 Speaker 1: out around nineteen sixty one. But again it wasn't a 377 00:24:23,080 --> 00:24:26,639 Speaker 1: virtual reality headset or an augmented reality headset. It was 378 00:24:26,720 --> 00:24:29,600 Speaker 1: kind of a like I said, a remote viewfinder. But 379 00:24:29,720 --> 00:24:33,360 Speaker 1: still I consider that to be the earliest head mounted display, 380 00:24:33,560 --> 00:24:37,199 Speaker 1: not the sort of Damocles. However, Sutherland would end up 381 00:24:37,200 --> 00:24:40,320 Speaker 1: going on to make lots of other contributions in computer 382 00:24:40,400 --> 00:24:44,720 Speaker 1: graphics as well as the overall concepts that would guide 383 00:24:44,760 --> 00:24:48,680 Speaker 1: both virtual reality and augmented reality development over the next 384 00:24:48,680 --> 00:24:52,760 Speaker 1: several decades. But now it'll be time for me to 385 00:24:52,840 --> 00:24:55,639 Speaker 1: kind of move away from Sutherland and talk about some 386 00:24:55,720 --> 00:24:59,080 Speaker 1: other developments that were important in AR. And before I 387 00:24:59,119 --> 00:25:02,760 Speaker 1: get to that, let's take a quick break to thank 388 00:25:02,800 --> 00:25:15,879 Speaker 1: our sponsor. All Right, we just left off with Ivan Sutherland. 389 00:25:15,960 --> 00:25:21,679 Speaker 1: Now let's talk about a different father of augmented reality, 390 00:25:22,160 --> 00:25:26,920 Speaker 1: Myron Krueger or doctor Myron Krueger. In nineteen seventy four, 391 00:25:26,960 --> 00:25:31,119 Speaker 1: doctor Krueger created an augmented reality lab called Video Place. 392 00:25:32,440 --> 00:25:36,920 Speaker 1: He was really into this idea of seeing the interaction 393 00:25:37,119 --> 00:25:41,560 Speaker 1: of technology and people in artistic ways. He really wanted 394 00:25:41,600 --> 00:25:47,640 Speaker 1: to explore artistic expressions using technology and people working together. 395 00:25:48,960 --> 00:25:51,600 Speaker 1: So he wanted to create an artificial reality environment that 396 00:25:51,680 --> 00:25:55,080 Speaker 1: didn't require the user to wear special equipment. You wouldn't 397 00:25:55,119 --> 00:25:57,560 Speaker 1: have to put on a head Moulton display, or wear 398 00:25:57,920 --> 00:26:02,560 Speaker 1: special gloves, or use in any kind of device to 399 00:26:02,640 --> 00:26:06,760 Speaker 1: control your actions, because that's a barrier between you and 400 00:26:06,800 --> 00:26:12,679 Speaker 1: the experience. Instead, his version consisted of a laboratory that 401 00:26:12,760 --> 00:26:17,920 Speaker 1: had several rooms all networked together, and each room had 402 00:26:18,119 --> 00:26:21,920 Speaker 1: a video camera in it and a projector and a screen. Now, 403 00:26:21,920 --> 00:26:24,679 Speaker 1: the video camera would pick up the motions of the 404 00:26:24,720 --> 00:26:28,960 Speaker 1: person inside the room, it would send information to the projector, 405 00:26:29,080 --> 00:26:33,200 Speaker 1: which would then project the person's silhouette on the screen. 406 00:26:33,480 --> 00:26:36,800 Speaker 1: And the silhouette was typically a really bright color, and 407 00:26:36,840 --> 00:26:39,600 Speaker 1: you could move around and your silhouette would move around, 408 00:26:39,600 --> 00:26:42,240 Speaker 1: So you almost became like a puppet master controlling your 409 00:26:42,280 --> 00:26:45,959 Speaker 1: own silhouette. But then he started to incorporate other things, 410 00:26:46,000 --> 00:26:49,919 Speaker 1: like other elements that were virtually on the screen. The 411 00:26:49,960 --> 00:26:52,760 Speaker 1: projector was projecting things that were on the screen but 412 00:26:52,880 --> 00:26:56,399 Speaker 1: not in the actual real room itself. So imagine a ball, 413 00:26:56,640 --> 00:26:59,399 Speaker 1: and a ball is being projected on the screen. Well, 414 00:26:59,440 --> 00:27:01,800 Speaker 1: you could move as that your silhouette would interact with 415 00:27:01,840 --> 00:27:04,920 Speaker 1: the ball and the ball would bounce away. That sort 416 00:27:04,960 --> 00:27:07,080 Speaker 1: of thing. So you would be able to interact with 417 00:27:07,160 --> 00:27:11,000 Speaker 1: virtual environments by moving around in a real physical space. 418 00:27:11,560 --> 00:27:17,400 Speaker 1: And while those objects weren't really there in front of you, 419 00:27:17,400 --> 00:27:19,920 Speaker 1: you could see the representation of them on the screen. 420 00:27:20,280 --> 00:27:22,680 Speaker 1: And this was really powerful stuff. And remember I said 421 00:27:22,720 --> 00:27:25,160 Speaker 1: these rooms were all networked together, so you could actually 422 00:27:25,400 --> 00:27:28,199 Speaker 1: have a system where a person in one room and 423 00:27:28,240 --> 00:27:31,919 Speaker 1: a person in another room both have their silhouettes projected 424 00:27:32,160 --> 00:27:37,640 Speaker 1: together in their respective rooms on the screen, and your 425 00:27:37,880 --> 00:27:40,720 Speaker 1: silhouette would be one color, the other person's silhouette would 426 00:27:40,720 --> 00:27:43,840 Speaker 1: be a different color, and you could interact with one another. 427 00:27:44,560 --> 00:27:51,200 Speaker 1: And according to reports from this art experiment, they noticed 428 00:27:51,200 --> 00:27:54,800 Speaker 1: that whenever people would have their silhouettes cross one another, 429 00:27:55,240 --> 00:27:58,960 Speaker 1: they would actually recoil in their physical rooms. Keep in mind, 430 00:27:59,000 --> 00:28:01,560 Speaker 1: they're in different rooms, they're not in the same one together, 431 00:28:02,000 --> 00:28:05,040 Speaker 1: they would recoil as if they had made physical contact 432 00:28:05,160 --> 00:28:08,200 Speaker 1: or bumped into someone. So it showed that there was 433 00:28:08,240 --> 00:28:13,080 Speaker 1: a very powerful psychological element to this virtual presence. And 434 00:28:13,119 --> 00:28:16,800 Speaker 1: again that psychological element plays a huge important role in 435 00:28:17,080 --> 00:28:21,080 Speaker 1: VR and AR research and development, not just for creating products, 436 00:28:21,080 --> 00:28:25,360 Speaker 1: but just to understand how we process information and incorporate 437 00:28:25,400 --> 00:28:28,800 Speaker 1: it into our sense of reality. Not to get too 438 00:28:28,880 --> 00:28:33,320 Speaker 1: deep for you guys, So experimentation in the field continued 439 00:28:33,400 --> 00:28:36,600 Speaker 1: over the years. In the early nineteen eighties, doctor Krueger 440 00:28:36,760 --> 00:28:40,520 Speaker 1: would write a book and publish it about artificial realities. 441 00:28:41,160 --> 00:28:44,000 Speaker 1: But while the principles for augmented reality were established, the 442 00:28:44,080 --> 00:28:49,720 Speaker 1: technologies were still rather unwieldy. They were large, they weren't reliable, 443 00:28:50,040 --> 00:28:53,440 Speaker 1: and it would require several years of work to improve 444 00:28:53,520 --> 00:28:57,840 Speaker 1: those technologies, to create miniaturization strategies, to get the elements 445 00:28:57,880 --> 00:29:00,880 Speaker 1: down to a size that was more practic for that 446 00:29:01,040 --> 00:29:03,719 Speaker 1: sort of use, and what can require you to have 447 00:29:03,800 --> 00:29:07,840 Speaker 1: a headmund display mounted to the ceiling, and all of 448 00:29:07,840 --> 00:29:10,880 Speaker 1: that took time, but you could tell that the ideas 449 00:29:11,160 --> 00:29:15,920 Speaker 1: underlying augmented and virtual reality were already in place. In 450 00:29:16,040 --> 00:29:20,720 Speaker 1: nineteen ninety there was a Boeing researcher named Tom Coddell 451 00:29:20,960 --> 00:29:26,240 Speaker 1: who coined the term augmented reality, and he was specifically 452 00:29:26,320 --> 00:29:29,360 Speaker 1: using it to talk about this approach to overlaying digital 453 00:29:29,360 --> 00:29:31,959 Speaker 1: information on top of our physical world to enhance it 454 00:29:32,080 --> 00:29:36,120 Speaker 1: in some way. Now, doctor Coddell earned a PhD in 455 00:29:36,160 --> 00:29:40,480 Speaker 1: physics and astronomy from the University of Arizona, and before 456 00:29:40,480 --> 00:29:44,080 Speaker 1: contributing the term augmented reality to the public lexicon, he 457 00:29:44,160 --> 00:29:48,160 Speaker 1: did extensive work and artificial intelligence research and development. He 458 00:29:48,240 --> 00:29:50,640 Speaker 1: also became a professor in the fields of electrical and 459 00:29:50,680 --> 00:29:54,960 Speaker 1: computer engineering at the University of New Mexico. So when 460 00:29:54,960 --> 00:29:57,600 Speaker 1: he was working with Boeing, he used this phrase to 461 00:29:57,680 --> 00:30:00,760 Speaker 1: talk about specific system he was working on, an augmented 462 00:30:00,800 --> 00:30:03,400 Speaker 1: reality system, and the whole purpose of this was to 463 00:30:03,600 --> 00:30:09,640 Speaker 1: help people who were helping construct airplanes lay cables properly. 464 00:30:10,320 --> 00:30:13,520 Speaker 1: The whole idea was to use this system so that 465 00:30:14,800 --> 00:30:17,800 Speaker 1: an electrician can see exactly where the cable needed to 466 00:30:17,840 --> 00:30:23,720 Speaker 1: go inside the partly constructed cabin of an aircraft, and 467 00:30:23,760 --> 00:30:26,520 Speaker 1: that way you could follow the directions that you see 468 00:30:26,560 --> 00:30:30,680 Speaker 1: through your display, lay the actual cable down where the 469 00:30:30,720 --> 00:30:34,480 Speaker 1: guide tells you to go, and then you would have 470 00:30:34,520 --> 00:30:40,320 Speaker 1: a properly wired airplane. And I'm sure, as we're all aware, 471 00:30:40,440 --> 00:30:46,600 Speaker 1: properly wired airplanes are good airplanes. Improperly wired airplanes are 472 00:30:46,640 --> 00:30:49,320 Speaker 1: not so good. So it was a very important system 473 00:30:49,360 --> 00:30:53,520 Speaker 1: to make this much more smooth and fast, and it 474 00:30:53,600 --> 00:30:58,360 Speaker 1: meant that you didn't have to have as many experts 475 00:30:59,200 --> 00:31:02,360 Speaker 1: to guide the process. You could actually have someone come 476 00:31:02,400 --> 00:31:06,160 Speaker 1: in who had never done this before and just follow 477 00:31:06,200 --> 00:31:11,120 Speaker 1: the directions through this augmented reality set system and they 478 00:31:11,120 --> 00:31:17,080 Speaker 1: could wire the airplane properly. So really clever means of 479 00:31:17,200 --> 00:31:20,640 Speaker 1: using augmented reality. Also, we would end up seeing that 480 00:31:20,720 --> 00:31:23,880 Speaker 1: same sort of philosophy used again and again in the 481 00:31:23,920 --> 00:31:28,200 Speaker 1: future in more sophisticated types of technology, but it was 482 00:31:28,240 --> 00:31:32,840 Speaker 1: the exact same approach, exact same idea underlying it. In 483 00:31:32,920 --> 00:31:36,840 Speaker 1: nineteen ninety two, Lewis Rosenberg proposed a system that the 484 00:31:36,880 --> 00:31:40,080 Speaker 1: Air Force could use to allow someone to control devices 485 00:31:40,120 --> 00:31:43,800 Speaker 1: from a remote location, and that consisted of a video 486 00:31:43,880 --> 00:31:46,960 Speaker 1: camera which would provide the visual data to the user 487 00:31:47,680 --> 00:31:50,160 Speaker 1: through a head mounted display. They would wear the display 488 00:31:50,160 --> 00:31:52,560 Speaker 1: on their heads or they would look at a screen, 489 00:31:52,640 --> 00:31:56,080 Speaker 1: but typically they'd wear a display, and then they would 490 00:31:56,080 --> 00:32:00,000 Speaker 1: also wear an exoskeleton on their upper body that would 491 00:32:00,160 --> 00:32:05,320 Speaker 1: allow them to control some sort of robotic device, typically 492 00:32:05,400 --> 00:32:08,520 Speaker 1: robotic arms. And usually the way this would work is 493 00:32:08,560 --> 00:32:11,640 Speaker 1: that the display was designed in such a way with 494 00:32:11,720 --> 00:32:15,040 Speaker 1: the video camera so that the view that the person 495 00:32:15,160 --> 00:32:18,480 Speaker 1: had it made it look like the robot arms were 496 00:32:18,640 --> 00:32:22,240 Speaker 1: their actual arms, which required a little bit of trickery 497 00:32:22,480 --> 00:32:25,880 Speaker 1: on the part of Rosenberg. They had to fudge the 498 00:32:25,960 --> 00:32:29,120 Speaker 1: distances between the video camera and the robotic arms to 499 00:32:29,200 --> 00:32:33,440 Speaker 1: give this sort of feeling that the robot arms represented 500 00:32:33,480 --> 00:32:36,320 Speaker 1: your actual arms. So you move your arms inside the 501 00:32:36,320 --> 00:32:40,240 Speaker 1: exoskeleton and the robot arms would move as well at 502 00:32:40,240 --> 00:32:43,040 Speaker 1: their remote location. So it's kind of like a really 503 00:32:43,040 --> 00:32:48,920 Speaker 1: fancy remote control. Now imagine that the robot arms are 504 00:32:48,920 --> 00:32:54,240 Speaker 1: holding various tools. The suit would also provide haptic feedback, 505 00:32:54,320 --> 00:32:57,200 Speaker 1: that touch based feedback to let a user know more 506 00:32:57,280 --> 00:32:59,880 Speaker 1: about what is going on when they're operating the arms. 507 00:33:00,320 --> 00:33:03,800 Speaker 1: So if you were to do something that would make 508 00:33:03,840 --> 00:33:07,640 Speaker 1: a robot arm and counter resistance, then you would feel 509 00:33:08,040 --> 00:33:11,200 Speaker 1: haptic feedback in the suit that would indicate, oh, you're 510 00:33:11,240 --> 00:33:13,960 Speaker 1: going beyond the parameters of where this robot arm is 511 00:33:14,000 --> 00:33:17,800 Speaker 1: capable of going. So you learn very quickly where you 512 00:33:17,840 --> 00:33:21,160 Speaker 1: can operate within that suit and make sure that you 513 00:33:21,760 --> 00:33:26,160 Speaker 1: are not pushing it beyond its limits. You could also 514 00:33:27,520 --> 00:33:30,680 Speaker 1: end up using these tools to do various things in 515 00:33:30,720 --> 00:33:34,600 Speaker 1: this remote environment. Now, Rosenberg called a system virtual fixtures, 516 00:33:35,400 --> 00:33:37,880 Speaker 1: which meant that the user would see these virtual overlays 517 00:33:37,880 --> 00:33:40,120 Speaker 1: on top of a real environment that they were looking 518 00:33:40,160 --> 00:33:43,720 Speaker 1: at So I'm going to give a very basic example 519 00:33:45,120 --> 00:33:47,440 Speaker 1: to illustrate this, because it's hard to imagine, it's hard 520 00:33:47,440 --> 00:33:49,800 Speaker 1: to get it across in words. But let's say you're 521 00:33:49,840 --> 00:33:52,440 Speaker 1: looking through a head mount display and in front of 522 00:33:52,440 --> 00:33:58,480 Speaker 1: you is a board, wooden board, and it's just a 523 00:33:58,480 --> 00:34:01,560 Speaker 1: regular wooden board. There's nothing painted on it or anything 524 00:34:01,560 --> 00:34:03,920 Speaker 1: in the real world, and it's in a room that's 525 00:34:04,240 --> 00:34:07,000 Speaker 1: across the building from you. You cannot see this with 526 00:34:07,080 --> 00:34:08,920 Speaker 1: your own eyes. You can only see it through the 527 00:34:08,960 --> 00:34:14,400 Speaker 1: video camera. The virtual fixture overlay might be a series 528 00:34:14,440 --> 00:34:17,440 Speaker 1: of circles, and the circles are things that you are 529 00:34:17,480 --> 00:34:19,720 Speaker 1: meant to cut out of the board using the robot 530 00:34:19,880 --> 00:34:24,000 Speaker 1: arms and a tool that's right there inside the physical environment, 531 00:34:24,040 --> 00:34:27,440 Speaker 1: across the building from you. So you follow the patterns 532 00:34:27,560 --> 00:34:31,040 Speaker 1: that you see in this virtual overlay and you complete 533 00:34:31,080 --> 00:34:35,360 Speaker 1: the task. That's a very simple example, and this system 534 00:34:35,480 --> 00:34:38,080 Speaker 1: was meant to allow for that. That's what he would 535 00:34:38,120 --> 00:34:41,800 Speaker 1: call the virtual fixtures, these overlays that you would see 536 00:34:42,360 --> 00:34:45,440 Speaker 1: that would appear to be real, but actually were not 537 00:34:45,600 --> 00:34:51,240 Speaker 1: present in the physical environment itself. Now, also in nineteen 538 00:34:51,320 --> 00:34:54,640 Speaker 1: ninety two, a group of researchers at Columbia University were 539 00:34:54,680 --> 00:34:58,120 Speaker 1: proposing a system that they called the Knowledge based Augmented 540 00:34:58,160 --> 00:35:05,840 Speaker 1: Reality for Maintenance Assistance aka Karma Cute. Their approach was 541 00:35:05,880 --> 00:35:08,839 Speaker 1: pretty novel. They pointed out that while augmented reality had 542 00:35:08,960 --> 00:35:12,520 Speaker 1: tremendous potential, it also had a really big barrier in 543 00:35:12,560 --> 00:35:16,320 Speaker 1: that it takes an enormous amount of time to design 544 00:35:16,840 --> 00:35:21,160 Speaker 1: or animate and implement these graphic overlays for AR applications. 545 00:35:21,200 --> 00:35:24,040 Speaker 1: So let's say you're in a room and you're looking 546 00:35:24,040 --> 00:35:26,840 Speaker 1: at different objects, and little labels are popping up for 547 00:35:26,880 --> 00:35:30,040 Speaker 1: each object. If you're having to do all that by hand, 548 00:35:30,040 --> 00:35:33,160 Speaker 1: it takes a huge amount of time. What they wanted 549 00:35:33,160 --> 00:35:37,839 Speaker 1: to do was create artificial intelligence systems, or at least 550 00:35:37,880 --> 00:35:44,560 Speaker 1: techniques to generate graphics automatically on the fly. So this 551 00:35:44,640 --> 00:35:48,480 Speaker 1: would be similar to using image recognition software, so that 552 00:35:48,520 --> 00:35:52,680 Speaker 1: if you look at a specific box, let's say, the 553 00:35:52,719 --> 00:35:55,280 Speaker 1: image recognition software might be able to map that box 554 00:35:55,320 --> 00:35:58,000 Speaker 1: to a specific product and thus give you an overlay 555 00:35:58,040 --> 00:36:01,759 Speaker 1: of information about the product though inside that box. And 556 00:36:01,760 --> 00:36:03,480 Speaker 1: it would be able to do all this automatically. It 557 00:36:03,520 --> 00:36:07,840 Speaker 1: would not require a human programmer to go through and 558 00:36:08,000 --> 00:36:11,280 Speaker 1: look at every single product in every single type of box. 559 00:36:11,480 --> 00:36:14,719 Speaker 1: And program all that out. That would be ridiculous, it 560 00:36:14,719 --> 00:36:17,600 Speaker 1: would take forever. So it was the work of this 561 00:36:17,640 --> 00:36:20,759 Speaker 1: group with Karma that really started the ball rolling with 562 00:36:20,840 --> 00:36:24,920 Speaker 1: this AI approach to automatically fill in that information and 563 00:36:25,360 --> 00:36:31,480 Speaker 1: make AR a more practical experience. Around the same time, 564 00:36:31,680 --> 00:36:36,320 Speaker 1: between ninety two and ninety three, the Laurel Western Development Labs, 565 00:36:36,600 --> 00:36:39,279 Speaker 1: which was a defense contractor, began to work with the 566 00:36:39,360 --> 00:36:43,080 Speaker 1: US military to create AR systems for military vehicles. And 567 00:36:43,120 --> 00:36:47,000 Speaker 1: you can understand very quickly how AR would have enormous 568 00:36:47,040 --> 00:36:51,279 Speaker 1: potential for military applications. And in fact, AR is very 569 00:36:51,320 --> 00:36:54,360 Speaker 1: commonly used in lots of different things like pilot helmets 570 00:36:54,440 --> 00:37:01,360 Speaker 1: where it helps pilots keep track of targets and potential threats, 571 00:37:01,440 --> 00:37:05,040 Speaker 1: that kind of thing. But in this case, they were 572 00:37:05,080 --> 00:37:09,799 Speaker 1: really looking at creating a augmented reality system that would 573 00:37:09,800 --> 00:37:15,480 Speaker 1: create virtual opponents for people working in simulated wartime conditions, 574 00:37:15,960 --> 00:37:19,520 Speaker 1: so really a training program. Imagine that you're operating an 575 00:37:19,600 --> 00:37:23,680 Speaker 1: actual military vehicle like a tank, and you have a 576 00:37:24,000 --> 00:37:28,400 Speaker 1: view outside that is really an augmented reality system, so 577 00:37:28,440 --> 00:37:30,640 Speaker 1: you're actually looking at the real world around you. You 578 00:37:30,640 --> 00:37:33,200 Speaker 1: aren't just sitting in a simulator inside of building. You 579 00:37:33,239 --> 00:37:36,560 Speaker 1: are out there in the field controlling a real vehicle 580 00:37:36,640 --> 00:37:40,480 Speaker 1: moving around in real terrain, but you also see virtual 581 00:37:40,960 --> 00:37:45,399 Speaker 1: representations of enemies in that real terrain, and you can 582 00:37:45,520 --> 00:37:49,200 Speaker 1: practice maneuvers and firing on enemies that sort of thing, 583 00:37:49,520 --> 00:37:53,359 Speaker 1: probably not using live ammunition at that point, but having 584 00:37:53,360 --> 00:37:56,239 Speaker 1: a more realistic simulation in a real environment, so that 585 00:37:56,280 --> 00:38:03,000 Speaker 1: you're not just trying to create a totally virtual scenario. Anyway, 586 00:38:03,000 --> 00:38:05,480 Speaker 1: that work was done in ninety two and ninety three, 587 00:38:05,960 --> 00:38:08,359 Speaker 1: the world wouldn't really learn about it at large until 588 00:38:08,360 --> 00:38:11,680 Speaker 1: about ninety nine, because that's the way the military works. 589 00:38:11,880 --> 00:38:14,080 Speaker 1: They're not so eager to talk about their stuff while 590 00:38:14,080 --> 00:38:18,240 Speaker 1: are still doing it. Meanwhile, at the same time, artists 591 00:38:18,280 --> 00:38:21,719 Speaker 1: were continuing to explore the relationships between physical performers and 592 00:38:21,800 --> 00:38:24,880 Speaker 1: virtual elements. You remember I talked about doctor Krueger earlier, 593 00:38:24,960 --> 00:38:27,960 Speaker 1: while in nineteen ninety four, a different artist, Julie Martin, 594 00:38:28,000 --> 00:38:32,600 Speaker 1: would create a piece called Dancing in Cyberspace, and in 595 00:38:32,680 --> 00:38:36,240 Speaker 1: that piece, dancers on a physical space or a physical 596 00:38:36,239 --> 00:38:39,960 Speaker 1: stage were able to manipulate virtual objects, so an audience 597 00:38:39,960 --> 00:38:43,000 Speaker 1: would be able to see both the physical performance by 598 00:38:43,000 --> 00:38:48,160 Speaker 1: the dancers and the virtual reactions. The things that happened 599 00:38:48,160 --> 00:38:50,760 Speaker 1: within the virtual environment as a result of the dancers 600 00:38:50,800 --> 00:38:55,040 Speaker 1: moving around their physical space pretty neat. In nineteen ninety five, 601 00:38:55,880 --> 00:39:01,600 Speaker 1: two researchers Rekimoto and Nagao created their the first real 602 00:39:01,760 --> 00:39:05,080 Speaker 1: handheld AR display. But it was a tethered display. It 603 00:39:05,160 --> 00:39:08,120 Speaker 1: wasn't free form. You couldn't just take it anywhere. It 604 00:39:08,200 --> 00:39:11,560 Speaker 1: was called navicam, and you had to have a tether 605 00:39:12,000 --> 00:39:15,840 Speaker 1: cable essentially connect the navicam to a workstation. But it 606 00:39:15,880 --> 00:39:19,160 Speaker 1: had a forward facing camera and you could use a 607 00:39:19,239 --> 00:39:24,200 Speaker 1: video feed to go through this handheld device through the 608 00:39:25,360 --> 00:39:28,760 Speaker 1: cable to the workstation, and it could detect color coded 609 00:39:28,800 --> 00:39:31,920 Speaker 1: markers in the camera image and display information on a 610 00:39:32,000 --> 00:39:34,560 Speaker 1: video see through view. So you could get that augmented 611 00:39:34,640 --> 00:39:38,120 Speaker 1: reality experience. Obviously very limited, you know, you could not 612 00:39:38,280 --> 00:39:40,600 Speaker 1: just carry this around with you everywhere you go, but 613 00:39:40,680 --> 00:39:45,319 Speaker 1: it showed the ideas behind augmented reality could in fact 614 00:39:45,360 --> 00:39:48,480 Speaker 1: be realized in a handheld format. Now, it was just 615 00:39:48,520 --> 00:39:52,319 Speaker 1: a matter of getting those different components small enough to 616 00:39:52,360 --> 00:39:57,960 Speaker 1: all fit in a self contained mobile form factor. Now 617 00:39:57,960 --> 00:40:01,040 Speaker 1: in the late nineties, we started seeing televi sporting events 618 00:40:01,160 --> 00:40:05,120 Speaker 1: featuring augmented reality elements, or at least you did. I 619 00:40:05,200 --> 00:40:08,920 Speaker 1: don't watch sports ball. That's not entirely true, but I 620 00:40:08,960 --> 00:40:12,239 Speaker 1: don't watch football or hockey, American football or hockey, and 621 00:40:12,360 --> 00:40:16,000 Speaker 1: both of those were the sports that really got them. 622 00:40:16,040 --> 00:40:19,760 Speaker 1: First off, I'm going to backtrack. I used to watch hockey, 623 00:40:20,360 --> 00:40:27,120 Speaker 1: but then Winnipeg stole the Atlanta Thrashers from me. Winnipeg, okay, 624 00:40:27,160 --> 00:40:30,120 Speaker 1: getting back to hockey. So hockey had the Fox track system, 625 00:40:30,239 --> 00:40:33,440 Speaker 1: which Fox put into hockey games so that you could 626 00:40:33,520 --> 00:40:36,160 Speaker 1: easily follow the puck. Instead of trying to watch this 627 00:40:36,239 --> 00:40:39,359 Speaker 1: little bitty black disc spinning around, you got to watch 628 00:40:39,360 --> 00:40:45,960 Speaker 1: this very bright, highlighted neon colored disc that everyone hated. 629 00:40:46,840 --> 00:40:50,160 Speaker 1: And after about two seasons, Fuck stopped doing it and 630 00:40:50,239 --> 00:40:54,239 Speaker 1: people were happy until a Thrashers moved away, and then 631 00:40:54,280 --> 00:40:59,080 Speaker 1: it was just miserable. American football would follow suit in 632 00:40:59,120 --> 00:41:02,560 Speaker 1: the late nineties and have the first down line introduced, 633 00:41:02,600 --> 00:41:07,360 Speaker 1: where they could on live video overlay the first down line. 634 00:41:07,800 --> 00:41:11,359 Speaker 1: Usually it's a bright yellow line that indicates how far 635 00:41:11,440 --> 00:41:13,920 Speaker 1: the offensive team needs to go. And by offensive, I 636 00:41:13,960 --> 00:41:16,120 Speaker 1: mean they're on the offensive. I don't mean they offend 637 00:41:16,160 --> 00:41:20,480 Speaker 1: my sensibilities. I'm not that against American football, but it 638 00:41:20,520 --> 00:41:22,000 Speaker 1: showed how far they would need to go in order 639 00:41:22,080 --> 00:41:25,200 Speaker 1: to establish a first down, which I am told is 640 00:41:25,200 --> 00:41:29,480 Speaker 1: something you want to do. That would start to get 641 00:41:29,960 --> 00:41:33,040 Speaker 1: employed in nineteen ninety eight, and over time we would 642 00:41:33,040 --> 00:41:37,000 Speaker 1: see that increase where eventually Skycam was able to even 643 00:41:37,320 --> 00:41:39,919 Speaker 1: use this system. At first, it wasn't You could get 644 00:41:39,960 --> 00:41:42,440 Speaker 1: a skycam view, but you couldn't do the overlay of 645 00:41:42,440 --> 00:41:47,000 Speaker 1: the first and ten line until later. Well, I've got 646 00:41:47,040 --> 00:41:50,640 Speaker 1: a lot more to say about augmented reality, but before 647 00:41:50,680 --> 00:42:04,040 Speaker 1: I do, let's take another quick break to thank our sponsor. Okay, 648 00:42:04,040 --> 00:42:08,120 Speaker 1: we're back. Let's skip ahead to nineteen ninety nine. I 649 00:42:08,120 --> 00:42:10,280 Speaker 1: guess it's not really skipping. I just talked about nineteen 650 00:42:10,360 --> 00:42:14,399 Speaker 1: ninety eight. Let's plot ahead to nineteen ninety nine. That's 651 00:42:14,440 --> 00:42:19,759 Speaker 1: when NASA's X thirty eight spacecraft was using an AR 652 00:42:19,920 --> 00:42:24,120 Speaker 1: system as part of its navigational tools, so people back 653 00:42:24,160 --> 00:42:27,879 Speaker 1: on Earth could look at a view from the spacecraft 654 00:42:27,920 --> 00:42:32,279 Speaker 1: a camera mounted on the spacecraft, and on top of 655 00:42:32,280 --> 00:42:36,360 Speaker 1: that view they could overlay map data to help with navigation. 656 00:42:36,600 --> 00:42:38,560 Speaker 1: And all of that, of course was controlled back here 657 00:42:38,640 --> 00:42:41,680 Speaker 1: on Earth. But it was sort of an experiment to 658 00:42:41,719 --> 00:42:46,000 Speaker 1: see how augmented reality could be incorporated into space exploration. 659 00:42:46,120 --> 00:42:50,200 Speaker 1: Missions in the future and make them more effective. Also 660 00:42:50,280 --> 00:42:53,399 Speaker 1: in nineteen ninety nine, the Navy began work on the 661 00:42:53,440 --> 00:42:58,320 Speaker 1: Battlefield Augmented Reality System or BARS, which is a wearable 662 00:42:58,600 --> 00:43:02,759 Speaker 1: AR system for soldiers. You probably seen various implementations of 663 00:43:02,800 --> 00:43:06,680 Speaker 1: this over the years. Is obviously evolved since nineteen ninety nine. 664 00:43:07,440 --> 00:43:10,640 Speaker 1: It's one of those pieces of technology that some soldiers 665 00:43:11,040 --> 00:43:14,000 Speaker 1: took to, but a lot just felt that it created 666 00:43:14,120 --> 00:43:20,840 Speaker 1: unnecessary distractions. Technology and warfare is very very difficult because 667 00:43:21,360 --> 00:43:25,160 Speaker 1: there's sometimes where we think, oh, more information is always better, 668 00:43:25,680 --> 00:43:28,720 Speaker 1: but in some cases that doesn't seem to hold true, 669 00:43:29,640 --> 00:43:33,200 Speaker 1: and for some people with these head mounted displays or 670 00:43:33,280 --> 00:43:37,239 Speaker 1: really its heads up displays HUDs, that can sometimes be 671 00:43:37,320 --> 00:43:41,959 Speaker 1: the case, depends on the implementation. In two thousand, hiro 672 00:43:42,160 --> 00:43:48,000 Speaker 1: Katsu Kato created a software library called AR Toolkit. Very 673 00:43:48,160 --> 00:43:51,920 Speaker 1: important software library was also open source, so anyone could 674 00:43:52,040 --> 00:43:56,080 Speaker 1: contribute to it, modify it. Brent put out a new 675 00:43:56,200 --> 00:43:59,800 Speaker 1: version that sort of stuff, and it uses video tracking 676 00:43:59,800 --> 00:44:03,719 Speaker 1: to overlay computer graphics on a video camera feed, and 677 00:44:03,760 --> 00:44:07,200 Speaker 1: it's still a component for a lot of AR experiences today. 678 00:44:07,440 --> 00:44:10,879 Speaker 1: Later on in the two thousands, this would be adapted 679 00:44:11,239 --> 00:44:14,800 Speaker 1: so that it could also be used in web experiences, 680 00:44:14,840 --> 00:44:20,600 Speaker 1: not just native experiences to specific devices, and we continued 681 00:44:20,600 --> 00:44:24,920 Speaker 1: to see AR built into new experiences, including smartphones and tablets. 682 00:44:24,960 --> 00:44:28,160 Speaker 1: By two thousand and four, some researchers in Germany were 683 00:44:28,160 --> 00:44:31,640 Speaker 1: creating AR apps that could take advantage of a smartphone's camera. 684 00:44:32,280 --> 00:44:35,680 Speaker 1: But two thousand and fours pretty early for smartphones. It 685 00:44:35,760 --> 00:44:39,160 Speaker 1: really would would be a few years before this would 686 00:44:39,200 --> 00:44:42,680 Speaker 1: truly take off, because that's when Apple came out with 687 00:44:42,719 --> 00:44:45,719 Speaker 1: the iPhone in two thousand and seven. That was the 688 00:44:45,760 --> 00:44:50,400 Speaker 1: real revolution in smartphone technology. There had been smartphones before 689 00:44:50,480 --> 00:44:52,920 Speaker 1: the iPhone, don't get me wrong, and many of them 690 00:44:52,920 --> 00:44:57,360 Speaker 1: were really good, but the iPhone was something that caught 691 00:44:57,400 --> 00:45:02,080 Speaker 1: the public's attention and made smartphones sexy. And because of that, 692 00:45:02,239 --> 00:45:05,320 Speaker 1: there was a ton of money poured into the smartphone 693 00:45:05,320 --> 00:45:08,400 Speaker 1: industry as well as not just Apple, but also to 694 00:45:08,480 --> 00:45:11,759 Speaker 1: other companies, like the companies that were offering Android smartphones. 695 00:45:12,760 --> 00:45:15,759 Speaker 1: But I think we can really thank Apple for all 696 00:45:15,800 --> 00:45:19,400 Speaker 1: of that happening in the first place, especially things like 697 00:45:19,400 --> 00:45:21,960 Speaker 1: seeing that accelerometer where you could switch from portrait to 698 00:45:22,040 --> 00:45:25,040 Speaker 1: landscape mode. I remember everyone freaking out about that when 699 00:45:25,600 --> 00:45:28,040 Speaker 1: Steve Jobs showed it off in two thousand and seven 700 00:45:28,040 --> 00:45:32,080 Speaker 1: at Macworld and everyone thought, wow, this is amazing. Well, 701 00:45:32,120 --> 00:45:33,640 Speaker 1: we take it for granted now, but it was a 702 00:45:33,640 --> 00:45:38,560 Speaker 1: big deal then. So once that smartphone revolution happened, it 703 00:45:38,680 --> 00:45:43,880 Speaker 1: was a landslide victory for both augmented reality and virtual 704 00:45:43,920 --> 00:45:47,200 Speaker 1: reality research and development, because it meant that so much 705 00:45:47,239 --> 00:45:53,040 Speaker 1: money was being poured into creating newer, thinner, more capable 706 00:45:53,080 --> 00:45:58,160 Speaker 1: smartphones that we saw an explosion in technological development that 707 00:45:58,200 --> 00:46:03,520 Speaker 1: could also be used for virtual and augmented reality experiences. So, 708 00:46:03,640 --> 00:46:07,440 Speaker 1: for example, think of those sensors I talked about earlier, 709 00:46:07,520 --> 00:46:12,399 Speaker 1: accelerometers and gyroscopes, that sort of thing. Well, we saw 710 00:46:12,400 --> 00:46:14,600 Speaker 1: a lot of development in those spaces in order to 711 00:46:14,600 --> 00:46:17,759 Speaker 1: make smartphones better, and people who were working in AR 712 00:46:17,840 --> 00:46:21,200 Speaker 1: and VR experiences could take advantage of those same sensors 713 00:46:21,520 --> 00:46:25,920 Speaker 1: either creating apps specifically for smartphones. Thus, you don't have 714 00:46:26,000 --> 00:46:29,440 Speaker 1: to build any other hardware, you just use existing hardware, 715 00:46:29,520 --> 00:46:32,600 Speaker 1: but that limits how you can use it, right because 716 00:46:32,640 --> 00:46:35,040 Speaker 1: you don't typically wear your smartphone directly in front of 717 00:46:35,040 --> 00:46:38,879 Speaker 1: your face. Or they could end up taking advantage of 718 00:46:38,920 --> 00:46:43,120 Speaker 1: those new, smaller sensors and incorporate them directly into brand 719 00:46:43,160 --> 00:46:46,640 Speaker 1: new hardware like various types of wearables like Google Glass, 720 00:46:46,640 --> 00:46:48,440 Speaker 1: for example, but that would be a few more years. 721 00:46:50,040 --> 00:46:55,280 Speaker 1: In twenty eleven, Nintendo launched the Nintendo three DS, which 722 00:46:55,280 --> 00:46:59,720 Speaker 1: included a camera. It was the three D capable handheld 723 00:46:59,719 --> 00:47:04,800 Speaker 1: device and included actually a pair of forward facing cameras, 724 00:47:05,280 --> 00:47:07,759 Speaker 1: so you could take three D photos if you wanted to, 725 00:47:08,880 --> 00:47:12,680 Speaker 1: and it also had some AR software included with it. 726 00:47:13,120 --> 00:47:17,200 Speaker 1: You would get these special Nintendo cards kind of like 727 00:47:17,280 --> 00:47:21,280 Speaker 1: playing cards, and if you were to point the camera 728 00:47:21,400 --> 00:47:24,759 Speaker 1: of the three DS at the card and look at 729 00:47:24,800 --> 00:47:28,440 Speaker 1: the screen, you would see a little virtual three dimensional 730 00:47:28,600 --> 00:47:31,520 Speaker 1: character pop up on the card. So Mario would be 731 00:47:31,520 --> 00:47:34,759 Speaker 1: an obvious example. You put the Mario card down on 732 00:47:34,840 --> 00:47:37,239 Speaker 1: the table, you hold up the three DS, and you 733 00:47:37,280 --> 00:47:39,279 Speaker 1: aim the camera at the card, and you look at 734 00:47:39,280 --> 00:47:42,040 Speaker 1: the screen and there's Mario, and Mario appears to be 735 00:47:42,120 --> 00:47:45,799 Speaker 1: jumping around on your physical table. Now, obviously, if you 736 00:47:45,840 --> 00:47:50,440 Speaker 1: look off of the display, there's no Mario jumping around, 737 00:47:51,040 --> 00:47:53,600 Speaker 1: but on the display there he is, and it was 738 00:47:53,640 --> 00:47:57,040 Speaker 1: pretty cute. I remember being really impressed with this very 739 00:47:57,080 --> 00:48:00,720 Speaker 1: simple implementation of AR when we got our three DS, 740 00:48:01,640 --> 00:48:04,920 Speaker 1: and then I took our three DS apart, and then 741 00:48:04,920 --> 00:48:07,680 Speaker 1: I took pictures of it, and then I posted it 742 00:48:07,760 --> 00:48:12,759 Speaker 1: on Twitter and people got sad, it's a great day. 743 00:48:12,800 --> 00:48:16,640 Speaker 1: In twenty thirteen, Google introduced Google Glass. That was the 744 00:48:16,680 --> 00:48:20,000 Speaker 1: wearable that included a small display position just above the 745 00:48:20,040 --> 00:48:24,640 Speaker 1: right eye, so when you look straightforward, you could tell 746 00:48:24,680 --> 00:48:28,799 Speaker 1: that there was something kind of above your natural eyeline, 747 00:48:29,360 --> 00:48:32,799 Speaker 1: but it didn't get in the way too much to 748 00:48:32,840 --> 00:48:34,800 Speaker 1: look at the screen. You actually had a glimpse. You 749 00:48:34,880 --> 00:48:37,440 Speaker 1: had a glance upward, and then you could see what 750 00:48:37,560 --> 00:48:42,640 Speaker 1: was on the display. Google Glass had augmented reality features 751 00:48:42,680 --> 00:48:47,360 Speaker 1: like crazy. You could see video calls. You could actually 752 00:48:48,040 --> 00:48:52,319 Speaker 1: use the glasses to not just take a video call, 753 00:48:52,360 --> 00:48:55,239 Speaker 1: but show the other person what you are looking at 754 00:48:55,600 --> 00:48:58,560 Speaker 1: so they could see from your point of view. You 755 00:48:58,600 --> 00:49:01,480 Speaker 1: could also overlay direction, so if you're walking down street, 756 00:49:01,880 --> 00:49:04,040 Speaker 1: you could glance up at the screen and it would 757 00:49:04,040 --> 00:49:05,719 Speaker 1: tell you if you need to keep going straight, or 758 00:49:05,719 --> 00:49:07,840 Speaker 1: turn left or turn right, that kind of thing. It 759 00:49:07,880 --> 00:49:10,920 Speaker 1: was really useful. I had a pair of these Google 760 00:49:10,920 --> 00:49:15,080 Speaker 1: Glass and I really liked the direction they were going in. 761 00:49:15,320 --> 00:49:18,440 Speaker 1: I felt that it wasn't a fully realized product at 762 00:49:18,440 --> 00:49:21,560 Speaker 1: the time, and eventually Google agreed and after a couple 763 00:49:21,600 --> 00:49:24,600 Speaker 1: of years they took Google Glass off the market entirely, 764 00:49:24,680 --> 00:49:28,240 Speaker 1: and now you can't get them anymore. They were clever, 765 00:49:28,560 --> 00:49:33,279 Speaker 1: but they were expensive, and they had some limitations, And 766 00:49:33,840 --> 00:49:35,880 Speaker 1: like I was saying earlier, you know, it's hard to 767 00:49:35,880 --> 00:49:38,920 Speaker 1: build all the components you need into one headset. So 768 00:49:39,040 --> 00:49:43,200 Speaker 1: Google Glass would communicate via Bluetooth to your smartphone, and 769 00:49:43,239 --> 00:49:45,880 Speaker 1: your smartphone would act as the actual nexus point to 770 00:49:45,960 --> 00:49:50,239 Speaker 1: the Internet. But it was a neat idea, and I 771 00:49:50,360 --> 00:49:53,440 Speaker 1: enjoyed getting to use them while I did, so I 772 00:49:53,560 --> 00:49:57,879 Speaker 1: keep hoping to see a return of that kind of technology, 773 00:49:58,200 --> 00:50:02,080 Speaker 1: but perhaps in a more mature and less expensive format. 774 00:50:03,560 --> 00:50:07,360 Speaker 1: Now we've also seen applications similar to the ones we 775 00:50:07,480 --> 00:50:10,480 Speaker 1: mentioned earlier, the ones that are meant to guide people 776 00:50:10,640 --> 00:50:14,440 Speaker 1: into laying out or repairing a system. We've seen that 777 00:50:14,480 --> 00:50:17,680 Speaker 1: in the car world. Not too long ago, there was 778 00:50:17,719 --> 00:50:21,880 Speaker 1: the MARTA system introduced by Volkswagen. MARTA makes me chuckle 779 00:50:22,000 --> 00:50:25,320 Speaker 1: because that's also the name of Atlanta's public transportation system, 780 00:50:25,440 --> 00:50:28,520 Speaker 1: But in this case, it stands for Mobile Augmented Reality 781 00:50:28,560 --> 00:50:33,000 Speaker 1: Technical Assistance, and it's specifically designed for mechanics who are 782 00:50:33,040 --> 00:50:36,640 Speaker 1: working on the XL one vehicle. So if you hold 783 00:50:36,719 --> 00:50:40,120 Speaker 1: up an iPad that has this app on it, and 784 00:50:40,239 --> 00:50:42,759 Speaker 1: the camera is pointed at an XL one and you 785 00:50:42,800 --> 00:50:46,040 Speaker 1: look at the display, you'll see information overlaid on top 786 00:50:46,080 --> 00:50:50,040 Speaker 1: of the car, including labels for all the different parts. 787 00:50:50,080 --> 00:50:52,319 Speaker 1: So let's say you're a mechanic and you have to 788 00:50:52,360 --> 00:50:56,480 Speaker 1: do a specific repair on this vehicle. You hold up 789 00:50:56,520 --> 00:50:59,480 Speaker 1: the iPad, you look through the display, and you see 790 00:50:59,520 --> 00:51:01,279 Speaker 1: exactly what you need to do. It gives you a 791 00:51:01,320 --> 00:51:04,200 Speaker 1: set of instructions. It shows you how you need to do. 792 00:51:04,239 --> 00:51:06,560 Speaker 1: It tells you where you need to stand based upon 793 00:51:06,680 --> 00:51:09,400 Speaker 1: the angle of the view. So if you hold it 794 00:51:09,480 --> 00:51:10,920 Speaker 1: up and it says no, you need to move about 795 00:51:11,040 --> 00:51:12,839 Speaker 1: a foot to the right, you can do that. Then 796 00:51:12,880 --> 00:51:14,640 Speaker 1: hold up the iPad again and you'll say, all right, 797 00:51:14,640 --> 00:51:17,400 Speaker 1: you're in the right spot. Make sure you loosen this 798 00:51:17,480 --> 00:51:20,920 Speaker 1: particular bolt first, that kind of thing. And it's meant 799 00:51:20,960 --> 00:51:25,480 Speaker 1: to be an interactive maintenance guide in a way maintenance 800 00:51:25,520 --> 00:51:28,960 Speaker 1: and repair guide. This is one of those applications of 801 00:51:29,040 --> 00:51:33,560 Speaker 1: augmentary reality. I think is a no brainer to me. 802 00:51:33,640 --> 00:51:37,279 Speaker 1: It's a killer app The idea of having an ability 803 00:51:37,400 --> 00:51:42,160 Speaker 1: to work with something you are not one hundred percent 804 00:51:42,239 --> 00:51:46,279 Speaker 1: familiar with, but you're able to leverage the expertise of 805 00:51:46,360 --> 00:51:48,640 Speaker 1: people who either designed it or built it, or just 806 00:51:48,719 --> 00:51:53,480 Speaker 1: fully understand it and get guidance based on their expertise 807 00:51:54,080 --> 00:51:57,080 Speaker 1: in real time, so you're not having to go and 808 00:51:57,280 --> 00:52:01,800 Speaker 1: consult an article of it or watch a YouTube video. 809 00:52:02,520 --> 00:52:06,400 Speaker 1: You get step by step instructions overlaid on top of 810 00:52:06,440 --> 00:52:11,320 Speaker 1: your view of that thing. To me, that's the most 811 00:52:11,360 --> 00:52:14,960 Speaker 1: compelling use of augmented reality from a practical standpoint. There 812 00:52:14,960 --> 00:52:16,719 Speaker 1: are a lot of other uses that I'll talk about 813 00:52:16,760 --> 00:52:19,279 Speaker 1: towards the end that I think are also really super cool. 814 00:52:19,520 --> 00:52:21,239 Speaker 1: So don't get me wrong, It's not the only one. 815 00:52:22,920 --> 00:52:25,560 Speaker 1: But let's move on to twenty fifteen. That was when 816 00:52:25,640 --> 00:52:29,520 Speaker 1: Microsoft would unveil the Holow Lens, something I still want 817 00:52:29,560 --> 00:52:31,279 Speaker 1: to try out. I have not had a chance to 818 00:52:31,320 --> 00:52:34,440 Speaker 1: try a Holow lens yet. That is a headset capable 819 00:52:34,480 --> 00:52:37,600 Speaker 1: of advanced AR applications everything from what I was just 820 00:52:37,640 --> 00:52:41,000 Speaker 1: talking about, giving you guidance, step by step instructions on 821 00:52:41,000 --> 00:52:43,680 Speaker 1: how to do like a repair job on say an 822 00:52:43,719 --> 00:52:48,319 Speaker 1: electrical outlet. You could even use a Skype system to 823 00:52:48,480 --> 00:52:52,200 Speaker 1: call an expert who can then view your point of 824 00:52:52,280 --> 00:52:56,799 Speaker 1: view and interact with that point of view. So let's 825 00:52:56,800 --> 00:53:01,200 Speaker 1: say I'm looking at the outlet the expert electrician i'm 826 00:53:01,200 --> 00:53:05,000 Speaker 1: talking to can see what I see, and he or 827 00:53:05,040 --> 00:53:09,640 Speaker 1: she can also make notes on the display, which shows 828 00:53:09,719 --> 00:53:12,600 Speaker 1: up in my field of view. So he or she 829 00:53:12,680 --> 00:53:15,640 Speaker 1: might circle a specific wire and say you need to 830 00:53:15,760 --> 00:53:18,040 Speaker 1: you need to remove that one first, and I know 831 00:53:18,400 --> 00:53:20,200 Speaker 1: I need to do that one first because I can 832 00:53:20,239 --> 00:53:22,319 Speaker 1: see which one they are talking about. Or they might 833 00:53:22,360 --> 00:53:26,400 Speaker 1: circle another wire and say, no, matter what you do, 834 00:53:26,400 --> 00:53:32,279 Speaker 1: don't cut this wire, or the toilet upstairs will explode 835 00:53:32,480 --> 00:53:36,120 Speaker 1: like lethal weapon two. And I won't do that because 836 00:53:36,840 --> 00:53:39,880 Speaker 1: you know that guy's like three days from retirement. So 837 00:53:41,480 --> 00:53:43,480 Speaker 1: I have a heart. But no, this is this is 838 00:53:43,520 --> 00:53:48,440 Speaker 1: a really neat idea, having this interactive ability to overlay 839 00:53:48,480 --> 00:53:51,799 Speaker 1: the information from the world, the digital world, onto your 840 00:53:51,800 --> 00:53:54,640 Speaker 1: physical world. And beyond that, the HoloLens has lots of 841 00:53:54,680 --> 00:53:58,319 Speaker 1: other functions. It's not just something to do, you know, 842 00:53:58,719 --> 00:54:01,400 Speaker 1: home repairs around the house. Else. You can also use 843 00:54:01,440 --> 00:54:05,440 Speaker 1: it for entertainment purposes, like you could create a screen 844 00:54:05,719 --> 00:54:09,640 Speaker 1: that can show you video from various sources and you 845 00:54:09,680 --> 00:54:14,120 Speaker 1: can assign it a place on a wall in your environment. 846 00:54:14,200 --> 00:54:17,879 Speaker 1: Let's say that you're in your living room and you 847 00:54:18,000 --> 00:54:20,640 Speaker 1: just create a screen so you can watch Netflix and 848 00:54:20,680 --> 00:54:23,720 Speaker 1: you slap it on a wall, and it will stay 849 00:54:24,120 --> 00:54:27,040 Speaker 1: in that same position relative to your point of view. 850 00:54:27,320 --> 00:54:29,080 Speaker 1: So if you look to the left or right, the 851 00:54:29,120 --> 00:54:32,319 Speaker 1: screen stays where you put it, as if it were 852 00:54:32,360 --> 00:54:34,960 Speaker 1: physically there on your wall. But keep in mind it's 853 00:54:35,120 --> 00:54:38,080 Speaker 1: just a virtual screen, and when you look back to 854 00:54:38,120 --> 00:54:40,640 Speaker 1: that part of your wall, you'll see the virtual screen 855 00:54:40,680 --> 00:54:43,000 Speaker 1: there playing whatever it was that you wanted to watch. 856 00:54:44,320 --> 00:54:46,560 Speaker 1: I think that's a super cool idea. And they've also 857 00:54:46,600 --> 00:54:50,600 Speaker 1: shown off games like a game of Minecraft that uses 858 00:54:50,640 --> 00:54:53,600 Speaker 1: hollowle in so you can actually view a Minecraft world 859 00:54:54,360 --> 00:54:57,480 Speaker 1: sitting appearing to sit at any rate on top of 860 00:54:57,520 --> 00:55:00,759 Speaker 1: a table, so you can walk around the table and 861 00:55:00,880 --> 00:55:04,439 Speaker 1: view the Minecraft world from various angles and play that way. 862 00:55:05,239 --> 00:55:08,319 Speaker 1: I think that's super neat. Don't know how compelling it is, 863 00:55:08,360 --> 00:55:11,320 Speaker 1: because again I haven't tried it myself, but I really 864 00:55:11,440 --> 00:55:16,720 Speaker 1: like the idea. This year, twenty sixteen, AR got another 865 00:55:16,760 --> 00:55:20,960 Speaker 1: big boost from a little game called Pokemon Go. Although 866 00:55:21,120 --> 00:55:24,360 Speaker 1: I have to admit this was a really primitive, basic 867 00:55:24,520 --> 00:55:28,680 Speaker 1: implementation of augmented reality. Really, it was not much more 868 00:55:28,760 --> 00:55:31,200 Speaker 1: than just a In fact, it was nothing more than 869 00:55:31,280 --> 00:55:34,439 Speaker 1: just an animated overlay that would exist on top of 870 00:55:34,480 --> 00:55:38,799 Speaker 1: the camera view of of your device. So I'd say 871 00:55:38,840 --> 00:55:41,879 Speaker 1: I'm holding up my smartphone and I'm trying to catch 872 00:55:41,880 --> 00:55:45,560 Speaker 1: a Jigglypuff and the jigglypuff is currently bouncing up and 873 00:55:45,680 --> 00:55:48,799 Speaker 1: down on the sidewalk in front of me. That's about 874 00:55:48,800 --> 00:55:52,040 Speaker 1: as far as the augmented reality actual experience would go. 875 00:55:52,320 --> 00:55:56,840 Speaker 1: So very primitive. But because Pokemon Go became so popular 876 00:55:56,960 --> 00:56:01,799 Speaker 1: so quickly, it really pushed the concept of AR to 877 00:56:01,880 --> 00:56:05,680 Speaker 1: the front of the minds of people everywhere, including business 878 00:56:05,719 --> 00:56:09,080 Speaker 1: owners who immediately said, we need an augmented reality app. 879 00:56:09,880 --> 00:56:12,880 Speaker 1: Whether they actually needed one or not is beside the point. 880 00:56:13,280 --> 00:56:16,120 Speaker 1: A lot of people got into AR because of Pokemon Go, 881 00:56:17,480 --> 00:56:20,080 Speaker 1: for both good and bad. I always think that you 882 00:56:20,120 --> 00:56:22,319 Speaker 1: have to come up with the experience first. You have 883 00:56:22,400 --> 00:56:27,440 Speaker 1: to understand why you need to use a specific strategy 884 00:56:28,000 --> 00:56:33,040 Speaker 1: to create a specific experience, and then build it. Not hey, 885 00:56:33,440 --> 00:56:38,040 Speaker 1: we need to augmented reality. Make something that's AR. To me, 886 00:56:38,120 --> 00:56:41,160 Speaker 1: that's the backwards way of going about it. But what 887 00:56:41,239 --> 00:56:44,480 Speaker 1: do I know, not a programmer, So I'm sure the 888 00:56:44,480 --> 00:56:47,960 Speaker 1: programmers feel in a similar way to me, But that's 889 00:56:48,040 --> 00:56:51,960 Speaker 1: just a guess. Now, the future of AR depends heavily 890 00:56:52,040 --> 00:56:55,080 Speaker 1: upon the applications we see in which ones end up 891 00:56:55,080 --> 00:56:58,439 Speaker 1: being successful and which ones aren't. Right now, I would 892 00:56:58,520 --> 00:57:01,160 Speaker 1: say that the best bet is to see more AR 893 00:57:01,360 --> 00:57:06,640 Speaker 1: features built into smartphones and tablets, maybe not necessarily built 894 00:57:06,640 --> 00:57:10,839 Speaker 1: into them, but have apps available that create AR experiences 895 00:57:10,880 --> 00:57:15,160 Speaker 1: for very specific contexts, like let's say it's a museum app. 896 00:57:15,680 --> 00:57:18,160 Speaker 1: You might download a museum app on your phone, and 897 00:57:18,200 --> 00:57:20,600 Speaker 1: when you go to the museum and you use your phone, 898 00:57:20,920 --> 00:57:23,920 Speaker 1: you can get more information about the paintings and sculptures 899 00:57:23,960 --> 00:57:26,880 Speaker 1: and other installations that you see in the museum. That's 900 00:57:26,920 --> 00:57:29,919 Speaker 1: an easy one to understand. But that same app isn't 901 00:57:29,960 --> 00:57:32,280 Speaker 1: going to be useful once you leave the museum, you 902 00:57:32,360 --> 00:57:35,600 Speaker 1: no longer have the context that it is tied to. 903 00:57:36,440 --> 00:57:39,000 Speaker 1: I think that smartphones are probably going to be where 904 00:57:39,040 --> 00:57:41,800 Speaker 1: the greatest development is going to be in the near term, 905 00:57:42,200 --> 00:57:46,280 Speaker 1: because wearables is still really hard to do. We still 906 00:57:46,320 --> 00:57:49,680 Speaker 1: don't have a consumer version of the HoloLens out available 907 00:57:49,720 --> 00:57:53,000 Speaker 1: for anyone to purchase, and it may never come out 908 00:57:53,040 --> 00:57:55,920 Speaker 1: as a consumer product. Microsoft hasn't shown a whole lot 909 00:57:55,960 --> 00:57:58,720 Speaker 1: of interest in making it a consumer product. Maybe that 910 00:57:58,800 --> 00:58:01,360 Speaker 1: will change, but at the moment, I wouldn't hold my 911 00:58:01,400 --> 00:58:06,080 Speaker 1: breath so I would argue smartphones and tablets are pretty 912 00:58:06,160 --> 00:58:10,000 Speaker 1: much where it's at. Maybe some implementation with some existing 913 00:58:10,560 --> 00:58:14,000 Speaker 1: VR headsets which have external cameras mounted on them as well, 914 00:58:14,120 --> 00:58:17,760 Speaker 1: like forward facing cameras, you could build ar experiences there. 915 00:58:17,920 --> 00:58:20,440 Speaker 1: Then it gets a little weird because you're also you know, 916 00:58:20,440 --> 00:58:23,000 Speaker 1: you're looking at a monitor, so you're looking at a 917 00:58:23,080 --> 00:58:25,520 Speaker 1: video feed of your surroundings, and on top of the 918 00:58:25,560 --> 00:58:27,880 Speaker 1: video feed you get the overlay. Same thing is true 919 00:58:27,880 --> 00:58:30,880 Speaker 1: for your smartphones and tablets, by the way, but different 920 00:58:30,960 --> 00:58:34,120 Speaker 1: that from the Google Glass implementation, where you're looking at 921 00:58:34,160 --> 00:58:37,080 Speaker 1: the actual physical world, not a video representation of it, 922 00:58:37,640 --> 00:58:41,000 Speaker 1: but the real world. And then, because the display itself 923 00:58:41,040 --> 00:58:43,880 Speaker 1: that you are looking through is transparent, you're looking at 924 00:58:43,880 --> 00:58:47,800 Speaker 1: a transparent overlay of digital information that gives you more 925 00:58:48,400 --> 00:58:51,400 Speaker 1: info about the world you are in. I hope you 926 00:58:51,480 --> 00:58:55,360 Speaker 1: enjoyed that classic episode from twenty sixteen, Augmenting your Reality. 927 00:58:55,800 --> 00:59:00,840 Speaker 1: I've had a lot of experience using various augmented real apps, 928 00:59:01,480 --> 00:59:06,080 Speaker 1: and I've always found them really intriguing and interesting and 929 00:59:06,240 --> 00:59:10,760 Speaker 1: potentially really useful, but I haven't actually made use of 930 00:59:10,840 --> 00:59:16,439 Speaker 1: one that I felt was, you know, really necessary or 931 00:59:16,960 --> 00:59:19,640 Speaker 1: added a whole lot of value to whatever the experience was. 932 00:59:19,800 --> 00:59:24,360 Speaker 1: I could see the potential, but it just didn't quite click. 933 00:59:24,720 --> 00:59:27,360 Speaker 1: And it may very well be that's because I'm using 934 00:59:27,840 --> 00:59:31,880 Speaker 1: the wrong equipment and probably the wrong apps. But I 935 00:59:31,960 --> 00:59:37,480 Speaker 1: definitely see the potential for AR. I just haven't experienced 936 00:59:37,520 --> 00:59:41,000 Speaker 1: it being really transformational. I hope that actually changes, because 937 00:59:41,000 --> 00:59:43,800 Speaker 1: I really do think this is a technology that could 938 00:59:43,800 --> 00:59:47,360 Speaker 1: potentially do a lot of good in a lot of 939 00:59:47,360 --> 00:59:51,479 Speaker 1: different applications. That's it for this classic episode. I hope 940 00:59:51,520 --> 00:59:53,880 Speaker 1: you are all well, and I'll talk to you again 941 00:59:54,640 --> 01:00:04,640 Speaker 1: really soon. Tech Stuff is an iHeartRadio production. For more 942 01:00:04,720 --> 01:00:09,440 Speaker 1: podcasts from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, or 943 01:00:09,480 --> 01:00:11,400 Speaker 1: wherever you listen to your favorite shows.