1 00:00:04,440 --> 00:00:12,440 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,440 --> 00:00:15,920 Speaker 1: and welcome to tech Stuff. I'm your host Jonathan Strickland. 3 00:00:15,920 --> 00:00:18,880 Speaker 1: I'm an executive producer with iHeart Podcasts. And how the 4 00:00:19,000 --> 00:00:23,440 Speaker 1: tech are you? So? I have an interesting topic today. 5 00:00:23,640 --> 00:00:28,640 Speaker 1: An artist named Alicia Framus plans to do something unprecedented. 6 00:00:28,880 --> 00:00:34,159 Speaker 1: She plans to get married to an artificially intelligent hologram, 7 00:00:34,680 --> 00:00:37,800 Speaker 1: which is really in sort of her words, not mine. 8 00:00:38,080 --> 00:00:40,880 Speaker 1: And there is a lot to unpack there. I've got 9 00:00:41,440 --> 00:00:43,839 Speaker 1: a lot of questions. Now, some of those questions are 10 00:00:43,840 --> 00:00:47,800 Speaker 1: actually technical, right, some of them are ethical, some of 11 00:00:47,800 --> 00:00:51,440 Speaker 1: them are just downright skeptical. So, for example, what does 12 00:00:51,479 --> 00:00:55,320 Speaker 1: she mean by hologram? That is a term that really 13 00:00:55,360 --> 00:00:59,880 Speaker 1: means a specific thing but frequently gets misused to me 14 00:01:00,200 --> 00:01:03,720 Speaker 1: other things. And then who developed the AI and how 15 00:01:03,840 --> 00:01:07,240 Speaker 1: was it trained? And what does she specifically mean by AI? 16 00:01:07,480 --> 00:01:12,319 Speaker 1: Are we talking just strictly generative AI? So essentially, you know, 17 00:01:12,400 --> 00:01:15,520 Speaker 1: a chat bought type thing like you know, open AIS, 18 00:01:15,640 --> 00:01:19,240 Speaker 1: chat GPT. Is that what she means by AI? Does 19 00:01:19,280 --> 00:01:22,680 Speaker 1: she mean something beyond just a generative AI? Is it 20 00:01:22,760 --> 00:01:28,520 Speaker 1: AI that would have some form of control over the hologram? 21 00:01:28,560 --> 00:01:31,160 Speaker 1: I mean presumably that would happen right, Like the AI 22 00:01:31,240 --> 00:01:34,960 Speaker 1: would be able to determine where the hologram could be 23 00:01:35,360 --> 00:01:39,399 Speaker 1: directing its quote unquote attention, all that kind of stuff, like, 24 00:01:40,080 --> 00:01:45,800 Speaker 1: how is the AI supposed to sense her? What elements? 25 00:01:45,800 --> 00:01:49,200 Speaker 1: What technical elements are there so that the hologram AI 26 00:01:50,200 --> 00:01:54,560 Speaker 1: has a quote unquote awareness of where she happens to be. 27 00:01:54,680 --> 00:01:56,760 Speaker 1: You know, if you're with an actual other person, they 28 00:01:56,800 --> 00:02:00,960 Speaker 1: can direct their attention toward you in very way, But 29 00:02:01,400 --> 00:02:04,360 Speaker 1: a hologram can't do that on its own right. You 30 00:02:04,360 --> 00:02:06,200 Speaker 1: would have to pair it with something else, like a 31 00:02:06,240 --> 00:02:09,720 Speaker 1: camera system or something, and have some computer vision elements 32 00:02:09,800 --> 00:02:13,239 Speaker 1: to process that in order for the hologram to face 33 00:02:13,320 --> 00:02:16,760 Speaker 1: the correct way. Like, there are a lot of technical 34 00:02:16,840 --> 00:02:19,920 Speaker 1: questions that go along with this. Also, to what extent, 35 00:02:20,000 --> 00:02:24,320 Speaker 1: if any, will any agency of authority recognize this union 36 00:02:24,520 --> 00:02:27,720 Speaker 1: if she's to get married to a hologram? What does 37 00:02:27,760 --> 00:02:31,240 Speaker 1: that mean? Does it? I don't think it actually means 38 00:02:31,280 --> 00:02:34,160 Speaker 1: getting married in a legal sense. It is more of 39 00:02:34,160 --> 00:02:36,959 Speaker 1: a symbolic thing, which is fine. This is an artist 40 00:02:36,960 --> 00:02:40,760 Speaker 1: we're talking about, and in fact, the purpose of her 41 00:02:40,960 --> 00:02:45,000 Speaker 1: art is largely to ask questions that we don't necessarily 42 00:02:45,000 --> 00:02:47,560 Speaker 1: have the answers to So this isn't just me being frustrated. 43 00:02:47,600 --> 00:02:50,359 Speaker 1: I mean, some of it is frustrating, but it's also 44 00:02:50,680 --> 00:02:53,960 Speaker 1: the point of the whole exercise is to start asking 45 00:02:54,040 --> 00:02:57,880 Speaker 1: questions like what does this mean for the future, because 46 00:02:57,919 --> 00:03:02,480 Speaker 1: Alicia on her website or someone on her behalf points 47 00:03:02,520 --> 00:03:05,919 Speaker 1: out that in the future we are likely to see 48 00:03:06,200 --> 00:03:11,799 Speaker 1: cases of people wanting to have actual relationships with artificial constructs. 49 00:03:11,800 --> 00:03:16,600 Speaker 1: I mean, we've already seen examples of that before with AI, 50 00:03:16,919 --> 00:03:21,600 Speaker 1: like boyfriends and girlfriends to varying degrees of success, and 51 00:03:21,760 --> 00:03:25,600 Speaker 1: so it's really just an extension of that. Now, keeping 52 00:03:25,600 --> 00:03:27,960 Speaker 1: in mind that Framus has made a career out of 53 00:03:27,960 --> 00:03:32,680 Speaker 1: exploring the depths of human experience, including loneliness, it's only 54 00:03:32,760 --> 00:03:36,040 Speaker 1: fair that we acknowledge the artistic element of this event. 55 00:03:36,200 --> 00:03:38,400 Speaker 1: Is what I was just saying a second ago. I mean, 56 00:03:38,440 --> 00:03:41,120 Speaker 1: you could call this just a stunt, and to some 57 00:03:41,240 --> 00:03:44,200 Speaker 1: extent I could see that being a fair assessment saying 58 00:03:44,200 --> 00:03:47,800 Speaker 1: this is kind of a stunt. But you know, we 59 00:03:47,840 --> 00:03:53,240 Speaker 1: also have to admit that AI is playing an increasingly 60 00:03:53,360 --> 00:03:56,920 Speaker 1: larger role in our lives, for good and for ill, 61 00:03:57,400 --> 00:04:00,720 Speaker 1: and keeping in mind AI means lots of stuff right. 62 00:04:00,800 --> 00:04:03,440 Speaker 1: It's not just the jurni of AI. We see AI 63 00:04:03,960 --> 00:04:07,160 Speaker 1: being incorporated all over the place. Some cases it's not 64 00:04:07,400 --> 00:04:11,800 Speaker 1: nearly as obvious, but it's certainly there, and like it 65 00:04:11,880 --> 00:04:14,200 Speaker 1: or not, things like AI and robotics are going to 66 00:04:14,200 --> 00:04:18,039 Speaker 1: play a part in physical relationships in the future as well. 67 00:04:18,360 --> 00:04:21,200 Speaker 1: We humans are complicated creatures, and sometimes it means we 68 00:04:21,279 --> 00:04:24,760 Speaker 1: might find it challenging or even impossible to find another 69 00:04:24,880 --> 00:04:27,719 Speaker 1: human with whom we can share ourselves, and perhaps it 70 00:04:27,800 --> 00:04:31,200 Speaker 1: might be easier to engineer a simulation of a person 71 00:04:31,400 --> 00:04:34,039 Speaker 1: to fill those needs. This, of course, is not a 72 00:04:34,080 --> 00:04:37,240 Speaker 1: groundbreaking idea. There are plenty of science fiction stories that 73 00:04:37,279 --> 00:04:41,919 Speaker 1: have explored this concept thoroughly. There's the phenomenal science fiction 74 00:04:42,040 --> 00:04:45,479 Speaker 1: film from twenty fourteen titled Ex Makina, in which a 75 00:04:45,560 --> 00:04:50,080 Speaker 1: narcissistic tech billionaire brings a computer programmer to a remote 76 00:04:50,160 --> 00:04:54,200 Speaker 1: mansion to judge whether a female presenting robot has self 77 00:04:54,200 --> 00:05:00,040 Speaker 1: awareness and true intelligence that has elements of romantic feeling 78 00:05:00,160 --> 00:05:03,360 Speaker 1: between a human and a robot. Or there's the equally 79 00:05:03,400 --> 00:05:06,480 Speaker 1: phenomenal nineteen eighty seven film Making Mister Right, in which 80 00:05:06,560 --> 00:05:10,000 Speaker 1: John Malkovich plays both an anti social human engineer and 81 00:05:10,040 --> 00:05:13,719 Speaker 1: his robotic duplicate who seems to actually be more human 82 00:05:13,760 --> 00:05:17,200 Speaker 1: than his human creator. And I'm being a little cheeky 83 00:05:17,200 --> 00:05:19,240 Speaker 1: when I say that Making Mister Wright is equal to 84 00:05:19,400 --> 00:05:21,960 Speaker 1: X mock it Up. I don't actually believe that, but 85 00:05:23,279 --> 00:05:26,040 Speaker 1: it is a movie, and it does explore this idea 86 00:05:26,279 --> 00:05:30,720 Speaker 1: from a romantic comedy perspective. Anyway, most of these science 87 00:05:30,720 --> 00:05:34,039 Speaker 1: fiction stories at some point explore what it means to 88 00:05:34,080 --> 00:05:39,240 Speaker 1: be human, or what intelligence actually means, whether an artificial 89 00:05:39,279 --> 00:05:43,880 Speaker 1: construct can experience emotional connections, what effect this could have 90 00:05:44,040 --> 00:05:46,240 Speaker 1: on humanity as a whole, and a host of other 91 00:05:46,320 --> 00:05:49,440 Speaker 1: philosophical and ethical concepts. And based on what I've read, 92 00:05:49,480 --> 00:05:52,560 Speaker 1: Alicia Framus is looking at the issue from a very 93 00:05:52,640 --> 00:05:58,479 Speaker 1: human perspective. Can we seek and receive emotional fulfillment from 94 00:05:58,600 --> 00:06:01,720 Speaker 1: an AI creation? But before we get into all that, 95 00:06:02,320 --> 00:06:05,440 Speaker 1: let's talk about the tech. Because this is tech stuff. 96 00:06:05,480 --> 00:06:09,880 Speaker 1: We need to talk about holograms and artificial intelligence in general. 97 00:06:10,240 --> 00:06:12,520 Speaker 1: And this is going to be tricky because in most 98 00:06:12,560 --> 00:06:15,560 Speaker 1: of the articles I've come across, there's a distinct lack 99 00:06:15,600 --> 00:06:19,839 Speaker 1: of technical information about Framus's project. There are plenty of 100 00:06:19,839 --> 00:06:23,039 Speaker 1: photographs in which Framos appears with a ghostly image of 101 00:06:23,040 --> 00:06:25,440 Speaker 1: a man. In fact, I can't help but walk away 102 00:06:25,440 --> 00:06:28,320 Speaker 1: with the impression that Framus is engaged to a forced 103 00:06:28,440 --> 00:06:31,000 Speaker 1: ghost from the Star Wars universe, because that's what it 104 00:06:31,040 --> 00:06:34,880 Speaker 1: looks like. But these are clearly manufactured images, right, These 105 00:06:34,880 --> 00:06:38,240 Speaker 1: are not actual pictures of her with a hologram. It's 106 00:06:38,400 --> 00:06:41,600 Speaker 1: just pictures that go along with the concept of her 107 00:06:41,720 --> 00:06:46,000 Speaker 1: artistic project. There's not really a holographic rogue gentleman posing 108 00:06:46,000 --> 00:06:48,720 Speaker 1: with Framus in a courtyard, because that would be very 109 00:06:48,720 --> 00:06:53,360 Speaker 1: hard to pull off. Any sort of three dimensional projection 110 00:06:53,920 --> 00:06:57,440 Speaker 1: would require a lot of equipment, and you couldn't just 111 00:06:57,760 --> 00:07:02,960 Speaker 1: have some hologram spawned taneously appear in any given location. 112 00:07:03,120 --> 00:07:05,280 Speaker 1: That's not how they work. It's not like you know 113 00:07:05,680 --> 00:07:12,040 Speaker 1: in Quantum Leap or Star Trek or you know, a 114 00:07:12,080 --> 00:07:15,800 Speaker 1: Red Dwarf probably the best example. There's no way for 115 00:07:15,840 --> 00:07:18,560 Speaker 1: a hologram to just be free roaming, and of course 116 00:07:18,600 --> 00:07:21,280 Speaker 1: it wouldn't be a hologram anyway, but we'll get into that, 117 00:07:21,280 --> 00:07:24,840 Speaker 1: it's not really a hologram. So according to numerous pieces, 118 00:07:25,320 --> 00:07:27,680 Speaker 1: and I suspect all of these pieces traced back to 119 00:07:27,840 --> 00:07:31,040 Speaker 1: like a single press release on the matter, because a 120 00:07:31,040 --> 00:07:34,000 Speaker 1: lot of it seems to be pulling from Alisha Framus's 121 00:07:34,040 --> 00:07:38,240 Speaker 1: own website. The figure that will accompany Alisha Framus at 122 00:07:38,280 --> 00:07:41,040 Speaker 1: the wedding ceremony the summer in Rotterdam will be a 123 00:07:41,160 --> 00:07:45,000 Speaker 1: quote unquote holographic sculpture. Now that can mean lots of 124 00:07:45,000 --> 00:07:48,640 Speaker 1: different things. It could mean that there's an actual physical 125 00:07:49,000 --> 00:07:53,120 Speaker 1: figure like a mannequin, like a blank mannequin, and this 126 00:07:53,360 --> 00:07:57,760 Speaker 1: hologram quote unquote will be projected or otherwise displayed on 127 00:07:58,000 --> 00:08:02,040 Speaker 1: the figure, sort of like a three D mapping projection thing. 128 00:08:02,760 --> 00:08:05,480 Speaker 1: Because again, it's really hard for me to imagine a freestanding, 129 00:08:05,560 --> 00:08:08,720 Speaker 1: ephemeral hologram made up of only light saying I do. 130 00:08:09,200 --> 00:08:11,400 Speaker 1: It's not that it would be impossible, but you would 131 00:08:11,440 --> 00:08:13,360 Speaker 1: have to put in a whole lot of work to 132 00:08:13,440 --> 00:08:16,760 Speaker 1: create the projectors. And maybe that is what's going to happen. 133 00:08:16,760 --> 00:08:19,600 Speaker 1: I mean, this is a big artistic project, so maybe 134 00:08:20,120 --> 00:08:24,760 Speaker 1: there will be the equipment. But doing this you need 135 00:08:24,800 --> 00:08:27,400 Speaker 1: a lot of control over your environment in order for 136 00:08:27,480 --> 00:08:31,120 Speaker 1: it to have the right effect, because if you have 137 00:08:31,280 --> 00:08:33,920 Speaker 1: too much ambient light, it's really going to mess things up. 138 00:08:34,240 --> 00:08:37,160 Speaker 1: So let's talk about light and holograms. Now, this is 139 00:08:37,200 --> 00:08:40,080 Speaker 1: a very complicated topic. And I want to say that. 140 00:08:40,240 --> 00:08:42,720 Speaker 1: You know, back when I worked at houstuffworks dot com, 141 00:08:42,880 --> 00:08:46,120 Speaker 1: our senior writer, our head writer, Tracy Wilson, who is 142 00:08:46,160 --> 00:08:47,680 Speaker 1: now one of the co hosts of Stuff You Missed 143 00:08:47,679 --> 00:08:51,320 Speaker 1: in History Class, she nearly went bananas working on this article. 144 00:08:51,360 --> 00:08:53,160 Speaker 1: She did a phenomenal job. By the way, I'm not 145 00:08:53,320 --> 00:08:56,920 Speaker 1: no shade on Tracy. She did far better writing this 146 00:08:57,040 --> 00:09:00,280 Speaker 1: article than I would have. And I say that because 147 00:09:00,320 --> 00:09:05,080 Speaker 1: it's very challenging to break down a hologram into a 148 00:09:05,160 --> 00:09:08,800 Speaker 1: simple concept that you can explain, because it's one of 149 00:09:08,800 --> 00:09:10,960 Speaker 1: those things where it's like an onion. In order to 150 00:09:11,040 --> 00:09:13,760 Speaker 1: understand the hologram, you peel away a layer and you say, well, 151 00:09:13,800 --> 00:09:16,640 Speaker 1: we really need to talk about interference, and we need 152 00:09:16,679 --> 00:09:19,960 Speaker 1: to talk about things like reflection and refraction. We need 153 00:09:20,000 --> 00:09:22,560 Speaker 1: to talk about how light travels as a wave or 154 00:09:22,559 --> 00:09:25,240 Speaker 1: a particle. Then you strip it away again. You start 155 00:09:25,280 --> 00:09:27,920 Speaker 1: getting further and further down, and you realize, oh, my 156 00:09:28,000 --> 00:09:31,360 Speaker 1: starting point is so far back that this article is 157 00:09:31,360 --> 00:09:34,960 Speaker 1: going to last fifty pages or this podcast will go 158 00:09:35,800 --> 00:09:38,240 Speaker 1: four hours. We're not gonna do that. We're going to 159 00:09:38,400 --> 00:09:41,640 Speaker 1: take a much more service level approach, because one it 160 00:09:41,640 --> 00:09:43,800 Speaker 1: would take me a lot more time to really get 161 00:09:43,800 --> 00:09:46,960 Speaker 1: my head wrapped around it, because while I did take 162 00:09:47,000 --> 00:09:50,920 Speaker 1: physics in school and I loved physics, it has been 163 00:09:51,080 --> 00:09:54,840 Speaker 1: like thirty years since I took those courses, and I 164 00:09:54,880 --> 00:09:57,720 Speaker 1: would have to do a whole lot of research in 165 00:09:57,800 --> 00:10:00,880 Speaker 1: order to dust the ring rust off my my brain 166 00:10:01,440 --> 00:10:06,440 Speaker 1: and then I could maybe tackle it. So we talk 167 00:10:06,679 --> 00:10:11,320 Speaker 1: about holograms. It means a specific thing, and one way 168 00:10:11,760 --> 00:10:15,040 Speaker 1: to really break it down is to start with how 169 00:10:15,080 --> 00:10:19,079 Speaker 1: a normal photograph works to contrast it with a hologram. 170 00:10:19,559 --> 00:10:24,280 Speaker 1: So let's talk about photography. And with photography, you've got 171 00:10:24,320 --> 00:10:27,160 Speaker 1: your camera and you point your camera at something interesting. 172 00:10:27,440 --> 00:10:30,319 Speaker 1: Let's say it's your favorite tech podcaster ordering a burrito 173 00:10:30,360 --> 00:10:33,320 Speaker 1: at a food truck, something that happens on a fairly 174 00:10:33,320 --> 00:10:36,679 Speaker 1: frequent basis. And here's what happens. When you push the 175 00:10:36,720 --> 00:10:40,560 Speaker 1: button to actually take the picture. Light from the scene, 176 00:10:40,640 --> 00:10:43,920 Speaker 1: light reflecting off of your subject, in this case a 177 00:10:44,360 --> 00:10:48,200 Speaker 1: tech podcaster ordering a brito. Light from that reflects off 178 00:10:48,240 --> 00:10:52,640 Speaker 1: of those objects and enters into the camera through the lens, 179 00:10:53,080 --> 00:10:56,120 Speaker 1: and the lens focuses the light to a point that's 180 00:10:56,120 --> 00:10:59,320 Speaker 1: actually behind a shutter. So when the shutter's closed, no 181 00:10:59,400 --> 00:11:02,080 Speaker 1: light can pass beyond that. But the shutter when you 182 00:11:02,120 --> 00:11:05,360 Speaker 1: push the button opens and this shutter could be mechanical, 183 00:11:05,400 --> 00:11:08,199 Speaker 1: it could be electronic. You know, it doesn't really matter 184 00:11:08,240 --> 00:11:10,200 Speaker 1: for the purposes of this. The point is that it 185 00:11:10,240 --> 00:11:12,920 Speaker 1: allows light to come through. The light moves through an 186 00:11:13,000 --> 00:11:16,440 Speaker 1: opening called the aperture, and then it passes to hit 187 00:11:17,000 --> 00:11:20,800 Speaker 1: a medium, a recording medium, which could be film in 188 00:11:20,880 --> 00:11:23,760 Speaker 1: the old days, or it could be a digital sensor. 189 00:11:23,920 --> 00:11:26,960 Speaker 1: Probably a digital sensor. I mean, unless you happen to 190 00:11:27,080 --> 00:11:30,000 Speaker 1: like shooting on film, and maybe you do. It's pretty cool. 191 00:11:30,360 --> 00:11:33,320 Speaker 1: I enjoy it. The medium makes a record of the 192 00:11:33,440 --> 00:11:35,760 Speaker 1: light that entered in through the scene, and what you 193 00:11:35,880 --> 00:11:39,080 Speaker 1: end up with is a two dimensional image of whatever 194 00:11:39,120 --> 00:11:40,560 Speaker 1: it was you were taking a photo of, in this 195 00:11:40,640 --> 00:11:45,080 Speaker 1: case tech podcast or ordering a brito. Now, as you're aware, 196 00:11:45,480 --> 00:11:48,520 Speaker 1: this image is limited to a single angle. You know, 197 00:11:48,640 --> 00:11:51,320 Speaker 1: the angle what you were at when you took the photo. 198 00:11:51,640 --> 00:11:54,280 Speaker 1: You can't turn the image around in the photo and 199 00:11:54,960 --> 00:11:58,160 Speaker 1: see what you know. I I'm sorry, I mean your 200 00:11:58,200 --> 00:12:01,280 Speaker 1: favorite tech podcaster look like on the opposite side. You 201 00:12:01,280 --> 00:12:03,839 Speaker 1: can't flip it around, say oh, and here's the backside 202 00:12:03,920 --> 00:12:07,400 Speaker 1: of that photograph. You're just limited to that single perspective. Now, 203 00:12:07,480 --> 00:12:11,360 Speaker 1: let's go a step further and talk about three D photography. Now, 204 00:12:11,840 --> 00:12:14,200 Speaker 1: one way to achieve a three D effect would be 205 00:12:14,240 --> 00:12:16,640 Speaker 1: to use two cameras, and these two cameras would be 206 00:12:16,679 --> 00:12:19,720 Speaker 1: slightly offset from each other, similar to how your eyes 207 00:12:19,920 --> 00:12:22,880 Speaker 1: are offset from each other. Right, you have one toward 208 00:12:22,920 --> 00:12:25,040 Speaker 1: the left, one toward the right. So when you're looking 209 00:12:25,040 --> 00:12:28,040 Speaker 1: at something, your brain is taking in visual information from 210 00:12:28,160 --> 00:12:31,160 Speaker 1: two slightly different angles and then combining that into a 211 00:12:31,240 --> 00:12:35,400 Speaker 1: single image in your brain, and the two cameras will 212 00:12:35,400 --> 00:12:38,040 Speaker 1: take a picture at the same time from slightly different perspectives, 213 00:12:38,120 --> 00:12:42,360 Speaker 1: kind of mimicking that physical effect. You then present this 214 00:12:42,600 --> 00:12:45,760 Speaker 1: pair of images that are offset from one another, and 215 00:12:45,800 --> 00:12:47,280 Speaker 1: you do it in such a way that the viewer 216 00:12:47,320 --> 00:12:50,520 Speaker 1: will get a different image for each eye. And you 217 00:12:50,559 --> 00:12:52,560 Speaker 1: can use various filters to do this. So in the 218 00:12:52,640 --> 00:12:55,000 Speaker 1: old days, back in the mid twentieth century, like the 219 00:12:55,080 --> 00:12:58,320 Speaker 1: nineteen fifties, you might use a real cheap pair of 220 00:12:58,360 --> 00:13:00,360 Speaker 1: three D glasses where you have a red filmilter for 221 00:13:00,400 --> 00:13:02,680 Speaker 1: one eye and a blue filter for the other eye. 222 00:13:03,040 --> 00:13:05,880 Speaker 1: And so let's say it's red for the right, blue 223 00:13:05,880 --> 00:13:10,280 Speaker 1: for the left, and in one of the sets of 224 00:13:10,320 --> 00:13:13,400 Speaker 1: images you would have it tinted blue, which would be 225 00:13:13,679 --> 00:13:17,560 Speaker 1: blocked by the blue lens, so that eye, the left 226 00:13:17,559 --> 00:13:20,360 Speaker 1: eye would not see that set of images, but the 227 00:13:20,440 --> 00:13:22,480 Speaker 1: right eye would. The other set of images would be 228 00:13:22,520 --> 00:13:25,840 Speaker 1: tinted red, so your right eye behind the red filter 229 00:13:26,080 --> 00:13:29,679 Speaker 1: would not see those images. It'd be filtered out by 230 00:13:29,800 --> 00:13:33,000 Speaker 1: the lens in front of your eye, so your left 231 00:13:33,040 --> 00:13:35,079 Speaker 1: eye would get one set of images, your right eye 232 00:13:35,080 --> 00:13:36,840 Speaker 1: would get the other set of images. Your brain would 233 00:13:36,840 --> 00:13:39,480 Speaker 1: combine the two in your head, and voila, you have 234 00:13:39,800 --> 00:13:42,280 Speaker 1: what appears to be a three dimensional shape in front 235 00:13:42,320 --> 00:13:44,840 Speaker 1: of you. Now these days, you're more likely to use 236 00:13:45,040 --> 00:13:48,480 Speaker 1: a different method like polarized light, and you would have 237 00:13:48,600 --> 00:13:51,560 Speaker 1: light polarized in one way so that the left eye 238 00:13:51,559 --> 00:13:54,319 Speaker 1: would see those sets of images, and polarize the different 239 00:13:54,360 --> 00:13:56,360 Speaker 1: way so the right eye would see those set of images, 240 00:13:56,400 --> 00:13:59,200 Speaker 1: and again your brain combines it. The brain part is 241 00:13:59,240 --> 00:14:02,440 Speaker 1: the same, it's just how you end up filtering out 242 00:14:02,679 --> 00:14:05,680 Speaker 1: one set versus another. So that's how most three D 243 00:14:05,840 --> 00:14:08,480 Speaker 1: films work these days. You can also have active three 244 00:14:08,559 --> 00:14:12,560 Speaker 1: D glasses. These have electronic shutters in them, and they're 245 00:14:12,679 --> 00:14:16,960 Speaker 1: timed so that they turn off and on in time 246 00:14:17,080 --> 00:14:19,880 Speaker 1: with the sets of images that are displayed on a screen. 247 00:14:20,240 --> 00:14:22,520 Speaker 1: So the left eye gets one set, the right eye 248 00:14:22,520 --> 00:14:26,520 Speaker 1: gets another set, and what's actually happening is your screen 249 00:14:26,720 --> 00:14:29,240 Speaker 1: is alternating these sets, but doing so at a very 250 00:14:29,320 --> 00:14:32,880 Speaker 1: very high frequency that you wouldn't necessarily notice without the 251 00:14:32,880 --> 00:14:35,320 Speaker 1: glasses on. Now, the point is this approach creates a 252 00:14:35,320 --> 00:14:37,960 Speaker 1: three D effect, but you're still working under limitations because 253 00:14:38,000 --> 00:14:40,760 Speaker 1: while the image will appear to have depth to it, 254 00:14:40,800 --> 00:14:42,920 Speaker 1: like it'll look like it's coming off the screen, or 255 00:14:42,960 --> 00:14:46,760 Speaker 1: that the screen itself has depth, you can't walk around 256 00:14:47,280 --> 00:14:50,560 Speaker 1: the image and view it from different angles. It can 257 00:14:50,720 --> 00:14:52,680 Speaker 1: look like you might be able to do that, but 258 00:14:52,800 --> 00:14:54,360 Speaker 1: then if you were to get up and move to 259 00:14:54,400 --> 00:14:57,200 Speaker 1: a different part of the room, you would not actually 260 00:14:57,960 --> 00:15:01,080 Speaker 1: have a change in perspective. It would still be the same. 261 00:15:01,080 --> 00:15:03,360 Speaker 1: It would still have depth, but it wouldn't have changed 262 00:15:03,560 --> 00:15:07,560 Speaker 1: perspective at all. There's no parallax, and parallax is quote 263 00:15:07,680 --> 00:15:10,640 Speaker 1: the apparent displacement or the difference in apparent direction of 264 00:15:10,680 --> 00:15:13,840 Speaker 1: an object as seen from two different points, not on 265 00:15:13,920 --> 00:15:16,760 Speaker 1: a straight line with the object end quote. That's according 266 00:15:16,760 --> 00:15:20,120 Speaker 1: to Merriam Webster. So to achieve parallax, you would need 267 00:15:20,160 --> 00:15:23,640 Speaker 1: a way to project a hologram, which we will talk 268 00:15:23,680 --> 00:15:36,040 Speaker 1: about after we take this quick break. All right, To 269 00:15:36,120 --> 00:15:39,080 Speaker 1: talk about holograms, we do have to understand a little 270 00:15:39,120 --> 00:15:42,320 Speaker 1: bit about how light works and that it travels in waves, 271 00:15:42,640 --> 00:15:44,920 Speaker 1: but it also can travel as a particle. We know 272 00:15:45,000 --> 00:15:47,360 Speaker 1: that because of quantum physics. But we're going to leave 273 00:15:47,360 --> 00:15:49,240 Speaker 1: that for now. We're just going to talk about the waves. 274 00:15:49,600 --> 00:15:54,640 Speaker 1: So waves are of the same type, can interact with 275 00:15:54,680 --> 00:15:57,960 Speaker 1: each other. Right, You see this in water, So a 276 00:15:58,000 --> 00:16:02,560 Speaker 1: classic example in keeping water and light behave in different ways, 277 00:16:02,600 --> 00:16:05,800 Speaker 1: but it's similar enough that we can kind of make 278 00:16:05,880 --> 00:16:10,600 Speaker 1: this analogy. But if you have a still pond, and 279 00:16:10,720 --> 00:16:14,400 Speaker 1: you've got two pebbles, and you drop those two pebbles 280 00:16:14,760 --> 00:16:16,840 Speaker 1: where they are a couple of feet away from each other, 281 00:16:17,400 --> 00:16:20,000 Speaker 1: When they make contact with the water, they obviously each 282 00:16:20,080 --> 00:16:23,400 Speaker 1: cause ripples to radiate outward from the point of contact. 283 00:16:23,680 --> 00:16:26,600 Speaker 1: And because you drop these two pebbles fairly close to 284 00:16:26,640 --> 00:16:29,640 Speaker 1: each other, the ripples from one are going to interact 285 00:16:29,640 --> 00:16:33,440 Speaker 1: with the ripples of another, and they will end up 286 00:16:33,560 --> 00:16:36,440 Speaker 1: interfering with one another. Well, the same sort of thing 287 00:16:36,520 --> 00:16:41,920 Speaker 1: happens with light. So to create a traditional hologram on film, 288 00:16:41,960 --> 00:16:46,240 Speaker 1: that's one way to create a hologram. The basic approach 289 00:16:46,600 --> 00:16:50,200 Speaker 1: is to use a laser and a beam splitter, and 290 00:16:50,280 --> 00:16:52,400 Speaker 1: you put whatever subject it is that you want to 291 00:16:52,400 --> 00:16:55,440 Speaker 1: capture as a hologram. Let's say it's a vintage nineteen 292 00:16:55,480 --> 00:16:58,680 Speaker 1: eighties action figure of mister T. You put that in 293 00:16:58,800 --> 00:17:02,840 Speaker 1: line with a path for part of this laser beam. 294 00:17:03,040 --> 00:17:05,800 Speaker 1: So you direct this laser beam through the beam splitter, which, 295 00:17:06,040 --> 00:17:10,320 Speaker 1: as its name implies, splits the beam into two identical lasers, 296 00:17:10,760 --> 00:17:14,560 Speaker 1: and one of those beams ends up following down a 297 00:17:14,560 --> 00:17:19,560 Speaker 1: path and is directed by mirrors to hit mister T. 298 00:17:20,400 --> 00:17:23,280 Speaker 1: And this is the object beam. It's the object beam 299 00:17:23,320 --> 00:17:26,560 Speaker 1: because it's the beam that actually interacts with the object. 300 00:17:27,000 --> 00:17:29,480 Speaker 1: Some of the light reflects off of mister T and 301 00:17:29,560 --> 00:17:33,240 Speaker 1: hits your recording medium, so our holographic film in this case, 302 00:17:33,280 --> 00:17:37,359 Speaker 1: the film it gets this light hit and captures the information. 303 00:17:37,800 --> 00:17:40,600 Speaker 1: Now the other beam, because remember we split into two, 304 00:17:40,880 --> 00:17:43,000 Speaker 1: the other beam goes down a different path from the 305 00:17:43,000 --> 00:17:47,560 Speaker 1: first one, and mirrors redirect it so that this beam 306 00:17:47,640 --> 00:17:50,879 Speaker 1: hits the recording medium directly. It does not interact with 307 00:17:50,920 --> 00:17:54,480 Speaker 1: mister T. It pities the fool. Instead, it just hits 308 00:17:54,520 --> 00:17:57,520 Speaker 1: the holographic film. So this is what we call the 309 00:17:57,600 --> 00:18:00,400 Speaker 1: reference beam, and the reference beam and the object beam 310 00:18:00,520 --> 00:18:05,440 Speaker 1: interact at the medium. There's interference created by these two beams, 311 00:18:05,680 --> 00:18:08,880 Speaker 1: and the result is that the recorded data on that 312 00:18:09,000 --> 00:18:12,280 Speaker 1: film allows for the creation of the hologram. It contains 313 00:18:12,320 --> 00:18:17,360 Speaker 1: the information, and after developing the film and by illuminating 314 00:18:17,400 --> 00:18:19,639 Speaker 1: it in a specific way, which depends upon your method 315 00:18:19,640 --> 00:18:21,760 Speaker 1: of creating the hologram in the first place, you can 316 00:18:21,840 --> 00:18:24,440 Speaker 1: create an image that appears to have three dimensions. Now 317 00:18:24,760 --> 00:18:26,600 Speaker 1: we're talking about film here, right, so we are still 318 00:18:26,600 --> 00:18:30,360 Speaker 1: talking about a picture. The picture itself is effectively two dimensional, 319 00:18:30,359 --> 00:18:32,600 Speaker 1: technically has three dimensions, but it's thin, so you don't 320 00:18:32,600 --> 00:18:34,240 Speaker 1: really think of it that way. It's like a piece 321 00:18:34,240 --> 00:18:37,720 Speaker 1: of paper, right, But the image on this photograph would 322 00:18:37,720 --> 00:18:40,720 Speaker 1: appear to have three dimensions, right, Like you would stand 323 00:18:40,720 --> 00:18:43,160 Speaker 1: in front of the photograph and you'd be able to 324 00:18:43,280 --> 00:18:45,239 Speaker 1: lean to the left or lean to the right and 325 00:18:45,320 --> 00:18:50,400 Speaker 1: see different angles of the imaged object. But you couldn't 326 00:18:50,520 --> 00:18:53,040 Speaker 1: do a full three hundred and sixty degree walk around 327 00:18:53,280 --> 00:18:56,000 Speaker 1: of it, because again, it's a photograph, right, It's just 328 00:18:56,040 --> 00:18:59,359 Speaker 1: like a regular picture. If you were to walk around, 329 00:18:59,359 --> 00:19:01,159 Speaker 1: you would just see the back of the photograph. It 330 00:19:01,160 --> 00:19:04,000 Speaker 1: would just be a blank surface, but from the front, 331 00:19:04,080 --> 00:19:09,040 Speaker 1: from the viewing angles, you could see different three dimensional 332 00:19:09,160 --> 00:19:14,040 Speaker 1: aspects of this object. And that's a really cool effect. 333 00:19:14,040 --> 00:19:16,679 Speaker 1: That is a hologram. And you might say, well, that 334 00:19:16,720 --> 00:19:20,720 Speaker 1: doesn't sound like a freestanding hologram, not the way that 335 00:19:20,760 --> 00:19:24,320 Speaker 1: Alicia Framus is talking about, and it's not because a 336 00:19:24,359 --> 00:19:26,920 Speaker 1: lot of the stuff that gets billed as a hologram 337 00:19:27,280 --> 00:19:30,199 Speaker 1: isn't really a hologram at all. It's something else, and 338 00:19:30,480 --> 00:19:34,160 Speaker 1: it could be one of many things if you think 339 00:19:34,200 --> 00:19:37,960 Speaker 1: about like the hologram of Tupac Shakur at the twenty 340 00:19:38,000 --> 00:19:43,240 Speaker 1: twelve Coachella Festival. Now, Tupac Shakur was killed in nineteen 341 00:19:43,320 --> 00:19:49,080 Speaker 1: ninety six, so obviously a tragic moment, and this was 342 00:19:49,119 --> 00:19:52,679 Speaker 1: like a triumphant moment to see Tupac Shakur performing in 343 00:19:52,680 --> 00:19:56,000 Speaker 1: front of a live crowd at twenty twelve's Coachella. And 344 00:19:56,080 --> 00:19:58,280 Speaker 1: it was said to have been a hologram, like a 345 00:19:58,359 --> 00:20:02,440 Speaker 1: holographic image of Shakur performing in front of the crowd. 346 00:20:02,720 --> 00:20:05,840 Speaker 1: It wasn't a hologram. Instead, it was an example of 347 00:20:06,080 --> 00:20:10,439 Speaker 1: a very old effect called Pepper's ghost illusion. It actually 348 00:20:10,600 --> 00:20:14,240 Speaker 1: traces its history back to the nineteenth century, and it 349 00:20:14,280 --> 00:20:16,720 Speaker 1: takes its name from a guy named John Henry Pepper 350 00:20:17,040 --> 00:20:19,959 Speaker 1: who was an inventor in the United Kingdom and involves 351 00:20:20,000 --> 00:20:22,880 Speaker 1: a few different elements. So you have your audience who 352 00:20:23,520 --> 00:20:26,040 Speaker 1: happen to be in a very specific place, right, Like, 353 00:20:26,200 --> 00:20:29,040 Speaker 1: you can't do Pepper's Ghost in theater in the round, 354 00:20:29,160 --> 00:20:31,919 Speaker 1: it wouldn't work. But if your audience is in a 355 00:20:31,960 --> 00:20:35,320 Speaker 1: specific spot relative to the stage, you can do it. 356 00:20:35,760 --> 00:20:38,560 Speaker 1: So you need your staging area as in the actual 357 00:20:38,600 --> 00:20:41,119 Speaker 1: place where the audience is viewing. Then you need the 358 00:20:42,359 --> 00:20:45,639 Speaker 1: hidden staging area it's outside of the audience's view. And 359 00:20:45,680 --> 00:20:48,600 Speaker 1: then you have to have an angled, transparent and reflective 360 00:20:48,800 --> 00:20:52,760 Speaker 1: surface at a forty five degree angle, and that's important. 361 00:20:52,800 --> 00:20:54,639 Speaker 1: You need to be able to see through the surface 362 00:20:54,640 --> 00:20:57,399 Speaker 1: because otherwise you're just looking at a screen, right. You 363 00:20:57,520 --> 00:21:00,080 Speaker 1: need to be able to have it be reflective or 364 00:21:00,080 --> 00:21:01,879 Speaker 1: else slight. It's not going to be able to really 365 00:21:02,200 --> 00:21:04,320 Speaker 1: interact with it in a meaningful way, and it has 366 00:21:04,320 --> 00:21:06,880 Speaker 1: to be angled in order to get the proper effect 367 00:21:06,920 --> 00:21:10,320 Speaker 1: for the audience. So the angled surface, which typically is 368 00:21:10,400 --> 00:21:13,920 Speaker 1: something like a pane of very very clean glass, acts 369 00:21:13,920 --> 00:21:17,240 Speaker 1: as a sort of reflective screen and performers in the 370 00:21:17,359 --> 00:21:20,800 Speaker 1: hidden room are very well lit. So again, this hidden room, 371 00:21:20,880 --> 00:21:23,240 Speaker 1: the audience can't see it. It's off to the side, 372 00:21:23,640 --> 00:21:27,200 Speaker 1: and it has a straight line to the reflective surface, 373 00:21:27,280 --> 00:21:30,880 Speaker 1: but not to the audience's point of view. And it's 374 00:21:30,960 --> 00:21:33,280 Speaker 1: very well lit because it needs to be bright enough 375 00:21:33,440 --> 00:21:38,359 Speaker 1: so that the glass, the angle glass, captures the reflection 376 00:21:39,200 --> 00:21:42,960 Speaker 1: of the performers. The audience can then see the reflections, 377 00:21:43,400 --> 00:21:46,960 Speaker 1: and because the reflections are on transparent glass, and because 378 00:21:46,960 --> 00:21:50,200 Speaker 1: they can see through the glass or whatever other material 379 00:21:50,520 --> 00:21:53,760 Speaker 1: is being used, they can have the effect of these 380 00:21:53,800 --> 00:21:58,000 Speaker 1: figures being incorporated in the scene that's on the stage. 381 00:21:58,800 --> 00:22:02,000 Speaker 1: Those figures can also appear in substantial and you get 382 00:22:02,040 --> 00:22:06,760 Speaker 1: this ghost effect. So in depending upon how you've lit 383 00:22:07,000 --> 00:22:10,800 Speaker 1: the stage and the figures that are in the hidden room, 384 00:22:11,119 --> 00:22:15,000 Speaker 1: they might look transparent ghostly. You could even have other 385 00:22:15,040 --> 00:22:18,800 Speaker 1: actors on stage, like live actors on stage within the 386 00:22:18,840 --> 00:22:22,399 Speaker 1: audience's view and they can appear to interact with the 387 00:22:22,520 --> 00:22:26,719 Speaker 1: ghostly figures that are also present. So it's a pretty 388 00:22:26,760 --> 00:22:29,960 Speaker 1: cool effect. If you've ever ridden the Haunted Mansion ride 389 00:22:30,000 --> 00:22:33,359 Speaker 1: at Disneyland or disney World, you've seen this effect in action. 390 00:22:33,800 --> 00:22:37,159 Speaker 1: There's a large ballroom scene that features numerous ghosts, and 391 00:22:37,200 --> 00:22:39,520 Speaker 1: these ghosts fade in and out of view in front 392 00:22:39,560 --> 00:22:43,520 Speaker 1: of your eyes. They are actually reflections created by Pepper's ghost. 393 00:22:43,880 --> 00:22:47,400 Speaker 1: Above and below your doom buggy track. There are staging 394 00:22:47,440 --> 00:22:50,879 Speaker 1: areas with these figures in them, so they're above you 395 00:22:50,960 --> 00:22:54,680 Speaker 1: and below you, and the areas that they're in have 396 00:22:54,800 --> 00:22:57,720 Speaker 1: lights that turn on and turn off, so when the 397 00:22:57,800 --> 00:23:00,679 Speaker 1: lights come on, then you can see the reflection of 398 00:23:00,680 --> 00:23:03,280 Speaker 1: the figures. When the lights go off, the reflections disappear, 399 00:23:03,400 --> 00:23:06,360 Speaker 1: So the effect you have is that these ghostly images 400 00:23:07,160 --> 00:23:10,959 Speaker 1: manifest and then disappear in front of your very eyes. 401 00:23:11,320 --> 00:23:14,919 Speaker 1: In a similar way, special effects folks can create the 402 00:23:14,920 --> 00:23:17,520 Speaker 1: illusion of a person being an event, even if they 403 00:23:17,600 --> 00:23:20,920 Speaker 1: happen to have shuffled off the moral coil, or maybe 404 00:23:20,920 --> 00:23:23,920 Speaker 1: they're not even a real person. Maybe it's an animated character. 405 00:23:24,359 --> 00:23:26,680 Speaker 1: So rather than a pane of glass, the effects artists 406 00:23:26,760 --> 00:23:30,239 Speaker 1: might use what's called a pepper scrim surface. This is 407 00:23:30,520 --> 00:23:35,240 Speaker 1: a metallic gauze that you can see through. In fact, 408 00:23:35,320 --> 00:23:38,640 Speaker 1: it's invisible to the naked eye, and it acts as 409 00:23:38,680 --> 00:23:42,679 Speaker 1: like a reflective screen for projections, and these projections typically 410 00:23:42,720 --> 00:23:45,240 Speaker 1: are digital rather than a reflection of a physical object. 411 00:23:45,240 --> 00:23:48,520 Speaker 1: So instead of having a hidden room where you've got 412 00:23:48,840 --> 00:23:53,320 Speaker 1: a light that is allowing the reflection to hit this surface, 413 00:23:53,520 --> 00:23:56,600 Speaker 1: you've got a projector and you're just projecting onto the 414 00:23:56,640 --> 00:24:00,520 Speaker 1: surface directly. That's how Tubac showed up at Coach. It's 415 00:24:00,560 --> 00:24:03,960 Speaker 1: also how various anime singers are able to appear at 416 00:24:04,000 --> 00:24:08,120 Speaker 1: real world concerts. And these are not holograms, right, It's 417 00:24:08,200 --> 00:24:11,879 Speaker 1: not any different honestly from using just a projector and 418 00:24:11,960 --> 00:24:14,840 Speaker 1: a screen, because like a projector in the screen, you 419 00:24:14,880 --> 00:24:18,000 Speaker 1: can't walk around that, you can't view it from different 420 00:24:18,040 --> 00:24:22,160 Speaker 1: angles and get that parallax effect. And you know, if 421 00:24:22,160 --> 00:24:24,320 Speaker 1: you did try to walk around it, you wouldn't have 422 00:24:24,320 --> 00:24:26,639 Speaker 1: the experience of looking at a three dimensional figure. But 423 00:24:26,800 --> 00:24:30,280 Speaker 1: folks often will refer to these projections as holograms. I'm 424 00:24:30,280 --> 00:24:33,320 Speaker 1: here to remind you that words mean things, and that 425 00:24:33,400 --> 00:24:36,679 Speaker 1: when we misuse words, we just create confusion. Now, I 426 00:24:36,680 --> 00:24:39,960 Speaker 1: don't know if Framus is intended is going to be 427 00:24:40,000 --> 00:24:42,560 Speaker 1: a Pepper's Ghost illusion. I don't know if if quote 428 00:24:42,600 --> 00:24:45,240 Speaker 1: unquote he I don't know. I don't know what pronoun 429 00:24:45,280 --> 00:24:48,680 Speaker 1: to use it. I don't know if it is going 430 00:24:48,720 --> 00:24:51,720 Speaker 1: to be a Pepper's Ghost illusion. It's possible she could 431 00:24:51,800 --> 00:24:55,639 Speaker 1: use something like a volumetric display. It's unlikely, but she could. 432 00:24:56,000 --> 00:24:58,800 Speaker 1: This is actually a pretty broad category of technologies, and 433 00:24:58,840 --> 00:25:00,800 Speaker 1: in fact, I would need to do a episode just 434 00:25:00,840 --> 00:25:05,120 Speaker 1: on volumetric displays to really tackle it properly. But generally speaking, 435 00:25:05,240 --> 00:25:08,200 Speaker 1: they allow for the display of three dimensional images without 436 00:25:08,200 --> 00:25:11,280 Speaker 1: the audience needing to wear special glasses or headgear. So 437 00:25:11,320 --> 00:25:14,240 Speaker 1: sometimes they call it, you know, glasses free three D. 438 00:25:14,880 --> 00:25:18,600 Speaker 1: But the point is there's still a display, right, There's 439 00:25:18,600 --> 00:25:22,680 Speaker 1: still a physical display in play here. It's a medium 440 00:25:22,800 --> 00:25:25,960 Speaker 1: upon which light reflects. And it could be that the 441 00:25:26,040 --> 00:25:30,280 Speaker 1: volumetric display is actually in motion. It's spinning or otherwise 442 00:25:30,320 --> 00:25:33,919 Speaker 1: moving rapidly, far faster than we can see. But the 443 00:25:34,080 --> 00:25:36,360 Speaker 1: medium is moving in such a way as to reflect 444 00:25:36,480 --> 00:25:39,560 Speaker 1: light at the specific locations it needs to in order 445 00:25:39,640 --> 00:25:43,159 Speaker 1: to create the illusion of a three dimensional image. So 446 00:25:43,240 --> 00:25:45,359 Speaker 1: it's still a surface. And I say this only to 447 00:25:45,359 --> 00:25:48,199 Speaker 1: point out that volumetric displays can be really impressive, but 448 00:25:48,240 --> 00:25:51,800 Speaker 1: they're also stationary, right. They're not something again that you 449 00:25:51,840 --> 00:25:55,879 Speaker 1: could just have as a free, roaming three dimensional hologram 450 00:25:56,000 --> 00:25:58,800 Speaker 1: wandering around your house. You would have to have very 451 00:25:58,840 --> 00:26:02,440 Speaker 1: specific locations where the hologram would be able to appear. 452 00:26:02,920 --> 00:26:06,640 Speaker 1: The same could be said for so called solid light projections. 453 00:26:06,960 --> 00:26:09,439 Speaker 1: These are also really cool, but again you need to 454 00:26:09,480 --> 00:26:12,760 Speaker 1: have the projectors to make it work, and they aren't 455 00:26:12,880 --> 00:26:16,040 Speaker 1: gonna go anywhere. These are big, big pieces of equipment. 456 00:26:16,440 --> 00:26:20,000 Speaker 1: These are able to project light in such a way 457 00:26:20,520 --> 00:26:24,080 Speaker 1: that they can create the illusion of a three dimensional 458 00:26:24,119 --> 00:26:29,399 Speaker 1: object appearing within a physical space, and in order to 459 00:26:29,400 --> 00:26:32,119 Speaker 1: do that, you have to have very specific control of 460 00:26:32,160 --> 00:26:34,760 Speaker 1: the lighting environment, otherwise you're going to have issues. And 461 00:26:34,800 --> 00:26:37,960 Speaker 1: you also have to again have these physical projectors set 462 00:26:38,040 --> 00:26:42,840 Speaker 1: up at specific locations in order to achieve the effect 463 00:26:42,880 --> 00:26:46,760 Speaker 1: you want. And it's very limiting. It's a cool effect. 464 00:26:46,840 --> 00:26:49,239 Speaker 1: I've seen videos of never seen one in person, by 465 00:26:49,240 --> 00:26:51,840 Speaker 1: the way, I've just seen video of them in action. 466 00:26:52,320 --> 00:26:56,760 Speaker 1: But they work for like a permanent exhibit, right Like 467 00:26:56,800 --> 00:26:58,560 Speaker 1: if you were to go to a museum, a science 468 00:26:58,640 --> 00:27:00,720 Speaker 1: museum or something that had one of them, you could 469 00:27:00,760 --> 00:27:04,919 Speaker 1: see a really cool effect, but you wouldn't see that 470 00:27:05,000 --> 00:27:08,439 Speaker 1: object moving around the whole space. You would have to 471 00:27:08,480 --> 00:27:11,160 Speaker 1: go through a lot of trouble to set up all 472 00:27:11,200 --> 00:27:14,440 Speaker 1: the hard ware you needed in order to create multiple 473 00:27:14,480 --> 00:27:19,320 Speaker 1: locations that a quote unquote hologram could travel in order 474 00:27:19,359 --> 00:27:22,639 Speaker 1: for you to have like a realistic interaction with a 475 00:27:22,720 --> 00:27:26,400 Speaker 1: spouse in your home, you wouldn't be able to have 476 00:27:26,520 --> 00:27:30,000 Speaker 1: your holographic spouse accompany you while you pop out to 477 00:27:30,040 --> 00:27:32,000 Speaker 1: the shops to get your bits and bobs. That just 478 00:27:32,000 --> 00:27:34,040 Speaker 1: wouldn't happen. And like I said, I could do a 479 00:27:34,080 --> 00:27:37,199 Speaker 1: whole series of episodes on things like volumetric displays and 480 00:27:37,680 --> 00:27:40,680 Speaker 1: solid light, but we still need to talk about the 481 00:27:40,760 --> 00:27:44,760 Speaker 1: AI element and Framus's proposal. The holographic element is just 482 00:27:44,800 --> 00:27:47,560 Speaker 1: one half of it, and again we don't have enough 483 00:27:47,560 --> 00:27:51,360 Speaker 1: information to really tackle it in a meaningful way, except 484 00:27:51,359 --> 00:27:55,159 Speaker 1: to say that when she says hologram, she probably doesn't 485 00:27:55,200 --> 00:27:58,720 Speaker 1: actually mean hologram. According to the pieces I read, the 486 00:27:58,760 --> 00:28:02,080 Speaker 1: AI is supposed to draw from Framus's own romantic past, 487 00:28:02,200 --> 00:28:05,560 Speaker 1: and she says it's supposed to incorporate elements and characteristics 488 00:28:05,720 --> 00:28:09,560 Speaker 1: from people in her past relationships. Now, to achieve that, 489 00:28:09,600 --> 00:28:13,199 Speaker 1: the engineers have to train the AI on people that 490 00:28:13,240 --> 00:28:17,080 Speaker 1: Framus formerly had romantic relationships with, and to do that, 491 00:28:17,680 --> 00:28:21,960 Speaker 1: they're going to turn largely to social platforms. Okay, we're 492 00:28:21,960 --> 00:28:24,040 Speaker 1: going to take another quick break. When we come back, 493 00:28:24,040 --> 00:28:28,240 Speaker 1: we're going to tackle what this means and the ethical 494 00:28:28,560 --> 00:28:32,400 Speaker 1: implications that accompany it. But first, let's take another quick 495 00:28:32,400 --> 00:28:47,040 Speaker 1: break to think are sponsors? Okay? So Alicia Framus plans 496 00:28:47,080 --> 00:28:52,120 Speaker 1: to train this AI on the social platforms of her 497 00:28:52,560 --> 00:28:57,440 Speaker 1: past romantic relationships. This raises some pretty serious ethical questions, 498 00:28:57,480 --> 00:29:00,280 Speaker 1: at least for me. Is it ethical to create an 499 00:29:00,360 --> 00:29:05,720 Speaker 1: artificially intelligent construct based off of real people? Now? I 500 00:29:05,800 --> 00:29:10,320 Speaker 1: don't know if her exes have given their consent to this, 501 00:29:10,880 --> 00:29:13,040 Speaker 1: like if they said, oh, it's fine if you train 502 00:29:13,120 --> 00:29:17,400 Speaker 1: this AI on my social platforms. I hope that they've 503 00:29:17,400 --> 00:29:21,200 Speaker 1: given their consent, because otherwise this gets really creepy really fast. 504 00:29:21,240 --> 00:29:24,200 Speaker 1: I mean, let me just give you a simple example. 505 00:29:25,320 --> 00:29:31,120 Speaker 1: Let's say that there's a creepy person who has developed 506 00:29:31,280 --> 00:29:39,200 Speaker 1: an obsessive attraction toward someone else. And this creepy person, 507 00:29:39,360 --> 00:29:42,600 Speaker 1: for whatever reason, either feels that they have no shot 508 00:29:43,120 --> 00:29:47,240 Speaker 1: with this object of their desire, or they legitimately don't 509 00:29:47,240 --> 00:29:49,400 Speaker 1: have a shot. Maybe the object of their desires already 510 00:29:49,400 --> 00:29:52,200 Speaker 1: in a committed relationship and has no interest in pursuing 511 00:29:52,520 --> 00:29:56,959 Speaker 1: anything else. Whatever the case, so the creepy person has decided, 512 00:29:57,000 --> 00:29:58,880 Speaker 1: you know what I'll do is I'll just create an 513 00:29:59,000 --> 00:30:03,960 Speaker 1: AI version of this person, and I'm going to train 514 00:30:04,040 --> 00:30:07,520 Speaker 1: the AI on this person's social media and I'm going 515 00:30:07,560 --> 00:30:11,520 Speaker 1: to create a copy of them that gets pretty creepy, right, 516 00:30:11,600 --> 00:30:17,400 Speaker 1: Like it's removing consent, it's removing agency from the object 517 00:30:17,400 --> 00:30:21,040 Speaker 1: of desire. You could argue, well, ultimately the creepy person 518 00:30:21,120 --> 00:30:25,440 Speaker 1: is developed is completely committing their focus to a construct 519 00:30:25,480 --> 00:30:27,960 Speaker 1: and not the actual person. I'm not sure that that's 520 00:30:28,000 --> 00:30:33,440 Speaker 1: healthy either, but it just it gives off some uncomfortable vibes. 521 00:30:33,800 --> 00:30:36,160 Speaker 1: And also we have to keep in mind that the 522 00:30:36,240 --> 00:30:40,280 Speaker 1: person we present on social media, right, the version of 523 00:30:40,360 --> 00:30:43,320 Speaker 1: us that we present to the world often is not 524 00:30:43,520 --> 00:30:46,360 Speaker 1: a true reflection of who we are. Right, A lot 525 00:30:46,360 --> 00:30:50,479 Speaker 1: of folks will put forth their best self on social 526 00:30:50,560 --> 00:30:53,280 Speaker 1: media and the leave out stuff that perhaps is maybe 527 00:30:53,400 --> 00:30:58,640 Speaker 1: less complementary. And this is just a natural kind of 528 00:30:59,200 --> 00:31:01,200 Speaker 1: thing that a lot of let's go with. Right. We 529 00:31:01,320 --> 00:31:04,200 Speaker 1: get on these social platforms and we want to show 530 00:31:04,280 --> 00:31:08,920 Speaker 1: how exciting our day is, how amazing, this meal is, 531 00:31:09,240 --> 00:31:12,400 Speaker 1: this gorgeous location we happen to be at, But we're 532 00:31:12,440 --> 00:31:16,520 Speaker 1: not showing all the more mundane, humdrum stuff or the 533 00:31:16,520 --> 00:31:18,800 Speaker 1: negative stuff that's going on in our lives not nearly 534 00:31:18,840 --> 00:31:23,400 Speaker 1: to the same extent, So any AI trained version of 535 00:31:23,680 --> 00:31:27,160 Speaker 1: a person is likely to be a poor representation or 536 00:31:27,160 --> 00:31:31,920 Speaker 1: at least an incomplete representation of that person. So there's 537 00:31:31,920 --> 00:31:37,120 Speaker 1: some real gray areas here already about creating an AI 538 00:31:37,280 --> 00:31:39,440 Speaker 1: that is at least in part based on a person 539 00:31:39,560 --> 00:31:43,479 Speaker 1: or multiple people. But anyway, for Alicia Framus's project, the 540 00:31:43,520 --> 00:31:48,320 Speaker 1: AI is being called ai Lex or Alex or Ilex 541 00:31:48,760 --> 00:31:51,800 Speaker 1: or just Alex. I don't know, although arguably this is 542 00:31:51,840 --> 00:31:53,800 Speaker 1: not just the name of the AI, it's the name 543 00:31:53,840 --> 00:31:56,720 Speaker 1: of the part and parcel package of the AI and 544 00:31:56,760 --> 00:32:00,719 Speaker 1: the quote unquote holographic sculpture. Beyond the ethical is I 545 00:32:00,760 --> 00:32:04,080 Speaker 1: really wonder how effective Alex is going to be as 546 00:32:04,200 --> 00:32:07,280 Speaker 1: a simulation of a person, because we have seen some 547 00:32:07,360 --> 00:32:10,840 Speaker 1: phenomenal strides and generative AI capabilities over the last few years, 548 00:32:11,080 --> 00:32:14,320 Speaker 1: but we've also seen time and again how these technologies 549 00:32:14,520 --> 00:32:18,720 Speaker 1: can run up against limitations, right, whether it's an AI 550 00:32:18,800 --> 00:32:22,280 Speaker 1: construct going off the rails by making offensive content like 551 00:32:22,320 --> 00:32:27,600 Speaker 1: I'm reminded of when Microsoft launched an AI tool that 552 00:32:28,040 --> 00:32:30,440 Speaker 1: it took down in less than twenty four hours because 553 00:32:30,880 --> 00:32:35,360 Speaker 1: it started spouting off some terrible stuff due to people 554 00:32:35,840 --> 00:32:39,760 Speaker 1: specifically making it do that, or we get cases of 555 00:32:39,800 --> 00:32:44,000 Speaker 1: AI hallucinating or confabulating, if you prefer that term, which 556 00:32:44,040 --> 00:32:48,320 Speaker 1: is when they start producing incorrect answers to queries. You know, 557 00:32:48,480 --> 00:32:50,280 Speaker 1: rather than saying I don't know the answer to that, 558 00:32:50,320 --> 00:32:54,640 Speaker 1: they just invent one based upon statistical probabilities of words 559 00:32:54,680 --> 00:32:57,920 Speaker 1: that should follow in a sentence. We're all familiar with 560 00:32:58,000 --> 00:33:00,920 Speaker 1: cases in which AI just can come up, so one 561 00:33:01,040 --> 00:33:03,240 Speaker 1: wonders how that could play out in the context of 562 00:33:03,280 --> 00:33:07,920 Speaker 1: a relationship. Now, on the flip side, you could argue, yeah, sure, 563 00:33:08,280 --> 00:33:12,360 Speaker 1: AI could make some really bad mistakes and start making 564 00:33:12,440 --> 00:33:15,160 Speaker 1: stuff up when they don't really know what they're talking about. 565 00:33:15,320 --> 00:33:18,360 Speaker 1: But then human beings do that too. Being in a 566 00:33:18,400 --> 00:33:21,440 Speaker 1: relationship with another human being is bound to include times 567 00:33:21,680 --> 00:33:24,880 Speaker 1: when you are at odds with one another. Whether it's 568 00:33:24,880 --> 00:33:28,200 Speaker 1: a misunderstanding or it's a disagreement or an out and 569 00:33:28,200 --> 00:33:32,480 Speaker 1: out fight, These things do happen. So you could argue, 570 00:33:32,680 --> 00:33:35,640 Speaker 1: why should we hold AI to a higher standard than 571 00:33:35,680 --> 00:33:38,560 Speaker 1: we would hold an actual human person. And I don't 572 00:33:38,560 --> 00:33:40,600 Speaker 1: really have an answer to that. I just know that, 573 00:33:40,840 --> 00:33:44,080 Speaker 1: you know, when we talk about AI, we know about 574 00:33:44,120 --> 00:33:48,560 Speaker 1: the shortcomings of AI, and so knowing that and thinking 575 00:33:48,640 --> 00:33:52,400 Speaker 1: about how that could impact and implementation of AI, in 576 00:33:52,440 --> 00:33:55,640 Speaker 1: which it's supposed to stand in for a romantic partner. 577 00:33:56,000 --> 00:34:00,000 Speaker 1: It raises these questions. Now. According to Alicia Framus's own website, 578 00:34:00,160 --> 00:34:04,480 Speaker 1: the artist is quote passionate about dedicating her soul and 579 00:34:04,520 --> 00:34:07,600 Speaker 1: body to science and art in order to develop deeper 580 00:34:07,640 --> 00:34:14,360 Speaker 1: relationships and assist individuals with congenital or acquired diseases, physical disabilities, 581 00:34:14,680 --> 00:34:19,120 Speaker 1: gender imbalances in certain countries, social requirements and stereotypes in 582 00:34:19,200 --> 00:34:22,760 Speaker 1: different cultures, as well as those who have experienced trauma 583 00:34:22,920 --> 00:34:28,520 Speaker 1: or suffer from agloraphobia, disfigurement, or fear end quote. Now 584 00:34:28,719 --> 00:34:31,640 Speaker 1: put that way, this all sounds admirable. It really puts 585 00:34:31,719 --> 00:34:34,919 Speaker 1: this work into a different context because this is one 586 00:34:34,960 --> 00:34:38,520 Speaker 1: that trends toward accessibility. Now, in general, I'm in favor 587 00:34:38,560 --> 00:34:42,520 Speaker 1: of technologies that grant people access to things and experiences 588 00:34:42,560 --> 00:34:47,239 Speaker 1: that they might otherwise not have or have difficulty accessing. 589 00:34:47,600 --> 00:34:50,719 Speaker 1: Increasing agency is a really noble goal in my opinion. 590 00:34:51,000 --> 00:34:54,520 Speaker 1: But there are other questions that I have and I 591 00:34:54,560 --> 00:34:58,359 Speaker 1: haven't seen answers to them yet. For example, where will 592 00:34:58,360 --> 00:35:00,840 Speaker 1: this AI be hosted? This is getting back to the 593 00:35:00,840 --> 00:35:04,440 Speaker 1: technical site. Where does the AI live, not like within 594 00:35:04,480 --> 00:35:08,440 Speaker 1: her house, but literally where is the AI living on 595 00:35:08,520 --> 00:35:11,440 Speaker 1: a computer? What physical machines are going to provide the 596 00:35:11,600 --> 00:35:15,440 Speaker 1: processing power for this AI? I presume it's going to 597 00:35:15,440 --> 00:35:19,160 Speaker 1: be cloud based, as AI functionality really requires a hefty 598 00:35:19,160 --> 00:35:22,080 Speaker 1: amount of processing power. Now, it could be a strictly 599 00:35:22,120 --> 00:35:25,480 Speaker 1: local installation, which is entirely possible. That just doesn't seem 600 00:35:25,600 --> 00:35:29,239 Speaker 1: likely to me, but it is possible. Like wherever Sheep 601 00:35:29,320 --> 00:35:33,480 Speaker 1: lands on holding this exhibition and showing off these various performances, 602 00:35:33,760 --> 00:35:38,600 Speaker 1: it could be that the AI instance is locally living there. 603 00:35:39,160 --> 00:35:41,799 Speaker 1: But then how is the AI going to manipulate the 604 00:35:41,840 --> 00:35:46,759 Speaker 1: holographic element? How are these two different technological pieces linked? 605 00:35:47,120 --> 00:35:50,279 Speaker 1: Because again, generative AI doesn't just magically know how to 606 00:35:50,840 --> 00:35:56,160 Speaker 1: control a projection. Will the hologram, in whatever form it 607 00:35:56,239 --> 00:35:59,400 Speaker 1: takes be a digital puppet? In other words, will it 608 00:35:59,520 --> 00:36:03,200 Speaker 1: be a digital construct and its mouth will move in 609 00:36:03,280 --> 00:36:07,160 Speaker 1: accordance to whatever sounds are being generated by the generative 610 00:36:07,200 --> 00:36:09,719 Speaker 1: AI part? The body of the hologram is not the 611 00:36:09,760 --> 00:36:13,000 Speaker 1: same thing as the AI simulated personality. This is kind 612 00:36:13,000 --> 00:36:15,160 Speaker 1: of weird to think about, Like, there's not an easy 613 00:36:15,880 --> 00:36:18,279 Speaker 1: analogy we can make. I guess you could say, well, 614 00:36:18,360 --> 00:36:21,399 Speaker 1: imagine that your brain is in a jar, but you're 615 00:36:21,440 --> 00:36:24,719 Speaker 1: still able to control your otherwise brainless body as it 616 00:36:24,760 --> 00:36:28,200 Speaker 1: moves around. They are two separate things in this scenario 617 00:36:28,840 --> 00:36:31,120 Speaker 1: that would be similar to what we're talking about here. 618 00:36:31,520 --> 00:36:34,040 Speaker 1: But of course that's not the way our bodies work. 619 00:36:34,040 --> 00:36:37,439 Speaker 1: Our brains are incorporated within our bodies. They are when 620 00:36:37,480 --> 00:36:40,600 Speaker 1: we think of ourselves, we don't break it down, we 621 00:36:40,640 --> 00:36:44,560 Speaker 1: don't compartmentalize typically whether or not we're talking about our 622 00:36:44,560 --> 00:36:47,759 Speaker 1: body or our brain, like it's just us. But for 623 00:36:47,880 --> 00:36:51,480 Speaker 1: this AI construct, that's just not the case. It's separate 624 00:36:51,640 --> 00:36:55,200 Speaker 1: things that are linked together, but they are separate. That 625 00:36:55,320 --> 00:36:59,080 Speaker 1: being said, there are other ethical and social considerations that 626 00:36:59,120 --> 00:37:01,440 Speaker 1: we should take into it count and arguably this is 627 00:37:01,480 --> 00:37:04,840 Speaker 1: the very point of Alicia Framus's work. So, for example, 628 00:37:05,320 --> 00:37:08,200 Speaker 1: in some parts of the world, there are regions that 629 00:37:08,280 --> 00:37:12,840 Speaker 1: have an aging population, so you have a larger number 630 00:37:12,840 --> 00:37:15,960 Speaker 1: of older people and a smaller number of younger people, 631 00:37:16,440 --> 00:37:21,359 Speaker 1: and as those numbers for younger people get smaller, those 632 00:37:21,400 --> 00:37:24,000 Speaker 1: generations may start to find it challenging to find a 633 00:37:24,080 --> 00:37:30,440 Speaker 1: romantic partner. And it's possible that opening up this Pandora's 634 00:37:30,440 --> 00:37:36,720 Speaker 1: box of forming relationships with artificially intelligent beings could really 635 00:37:36,760 --> 00:37:40,719 Speaker 1: exacerbate problems in those regions because people who might feel 636 00:37:40,719 --> 00:37:43,440 Speaker 1: lonely could turn to tech to feel that void, which 637 00:37:43,480 --> 00:37:47,480 Speaker 1: is understandable. But could that make it worse for the 638 00:37:47,560 --> 00:37:53,080 Speaker 1: society overall, where it removes their desire to go out 639 00:37:53,120 --> 00:37:57,520 Speaker 1: and find another human companion and thus perpetuate that society. 640 00:37:57,880 --> 00:38:01,920 Speaker 1: Could this cure for loneliness ironically speed up social collapse 641 00:38:02,040 --> 00:38:05,440 Speaker 1: in those regions? Now, this doesn't to me sound like 642 00:38:05,480 --> 00:38:08,799 Speaker 1: an outlandish hypothesis. I don't think it would be you know, 643 00:38:08,920 --> 00:38:12,000 Speaker 1: instantaneous or anything. But I do think it could make 644 00:38:12,040 --> 00:38:15,359 Speaker 1: the problem more challenging. But I'm not ready to say 645 00:38:15,360 --> 00:38:18,239 Speaker 1: it's a foregone conclusion. It may be that perhaps people 646 00:38:18,239 --> 00:38:20,680 Speaker 1: would try this and for maybe some small percentage it 647 00:38:20,719 --> 00:38:24,359 Speaker 1: would seem to really take hold, and for maybe other 648 00:38:24,400 --> 00:38:26,879 Speaker 1: people they just say, no, this isn't what I need. 649 00:38:27,400 --> 00:38:31,000 Speaker 1: So that leads to another question. Can AI actually provide 650 00:38:31,640 --> 00:38:37,280 Speaker 1: the companionship that people desire? You've likely heard research suggests 651 00:38:37,360 --> 00:38:40,920 Speaker 1: that people who have meaningful social relationships, and married people 652 00:38:41,000 --> 00:38:45,479 Speaker 1: in particular, typically have longer life expectancies than those who 653 00:38:45,560 --> 00:38:50,520 Speaker 1: lack those meaningful social relationships. So can an AI or 654 00:38:50,640 --> 00:38:55,480 Speaker 1: robot provide the depth and meaning in a relationship that 655 00:38:55,520 --> 00:38:58,840 Speaker 1: would then lead to this benefit, assuming that the research 656 00:38:58,960 --> 00:39:02,160 Speaker 1: is drying on accu conclusion, that is, because it's possible 657 00:39:02,160 --> 00:39:05,439 Speaker 1: that it's not. But assuming the research is accurate, would 658 00:39:05,520 --> 00:39:10,719 Speaker 1: AI actually help reduce feelings of depression and isolation? Would 659 00:39:10,760 --> 00:39:15,680 Speaker 1: it lead to healthier and longer lifespans. That's a question 660 00:39:15,800 --> 00:39:17,719 Speaker 1: we just don't know the answer to. Or would it 661 00:39:17,760 --> 00:39:21,440 Speaker 1: just seem that it would on a surface level fulfill 662 00:39:21,520 --> 00:39:26,040 Speaker 1: the need but ultimately not provide those benefits. We don't know. 663 00:39:26,840 --> 00:39:29,799 Speaker 1: Alicia Framus is likely exploring some of these questions and 664 00:39:29,880 --> 00:39:32,680 Speaker 1: lots of other ones that haven't even thought about in 665 00:39:32,719 --> 00:39:36,359 Speaker 1: her work. That's likely the point of this. I would 666 00:39:36,400 --> 00:39:40,520 Speaker 1: never ask an artist to go into depth about the 667 00:39:40,560 --> 00:39:44,240 Speaker 1: point of their work unless that was something they wanted 668 00:39:44,280 --> 00:39:49,160 Speaker 1: to talk about, because often artists present things and while 669 00:39:49,160 --> 00:39:52,960 Speaker 1: they might have their own intent their content, to let 670 00:39:53,000 --> 00:39:57,359 Speaker 1: the world interpret their work, however the world does. She's 671 00:39:57,400 --> 00:40:00,600 Speaker 1: also looking at, you know, some other mundane questions like 672 00:40:00,760 --> 00:40:03,000 Speaker 1: is it possible to secure a mortgage for a home 673 00:40:03,040 --> 00:40:06,080 Speaker 1: with a holographic co signer? Or can you design a 674 00:40:06,120 --> 00:40:09,920 Speaker 1: home that allows for a holographic occupant. So again, like 675 00:40:10,360 --> 00:40:13,800 Speaker 1: we talked about, all of these different ways of producing 676 00:40:13,880 --> 00:40:17,759 Speaker 1: quote unquote holograms that aren't really holograms require that you 677 00:40:17,800 --> 00:40:19,960 Speaker 1: have a lot of equipment, right, You have to have 678 00:40:19,960 --> 00:40:22,120 Speaker 1: a lot of equipment and a lot of special conditions 679 00:40:22,400 --> 00:40:27,160 Speaker 1: to allow for this manifestation, and that means like lots 680 00:40:27,200 --> 00:40:30,360 Speaker 1: of projectors and controlled lighting and all this kind of stuff. 681 00:40:30,680 --> 00:40:32,880 Speaker 1: Could you create a home that has enough of this 682 00:40:33,080 --> 00:40:37,640 Speaker 1: to simulate the experience of having an actual spouse wandering 683 00:40:37,640 --> 00:40:42,080 Speaker 1: through your house? Or is that impractical? I think it's impractical. 684 00:40:42,120 --> 00:40:46,600 Speaker 1: Maybe not impossible, but I certainly think it's impractical. And 685 00:40:46,640 --> 00:40:48,359 Speaker 1: then you could argue, well, if you have a house 686 00:40:48,360 --> 00:40:52,080 Speaker 1: that's fully outfitted with all this equipment, where a quote 687 00:40:52,160 --> 00:40:55,840 Speaker 1: unquote hologram could wander through the house freely, then you 688 00:40:55,880 --> 00:40:59,400 Speaker 1: could probably also just have like a whole bunch of 689 00:40:59,480 --> 00:41:02,759 Speaker 1: cloned versions of this person. You know, if you have 690 00:41:02,760 --> 00:41:05,520 Speaker 1: all these different projectors, all the projectors could be producing 691 00:41:05,560 --> 00:41:08,040 Speaker 1: a version of this person, and then you just have 692 00:41:08,880 --> 00:41:12,319 Speaker 1: clones of your spouse in every room. But you know, 693 00:41:12,560 --> 00:41:14,319 Speaker 1: I don't know, I don't know how they're doing it 694 00:41:14,360 --> 00:41:18,440 Speaker 1: because there's no technical information available from what I can tell. 695 00:41:18,760 --> 00:41:20,359 Speaker 1: I suppose we're just going to learn a lot more. 696 00:41:20,360 --> 00:41:23,200 Speaker 1: When Framous's project officially gets off the ground this summer. 697 00:41:23,360 --> 00:41:27,200 Speaker 1: That's when the set of performances will begin, and the 698 00:41:27,280 --> 00:41:31,120 Speaker 1: full set of performances are grouped under the title The 699 00:41:31,280 --> 00:41:34,480 Speaker 1: Hybrid Couple. So I will keep an eye out and 700 00:41:34,520 --> 00:41:38,160 Speaker 1: see about following up on this story as it unfolds. 701 00:41:38,480 --> 00:41:40,960 Speaker 1: I still have questions about how this is going to 702 00:41:41,000 --> 00:41:44,799 Speaker 1: work from a technical standpoint, but again it's because of 703 00:41:44,840 --> 00:41:47,560 Speaker 1: a lack of information. I'm not saying that it's impossible 704 00:41:47,600 --> 00:41:51,760 Speaker 1: to pull off. I'm just curious what method they're using 705 00:41:51,840 --> 00:41:54,680 Speaker 1: or methods. Maybe they're using a collection of different ones. 706 00:41:54,760 --> 00:42:00,160 Speaker 1: I don't know. But it is an interesting project and 707 00:42:00,239 --> 00:42:02,800 Speaker 1: one that I think has merit because it is asking 708 00:42:02,880 --> 00:42:05,879 Speaker 1: questions that we are going to have to answer. Even 709 00:42:05,920 --> 00:42:08,319 Speaker 1: if you would argue this is far too early and 710 00:42:08,600 --> 00:42:12,680 Speaker 1: that this version is an unrealistic version of what we 711 00:42:12,719 --> 00:42:15,560 Speaker 1: can expect in the future, it still raises the questions 712 00:42:15,560 --> 00:42:19,440 Speaker 1: that we will have to answer because this is something 713 00:42:19,480 --> 00:42:23,280 Speaker 1: that is going to be part of our future. People 714 00:42:23,480 --> 00:42:28,200 Speaker 1: have already developed feelings for AI chatbots. If you want 715 00:42:28,239 --> 00:42:32,759 Speaker 1: to see a really good video essay about this with 716 00:42:33,239 --> 00:42:37,000 Speaker 1: maybe some content warnings in there. Check out Sarah Z 717 00:42:37,719 --> 00:42:41,120 Speaker 1: and is Sarah Z for all my fellow Americans on YouTube. 718 00:42:41,440 --> 00:42:44,120 Speaker 1: She has a piece called the Rise and Fall of 719 00:42:44,280 --> 00:42:47,759 Speaker 1: Replica that's our ep L I KA, in which she 720 00:42:47,880 --> 00:42:54,960 Speaker 1: talks about an AI generated companion program that ended up 721 00:42:55,000 --> 00:42:59,320 Speaker 1: having some massive changes along the way due to various reasons. 722 00:42:59,680 --> 00:43:03,279 Speaker 1: But he has a good piece that kind of looks 723 00:43:03,320 --> 00:43:06,640 Speaker 1: at this from the AI standpoint, and I think it's 724 00:43:06,760 --> 00:43:12,080 Speaker 1: a valuable essay, one that is entertaining, interesting, and a 725 00:43:12,120 --> 00:43:15,600 Speaker 1: little disturbing at points, but it kind of explores the 726 00:43:15,640 --> 00:43:20,000 Speaker 1: same sort of area. I think Alicia Framus is looking 727 00:43:20,040 --> 00:43:25,120 Speaker 1: at similar things, although from perhaps a slightly less salacious 728 00:43:25,840 --> 00:43:30,239 Speaker 1: perspective than Replica was when it first launched. So check 729 00:43:30,320 --> 00:43:33,640 Speaker 1: that out if you like. Sarah Z does great video essays. 730 00:43:33,680 --> 00:43:37,279 Speaker 1: I don't know her, never interacted with her, so I 731 00:43:37,440 --> 00:43:41,439 Speaker 1: just like her work and I recommend checking that out. 732 00:43:41,440 --> 00:43:43,399 Speaker 1: It is an hour long, so it is a bit 733 00:43:43,440 --> 00:43:47,880 Speaker 1: of a commitment, but it's worth viewing, and yeah, I'm 734 00:43:48,040 --> 00:43:50,840 Speaker 1: interested to see how this plays out again. Still have 735 00:43:50,920 --> 00:43:55,799 Speaker 1: lots of questions, both technical and otherwise, but you could 736 00:43:55,920 --> 00:43:58,759 Speaker 1: argue that is the purpose of art. Or one of 737 00:43:58,920 --> 00:44:03,400 Speaker 1: art's purposes. I don't want to pigeonhole why art exists 738 00:44:03,520 --> 00:44:06,560 Speaker 1: or what it's for, but I would say that raising 739 00:44:06,640 --> 00:44:11,000 Speaker 1: questions is definitely part of it, and if that's true, 740 00:44:11,040 --> 00:44:14,520 Speaker 1: then goal achieved. I have lots of questions, so we'll 741 00:44:14,520 --> 00:44:16,640 Speaker 1: come back to this. But I think it is a 742 00:44:16,680 --> 00:44:22,080 Speaker 1: really interesting topic to contemplate. What is AI's role in 743 00:44:22,080 --> 00:44:24,960 Speaker 1: the future when it comes to things like personal relationships 744 00:44:25,560 --> 00:44:29,960 Speaker 1: and is it ultimately something that could be beneficial or 745 00:44:30,800 --> 00:44:33,600 Speaker 1: have a negative impact on us or is it more 746 00:44:33,640 --> 00:44:37,719 Speaker 1: complicated than that. Is it a combination. I suspect it 747 00:44:37,800 --> 00:44:40,120 Speaker 1: is a combination of some sort. But there you go. 748 00:44:40,680 --> 00:44:44,880 Speaker 1: That is my take so far on Alicia Framans's project 749 00:44:45,280 --> 00:44:51,360 Speaker 1: to marry an AI quote unquote hologram. And we'll see 750 00:44:51,640 --> 00:44:55,000 Speaker 1: how this plays out this summer summer of twenty twenty four. 751 00:44:55,040 --> 00:44:57,320 Speaker 1: In case some of you are listening from the future, 752 00:44:57,400 --> 00:45:01,400 Speaker 1: in which case maybe you're already, you know, attending a 753 00:45:01,400 --> 00:45:04,799 Speaker 1: holographic wedding, in which case, what do you get? Where 754 00:45:04,800 --> 00:45:08,440 Speaker 1: do they register? Okay, that's it. I hope you are 755 00:45:08,560 --> 00:45:12,680 Speaker 1: all well, and I'll talk to you again really soon. 756 00:45:18,920 --> 00:45:23,560 Speaker 1: Tech Stuff is an iHeartRadio production. For more podcasts from iHeartRadio, 757 00:45:23,880 --> 00:45:27,600 Speaker 1: visit the iHeartRadio app, Apple podcasts, or wherever you listen 758 00:45:27,640 --> 00:45:28,680 Speaker 1: to your favorite shows.