1 00:00:00,400 --> 00:00:03,920 Speaker 1: Technology should not be replacing life. There is value and 2 00:00:04,040 --> 00:00:08,160 Speaker 1: letting people's online presence rest after they're gone, instead of 3 00:00:08,200 --> 00:00:13,640 Speaker 1: manufacturing artificial activity, real relationships and memories. They shouldn't be 4 00:00:13,680 --> 00:00:17,200 Speaker 1: swapped out for any sort of automatic content. I know 5 00:00:17,280 --> 00:00:20,400 Speaker 1: that I've been online before and I've pulled up Facebook 6 00:00:20,480 --> 00:00:23,799 Speaker 1: and I've seen like you know, today is your dad's birthday. 7 00:00:23,840 --> 00:00:27,120 Speaker 1: Of course, many of our listeners know my father passed 8 00:00:27,160 --> 00:00:30,720 Speaker 1: away more than a decade ago, and when I see that, 9 00:00:30,920 --> 00:00:36,840 Speaker 1: it hits hard because he still has a Facebook profile 10 00:00:37,000 --> 00:00:42,560 Speaker 1: that has never been shut down. Now you've got Meta 11 00:00:42,640 --> 00:00:47,320 Speaker 1: introducing a new product, that is, they've been granted a 12 00:00:47,360 --> 00:00:51,960 Speaker 1: patent for an AI system that can simulate a user's 13 00:00:52,440 --> 00:00:57,160 Speaker 1: social media activity even after they've died or stopped posting. 14 00:00:57,480 --> 00:01:02,640 Speaker 1: So you could be gone, but your online presence continues 15 00:01:02,680 --> 00:01:03,240 Speaker 1: on without you. 16 00:01:03,520 --> 00:01:06,560 Speaker 2: So I guess this would be similar to an AI 17 00:01:06,800 --> 00:01:12,520 Speaker 2: chat bot, but specifically designed to mimic somebody who has 18 00:01:12,520 --> 00:01:15,800 Speaker 2: passed away in your in your voice or in their 19 00:01:15,880 --> 00:01:18,560 Speaker 2: voice voice and the person's voice who's passed away. 20 00:01:18,959 --> 00:01:21,720 Speaker 1: I think it's basically a digital clone of yourself. 21 00:01:21,959 --> 00:01:26,160 Speaker 2: Right, So, so somebody close to you passes away, you 22 00:01:26,200 --> 00:01:29,000 Speaker 2: sign up for this service. With META, they're able to 23 00:01:29,080 --> 00:01:32,280 Speaker 2: access whatever digital files that exist on this person that 24 00:01:32,319 --> 00:01:36,720 Speaker 2: passes away. META feeds that into AI, and AI essentially 25 00:01:36,760 --> 00:01:40,360 Speaker 2: creates a digital chat bot of your loved one who's 26 00:01:40,440 --> 00:01:43,920 Speaker 2: no longer with us, so that you can, I guess 27 00:01:43,920 --> 00:01:46,119 Speaker 2: would I would assume be able to interact with them, 28 00:01:46,400 --> 00:01:50,960 Speaker 2: ask them questions and they would respond in their voice. 29 00:01:51,080 --> 00:01:55,000 Speaker 1: It will continue posting content, it'll reply to comments, and 30 00:01:55,360 --> 00:02:00,920 Speaker 1: it will interact with other people in their voice. And 31 00:02:01,320 --> 00:02:03,400 Speaker 1: I guess this is kind of like you've heard of 32 00:02:03,480 --> 00:02:06,000 Speaker 1: when someone has a loved one who passes away, they 33 00:02:06,800 --> 00:02:10,639 Speaker 1: save the last voicemail. Yeah, they can pull that up 34 00:02:10,680 --> 00:02:14,960 Speaker 1: and hear their voice again. Is it different than looking 35 00:02:14,960 --> 00:02:17,760 Speaker 1: at a picture of them? Yeah, I way to preserve 36 00:02:17,800 --> 00:02:19,919 Speaker 1: their memory. Although this seems more interactive. 37 00:02:20,200 --> 00:02:24,080 Speaker 2: Well, and you know, listening to an old voicemail that 38 00:02:24,120 --> 00:02:26,160 Speaker 2: you've saved on your phone from a loved one that's 39 00:02:26,160 --> 00:02:28,160 Speaker 2: passed away is one thing. And I get that if 40 00:02:28,160 --> 00:02:30,840 Speaker 2: that's what you want to do, because that's the real 41 00:02:30,960 --> 00:02:33,160 Speaker 2: person who left that voicemail. 42 00:02:33,240 --> 00:02:34,680 Speaker 1: This isn't an AI clone of that. 43 00:02:35,000 --> 00:02:38,880 Speaker 2: This is an AI chatbot version of your former loved one. 44 00:02:39,720 --> 00:02:42,239 Speaker 2: I think everybody's first initial reaction to this and the 45 00:02:42,320 --> 00:02:45,600 Speaker 2: vast majority of people are gonna be like, this is creepy, 46 00:02:45,680 --> 00:02:48,240 Speaker 2: This is, you know, not something we should be doing. 47 00:02:48,960 --> 00:02:52,240 Speaker 2: But I can see that for certain individuals, not me, 48 00:02:52,600 --> 00:02:56,519 Speaker 2: but for certain individuals that they you know, are having 49 00:02:56,560 --> 00:03:00,400 Speaker 2: such a hard time with that loved one's passing, that 50 00:03:00,919 --> 00:03:03,640 Speaker 2: being able to interact with them even though it's fake, 51 00:03:03,800 --> 00:03:06,760 Speaker 2: and even though it's AI, there's a certain percentage of 52 00:03:06,760 --> 00:03:09,799 Speaker 2: the population that I'm going to guess would get comfort 53 00:03:10,200 --> 00:03:13,600 Speaker 2: and some relief from their grief and longing and missing 54 00:03:13,639 --> 00:03:18,040 Speaker 2: this person by having this cloned AI chatbot, even though 55 00:03:18,080 --> 00:03:19,040 Speaker 2: it's not really it does. 56 00:03:19,040 --> 00:03:22,800 Speaker 1: Really blur the lines though, between memory and grief. This 57 00:03:22,880 --> 00:03:28,240 Speaker 1: could affect how people process loss and how they interact 58 00:03:28,280 --> 00:03:32,040 Speaker 1: with ghosts online accounts. But this is just the next evolution, 59 00:03:32,320 --> 00:03:35,680 Speaker 1: right because you used to have a picture and then 60 00:03:35,720 --> 00:03:40,520 Speaker 1: maybe a video and then a voice message, and now 61 00:03:40,640 --> 00:03:44,720 Speaker 1: their online presence continues on even though they don't. This 62 00:03:44,880 --> 00:03:47,640 Speaker 1: is just the next step. But it does show that 63 00:03:48,680 --> 00:03:52,720 Speaker 1: maybe there needs to be some rules. Maybe you have 64 00:03:52,760 --> 00:03:57,240 Speaker 1: to give permission for this to happen. How you passed, Like, 65 00:03:57,920 --> 00:04:00,440 Speaker 1: is this something you have to consider for your will? 66 00:04:01,000 --> 00:04:04,880 Speaker 2: Like what are yours, Jim, when I'm gone, don't make 67 00:04:04,960 --> 00:04:07,920 Speaker 2: me AI. I don't know, I don't know, could make 68 00:04:07,920 --> 00:04:10,080 Speaker 2: you an AI chatbot and maybe be able to do 69 00:04:10,160 --> 00:04:12,080 Speaker 2: it against your will. That's a great question you said, 70 00:04:12,200 --> 00:04:13,560 Speaker 2: you said a couple of things. You know, this is, 71 00:04:13,600 --> 00:04:16,480 Speaker 2: you know, could change how people process grief, And it 72 00:04:16,520 --> 00:04:19,320 Speaker 2: would be really interesting to get a therapist perspective on that, 73 00:04:19,400 --> 00:04:23,600 Speaker 2: because my guess is, you're not processing your grief if 74 00:04:23,600 --> 00:04:26,839 Speaker 2: you're playing. You're right here, you are masking or not 75 00:04:27,000 --> 00:04:29,960 Speaker 2: accepting the loss of the loved one if you're turning 76 00:04:30,000 --> 00:04:33,400 Speaker 2: them into an AI chatbot after they pass away. The 77 00:04:33,480 --> 00:04:36,760 Speaker 2: other aspect of this is what sort of real rights 78 00:04:36,960 --> 00:04:40,560 Speaker 2: do you retain for any content that you create on 79 00:04:40,600 --> 00:04:44,120 Speaker 2: social media or the digital space after you die? Do 80 00:04:44,200 --> 00:04:46,480 Speaker 2: you still have Because right now, if I take a 81 00:04:46,520 --> 00:04:51,680 Speaker 2: picture of you and I post that on Facebook, I 82 00:04:51,720 --> 00:04:54,200 Speaker 2: own the copyright to that image. I took that picture 83 00:04:54,320 --> 00:04:57,280 Speaker 2: I posted on Facebook, I own the copyright to that image. Technically, 84 00:04:57,440 --> 00:04:59,919 Speaker 2: no one else's should be able to share, repurpose, or 85 00:05:00,080 --> 00:05:03,680 Speaker 2: do anything with that image. But if I'm posting on 86 00:05:03,720 --> 00:05:06,200 Speaker 2: Facebook and I'm posting videos of myself on Facebook and 87 00:05:06,200 --> 00:05:10,360 Speaker 2: I pass away, do I still retain the copyright, Who 88 00:05:10,440 --> 00:05:12,479 Speaker 2: do those copyright rights go to? And what can be 89 00:05:12,520 --> 00:05:14,760 Speaker 2: done with that? This is obviously going to open up 90 00:05:14,800 --> 00:05:18,920 Speaker 2: a ton of both moral, ethical, and legal challenges. And 91 00:05:19,640 --> 00:05:23,279 Speaker 2: not surprise, who's at the forefront of it. Facebook, Meta, 92 00:05:23,560 --> 00:05:26,720 Speaker 2: Mark Zuckerberg. If there's something creepy that can be done 93 00:05:26,760 --> 00:05:29,920 Speaker 2: on social media and with AI, you better believe Mark Zuckerberg. 94 00:05:30,120 --> 00:05:32,120 Speaker 2: He's standing right at the front of the line ready 95 00:05:32,200 --> 00:05:32,560 Speaker 2: to do it. 96 00:05:32,680 --> 00:05:35,640 Speaker 1: He's thinking, Oh wait, there's an audience there, there's people 97 00:05:35,720 --> 00:05:37,640 Speaker 1: I can make money from this. Is that what they're 98 00:05:37,640 --> 00:05:41,120 Speaker 1: doing in Lebanon with the Leap project? Are they making 99 00:05:42,040 --> 00:05:42,919 Speaker 1: AI dead people? 100 00:05:43,000 --> 00:05:45,320 Speaker 2: Oh? You're talking about the fact that Meta is building 101 00:05:45,520 --> 00:05:49,600 Speaker 2: a massive data center in Lebanon with the Leap project. Yeah, 102 00:05:49,640 --> 00:05:52,560 Speaker 2: I mean absolutely. I mean those data centers are going 103 00:05:52,600 --> 00:05:54,880 Speaker 2: to be used for all sorts of AI and creating 104 00:05:54,960 --> 00:05:57,719 Speaker 2: an AIS. This is the next thing, creating an AI 105 00:05:57,839 --> 00:06:00,640 Speaker 2: chatbot from your dead loved one. Yeah, that's certainly going 106 00:06:00,720 --> 00:06:03,200 Speaker 2: to be processed through data setage. Another great excuse to 107 00:06:03,440 --> 00:06:07,240 Speaker 2: you know, protest and be against data centers coming to 108 00:06:07,279 --> 00:06:09,200 Speaker 2: your community if Meta is going to use them for 109 00:06:09,240 --> 00:06:12,240 Speaker 2: something like this. This has just got creepy written all 110 00:06:12,240 --> 00:06:14,120 Speaker 2: over it. I'm trying to be I'm trying to be 111 00:06:14,160 --> 00:06:16,719 Speaker 2: a capitalist here and say it's a free market and 112 00:06:16,760 --> 00:06:19,400 Speaker 2: if Meta wants to create a product that's legally available 113 00:06:19,400 --> 00:06:21,120 Speaker 2: out there, then they should be able to create that 114 00:06:21,200 --> 00:06:24,039 Speaker 2: product and let the market decide if it's a viable 115 00:06:24,080 --> 00:06:27,320 Speaker 2: product to survive in the marketplace or not. But man, 116 00:06:27,440 --> 00:06:29,240 Speaker 2: this just creeps me out on every level. 117 00:06:29,279 --> 00:06:34,239 Speaker 1: But if the chat bot of the deceased person does 118 00:06:34,400 --> 00:06:38,159 Speaker 1: something nefarious, who is responsible at. 119 00:06:38,000 --> 00:06:41,360 Speaker 2: The end, Well, that's a whole separate hold accountable. That's 120 00:06:41,360 --> 00:06:44,599 Speaker 2: a whole separate legal question because who is responsible for 121 00:06:44,640 --> 00:06:47,160 Speaker 2: the creation of AI. And again, we're seeing this play 122 00:06:47,200 --> 00:06:50,440 Speaker 2: out in real time where people are using AI images 123 00:06:51,160 --> 00:06:53,320 Speaker 2: And we saw this Groc was having a problem with 124 00:06:53,360 --> 00:06:56,839 Speaker 2: this Twitter's AI where people were posting images of women 125 00:06:56,880 --> 00:06:58,800 Speaker 2: that were fully clothed and they were like, hey, Groc, 126 00:06:58,839 --> 00:07:01,640 Speaker 2: put her in a string bikini, and Grock would create 127 00:07:01,680 --> 00:07:04,719 Speaker 2: an image, very realistic one of that woman who was 128 00:07:04,720 --> 00:07:07,240 Speaker 2: once fully clothed that is now wearing a string bikini. 129 00:07:07,440 --> 00:07:13,160 Speaker 2: Who is responsible for when AI breaks the law or 130 00:07:13,280 --> 00:07:16,920 Speaker 2: does something that would hold a real human being liable