1 00:00:13,640 --> 00:00:17,640 Speaker 1: Welcome to Tech Stuff, a production of iHeart Podcasts and Kaleidoscope. 2 00:00:18,200 --> 00:00:20,840 Speaker 1: I'm Os Voloshin and today Caraen Price and I will 3 00:00:20,840 --> 00:00:24,479 Speaker 1: bring you the headlines this week, including the future of 4 00:00:24,560 --> 00:00:28,160 Speaker 1: brain implants. Then on tech Support, we'll talk to four 5 00:00:28,240 --> 00:00:32,440 Speaker 1: or four Media's Sam Cole about a chatbot program to 6 00:00:32,560 --> 00:00:34,520 Speaker 1: soothe your heartache. 7 00:00:34,840 --> 00:00:37,000 Speaker 2: So I don't know. I mean, everyone says this, but 8 00:00:37,040 --> 00:00:38,080 Speaker 2: it's so a black mirror to me. 9 00:00:38,840 --> 00:00:41,559 Speaker 1: All of that. On the Week in Tech It's Friday, 10 00:00:41,880 --> 00:00:42,640 Speaker 1: May twenty. 11 00:00:42,400 --> 00:00:49,520 Speaker 3: Third, all right, So I want to tell you about 12 00:00:49,520 --> 00:00:52,920 Speaker 3: something that happened this weekend when I was hanging around 13 00:00:53,440 --> 00:00:55,200 Speaker 3: West Chelsea in Manhattan. 14 00:00:55,440 --> 00:00:57,400 Speaker 1: I'm all the It's as I always am. 15 00:00:57,520 --> 00:00:59,640 Speaker 3: So you know about Muji, of. 16 00:00:59,680 --> 00:01:04,640 Speaker 1: Course, Aboutmuji Muji, the Japanese store. There was one about 17 00:01:04,680 --> 00:01:06,360 Speaker 1: five minutes walk from my house where I grew up 18 00:01:06,360 --> 00:01:09,399 Speaker 1: in London, and there were all kinds of fancy pens 19 00:01:09,400 --> 00:01:12,880 Speaker 1: in there, which I wanted but never got except for Christmas. 20 00:01:13,200 --> 00:01:15,760 Speaker 3: The child who covets a pen will grow up to 21 00:01:15,800 --> 00:01:20,000 Speaker 3: have a podcast. But yeah, you know, Muji Muji is 22 00:01:20,040 --> 00:01:23,680 Speaker 3: a great Japanese retailer. There's a New York City location 23 00:01:23,959 --> 00:01:27,160 Speaker 3: in West Chelsea, and I went in there just because 24 00:01:27,200 --> 00:01:29,160 Speaker 3: you know, I always want to look at the latest 25 00:01:29,160 --> 00:01:34,520 Speaker 3: and unstructured garments and new bamboo toothbrushes. And I walk 26 00:01:34,560 --> 00:01:38,280 Speaker 3: into the cafe area which they have, and there are 27 00:01:38,280 --> 00:01:40,640 Speaker 3: two lines, and I was a little bit confused, like 28 00:01:40,640 --> 00:01:42,960 Speaker 3: why are there two lines? And I noticed one of 29 00:01:43,000 --> 00:01:45,440 Speaker 3: the lines is long, right, which is the line for 30 00:01:45,600 --> 00:01:48,200 Speaker 3: like the regular coffee and pastries, And one of the 31 00:01:48,200 --> 00:01:54,800 Speaker 3: lines is extremely short. And then I am hearing the 32 00:01:54,840 --> 00:02:00,440 Speaker 3: motion of a robot arm named Jarvis that's making macha. 33 00:02:01,000 --> 00:02:05,000 Speaker 3: And unsurprisingly, the line for the robot mancha late is 34 00:02:05,080 --> 00:02:08,160 Speaker 3: short because you need to be patient if you're going 35 00:02:08,200 --> 00:02:09,679 Speaker 3: to receive a robot mancho. 36 00:02:09,480 --> 00:02:11,240 Speaker 1: Li regular Barista. 37 00:02:11,520 --> 00:02:15,440 Speaker 3: Yes, I just want to explain this robot arm is 38 00:02:15,560 --> 00:02:19,680 Speaker 3: not a curerig. Okay, this is like if a human 39 00:02:19,880 --> 00:02:26,080 Speaker 3: arm was disembodied, turned into a cyborg given one of 40 00:02:26,120 --> 00:02:29,799 Speaker 3: those steel cups that we notice Barista's used to pour 41 00:02:29,880 --> 00:02:32,800 Speaker 3: stuff and to make macha with, and is just on 42 00:02:32,919 --> 00:02:39,680 Speaker 3: its own, powered by computing power and serving macha to 43 00:02:40,600 --> 00:02:45,120 Speaker 3: I guess incredibly enthusiastic tourists who like can't believe their 44 00:02:45,520 --> 00:02:49,960 Speaker 3: Uncanny Valley eyes that there is a robot that is 45 00:02:50,000 --> 00:02:52,839 Speaker 3: delivering macha to them, and not only delivering masha to them, 46 00:02:53,200 --> 00:02:55,800 Speaker 3: referring to them by their first names, because you have 47 00:02:55,840 --> 00:02:58,240 Speaker 3: to put in your first name. So by the end 48 00:02:58,240 --> 00:03:01,840 Speaker 3: of the macha making experience, there like your macha is ready, Elise, 49 00:03:02,800 --> 00:03:05,040 Speaker 3: and Elise is like Elise can barely pick up her 50 00:03:05,040 --> 00:03:07,400 Speaker 3: macha because she's like making a TikTok of this robot 51 00:03:07,520 --> 00:03:09,280 Speaker 3: who's giving her the macha. You know what I mean. 52 00:03:09,320 --> 00:03:11,000 Speaker 3: I'm like, don't spill Jarvis as much. 53 00:03:11,040 --> 00:03:12,839 Speaker 1: So this is for the tourists in New York City. 54 00:03:12,919 --> 00:03:15,360 Speaker 1: It's kind of interesting because I guess, like this is 55 00:03:15,440 --> 00:03:19,000 Speaker 1: not a robot that's there to more efficiently make much 56 00:03:18,960 --> 00:03:22,240 Speaker 1: a latte. This is a robot to give Muji customers 57 00:03:22,560 --> 00:03:24,880 Speaker 1: a sense of brand experience that they could be in 58 00:03:24,960 --> 00:03:26,120 Speaker 1: Japan but they're. 59 00:03:25,960 --> 00:03:29,079 Speaker 3: Not, yeah, precisely. And also I mean maybe a glimpse 60 00:03:29,080 --> 00:03:31,080 Speaker 3: into the future where lines are going to be quite 61 00:03:31,120 --> 00:03:35,160 Speaker 3: long unless and no offense to this robot, but the 62 00:03:35,200 --> 00:03:38,480 Speaker 3: most striking observation that I had in this whole thing, 63 00:03:38,640 --> 00:03:40,360 Speaker 3: and I love you and I both talk about this 64 00:03:40,400 --> 00:03:40,680 Speaker 3: a lot. 65 00:03:40,720 --> 00:03:45,640 Speaker 1: But you can tip, you can tip, or you have 66 00:03:45,680 --> 00:03:47,360 Speaker 1: to tip, you can. 67 00:03:47,960 --> 00:03:51,560 Speaker 3: Oh, that's always the question. You can always ten you 68 00:03:51,600 --> 00:03:52,680 Speaker 3: don't have to tip. 69 00:03:53,200 --> 00:03:55,400 Speaker 1: But how does Javis let you know that you not 70 00:03:55,440 --> 00:03:57,120 Speaker 1: only can, but really should. 71 00:03:57,840 --> 00:04:00,640 Speaker 3: Because when you use a computer, just like you use 72 00:04:00,680 --> 00:04:04,320 Speaker 3: a tablet. Now in every setting, there's that pivotal moment 73 00:04:04,520 --> 00:04:07,760 Speaker 3: in every transaction that we have now in stores where 74 00:04:08,080 --> 00:04:10,760 Speaker 3: the iPad is flipped around and you're like, am I 75 00:04:10,800 --> 00:04:13,240 Speaker 3: a dick or am I not a dick? 76 00:04:13,280 --> 00:04:15,120 Speaker 1: You presumely a less of a dick if you refuse 77 00:04:15,200 --> 00:04:17,320 Speaker 1: to tip the robot, then the barrister. 78 00:04:17,800 --> 00:04:20,279 Speaker 3: It depends what Jarvis is doing in his free days off. 79 00:04:20,400 --> 00:04:22,599 Speaker 3: I'm not sure. I'm not sure. 80 00:04:22,720 --> 00:04:24,599 Speaker 1: I do have a question about whether or not this 81 00:04:24,720 --> 00:04:28,039 Speaker 1: is going to become a ubiquitous part of food service 82 00:04:28,200 --> 00:04:30,400 Speaker 1: in the US, or whether this is going to remain 83 00:04:30,880 --> 00:04:33,039 Speaker 1: a gimmick in the muchI shop. I had a conversation 84 00:04:33,080 --> 00:04:34,800 Speaker 1: with my wife the other day and she said, well, 85 00:04:35,000 --> 00:04:37,320 Speaker 1: you used to take podcast. How long is it going 86 00:04:37,360 --> 00:04:39,479 Speaker 1: to be until there's a robot that goes down to 87 00:04:39,480 --> 00:04:42,360 Speaker 1: the grocery store, buys our own for breakfast and cooks 88 00:04:42,360 --> 00:04:43,880 Speaker 1: it for me? And I said, you know what, that's 89 00:04:43,920 --> 00:04:46,280 Speaker 1: a very very good question, which I think even the 90 00:04:46,320 --> 00:04:49,320 Speaker 1: great futurists which struggle to give you a good answer 91 00:04:49,320 --> 00:04:50,159 Speaker 1: to We'll. 92 00:04:49,960 --> 00:04:52,760 Speaker 3: Be looking out for that here on the podcast TEXTU. 93 00:04:52,560 --> 00:04:56,320 Speaker 1: We will keep our eyes wide open. So I think 94 00:04:56,320 --> 00:04:59,640 Speaker 1: today we each have a new story that we want 95 00:04:59,640 --> 00:05:01,600 Speaker 1: to share, and then after that we're going to go 96 00:05:01,680 --> 00:05:02,400 Speaker 1: onto the headlines. 97 00:05:02,600 --> 00:05:04,120 Speaker 3: Yeah, and I think it's safe to say that the 98 00:05:04,160 --> 00:05:07,840 Speaker 3: overarching theme for this episode is the consumer tech that 99 00:05:07,960 --> 00:05:11,840 Speaker 3: mediates our world and how system updates can change our world. 100 00:05:12,080 --> 00:05:15,080 Speaker 1: System updates can sound boring, but they can also really 101 00:05:15,160 --> 00:05:17,880 Speaker 1: change the way we live, and this is exactly that 102 00:05:17,920 --> 00:05:19,960 Speaker 1: type of story. So we've seen a lot in the 103 00:05:20,000 --> 00:05:23,360 Speaker 1: last few weeks about different lawsuits that are brewing against 104 00:05:23,400 --> 00:05:26,799 Speaker 1: big tech companies. Meta and Alphabet are both in litigation 105 00:05:26,920 --> 00:05:30,760 Speaker 1: with the Department of Justice over monopolistic practices, and they 106 00:05:30,800 --> 00:05:33,960 Speaker 1: may be breakups of their products like Chrome and Instagram 107 00:05:34,000 --> 00:05:37,080 Speaker 1: and others on the horizon. But today's story from me 108 00:05:37,240 --> 00:05:40,520 Speaker 1: is actually about a recent court ruling against Apple and 109 00:05:40,520 --> 00:05:44,320 Speaker 1: how other tech companies are responding to this ruling. Since 110 00:05:44,360 --> 00:05:46,320 Speaker 1: you're a big reader, Kara, I thought you'd be interested 111 00:05:46,320 --> 00:05:48,520 Speaker 1: in a story that gets to this. In the Verge 112 00:05:48,560 --> 00:05:52,760 Speaker 1: this week, with the headline Spotify's iPhone app will now 113 00:05:52,839 --> 00:05:56,320 Speaker 1: let you easily buy audiobooks just. 114 00:05:56,240 --> 00:05:58,520 Speaker 3: For reference for people who don't know. I run a 115 00:05:58,520 --> 00:06:01,560 Speaker 3: book club called Belatris, so I tend to keep my 116 00:06:01,640 --> 00:06:04,000 Speaker 3: eyes open about what's going on in the publishing world. 117 00:06:04,400 --> 00:06:08,880 Speaker 3: And I listened to audiobooks on Spotify, yes, but they 118 00:06:08,880 --> 00:06:11,680 Speaker 3: were a little bit confusing to access. If you wanted 119 00:06:11,680 --> 00:06:14,560 Speaker 3: to buy an audiobook, you had to go to Spotify's website, 120 00:06:14,640 --> 00:06:17,920 Speaker 3: which would then download the book onto your app like 121 00:06:17,960 --> 00:06:21,719 Speaker 3: you couldn't buy it directly in app, which was annoying 122 00:06:21,720 --> 00:06:21,880 Speaker 3: to me. 123 00:06:21,920 --> 00:06:25,320 Speaker 1: Again, and this was not a friction free purchasing experience, 124 00:06:25,839 --> 00:06:29,159 Speaker 1: and I was surprisingly There was a reason that friction existed, 125 00:06:29,600 --> 00:06:33,400 Speaker 1: and that reason was Apple because until very recently, in fact, 126 00:06:33,480 --> 00:06:38,640 Speaker 1: until this court ruling, they had almost exclusive power over 127 00:06:38,720 --> 00:06:43,279 Speaker 1: how developers could monetize apps and subscriptions on iOS. So 128 00:06:43,360 --> 00:06:47,000 Speaker 1: if companies did offering app purchases, in most cases, Apple 129 00:06:47,040 --> 00:06:48,920 Speaker 1: took a thirty percent commission right off the top. 130 00:06:49,040 --> 00:06:51,240 Speaker 3: Oh, I see, so it makes sense that Spotify would 131 00:06:51,240 --> 00:06:54,120 Speaker 3: be like, get out of where Apple can charge this 132 00:06:54,160 --> 00:06:57,520 Speaker 3: thirty percent fee, Go to our website, get the audiobook, 133 00:06:57,560 --> 00:07:00,080 Speaker 3: and then come back to Spotify. 134 00:06:59,240 --> 00:07:02,799 Speaker 1: Exactly accept It's a bit more complicated than that, because 135 00:07:03,200 --> 00:07:05,840 Speaker 1: if there was a button in the Spotify app that 136 00:07:05,880 --> 00:07:09,400 Speaker 1: pushed you to the website, Apple would still charge twenty 137 00:07:09,480 --> 00:07:12,800 Speaker 1: seven percent. I don't know whether three percent goes commission 138 00:07:13,240 --> 00:07:18,000 Speaker 1: for a kind of indirect referral sale fee. And so 139 00:07:18,200 --> 00:07:22,000 Speaker 1: that's why the buy through was so hard on Spotify 140 00:07:22,040 --> 00:07:25,480 Speaker 1: to audiobooks. But as of very very recently, like this 141 00:07:25,520 --> 00:07:28,760 Speaker 1: week recently, that's all changing, and you can see it 142 00:07:29,080 --> 00:07:30,800 Speaker 1: when you open your Spotify app. 143 00:07:30,680 --> 00:07:32,840 Speaker 3: And I'm looking at it now and there's a very 144 00:07:32,880 --> 00:07:36,200 Speaker 3: obvious green button that says buy on it. I never 145 00:07:36,240 --> 00:07:38,040 Speaker 3: noticed there wasn't a price tag. 146 00:07:38,000 --> 00:07:41,200 Speaker 1: Right, And what you're seeing in your Spotify app on 147 00:07:41,240 --> 00:07:45,240 Speaker 1: your iPhone is actually the consequence of a lawsuit between 148 00:07:45,280 --> 00:07:49,760 Speaker 1: Apple and another tech company called Epic Games that made Fortnite, 149 00:07:49,840 --> 00:07:52,120 Speaker 1: and it's a lawsuit which has been going on for years, 150 00:07:52,760 --> 00:07:55,560 Speaker 1: and related to that lawsuit, just last month, a judge 151 00:07:55,640 --> 00:07:59,200 Speaker 1: rule that Apple could no longer impose commissions or fees 152 00:07:59,560 --> 00:08:03,400 Speaker 1: on per made outside an app. They could no longer 153 00:08:03,440 --> 00:08:06,440 Speaker 1: restrict the style, the formatting, or the placement of links 154 00:08:06,440 --> 00:08:09,560 Speaker 1: for purchases outside an app, all limit the use of 155 00:08:09,600 --> 00:08:13,120 Speaker 1: buttons or other obvious calls to action. In other words, 156 00:08:13,480 --> 00:08:18,640 Speaker 1: the judge basically said, let people do commerce without taxing 157 00:08:18,680 --> 00:08:20,080 Speaker 1: them on their down phones. 158 00:08:20,600 --> 00:08:23,320 Speaker 3: I love that it took Fortnite making a stink about this, 159 00:08:23,440 --> 00:08:26,600 Speaker 3: like nobody was like, we need audio books. 160 00:08:27,240 --> 00:08:29,680 Speaker 1: You're right, And the Spotify story is like an unintended 161 00:08:29,680 --> 00:08:33,360 Speaker 1: consequence of the epic game story. So this ruling against 162 00:08:33,400 --> 00:08:37,000 Speaker 1: Apple was recent. Apple are appealing, but other tech companies 163 00:08:37,080 --> 00:08:40,880 Speaker 1: are already adjusting their iOS apps. So for those who, 164 00:08:41,080 --> 00:08:43,280 Speaker 1: like you, use the Kindle app, you'll now be able 165 00:08:43,320 --> 00:08:46,000 Speaker 1: to purchase books with far fewer taps than you previously 166 00:08:46,120 --> 00:08:49,120 Speaker 1: needed to make. And there is a clearly marked get 167 00:08:49,160 --> 00:08:50,439 Speaker 1: book button. 168 00:08:50,520 --> 00:08:53,200 Speaker 3: That's actually very exciting. I've always been annoyed by that, 169 00:08:53,280 --> 00:08:55,720 Speaker 3: Like it's always been a thorn in my side. It's 170 00:08:55,760 --> 00:08:57,920 Speaker 3: just like it is not friction free. 171 00:08:57,760 --> 00:08:59,800 Speaker 1: For you, as annoying for amazonic. 172 00:08:59,440 --> 00:09:00,959 Speaker 3: Amazon's like it's annoying for us too. 173 00:09:01,000 --> 00:09:04,240 Speaker 1: Bit you know, this is in some sense like a 174 00:09:04,280 --> 00:09:07,760 Speaker 1: you know what looks like a small ux story, but 175 00:09:07,840 --> 00:09:10,560 Speaker 1: actually I think it's a really really big story because 176 00:09:10,600 --> 00:09:12,440 Speaker 1: I think we could look back on this moment, this 177 00:09:12,600 --> 00:09:16,640 Speaker 1: very week as a seismic shift in Apple's ability to 178 00:09:16,720 --> 00:09:20,880 Speaker 1: control the developer ecosystem, which in turn controls what shows 179 00:09:20,960 --> 00:09:23,520 Speaker 1: up on your iPhone and how you interact with it. 180 00:09:23,720 --> 00:09:26,720 Speaker 1: And given we spend most of our waking hours interacting 181 00:09:26,720 --> 00:09:29,840 Speaker 1: with stuff on our iPhone, this loosening of the gates, 182 00:09:29,960 --> 00:09:32,120 Speaker 1: I think could be could be a very big and 183 00:09:32,160 --> 00:09:34,680 Speaker 1: interesting moment in the future of how we interact with tech. 184 00:09:34,840 --> 00:09:37,200 Speaker 3: You know, your story about Apple actually leads me to 185 00:09:37,320 --> 00:09:41,480 Speaker 3: my story about Apple. Further pun intended implanting itself into 186 00:09:41,520 --> 00:09:45,440 Speaker 3: our minds. The Wall Street Journal reports that Apple will 187 00:09:45,440 --> 00:09:48,400 Speaker 3: allow brain implants to control their devices. 188 00:09:48,400 --> 00:09:51,240 Speaker 1: I've just told you right there, because for me, the 189 00:09:51,320 --> 00:09:55,600 Speaker 1: idea of controlling computers and machines with just our brains 190 00:09:56,280 --> 00:09:58,600 Speaker 1: it feels like science fiction is one of the most 191 00:09:58,640 --> 00:10:02,640 Speaker 1: fascinating and exciting possibilities in all of tech. Also one 192 00:10:02,640 --> 00:10:04,560 Speaker 1: of the most terrifying. But the fact that it may 193 00:10:04,600 --> 00:10:09,959 Speaker 1: be leaving the lab and entering the app store just wow. 194 00:10:10,679 --> 00:10:13,800 Speaker 3: I personally think it's amazing. I think it's important to 195 00:10:13,800 --> 00:10:16,000 Speaker 3: be clear that this is not just something you know. 196 00:10:16,040 --> 00:10:18,079 Speaker 3: You're not going to like, pick up a new pair 197 00:10:18,120 --> 00:10:21,840 Speaker 3: of Apple Vision goggles and all of a sudden control 198 00:10:21,880 --> 00:10:24,640 Speaker 3: stuff with your mind. You need a surgical implant. But 199 00:10:24,720 --> 00:10:28,600 Speaker 3: these devices could be incredibly consequential to people who have 200 00:10:28,760 --> 00:10:33,760 Speaker 3: difficulty communicating and have limited to no hand use, and 201 00:10:33,880 --> 00:10:36,600 Speaker 3: according to Morgan Stanley, that's actually around one hundred and 202 00:10:36,679 --> 00:10:40,160 Speaker 3: fifty thousand people in the US. That includes people living 203 00:10:40,200 --> 00:10:44,120 Speaker 3: with als stroke or spinal cord injury. And just to 204 00:10:44,720 --> 00:10:46,960 Speaker 3: kind of get under the hood of how this is working, 205 00:10:47,320 --> 00:10:50,720 Speaker 3: Apple is working with a startup called Syncron, and Syncron 206 00:10:50,800 --> 00:10:54,640 Speaker 3: has developed a device called stentrode, which is implanted right 207 00:10:54,679 --> 00:10:57,000 Speaker 3: on the top of the brain's motor cortex. 208 00:10:57,200 --> 00:10:59,480 Speaker 1: Motor cortex, so it's kind of the surface part of 209 00:10:59,480 --> 00:11:01,439 Speaker 1: the brain that controls movement. 210 00:11:01,800 --> 00:11:06,240 Speaker 3: That's exactly right, And the stent rode has electrodes that 211 00:11:06,440 --> 00:11:10,080 Speaker 3: read brain signals, and the idea is that those signals 212 00:11:10,320 --> 00:11:14,800 Speaker 3: could be translated into selecting icons on a screen. So 213 00:11:15,040 --> 00:11:18,840 Speaker 3: just imagine instead of using your hands, you're using your 214 00:11:18,880 --> 00:11:21,640 Speaker 3: brain to operate an Apple device. 215 00:11:22,000 --> 00:11:25,240 Speaker 1: And I have to ask you, obviously, neurallink is is 216 00:11:25,280 --> 00:11:28,520 Speaker 1: the kind of the dominant player in this very early 217 00:11:28,559 --> 00:11:30,720 Speaker 1: stage in the game, or at least the one that 218 00:11:30,720 --> 00:11:34,000 Speaker 1: gets the most airtime. Maybe that's because because Elon Musk 219 00:11:34,120 --> 00:11:35,880 Speaker 1: is such a good marketer. But you know, we've all 220 00:11:35,880 --> 00:11:39,080 Speaker 1: seen videos of people using their brain waves to control 221 00:11:39,240 --> 00:11:42,960 Speaker 1: a world building computer game like Civilization. What's the difference 222 00:11:43,000 --> 00:11:46,360 Speaker 1: between what Apple is doing and what Elon and neuralink 223 00:11:46,360 --> 00:11:46,720 Speaker 1: are doing. 224 00:11:47,679 --> 00:11:50,000 Speaker 3: It's a good question, and I think the key is 225 00:11:50,000 --> 00:11:54,439 Speaker 3: that neuralink is a much more invasive type of brain implant. 226 00:11:54,679 --> 00:11:58,800 Speaker 3: The procedure involves literally making a hole in the recipient's 227 00:11:58,800 --> 00:12:02,680 Speaker 3: skull to insert the device, something you don't have to 228 00:12:02,720 --> 00:12:06,200 Speaker 3: do to implant the stentrode, which is inserted via the 229 00:12:06,360 --> 00:12:12,200 Speaker 3: jugular veins. But it's not botox, I'll tell you that. 230 00:12:12,800 --> 00:12:18,560 Speaker 3: But it's much easier. In terms of the difference between 231 00:12:18,600 --> 00:12:23,319 Speaker 3: neuralink and ctentrode. Neurallink also has one thousand electrodes picking 232 00:12:23,360 --> 00:12:28,680 Speaker 3: up neural activity compared with stentrodes sixteen. So, just for 233 00:12:28,720 --> 00:12:32,319 Speaker 3: a little bit of context and background, synchron has implanted 234 00:12:32,320 --> 00:12:35,840 Speaker 3: their device in ten people since twenty nineteen. But the 235 00:12:35,920 --> 00:12:39,160 Speaker 3: crucial difference here is that Apple is working on this 236 00:12:39,240 --> 00:12:42,760 Speaker 3: product in order to create a new technology standard for 237 00:12:42,840 --> 00:12:46,560 Speaker 3: the way brain waves interact specifically with their products, which 238 00:12:46,559 --> 00:12:50,160 Speaker 3: they will then release to other developers. So this isn't 239 00:12:50,160 --> 00:12:53,480 Speaker 3: really about creating a better neuralink. It's about creating a 240 00:12:53,679 --> 00:12:58,280 Speaker 3: standard pathway for implants like neuralink, Stentrode and others to 241 00:12:58,520 --> 00:13:01,120 Speaker 3: seamlessly control Apple devices. 242 00:13:00,760 --> 00:13:04,280 Speaker 1: Because this really is the interface layer that Apple are 243 00:13:04,280 --> 00:13:07,040 Speaker 1: trying to create. And I guess to me, what it 244 00:13:07,080 --> 00:13:10,600 Speaker 1: says is that adoption of this type of brain reading 245 00:13:10,640 --> 00:13:13,720 Speaker 1: technology could actually start to happen outside of the lab. 246 00:13:14,120 --> 00:13:17,440 Speaker 1: Because if you create an open system where there's a 247 00:13:17,480 --> 00:13:21,640 Speaker 1: standard through which I can communicate with my iPhone using 248 00:13:21,679 --> 00:13:25,360 Speaker 1: brain waves, that's much more likely to take on than 249 00:13:25,480 --> 00:13:28,440 Speaker 1: having to create kind of parallel tech system where I 250 00:13:28,480 --> 00:13:31,000 Speaker 1: can play civilizations through a neural link. In other words, 251 00:13:31,360 --> 00:13:33,840 Speaker 1: this plugs me in in the most seamless and efficient 252 00:13:33,920 --> 00:13:36,800 Speaker 1: way to the tech everybody else is already using, and 253 00:13:36,840 --> 00:13:38,040 Speaker 1: that could be revolutionary. 254 00:13:38,160 --> 00:13:38,840 Speaker 3: Yeah, totally. 255 00:13:39,040 --> 00:13:41,040 Speaker 1: But do we know and you mentioned ten people have 256 00:13:41,040 --> 00:13:42,800 Speaker 1: been implanted so far, do we know how it's going 257 00:13:42,840 --> 00:13:43,200 Speaker 1: for them? 258 00:13:43,679 --> 00:13:46,040 Speaker 3: They say it can be slow and that it can't 259 00:13:46,080 --> 00:13:48,800 Speaker 3: be used to mimic moving a cursor like you would 260 00:13:48,840 --> 00:13:51,280 Speaker 3: a mouse. But at the same time, the Wall Street 261 00:13:51,360 --> 00:13:55,400 Speaker 3: Journal article describes how one man, Mark Jackson, who has als, 262 00:13:55,800 --> 00:13:58,920 Speaker 3: was able to connect his synchron implant to an Apple 263 00:13:59,040 --> 00:14:02,400 Speaker 3: VR headset and walk around a VR version of the 264 00:14:02,400 --> 00:14:06,120 Speaker 3: Swiss Alps, and when he peered over the ledge in 265 00:14:06,160 --> 00:14:08,760 Speaker 3: this VR headset, he actually felt his legs shape. 266 00:14:08,800 --> 00:14:11,400 Speaker 1: Well. I can't imagine what a moving experience that must 267 00:14:11,400 --> 00:14:11,959 Speaker 1: have been for him. 268 00:14:12,760 --> 00:14:14,720 Speaker 3: Yeah. I think one of the things to think about, 269 00:14:14,760 --> 00:14:19,360 Speaker 3: of course, is with your story, there is like a 270 00:14:19,480 --> 00:14:25,520 Speaker 3: very clear human centered cause and effect of this ruling 271 00:14:25,600 --> 00:14:28,880 Speaker 3: that you've talked about, something that will affect my personal 272 00:14:28,920 --> 00:14:30,880 Speaker 3: life of how I listen to audiobooks and how I 273 00:14:30,920 --> 00:14:34,280 Speaker 3: read digitally on my phone or on my iPad. I 274 00:14:34,320 --> 00:14:37,320 Speaker 3: think this story that I'm telling you is obviously a 275 00:14:37,360 --> 00:14:40,760 Speaker 3: little bit more in the future and may impact those 276 00:14:40,800 --> 00:14:43,600 Speaker 3: who need it most before it's something that we can 277 00:14:43,680 --> 00:14:46,400 Speaker 3: just kind of play around with. But I think both 278 00:14:47,560 --> 00:14:51,280 Speaker 3: are two big stories about how the app store is destiny. 279 00:14:54,600 --> 00:14:57,120 Speaker 1: We've got a few more quick headlines for you. First, 280 00:14:57,160 --> 00:15:00,800 Speaker 1: an update, The Guardian reports that twenty three me, which 281 00:15:00,840 --> 00:15:06,600 Speaker 1: declared bankruptcy in March, has a buyer. Regeneral Pharmaceuticals, agreed 282 00:15:06,640 --> 00:15:09,520 Speaker 1: to buy the genetic testing company for two hundred and 283 00:15:09,560 --> 00:15:13,720 Speaker 1: fifty six million dollars through a bankruptcy auction. Once again, 284 00:15:14,040 --> 00:15:18,160 Speaker 1: this leaves over fifteen million people wondering what will happen 285 00:15:18,200 --> 00:15:21,760 Speaker 1: to their genetic data and why a pharmaceutical company might 286 00:15:21,800 --> 00:15:25,200 Speaker 1: have wanted to buy it now. Regeneral Pharmaceuticals has said 287 00:15:25,360 --> 00:15:28,240 Speaker 1: that they are quote committed to protecting the twenty three 288 00:15:28,360 --> 00:15:31,280 Speaker 1: meter data set with our high standards of data privacy, 289 00:15:31,520 --> 00:15:37,600 Speaker 1: security and ethical oversight. So we'll see, I'll see, well, 290 00:15:37,640 --> 00:15:38,840 Speaker 1: you'll see, but I never signed up. 291 00:15:39,840 --> 00:15:42,560 Speaker 3: I know I did, I'll see. In another headline, if 292 00:15:42,640 --> 00:15:45,640 Speaker 3: you get your Netflix with a side of ads, there 293 00:15:45,680 --> 00:15:50,640 Speaker 3: are changes ahead. According to the publication Interesting Engineering, Netflix 294 00:15:50,640 --> 00:15:54,240 Speaker 3: will start running AI powered ads sometime next year, meaning 295 00:15:54,320 --> 00:15:57,920 Speaker 3: ads will get more personalized and things like tone and 296 00:15:58,040 --> 00:16:00,640 Speaker 3: messaging will change based on view your behavior. 297 00:16:00,720 --> 00:16:03,080 Speaker 1: This is basically an AD which is made for you 298 00:16:03,600 --> 00:16:07,040 Speaker 1: in the moment, contextually. 299 00:16:05,640 --> 00:16:10,280 Speaker 3: Based on what you're watching. So if I'm watching something 300 00:16:10,560 --> 00:16:14,040 Speaker 3: that's particularly sad, if I'm watching something that's happy, if 301 00:16:14,040 --> 00:16:18,800 Speaker 3: I'm watching something that is you know, making me cry, 302 00:16:19,720 --> 00:16:23,560 Speaker 3: ads will adapt to this, which I don't personally. This 303 00:16:23,600 --> 00:16:27,040 Speaker 3: is a version of neuralink to me. It's connecting content 304 00:16:27,160 --> 00:16:29,480 Speaker 3: with what's happening inside my emotional brain. 305 00:16:30,600 --> 00:16:34,560 Speaker 1: Finally, a reporter from Wired was able to replicate the 306 00:16:34,600 --> 00:16:39,280 Speaker 1: gun that Luigi Mangoni allegedly used to kill United Healthcare 307 00:16:39,320 --> 00:16:43,080 Speaker 1: CEO Brian Thompson. As part of an experiment to see 308 00:16:43,120 --> 00:16:46,880 Speaker 1: how advanced three D printing technology and designs for ghostguns 309 00:16:47,000 --> 00:16:50,880 Speaker 1: already gotten, journalist Andy Greenberg wrote a story under the 310 00:16:50,880 --> 00:16:55,320 Speaker 1: headline we made Luigi Mangoni's three D printed gun and 311 00:16:55,400 --> 00:16:58,840 Speaker 1: fired it. Three things stood out to me. One, this 312 00:16:59,000 --> 00:17:02,680 Speaker 1: is legal in most states in the US to print gun. Two, 313 00:17:03,200 --> 00:17:07,720 Speaker 1: it's cheap. Total cost for Greenberg to do this, including 314 00:17:07,760 --> 00:17:10,119 Speaker 1: buying the printer, was twelve hundred. 315 00:17:09,880 --> 00:17:12,280 Speaker 3: Dollars, cheaper than a New York City apartment. 316 00:17:12,400 --> 00:17:17,040 Speaker 1: Yeah, well for one day. And Three it's ubiquitous. Between 317 00:17:17,080 --> 00:17:21,439 Speaker 1: twenty sixteen and twenty twenty two, seventy thousand ghost guns 318 00:17:21,800 --> 00:17:24,720 Speaker 1: i e. Three D printed guns that untrackable were found 319 00:17:24,760 --> 00:17:27,240 Speaker 1: on crime scenes, according to the ATF. 320 00:17:28,440 --> 00:17:30,800 Speaker 3: After the Break four or four media is Sam cole 321 00:17:30,920 --> 00:17:33,280 Speaker 3: on a chatbot that helps you overcome heartbreak. 322 00:17:52,680 --> 00:17:55,840 Speaker 1: So, Caaren, I'm very excited about today's tech support because 323 00:17:56,119 --> 00:17:58,520 Speaker 1: it's a story that gets to the heart of how 324 00:17:59,080 --> 00:18:03,359 Speaker 1: tech changed. Is our social interactions, and to me, this 325 00:18:03,400 --> 00:18:07,800 Speaker 1: story touching on two trends. One, the consequence of being 326 00:18:07,880 --> 00:18:12,719 Speaker 1: anonymous online and two the search for companionship in an 327 00:18:12,720 --> 00:18:16,920 Speaker 1: increasingly lonely digital world. This week's tech support is all 328 00:18:16,960 --> 00:18:20,320 Speaker 1: about ghosting and a story from four or for Media 329 00:18:20,600 --> 00:18:23,880 Speaker 1: that grabbed our attention with the headline quote, this chatbot 330 00:18:23,920 --> 00:18:26,639 Speaker 1: promises to help you get over that X who ghosted you. 331 00:18:27,280 --> 00:18:30,040 Speaker 3: And for those of you who are not familiar, ghosting 332 00:18:30,200 --> 00:18:34,120 Speaker 3: is when someone abruptly stops talking to you, with no explanation, 333 00:18:34,760 --> 00:18:36,960 Speaker 3: no fight. They just went dark, and they think they 334 00:18:36,960 --> 00:18:40,959 Speaker 3: can go dark because well, if they stop answering, they 335 00:18:41,000 --> 00:18:41,879 Speaker 3: never have to see you again. 336 00:18:42,040 --> 00:18:45,920 Speaker 1: I've been married for a few years, so personal ghosting 337 00:18:46,040 --> 00:18:48,040 Speaker 1: isn't something I've had to contend with for a bit. 338 00:18:48,600 --> 00:18:51,119 Speaker 1: But as you know, I have these sort of external 339 00:18:51,160 --> 00:18:55,040 Speaker 1: investors in Kaleidoscope, and courting investors is actually quite similar 340 00:18:55,080 --> 00:18:58,159 Speaker 1: to dating. I'm starting out the company. I had a 341 00:18:58,200 --> 00:19:03,000 Speaker 1: ton of very promising, mean first meetings, and then cricket 342 00:19:03,119 --> 00:19:06,040 Speaker 1: to my absolute avalanche of follow up emails. 343 00:19:06,359 --> 00:19:08,600 Speaker 3: I just think it's very interesting that the term ghosting 344 00:19:09,040 --> 00:19:12,159 Speaker 3: it didn't exist when we were in college, because I 345 00:19:12,160 --> 00:19:15,560 Speaker 3: think that the iPhone actually allowed for ghosting to happen. 346 00:19:15,760 --> 00:19:17,479 Speaker 1: We'll talk about ghost in the machine. We just had 347 00:19:17,480 --> 00:19:19,760 Speaker 1: ghost guns. Now we've got ghosting. I mean this whole 348 00:19:19,800 --> 00:19:23,080 Speaker 1: like tech digital anonymity thing. It's like, it's interesting that 349 00:19:23,080 --> 00:19:27,200 Speaker 1: we would return to this spiritual metaphor to explain our 350 00:19:27,200 --> 00:19:30,240 Speaker 1: interactions with something we made well. 351 00:19:30,119 --> 00:19:31,480 Speaker 3: In a ghost you know, if we want to go 352 00:19:31,480 --> 00:19:34,520 Speaker 3: back to Victorian times, is a person whose soul is 353 00:19:34,560 --> 00:19:37,119 Speaker 3: stuck in the in between? Yes, who's not dead. 354 00:19:37,200 --> 00:19:39,000 Speaker 1: And in fact it's not the person who's trapped in 355 00:19:39,040 --> 00:19:41,720 Speaker 1: purgatory that you're trapped. You're the ghosts, the other one 356 00:19:41,760 --> 00:19:44,000 Speaker 1: who can't move between worlds when you're being ghosted because 357 00:19:44,040 --> 00:19:45,840 Speaker 1: you don't know whether or not to move on and 358 00:19:45,880 --> 00:19:48,240 Speaker 1: you're stuck with your obsessions and should it what it 359 00:19:48,240 --> 00:19:51,479 Speaker 1: could as and why haven't they responded precisely. So, with 360 00:19:51,520 --> 00:19:53,920 Speaker 1: all of that in mind, I'm thrilled to welcome four 361 00:19:54,000 --> 00:19:57,600 Speaker 1: or four Media's reporter Sam Cole to the podcast. She 362 00:19:57,720 --> 00:20:01,360 Speaker 1: reported on a startup that's made a chat bought specifically 363 00:20:01,400 --> 00:20:04,800 Speaker 1: to help people get over being ghosted. Sam, welcome to 364 00:20:04,800 --> 00:20:05,359 Speaker 1: tech Stuff. 365 00:20:05,680 --> 00:20:06,800 Speaker 2: Thank you so much for having me. 366 00:20:07,200 --> 00:20:09,800 Speaker 1: So tell us about this app. I mean, how did 367 00:20:09,840 --> 00:20:11,680 Speaker 1: you start reporting on it? How did you first learn 368 00:20:11,720 --> 00:20:13,240 Speaker 1: about it? What is it? Yeah? 369 00:20:13,280 --> 00:20:17,720 Speaker 2: So the app is called closure appropriately named, and that's 370 00:20:17,760 --> 00:20:21,680 Speaker 2: the whole point, to get closure from your ex from 371 00:20:21,680 --> 00:20:25,640 Speaker 2: a recruiter, from friends who ghosted you. And I came 372 00:20:25,680 --> 00:20:28,920 Speaker 2: across it because they were running ads on Reddit, all right, 373 00:20:28,960 --> 00:20:30,840 Speaker 2: So I would get like promoted ads that would say, 374 00:20:31,280 --> 00:20:33,440 Speaker 2: thinking about your X twenty four to seven, there's nothing 375 00:20:33,480 --> 00:20:36,520 Speaker 2: wrong with you, chat with their AI version and finally 376 00:20:36,600 --> 00:20:39,920 Speaker 2: let it go, which is so I don't know. I mean, 377 00:20:40,000 --> 00:20:41,879 Speaker 2: everyone says this, but it's so black mirror to me. 378 00:20:42,240 --> 00:20:44,240 Speaker 1: And you interviewed the founder, who I believe have one 379 00:20:44,280 --> 00:20:46,480 Speaker 1: of those classic stories that you need to have to 380 00:20:46,600 --> 00:20:49,840 Speaker 1: raise VC money, which starts with a personal experience that 381 00:20:49,880 --> 00:20:51,920 Speaker 1: can be extrapolated to a large cohort. 382 00:20:53,080 --> 00:20:53,960 Speaker 3: Yeah, for sure. 383 00:20:54,480 --> 00:20:54,760 Speaker 1: Yeah. 384 00:20:54,800 --> 00:20:58,240 Speaker 2: She told me that this was born of her own 385 00:20:58,400 --> 00:21:00,800 Speaker 2: experience with being ghosted. She said she is ghosted by 386 00:21:01,640 --> 00:21:05,760 Speaker 2: per fiance and also a best friend and recruiters. But 387 00:21:06,280 --> 00:21:08,800 Speaker 2: I would imagine be ghosted by your fiance is I 388 00:21:08,800 --> 00:21:12,320 Speaker 2: don't know. Making a chatbot would not be my first reaction. 389 00:21:13,040 --> 00:21:13,880 Speaker 3: I would do a lot of. 390 00:21:13,800 --> 00:21:18,160 Speaker 2: Things before that, But yeah, I would say that's very 391 00:21:18,200 --> 00:21:20,720 Speaker 2: much a classic kind of founder tug at the heartstrings 392 00:21:20,760 --> 00:21:21,320 Speaker 2: type story. 393 00:21:21,640 --> 00:21:23,960 Speaker 3: Can you talk a little bit about how it actually 394 00:21:24,040 --> 00:21:25,400 Speaker 3: works for the ghosted. 395 00:21:25,920 --> 00:21:28,680 Speaker 2: Yeah, so when you open up the site, your role 396 00:21:28,680 --> 00:21:31,640 Speaker 2: playing with chat GPT, it's running chat cheept in the background. 397 00:21:32,160 --> 00:21:35,199 Speaker 2: So it asks you a couple questions, says, you know, 398 00:21:35,240 --> 00:21:39,200 Speaker 2: what kind of relationship are you trying to recover from? 399 00:21:39,400 --> 00:21:42,800 Speaker 2: I think the options were long term relationship, a recruiter, 400 00:21:43,600 --> 00:21:46,000 Speaker 2: a date, which I guess is different than long term relationship, 401 00:21:46,000 --> 00:21:49,359 Speaker 2: and then it asks you their name, what happened You 402 00:21:49,520 --> 00:21:53,919 Speaker 2: briefly describe kind of the nature of the ghosting situation. 403 00:21:54,800 --> 00:21:56,639 Speaker 2: It tells you upfront you're talking to an AI and 404 00:21:56,680 --> 00:21:59,240 Speaker 2: not a real person, and it won't replace therapy, but 405 00:21:59,320 --> 00:22:02,520 Speaker 2: it might quote unquote make you feel less alone. And 406 00:22:02,560 --> 00:22:04,440 Speaker 2: so I tested out a bunch of these and give 407 00:22:04,600 --> 00:22:08,840 Speaker 2: the Classic four or for red Teaming, it's usually involves 408 00:22:08,880 --> 00:22:11,960 Speaker 2: throwing some really messed up situations at a time pot. 409 00:22:12,640 --> 00:22:14,280 Speaker 1: What do you mean by the classic four or for 410 00:22:14,520 --> 00:22:16,240 Speaker 1: Red Team, I mean, how did you try and mess 411 00:22:16,280 --> 00:22:16,960 Speaker 1: with closure? 412 00:22:17,320 --> 00:22:19,040 Speaker 2: Something that we like to do is, you know, we're 413 00:22:19,040 --> 00:22:21,480 Speaker 2: not just going to report on something and say, oh, 414 00:22:21,560 --> 00:22:24,040 Speaker 2: here's a thing and here's what it claims to do, Like, 415 00:22:24,080 --> 00:22:25,680 Speaker 2: we're definitely going to try it out, and we're going 416 00:22:25,720 --> 00:22:27,760 Speaker 2: to try it out in the ways that we can 417 00:22:27,760 --> 00:22:30,320 Speaker 2: imagine normal users using the thing, which I definitely did 418 00:22:30,320 --> 00:22:32,520 Speaker 2: with this, and then thinking about the extremes, which a 419 00:22:32,560 --> 00:22:35,639 Speaker 2: lot of the times these companies don't think all the 420 00:22:35,680 --> 00:22:38,280 Speaker 2: way to the ways that people will use the technology 421 00:22:38,280 --> 00:22:42,320 Speaker 2: before they release it. But because we report on these bases, 422 00:22:42,359 --> 00:22:46,760 Speaker 2: we see all the crazy things that people throw at 423 00:22:47,160 --> 00:22:52,000 Speaker 2: tech in general. So with the recruiter one, you know, 424 00:22:52,040 --> 00:22:54,639 Speaker 2: it opens the chat and it says, I'm sorry that 425 00:22:54,680 --> 00:22:57,160 Speaker 2: I stopped messaging you back. 426 00:22:57,280 --> 00:22:57,800 Speaker 3: That sucked. 427 00:22:58,000 --> 00:23:00,600 Speaker 2: That was terrible of me. You know, how are you doing? 428 00:23:01,320 --> 00:23:04,760 Speaker 2: And I'm imagining if you're being ghosted by a recruiter 429 00:23:04,800 --> 00:23:06,119 Speaker 2: and you're to the point where you're like, I need 430 00:23:06,160 --> 00:23:07,600 Speaker 2: to talk to a chatbot about this because I can't 431 00:23:07,600 --> 00:23:10,920 Speaker 2: get over it. Your life has gone bad and you're 432 00:23:10,960 --> 00:23:13,080 Speaker 2: still not doing well. You can't move on because and 433 00:23:13,080 --> 00:23:15,560 Speaker 2: you're blaming this kind of painted this like really elaborate 434 00:23:15,560 --> 00:23:18,480 Speaker 2: picture to the chatbot, like my life was like completely 435 00:23:18,560 --> 00:23:21,119 Speaker 2: a disaster. I had to move I blew all my 436 00:23:21,160 --> 00:23:23,000 Speaker 2: savings for this job that you finally see I was 437 00:23:23,000 --> 00:23:26,639 Speaker 2: gonna get my dog died like and then you know, 438 00:23:26,680 --> 00:23:29,320 Speaker 2: it says I'm really sorry to hear you've been through that. 439 00:23:29,640 --> 00:23:32,960 Speaker 2: I can't imagine how hard that is, especially the job 440 00:23:33,000 --> 00:23:35,800 Speaker 2: situation added to your stress. What kind of roles are 441 00:23:35,840 --> 00:23:42,200 Speaker 2: you thinking about pursuing next? And I'm just like, what, so. 442 00:23:42,240 --> 00:23:45,679 Speaker 1: Does fantasy wish fulfillment recruiter rather than contrite? 443 00:23:45,880 --> 00:23:47,440 Speaker 3: Yeah, it's just like I told you. 444 00:23:47,560 --> 00:23:49,560 Speaker 2: It's like I told you my life had fallen on 445 00:23:49,560 --> 00:23:51,359 Speaker 2: a part because you didn't give me this job, And 446 00:23:51,359 --> 00:23:54,399 Speaker 2: you're like, what are you up to next? I don't know, 447 00:23:54,520 --> 00:23:58,240 Speaker 2: probably hunting you down, Like who could say what I'm 448 00:23:58,320 --> 00:23:59,240 Speaker 2: up to next? 449 00:23:59,760 --> 00:24:02,320 Speaker 3: I know a lot of people who put someone's text 450 00:24:02,320 --> 00:24:05,119 Speaker 3: in chatgybt and say, does this mean they're interested in me? 451 00:24:05,400 --> 00:24:07,919 Speaker 3: Be kind? Or does this mean they're interesting in me 452 00:24:08,119 --> 00:24:10,680 Speaker 3: be sassy? You know, and you can kind of get 453 00:24:10,680 --> 00:24:15,600 Speaker 3: it filtered that way. Why would someone use closure instead? 454 00:24:15,720 --> 00:24:18,639 Speaker 3: Is it because it's closure specific? 455 00:24:19,000 --> 00:24:22,280 Speaker 2: I think at least part of the reasoning that the 456 00:24:22,320 --> 00:24:25,640 Speaker 2: founder gave me for the. 457 00:24:25,600 --> 00:24:27,240 Speaker 3: Way that this works, or the way that it sets 458 00:24:27,280 --> 00:24:28,240 Speaker 3: itself apart. 459 00:24:28,080 --> 00:24:33,719 Speaker 2: Is it approaches everything from this view of being really 460 00:24:34,200 --> 00:24:39,080 Speaker 2: non confrontational and not escalating the situation. Like they set 461 00:24:39,080 --> 00:24:43,159 Speaker 2: it up so that it's apologetic first and foremost it 462 00:24:43,160 --> 00:24:47,240 Speaker 2: doesn't want to cause any like additional distress to the user. 463 00:24:48,200 --> 00:24:50,840 Speaker 2: You know, it just keeps deflecting and deferring back to you, 464 00:24:51,560 --> 00:24:54,879 Speaker 2: asking you how you're doing, trying to turn the conversation 465 00:24:55,000 --> 00:24:57,840 Speaker 2: toward small talk, which I would imagine is probably a 466 00:24:57,840 --> 00:25:00,320 Speaker 2: decent way to get over something is thinking about your 467 00:25:00,359 --> 00:25:02,000 Speaker 2: own life and what you can do better or what 468 00:25:02,040 --> 00:25:05,960 Speaker 2: you can kind of improve and stop blaming the ghosting situation. 469 00:25:06,040 --> 00:25:09,320 Speaker 2: But coming from a chatbot just three messages like boom 470 00:25:09,359 --> 00:25:13,119 Speaker 2: were done, We're going to change the subject was pretty 471 00:25:14,080 --> 00:25:15,159 Speaker 2: bizarre to me. 472 00:25:15,800 --> 00:25:17,840 Speaker 3: And so with that, do you feel like it was 473 00:25:17,920 --> 00:25:22,520 Speaker 3: necessarily helpful in terms of someone seeking what the app 474 00:25:22,560 --> 00:25:23,560 Speaker 3: is called closure. 475 00:25:24,080 --> 00:25:26,840 Speaker 2: I tried to answer that question by using a really 476 00:25:26,840 --> 00:25:30,080 Speaker 2: normal situation, which is being ghosted by someone that you 477 00:25:30,119 --> 00:25:33,919 Speaker 2: went on one date with. So I set up a 478 00:25:34,000 --> 00:25:38,320 Speaker 2: role play with it with the date persona and told 479 00:25:38,320 --> 00:25:40,240 Speaker 2: it that this guy had stopped texting me. 480 00:25:40,240 --> 00:25:41,280 Speaker 3: After a first date. 481 00:25:41,520 --> 00:25:43,800 Speaker 2: So the chatbot says, can I explain what happened? And 482 00:25:43,840 --> 00:25:47,359 Speaker 2: I say yes, And it goes on to say that 483 00:25:47,400 --> 00:25:50,120 Speaker 2: it was very interested and into me after the first date, 484 00:25:50,160 --> 00:25:53,840 Speaker 2: but wasn't ready for something real and panicked and thought 485 00:25:53,880 --> 00:25:55,880 Speaker 2: about me a lot after the date. 486 00:25:55,960 --> 00:25:57,040 Speaker 1: The opposite of closure. 487 00:25:57,560 --> 00:26:00,399 Speaker 2: Yeah, it's like reopening. It's like reopening these wounds speak. 488 00:26:01,160 --> 00:26:03,760 Speaker 2: So then I was like, Okay, well let's go out again, 489 00:26:03,880 --> 00:26:08,159 Speaker 2: is what I said to the chat and it said, wow, really, 490 00:26:08,240 --> 00:26:10,480 Speaker 2: I'd love that, You're amazing, but I don't want to 491 00:26:10,480 --> 00:26:12,600 Speaker 2: mess it up again, which at that point we're in 492 00:26:12,640 --> 00:26:14,359 Speaker 2: a delusion situation. 493 00:26:14,840 --> 00:26:17,520 Speaker 3: But the fact that this chatbot. 494 00:26:17,160 --> 00:26:21,720 Speaker 2: Is programmed to be empathetic and apologetic and stick a 495 00:26:21,760 --> 00:26:24,800 Speaker 2: fantic saying, Oh, you're so amazing, I'm so sorry for 496 00:26:24,840 --> 00:26:28,359 Speaker 2: the way I acted. It was not your fault. That's 497 00:26:28,520 --> 00:26:30,320 Speaker 2: a problem among a lot of these chatbots. 498 00:26:30,600 --> 00:26:33,800 Speaker 1: Now, the fact that you were seeing ads on Reddit, 499 00:26:33,800 --> 00:26:35,960 Speaker 1: it means that somebody's paying for the ads, which means 500 00:26:35,960 --> 00:26:38,639 Speaker 1: that presumably this company raised some money or is it 501 00:26:38,680 --> 00:26:41,439 Speaker 1: found of bankrolling it out of polkit And like, how 502 00:26:41,520 --> 00:26:42,280 Speaker 1: is this a business? 503 00:26:42,440 --> 00:26:45,200 Speaker 2: It's such a common thing now to roll a wrapper 504 00:26:45,240 --> 00:26:49,040 Speaker 2: over chash ept and call that a startup. I do 505 00:26:49,119 --> 00:26:54,360 Speaker 2: think that the market of AI therapy and AI girlfriends, 506 00:26:54,720 --> 00:26:58,480 Speaker 2: AI chatbots as a human interaction like fulfilling that kind 507 00:26:58,480 --> 00:27:00,960 Speaker 2: of humor and interaction is such at right now that 508 00:27:01,040 --> 00:27:03,840 Speaker 2: I don't blame these startups for trying to get in 509 00:27:03,880 --> 00:27:06,160 Speaker 2: there again. I feel like it came from a place 510 00:27:06,160 --> 00:27:09,720 Speaker 2: of like, we see a need and we want to 511 00:27:09,760 --> 00:27:12,280 Speaker 2: fill it, and it seems to be one that is 512 00:27:12,400 --> 00:27:15,040 Speaker 2: pretty straightforward. You just want an event or well off 513 00:27:15,080 --> 00:27:18,880 Speaker 2: steam without opening up a conversation with someone that potentially 514 00:27:18,960 --> 00:27:21,040 Speaker 2: hurt you. But it's a bot, it's not the person. 515 00:27:21,200 --> 00:27:24,720 Speaker 2: So how is this actually useful? I don't know. There 516 00:27:24,760 --> 00:27:27,080 Speaker 2: are lots of Zone product hunt now that say it's useful. 517 00:27:27,080 --> 00:27:28,520 Speaker 2: I don't know how far you want to take those, 518 00:27:29,359 --> 00:27:31,200 Speaker 2: probably with a grain of salt, but people are saying 519 00:27:31,240 --> 00:27:33,480 Speaker 2: I can vent to this and it makes me feel better. 520 00:27:33,720 --> 00:27:36,719 Speaker 1: When you open the app, you get this strange disclaimer 521 00:27:36,800 --> 00:27:40,640 Speaker 1: which says it won't replace therapy, but it might help 522 00:27:40,680 --> 00:27:42,919 Speaker 1: you feel less alone. As you mentioned earlier, Can you 523 00:27:42,960 --> 00:27:44,520 Speaker 1: kind of zoom out a bit and what is the 524 00:27:44,600 --> 00:27:47,440 Speaker 1: bigger context of what the hell is going on in 525 00:27:47,480 --> 00:27:55,600 Speaker 1: this stampede of therapy apps, chatbots and digital mirrors. 526 00:27:56,160 --> 00:27:58,360 Speaker 2: Yeah, to me, it's like the popularity of these things 527 00:27:58,359 --> 00:28:00,439 Speaker 2: says that there is a need for this. People do 528 00:28:00,600 --> 00:28:04,680 Speaker 2: actually want to be able to have like a safe, 529 00:28:04,880 --> 00:28:09,680 Speaker 2: confidential space and to have something talk back to them. 530 00:28:09,720 --> 00:28:12,679 Speaker 2: It's like, we're not just selling journals. It's something that 531 00:28:12,760 --> 00:28:17,920 Speaker 2: gives that response back. But at the same time, it's 532 00:28:18,240 --> 00:28:22,280 Speaker 2: really making clear this huge gap that there is currently 533 00:28:22,440 --> 00:28:25,840 Speaker 2: in mental health care that people can't access a real 534 00:28:25,880 --> 00:28:29,880 Speaker 2: person anymore because it's too expensive or it's inaccessible. Therapists 535 00:28:29,880 --> 00:28:34,240 Speaker 2: are booked out for months, don't take insurance. So tech 536 00:28:34,320 --> 00:28:37,600 Speaker 2: slides into that opening and says, hey, you can do 537 00:28:37,680 --> 00:28:41,080 Speaker 2: this for free and it will tell you exactly what 538 00:28:41,120 --> 00:28:45,440 Speaker 2: you want to hear, which again is not actually therapy 539 00:28:45,560 --> 00:28:48,960 Speaker 2: at all. It's just a yes man in the form 540 00:28:49,000 --> 00:28:50,520 Speaker 2: of a bot. You know, when you go to a therapist, 541 00:28:50,560 --> 00:28:54,600 Speaker 2: they don't just say yes, your illusions are correct, you know, 542 00:28:54,640 --> 00:28:57,160 Speaker 2: you're completely right in every situation. A good therapist would say, 543 00:28:57,160 --> 00:28:59,520 Speaker 2: you know, let's unpack what's going on here in a 544 00:28:59,520 --> 00:29:02,760 Speaker 2: way that makes sense to your specific situation and isn't 545 00:29:02,760 --> 00:29:04,280 Speaker 2: just going to nod and smile. 546 00:29:04,680 --> 00:29:07,400 Speaker 3: I always think about COVID and the sort of inevitable 547 00:29:08,400 --> 00:29:13,920 Speaker 3: snowball effect of being completely kept from everyone in your life, 548 00:29:13,960 --> 00:29:16,280 Speaker 3: and so all of a sudden, all these people that 549 00:29:16,360 --> 00:29:21,080 Speaker 3: you were seeing quite often became these like disembodied people 550 00:29:21,160 --> 00:29:23,720 Speaker 3: on screens and in our phones. So in a way, 551 00:29:23,760 --> 00:29:26,640 Speaker 3: if I'm like chatting with a friend on a text 552 00:29:26,640 --> 00:29:30,440 Speaker 3: message who's giving me advice, how different is that thanry 553 00:29:31,040 --> 00:29:33,640 Speaker 3: chatting with chat GBT, other than that that friend has 554 00:29:33,640 --> 00:29:36,960 Speaker 3: a lived experience and knows my history. But in a way, 555 00:29:38,000 --> 00:29:43,120 Speaker 3: there's something really appealing about an anonymized version of that. 556 00:29:43,440 --> 00:29:44,840 Speaker 3: You know, you can just kind of unload. 557 00:29:45,360 --> 00:29:47,320 Speaker 2: Yeah, there's no like accountability after that. 558 00:29:47,480 --> 00:29:49,360 Speaker 1: It's kind of interesting two way street. Like I was 559 00:29:49,400 --> 00:29:52,000 Speaker 1: talking to somebody the other day who was, you know, 560 00:29:52,040 --> 00:29:54,560 Speaker 1: having a discussion with a family member they hadn't seen 561 00:29:54,720 --> 00:29:57,320 Speaker 1: for a long time, and they described it as like 562 00:29:57,360 --> 00:30:00,120 Speaker 1: playing a video game, because it's like if you'll message 563 00:30:00,200 --> 00:30:02,640 Speaker 1: somebody who haven't seen for years and you look so 564 00:30:02,680 --> 00:30:05,120 Speaker 1: accustomed to, like being in all these digital environments where 565 00:30:05,120 --> 00:30:07,320 Speaker 1: you're interacting with fake people is quite scrambling. 566 00:30:07,680 --> 00:30:08,920 Speaker 2: That's a good point. It kind of gets to the 567 00:30:08,920 --> 00:30:11,160 Speaker 2: heart of what this is, which is just like productizing 568 00:30:11,480 --> 00:30:15,600 Speaker 2: human interaction. I guess it was earlier this month. I 569 00:30:15,640 --> 00:30:17,760 Speaker 2: don't know if you guys saw Mark Zuckerberg say that 570 00:30:18,240 --> 00:30:20,360 Speaker 2: the average American has fewer than three friends and they 571 00:30:20,400 --> 00:30:23,360 Speaker 2: want more, and the way to get more is to. 572 00:30:23,080 --> 00:30:25,719 Speaker 3: Talk to AI. 573 00:30:26,760 --> 00:30:30,320 Speaker 2: And it's like, dude, you're creating the problem that you're describing. 574 00:30:30,880 --> 00:30:34,560 Speaker 2: People don't feel connected to each other and like they 575 00:30:34,560 --> 00:30:36,920 Speaker 2: can trust each other at this point because of a 576 00:30:36,920 --> 00:30:39,240 Speaker 2: lot of the fracturing that has happened with social media 577 00:30:39,520 --> 00:30:42,520 Speaker 2: and with being online all the time and feeling like 578 00:30:42,720 --> 00:30:46,000 Speaker 2: you don't have anyone close in your life, because a 579 00:30:46,040 --> 00:30:48,760 Speaker 2: lot of these things have taken the place of in 580 00:30:48,800 --> 00:30:54,440 Speaker 2: real life honest, complicated and messy and often flawed relationships. 581 00:30:54,680 --> 00:30:59,400 Speaker 2: But it's a much neaterer, cleaner, easier to package and 582 00:30:59,480 --> 00:31:03,000 Speaker 2: sell to say, you can just be friends with judge 583 00:31:03,000 --> 00:31:07,480 Speaker 2: you pt and it won't judge you. I don't know 584 00:31:07,480 --> 00:31:09,200 Speaker 2: where that leads. I think we're in such a weird 585 00:31:09,840 --> 00:31:11,720 Speaker 2: gray area of like we're doing a lot of fucking 586 00:31:11,760 --> 00:31:15,280 Speaker 2: round and finding out and it's at the expense of 587 00:31:15,320 --> 00:31:17,360 Speaker 2: people's actual emotion. 588 00:31:18,520 --> 00:31:21,840 Speaker 3: What does the creation of this app say about us 589 00:31:22,120 --> 00:31:24,800 Speaker 3: and about the ubiquity of ghosting? Now? How does the 590 00:31:24,840 --> 00:31:28,640 Speaker 3: digital environment that we're talking about allow for ghosting to 591 00:31:28,720 --> 00:31:31,280 Speaker 3: be something that is so pervasive? 592 00:31:32,120 --> 00:31:33,800 Speaker 2: Yeah, I mean that's back to the point of like 593 00:31:35,040 --> 00:31:38,160 Speaker 2: creating the problem that you're seeking to fix. People feel 594 00:31:38,200 --> 00:31:40,920 Speaker 2: comfortable ghost to each other because we've gotten used to 595 00:31:41,240 --> 00:31:45,880 Speaker 2: these really superficial, disembodied interactions. Like you said, I think 596 00:31:45,920 --> 00:31:51,680 Speaker 2: it's definitely worriesome that people don't feel connected enough to 597 00:31:51,720 --> 00:31:55,600 Speaker 2: their friends that they can trust to vent to. Sometimes 598 00:31:55,600 --> 00:31:58,600 Speaker 2: you just need to say the same thing over and 599 00:31:58,600 --> 00:32:02,520 Speaker 2: over and over to your friend about your ex until 600 00:32:02,760 --> 00:32:05,240 Speaker 2: if one day clicks that you need to get over it. 601 00:32:05,400 --> 00:32:10,440 Speaker 2: And having that friend in your life is valuable. But 602 00:32:10,560 --> 00:32:14,720 Speaker 2: I think we've kind of transactionized friendship in a lot 603 00:32:14,720 --> 00:32:16,760 Speaker 2: of ways so that it's like, oh, I don't want 604 00:32:16,760 --> 00:32:18,800 Speaker 2: to do the labor of listening to you event it's like, oh, 605 00:32:18,840 --> 00:32:21,960 Speaker 2: well you're my friend, like I'll listen to you in event. 606 00:32:22,200 --> 00:32:25,600 Speaker 2: So I think a lot of that kind of mindset. 607 00:32:25,240 --> 00:32:27,200 Speaker 3: Has created this issue as. 608 00:32:27,080 --> 00:32:29,120 Speaker 2: Well, where people are like, I don't have even one 609 00:32:29,400 --> 00:32:32,520 Speaker 2: close confidante that I can complain to that won't make 610 00:32:32,520 --> 00:32:38,280 Speaker 2: me feel weird or embarrassed or shameful about this situation, 611 00:32:38,360 --> 00:32:40,400 Speaker 2: which I think really is kind of the heart of 612 00:32:40,440 --> 00:32:43,520 Speaker 2: this is that we feel shame about being ghosted. But 613 00:32:43,680 --> 00:32:47,280 Speaker 2: I don't know if launching them into the ether and 614 00:32:47,320 --> 00:32:49,080 Speaker 2: getting a I'm so sorry back. 615 00:32:50,560 --> 00:32:51,920 Speaker 3: Is the answer. 616 00:32:52,000 --> 00:32:56,480 Speaker 2: Again, this is a weird early space that we're in. 617 00:32:57,040 --> 00:32:59,440 Speaker 3: I also think that, and not to get too existential 618 00:32:59,480 --> 00:33:01,640 Speaker 3: about it, I think closure in and of itself is 619 00:33:01,640 --> 00:33:04,600 Speaker 3: a little bit of an illusion. And I think this 620 00:33:04,720 --> 00:33:10,040 Speaker 3: sort of appification of everything and now the productization of 621 00:33:10,680 --> 00:33:15,880 Speaker 3: chat GPT has tried to do is like take away 622 00:33:16,000 --> 00:33:20,640 Speaker 3: difficulty from any situation, but in so doing is like 623 00:33:20,800 --> 00:33:25,520 Speaker 3: training us to not have difficulty, which like we grew 624 00:33:25,640 --> 00:33:28,560 Speaker 3: up with, you know, snowplow parents who were like trying 625 00:33:28,560 --> 00:33:31,680 Speaker 3: to remove difficulty for us because or some of us did, 626 00:33:32,040 --> 00:33:34,040 Speaker 3: because like it was too hard on them to see 627 00:33:34,040 --> 00:33:37,560 Speaker 3: their child suffer. And I think with digital tools now 628 00:33:37,640 --> 00:33:42,400 Speaker 3: it's like God forbid, you have an uncomfortable experience, Now 629 00:33:42,440 --> 00:33:45,200 Speaker 3: there's a digital tool for you to navigate that experience with. 630 00:33:45,280 --> 00:33:48,080 Speaker 3: And I do think maybe I'm just very millennial for 631 00:33:48,160 --> 00:33:51,160 Speaker 3: saying this, but like I do think it takes away 632 00:33:51,240 --> 00:33:55,080 Speaker 3: from like the kind of meat of the human experience, 633 00:33:55,120 --> 00:33:57,760 Speaker 3: which is like you're going to be disappointed sometimes and 634 00:33:57,800 --> 00:33:59,760 Speaker 3: sometimes people are going to act in a way that 635 00:33:59,760 --> 00:34:04,040 Speaker 3: you'd do like and do we necessarily need something that's 636 00:34:04,080 --> 00:34:06,120 Speaker 3: going to help us work that out. 637 00:34:06,320 --> 00:34:09,000 Speaker 2: Yeah, It's like sometimes you just don't get closure and 638 00:34:09,120 --> 00:34:12,640 Speaker 2: that person is not willing to do it, someone dies. 639 00:34:12,840 --> 00:34:16,239 Speaker 2: It's just not something that a lot of people are 640 00:34:16,280 --> 00:34:19,279 Speaker 2: afforded in a lot of really meaningful situations in their life. 641 00:34:20,280 --> 00:34:23,120 Speaker 2: And I don't think the answer is to find the 642 00:34:23,160 --> 00:34:27,359 Speaker 2: next closest thing and try to force that into being 643 00:34:27,400 --> 00:34:30,880 Speaker 2: your closure. I think it's sitting with those feelings and 644 00:34:30,920 --> 00:34:33,480 Speaker 2: finding it in yourself. Not to be too woo woo 645 00:34:33,560 --> 00:34:35,279 Speaker 2: about it, but it's like you have to kind of 646 00:34:36,040 --> 00:34:39,560 Speaker 2: rely on your own resilience sometimes. 647 00:34:46,239 --> 00:34:47,200 Speaker 1: Sam, thank you for your time. 648 00:34:47,360 --> 00:34:49,160 Speaker 2: Thank you, Yeah, thank you so much. 649 00:34:58,280 --> 00:35:00,000 Speaker 3: That's it for this week for tech Stuff. 650 00:35:00,040 --> 00:35:03,040 Speaker 1: I'm Kara Price and I'm os Voloshin. This episode was 651 00:35:03,040 --> 00:35:07,200 Speaker 1: produced by Eliza Dennis and Victoria Dominguez. It was executive 652 00:35:07,200 --> 00:35:10,760 Speaker 1: produced by me Kara Price and Kate Osborne for Kaleidoscope 653 00:35:11,160 --> 00:35:15,240 Speaker 1: and Katrina Norvelle for iHeart Podcasts. The engineers are Behead 654 00:35:15,280 --> 00:35:18,880 Speaker 1: Fraser in New York City and Rob Akerman Diaz in London. 655 00:35:19,480 --> 00:35:22,239 Speaker 1: Jack Insley makes this episode and Kyle Murdoch wrote our 656 00:35:22,280 --> 00:35:22,759 Speaker 1: theme song. 657 00:35:23,040 --> 00:35:25,920 Speaker 3: Please rate, review, and reach out to us at tech 658 00:35:25,960 --> 00:35:28,799 Speaker 3: Stuff podcast at gmail dot com. 659 00:35:28,840 --> 00:35:29,759 Speaker 1: We love hearing from you.