1 00:00:14,520 --> 00:00:17,440 Speaker 1: Welcome to tech Stuff. I'm Oz Valoshin. Today is all 2 00:00:17,480 --> 00:00:21,279 Speaker 1: about biometric data and how it's increasingly become a part 3 00:00:21,320 --> 00:00:25,279 Speaker 1: of modern life. We unlock phones with our faces, send 4 00:00:25,360 --> 00:00:29,200 Speaker 1: DNA samples to learn about our ancestry, track our workouts, 5 00:00:29,440 --> 00:00:31,600 Speaker 1: and you can even pay for groceries now with the 6 00:00:31,600 --> 00:00:34,720 Speaker 1: palm of your hand. But as people get more and 7 00:00:34,760 --> 00:00:38,440 Speaker 1: more comfortable using their highly personal data to seamlessly and 8 00:00:38,479 --> 00:00:43,440 Speaker 1: efficiently move through life, questions arise, like what's being done 9 00:00:43,479 --> 00:00:46,640 Speaker 1: with all this information? And are the benefits to using 10 00:00:46,760 --> 00:00:50,960 Speaker 1: biometric technology worth the costs. Here to walk us through 11 00:00:51,000 --> 00:00:54,520 Speaker 1: all our questions is Adam clark Estes. He's a senior 12 00:00:54,560 --> 00:00:58,200 Speaker 1: technology correspondent at Vox. Adam, welcome to tech stuff. 13 00:00:58,440 --> 00:00:59,400 Speaker 2: Hey, thanks for having me. 14 00:00:59,480 --> 00:01:01,560 Speaker 1: Thank you so much becoming. I love your newsletter and 15 00:01:01,600 --> 00:01:03,520 Speaker 1: I was struck by a recent piece you wrote that 16 00:01:03,600 --> 00:01:07,119 Speaker 1: began with a visit to the doctor. Can you tell 17 00:01:07,120 --> 00:01:07,600 Speaker 1: the story? 18 00:01:07,959 --> 00:01:10,840 Speaker 2: Sure? So, my doctor is at NYU Lango and it's 19 00:01:10,880 --> 00:01:14,640 Speaker 2: a huge healthcare system in New York City, and I 20 00:01:14,720 --> 00:01:16,759 Speaker 2: walked in one day and there was a little Amazon 21 00:01:16,800 --> 00:01:20,360 Speaker 2: branded scanner next to the little kiosk where you check in. 22 00:01:20,920 --> 00:01:23,360 Speaker 2: The idea was that I would put my palm over 23 00:01:23,400 --> 00:01:25,840 Speaker 2: this sensor and it would scan it, and that's how 24 00:01:25,880 --> 00:01:28,319 Speaker 2: it would verify my identity. I didn't do that. There 25 00:01:28,360 --> 00:01:30,440 Speaker 2: was an option to opt out of it and go 26 00:01:30,480 --> 00:01:33,120 Speaker 2: and visit the desk clerk, which is what the doctor's 27 00:01:33,120 --> 00:01:34,960 Speaker 2: office doesn't want you to do because it takes more 28 00:01:35,319 --> 00:01:38,120 Speaker 2: time for them and for their staff to verify you, 29 00:01:38,560 --> 00:01:40,920 Speaker 2: and the tech based way of doing it is a 30 00:01:40,959 --> 00:01:43,400 Speaker 2: lot quicker. But it also just raised a lot of questions, 31 00:01:43,400 --> 00:01:45,119 Speaker 2: like I didn't know what I was giving up when 32 00:01:45,120 --> 00:01:47,319 Speaker 2: I gave my palm print to Amazon. I didn't know 33 00:01:47,319 --> 00:01:49,720 Speaker 2: if that would have anything to do with the ads 34 00:01:49,720 --> 00:01:51,840 Speaker 2: I saw on Amazon dot Com. And then I talked 35 00:01:51,840 --> 00:01:54,400 Speaker 2: to some privacy experts. I interviewed three or four people 36 00:01:54,440 --> 00:01:59,280 Speaker 2: for this story, and ultimately the message was that it's 37 00:01:59,320 --> 00:02:01,320 Speaker 2: really hard to say what is going to happen to 38 00:02:01,360 --> 00:02:05,240 Speaker 2: your data long term. Amazon has a lot of businesses, 39 00:02:05,400 --> 00:02:07,560 Speaker 2: They have a lot of data about you already, and 40 00:02:07,600 --> 00:02:10,560 Speaker 2: if they can add more, especially this biometric data, which 41 00:02:11,080 --> 00:02:14,959 Speaker 2: to be clear, gives proof that you were at a 42 00:02:15,040 --> 00:02:18,120 Speaker 2: place physically, and that's kind of hard to do otherwise 43 00:02:18,120 --> 00:02:20,560 Speaker 2: because even just like your phone pinging a cell tower, 44 00:02:20,600 --> 00:02:23,880 Speaker 2: doesn't prove that you were somewhere, but if your palm 45 00:02:23,919 --> 00:02:25,680 Speaker 2: was at a place that knows that you were at 46 00:02:25,680 --> 00:02:27,320 Speaker 2: a doctor's office, you can kind of figure out. 47 00:02:27,480 --> 00:02:29,079 Speaker 1: A digital DNA effectively. 48 00:02:29,440 --> 00:02:31,640 Speaker 2: Yeah, right, and then you know, I did talk to 49 00:02:31,639 --> 00:02:35,720 Speaker 2: Amazon quite a lot about this. They really tried hard 50 00:02:35,800 --> 00:02:39,600 Speaker 2: to make it clear to me that the data being 51 00:02:39,639 --> 00:02:44,519 Speaker 2: collected for the Amazon one biometric system was not shared 52 00:02:44,520 --> 00:02:47,960 Speaker 2: with other Amazon businesses. It's much less clear in the 53 00:02:48,000 --> 00:02:52,040 Speaker 2: privacy policy, and I did talk to lawyers who basically 54 00:02:52,120 --> 00:02:54,880 Speaker 2: argued that there's some legalise to work around here that 55 00:02:55,080 --> 00:02:58,400 Speaker 2: gives Amazon some wiggle room. But that said, I frankly 56 00:02:58,480 --> 00:03:00,520 Speaker 2: kind of believe the company that they're not doing anything 57 00:03:00,520 --> 00:03:02,600 Speaker 2: else with this data right now, but it's hard to 58 00:03:02,600 --> 00:03:04,320 Speaker 2: say what they'll do with it in the future. 59 00:03:04,639 --> 00:03:07,040 Speaker 1: Did you ask them any questions at the doc's office 60 00:03:07,080 --> 00:03:10,520 Speaker 1: about like, you know, obviously patients and medical settings are 61 00:03:10,880 --> 00:03:12,720 Speaker 1: reasonably quite concerned about their privacy. 62 00:03:13,040 --> 00:03:18,000 Speaker 2: I did. I talked to NYU lengthon and they assured 63 00:03:18,040 --> 00:03:20,880 Speaker 2: me that Amazon was not collecting any health data about you. 64 00:03:20,880 --> 00:03:23,160 Speaker 2: You know, the sort of interaction you'd have with your 65 00:03:23,200 --> 00:03:26,600 Speaker 2: doctor in the doctor's office is covered by hippolaws. Which 66 00:03:26,600 --> 00:03:30,040 Speaker 2: are actually quite narrow and specific to your health data. 67 00:03:30,520 --> 00:03:32,640 Speaker 2: But everything else, sort of like you're being at the 68 00:03:32,639 --> 00:03:36,560 Speaker 2: doctor's office, is not protected by HIPPA in any case. 69 00:03:36,800 --> 00:03:41,480 Speaker 2: Nyu langone Health actually had palm scanners before Amazon. They 70 00:03:41,520 --> 00:03:45,680 Speaker 2: never worked, but there's just something much more foreboding about 71 00:03:46,120 --> 00:03:49,480 Speaker 2: seeing that Amazon logo greet you when you're at the 72 00:03:49,520 --> 00:03:52,360 Speaker 2: doctor's office. So I ended up at the end of 73 00:03:52,360 --> 00:03:55,480 Speaker 2: this reporting experience back at the doctor's office, and I 74 00:03:55,680 --> 00:03:57,720 Speaker 2: still did not put my palm on the scanner. 75 00:03:57,960 --> 00:04:00,680 Speaker 1: I remember when they sort of switched inn apes from 76 00:04:01,080 --> 00:04:05,840 Speaker 1: metal detectors to like full body visualization scanners. There was 77 00:04:05,880 --> 00:04:09,720 Speaker 1: a time when I resisted and asked for the pat down, 78 00:04:10,200 --> 00:04:13,360 Speaker 1: But after about a year of extremely unpleasant experiences, I 79 00:04:13,440 --> 00:04:15,800 Speaker 1: was just like, you know what, let me just, you know, 80 00:04:15,880 --> 00:04:18,719 Speaker 1: go through the machine that is now technically optional but 81 00:04:19,080 --> 00:04:23,400 Speaker 1: practically mandatory. And when you see something as a technology 82 00:04:23,400 --> 00:04:26,840 Speaker 1: correspondent like that in a doctor's office, what's your sense 83 00:04:26,880 --> 00:04:28,160 Speaker 1: of its kind of inevitability. 84 00:04:28,839 --> 00:04:33,280 Speaker 2: Inevitability is a great word, I think, especially with biometrics, 85 00:04:33,320 --> 00:04:37,840 Speaker 2: because for the companies that are pushing this technology to us, 86 00:04:37,880 --> 00:04:40,440 Speaker 2: there are so many upsides. There are so many upsides 87 00:04:40,760 --> 00:04:46,000 Speaker 2: for Amazon to sell this service to businesses. So Amazon 88 00:04:46,040 --> 00:04:47,760 Speaker 2: One is the name of the technology, and it's a 89 00:04:47,760 --> 00:04:53,160 Speaker 2: new business within Amazon's Aws business. So you know, it's 90 00:04:53,160 --> 00:04:55,680 Speaker 2: like a Russian nesting dollar of businesses. And Amazon says 91 00:04:55,680 --> 00:04:58,320 Speaker 2: they're all separate, but still you're going to start seeing 92 00:04:58,360 --> 00:05:01,280 Speaker 2: these Amazon scanners up. They're already in Whole Foods with 93 00:05:01,400 --> 00:05:04,400 Speaker 2: Amazon owns. They're popping up in doctors' offices like at 94 00:05:04,520 --> 00:05:08,000 Speaker 2: NYU Lango and health, and you'll see them at stadiums 95 00:05:08,320 --> 00:05:12,440 Speaker 2: and it will be annoying enough for people to work 96 00:05:12,480 --> 00:05:14,760 Speaker 2: around it that I think a lot of people will 97 00:05:14,800 --> 00:05:17,760 Speaker 2: just figure that whatever hang ups they had about their 98 00:05:17,800 --> 00:05:21,800 Speaker 2: their privacy, it's easier to just go ahead and comply 99 00:05:22,440 --> 00:05:27,280 Speaker 2: and have their palm scanned and frankly, we'll save them 100 00:05:27,320 --> 00:05:29,440 Speaker 2: some time. But makes me feel uneasy. 101 00:05:30,120 --> 00:05:32,680 Speaker 1: And what's their widest strategy why they want to take 102 00:05:32,680 --> 00:05:35,719 Speaker 1: ownership of this biometric scanning technology. 103 00:05:36,440 --> 00:05:38,880 Speaker 2: There's a bit of a race, I think to come 104 00:05:38,960 --> 00:05:43,760 Speaker 2: up with a good way to verify our identities in 105 00:05:43,800 --> 00:05:46,640 Speaker 2: a digital world. The New Yorker cartoon of a dog 106 00:05:46,839 --> 00:05:49,240 Speaker 2: looking at a computer and the caption is on the internet. 107 00:05:49,240 --> 00:05:51,320 Speaker 2: Nobody knows you're a dog I love that one. It's 108 00:05:51,440 --> 00:05:55,760 Speaker 2: very easy to mimic somebody else online and increasingly difficult 109 00:05:55,839 --> 00:05:58,320 Speaker 2: to prove that you are who you are, even in 110 00:05:58,320 --> 00:06:03,080 Speaker 2: the United States. Kind of our gold standard identification method 111 00:06:03,160 --> 00:06:06,080 Speaker 2: is our Social Security number, which is not very secure 112 00:06:06,080 --> 00:06:08,920 Speaker 2: at all. It's just a string of numbers. So there 113 00:06:08,920 --> 00:06:11,240 Speaker 2: are a lot of tech companies who want to get 114 00:06:11,279 --> 00:06:14,720 Speaker 2: at this problem. Amazon solution, i think, is this pomp print, 115 00:06:14,839 --> 00:06:19,520 Speaker 2: because it is a way of creating a unique identifier 116 00:06:19,560 --> 00:06:21,720 Speaker 2: for you. So your palm is basically the key, and 117 00:06:22,080 --> 00:06:23,880 Speaker 2: what the scanner does is kind of create a lock 118 00:06:23,960 --> 00:06:26,080 Speaker 2: that it can fit into so that the next time 119 00:06:26,120 --> 00:06:29,520 Speaker 2: you scan your palm, if the data from that hand 120 00:06:29,760 --> 00:06:32,839 Speaker 2: matches that kind of digital lock, it confirms that you 121 00:06:32,880 --> 00:06:35,320 Speaker 2: are who you are and you get into wherever you're going. 122 00:06:35,720 --> 00:06:38,480 Speaker 1: Why did you find this more concerning or more worthy 123 00:06:38,480 --> 00:06:41,159 Speaker 1: of writing about than the thing which we're all accustomed to, 124 00:06:41,160 --> 00:06:43,520 Speaker 1: which is opening our iPhones with our faces, Like, what's 125 00:06:43,520 --> 00:06:45,760 Speaker 1: the difference between Apple are doing what Amazon are doing 126 00:06:45,760 --> 00:06:46,400 Speaker 1: in this respect? 127 00:06:47,000 --> 00:06:49,000 Speaker 2: I'm a fan of face ID and it have been 128 00:06:49,000 --> 00:06:51,320 Speaker 2: a fan of touch ID because I know that the 129 00:06:51,600 --> 00:06:56,039 Speaker 2: processing that happens is happening on my device that's Apple's 130 00:06:56,040 --> 00:06:58,120 Speaker 2: big commitment to privacy is that it's not setting that 131 00:06:58,200 --> 00:07:01,440 Speaker 2: data up to the cloud, where as Amazon One is 132 00:07:01,839 --> 00:07:04,839 Speaker 2: literally run by Amazon's cloud computing business, so all this 133 00:07:04,920 --> 00:07:07,240 Speaker 2: data is kind of has the chance to get out there. 134 00:07:07,240 --> 00:07:11,200 Speaker 2: And the trouble with biometric data is that it's pretty permanent. 135 00:07:11,560 --> 00:07:13,640 Speaker 2: I can't go out and get a new hand if 136 00:07:13,680 --> 00:07:17,200 Speaker 2: the details of my pomprint get out there, I can't. 137 00:07:17,320 --> 00:07:18,760 Speaker 2: It's not like changing a password. 138 00:07:19,120 --> 00:07:22,520 Speaker 1: I mean, when you think about the wider Amazon ecosystem, 139 00:07:22,560 --> 00:07:28,840 Speaker 1: which is shopping video now verify identity in healthcare setting, 140 00:07:29,000 --> 00:07:31,480 Speaker 1: I mean, what's the night meassonaryoo for you here? 141 00:07:32,280 --> 00:07:35,360 Speaker 2: The nightmare scenario for me online at any time is 142 00:07:35,400 --> 00:07:39,920 Speaker 2: identity theft. There's so much of my information online if 143 00:07:40,000 --> 00:07:44,480 Speaker 2: someone could pretend they were me and really do anything 144 00:07:44,480 --> 00:07:46,240 Speaker 2: that I wouldn't want to do, like open a bank 145 00:07:46,240 --> 00:07:50,679 Speaker 2: account or buy a new car. And when it comes 146 00:07:50,720 --> 00:07:53,600 Speaker 2: to healthcare, it's the most private data I have, and 147 00:07:54,280 --> 00:07:56,680 Speaker 2: I think that details about my health now are in 148 00:07:56,720 --> 00:07:58,400 Speaker 2: the future that could end up in the wrong hands, 149 00:07:58,480 --> 00:08:01,400 Speaker 2: or even in the right hands, could have a bad 150 00:08:01,440 --> 00:08:03,880 Speaker 2: outcome for me. I don't necessarily want my health insurance 151 00:08:03,880 --> 00:08:07,080 Speaker 2: company to know everything that's happening with me all the time. 152 00:08:07,080 --> 00:08:10,200 Speaker 2: I don't necessarily want my employer to know what's happening. 153 00:08:10,680 --> 00:08:13,640 Speaker 2: But at the same time, a lot of what has 154 00:08:13,720 --> 00:08:20,080 Speaker 2: always kind of pushed me towards privacy online is got 155 00:08:20,120 --> 00:08:24,400 Speaker 2: feeling that technology is increasingly encroaching upon our privacy and 156 00:08:24,480 --> 00:08:28,120 Speaker 2: kind of changing the definition of what is publicly available information. 157 00:08:28,440 --> 00:08:32,440 Speaker 1: Yeah, you also had an experience recently elsewhere in the 158 00:08:32,480 --> 00:08:37,120 Speaker 1: Amazon ecosystem that had to do with health and being 159 00:08:37,160 --> 00:08:39,000 Speaker 1: recommended prescription drugs. 160 00:08:39,360 --> 00:08:42,160 Speaker 2: The experience I had with prescription drugs and Amazon is 161 00:08:42,200 --> 00:08:47,199 Speaker 2: a great example of something that just felt wrong and uncomfortable. 162 00:08:47,280 --> 00:08:49,280 Speaker 2: It was a feeling I didn't like having. I ordered 163 00:08:49,320 --> 00:08:54,800 Speaker 2: groceries through Amazon Fresh and went to check out, and 164 00:08:54,840 --> 00:08:57,840 Speaker 2: at the bottom it was recommending me a list of 165 00:08:57,880 --> 00:09:02,439 Speaker 2: prescription drugs. And I had no idea how Amazon would 166 00:09:03,000 --> 00:09:05,600 Speaker 2: know what kinds of prescription drugs I might be interested in, 167 00:09:05,679 --> 00:09:07,439 Speaker 2: or it might apply to me, or even why I 168 00:09:07,520 --> 00:09:10,120 Speaker 2: might be prompted at that specific moment to buy them 169 00:09:10,160 --> 00:09:13,400 Speaker 2: from Amazon. I talked to Amazon again about that, and 170 00:09:13,920 --> 00:09:16,839 Speaker 2: Amazon told me that the system was working as it should. 171 00:09:16,960 --> 00:09:21,600 Speaker 2: It had looked at what I purchased and based on 172 00:09:21,960 --> 00:09:27,079 Speaker 2: what other Amazon customers bought from Amazon had recommended something 173 00:09:27,080 --> 00:09:30,520 Speaker 2: I thought I would like. In this case, I'll be specific. 174 00:09:30,720 --> 00:09:33,840 Speaker 2: I got the reduced fat version of my coffee creamer, 175 00:09:34,600 --> 00:09:37,640 Speaker 2: and so it recommended a statin to me to lower 176 00:09:37,640 --> 00:09:39,880 Speaker 2: my cholesterol. So I could see kind of on the 177 00:09:39,880 --> 00:09:41,600 Speaker 2: back end how that would happen, but it's still just 178 00:09:41,600 --> 00:09:43,880 Speaker 2: sort of an uncanny experience. 179 00:09:44,280 --> 00:09:46,760 Speaker 1: This is like if you go to the Woolgreens, you know, 180 00:09:46,760 --> 00:09:50,080 Speaker 1: around the corner, and you're in the you're reaching into 181 00:09:50,120 --> 00:09:53,240 Speaker 1: the refrigerator to get a low fat yoga and the 182 00:09:53,240 --> 00:09:57,160 Speaker 1: pharmacist runs out and tries to use them statins, right, 183 00:09:57,240 --> 00:09:59,360 Speaker 1: I mean, it's like that would be pretty weird in 184 00:09:59,400 --> 00:09:59,880 Speaker 1: real life. 185 00:10:00,320 --> 00:10:01,959 Speaker 2: It would be, but now that you put it like that, 186 00:10:01,960 --> 00:10:05,080 Speaker 2: that's sort of exactly what happened. Amazon has a newer 187 00:10:05,080 --> 00:10:07,480 Speaker 2: pharmacy business that it wants people to sign up for, 188 00:10:07,720 --> 00:10:10,400 Speaker 2: and if I'm reminded that I might want to take 189 00:10:10,440 --> 00:10:13,920 Speaker 2: a statin, then maybe I would sign up and be like, oh, yeah, 190 00:10:13,920 --> 00:10:16,559 Speaker 2: it would be more convenient to get my prescriptions through Amazon. 191 00:10:16,600 --> 00:10:16,720 Speaker 3: Two. 192 00:10:17,400 --> 00:10:20,320 Speaker 2: So I think it's marketing and a lot of these 193 00:10:20,360 --> 00:10:23,440 Speaker 2: recommendations are pushing you towards other Amazon. Businesses to spend 194 00:10:23,480 --> 00:10:26,160 Speaker 2: more time and money on Amazon, and that's why Amazon 195 00:10:26,240 --> 00:10:29,280 Speaker 2: has a huge, multi billion dollar advertising business. 196 00:10:29,600 --> 00:10:31,680 Speaker 1: What does all of this tell us about Amazon's wider 197 00:10:31,720 --> 00:10:32,840 Speaker 1: ambitions in healthcare? 198 00:10:33,320 --> 00:10:36,200 Speaker 2: We've known for some time that Amazon wants to be 199 00:10:36,240 --> 00:10:41,600 Speaker 2: a healthcare company, and it's funny ahead of this experiment 200 00:10:41,840 --> 00:10:44,160 Speaker 2: years ago, I actually I wrote a whole story about 201 00:10:44,400 --> 00:10:47,960 Speaker 2: whether Amazon wants to have an Amazon Prime for healthcare, 202 00:10:48,120 --> 00:10:50,320 Speaker 2: and then kind of like what I had been reporting 203 00:10:50,320 --> 00:10:53,280 Speaker 2: on in the story ended up happening. Amazon bought a 204 00:10:53,320 --> 00:10:56,880 Speaker 2: healthcare company. They bought Amazon One. Amazon launched and then 205 00:10:56,920 --> 00:11:02,359 Speaker 2: expanded its pharmacy business, and Amazon is getting into telehealth 206 00:11:02,440 --> 00:11:05,760 Speaker 2: as well through the company that have bought One Medical. 207 00:11:06,600 --> 00:11:09,280 Speaker 2: It's very clear to me that Amazon wants to be 208 00:11:09,320 --> 00:11:11,880 Speaker 2: in the healthcare business. I think because it's a good 209 00:11:11,920 --> 00:11:15,280 Speaker 2: business to be in. Just like everybody needs groceries, everybody 210 00:11:15,640 --> 00:11:18,320 Speaker 2: needs healthcare, and there's a lot of money to be made. 211 00:11:18,840 --> 00:11:21,719 Speaker 2: And I also in this context, I have to think 212 00:11:21,720 --> 00:11:25,040 Speaker 2: about how much Amazon knows about us from my own 213 00:11:25,080 --> 00:11:29,800 Speaker 2: experience where recommended a statin to me. Amazon knows the 214 00:11:29,840 --> 00:11:33,000 Speaker 2: groceries that I'm buying, the books that I'm reading, the 215 00:11:33,000 --> 00:11:36,040 Speaker 2: clothes that I'm wearing a lot of times. So all 216 00:11:36,040 --> 00:11:39,839 Speaker 2: that data can be leveraged into a business that can 217 00:11:39,880 --> 00:11:42,760 Speaker 2: help Amazon grow and make more money. So why wouldn't 218 00:11:42,760 --> 00:11:43,160 Speaker 2: they want. 219 00:11:43,000 --> 00:11:44,959 Speaker 3: To be in the healthcare business? 220 00:11:50,640 --> 00:11:54,320 Speaker 1: After the break, how much health trecking is too much? 221 00:11:54,360 --> 00:11:54,920 Speaker 1: Health tracking? 222 00:11:55,400 --> 00:12:05,480 Speaker 3: Stay with us. 223 00:12:08,360 --> 00:12:10,959 Speaker 1: So there's this interesting pushball to your point about like 224 00:12:11,520 --> 00:12:15,440 Speaker 1: your palm print, creating a new locke when you're quite 225 00:12:15,480 --> 00:12:17,640 Speaker 1: happy just to go and checking with a receptionists, like 226 00:12:17,679 --> 00:12:19,839 Speaker 1: that's not something you really want. On the other hand, 227 00:12:20,520 --> 00:12:25,520 Speaker 1: wearables are a kind of voluntary self conversion into data 228 00:12:25,840 --> 00:12:28,960 Speaker 1: in order to optimize, and that's something you've written about 229 00:12:28,960 --> 00:12:32,080 Speaker 1: a fair bit as well. And then recently you decided 230 00:12:32,120 --> 00:12:36,640 Speaker 1: to write a story about your experience of wearing basically 231 00:12:36,679 --> 00:12:39,120 Speaker 1: every wearable that you could feel on your body, including 232 00:12:39,200 --> 00:12:42,880 Speaker 1: glucose monitors, watches, rings. Tell us a bit about that 233 00:12:42,920 --> 00:12:46,400 Speaker 1: story and what the experience of all of that data was. 234 00:12:46,960 --> 00:12:49,559 Speaker 2: It all started at the beginning of this year at CES, 235 00:12:49,679 --> 00:12:52,480 Speaker 2: which is the world's largest electronics show in Las Vegas. 236 00:12:52,760 --> 00:12:55,319 Speaker 2: And I've been to CS many times as a technology journalist, 237 00:12:55,440 --> 00:12:59,600 Speaker 2: and usually it's a lot of TVs and some goofy robots, 238 00:12:59,720 --> 00:13:02,720 Speaker 2: but I'm always looking for innovation, for new things, for 239 00:13:03,120 --> 00:13:06,160 Speaker 2: something I hadn't seen before, And what I saw at 240 00:13:06,160 --> 00:13:10,400 Speaker 2: CES this year was a continuous glucose monitor for people 241 00:13:10,480 --> 00:13:13,880 Speaker 2: who don't have diabetes. These are the little sensors that 242 00:13:14,040 --> 00:13:16,880 Speaker 2: you might see on the backs of people's arms. Historically 243 00:13:16,880 --> 00:13:20,319 Speaker 2: those have been to help people with diabetes manage their illness, 244 00:13:20,400 --> 00:13:22,959 Speaker 2: but now a growing number of companies want to market 245 00:13:23,000 --> 00:13:25,320 Speaker 2: them over the counter to people who are interested in 246 00:13:25,320 --> 00:13:28,920 Speaker 2: fitness tracking. And what you're tracking is the glucose in 247 00:13:28,920 --> 00:13:31,520 Speaker 2: your bloodstream. So I talked to a company there that 248 00:13:31,600 --> 00:13:34,079 Speaker 2: was doing it, and then suddenly I started noticing all 249 00:13:34,120 --> 00:13:38,160 Speaker 2: these other interesting health tracking products things I hadn't seen before. 250 00:13:38,200 --> 00:13:41,079 Speaker 2: There are lots of companies selling smart rings, which have 251 00:13:41,200 --> 00:13:44,080 Speaker 2: been around but are kind of newly popular. I saw 252 00:13:44,360 --> 00:13:49,680 Speaker 2: and met with a company that actually has headphones that 253 00:13:49,720 --> 00:13:50,920 Speaker 2: can track your brain waves. 254 00:13:51,280 --> 00:13:51,600 Speaker 3: Wow. 255 00:13:51,760 --> 00:13:54,760 Speaker 2: It's a company called Neuroble, and they've figured out a 256 00:13:54,760 --> 00:13:57,360 Speaker 2: way to take over the year. Headphones the ones that 257 00:13:57,440 --> 00:13:59,600 Speaker 2: kind of cup around your ears, and they have electrodes 258 00:13:59,600 --> 00:14:02,560 Speaker 2: embedded in them and it can pick up on the 259 00:14:03,160 --> 00:14:06,760 Speaker 2: electricity that your brain produces. They have a lot of 260 00:14:06,840 --> 00:14:09,480 Speaker 2: ambition for this technology. But right now, the headphones that 261 00:14:09,520 --> 00:14:13,080 Speaker 2: they're selling track your attention basically, like there's a you 262 00:14:13,080 --> 00:14:15,920 Speaker 2: can kind of imagine a meter showing you how much 263 00:14:15,960 --> 00:14:19,120 Speaker 2: you're paying attention. And the more I got distracted by 264 00:14:19,160 --> 00:14:21,560 Speaker 2: the meter, the more my attention went down because I 265 00:14:21,600 --> 00:14:23,640 Speaker 2: wasn't paying attention to the thing I was supposed to 266 00:14:23,640 --> 00:14:26,840 Speaker 2: be paying attention to. So I think that actually kind 267 00:14:26,880 --> 00:14:28,360 Speaker 2: of put me over the edge and wanting to do 268 00:14:28,400 --> 00:14:30,560 Speaker 2: this story, like what can I track and what's useful 269 00:14:31,120 --> 00:14:33,800 Speaker 2: because this is a health tracking is a huge and 270 00:14:33,840 --> 00:14:36,000 Speaker 2: growing market and a lot of people are spending a 271 00:14:36,000 --> 00:14:38,240 Speaker 2: lot of money to try to make themselves healthier with 272 00:14:38,280 --> 00:14:39,320 Speaker 2: the help of tech. 273 00:14:40,080 --> 00:14:43,280 Speaker 1: So you were wearables for many months, and lots of them. 274 00:14:43,560 --> 00:14:46,000 Speaker 1: Can you list the wearables that you tried and what 275 00:14:46,040 --> 00:14:47,400 Speaker 1: they were kind of marketed to do? 276 00:14:48,200 --> 00:14:51,720 Speaker 2: I'll do my best. There were probably two dozen things 277 00:14:51,760 --> 00:14:54,760 Speaker 2: that I tried. The Apple Watch was something I already had. 278 00:14:55,200 --> 00:14:59,320 Speaker 2: The woopband was something I hadn't tried before, but I 279 00:14:59,360 --> 00:15:00,880 Speaker 2: see a lot of people wearing it and they really 280 00:15:00,920 --> 00:15:04,880 Speaker 2: like it. I wore in ordering all the time, but 281 00:15:04,920 --> 00:15:08,280 Speaker 2: I really liked it for sleep. Other smart rings I tried, 282 00:15:08,520 --> 00:15:13,480 Speaker 2: there's one called the Superhuman ring, one called the Lunar Ring. 283 00:15:14,320 --> 00:15:18,240 Speaker 2: I wore headphones that read my brain waves, and I 284 00:15:18,320 --> 00:15:23,480 Speaker 2: wore several different cgms. The companies that make the cgms 285 00:15:23,480 --> 00:15:28,480 Speaker 2: the sensors themselves are Abbot Pharmaceuticals and dex Calm, and 286 00:15:28,480 --> 00:15:31,160 Speaker 2: then the companies that make the software. Abbot has its 287 00:15:31,160 --> 00:15:35,600 Speaker 2: own called Lingo, there's a company called Levels and Aura 288 00:15:35,960 --> 00:15:39,600 Speaker 2: also recently started doing glucose tracking. So I tried all those, 289 00:15:39,640 --> 00:15:42,520 Speaker 2: and then there were other devices that were in the mix, 290 00:15:42,640 --> 00:15:45,360 Speaker 2: like Withvings has a number of health tech tools that 291 00:15:45,400 --> 00:15:49,680 Speaker 2: I tried. The BPM Vision Plus is a fancy blood 292 00:15:49,680 --> 00:15:52,480 Speaker 2: pressure cuff. They have a body scanner that I think 293 00:15:52,600 --> 00:15:56,560 Speaker 2: is just called the body scan and I even tried 294 00:15:56,600 --> 00:15:59,680 Speaker 2: glasses that also worked as hearing aids. They didn't track 295 00:15:59,720 --> 00:16:01,560 Speaker 2: my health per se, but they were kind of they're 296 00:16:01,560 --> 00:16:02,040 Speaker 2: in the mix. 297 00:16:02,640 --> 00:16:04,680 Speaker 1: Who are these companies marketing to? I mean, who are 298 00:16:04,680 --> 00:16:05,400 Speaker 1: these products FORU? 299 00:16:05,960 --> 00:16:10,120 Speaker 2: I think they are basically two groups of people that 300 00:16:10,320 --> 00:16:16,680 Speaker 2: these wearable companies are trying to appeal to. One is 301 00:16:16,720 --> 00:16:19,840 Speaker 2: the athlete. A lot of these wearable companies started out 302 00:16:19,880 --> 00:16:25,000 Speaker 2: as a kind of really hardcore fitness tracking, like the woopband. 303 00:16:25,480 --> 00:16:28,800 Speaker 2: I think is particularly tuned towards people who like really 304 00:16:28,840 --> 00:16:34,040 Speaker 2: want to optimize their their body's performance. The Superhuman Ring 305 00:16:34,280 --> 00:16:37,680 Speaker 2: is another one that is really geared towards athletes, and 306 00:16:37,720 --> 00:16:40,080 Speaker 2: the other group of is just everybody else who's just 307 00:16:40,120 --> 00:16:42,760 Speaker 2: trying to gain some insight and be a little bit healthier. 308 00:16:42,760 --> 00:16:45,600 Speaker 2: Maybe they want to feel better, have more energy, maybe 309 00:16:45,600 --> 00:16:48,200 Speaker 2: they want to to sleep better, have more energy. Having 310 00:16:48,240 --> 00:16:50,800 Speaker 2: more energy is a pretty common thing people want. But 311 00:16:51,440 --> 00:16:54,720 Speaker 2: in general, I don't think that any of these devices 312 00:16:54,800 --> 00:17:00,880 Speaker 2: or services really promise anything other than inside and information. 313 00:17:01,440 --> 00:17:04,240 Speaker 1: What did you find useful about the experiment for yourself 314 00:17:04,280 --> 00:17:05,520 Speaker 1: and what was less helpful? 315 00:17:06,080 --> 00:17:08,720 Speaker 2: It was at times awful to be in that experiment. 316 00:17:09,359 --> 00:17:11,639 Speaker 2: It was self imposed. I knew that I was getting 317 00:17:11,640 --> 00:17:14,359 Speaker 2: myself into something that was probably going to end up 318 00:17:14,359 --> 00:17:16,919 Speaker 2: being unpleasant, and it was also something that the average 319 00:17:16,920 --> 00:17:21,120 Speaker 2: person should never try. I think that when I maxed out, 320 00:17:21,160 --> 00:17:23,840 Speaker 2: I would be wearing the headphones, be wearing two or 321 00:17:23,880 --> 00:17:27,840 Speaker 2: three smart rings, wristbands on each risk and continuous glucose 322 00:17:27,880 --> 00:17:30,720 Speaker 2: monitor as well. At the same time, there are other 323 00:17:30,760 --> 00:17:33,080 Speaker 2: things that I couldn't really wear that I was still 324 00:17:33,520 --> 00:17:36,760 Speaker 2: trying out, Like I tried a scale that scanned my body. 325 00:17:36,800 --> 00:17:41,439 Speaker 2: I tried a service that promised to give me actionable 326 00:17:41,440 --> 00:17:45,720 Speaker 2: insights on my gut microbiome, and I did learn a 327 00:17:45,760 --> 00:17:50,880 Speaker 2: lot about my health. I think the problem is that 328 00:17:51,000 --> 00:17:55,440 Speaker 2: the way all these devices work is pretty similar. They 329 00:17:55,480 --> 00:17:58,160 Speaker 2: have sensors that can pick up on things like your 330 00:17:58,640 --> 00:18:01,840 Speaker 2: body temperature, but mostly they pick up on your heart 331 00:18:01,880 --> 00:18:05,400 Speaker 2: rate and something called heart rate variability. As you may 332 00:18:05,400 --> 00:18:07,440 Speaker 2: have guessed, heart rate variability is how much your heart 333 00:18:07,520 --> 00:18:11,080 Speaker 2: rate varies at any given time. And I like that. 334 00:18:11,320 --> 00:18:14,240 Speaker 2: I just didn't know how much to count on how 335 00:18:14,240 --> 00:18:17,840 Speaker 2: the algorithms were kind of analyzing what I was doing. 336 00:18:18,200 --> 00:18:20,399 Speaker 2: And the thing is every device was a little bit different. 337 00:18:20,480 --> 00:18:23,040 Speaker 2: There were smart rings that told me I was super fit, 338 00:18:23,400 --> 00:18:25,280 Speaker 2: and there were smart rings that told me that I 339 00:18:25,320 --> 00:18:27,680 Speaker 2: had work to do, so they're all tuned a little 340 00:18:27,680 --> 00:18:30,879 Speaker 2: bit differently. But overall, I found that the thing I 341 00:18:30,920 --> 00:18:35,880 Speaker 2: wanted most was a way to check in with myself 342 00:18:36,000 --> 00:18:39,560 Speaker 2: and know if I was being active enough. And the 343 00:18:39,640 --> 00:18:41,960 Speaker 2: other thing and I didn't really expect this to be 344 00:18:42,160 --> 00:18:46,359 Speaker 2: such a big factor going into the test was I 345 00:18:46,400 --> 00:18:49,000 Speaker 2: really got into the sleep tracking side of it. It's 346 00:18:49,000 --> 00:18:52,359 Speaker 2: something I hadn't tried before, but I still wear the 347 00:18:52,560 --> 00:18:55,800 Speaker 2: or ring when I sleep, if only to check the 348 00:18:55,840 --> 00:18:58,479 Speaker 2: score when I wake up, And I realized it sounds 349 00:18:58,720 --> 00:19:00,840 Speaker 2: silly to think that you need to check a score 350 00:19:00,880 --> 00:19:02,880 Speaker 2: to see how well you slept. But I have an 351 00:19:02,880 --> 00:19:07,040 Speaker 2: eighteen month old. I don't sleep super great. But to 352 00:19:07,040 --> 00:19:10,000 Speaker 2: have an extra little bit of data about what I'm 353 00:19:10,000 --> 00:19:12,359 Speaker 2: doing when I can't track it at all, I thought 354 00:19:12,440 --> 00:19:15,360 Speaker 2: was really helpful. And I am sleeping better, I'm going 355 00:19:15,400 --> 00:19:18,560 Speaker 2: to bed earlier. I find myself listening to the software 356 00:19:18,680 --> 00:19:21,399 Speaker 2: telling me what to do and feeling better as a result. 357 00:19:21,440 --> 00:19:24,399 Speaker 2: And I think that if the goal for this was anything, 358 00:19:24,440 --> 00:19:25,920 Speaker 2: it was to feel a little bit better and more 359 00:19:25,920 --> 00:19:28,840 Speaker 2: confident about my health. And although it was a bumpy 360 00:19:28,920 --> 00:19:30,359 Speaker 2: road the last six months, I think I got to 361 00:19:30,440 --> 00:19:30,960 Speaker 2: a good place. 362 00:19:31,640 --> 00:19:34,360 Speaker 1: Two kind of bigger zoom out questions. One is we're 363 00:19:34,359 --> 00:19:38,399 Speaker 1: living in this sort of Maha time. Make America healthy again. 364 00:19:38,640 --> 00:19:42,560 Speaker 1: And I think the one app which is truly like 365 00:19:42,640 --> 00:19:46,560 Speaker 1: capture people's imagination this year is Yuka Right. You can 366 00:19:46,640 --> 00:19:50,600 Speaker 1: scan products in the supermarket, in the grocery store and 367 00:19:50,880 --> 00:19:53,360 Speaker 1: it will tell you the ingredients and how healthy the ingredients. 368 00:19:53,359 --> 00:19:56,240 Speaker 1: So I was a French app and people are going 369 00:19:56,240 --> 00:19:58,680 Speaker 1: crazy for it, and I think huge corporations like Into 370 00:19:58,760 --> 00:20:02,800 Speaker 1: the Amazon changing the formulas and their products in response 371 00:20:02,880 --> 00:20:06,479 Speaker 1: to consumer demand based around this app. It's sort of 372 00:20:06,520 --> 00:20:09,800 Speaker 1: this idea that the best way to be in control 373 00:20:09,840 --> 00:20:13,560 Speaker 1: of your health is to have more knowledge and to 374 00:20:13,640 --> 00:20:16,919 Speaker 1: basically be your own doctor in some respects. Right, So 375 00:20:17,520 --> 00:20:21,200 Speaker 1: is there a wide a cultural, socio political trend into 376 00:20:21,240 --> 00:20:23,000 Speaker 1: which all of this is fitting. 377 00:20:23,640 --> 00:20:26,640 Speaker 2: There's absolutely a bigger picture here. And I didn't even 378 00:20:26,960 --> 00:20:29,240 Speaker 2: know that I was going to be getting into to 379 00:20:29,320 --> 00:20:32,439 Speaker 2: the Mahab movement in this experiment until one of the 380 00:20:32,440 --> 00:20:36,400 Speaker 2: companies that I tested out that sells continuous glucose monitors. 381 00:20:36,720 --> 00:20:39,359 Speaker 2: They make the software actually was co founded by Trump's 382 00:20:39,680 --> 00:20:42,560 Speaker 2: pick for Surge. In general, Casey means, the company is 383 00:20:42,560 --> 00:20:46,959 Speaker 2: called Levels, and it is very much about giving you 384 00:20:47,000 --> 00:20:49,960 Speaker 2: more information about what you're eating and how your body 385 00:20:50,000 --> 00:20:52,600 Speaker 2: is responding to it. And it's hard to argue with 386 00:20:53,520 --> 00:20:57,280 Speaker 2: giving people more information, but I do think that it's 387 00:20:57,320 --> 00:21:01,800 Speaker 2: a slippery slope when you are are not just empowering 388 00:21:01,840 --> 00:21:04,960 Speaker 2: them with information, but empowering people to take things into 389 00:21:05,000 --> 00:21:07,960 Speaker 2: their own hands. And I did feel like some of 390 00:21:08,200 --> 00:21:12,040 Speaker 2: these health tracking companies flirt with that idea of you, 391 00:21:12,040 --> 00:21:13,960 Speaker 2: you can be your own doctor, and you can make 392 00:21:14,680 --> 00:21:18,080 Speaker 2: the right healthcare decisions without needing to be involved in 393 00:21:18,119 --> 00:21:21,080 Speaker 2: the healthcare system. There are a lot of criticisms about 394 00:21:21,119 --> 00:21:25,119 Speaker 2: the US healthcare system that are really valid. But I 395 00:21:25,160 --> 00:21:27,840 Speaker 2: think that there are some of the Maha movement that 396 00:21:28,840 --> 00:21:32,720 Speaker 2: kind of think that people can make decisions better themselves 397 00:21:32,800 --> 00:21:37,800 Speaker 2: with the right devices and technologies, and having spent a 398 00:21:37,800 --> 00:21:40,680 Speaker 2: lot of time with these devices and technologies, I don't 399 00:21:40,680 --> 00:21:42,960 Speaker 2: think that's the case for me. I still like talking 400 00:21:43,000 --> 00:21:47,120 Speaker 2: to my doctor, and I still believe in traditional healthcare, 401 00:21:47,160 --> 00:21:49,560 Speaker 2: but I do think that this tech can be a 402 00:21:49,600 --> 00:21:50,960 Speaker 2: good supplement for a lot of people. 403 00:21:51,560 --> 00:21:54,080 Speaker 1: I remember in my high school theology class, so we 404 00:21:54,080 --> 00:21:57,000 Speaker 1: were introduced to this concept of the God of the gaps, 405 00:21:57,040 --> 00:22:00,639 Speaker 1: which is basically like everything you couldn't explain with science, 406 00:22:00,720 --> 00:22:02,679 Speaker 1: you could say, oh God, that's God, and that was 407 00:22:02,720 --> 00:22:06,480 Speaker 1: like a kind of diminishing wedge of like God's role 408 00:22:06,520 --> 00:22:08,400 Speaker 1: in the world. It makes me think about the tech 409 00:22:08,400 --> 00:22:10,560 Speaker 1: of the gaps. In other words, like we're in a 410 00:22:10,600 --> 00:22:13,160 Speaker 1: time where's mental health crisis, and so people are using 411 00:22:13,320 --> 00:22:15,560 Speaker 1: large language models to be a therapist. There's a time 412 00:22:15,560 --> 00:22:18,959 Speaker 1: where it's like tremendous chronic illness and very very uneven 413 00:22:19,000 --> 00:22:21,680 Speaker 1: access to healthcare, and so we see these like fitness 414 00:22:21,680 --> 00:22:24,800 Speaker 1: trapping tracking apps emerge like there's a kind of like 415 00:22:25,440 --> 00:22:30,320 Speaker 1: patchwork tech solutionism, which at the margins that all these 416 00:22:30,320 --> 00:22:33,320 Speaker 1: things can obviously be good, but in the aggregate they 417 00:22:33,359 --> 00:22:36,399 Speaker 1: do point towards like a wider absence. 418 00:22:36,480 --> 00:22:40,360 Speaker 2: Perhaps that's absolutely true, and it's something that I addressed 419 00:22:40,359 --> 00:22:42,959 Speaker 2: in my piece about this. There is a primary healthcare 420 00:22:43,400 --> 00:22:46,879 Speaker 2: crisis in the US, and it's not just an access 421 00:22:46,880 --> 00:22:50,600 Speaker 2: to healthcare crisis, which is its own crisis. Not enough 422 00:22:50,600 --> 00:22:53,560 Speaker 2: people in the US have health insurance, not enough can 423 00:22:53,880 --> 00:22:55,800 Speaker 2: get access to a doctor, and even if they do, 424 00:22:55,880 --> 00:22:59,400 Speaker 2: their primary care doctor is probably overworked and can't spend 425 00:22:59,440 --> 00:23:02,440 Speaker 2: more than ten minutes with them maybe in the whole year. 426 00:23:03,200 --> 00:23:05,479 Speaker 2: I can see how it's hard to depend on that 427 00:23:05,480 --> 00:23:08,720 Speaker 2: system for most people. And technology isn't necessarily coming in 428 00:23:08,800 --> 00:23:12,480 Speaker 2: to replace your doctor. I think that it's coming to 429 00:23:12,520 --> 00:23:15,439 Speaker 2: help make you feel better about what you're doing, maybe 430 00:23:15,960 --> 00:23:19,720 Speaker 2: in the absence of a doctor. But I continue to 431 00:23:19,800 --> 00:23:22,720 Speaker 2: kind of have a hard time towing the line here 432 00:23:22,760 --> 00:23:25,240 Speaker 2: because I do think that these devices are really helpful, 433 00:23:25,240 --> 00:23:29,640 Speaker 2: and I think that the technology is good but from 434 00:23:29,640 --> 00:23:33,000 Speaker 2: my own personal experience, I found that the more information 435 00:23:33,040 --> 00:23:36,200 Speaker 2: I collected about myself and my body, the more anxious 436 00:23:36,240 --> 00:23:38,280 Speaker 2: I felt about not knowing what it meant or what 437 00:23:38,359 --> 00:23:39,960 Speaker 2: to do with it. And then that's where I think 438 00:23:40,000 --> 00:23:42,399 Speaker 2: that the healthcare system has to come into play. 439 00:23:43,480 --> 00:23:45,800 Speaker 1: Just to close we told to the beginning the conversation 440 00:23:45,840 --> 00:23:49,960 Speaker 1: about privacy biometric data palm scans at the doctor, how 441 00:23:50,040 --> 00:23:54,040 Speaker 1: concerned were you and should other consumers be about this 442 00:23:54,160 --> 00:23:58,639 Speaker 1: treasure trove of health data escaping. I mean, obviously you 443 00:23:58,680 --> 00:24:01,800 Speaker 1: feel like it's your rings, you'll whoop, it's your Apple watch, 444 00:24:01,840 --> 00:24:06,320 Speaker 1: it's your continuous glucose monitor. But we've seen time and 445 00:24:06,359 --> 00:24:10,520 Speaker 1: again that the databases that stills information are permeable. How 446 00:24:10,600 --> 00:24:12,280 Speaker 1: much of a concern is that for you? And how 447 00:24:12,400 --> 00:24:15,200 Speaker 1: robust are the so security protocols from any of these 448 00:24:15,280 --> 00:24:17,280 Speaker 1: consumert health treking companies. 449 00:24:17,760 --> 00:24:19,880 Speaker 2: One thing I think that is really important to point 450 00:24:19,920 --> 00:24:23,119 Speaker 2: out here is that the data being collected by your 451 00:24:23,440 --> 00:24:25,919 Speaker 2: or ring, or your wo band or Apple watch is 452 00:24:25,960 --> 00:24:28,640 Speaker 2: not covered by HIPPA. Again, HIPPA is a really narrow 453 00:24:28,720 --> 00:24:31,160 Speaker 2: law and it's very serious, and I think the data 454 00:24:31,160 --> 00:24:34,119 Speaker 2: is well protected, but it's like your messages between you 455 00:24:34,160 --> 00:24:36,480 Speaker 2: and your doctor and your test results. Those things are covered, 456 00:24:36,920 --> 00:24:42,080 Speaker 2: but your oring data is not. We have seen major breaches. 457 00:24:42,160 --> 00:24:44,240 Speaker 2: There is a huge fit bit breach where a lot 458 00:24:44,280 --> 00:24:47,080 Speaker 2: of people's data got out there. I think with biometric 459 00:24:47,160 --> 00:24:50,840 Speaker 2: data it's permanent, so once it is out there, if 460 00:24:50,880 --> 00:24:53,800 Speaker 2: Amazon One got hacked and everybody's pomprints got out there, 461 00:24:54,359 --> 00:24:57,920 Speaker 2: they'd be out there. We saw what happens when data 462 00:24:58,000 --> 00:25:00,760 Speaker 2: kind of ends up in unexpected hands with the twenty 463 00:25:00,760 --> 00:25:03,439 Speaker 2: three and me bankruptcy, where a lot of people's DNA 464 00:25:03,480 --> 00:25:07,200 Speaker 2: information was up for sale. So I think that most 465 00:25:07,240 --> 00:25:10,879 Speaker 2: people should assume that whatever data they're giving up when 466 00:25:10,920 --> 00:25:13,320 Speaker 2: wearing one of these devices could get hacked and could 467 00:25:13,400 --> 00:25:15,960 Speaker 2: end up on the dark web being sold. But that's 468 00:25:15,960 --> 00:25:18,760 Speaker 2: true with any data that you're giving up when you're 469 00:25:18,840 --> 00:25:22,080 Speaker 2: surfing the web or doing anything with technology. I think 470 00:25:22,080 --> 00:25:25,320 Speaker 2: that there are companies that I trust more with my 471 00:25:25,400 --> 00:25:28,479 Speaker 2: data who have made a stronger commitment to protecting it, 472 00:25:29,040 --> 00:25:34,480 Speaker 2: like Apple. But I also think that there's a faith 473 00:25:34,520 --> 00:25:37,040 Speaker 2: that you have to have in using some of these products, 474 00:25:37,600 --> 00:25:40,560 Speaker 2: otherwise you'll go crazier or just not be able to 475 00:25:40,640 --> 00:25:42,800 Speaker 2: use technology in general. If you're just too scared about 476 00:25:43,000 --> 00:25:44,040 Speaker 2: getting hacked or a breach. 477 00:25:51,560 --> 00:26:09,159 Speaker 1: Adam, thank you so much, Thank you for tech Stuff. 478 00:26:09,280 --> 00:26:14,159 Speaker 1: I'mos Vloschin. This episode was produced by Elyza Dennis, Victoria Dominguez, 479 00:26:14,359 --> 00:26:18,120 Speaker 1: and Adriana Toppia. It was executive produced by me, Karen 480 00:26:18,200 --> 00:26:22,240 Speaker 1: Price and Kate Osborne for Kaleidoscope and Katrina Norval for 481 00:26:22,280 --> 00:26:26,679 Speaker 1: iHeart Podcasts. Jack Insley mixed this episode and Kyle Murdoch 482 00:26:26,720 --> 00:26:29,399 Speaker 1: wrote our theme song. Join us on Friday for the 483 00:26:29,440 --> 00:26:31,840 Speaker 1: Weekend tech when we'll run through all the headlines you 484 00:26:31,880 --> 00:26:35,040 Speaker 1: may have missed, and please rate, review, and reach out 485 00:26:35,080 --> 00:26:43,719 Speaker 1: to us at tech Stuff podcast at gmail dot com.