1 00:00:12,440 --> 00:00:16,320 Speaker 1: Welcome to Tech Stuff, a production of iHeart Podcasts and Kaleidoscope. 2 00:00:16,600 --> 00:00:19,239 Speaker 1: I'm as Valoshian and today Karra Price and I will 3 00:00:19,239 --> 00:00:23,079 Speaker 1: bring you the headlines this week, including the selfies that 4 00:00:23,160 --> 00:00:26,799 Speaker 1: tell you how old you really are. Then, on tech Support, 5 00:00:26,920 --> 00:00:30,120 Speaker 1: we'll talk to New York Magazine's James D. Walsh about 6 00:00:30,240 --> 00:00:32,680 Speaker 1: using AI to cheat your way through college. 7 00:00:33,120 --> 00:00:36,240 Speaker 2: And it quickly kind of dawned on me that everyone 8 00:00:36,360 --> 00:00:39,000 Speaker 2: is cheating. They may not be using the word cheating, 9 00:00:39,159 --> 00:00:42,199 Speaker 2: but they are cheating according to their honor code. 10 00:00:42,600 --> 00:00:46,360 Speaker 1: All of that on the weekend Tech It's Friday, May sixteenth. 11 00:00:46,720 --> 00:00:51,600 Speaker 1: Hello Cara, hi ozgo next adjacent Lee, When was the 12 00:00:51,680 --> 00:00:53,559 Speaker 1: last time you flew into Newark Airport? 13 00:00:53,720 --> 00:00:56,680 Speaker 3: You know that Newark is normally my secret weapon airport, 14 00:00:56,720 --> 00:00:59,520 Speaker 3: but I haven't flown into Newark for quite some time. 15 00:00:59,440 --> 00:01:02,000 Speaker 1: Especially into nationally. Actually, it is much faster getting through 16 00:01:02,000 --> 00:01:04,959 Speaker 1: the customs and border patrol. It is. However, it is 17 00:01:05,040 --> 00:01:06,640 Speaker 1: not a good time to be flying in and out 18 00:01:06,640 --> 00:01:08,400 Speaker 1: of New York right now. In the last few weeks, 19 00:01:08,440 --> 00:01:11,560 Speaker 1: there have been three telecommunication failures at the air traffic 20 00:01:11,600 --> 00:01:14,880 Speaker 1: control center that oversees the airport. The first outage late 21 00:01:14,959 --> 00:01:17,440 Speaker 1: last month lasted over a minute. 22 00:01:17,360 --> 00:01:19,160 Speaker 3: Like, this is the stuff that makes me not want 23 00:01:19,200 --> 00:01:20,360 Speaker 3: to ever get on an airplane. 24 00:01:20,440 --> 00:01:22,120 Speaker 1: Yeah, I mean this, truly think about it. This is 25 00:01:22,160 --> 00:01:25,959 Speaker 1: when air traffic control has no contact at all with 26 00:01:26,080 --> 00:01:28,480 Speaker 1: the planes in the sky, like none, none, and they're 27 00:01:28,520 --> 00:01:30,720 Speaker 1: just hoping for that minute fingers cross they don't crash 28 00:01:30,760 --> 00:01:33,240 Speaker 1: into each other. And of course there are these compounding 29 00:01:33,240 --> 00:01:36,440 Speaker 1: delays afterwards because the poor air traffic controllers get so 30 00:01:36,480 --> 00:01:38,360 Speaker 1: stressed by this, they have like PTSD. 31 00:01:38,840 --> 00:01:41,840 Speaker 3: Yeah, I've actually heard that air traffic control is one 32 00:01:41,840 --> 00:01:43,679 Speaker 3: of the more stressful jobs that you can have in 33 00:01:43,720 --> 00:01:44,039 Speaker 3: this kind of. 34 00:01:44,040 --> 00:01:46,039 Speaker 1: Stry especially when the systems are going dark and you 35 00:01:46,120 --> 00:01:49,360 Speaker 1: have dozens of planes in midair that you can't communicate. 36 00:01:48,880 --> 00:01:51,280 Speaker 4: With and nobody else wants the job, so you're overworked. 37 00:01:51,360 --> 00:01:51,920 Speaker 1: Absolutely. 38 00:01:52,000 --> 00:01:53,680 Speaker 4: But why is this a tech story? 39 00:01:53,800 --> 00:01:57,280 Speaker 1: Well, good question. The outages are being blamed in part 40 00:01:57,440 --> 00:02:00,640 Speaker 1: on systems that rely on old technologies. 41 00:02:00,120 --> 00:02:01,400 Speaker 4: Like what are we talking about? 42 00:02:01,400 --> 00:02:06,280 Speaker 1: Old like floppy disc gold so my child. Yeah. Basically, well, 43 00:02:06,440 --> 00:02:09,640 Speaker 1: here's how a former air traffic controller described it. If 44 00:02:09,680 --> 00:02:12,080 Speaker 1: you look at the technology we're using, most of it 45 00:02:12,120 --> 00:02:14,360 Speaker 1: is from the late nineteen eighties to the nineteen nineties, 46 00:02:14,800 --> 00:02:17,920 Speaker 1: we still use floppy discs to update our information system. 47 00:02:18,360 --> 00:02:20,360 Speaker 1: We still have paper strips. I mean, this is not 48 00:02:20,360 --> 00:02:24,080 Speaker 1: the eighties or nineties. This is the ancient Egypt. We 49 00:02:24,200 --> 00:02:26,720 Speaker 1: use paper strips that we walk around the tower cab 50 00:02:26,800 --> 00:02:29,639 Speaker 1: with that each controller writes something on and then hand 51 00:02:29,680 --> 00:02:30,920 Speaker 1: it to the next controller. 52 00:02:31,120 --> 00:02:33,079 Speaker 3: I just I was thinking about, like if Jen Alpha 53 00:02:33,160 --> 00:02:34,639 Speaker 3: is like, you know what, I'm only going to hand 54 00:02:34,639 --> 00:02:36,600 Speaker 3: in my homework on floppy disc now, because that's what 55 00:02:36,600 --> 00:02:38,160 Speaker 3: we used to do when I was growing up, Like 56 00:02:38,160 --> 00:02:41,040 Speaker 3: you would upload your homework to floppy disc and then 57 00:02:41,040 --> 00:02:43,520 Speaker 3: bring it to class and print it from school. 58 00:02:43,680 --> 00:02:45,800 Speaker 1: Well, I heard that in Japan there's a new trend 59 00:02:46,000 --> 00:02:48,520 Speaker 1: of fake digital cassette player. So it looks like a 60 00:02:48,560 --> 00:02:51,720 Speaker 1: set player, but it's actually an MP three player. But anyway, 61 00:02:51,960 --> 00:02:53,840 Speaker 1: you know, I'm a big local news guy. 62 00:02:54,040 --> 00:02:54,840 Speaker 4: You are you are. 63 00:02:55,040 --> 00:02:58,640 Speaker 1: NJ dot Com reports that the technology at Newark is 64 00:02:58,720 --> 00:03:02,200 Speaker 1: so outdated that when parts need to be replaced, the 65 00:03:02,280 --> 00:03:05,680 Speaker 1: FAA has to source them from eBay. 66 00:03:06,000 --> 00:03:07,679 Speaker 4: You know, this is how Kim kardash she needs to 67 00:03:07,680 --> 00:03:08,480 Speaker 4: buy her BlackBerry. 68 00:03:08,520 --> 00:03:10,240 Speaker 1: She did after they have continued sub she. 69 00:03:10,240 --> 00:03:13,000 Speaker 3: Would go on eBay and buy like fifteen or twenty 70 00:03:13,080 --> 00:03:17,320 Speaker 3: of them smartly. But she's also not working for the FAA. 71 00:03:17,400 --> 00:03:19,239 Speaker 3: But in all seriousness, you know, in the FAA is 72 00:03:19,280 --> 00:03:22,440 Speaker 3: a federal agency. So what is the Trump administration doing 73 00:03:22,440 --> 00:03:22,840 Speaker 3: about this? 74 00:03:23,240 --> 00:03:26,160 Speaker 1: Well, there is a three year plan to build big, 75 00:03:26,200 --> 00:03:30,800 Speaker 1: beautiful new traffic control systems with high speed network connections 76 00:03:30,800 --> 00:03:33,680 Speaker 1: and fiber and wireless connections, you know, but the US 77 00:03:33,760 --> 00:03:37,080 Speaker 1: obviously has real struggles with modernizing its infrastructure. I tried 78 00:03:37,120 --> 00:03:39,680 Speaker 1: to find out, in preparing for this episode whether or 79 00:03:39,720 --> 00:03:43,040 Speaker 1: not they use floppy disks in Chinese airports. I couldn't 80 00:03:43,080 --> 00:03:45,920 Speaker 1: get an answer, but my guess would be no. 81 00:03:46,160 --> 00:03:48,400 Speaker 4: Is that because you don't know the Chinese words for floppery? 82 00:03:48,480 --> 00:03:50,160 Speaker 1: That's probably that? MUS see what it is. And the 83 00:03:50,200 --> 00:03:52,480 Speaker 1: reason I've attract to this story is because we rarely 84 00:03:52,600 --> 00:03:56,240 Speaker 1: think about technology in terms of like stuff from the 85 00:03:56,240 --> 00:03:59,960 Speaker 1: eighties that still tape together and keeping us relatively safe 86 00:04:00,200 --> 00:04:01,440 Speaker 1: at thirty eight thousand feet. 87 00:04:01,760 --> 00:04:04,920 Speaker 3: Yes, but as is evidenced by my continuous use and 88 00:04:04,960 --> 00:04:08,600 Speaker 3: support of Apple's wired headphones, technology does not necessarily mean 89 00:04:08,640 --> 00:04:11,720 Speaker 3: the future or even the present. Technology is very much 90 00:04:12,200 --> 00:04:12,720 Speaker 3: the past. 91 00:04:13,120 --> 00:04:15,000 Speaker 1: The other thing that attracted me to this story was 92 00:04:15,040 --> 00:04:17,839 Speaker 1: that I got to experience some rare British pride for 93 00:04:17,880 --> 00:04:20,800 Speaker 1: the first time since the Spice Girls. I feel like 94 00:04:20,839 --> 00:04:24,880 Speaker 1: that's not true. So The Financial Times reported this week 95 00:04:24,880 --> 00:04:28,400 Speaker 1: that the British Airways CEO has hailed the quote game 96 00:04:28,480 --> 00:04:31,960 Speaker 1: changing effects of AI for cutting delays at the airline. 97 00:04:32,240 --> 00:04:35,920 Speaker 1: They've invested one hundred million pounds in quote operational resilience, 98 00:04:36,200 --> 00:04:39,279 Speaker 1: which includes using AI to suggest how to minimize passenger 99 00:04:39,320 --> 00:04:42,839 Speaker 1: disruptions like when to delay flights, whent to cancel them, 100 00:04:43,080 --> 00:04:47,080 Speaker 1: when to preemptively rebooked passengers, and even which gates aircraft 101 00:04:47,160 --> 00:04:50,520 Speaker 1: should land at to help passengers make tight connections. In 102 00:04:50,560 --> 00:04:54,560 Speaker 1: the process, the National Flag Carrier has gone from one 103 00:04:54,600 --> 00:04:57,040 Speaker 1: of the most delayed airlines in the world it's one 104 00:04:57,040 --> 00:04:57,520 Speaker 1: of the least. 105 00:04:57,720 --> 00:04:59,839 Speaker 3: Oh good for them, actually, I'd love to see some 106 00:05:00,040 --> 00:05:02,760 Speaker 3: national pride for you, because I also love to see 107 00:05:02,800 --> 00:05:06,400 Speaker 3: some support for those poor, poor schmucks at the back 108 00:05:06,400 --> 00:05:08,919 Speaker 3: of the airplane who are like I have a connection 109 00:05:09,080 --> 00:05:11,919 Speaker 3: in seven seconds an ethrow. 110 00:05:12,080 --> 00:05:14,320 Speaker 1: But there is something quite sort of reassuring about human 111 00:05:14,400 --> 00:05:16,760 Speaker 1: nature that most people do feel some of enough empathy 112 00:05:16,839 --> 00:05:19,400 Speaker 1: for their true people do come together. 113 00:05:19,400 --> 00:05:22,080 Speaker 4: People go go right ahead, go right ahead, unless you're pregnant, 114 00:05:22,160 --> 00:05:23,360 Speaker 4: which case, we're gonna black you. 115 00:05:23,920 --> 00:05:27,279 Speaker 1: So to change landing gears slightly. Tech is obviously a 116 00:05:27,320 --> 00:05:30,239 Speaker 1: system that can govern outcomes of tens of thousands of people, 117 00:05:30,240 --> 00:05:32,080 Speaker 1: and that's where you see it showing up, for example, 118 00:05:32,080 --> 00:05:35,040 Speaker 1: in air traffic control or airline management. But the other 119 00:05:35,160 --> 00:05:38,080 Speaker 1: kind of side of tech that I find particularly fascinating 120 00:05:38,720 --> 00:05:43,600 Speaker 1: is how it's becoming a way to make us legible, 121 00:05:43,800 --> 00:05:46,880 Speaker 1: Like our tech is making us readable to others, which 122 00:05:46,920 --> 00:05:51,080 Speaker 1: is kind of fascinating and creepy. Scientists at mass General 123 00:05:51,120 --> 00:05:54,599 Speaker 1: Brigham in Boston have developed a new AI prediction tool 124 00:05:54,640 --> 00:05:58,640 Speaker 1: that can identify a person's biological age just by analyzing 125 00:05:58,640 --> 00:06:00,520 Speaker 1: a picture of their face to. 126 00:06:00,480 --> 00:06:05,640 Speaker 3: Be clear biological age and real age, like I'm thirty five, 127 00:06:05,880 --> 00:06:07,839 Speaker 3: but I might have a different biological age. 128 00:06:07,920 --> 00:06:11,000 Speaker 1: It's basically how old you are at a cellular level, 129 00:06:11,040 --> 00:06:13,640 Speaker 1: based on the condition of your DNA, and it's different 130 00:06:13,640 --> 00:06:15,800 Speaker 1: from yet what scientists called chronological age. 131 00:06:16,560 --> 00:06:20,120 Speaker 3: So someone could be forty years old that's their chronological age, 132 00:06:20,360 --> 00:06:23,080 Speaker 3: and have a biological age of thirty five, which I'm 133 00:06:23,080 --> 00:06:25,239 Speaker 3: assuming is you know, a sign of good health. 134 00:06:25,360 --> 00:06:28,520 Speaker 4: Yeah, exactly, to have a lower biological age. 135 00:06:28,320 --> 00:06:31,520 Speaker 1: That's right. And so there's an app for this. Guess 136 00:06:31,520 --> 00:06:35,200 Speaker 1: what it's called dead or not hot or not No, 137 00:06:35,279 --> 00:06:38,960 Speaker 1: it's called it's called face age augh. And what's kind 138 00:06:38,960 --> 00:06:41,920 Speaker 1: of interesting is the way they trained it. So researchers 139 00:06:41,960 --> 00:06:45,600 Speaker 1: gave face Age thousands of publicly available pictures of people 140 00:06:45,839 --> 00:06:48,760 Speaker 1: over the age of sixty who presume to be healthy. 141 00:06:49,480 --> 00:06:52,600 Speaker 1: Then they gave it pictures of cancer patients who are 142 00:06:52,640 --> 00:06:56,960 Speaker 1: beginning radiotherapy treatment. On average, they found that someone going 143 00:06:56,960 --> 00:06:59,960 Speaker 1: through these treatments has a biological age that's five years 144 00:07:00,200 --> 00:07:04,000 Speaker 1: older than their chronological age, and the older the biological 145 00:07:04,040 --> 00:07:07,560 Speaker 1: age is, according to face Age, the worst of survival outlook. 146 00:07:07,720 --> 00:07:09,760 Speaker 1: This is according to an article in the Washington Post. 147 00:07:09,920 --> 00:07:11,600 Speaker 4: And why would anyone want to know this? 148 00:07:12,200 --> 00:07:15,760 Speaker 1: Well, good question. It is not just curiosity. The Post 149 00:07:15,840 --> 00:07:17,880 Speaker 1: explained how the tech could actually be a life saving 150 00:07:17,920 --> 00:07:21,560 Speaker 1: tool because it can be useful in predicting tolerance for 151 00:07:21,640 --> 00:07:25,440 Speaker 1: cancer treatments. This is something doctors are obviously constantly grappling with. 152 00:07:25,800 --> 00:07:27,720 Speaker 1: In one case in the article, a doctor had a 153 00:07:27,760 --> 00:07:30,640 Speaker 1: patient who was eighty six years old who'd received a 154 00:07:30,720 --> 00:07:34,240 Speaker 1: terminal lung cancer diagnosis, and the doctor was hesitating at 155 00:07:34,240 --> 00:07:37,000 Speaker 1: whether or not to recommend treatment because of the patient's 156 00:07:37,040 --> 00:07:40,560 Speaker 1: advanced age, but according to the doctor quote, he looked 157 00:07:40,600 --> 00:07:42,760 Speaker 1: younger than eighty six to me, and based on the 158 00:07:42,760 --> 00:07:45,800 Speaker 1: eyeball test and a host of other factors, I decided 159 00:07:45,840 --> 00:07:49,440 Speaker 1: to treat him with aggressive radiation therapy. The patient survived 160 00:07:49,480 --> 00:07:50,920 Speaker 1: and is now ninety years old. 161 00:07:51,480 --> 00:07:54,520 Speaker 4: So just humor me here, what does this have to 162 00:07:54,560 --> 00:07:55,880 Speaker 4: do with face age? 163 00:07:56,280 --> 00:07:59,480 Speaker 1: Well, it's kind of like the eyeball test in the 164 00:07:59,480 --> 00:08:02,400 Speaker 1: digital world world right, And the doctor actually went back 165 00:08:02,400 --> 00:08:05,239 Speaker 1: and scanned an old photo of his patient using face 166 00:08:05,280 --> 00:08:09,720 Speaker 1: age and discovered that the app basically posts fact endorsed 167 00:08:09,720 --> 00:08:12,880 Speaker 1: the assessment. The patient's biological age was ten years younger 168 00:08:12,920 --> 00:08:16,200 Speaker 1: than his chronological age. I he was biologically seventy six 169 00:08:16,280 --> 00:08:19,280 Speaker 1: when he started treatment, and therefore was a good candidate. Obviously, 170 00:08:19,400 --> 00:08:22,040 Speaker 1: a person's face isn't the only indicator of their health, 171 00:08:22,480 --> 00:08:25,880 Speaker 1: and the tool is used alongside other clinical information, but 172 00:08:26,160 --> 00:08:27,760 Speaker 1: per the post, it does do a better job of 173 00:08:27,800 --> 00:08:31,240 Speaker 1: predicting someone's chronological age than a doctor just using their 174 00:08:31,240 --> 00:08:32,679 Speaker 1: eyes alone, just the eyeball test. 175 00:08:33,160 --> 00:08:35,360 Speaker 3: You know, it makes me think about the old adage 176 00:08:35,640 --> 00:08:37,920 Speaker 3: that I like to use as it pertains to women 177 00:08:37,920 --> 00:08:39,400 Speaker 3: and men, which is that you can't judge a book 178 00:08:39,440 --> 00:08:42,120 Speaker 3: by its cover, and you have to wonder if plastic 179 00:08:42,160 --> 00:08:46,040 Speaker 3: surgery in botox is as effective at tricking the AI 180 00:08:46,400 --> 00:08:47,640 Speaker 3: as it is to the human eye. 181 00:08:47,800 --> 00:08:50,280 Speaker 1: Funny you mentioned, you know, scientists are actually still studying 182 00:08:50,320 --> 00:08:54,280 Speaker 1: if light, surgery, makeup, or other factors can affect the 183 00:08:54,320 --> 00:08:57,760 Speaker 1: accuracy of the face age reading. Although interestingly, and this 184 00:08:57,880 --> 00:09:00,920 Speaker 1: is something that I find very encouraged as a man 185 00:09:01,400 --> 00:09:04,600 Speaker 1: experiencing the beginning of baldness, or perhaps the mid stage 186 00:09:04,600 --> 00:09:08,320 Speaker 1: of baldness, face age does not overreact to the visual 187 00:09:08,320 --> 00:09:11,360 Speaker 1: cues of aging, like being baled or having gray hair 188 00:09:11,480 --> 00:09:13,960 Speaker 1: in the way that humans do. But just taking a 189 00:09:13,960 --> 00:09:16,600 Speaker 1: step back, I think the implications of this story are 190 00:09:16,640 --> 00:09:19,920 Speaker 1: actually really really big because in the old days, it 191 00:09:19,960 --> 00:09:23,600 Speaker 1: would have taken a doctor decades of clinical experience to 192 00:09:23,640 --> 00:09:28,000 Speaker 1: develop their own sense of intuition about somebody's biological age 193 00:09:28,120 --> 00:09:30,760 Speaker 1: versus their chronological age. You know, they would have developed 194 00:09:30,880 --> 00:09:33,960 Speaker 1: the clinical experience and then used it to make a 195 00:09:34,240 --> 00:09:36,440 Speaker 1: to make a judgment that they probably couldn't have explained 196 00:09:36,440 --> 00:09:39,679 Speaker 1: to you themselves how they got to. But that knowledge 197 00:09:39,720 --> 00:09:42,840 Speaker 1: was captured within a community of people who were trained 198 00:09:43,120 --> 00:09:46,760 Speaker 1: and trusted to use that information for good according to 199 00:09:46,760 --> 00:09:49,720 Speaker 1: the Hippocratic Oath, this app points in the direction of 200 00:09:49,760 --> 00:09:54,080 Speaker 1: a future where anybody will be able to essentially tap 201 00:09:54,160 --> 00:09:58,280 Speaker 1: into intuition and take a photo of your face and 202 00:09:58,480 --> 00:10:02,240 Speaker 1: know your biological age and know you know how much 203 00:10:02,360 --> 00:10:05,040 Speaker 1: longer you may have to live with some degree of accuracy. 204 00:10:05,520 --> 00:10:09,400 Speaker 1: This isn't happening today, but it could happen soon. That can, 205 00:10:09,480 --> 00:10:11,520 Speaker 1: of course be empowering if you want to take a 206 00:10:11,559 --> 00:10:14,040 Speaker 1: selfie and know what's going on and maybe make some 207 00:10:14,160 --> 00:10:17,319 Speaker 1: changes to your lifestyle perbs. But on the other hand, 208 00:10:18,040 --> 00:10:20,200 Speaker 1: having bad actors or actors who don't have your best 209 00:10:20,200 --> 00:10:25,600 Speaker 1: interests at heart. Other people, colleagues, bosses, health insurance companies 210 00:10:26,120 --> 00:10:28,280 Speaker 1: should make us all I think, deeply concerned. 211 00:10:28,480 --> 00:10:32,200 Speaker 3: Yeah, I mean, especially with insurance companies. It creates a 212 00:10:32,720 --> 00:10:34,040 Speaker 3: huge moral problem. 213 00:10:34,160 --> 00:10:34,760 Speaker 1: Absolutely. 214 00:10:35,200 --> 00:10:36,800 Speaker 3: So I want to tell you about a headline this 215 00:10:36,840 --> 00:10:39,880 Speaker 3: week that frightened me more than the realization of how 216 00:10:39,880 --> 00:10:40,320 Speaker 3: old I am. 217 00:10:40,400 --> 00:10:41,480 Speaker 4: When I attended the Webbies. 218 00:10:41,720 --> 00:10:43,400 Speaker 1: Okay, what you're doing at the Webbies. 219 00:10:43,600 --> 00:10:45,400 Speaker 4: I was invited, I was invited. 220 00:10:46,880 --> 00:10:47,440 Speaker 1: Goes somewhere. 221 00:10:47,520 --> 00:10:50,040 Speaker 4: Yeah, let me tell you something. 222 00:10:50,080 --> 00:10:52,680 Speaker 3: You know how often well you might think differently, I 223 00:10:52,679 --> 00:10:54,559 Speaker 3: don't say no as much as I used to at 224 00:10:54,600 --> 00:10:56,480 Speaker 3: my chronological age, I say no less. 225 00:10:56,559 --> 00:10:59,079 Speaker 1: Yeah, yeah, well you've got to make the most, make the. 226 00:10:59,080 --> 00:11:03,000 Speaker 3: Best of it, exactly. But I read a story this week. 227 00:11:03,160 --> 00:11:06,200 Speaker 3: Have you heard of chat GPT induced psychosis? 228 00:11:06,280 --> 00:11:10,199 Speaker 1: I have to confess I have not, so it is. 229 00:11:10,240 --> 00:11:13,200 Speaker 3: Kind of vague, but I got it from a Rolling 230 00:11:13,400 --> 00:11:17,480 Speaker 3: Stone headline that read people are losing loved ones to 231 00:11:17,679 --> 00:11:20,160 Speaker 3: AI fueled spiritual fantasies. 232 00:11:20,440 --> 00:11:24,319 Speaker 1: So this is like AI kind of becoming a digital 233 00:11:24,600 --> 00:11:26,240 Speaker 1: cult leader or something like that. 234 00:11:26,400 --> 00:11:27,000 Speaker 4: Pretty much. 235 00:11:27,080 --> 00:11:31,079 Speaker 3: And it's putting stress on the relationships of the people 236 00:11:31,520 --> 00:11:34,160 Speaker 3: who have to deal with people who think they're accessing 237 00:11:34,600 --> 00:11:37,000 Speaker 3: this sort of rules to the universe. 238 00:11:36,640 --> 00:11:37,439 Speaker 4: Through chat GPT. 239 00:11:37,840 --> 00:11:38,120 Speaker 1: Wow. 240 00:11:38,280 --> 00:11:39,439 Speaker 4: You know, several people. 241 00:11:39,160 --> 00:11:41,640 Speaker 3: Reached out to Rolling Stone about how chatbot use is 242 00:11:41,679 --> 00:11:44,160 Speaker 3: getting in the way of their relationships. People even said 243 00:11:44,160 --> 00:11:47,600 Speaker 3: that their partners were communicating with chat gpt as if 244 00:11:47,600 --> 00:11:50,880 Speaker 3: it were a savior figure, and in some cases the 245 00:11:51,000 --> 00:11:53,800 Speaker 3: chatbot would say it was God, wow, or tell the 246 00:11:53,920 --> 00:11:55,200 Speaker 3: user they were God. 247 00:11:55,320 --> 00:11:57,320 Speaker 1: That's worse, is much worse. 248 00:11:57,600 --> 00:12:00,520 Speaker 4: So here's the opening story of the article in Role Stone. 249 00:12:01,080 --> 00:12:04,360 Speaker 3: A couple's marriage is falling apart because a woman's husband 250 00:12:04,600 --> 00:12:08,080 Speaker 3: started to use chat GPT obsessively, and it's not the 251 00:12:08,080 --> 00:12:12,720 Speaker 3: way you or I use CHATGPT. This is like someone 252 00:12:12,760 --> 00:12:16,040 Speaker 3: who's being radicalized on YouTube, except YouTube is talking back 253 00:12:16,080 --> 00:12:19,760 Speaker 3: to it. The husband was using it as a spiritual guide. 254 00:12:20,040 --> 00:12:24,199 Speaker 3: He was asking chat GPT philosophical questions and getting increasingly 255 00:12:24,240 --> 00:12:27,640 Speaker 3: personal in his responses, revealing more and more of himself 256 00:12:27,679 --> 00:12:32,040 Speaker 3: along the way. And then this same person starts to 257 00:12:32,040 --> 00:12:35,160 Speaker 3: get paranoid about the government surveilling him, and he says 258 00:12:35,200 --> 00:12:38,960 Speaker 3: that AI helped him regain a repressed childhood memory of 259 00:12:39,000 --> 00:12:41,600 Speaker 3: a babysitter trying to drown him. 260 00:12:41,679 --> 00:12:43,439 Speaker 1: I mean, this is making my mind go in all 261 00:12:43,480 --> 00:12:45,520 Speaker 1: sorts of different ways. I mean, you know, you can 262 00:12:45,559 --> 00:12:47,800 Speaker 1: imagine if you're spending all of your time talking to 263 00:12:47,880 --> 00:12:51,040 Speaker 1: chat GPT and feeling so well understood that it kind 264 00:12:51,080 --> 00:12:53,720 Speaker 1: of could exacerbate the feelings. Well, if chat GPT can 265 00:12:53,760 --> 00:12:56,240 Speaker 1: understand me, why can't my husband or my wife. But 266 00:12:56,320 --> 00:12:59,079 Speaker 1: what you mentioned about the repressed childhood memory is also 267 00:12:59,160 --> 00:13:02,960 Speaker 1: very interesting to me because I find that quite disturbing. Obviously, 268 00:13:03,120 --> 00:13:06,160 Speaker 1: we've talked a lot about AI assisted therapy bots and 269 00:13:06,240 --> 00:13:09,400 Speaker 1: the promise of them, but at the point where chat 270 00:13:09,480 --> 00:13:14,360 Speaker 1: GPT is, you know, helping people or convincing people that 271 00:13:14,400 --> 00:13:18,840 Speaker 1: they're recovering childhood memories without any training or without any guardrails. 272 00:13:19,280 --> 00:13:21,920 Speaker 1: I mean, that gets quite dystopian and concerning. To me. 273 00:13:22,160 --> 00:13:27,079 Speaker 3: It takes quite a leap to believe that a chat 274 00:13:27,120 --> 00:13:31,040 Speaker 3: bot or a search a very sort of high falutant 275 00:13:31,080 --> 00:13:35,040 Speaker 3: search tool, is something that is capable of allowing you 276 00:13:35,160 --> 00:13:38,920 Speaker 3: to recover repressed memory. It's one thing, I mean, repressed memory. 277 00:13:39,080 --> 00:13:42,439 Speaker 1: Hold on, but what about journaling? True, this is two 278 00:13:42,480 --> 00:13:43,120 Speaker 1: way journaling. 279 00:13:43,559 --> 00:13:46,319 Speaker 4: It's two way journaling, but the other side. 280 00:13:46,240 --> 00:13:48,640 Speaker 1: You're on another side who is predisposed to encourage you 281 00:13:48,679 --> 00:13:50,640 Speaker 1: in whatever direction you're going. I think that's what's concerning 282 00:13:50,679 --> 00:13:51,080 Speaker 1: about it. 283 00:13:51,280 --> 00:13:55,640 Speaker 3: I also think it says more about where very seemingly 284 00:13:55,679 --> 00:14:00,560 Speaker 3: personalized technology fits into an increasingly godless world, which is 285 00:14:00,600 --> 00:14:05,320 Speaker 3: replacing religion with generative AI that seems friendly, more readily 286 00:14:05,400 --> 00:14:09,680 Speaker 3: available than your average guru therapist, and probably less judgmental 287 00:14:09,920 --> 00:14:10,640 Speaker 3: than your wife or. 288 00:14:10,679 --> 00:14:12,720 Speaker 1: Husband, certainly more prone to tell you what you want. 289 00:14:12,559 --> 00:14:15,920 Speaker 3: To hear, definitely, and also just answering you whenever you want. 290 00:14:16,000 --> 00:14:18,000 Speaker 3: I mean, I think that's something for free. 291 00:14:18,679 --> 00:14:21,400 Speaker 1: We know that chatbots tend to serve users things that 292 00:14:21,520 --> 00:14:24,600 Speaker 1: it knows they'll like. And last month there's a story 293 00:14:24,640 --> 00:14:27,680 Speaker 1: about how open Ai actually rolled back an update to 294 00:14:27,800 --> 00:14:31,440 Speaker 1: chat GPT's for o model because it was acting Tuesday 295 00:14:31,520 --> 00:14:34,880 Speaker 1: authentic towards users, like constantly telling them they were geniuses 296 00:14:35,000 --> 00:14:38,080 Speaker 1: or had amazing ideas, laying it on perhaps a bit 297 00:14:38,120 --> 00:14:39,920 Speaker 1: too thick, as my grandmother would have said. 298 00:14:40,880 --> 00:14:42,640 Speaker 3: You know, it's sort of like when I'm with my 299 00:14:42,720 --> 00:14:44,880 Speaker 3: mom on Mother's Day and she asked me to get 300 00:14:44,880 --> 00:14:46,760 Speaker 3: off my phone, and I'm like, will be as interesting 301 00:14:46,800 --> 00:14:48,880 Speaker 3: as my phone, and I'll start paying attention to you 302 00:14:48,960 --> 00:14:49,480 Speaker 3: on Mother's Day. 303 00:14:49,640 --> 00:14:52,560 Speaker 1: I give her Mother's da exception, except well, I. 304 00:14:52,520 --> 00:14:56,440 Speaker 3: Mean, look, the phone is a very seductive tool. Yeah, 305 00:14:56,440 --> 00:14:59,200 Speaker 3: and it's an always on supercomputer that gives you, and 306 00:14:59,280 --> 00:15:02,680 Speaker 3: I quote from the the article the answers to the universe. 307 00:15:02,480 --> 00:15:04,840 Speaker 1: And that of course, that kind of black box nature 308 00:15:05,360 --> 00:15:09,120 Speaker 1: obviously adds to the feeling of mystic or spirituality. I mean, 309 00:15:09,160 --> 00:15:12,320 Speaker 1: I think you look back to Victorian times and mesmerism 310 00:15:12,480 --> 00:15:15,840 Speaker 1: and you know, various quackery and stuff. The thing that 311 00:15:15,880 --> 00:15:18,520 Speaker 1: made people believe was not understanding how it worked. It 312 00:15:18,600 --> 00:15:20,840 Speaker 1: had to do with playing on that sense of the 313 00:15:20,920 --> 00:15:24,240 Speaker 1: unknown and filling it with meaning that was maybe non 314 00:15:24,240 --> 00:15:27,680 Speaker 1: appropriate meaning, and it feels like that's now happening on 315 00:15:27,760 --> 00:15:32,920 Speaker 1: a kind of cross societal and extremely technologically boosted scale. 316 00:15:33,120 --> 00:15:33,480 Speaker 4: Yeah. 317 00:15:33,520 --> 00:15:37,560 Speaker 3: And I think conversely, it's why people shouldn't read too 318 00:15:37,600 --> 00:15:40,040 Speaker 3: much into the advice that chat GBT gives them. 319 00:15:40,680 --> 00:15:42,400 Speaker 1: Yeah. I mean, I think that's something we all have 320 00:15:42,440 --> 00:15:45,880 Speaker 1: to remind ourselves of continually, because it's so tempting and 321 00:15:45,920 --> 00:15:53,720 Speaker 1: often it is so useful. We've got a couple more 322 00:15:53,720 --> 00:15:56,960 Speaker 1: headlines for you this week. Newly elected Pope Leo the 323 00:15:57,000 --> 00:16:01,200 Speaker 1: fourteenth says that he takes a similar position artificial intelligence 324 00:16:01,280 --> 00:16:05,440 Speaker 1: as his predecessor put Francis. According to CNN, the new 325 00:16:05,520 --> 00:16:07,960 Speaker 1: Pope laid out the vision for his papacy and he 326 00:16:08,040 --> 00:16:11,840 Speaker 1: identified AI as one of the most critical matters facing humanity. 327 00:16:12,280 --> 00:16:14,920 Speaker 1: He says that the development of AI, like the original 328 00:16:14,920 --> 00:16:19,320 Speaker 1: Industrial Revolution, would quote pose new challenges for the defense 329 00:16:19,480 --> 00:16:22,120 Speaker 1: of human dignity, justice, and labor. 330 00:16:22,880 --> 00:16:25,640 Speaker 3: So one drop, which is my chosen rap name for 331 00:16:25,720 --> 00:16:30,360 Speaker 3: Elizabeth Holmes, the infamous founder of fraudulent blood testing company Thearranos, 332 00:16:30,720 --> 00:16:34,040 Speaker 3: is in prison yep, but her partner is working on 333 00:16:34,080 --> 00:16:37,680 Speaker 3: a new venture that sounds a little bit familiar. The 334 00:16:37,680 --> 00:16:40,560 Speaker 3: New York Times reports that Holmes's partner, Billy Evans, is 335 00:16:40,640 --> 00:16:44,520 Speaker 3: raising money for Hymanthus, a blood testing startup that describes 336 00:16:44,560 --> 00:16:49,200 Speaker 3: itself as quote the future of diagnostics. And here's the kicker. 337 00:16:49,960 --> 00:16:53,840 Speaker 3: The device Billy Evans is showing two investors looks eerily 338 00:16:53,920 --> 00:16:59,920 Speaker 3: similar to the one hawked by Theranos. One woman's trash 339 00:17:00,200 --> 00:17:01,240 Speaker 3: is another man's treasure. 340 00:17:01,680 --> 00:17:03,320 Speaker 1: Yeah, I mean, I think you might need a blood 341 00:17:03,320 --> 00:17:05,639 Speaker 1: test yourself if you were, if you're queuing out to 342 00:17:05,640 --> 00:17:07,919 Speaker 1: invest in that one. Our friends at four or four 343 00:17:08,000 --> 00:17:10,920 Speaker 1: Media wrote about an ad for Coca Cola that used 344 00:17:10,920 --> 00:17:14,479 Speaker 1: AI to scan books and surprise, surprise, got some basic 345 00:17:14,520 --> 00:17:17,840 Speaker 1: facts wrong. Last month, the company released an ad campaign 346 00:17:17,840 --> 00:17:21,440 Speaker 1: which featured passages from classic literature that mentioned Coca Cola 347 00:17:21,480 --> 00:17:24,879 Speaker 1: by name. The problem is AI came up with some 348 00:17:24,920 --> 00:17:28,840 Speaker 1: examples that simply don't exist, including a sentence from a 349 00:17:28,840 --> 00:17:31,879 Speaker 1: book by British author J. G. Ballard featuring Coca Cola 350 00:17:32,320 --> 00:17:33,680 Speaker 1: that he never wrote. 351 00:17:34,040 --> 00:17:37,280 Speaker 3: Staying on the topic of AI, AD's beloved actress Jamie 352 00:17:37,359 --> 00:17:41,199 Speaker 3: Lee Curtis asked Meta CEO Mark Zuckerberg to remove an 353 00:17:41,320 --> 00:17:45,240 Speaker 3: AI generated commercial on Instagram that she claimed stole her 354 00:17:45,400 --> 00:17:49,000 Speaker 3: likeness and it worked. According to the San Francisco Chronicle, 355 00:17:49,080 --> 00:17:52,240 Speaker 3: Curtis posted on Instagram saying, quote, it's come to this 356 00:17:52,480 --> 00:17:56,439 Speaker 3: at Zach and then implored him to remove this quote 357 00:17:56,680 --> 00:17:59,200 Speaker 3: totally fake commercial from the internet. I can hear her 358 00:17:59,200 --> 00:18:02,639 Speaker 3: saying that. Curtis said she went through every proper channel 359 00:18:02,680 --> 00:18:04,480 Speaker 3: and even tried to DM Zuckerberg. 360 00:18:04,600 --> 00:18:06,240 Speaker 1: Is that one of the proper channels that is? 361 00:18:06,680 --> 00:18:10,120 Speaker 3: I mean, if you're verified, he's verified, We're verified. Let's 362 00:18:10,160 --> 00:18:13,679 Speaker 3: get together, but was unable to reach him since he 363 00:18:13,720 --> 00:18:16,760 Speaker 3: does not follow her on Instagram, so the post was 364 00:18:16,800 --> 00:18:19,800 Speaker 3: removed within two hours of Curtis's post, which, by the way, 365 00:18:20,040 --> 00:18:23,920 Speaker 3: was set to a wreath of Franklin's song Integrity. Now, 366 00:18:24,160 --> 00:18:26,840 Speaker 3: can Jamie Lee Curtis get Zuckerberg to take down all 367 00:18:26,880 --> 00:18:29,080 Speaker 3: the photos of himself wearing a gold chain? 368 00:18:30,560 --> 00:18:33,480 Speaker 1: Methinks not. And we're going to take a quick break now, 369 00:18:33,520 --> 00:18:36,879 Speaker 1: but stick around because cheating is on the rise, and 370 00:18:37,000 --> 00:18:49,399 Speaker 1: college is getting a whole lot easier to stay with us. 371 00:18:50,920 --> 00:18:53,360 Speaker 1: Welcome back to tech Stuff. This week on Tech Support, 372 00:18:53,400 --> 00:18:55,399 Speaker 1: we want to dive deeper on a headline that we 373 00:18:55,520 --> 00:18:58,879 Speaker 1: touched on last week in New York Magazine. Everyone is 374 00:18:58,960 --> 00:19:01,919 Speaker 1: cheating their way through college. The story start with me 375 00:19:02,080 --> 00:19:04,440 Speaker 1: because it's one of those examples of a story which 376 00:19:04,480 --> 00:19:07,440 Speaker 1: isn't just about a new technology changing the way we 377 00:19:07,560 --> 00:19:11,320 Speaker 1: do old things, in this case college assignments or cheating 378 00:19:11,400 --> 00:19:14,399 Speaker 1: on college assignments. It's about tech posing a challenge to 379 00:19:14,480 --> 00:19:18,320 Speaker 1: our entire system of learning in fascinating ways, and with 380 00:19:18,400 --> 00:19:20,640 Speaker 1: consequences we can't begin to fathom. 381 00:19:21,000 --> 00:19:23,600 Speaker 4: I actually think we can fathom the consequences of this. 382 00:19:23,680 --> 00:19:26,560 Speaker 3: I think as we step into a future of entirely 383 00:19:26,640 --> 00:19:30,760 Speaker 3: friction free existence, it's really interesting to see the ways people, 384 00:19:30,880 --> 00:19:35,200 Speaker 3: especially younger, more digitally native generations, skirt around the hard 385 00:19:35,240 --> 00:19:37,800 Speaker 3: parts of being a person. I read this article and 386 00:19:37,960 --> 00:19:41,320 Speaker 3: was consistently asking myself the question, if given the opportunity 387 00:19:41,359 --> 00:19:44,359 Speaker 3: to use something that made homework way easier, wouldn't I 388 00:19:44,480 --> 00:19:44,840 Speaker 3: use it? 389 00:19:45,240 --> 00:19:45,439 Speaker 1: You know? 390 00:19:45,520 --> 00:19:49,320 Speaker 3: I literally this morning tried to get chat GBT to 391 00:19:49,359 --> 00:19:52,600 Speaker 3: summarize an article for me. It was the best of times, 392 00:19:52,840 --> 00:19:54,320 Speaker 3: it was the worst of times. 393 00:19:54,480 --> 00:19:56,560 Speaker 1: Well, at least you've read your Tailor of Two Cities 394 00:19:56,600 --> 00:20:01,159 Speaker 1: by Charles Dickens. I'n Sparkner without other ado join us 395 00:20:01,280 --> 00:20:06,080 Speaker 1: discuss how AI is roiling education is James Walsh, a 396 00:20:06,119 --> 00:20:10,159 Speaker 1: features writer, at New York Magazine's Intelligencer, James, Welcome to 397 00:20:10,160 --> 00:20:10,679 Speaker 1: Tech Stuff. 398 00:20:11,119 --> 00:20:12,359 Speaker 2: Thanks so much for having me. 399 00:20:12,800 --> 00:20:15,320 Speaker 1: When did you first start to get interested in how 400 00:20:15,520 --> 00:20:17,600 Speaker 1: AI is changing college education? 401 00:20:18,160 --> 00:20:22,280 Speaker 2: Well, it actually started a few months ago, and I 402 00:20:22,960 --> 00:20:26,360 Speaker 2: just started calling college students, talking to college students, and 403 00:20:26,400 --> 00:20:30,520 Speaker 2: it quickly kind of dawned on me that everyone is cheating. Right. 404 00:20:30,680 --> 00:20:33,200 Speaker 2: They may not be using the word cheating, but they 405 00:20:33,240 --> 00:20:35,720 Speaker 2: are cheating according to their honor code. 406 00:20:36,040 --> 00:20:38,399 Speaker 3: And you open the piece with the story of the 407 00:20:38,400 --> 00:20:42,800 Speaker 3: Columbia student Roy Lee. He gained notoriety for hacking coding 408 00:20:42,880 --> 00:20:47,200 Speaker 3: tests big tech uses to assess internship applications. Why did 409 00:20:47,240 --> 00:20:49,040 Speaker 3: you want to tell his story and what is the 410 00:20:49,080 --> 00:20:50,960 Speaker 3: bigger takeaway from his story? 411 00:20:51,160 --> 00:20:51,440 Speaker 1: Sure? 412 00:20:51,480 --> 00:20:54,960 Speaker 2: I think Roy was fascinating to me because a number 413 00:20:55,000 --> 00:20:59,439 Speaker 2: of things. First, in order to prepare for interviews with 414 00:20:59,640 --> 00:21:04,040 Speaker 2: big tech companies like Google or any really big tech company, 415 00:21:04,080 --> 00:21:07,879 Speaker 2: he would work on leak Code. It's it's this site 416 00:21:07,960 --> 00:21:12,480 Speaker 2: that trains developers how to do these kind of puzzles 417 00:21:12,920 --> 00:21:16,199 Speaker 2: or riddles that he doesn't really think are applicable to 418 00:21:16,280 --> 00:21:19,000 Speaker 2: any kind of real world work. So he figured that 419 00:21:19,119 --> 00:21:22,399 Speaker 2: if he could develop something that would hide AI on 420 00:21:22,480 --> 00:21:27,280 Speaker 2: his browser, during a remote job interview, he could hack 421 00:21:27,400 --> 00:21:30,320 Speaker 2: these interviews and that it's not really cheating to him. 422 00:21:30,400 --> 00:21:32,960 Speaker 2: If it's hackable and there's a tool that can be 423 00:21:33,080 --> 00:21:36,200 Speaker 2: used to hack an assignment. He was thinking that if 424 00:21:36,240 --> 00:21:38,199 Speaker 2: not now, then in the near future it won't be 425 00:21:38,240 --> 00:21:41,919 Speaker 2: considered cheating. And it's very much the same way he 426 00:21:42,000 --> 00:21:45,639 Speaker 2: approaches studies. It's transactional to him. He had no interest 427 00:21:45,720 --> 00:21:49,560 Speaker 2: in kind of furthering himself for learning new things about 428 00:21:49,640 --> 00:21:54,360 Speaker 2: himself or about the world. He's there as a networking opportunity, 429 00:21:54,359 --> 00:21:56,840 Speaker 2: and he told me he's there to be, you know, 430 00:21:56,960 --> 00:22:00,239 Speaker 2: a co founder and a wife. Roy was singular in this, 431 00:22:00,359 --> 00:22:03,760 Speaker 2: but I think the idea that if it's hackable, why 432 00:22:03,760 --> 00:22:07,360 Speaker 2: am I learning this was something that resonated throughout all 433 00:22:07,359 --> 00:22:09,880 Speaker 2: of my interviews. And that's not just you know, sort 434 00:22:09,880 --> 00:22:12,960 Speaker 2: of a logic. It's also just kind of outside pressure 435 00:22:13,000 --> 00:22:15,960 Speaker 2: to excel, to get really good grades. If they feel 436 00:22:16,000 --> 00:22:17,920 Speaker 2: that pressure, they're going to use this tool. 437 00:22:18,119 --> 00:22:20,480 Speaker 3: Yeah, and that goes to I guess a larger question 438 00:22:20,560 --> 00:22:24,119 Speaker 3: I have, which is, you know, after reporting this article, 439 00:22:24,280 --> 00:22:27,800 Speaker 3: how common is it for college students to cheat using 440 00:22:28,320 --> 00:22:29,560 Speaker 3: just AI tools? 441 00:22:29,840 --> 00:22:31,480 Speaker 4: Now, Oh, I. 442 00:22:31,520 --> 00:22:35,600 Speaker 2: Think it's incredibly common. You know, our headline says everyone 443 00:22:35,720 --> 00:22:38,200 Speaker 2: is cheating, and I don't think that's I don't think 444 00:22:38,240 --> 00:22:40,600 Speaker 2: that's far off. One of the fascinating parts of this 445 00:22:40,680 --> 00:22:43,480 Speaker 2: article to me was talking to students and not using 446 00:22:43,520 --> 00:22:45,679 Speaker 2: the cheat word and watching them kind of work through it. 447 00:22:46,040 --> 00:22:48,080 Speaker 2: One of the students I talked to, Wendy, you know, 448 00:22:48,280 --> 00:22:52,040 Speaker 2: started our conversation by saying, I am against cheating. There 449 00:22:52,160 --> 00:22:55,560 Speaker 2: is a student handbook, and i am against cheating. I'm 450 00:22:55,560 --> 00:22:59,600 Speaker 2: against plagiarism. I'm against copy and pasting from chatgbt into 451 00:22:59,640 --> 00:23:02,879 Speaker 2: a document. And then she proceeded to tell me exactly 452 00:23:02,920 --> 00:23:05,320 Speaker 2: how she uses chat gpt to write all of her papers. 453 00:23:06,000 --> 00:23:08,240 Speaker 1: Now, was she the one who was using chat shept 454 00:23:08,800 --> 00:23:12,440 Speaker 1: to write the paper about how different modes of education 455 00:23:13,000 --> 00:23:16,399 Speaker 1: affected students callati development and you also, she's seen the 456 00:23:16,400 --> 00:23:19,680 Speaker 1: irony that she was using chatchipt to write this paper, 457 00:23:19,720 --> 00:23:23,120 Speaker 1: and she basically hadn't exactly Yeah, that is fine. 458 00:23:23,160 --> 00:23:26,200 Speaker 4: That's because chatchibt hasn't taught her about irony, right. 459 00:23:26,359 --> 00:23:27,240 Speaker 2: Hasn't covered that yet. 460 00:23:27,240 --> 00:23:29,040 Speaker 1: Well, she hadn't had time to think about it herself, 461 00:23:29,080 --> 00:23:30,480 Speaker 1: but she offloaded all of the work. 462 00:23:30,520 --> 00:23:34,200 Speaker 2: I mean, it's remarkable, Listen, I'm coming clean. I peaked 463 00:23:34,200 --> 00:23:35,960 Speaker 2: at spark notes every once in a while when I 464 00:23:36,000 --> 00:23:39,880 Speaker 2: was in college, so of course, but with spark notes 465 00:23:39,920 --> 00:23:42,520 Speaker 2: sort of like something that I relied on every single 466 00:23:42,560 --> 00:23:45,000 Speaker 2: time I got stuck. No, I think I had to 467 00:23:45,080 --> 00:23:47,000 Speaker 2: work through assignments a lot more often that I could 468 00:23:47,040 --> 00:23:47,920 Speaker 2: easily hack them. 469 00:23:48,200 --> 00:23:50,600 Speaker 3: The other thing that has changed a lot since I 470 00:23:50,640 --> 00:23:53,000 Speaker 3: guess we were students because you just mentioned spark notes 471 00:23:53,440 --> 00:23:57,520 Speaker 3: is like I was not contending with the allure of 472 00:23:57,560 --> 00:24:01,119 Speaker 3: anything but a Facebook wall, and one of the women 473 00:24:01,160 --> 00:24:04,240 Speaker 3: that you spoke to was talking about not so much 474 00:24:04,280 --> 00:24:06,879 Speaker 3: how hard school is, but how hard it is to 475 00:24:07,000 --> 00:24:13,720 Speaker 3: navigate all of the other digital distractions like TikTok, snapchat, Instagram, 476 00:24:14,200 --> 00:24:17,639 Speaker 3: And I guess, just based on you know, you're reporting 477 00:24:17,680 --> 00:24:20,359 Speaker 3: for this article, to what extent does the use of AI, 478 00:24:20,400 --> 00:24:25,120 Speaker 3: whether we're calling it cheating or enhanced studying, To what 479 00:24:25,200 --> 00:24:30,439 Speaker 3: extent does the pre existing digital landscape co opt the 480 00:24:30,560 --> 00:24:34,439 Speaker 3: ability to actually participate in the thing that you or 481 00:24:34,480 --> 00:24:38,520 Speaker 3: your parents or student loans are being used to send 482 00:24:38,560 --> 00:24:39,320 Speaker 3: you to university. 483 00:24:39,359 --> 00:24:42,240 Speaker 2: I don't know, Yeah, I mean, I think they're contending 484 00:24:42,320 --> 00:24:46,879 Speaker 2: with the swirl of whether it's social media or just 485 00:24:46,920 --> 00:24:50,800 Speaker 2: you know, the attention economy. The fact that chatgbt dropped, 486 00:24:50,880 --> 00:24:53,960 Speaker 2: you know what the end of twenty twenty two is 487 00:24:54,160 --> 00:24:57,399 Speaker 2: fascinating because we just figured out social media in school, 488 00:24:58,040 --> 00:25:01,560 Speaker 2: We're finally taking that seriously, and suddenly it's like, oh, well, here, 489 00:25:01,680 --> 00:25:04,280 Speaker 2: we're going to offer the greatest cheating tool that has 490 00:25:04,320 --> 00:25:06,960 Speaker 2: ever been created to co op people's attention. I mean, 491 00:25:07,119 --> 00:25:09,840 Speaker 2: I think one of the fascinating parts of this to 492 00:25:09,880 --> 00:25:12,840 Speaker 2: me was the kind of introduction of these websites cheg 493 00:25:12,920 --> 00:25:14,639 Speaker 2: and course Hero, which I you know, I didn't have 494 00:25:14,680 --> 00:25:16,919 Speaker 2: when I was an undergrad, but in a way, it 495 00:25:17,000 --> 00:25:20,320 Speaker 2: was like priming students to think it was okay to cheat. 496 00:25:20,880 --> 00:25:22,719 Speaker 1: These are websites where you could pay like an outsill 497 00:25:22,800 --> 00:25:24,560 Speaker 1: service to do your work for you. 498 00:25:24,400 --> 00:25:26,560 Speaker 2: Right, And you know, a website like cheg was employing 499 00:25:26,800 --> 00:25:31,280 Speaker 2: something like one hundred and fifty thousand experts, mostly in India, 500 00:25:31,320 --> 00:25:36,240 Speaker 2: who would provide answers to questions in thirty minutes or less. 501 00:25:36,600 --> 00:25:40,200 Speaker 2: And then chatgybt comes and you just see chegs stock 502 00:25:40,240 --> 00:25:44,200 Speaker 2: price just tanks because it was like one cheating tool 503 00:25:44,640 --> 00:25:45,639 Speaker 2: replacing another. 504 00:25:46,200 --> 00:25:49,320 Speaker 1: Is there an awareness that actually as fun and as 505 00:25:49,560 --> 00:25:52,680 Speaker 1: thrilling heck, this is that there's a real long term 506 00:25:52,800 --> 00:25:55,560 Speaker 1: price being paid in terms of how your mind is developing. 507 00:25:55,680 --> 00:25:57,359 Speaker 2: Yeah, I mean, I think there's certainly awareness on the 508 00:25:57,359 --> 00:25:59,720 Speaker 2: part of the professors and the people who are concerned 509 00:25:59,720 --> 00:26:03,120 Speaker 2: about that. There is, i will say, an awareness among students. 510 00:26:03,480 --> 00:26:05,600 Speaker 2: A lot of students were willing to engage in this, 511 00:26:05,960 --> 00:26:08,159 Speaker 2: and I was surprised by how many of the students 512 00:26:08,160 --> 00:26:10,000 Speaker 2: this was the first time they were having this conversation, 513 00:26:10,359 --> 00:26:12,040 Speaker 2: but they were eager to talk to me about this. 514 00:26:13,080 --> 00:26:18,520 Speaker 3: How forward thinking are academic institutions and educators about this, 515 00:26:18,760 --> 00:26:20,960 Speaker 3: like on the other side of things, because it's like 516 00:26:21,160 --> 00:26:23,520 Speaker 3: cheating is a little bit in the eye of the 517 00:26:23,600 --> 00:26:26,280 Speaker 3: cheater or is it in the eye of the place 518 00:26:26,400 --> 00:26:30,159 Speaker 3: that cheated cheated? Yeah, I mean that was such a 519 00:26:30,160 --> 00:26:31,399 Speaker 3: thing when I was younger. 520 00:26:31,160 --> 00:26:32,960 Speaker 1: Which is like you're plage cheating yourself. 521 00:26:33,240 --> 00:26:35,720 Speaker 3: Yeah, well, you're only cheating yourself, but like it was 522 00:26:35,760 --> 00:26:38,000 Speaker 3: also you could be kicked out of school, right right, 523 00:26:38,200 --> 00:26:40,960 Speaker 3: So to what extent are academic institutions like trying to 524 00:26:41,000 --> 00:26:41,639 Speaker 3: regulate this? 525 00:26:41,840 --> 00:26:44,920 Speaker 2: Yeah, the approach that most schools are taking is kind 526 00:26:44,920 --> 00:26:47,520 Speaker 2: of ad hoc. It's leaving it up to professors to 527 00:26:47,560 --> 00:26:51,359 Speaker 2: decide how they want to handle this. I'm kind of 528 00:26:51,359 --> 00:26:54,080 Speaker 2: sympathetic to that because it's such a difficult thing to regulate. 529 00:26:54,160 --> 00:26:57,080 Speaker 2: How on earth do you tell students to use something 530 00:26:57,200 --> 00:26:59,960 Speaker 2: that can help them and it's so difficult to cat, 531 00:27:00,359 --> 00:27:03,479 Speaker 2: you know, And so professors say, either you know, use it, 532 00:27:03,880 --> 00:27:06,480 Speaker 2: don't use it, or if you do use it, please 533 00:27:06,520 --> 00:27:09,520 Speaker 2: cite it, please provide a receipt you know that shows 534 00:27:09,560 --> 00:27:11,760 Speaker 2: the conversation you're having with chat GPT, so I can 535 00:27:11,800 --> 00:27:15,600 Speaker 2: watch kind of the gears turning. But again, it's really 536 00:27:15,640 --> 00:27:18,320 Speaker 2: hard to catch AI cheaters. You know. They have these 537 00:27:18,320 --> 00:27:23,240 Speaker 2: detection tools that really vary in their effectiveness, and even 538 00:27:23,320 --> 00:27:25,840 Speaker 2: if you are able to catch somebody using it just 539 00:27:25,880 --> 00:27:28,440 Speaker 2: by copy and pasting, you can't really catch somebody who's 540 00:27:28,520 --> 00:27:31,440 Speaker 2: just using it to generate ideas or generate topic sentences 541 00:27:31,480 --> 00:27:34,720 Speaker 2: and rewriting. And you can always launder AI text through 542 00:27:34,800 --> 00:27:38,159 Speaker 2: other AIS so that an AI detector can't really catch it. 543 00:27:38,640 --> 00:27:42,040 Speaker 2: So schools have quite the challenge in front of them. 544 00:27:42,560 --> 00:27:46,400 Speaker 2: And the challenge I think is convincing young students as 545 00:27:46,440 --> 00:27:49,440 Speaker 2: they come to their school why it's in their best 546 00:27:49,440 --> 00:27:50,840 Speaker 2: interests not to use AI. 547 00:27:51,200 --> 00:27:56,359 Speaker 1: Well, it's a fascinating moment for elite higher education institutions 548 00:27:56,400 --> 00:27:59,840 Speaker 1: in general, right, obviously in the cross hairs of the 549 00:27:59,840 --> 00:28:03,119 Speaker 1: t Trump administration. There was that David Brooks piece in 550 00:28:03,160 --> 00:28:06,000 Speaker 1: The Atlantic about three months ago about how kind of 551 00:28:06,200 --> 00:28:09,680 Speaker 1: elites and elite universities had failed and you were starting 552 00:28:09,680 --> 00:28:11,880 Speaker 1: to pay the price, and people were wondering whether the 553 00:28:11,920 --> 00:28:14,400 Speaker 1: price tag of going was even worth it. So there's 554 00:28:14,440 --> 00:28:18,400 Speaker 1: this kind of, like now tremendous new accelerant to those issues. 555 00:28:19,080 --> 00:28:21,800 Speaker 1: In terms of the neurological development side. Did you speak 556 00:28:21,840 --> 00:28:25,639 Speaker 1: to any cognitive psychologists or neuroscientists about what this. I 557 00:28:25,640 --> 00:28:28,560 Speaker 1: think I've heard the term cognitive offloading before, but like, 558 00:28:28,880 --> 00:28:31,080 Speaker 1: what's this doing to brains? Sure? 559 00:28:31,119 --> 00:28:34,720 Speaker 2: I mean I tried to dig into the research that's 560 00:28:34,760 --> 00:28:40,400 Speaker 2: out there on cognitive offloading, and there are a few 561 00:28:40,440 --> 00:28:42,600 Speaker 2: studies here and there that kind of show that, you know, 562 00:28:42,720 --> 00:28:49,640 Speaker 2: reliance on AI will reduce critical thinking. That's not necessarily surprising. 563 00:28:50,000 --> 00:28:52,800 Speaker 2: But I didn't want to lean too heavily on that 564 00:28:52,840 --> 00:28:56,600 Speaker 2: researcher or delve into it too much because it's so early, 565 00:28:56,720 --> 00:28:59,480 Speaker 2: and so our viewers kind of let the students speak 566 00:28:59,520 --> 00:29:01,200 Speaker 2: for themselves. I mean, what's fascinating to me is how 567 00:29:01,240 --> 00:29:04,080 Speaker 2: quickly this is happening. Also right the sudden realization. I 568 00:29:04,120 --> 00:29:07,160 Speaker 2: was like, oh my god, half of college students have 569 00:29:07,280 --> 00:29:10,520 Speaker 2: never known college without access to this. I do think 570 00:29:11,160 --> 00:29:13,640 Speaker 2: sooner rather than later, we'll kind of have a better 571 00:29:13,760 --> 00:29:16,120 Speaker 2: understanding of what's happening to people's brains. 572 00:29:16,600 --> 00:29:18,840 Speaker 1: I thought that was what was particularly brilliant about your piece. 573 00:29:18,880 --> 00:29:21,280 Speaker 1: It was almost like a piece of anthropology in terms 574 00:29:21,320 --> 00:29:24,200 Speaker 1: of you got to hear students in their own words, 575 00:29:24,560 --> 00:29:28,800 Speaker 1: wrestling with this problem. There's something which is very tempting 576 00:29:28,840 --> 00:29:30,520 Speaker 1: in front of them that they know is bad, but 577 00:29:30,520 --> 00:29:32,080 Speaker 1: they don't know what else to do. And I thought 578 00:29:32,080 --> 00:29:33,760 Speaker 1: that the drama of that really came across. 579 00:29:33,920 --> 00:29:36,720 Speaker 3: I felt a little bit of nostalgia when I was 580 00:29:36,720 --> 00:29:43,480 Speaker 3: reading your piece for the Forward Thinking Vigilante TI eighty 581 00:29:43,560 --> 00:29:47,959 Speaker 3: three Hacking Cara that existed when I was in twelfth grade, 582 00:29:48,280 --> 00:29:50,960 Speaker 3: and I was just, you know, I'm past the point 583 00:29:51,040 --> 00:29:55,440 Speaker 3: of using chat GPT for cognitive offloading to a certain extent, 584 00:29:55,520 --> 00:29:57,840 Speaker 3: because it doesn't feel native to me now, and I 585 00:29:58,000 --> 00:29:59,840 Speaker 3: just wonder, I don't want to use the word worse, 586 00:30:00,320 --> 00:30:04,040 Speaker 3: but just more ubiquitous in terms of you know, students 587 00:30:04,160 --> 00:30:08,760 Speaker 3: using it as a method of, you know, skirting. 588 00:30:08,400 --> 00:30:11,600 Speaker 1: The opportunity to develop their own critical faculties. 589 00:30:11,680 --> 00:30:15,160 Speaker 3: Yeah, and also just to the point of like people 590 00:30:15,200 --> 00:30:17,240 Speaker 3: want to spend time doing other things. I think that's 591 00:30:17,240 --> 00:30:19,280 Speaker 3: always been true, right where it's like you went to 592 00:30:19,320 --> 00:30:21,719 Speaker 3: college and you're like, well, I'd rather be hooking up 593 00:30:21,800 --> 00:30:24,480 Speaker 3: or partying or doing something else. Now it's like I'd 594 00:30:24,560 --> 00:30:28,520 Speaker 3: rather be on TikTok, Snapchat and Instagram than do my homework, 595 00:30:28,640 --> 00:30:31,120 Speaker 3: you know. So it's I think that's the difference. Is like, 596 00:30:31,200 --> 00:30:33,960 Speaker 3: it also is taking out a socialization piece, which is 597 00:30:34,280 --> 00:30:36,600 Speaker 3: really was a part of going to school as well, 598 00:30:36,800 --> 00:30:38,800 Speaker 3: which was like I'm going to talk to my peers 599 00:30:39,360 --> 00:30:41,680 Speaker 3: about what they're writing about. We're going to maybe sit 600 00:30:41,720 --> 00:30:43,600 Speaker 3: in the library together and confer. Now it's like a 601 00:30:43,640 --> 00:30:47,360 Speaker 3: reseitting around talking about how chat GBT you know, is 602 00:30:47,400 --> 00:30:49,120 Speaker 3: helping us write a chaucer essay. 603 00:30:49,680 --> 00:30:51,840 Speaker 2: Sure, I mean something that comes up time and time again, 604 00:30:52,400 --> 00:30:54,480 Speaker 2: is you know, it's the kind of one on one 605 00:30:54,680 --> 00:30:57,760 Speaker 2: learning that students can do with chat GBT. They have 606 00:30:57,920 --> 00:31:01,760 Speaker 2: this brilliant ta at their fingertips at all times and 607 00:31:02,400 --> 00:31:04,600 Speaker 2: over and over talking to students are like I do it. 608 00:31:04,600 --> 00:31:06,760 Speaker 2: It's instead office hours, it's instead office hours. I talked 609 00:31:06,800 --> 00:31:09,960 Speaker 2: to professors who said, like, our office hours have just tanked. 610 00:31:10,600 --> 00:31:14,280 Speaker 2: People aren't showing up, and of course something is lost, right, 611 00:31:14,360 --> 00:31:17,600 Speaker 2: And you know, I'm I was kind of shy in college, 612 00:31:17,640 --> 00:31:21,280 Speaker 2: Like I took some amount of like staring in the mirror, 613 00:31:21,320 --> 00:31:22,400 Speaker 2: being like, all right, you're going to show up to 614 00:31:22,400 --> 00:31:24,880 Speaker 2: office hours, And I think I got something out of that. 615 00:31:25,600 --> 00:31:28,240 Speaker 2: So I think that the loss of those interactions is 616 00:31:28,320 --> 00:31:30,360 Speaker 2: going to be measurable in some ways as well. 617 00:31:31,040 --> 00:31:33,840 Speaker 1: James. Just to close, one of the most surprising moments 618 00:31:33,840 --> 00:31:36,800 Speaker 1: in Your Peace is a quote from Sam Altman, who 619 00:31:36,880 --> 00:31:40,440 Speaker 1: said before Congress, I worry that as models get better 620 00:31:40,480 --> 00:31:43,440 Speaker 1: and better, users can have sort of less and less 621 00:31:43,440 --> 00:31:47,840 Speaker 1: of their own discriminating process. Surprised to hear that from him, 622 00:31:48,040 --> 00:31:50,800 Speaker 1: I'm curious, did you reach out for traditional comment and 623 00:31:50,840 --> 00:31:54,080 Speaker 1: how are the tech companies as a whole responding to 624 00:31:54,120 --> 00:31:55,560 Speaker 1: this phenomenon? Right? 625 00:31:56,080 --> 00:31:58,600 Speaker 2: Sam Altman said that in I think twenty twenty three, 626 00:31:59,160 --> 00:32:01,480 Speaker 2: we of course reached out to Open Air for comment, 627 00:32:01,560 --> 00:32:05,560 Speaker 2: and they pointed us toward just their education platform. You know, 628 00:32:05,640 --> 00:32:10,160 Speaker 2: I think this is something that the platforms are very 629 00:32:10,160 --> 00:32:14,480 Speaker 2: aware of, and even in the context of education. I 630 00:32:14,520 --> 00:32:19,400 Speaker 2: spoke to somebody from Anthropic about this on their education team, 631 00:32:19,800 --> 00:32:23,680 Speaker 2: and he said that they had expected students to be 632 00:32:23,800 --> 00:32:26,280 Speaker 2: some of the you know, the earliest adopters, but they 633 00:32:26,280 --> 00:32:30,440 Speaker 2: were still shocked by how true that is and how 634 00:32:30,640 --> 00:32:35,240 Speaker 2: much adoption there is on college campuses and so. And 635 00:32:35,280 --> 00:32:39,000 Speaker 2: they are also you know, concerned about the implications of that. 636 00:32:39,480 --> 00:32:43,400 Speaker 2: You know, open ai has apparently reportedly its own watermark 637 00:32:43,680 --> 00:32:47,000 Speaker 2: that would effectively cut down on plagiarism, but has chosen 638 00:32:47,160 --> 00:32:49,400 Speaker 2: not to release it. So, you know, I'm really interesting 639 00:32:49,640 --> 00:32:52,480 Speaker 2: what those conversations inside the company are like as well. 640 00:32:52,960 --> 00:32:54,880 Speaker 3: I mean, the fact that open a I could use 641 00:32:54,920 --> 00:32:57,680 Speaker 3: a water mark but refuses to really shows me who 642 00:32:57,720 --> 00:32:58,920 Speaker 3: these companies are targeting. 643 00:32:59,520 --> 00:33:01,760 Speaker 2: I mean, one of the most dystopian things that happened 644 00:33:01,760 --> 00:33:05,120 Speaker 2: to me while we were closing this piece. I got 645 00:33:05,160 --> 00:33:08,920 Speaker 2: like the push alert about Google launching this AI chatbot 646 00:33:08,920 --> 00:33:12,000 Speaker 2: for children under thirteen, and it was just like more 647 00:33:12,040 --> 00:33:15,120 Speaker 2: evidence that all these platforms are in a race to 648 00:33:15,200 --> 00:33:19,239 Speaker 2: capture this kind of like loyalty among younger users. And 649 00:33:19,320 --> 00:33:21,640 Speaker 2: it just seems like a moment when we should kind 650 00:33:21,640 --> 00:33:22,800 Speaker 2: of all be putting the brakes on. 651 00:33:25,040 --> 00:33:25,360 Speaker 1: James. 652 00:33:25,360 --> 00:33:28,320 Speaker 4: Thank you, thank you, James, thank you very much for 653 00:33:28,320 --> 00:33:38,080 Speaker 4: having me. That's it for this week for text Stuff, 654 00:33:38,080 --> 00:33:39,120 Speaker 4: I'm Kara Price and. 655 00:33:39,080 --> 00:33:42,280 Speaker 1: I'm mos Veloschin. This episode was produced by Eliza Dennis 656 00:33:42,280 --> 00:33:45,240 Speaker 1: and Victoria de Minuez. It was executive produced by me, 657 00:33:45,600 --> 00:33:48,960 Speaker 1: Karra Price and Kate Osborne of The Kaleidoscope and Katrina 658 00:33:49,040 --> 00:33:52,959 Speaker 1: Novel for iHeart Podcasts. The engineer is Phite Fraser and 659 00:33:53,080 --> 00:33:56,360 Speaker 1: Kyle Murdol mixed this episode. He also wrote off theme song. 660 00:33:56,680 --> 00:33:59,240 Speaker 3: Join us next Wednesday for text Stuff The Story and 661 00:33:59,320 --> 00:34:01,880 Speaker 3: we will share an in depth conversation with Sir David 662 00:34:01,920 --> 00:34:06,240 Speaker 3: Spiegelhalter to discuss all things risk in life, love, and 663 00:34:06,320 --> 00:34:07,440 Speaker 3: of course tech. 664 00:34:08,160 --> 00:34:10,719 Speaker 1: Please rate and review the show on Apple Podcasts or 665 00:34:10,719 --> 00:34:13,600 Speaker 1: Spotify or wherever you listen to your podcasts, and you 666 00:34:13,600 --> 00:34:15,879 Speaker 1: can also write to us at tech Stuff podcast at 667 00:34:15,920 --> 00:34:18,360 Speaker 1: gmail dot com. We really like getting your feedback.