1 00:00:10,960 --> 00:00:12,960 Speaker 1: I hadn't done a pitch call out about a year, 2 00:00:12,960 --> 00:00:15,120 Speaker 1: I guess, so I hadn't really been paying attention to 3 00:00:15,200 --> 00:00:17,759 Speaker 1: what that world was like. But this time I got like, 4 00:00:17,960 --> 00:00:22,000 Speaker 1: way more pitches than ever, very quickly, from all over 5 00:00:22,040 --> 00:00:24,680 Speaker 1: the place. It was very different than the year before. 6 00:00:25,200 --> 00:00:28,280 Speaker 2: Nicholas hun Brown is the editor of The Local, which 7 00:00:28,360 --> 00:00:32,120 Speaker 2: is an independent online magazine that covers social issues in Toronto. 8 00:00:32,600 --> 00:00:35,360 Speaker 2: Last September, they put out an open call for freelance 9 00:00:35,440 --> 00:00:38,320 Speaker 2: journalists to pitch stories for their upcoming issue. 10 00:00:38,400 --> 00:00:41,760 Speaker 1: We were looking to assign stories about healthcare and money. 11 00:00:41,880 --> 00:00:45,839 Speaker 1: There's been some creeping privatization in Ontario where we live, 12 00:00:46,159 --> 00:00:48,600 Speaker 1: so we put out a call for pitches. I posted 13 00:00:48,600 --> 00:00:51,880 Speaker 1: something on Blue Sky just asking freelancers give me a suggestions. 14 00:00:52,200 --> 00:00:54,920 Speaker 2: Nick got a lot of pitches, but among those, there 15 00:00:55,000 --> 00:00:57,560 Speaker 2: was one in particular that stood out. 16 00:00:57,800 --> 00:01:00,640 Speaker 1: This was a writer who called herself Victoria Goald. She 17 00:01:00,720 --> 00:01:02,520 Speaker 1: said in the email that she'd written for a bunch 18 00:01:02,560 --> 00:01:05,160 Speaker 1: of these Canadian publications that do similar work to what 19 00:01:05,200 --> 00:01:07,319 Speaker 1: we do. It was a well written pitch, I thought, 20 00:01:07,480 --> 00:01:09,160 Speaker 1: and when I did a quick Google of her, she 21 00:01:09,240 --> 00:01:12,320 Speaker 1: had bylines and a bunch of you know, repude of publications, 22 00:01:12,360 --> 00:01:15,319 Speaker 1: The Guardian, New York Magazine, places like that. You know, 23 00:01:15,360 --> 00:01:17,680 Speaker 1: when you see someone has written for all these other publications, 24 00:01:17,880 --> 00:01:20,399 Speaker 1: my first instinct was, like, that seems legit. You know, 25 00:01:20,440 --> 00:01:22,840 Speaker 1: these other publications are good publications. I'm not going to 26 00:01:22,840 --> 00:01:23,640 Speaker 1: be too suspicious. 27 00:01:24,080 --> 00:01:24,720 Speaker 3: Why would you be? 28 00:01:25,080 --> 00:01:28,840 Speaker 2: Yeah, but Victoria Goldie turned out not to be who 29 00:01:28,880 --> 00:01:31,760 Speaker 2: she said she was, and this sent Nick down a 30 00:01:31,800 --> 00:01:35,839 Speaker 2: really weird rabbit hole investigation. You've heard of fake news, Well, 31 00:01:36,040 --> 00:01:44,200 Speaker 2: welcome to a new scam. Fake journalists from Kaleidoscope and 32 00:01:44,280 --> 00:01:45,240 Speaker 2: iHeart Podcasts. 33 00:01:50,560 --> 00:01:56,560 Speaker 3: This is kill Switch. I'm Dexterah Thomas. 34 00:01:58,040 --> 00:02:35,359 Speaker 4: I'm sorry, I'm goodbye. 35 00:02:39,240 --> 00:02:41,440 Speaker 2: Back in September, when Nick first put out this call 36 00:02:41,480 --> 00:02:45,000 Speaker 2: for pitches, the magazine was deliberately trying to expand their 37 00:02:45,040 --> 00:02:48,399 Speaker 2: freelance writer pool. They were hoping to find new writers, 38 00:02:48,680 --> 00:02:51,600 Speaker 2: new voices, people with interesting stories to tell. 39 00:02:52,280 --> 00:02:53,880 Speaker 1: Part of our mandate is we want to work with 40 00:02:53,919 --> 00:02:56,560 Speaker 1: young writers. We want to work with people from different 41 00:02:56,560 --> 00:02:59,720 Speaker 1: parts of the city, different backgrounds. We are actively seeking 42 00:02:59,760 --> 00:03:02,760 Speaker 1: to work with people who maybe don't have the clips, 43 00:03:02,880 --> 00:03:05,960 Speaker 1: are not as experienced, but are able to bring a 44 00:03:06,000 --> 00:03:07,880 Speaker 1: point of view and a story to us that we 45 00:03:07,919 --> 00:03:10,800 Speaker 1: won't found elsewhere, So like the pitch meant a whole lot, 46 00:03:10,840 --> 00:03:12,920 Speaker 1: Like if you could tell me an awesome story in 47 00:03:12,960 --> 00:03:15,080 Speaker 1: a pitch, if I could see your style on the pitch, 48 00:03:15,160 --> 00:03:17,440 Speaker 1: if I could see that you had like an idea 49 00:03:17,480 --> 00:03:19,880 Speaker 1: I couldn't find elsewhere, that would go a long way. 50 00:03:21,120 --> 00:03:24,320 Speaker 2: And Victoria Goldie's pitch hit all those things. It was 51 00:03:24,440 --> 00:03:27,359 Speaker 2: exactly what Nick was looking for. Her pitch was about 52 00:03:27,360 --> 00:03:31,400 Speaker 2: a debate in Canada about something called membership medicine. So 53 00:03:31,560 --> 00:03:34,080 Speaker 2: you probably know that Canada has free essential health care, 54 00:03:34,120 --> 00:03:38,040 Speaker 2: but membership medicine is basically when patients pay their doctors 55 00:03:38,080 --> 00:03:42,840 Speaker 2: directly for more premium services, and man, Victoria was on it. 56 00:03:42,960 --> 00:03:46,160 Speaker 2: In that initial email to Nick, she wrote, quote, the 57 00:03:46,240 --> 00:03:50,040 Speaker 2: story would track how these plans transform healthcare into something 58 00:03:50,080 --> 00:03:53,720 Speaker 2: resembling Netflix or Amazon Prime, and what this means for 59 00:03:53,800 --> 00:03:57,760 Speaker 2: a public system that has long prided itself on universality. 60 00:03:58,160 --> 00:04:00,120 Speaker 1: The pitch she sent it indicated that she done a 61 00:04:00,120 --> 00:04:03,080 Speaker 1: lot of reporting already. She'd done some research. She said, 62 00:04:03,080 --> 00:04:06,400 Speaker 1: She'd spoken with a couple prominent doctors here. She seemed 63 00:04:06,400 --> 00:04:08,400 Speaker 1: to have some like patients who are willing to talk 64 00:04:08,440 --> 00:04:11,880 Speaker 1: about their experiences with privatized medical care. It seemed like 65 00:04:11,880 --> 00:04:15,040 Speaker 1: a promising pitch, so we thought, you know, let's take 66 00:04:15,040 --> 00:04:15,800 Speaker 1: a shot on this person. 67 00:04:17,000 --> 00:04:20,240 Speaker 2: Just as a fellow freelancer. I appreciate Nick taking that 68 00:04:20,240 --> 00:04:23,440 Speaker 2: shot on Victoria because I've been there. You're trying to 69 00:04:23,480 --> 00:04:25,920 Speaker 2: show an editor that you're the person to hire, so 70 00:04:26,040 --> 00:04:29,440 Speaker 2: you do all this extra work up front, like, look, 71 00:04:29,600 --> 00:04:32,120 Speaker 2: I've already done this interview, I've already got access to 72 00:04:32,160 --> 00:04:34,520 Speaker 2: the hospital. Just say the word boss, and I will 73 00:04:34,520 --> 00:04:39,000 Speaker 2: write the best article just for you. So please understand 74 00:04:39,080 --> 00:04:42,960 Speaker 2: right now, at this point, I am rooting for Victoria Goldie. 75 00:04:43,120 --> 00:04:46,000 Speaker 2: So in their next meeting, Nick and his editorial team 76 00:04:46,080 --> 00:04:48,880 Speaker 2: decide to move forward with Victoria. But then he looked 77 00:04:48,920 --> 00:04:53,320 Speaker 2: at Victoria's original email again and realized something that he 78 00:04:53,680 --> 00:04:54,920 Speaker 2: hadn't noticed before. 79 00:04:55,560 --> 00:04:57,560 Speaker 1: You know, I went back to her pitch, and some 80 00:04:57,680 --> 00:05:01,160 Speaker 1: other questions popped up in my mind, Like the biggest 81 00:05:01,160 --> 00:05:03,240 Speaker 1: one was is she in Toronto? You know, when I 82 00:05:03,240 --> 00:05:04,880 Speaker 1: looked at her Guardian stuff, it was all about her 83 00:05:05,000 --> 00:05:09,040 Speaker 1: being in England. Once you'd done stuff for American magazines, 84 00:05:09,080 --> 00:05:11,479 Speaker 1: it seemed as if she was writing as an American, 85 00:05:11,720 --> 00:05:13,440 Speaker 1: So I wondered, you know that there are people who 86 00:05:13,440 --> 00:05:16,360 Speaker 1: are international. There's no reason that you could be publishing 87 00:05:16,400 --> 00:05:19,000 Speaker 1: and reporting from different places. But yeah, it was a 88 00:05:19,040 --> 00:05:21,200 Speaker 1: question I had. And the second was that she'd done 89 00:05:21,279 --> 00:05:23,240 Speaker 1: so much pre reporting for this. She'd done so much 90 00:05:23,279 --> 00:05:26,200 Speaker 1: work on this pitch. Within like, you know, a week 91 00:05:26,279 --> 00:05:28,560 Speaker 1: or two of me setting out the call out, she'd 92 00:05:28,560 --> 00:05:31,200 Speaker 1: interviewed four people or something. We're in that first pitch. 93 00:05:31,400 --> 00:05:33,680 Speaker 1: And I was a freelancer forever. I know that process, 94 00:05:33,720 --> 00:05:36,640 Speaker 1: and sometimes it's useful to do some reporting before you 95 00:05:36,640 --> 00:05:39,400 Speaker 1: get a story assigned. But like that much is there's 96 00:05:39,440 --> 00:05:41,240 Speaker 1: a lot that's a risk, right, there's no guarantee that 97 00:05:41,240 --> 00:05:42,960 Speaker 1: you're going to land the story. So then I thought, 98 00:05:43,000 --> 00:05:45,119 Speaker 1: you know, there's times where someone could be a student 99 00:05:45,160 --> 00:05:48,440 Speaker 1: journalist and then they're reusing interviews they've done before. There's 100 00:05:48,440 --> 00:05:52,040 Speaker 1: different reasons that could happen. But that made me question 101 00:05:52,160 --> 00:05:56,280 Speaker 1: some things, so he dug a little deeper. So then 102 00:05:56,320 --> 00:05:59,200 Speaker 1: I wanted to just see what stories she'd done for 103 00:05:59,240 --> 00:06:01,840 Speaker 1: Canadian publics. I'd seen the work that she'd done at 104 00:06:01,839 --> 00:06:04,480 Speaker 1: The Guardian, the work she'd done elsewhere, and in her 105 00:06:04,480 --> 00:06:06,520 Speaker 1: pitch she said she'd written for our national in these 106 00:06:06,560 --> 00:06:08,880 Speaker 1: papers of the Globe and Mail, the Walrus kind of 107 00:06:09,080 --> 00:06:11,159 Speaker 1: magazines that are like us. So I just googled her 108 00:06:11,279 --> 00:06:14,880 Speaker 1: name along with those publications, and nothing came up. This 109 00:06:15,120 --> 00:06:18,640 Speaker 1: is where Nick starts to get really suspicious. Could Victoria 110 00:06:18,760 --> 00:06:21,680 Speaker 1: just be lying about publishing in all these different places? 111 00:06:22,160 --> 00:06:25,320 Speaker 1: And if she was, what else could she be lying about? 112 00:06:25,920 --> 00:06:27,680 Speaker 1: She went back to look at her pitch to see 113 00:06:27,680 --> 00:06:30,640 Speaker 1: if there was any other explanation for this. Then I 114 00:06:30,680 --> 00:06:32,400 Speaker 1: was looking at all these quotes that she had in 115 00:06:32,440 --> 00:06:35,239 Speaker 1: her pitch, and some of them were from regular patients, 116 00:06:35,240 --> 00:06:38,040 Speaker 1: she said, and some were from some prominent doctors in Toronto. 117 00:06:38,120 --> 00:06:40,279 Speaker 1: And my colleague happens to know one of these doctors. 118 00:06:40,320 --> 00:06:41,720 Speaker 1: So he shot her an email and said, did you 119 00:06:41,760 --> 00:06:44,880 Speaker 1: speak to a journalist called Victoria Goldie and this doctor? 120 00:06:44,960 --> 00:06:48,279 Speaker 1: Doctor Daniel Martin said, no, never heard of her. That's 121 00:06:48,880 --> 00:06:52,800 Speaker 1: very weird that I'm being quoted this way. So when 122 00:06:52,800 --> 00:06:55,560 Speaker 1: that happened, I kind of knew. I knew what was 123 00:06:55,600 --> 00:06:57,919 Speaker 1: going on, right, Like, I looked at the pitch again, 124 00:06:58,160 --> 00:07:00,760 Speaker 1: I read it with kind of different eyes, and it 125 00:07:00,760 --> 00:07:03,839 Speaker 1: felt to me as if someone had been using generative 126 00:07:03,880 --> 00:07:06,680 Speaker 1: AI to create a pitch with made up quotes from 127 00:07:06,760 --> 00:07:10,000 Speaker 1: real people, and they had been making up some of 128 00:07:10,040 --> 00:07:14,120 Speaker 1: their botolines in Canadian publications, and when you reread it again, 129 00:07:14,200 --> 00:07:18,160 Speaker 1: it was like there was some chat gptisms, you know, 130 00:07:18,480 --> 00:07:22,120 Speaker 1: if there was some like formulaic writing about this pitch 131 00:07:22,160 --> 00:07:24,320 Speaker 1: is important because of X, it's timely, because of why, 132 00:07:24,320 --> 00:07:26,480 Speaker 1: it's perfect for your publication, because of Z. It felt 133 00:07:26,480 --> 00:07:28,800 Speaker 1: a little When I looked at it more closely, you 134 00:07:28,840 --> 00:07:31,440 Speaker 1: could see some things that felt a little inhuman. 135 00:07:31,560 --> 00:07:35,320 Speaker 2: I guess, hmmm, See that's tough because I've definitely pitched 136 00:07:35,320 --> 00:07:39,520 Speaker 2: things that I've used language like that. No, totally, like 137 00:07:39,560 --> 00:07:42,280 Speaker 2: that's what you do, Like you tell, hey, I'd like 138 00:07:42,400 --> 00:07:45,080 Speaker 2: to write this, and it's timely because this is happening. 139 00:07:45,080 --> 00:07:47,400 Speaker 2: It's the anniversary of whatever. We should revisit this. 140 00:07:47,720 --> 00:07:50,840 Speaker 1: Yeah, no, all the telles for chat GPT or whatever. 141 00:07:50,960 --> 00:07:53,040 Speaker 1: Like people say looking for M dashes and I'm so 142 00:07:53,080 --> 00:07:55,000 Speaker 1: resentful of that. I use M dashes all the time, 143 00:07:55,120 --> 00:07:58,520 Speaker 1: like it's same. So I don't know, I don't have 144 00:07:58,560 --> 00:08:01,400 Speaker 1: a good way of spawning something. But the made up 145 00:08:01,480 --> 00:08:04,320 Speaker 1: quotes and the kind of made up bylines made me 146 00:08:04,320 --> 00:08:05,480 Speaker 1: think this is what was happening. 147 00:08:06,240 --> 00:08:09,400 Speaker 2: So okay, you're starting to think that this is AI. 148 00:08:09,960 --> 00:08:10,800 Speaker 2: What do you do next? 149 00:08:11,440 --> 00:08:13,560 Speaker 1: Then I just got really curious, right, Like, it's clear 150 00:08:13,600 --> 00:08:15,239 Speaker 1: that this was a fake pitch, but like, what about 151 00:08:15,240 --> 00:08:18,440 Speaker 1: all of those other bylines, Like all those other stories. 152 00:08:18,520 --> 00:08:21,080 Speaker 1: There's dozens of them across the Internet that she's written. 153 00:08:21,200 --> 00:08:23,360 Speaker 1: Like if she was faking stuff in this pitch for 154 00:08:23,440 --> 00:08:26,400 Speaker 1: us so blatantly, what about those stories? 155 00:08:28,760 --> 00:08:32,040 Speaker 2: So Nick started looking at Victoria Goldie's previous body of 156 00:08:32,120 --> 00:08:36,120 Speaker 2: work and he kind of got obsessed, and that obsession 157 00:08:36,720 --> 00:08:41,640 Speaker 2: led him to some pretty weird discoveries. That's after the break. 158 00:08:52,920 --> 00:08:55,680 Speaker 2: After Nick puts together that Victoria seems to have not 159 00:08:55,760 --> 00:08:59,240 Speaker 2: only chat GPT or pitch, but also lied about working 160 00:08:59,240 --> 00:09:02,480 Speaker 2: on other Canadians publications, he decides that he needs to 161 00:09:02,520 --> 00:09:05,880 Speaker 2: figure out how far this rabbit hole goes. He's pretty 162 00:09:05,960 --> 00:09:11,040 Speaker 2: sure he's being laed to, but he's still curious. You 163 00:09:11,080 --> 00:09:12,839 Speaker 2: didn't confront her immediately? 164 00:09:13,520 --> 00:09:15,800 Speaker 1: No, I wanted to know a little more, did. I 165 00:09:15,800 --> 00:09:17,959 Speaker 1: did send some follow up emails. I said, like, did 166 00:09:17,960 --> 00:09:19,679 Speaker 1: you actually speak with these people? And she said yes, 167 00:09:19,720 --> 00:09:21,400 Speaker 1: I did. I said, are you based in Toronto because 168 00:09:21,400 --> 00:09:23,640 Speaker 1: we need local reporting? She said, yeah, absolutely and based 169 00:09:23,640 --> 00:09:27,199 Speaker 1: in Toronto. So the kind of like I understood the 170 00:09:27,280 --> 00:09:29,000 Speaker 1: level of deception that was going on. I guess I 171 00:09:29,080 --> 00:09:31,640 Speaker 1: understood that she was making stuff up. So when I 172 00:09:31,720 --> 00:09:35,080 Speaker 1: started looking into the work that she'd published, the first 173 00:09:35,080 --> 00:09:37,520 Speaker 1: one I looked at was this feature from the Journal 174 00:09:37,720 --> 00:09:40,280 Speaker 1: of the Law Society of Scotland, which is like an 175 00:09:40,320 --> 00:09:43,320 Speaker 1: in depth feature that was about law firms disappearing from 176 00:09:43,960 --> 00:09:46,680 Speaker 1: the Scottish Highlands and like what were rural people doing? 177 00:09:46,720 --> 00:09:49,120 Speaker 1: And I quoted like members of parliament, I quoted a 178 00:09:49,120 --> 00:09:52,080 Speaker 1: bunch of lawyers, academics, and then a bunch of regular people. 179 00:09:52,200 --> 00:09:54,520 Speaker 1: So I was just wondering, like, this is a real article. 180 00:09:54,960 --> 00:09:56,760 Speaker 1: I couldn't find any of the regular people, you know, 181 00:09:56,760 --> 00:10:00,320 Speaker 1: when she begins an anecdote, like Fiona forty to a 182 00:10:00,320 --> 00:10:05,160 Speaker 1: school teacher in Glengarry or whatever, like those people google them, 183 00:10:05,200 --> 00:10:07,400 Speaker 1: can find any of them. So then I emailed the 184 00:10:07,440 --> 00:10:09,480 Speaker 1: people that she spoke to who were real, like a 185 00:10:09,520 --> 00:10:12,600 Speaker 1: professor called Elaine Sutherland who she quoted in that piece. 186 00:10:12,640 --> 00:10:14,040 Speaker 1: And I just emailed and said, did you speak to 187 00:10:14,040 --> 00:10:17,800 Speaker 1: this writer? And she said no, absolutely not. But what's 188 00:10:17,840 --> 00:10:19,960 Speaker 1: weird is that this sounds like the kind of thing 189 00:10:19,960 --> 00:10:22,280 Speaker 1: I would say, wow, And this is the response that 190 00:10:22,280 --> 00:10:24,959 Speaker 1: I was getting in a lot of places, and I 191 00:10:25,000 --> 00:10:26,880 Speaker 1: don't know, it's one of the eerier things. Right. It's 192 00:10:26,880 --> 00:10:29,280 Speaker 1: not someone just making stuff up, but it is, I 193 00:10:29,320 --> 00:10:33,559 Speaker 1: believe AI searching through the internet thinking who is the 194 00:10:33,679 --> 00:10:36,760 Speaker 1: likely person who can talk about Scottish law firms disappearing 195 00:10:36,760 --> 00:10:39,000 Speaker 1: from the Highlands, you find someone who maybe has spoken 196 00:10:39,040 --> 00:10:40,800 Speaker 1: too that in the past, or written a research paper 197 00:10:40,800 --> 00:10:43,160 Speaker 1: about it, or whatever it is, and then kind of 198 00:10:43,280 --> 00:10:46,960 Speaker 1: invents quotes that sound kind of like in the ballpark 199 00:10:47,000 --> 00:10:48,640 Speaker 1: of what you might say. So that was the first 200 00:10:48,679 --> 00:10:51,280 Speaker 1: piece that I thought, Okay, this is a made up story, 201 00:10:51,440 --> 00:10:53,320 Speaker 1: and then I just kept looking. There were more. There 202 00:10:53,360 --> 00:10:57,000 Speaker 1: was a story in Dwell that quoted like ten prominent 203 00:10:57,400 --> 00:11:00,800 Speaker 1: international designers and architects, and I just started sending emails 204 00:11:00,840 --> 00:11:03,360 Speaker 1: to all of them, and they had not spoken to 205 00:11:03,400 --> 00:11:03,840 Speaker 1: this person. 206 00:11:04,960 --> 00:11:08,679 Speaker 2: Victoria had published work in The Guardian, in Business Insider, 207 00:11:09,000 --> 00:11:13,600 Speaker 2: New York Magazine, Rolling Stone, Africa, Vogue, Philippines. I could 208 00:11:13,640 --> 00:11:16,800 Speaker 2: keep going here. This is a resume that almost any 209 00:11:16,840 --> 00:11:19,120 Speaker 2: freelance journalists would be really jealous of. 210 00:11:19,760 --> 00:11:21,800 Speaker 1: How many articles should you have out there? I think 211 00:11:21,840 --> 00:11:23,600 Speaker 1: over two dozen. I think I've got a spreadsheet of 212 00:11:23,679 --> 00:11:26,160 Speaker 1: over two dozen articles. You made a spreadsheet I got, 213 00:11:26,200 --> 00:11:29,080 Speaker 1: I got obsessed. I don't know I was not gonna 214 00:11:29,080 --> 00:11:31,120 Speaker 1: write about this. We're a local publication that only reads 215 00:11:31,120 --> 00:11:32,800 Speaker 1: about Toronto, right, I was just doing this set of 216 00:11:32,880 --> 00:11:35,720 Speaker 1: curiosity to begin with. But I don't know, it just 217 00:11:35,720 --> 00:11:38,720 Speaker 1: seems so wild to me that this stuff was out there. 218 00:11:38,840 --> 00:11:42,200 Speaker 1: And maybe it sound like overly, you know, people are 219 00:11:42,280 --> 00:11:45,120 Speaker 1: lying on the internet, but I found it shocking. I guess, 220 00:11:45,920 --> 00:11:48,400 Speaker 1: how dare somebody lie on the internet. Yeah, yeah, but 221 00:11:48,480 --> 00:11:50,800 Speaker 1: I lie on the Internet under the guardian's name or whatever. 222 00:11:50,960 --> 00:11:52,640 Speaker 3: That is different. Yeah, that is different. 223 00:11:52,720 --> 00:11:55,280 Speaker 2: Yeah, getting on there and saying, hey, yo, I don't know, 224 00:11:55,320 --> 00:11:58,800 Speaker 2: somebody like lying on their dating profile. Yo, I'm six 225 00:11:58,840 --> 00:12:00,880 Speaker 2: foot two and like, bro, you're five to three. 226 00:12:00,920 --> 00:12:01,480 Speaker 1: Like okay. 227 00:12:01,480 --> 00:12:04,480 Speaker 2: Maybe not a nice thing to do, but also very 228 00:12:04,480 --> 00:12:08,000 Speaker 2: different from fooling the public into thinking that something happened. 229 00:12:08,120 --> 00:12:10,320 Speaker 2: And some people said some things that they actually never did. 230 00:12:10,600 --> 00:12:11,240 Speaker 1: Yeah. 231 00:12:11,440 --> 00:12:14,480 Speaker 2: So one place that had published Victoria's work had actually 232 00:12:14,520 --> 00:12:17,280 Speaker 2: already taken some stuff down, but they didn't say it 233 00:12:17,320 --> 00:12:18,120 Speaker 2: was because of AI. 234 00:12:19,040 --> 00:12:22,280 Speaker 1: I emailed with editors of other publications and spoke to 235 00:12:22,320 --> 00:12:25,120 Speaker 1: a former editor at Pop Sugar, which had taken down 236 00:12:25,200 --> 00:12:28,520 Speaker 1: a number of articles that she'd written. So I went 237 00:12:28,559 --> 00:12:30,520 Speaker 1: online and it said that was taken down because there 238 00:12:30,559 --> 00:12:33,840 Speaker 1: inconsistencies or didn't live up to Pop Sugar standards. The 239 00:12:34,000 --> 00:12:37,000 Speaker 1: editor said that there were plagiarism issues with the stories. 240 00:12:37,880 --> 00:12:41,080 Speaker 2: So walk me through this, Like, what were you feeling 241 00:12:41,240 --> 00:12:42,000 Speaker 2: at this time? 242 00:12:42,840 --> 00:12:45,280 Speaker 1: I felt embarrassed to begin with. I think the pitch 243 00:12:45,280 --> 00:12:48,160 Speaker 1: call was I said something about like the ways that 244 00:12:48,679 --> 00:12:51,720 Speaker 1: health and money collide and twenty twenty five and Ontario 245 00:12:51,800 --> 00:12:53,920 Speaker 1: something like that, and it felt to me like the 246 00:12:53,920 --> 00:12:57,520 Speaker 1: pitches were some large language model taken that little prompt 247 00:12:57,600 --> 00:12:59,880 Speaker 1: and sending me back like what I wanted to hear. 248 00:13:00,040 --> 00:13:01,640 Speaker 1: And of course I was like, oh, yeah, I'm into this, 249 00:13:01,640 --> 00:13:03,559 Speaker 1: this makes sense. This is all about how health and 250 00:13:03,600 --> 00:13:06,280 Speaker 1: money collide in twenty twenty five, Like I was. Yeah, 251 00:13:06,320 --> 00:13:08,840 Speaker 1: I was embarrassed at being kind of like sent back 252 00:13:08,920 --> 00:13:11,760 Speaker 1: my own prompt in a way that felt like it 253 00:13:11,800 --> 00:13:14,079 Speaker 1: was just designed to appeal to me. And I got 254 00:13:14,080 --> 00:13:16,839 Speaker 1: fooled right as I moved through it, though, like I 255 00:13:16,880 --> 00:13:20,360 Speaker 1: don't know, I began to feel maybe it sounds overblown, 256 00:13:20,400 --> 00:13:22,840 Speaker 1: but like a sense of despair, Like at a certain point, 257 00:13:22,920 --> 00:13:25,560 Speaker 1: I'm tracking down all this stuff she's telling lies all 258 00:13:25,600 --> 00:13:27,840 Speaker 1: over the place. I go back to my inbox, which 259 00:13:27,840 --> 00:13:30,120 Speaker 1: still has a million pitches, and I'm going through it 260 00:13:30,160 --> 00:13:33,280 Speaker 1: and I'm seeing the ones that look so synthetic. So 261 00:13:33,360 --> 00:13:36,280 Speaker 1: AI and I started idly googling some of those authors, 262 00:13:36,280 --> 00:13:39,360 Speaker 1: and I see that they have places across the internet 263 00:13:39,360 --> 00:13:41,880 Speaker 1: as well. And I'm not going to do a deep 264 00:13:41,920 --> 00:13:44,640 Speaker 1: dive on every single freelancer that ends up in my inbox, 265 00:13:44,720 --> 00:13:46,960 Speaker 1: but like, how much of this stuff is out there? 266 00:13:47,120 --> 00:13:49,160 Speaker 1: Like how much of I don't know? 267 00:13:49,200 --> 00:13:49,640 Speaker 3: You know that? 268 00:13:50,040 --> 00:13:51,959 Speaker 1: That was depressing to me. I spent my whole life 269 00:13:51,960 --> 00:13:55,520 Speaker 1: writing journalism, and this feeling of this just being the 270 00:13:55,559 --> 00:13:59,880 Speaker 1: tip of a pretty like awful iceberg was dispiriting. 271 00:14:00,080 --> 00:14:00,360 Speaker 3: Guess. 272 00:14:01,280 --> 00:14:05,520 Speaker 2: After investigating all of this past work, Nick needed some answers. 273 00:14:06,040 --> 00:14:10,520 Speaker 2: He finally decided that he was ready to confront Victoria. 274 00:14:10,600 --> 00:14:13,000 Speaker 1: I wanted to talk to her. I had like so 275 00:14:13,120 --> 00:14:17,640 Speaker 1: many questions about who this person really is, Like I 276 00:14:17,679 --> 00:14:20,680 Speaker 1: as I'm in this world, I'm reading their writing right, 277 00:14:20,720 --> 00:14:22,920 Speaker 1: and I'm and some of its first person stuff, and 278 00:14:23,080 --> 00:14:26,000 Speaker 1: it's all obviously all a mix of true and false 279 00:14:26,240 --> 00:14:28,760 Speaker 1: or completely false. I'm trying to figure out where they are. 280 00:14:28,920 --> 00:14:31,120 Speaker 1: It feels like from some other earlier writing and some 281 00:14:31,160 --> 00:14:34,040 Speaker 1: other social media that they're they may be based in Nigeria, 282 00:14:34,120 --> 00:14:36,760 Speaker 1: or at least or from Nigeria. I said, a ton 283 00:14:36,760 --> 00:14:39,160 Speaker 1: of questions for this person, so I wanted to get 284 00:14:39,200 --> 00:14:40,800 Speaker 1: them on the phone. So I so I did that, 285 00:14:40,840 --> 00:14:43,120 Speaker 1: emailed and said, let's let's chat about this piece. 286 00:14:43,480 --> 00:14:46,480 Speaker 2: Nick emails Victoria asking if he can talk more about 287 00:14:46,520 --> 00:14:50,360 Speaker 2: her article. From a freelancers standpoint, and initial check in 288 00:14:50,440 --> 00:14:53,600 Speaker 2: call is a pretty normal next step toward officially being 289 00:14:53,640 --> 00:14:58,080 Speaker 2: accepted and being paid. So Victoria agreed to a video 290 00:14:58,160 --> 00:15:00,280 Speaker 2: call and they set it up for later that week. 291 00:15:02,400 --> 00:15:04,760 Speaker 1: In the ten minutes before that was about to start, 292 00:15:04,960 --> 00:15:07,720 Speaker 1: they emailed and said, I think I have internet connections. 293 00:15:07,760 --> 00:15:10,280 Speaker 1: Let's do this over audio. And then I got them 294 00:15:10,280 --> 00:15:12,720 Speaker 1: over the phone. And it was a strange conversation. It 295 00:15:12,760 --> 00:15:14,800 Speaker 1: was a it was a kind of surreal experience. 296 00:15:15,280 --> 00:15:15,920 Speaker 3: What happened. 297 00:15:16,560 --> 00:15:20,160 Speaker 1: So I guess in the fantasy version of this conversation 298 00:15:20,200 --> 00:15:22,640 Speaker 1: that I had been trying to plan out was I 299 00:15:22,680 --> 00:15:26,000 Speaker 1: would present them with some inconsistencies and they get bigger 300 00:15:26,040 --> 00:15:27,720 Speaker 1: and bigger, and a certain point they would just have 301 00:15:27,760 --> 00:15:30,880 Speaker 1: to admit these lies and then maybe we could actually 302 00:15:30,880 --> 00:15:32,280 Speaker 1: talk about who they were, and you know, were they 303 00:15:32,280 --> 00:15:34,880 Speaker 1: are real journalists who had gotten in over their heads. 304 00:15:34,880 --> 00:15:37,240 Speaker 1: Were they a scammer who had just like found some 305 00:15:37,280 --> 00:15:40,920 Speaker 1: easy marks with like overworked editors. That was the like 306 00:15:41,240 --> 00:15:44,000 Speaker 1: the unrealistic dream version of this conversation. How it actually 307 00:15:44,040 --> 00:15:47,000 Speaker 1: happened was I got on the phone someone very chipper 308 00:15:47,040 --> 00:15:49,520 Speaker 1: who send it to my ear like a young woman 309 00:15:49,560 --> 00:15:52,160 Speaker 1: with an African accent was on the phone. I asked 310 00:15:52,200 --> 00:15:54,600 Speaker 1: where she was based in Toronto, and she cheerfully said Bluer, 311 00:15:55,000 --> 00:15:56,840 Speaker 1: which is like if you said, whereabouts do you live 312 00:15:56,880 --> 00:16:00,440 Speaker 1: in New York and they just said Broadway right, okay, 313 00:16:00,800 --> 00:16:03,320 Speaker 1: But you know, this person had had googled the street 314 00:16:03,320 --> 00:16:05,520 Speaker 1: in Toronto, so I was I was already, you know, 315 00:16:05,760 --> 00:16:08,560 Speaker 1: well done. Yeah, And then I kind of just walked 316 00:16:08,560 --> 00:16:11,200 Speaker 1: them through some of the inconsistencies. And I didn't want 317 00:16:11,240 --> 00:16:13,360 Speaker 1: to spook them necessarily. I didn't want them to hang 318 00:16:13,440 --> 00:16:15,200 Speaker 1: up right away. So I asked them if they'd spoken 319 00:16:15,240 --> 00:16:17,520 Speaker 1: to the doctor. In her pitch, she said yes, she had. 320 00:16:17,720 --> 00:16:19,320 Speaker 1: I mentioned that, you know, we actually spoke to that 321 00:16:19,360 --> 00:16:21,800 Speaker 1: doctor and she says, she didn't speak to you, and 322 00:16:21,840 --> 00:16:23,800 Speaker 1: she said, oh, yes, of course, my personal assistant I 323 00:16:23,800 --> 00:16:27,080 Speaker 1: actually spoke with her. I didn't press on the fact 324 00:16:27,080 --> 00:16:30,800 Speaker 1: that like freelance journalists don't have personal assistance. I kind of, right, yeah, 325 00:16:31,440 --> 00:16:33,760 Speaker 1: let that drift past. I asked why I couldn't find 326 00:16:33,760 --> 00:16:36,320 Speaker 1: the stories in the Walrus and the Globe and Mail, 327 00:16:36,640 --> 00:16:39,160 Speaker 1: and she said most of those were done online only. 328 00:16:39,280 --> 00:16:41,080 Speaker 1: I emailed the editors from the Globe and the Walrus 329 00:16:41,080 --> 00:16:42,920 Speaker 1: and they said they've never worked with her. I asked 330 00:16:42,920 --> 00:16:44,840 Speaker 1: if she just recently moved to Toronto, because it seemed 331 00:16:44,880 --> 00:16:46,880 Speaker 1: like she was writing first person stories from London and 332 00:16:47,200 --> 00:16:49,400 Speaker 1: other places, and she said, yeah, she just moved this year. 333 00:16:49,960 --> 00:16:51,840 Speaker 1: I asked her why the stories were gone from Pop 334 00:16:51,880 --> 00:16:54,280 Speaker 1: Sugar and she said, oh, the editor moved on from 335 00:16:54,320 --> 00:16:56,840 Speaker 1: Pop Sugar, so they took down all the articles that 336 00:16:56,840 --> 00:16:59,600 Speaker 1: they worked talking with me, which is not how is 337 00:16:59,600 --> 00:17:02,400 Speaker 1: not how that works. Every question she just sort of answered, 338 00:17:02,480 --> 00:17:05,840 Speaker 1: very cheerfully, very upbeat, didn't seem to be shaken by it. 339 00:17:06,000 --> 00:17:08,600 Speaker 1: She had quick responses to every I mean, implausible, but 340 00:17:09,200 --> 00:17:11,920 Speaker 1: quick and ready. It's almost impressive. And then I kind 341 00:17:11,920 --> 00:17:14,680 Speaker 1: of moved on to the bigger inconsistencies. 342 00:17:14,760 --> 00:17:14,879 Speaker 4: Right. 343 00:17:15,080 --> 00:17:18,119 Speaker 1: I started talking about that story in the Scottish Journal. 344 00:17:18,480 --> 00:17:21,600 Speaker 1: I said that she'd quoted this professor Elaine Sutherland. I said, 345 00:17:21,600 --> 00:17:23,919 Speaker 1: I'd actually spoken to Elaine Sutherland, and she said she 346 00:17:23,960 --> 00:17:27,200 Speaker 1: has never spoken to you. And I realized in the 347 00:17:27,240 --> 00:17:30,000 Speaker 1: pause afterwards that she that she was gone, that she'd 348 00:17:30,080 --> 00:17:35,119 Speaker 1: hung up. She's not responded to any email since. I've 349 00:17:35,200 --> 00:17:36,080 Speaker 1: not heard from her since. 350 00:17:37,640 --> 00:17:40,360 Speaker 2: And it looks like that last question did it for Victoria. 351 00:17:40,640 --> 00:17:44,240 Speaker 2: She knew the jig was up, but Nick still wanted 352 00:17:44,280 --> 00:17:49,360 Speaker 2: to find out who is Victoria Goldie. We'll get into 353 00:17:49,400 --> 00:18:06,720 Speaker 2: that after the break. After that phone call, Victoria quietly 354 00:18:06,760 --> 00:18:08,440 Speaker 2: removed herself from the Internet. 355 00:18:08,920 --> 00:18:11,199 Speaker 1: In the days that followed, the x account that had 356 00:18:11,240 --> 00:18:15,160 Speaker 1: been posting some stuff under her name disappeared. Her mockrack page, 357 00:18:15,160 --> 00:18:17,760 Speaker 1: which is kind of a list of all a journalists 358 00:18:17,960 --> 00:18:22,679 Speaker 1: stories that went private. Her personal website went down. She 359 00:18:22,760 --> 00:18:26,280 Speaker 1: kind of disappeared after that phone call, and I say, 360 00:18:26,359 --> 00:18:29,080 Speaker 1: she I at this point, I don't not know who 361 00:18:29,400 --> 00:18:33,720 Speaker 1: who like Victoria Goldie is. The evidence suggests that this 362 00:18:33,800 --> 00:18:37,159 Speaker 1: person is either from or still lives in Nigeria. I 363 00:18:37,200 --> 00:18:41,119 Speaker 1: don't think it's a whole farm of people working together 364 00:18:41,400 --> 00:18:44,480 Speaker 1: and sending out stuff under one byline, although it could 365 00:18:44,520 --> 00:18:46,679 Speaker 1: be that. Something that crossed my mind a lot for 366 00:18:46,720 --> 00:18:49,520 Speaker 1: a while, I thought maybe there's no singular person attached 367 00:18:49,520 --> 00:18:51,320 Speaker 1: to this name at all. Maybe it's just, you know, 368 00:18:51,480 --> 00:18:53,600 Speaker 1: one of a dozen bylines that people are using to 369 00:18:54,040 --> 00:18:56,640 Speaker 1: get paid for some writing. I don't think that's the case. 370 00:18:56,680 --> 00:18:59,440 Speaker 1: That one of the earliest stories I found under that byline, 371 00:18:59,520 --> 00:19:01,880 Speaker 1: it was a couple months before Chatgebet came out, which 372 00:19:01,920 --> 00:19:04,600 Speaker 1: is kind of the demarketing line I think, after which 373 00:19:04,680 --> 00:19:07,320 Speaker 1: like you can't trust that a sentence was written by 374 00:19:07,359 --> 00:19:09,920 Speaker 1: a human, And it was like it felt different. Those 375 00:19:09,960 --> 00:19:13,800 Speaker 1: early stories were there's some grammatical mistakes, they're clumsier, they're 376 00:19:13,840 --> 00:19:16,919 Speaker 1: less smooth, less slick, but they seem to represent like 377 00:19:17,040 --> 00:19:19,720 Speaker 1: a real person talking about real things. 378 00:19:20,320 --> 00:19:23,160 Speaker 2: One article from that pre chatgpt era is from May 379 00:19:23,160 --> 00:19:26,360 Speaker 2: of twenty twenty two. It's a piece that Victoria Goldie 380 00:19:26,400 --> 00:19:29,439 Speaker 2: wrote for a website called black Ballad with the title 381 00:19:29,720 --> 00:19:33,480 Speaker 2: how I'm learning to navigate toxic positivity on social media, 382 00:19:34,080 --> 00:19:37,479 Speaker 2: And there's this line where she writes, quote, there's this 383 00:19:37,600 --> 00:19:41,080 Speaker 2: immense pressure to be productive. Most people like me are 384 00:19:41,160 --> 00:19:43,680 Speaker 2: tired and trying to survive day to day. 385 00:19:44,520 --> 00:19:47,159 Speaker 1: So I think this is a real individual. This is 386 00:19:47,200 --> 00:19:50,240 Speaker 1: not their name necessarily, but I think this is one individual. 387 00:19:50,240 --> 00:19:52,320 Speaker 1: And you see a kind of continuity of interest throughout 388 00:19:52,320 --> 00:19:56,040 Speaker 1: their into key dramas, they're into afrobeats. There's some continuity 389 00:19:56,080 --> 00:19:58,800 Speaker 1: of which feels like a person there. I can't know 390 00:19:58,880 --> 00:20:01,040 Speaker 1: for sure, but I think this as an individual who, 391 00:20:01,119 --> 00:20:03,440 Speaker 1: at least in those early stories, is writing about going 392 00:20:03,480 --> 00:20:06,520 Speaker 1: online like the rest of us and making her miserable 393 00:20:06,800 --> 00:20:09,840 Speaker 1: seeing people who are presenting themselves as doing so well, 394 00:20:09,960 --> 00:20:12,920 Speaker 1: seeing all this hustle culture and they're just struggling to 395 00:20:12,920 --> 00:20:13,800 Speaker 1: get by a day to day. 396 00:20:15,160 --> 00:20:18,760 Speaker 2: So in the beginning she was writing about going online 397 00:20:18,760 --> 00:20:22,520 Speaker 2: seeing hustle culture, getting kind of depressed. And that's what 398 00:20:22,560 --> 00:20:25,600 Speaker 2: the hustle culture Rose is selling you right now, is hey, 399 00:20:25,720 --> 00:20:27,639 Speaker 2: here's how to make a bunch of money by cranking 400 00:20:27,680 --> 00:20:32,360 Speaker 2: out content with AI. So kind of can't beat them, 401 00:20:32,800 --> 00:20:33,720 Speaker 2: join them type thing. 402 00:20:34,359 --> 00:20:36,760 Speaker 1: Yeah, So in the end, like I don't know who 403 00:20:36,760 --> 00:20:40,080 Speaker 1: this person is, I have some sympathy for them. 404 00:20:41,040 --> 00:20:43,639 Speaker 2: After all of this, Nick decided to make his investigation 405 00:20:43,760 --> 00:20:46,359 Speaker 2: public and he published a piece in the local and 406 00:20:46,520 --> 00:20:49,080 Speaker 2: that's actually where I first read about it. The title 407 00:20:49,119 --> 00:20:52,800 Speaker 2: of his piece was called Investigating a Possible Scammer in 408 00:20:52,880 --> 00:20:54,280 Speaker 2: Journalism's AI era. 409 00:20:54,960 --> 00:20:57,679 Speaker 1: We published the piece, it got a crazy response, but 410 00:20:57,720 --> 00:21:00,400 Speaker 1: I think the most like telling response. All I got 411 00:21:00,480 --> 00:21:03,040 Speaker 1: was so many emails from editors from around the world, 412 00:21:03,119 --> 00:21:05,720 Speaker 1: which was kind of mind blowing. Victoria Goldie had been 413 00:21:05,760 --> 00:21:11,600 Speaker 1: pitching like everywhere, everywhere, from nature to an Oakland startup 414 00:21:11,640 --> 00:21:14,520 Speaker 1: that began like a few weeks ago doing book reviews. 415 00:21:14,960 --> 00:21:17,719 Speaker 1: Someone on the West Coast who was running this tiny 416 00:21:17,920 --> 00:21:21,119 Speaker 1: indigenous run publication said that they had just assigned a 417 00:21:21,160 --> 00:21:23,080 Speaker 1: story to her and they just, oh my god, did 418 00:21:23,119 --> 00:21:24,600 Speaker 1: a quick Google to her and they're like, thank you 419 00:21:24,640 --> 00:21:27,200 Speaker 1: so much. We were very close to sending the contract. 420 00:21:27,680 --> 00:21:30,879 Speaker 1: Someone else from this publication called Republic. They phoned me 421 00:21:30,960 --> 00:21:32,679 Speaker 1: up there in the middle of a second draft with her. 422 00:21:32,880 --> 00:21:35,960 Speaker 1: They were editing a story about people living in their 423 00:21:36,080 --> 00:21:39,520 Speaker 1: vans in the American Southwest that was based on Victoria's 424 00:21:39,560 --> 00:21:41,760 Speaker 1: own experience living in a van in the American Southwest. 425 00:21:41,840 --> 00:21:46,119 Speaker 1: The Financial Times like around the world. This this person 426 00:21:46,320 --> 00:21:49,240 Speaker 1: was really incredibly prolific. 427 00:21:51,520 --> 00:21:54,720 Speaker 2: Just the sheer volume of pitches that Victoria Goldie was 428 00:21:54,720 --> 00:21:57,320 Speaker 2: sending out might give us a hint at her motivations. 429 00:21:57,760 --> 00:22:01,600 Speaker 2: Journalism is famously not a very lucrative career, especially for 430 00:22:01,640 --> 00:22:05,200 Speaker 2: a freelance journalist. Ask me how I know, and maybe 431 00:22:05,400 --> 00:22:09,520 Speaker 2: pre ai Victoria was having trouble finding work, But if 432 00:22:09,520 --> 00:22:12,359 Speaker 2: you can start writing for dozens of places at once, 433 00:22:12,960 --> 00:22:14,159 Speaker 2: that might start to add up. 434 00:22:14,920 --> 00:22:16,560 Speaker 1: I guess part of what's interesting to me is, I 435 00:22:16,560 --> 00:22:19,480 Speaker 1: think we were offering two thousand bucks for this story. 436 00:22:19,520 --> 00:22:22,280 Speaker 1: That's our rate Dwell paid, you know, I think she 437 00:22:22,359 --> 00:22:24,680 Speaker 1: got fifteen hundred dollars for that or something in that 438 00:22:24,760 --> 00:22:26,960 Speaker 1: Dwell article. If you actually had interviewed ten of the 439 00:22:27,000 --> 00:22:29,720 Speaker 1: top designers in the world, like you know that fifteen 440 00:22:29,760 --> 00:22:32,040 Speaker 1: hundred dollars, it doesn't go far. If you entered into 441 00:22:32,119 --> 00:22:34,480 Speaker 1: chat GPT and get this thing out of five minutes later, 442 00:22:34,520 --> 00:22:36,600 Speaker 1: that's a pretty good return, right. 443 00:22:37,440 --> 00:22:39,640 Speaker 2: And for a long time she got away with it. 444 00:22:40,000 --> 00:22:43,760 Speaker 2: Victoria Goldie had dozens of articles across the Internet for 445 00:22:43,920 --> 00:22:47,119 Speaker 2: a bunch of really well respected publications. By the end 446 00:22:47,119 --> 00:22:50,760 Speaker 2: of Nick's investigation, four of her old articles had come down, 447 00:22:51,080 --> 00:22:53,480 Speaker 2: but as of this recording, there are still a few 448 00:22:53,520 --> 00:22:59,880 Speaker 2: out there. How do we change editorial processes? I mean, 449 00:23:00,040 --> 00:23:02,760 Speaker 2: how is the local change their editorial process or they're 450 00:23:02,760 --> 00:23:06,359 Speaker 2: different hoops that somebody has to jump through to submitted article. 451 00:23:06,560 --> 00:23:09,920 Speaker 1: I mean, yeah, you got a big response on the internet, 452 00:23:09,960 --> 00:23:11,840 Speaker 1: but it got a big response in our newsroom. Like 453 00:23:11,880 --> 00:23:13,760 Speaker 1: we've been thinking about how do we do what we 454 00:23:13,800 --> 00:23:16,320 Speaker 1: do going forward in this era? Like I said at 455 00:23:16,320 --> 00:23:18,080 Speaker 1: the beginning, we want to work with new writers, right, 456 00:23:18,119 --> 00:23:20,000 Speaker 1: like this is our whole thing. I owe my career to. 457 00:23:20,119 --> 00:23:22,800 Speaker 1: Like a pitch used to be this key that could 458 00:23:22,800 --> 00:23:24,479 Speaker 1: open the door even if you didn't know anybody at 459 00:23:24,480 --> 00:23:26,960 Speaker 1: a publication. If you could write an amazing pitch, that's 460 00:23:27,000 --> 00:23:29,840 Speaker 1: a way into places. A pitch now is not connected 461 00:23:30,080 --> 00:23:32,400 Speaker 1: or is not necessarily connected to any human being. Right, 462 00:23:32,960 --> 00:23:35,520 Speaker 1: So when we are trying to find new writers, what 463 00:23:35,520 --> 00:23:38,480 Speaker 1: do you do? It's made our jobs difficult. Like we're 464 00:23:38,480 --> 00:23:41,040 Speaker 1: talking about I'm going to do more coffees with young freelancers. 465 00:23:41,040 --> 00:23:43,359 Speaker 1: I'm going to phone their editors and but this is 466 00:23:43,400 --> 00:23:45,760 Speaker 1: all like, this is all a bummer. This all takes 467 00:23:45,800 --> 00:23:48,919 Speaker 1: so much time, it takes so much money. Everyone's already overworked. 468 00:23:49,080 --> 00:23:51,200 Speaker 1: And I think the fear is that you're just gonna 469 00:23:51,200 --> 00:23:52,760 Speaker 1: work with people you already know. You're just going to 470 00:23:52,880 --> 00:23:55,919 Speaker 1: work with established people. You're never going to take a 471 00:23:56,000 --> 00:23:58,520 Speaker 1: chance on someone new because did they write their pitch? 472 00:23:58,800 --> 00:23:59,960 Speaker 1: Did they write their previous story? 473 00:24:00,359 --> 00:24:01,719 Speaker 3: Like you can't trust anything? 474 00:24:01,920 --> 00:24:07,040 Speaker 1: Yeah, yeah, So I am doing more phone calls earlier 475 00:24:07,080 --> 00:24:10,240 Speaker 1: in the process. With writers, at least you get a 476 00:24:10,280 --> 00:24:12,840 Speaker 1: sense of a person. You know, you're talking to a 477 00:24:12,880 --> 00:24:15,960 Speaker 1: real person. We're like, we've beefed up our fact checking process, 478 00:24:16,160 --> 00:24:19,680 Speaker 1: so people send annotated drafts or checking through them. I 479 00:24:19,720 --> 00:24:22,200 Speaker 1: think if you had robust fact checking, like this story 480 00:24:22,240 --> 00:24:25,639 Speaker 1: is not getting into twelve magazine, right. The depressing thing 481 00:24:25,680 --> 00:24:29,399 Speaker 1: about this scammer as opposed to like classic scammers like 482 00:24:29,800 --> 00:24:32,680 Speaker 1: Jason Blair or like Stephen Glass, Right, like people making 483 00:24:32,760 --> 00:24:34,480 Speaker 1: up stuff in the past. They were doing that in 484 00:24:34,520 --> 00:24:36,960 Speaker 1: a world or journalism has a certain amount of power 485 00:24:37,000 --> 00:24:38,720 Speaker 1: and prestige, and they're doing it for that reason. This 486 00:24:38,880 --> 00:24:42,480 Speaker 1: is people who are taking advantage of an ecosystem that 487 00:24:42,600 --> 00:24:47,200 Speaker 1: is like already pretty broken and yeah and overworked, overworked. Yeah, 488 00:24:47,200 --> 00:24:49,840 Speaker 1: there's not fact checkers. I don't think an editor at 489 00:24:49,840 --> 00:24:52,720 Speaker 1: some of these places could have thought twice about some 490 00:24:52,840 --> 00:24:55,800 Speaker 1: of this stuff. Even so, Yeah, to combat it, it 491 00:24:55,840 --> 00:24:58,080 Speaker 1: would take more money, would take more resources, it would 492 00:24:58,080 --> 00:25:00,760 Speaker 1: take journalists spending more time in that work. And we're 493 00:25:00,800 --> 00:25:03,159 Speaker 1: already at a point where like we're going in the 494 00:25:03,200 --> 00:25:04,200 Speaker 1: opposite direction. 495 00:25:04,040 --> 00:25:07,280 Speaker 2: Right, Yeah, I mean when when you said robust fact checking, 496 00:25:07,480 --> 00:25:10,440 Speaker 2: any journalists listening to this are probably laughing and or 497 00:25:10,480 --> 00:25:13,440 Speaker 2: spit out their coffee or crying into their coffee, probably 498 00:25:14,119 --> 00:25:16,600 Speaker 2: because that's one of the things that gets. 499 00:25:16,320 --> 00:25:17,880 Speaker 3: Cut, you know. 500 00:25:17,920 --> 00:25:22,080 Speaker 2: That's like do organizations have fact checkers, Yes, but maybe 501 00:25:22,080 --> 00:25:24,760 Speaker 2: they used to have five and they have one or 502 00:25:25,520 --> 00:25:27,840 Speaker 2: the editor maybe is being asked to handle all that 503 00:25:27,920 --> 00:25:31,320 Speaker 2: work and so we got to spend money to clean 504 00:25:31,359 --> 00:25:34,399 Speaker 2: it up. Yeah, And I mean this is it's interesting 505 00:25:34,440 --> 00:25:37,600 Speaker 2: because this isn't really necessarily about AI. You know, I 506 00:25:37,960 --> 00:25:41,560 Speaker 2: hate to echo the AI boosters here, right, but this 507 00:25:41,600 --> 00:25:44,600 Speaker 2: is something that's often said, AI is a tool, and 508 00:25:44,640 --> 00:25:48,439 Speaker 2: this is somebody who and maybe multiple somebodies who are 509 00:25:48,520 --> 00:25:51,640 Speaker 2: using just a tool that is more efficient. 510 00:25:51,720 --> 00:25:54,480 Speaker 1: Exactly the scale would be impossible with it at the 511 00:25:54,480 --> 00:25:57,639 Speaker 1: efficiency I would not have heard from dozens of editors 512 00:25:57,680 --> 00:26:00,720 Speaker 1: around the world had been pitched stories that fit pretty 513 00:26:00,760 --> 00:26:03,920 Speaker 1: well into their niche Like we're kind of we're way off. 514 00:26:04,280 --> 00:26:06,520 Speaker 1: That's only possible with the technology, right, But yeah, this 515 00:26:07,000 --> 00:26:09,760 Speaker 1: is this is human beings using that technology to do 516 00:26:09,840 --> 00:26:11,359 Speaker 1: something that they've done for a while. 517 00:26:12,920 --> 00:26:15,600 Speaker 2: But there is something new that comes with these AI 518 00:26:15,720 --> 00:26:19,960 Speaker 2: models that have scraped everything ever written. It's changing people's 519 00:26:20,000 --> 00:26:23,879 Speaker 2: definitions of truth and lies in ways that journalists haven't 520 00:26:23,880 --> 00:26:26,200 Speaker 2: had to think about before. And I'm not just saying, 521 00:26:26,240 --> 00:26:29,080 Speaker 2: you know, oh truth, there's important, lies or bad. I 522 00:26:29,200 --> 00:26:31,439 Speaker 2: just mean that it's getting weird. 523 00:26:33,320 --> 00:26:34,960 Speaker 1: So one of the responses I got from one of 524 00:26:35,000 --> 00:26:38,200 Speaker 1: these experts who I followed up on, who had quotes 525 00:26:38,240 --> 00:26:40,480 Speaker 1: made up by Victoria Goldie for one of these stories, 526 00:26:40,720 --> 00:26:43,040 Speaker 1: they said, that's not me. I don't remember speaking to 527 00:26:43,080 --> 00:26:45,840 Speaker 1: this writer, but that sounds like something I would say, 528 00:26:45,880 --> 00:26:48,199 Speaker 1: And I'm okay with that material being out there, and 529 00:26:48,240 --> 00:26:53,120 Speaker 1: that like wow, that took me back, Like that that 530 00:26:53,200 --> 00:26:55,760 Speaker 1: indicates that we are part of a world where, like 531 00:26:56,240 --> 00:26:57,640 Speaker 1: I think we already know that a lot of people 532 00:26:57,680 --> 00:27:00,280 Speaker 1: don't mind the slap, like the slop is ever where, 533 00:27:00,359 --> 00:27:03,600 Speaker 1: because people are fine with it if it's that's up 534 00:27:04,440 --> 00:27:08,280 Speaker 1: some of their preconceived ideas or it seems close enough, right. 535 00:27:08,640 --> 00:27:13,080 Speaker 1: And the fact that individuals themselves are okay with being represented, 536 00:27:13,440 --> 00:27:15,359 Speaker 1: you know it's close enough to what they said or 537 00:27:15,400 --> 00:27:17,639 Speaker 1: what they believe, that makes me think like we are 538 00:27:17,680 --> 00:27:20,520 Speaker 1: in a we're entering a whole new paradigm about like 539 00:27:20,560 --> 00:27:21,960 Speaker 1: what truth even is. 540 00:27:22,040 --> 00:27:28,600 Speaker 2: And that is like I don't know, all right, counterpoint, 541 00:27:28,800 --> 00:27:33,240 Speaker 2: what if it doesn't matter that we've got AI journalists. Look, 542 00:27:33,520 --> 00:27:37,360 Speaker 2: you wanted an article on the kind of creeping privatization 543 00:27:38,000 --> 00:27:42,439 Speaker 2: of the healthcare system. You have a general idea of 544 00:27:42,440 --> 00:27:44,760 Speaker 2: what some of these people are going to say, and 545 00:27:45,320 --> 00:27:48,400 Speaker 2: for a lay reader who isn't really familiar with everything, 546 00:27:48,800 --> 00:27:50,760 Speaker 2: they need to be brought up at speed on the basics. Right, 547 00:27:51,080 --> 00:27:54,360 Speaker 2: So what's the harm in having, you know, some chatbot 548 00:27:54,880 --> 00:27:57,240 Speaker 2: fill in the blanks for them and say some things 549 00:27:57,240 --> 00:28:00,760 Speaker 2: that are pretty plausible from a few experts. The AI 550 00:28:00,840 --> 00:28:03,760 Speaker 2: models is trained on everything they've said. It knows about 551 00:28:03,760 --> 00:28:05,200 Speaker 2: what they're gonna say. 552 00:28:05,320 --> 00:28:07,800 Speaker 1: Yeah, I mean, this is the argument that kind of 553 00:28:07,880 --> 00:28:10,120 Speaker 1: terrifies me, right, Like, I think, to be. 554 00:28:10,080 --> 00:28:11,399 Speaker 3: Clear, I'm not advocating for this. 555 00:28:11,480 --> 00:28:12,800 Speaker 1: No, no, no, I know, I know, I know, I 556 00:28:13,080 --> 00:28:17,440 Speaker 1: hear you taking the point of view, Like, I mean, 557 00:28:17,480 --> 00:28:19,679 Speaker 1: first of all, they don't know what individuals say, right, 558 00:28:19,720 --> 00:28:22,320 Speaker 1: everyone is surprising, and a I can only say what's 559 00:28:22,640 --> 00:28:26,480 Speaker 1: been said already. It's incapable of actually advancing a story. 560 00:28:26,480 --> 00:28:28,600 Speaker 1: And the news is about saying what is new, right, 561 00:28:28,640 --> 00:28:30,960 Speaker 1: and that is something that I think cannot be replaced 562 00:28:30,960 --> 00:28:34,159 Speaker 1: at all. But more fundamentally, like I don't know, we 563 00:28:34,400 --> 00:28:36,359 Speaker 1: just can't be cool with having stuff that is not 564 00:28:36,400 --> 00:28:39,120 Speaker 1: true presented as true because it's close enough, or it's 565 00:28:39,240 --> 00:28:42,960 Speaker 1: like we're losing that grip on you know, when someone's 566 00:28:43,040 --> 00:28:44,960 Speaker 1: quoted they have to have said a thing, when when 567 00:28:45,160 --> 00:28:47,360 Speaker 1: like I feel like I'm losing my mind as we're 568 00:28:47,960 --> 00:28:51,720 Speaker 1: kind of moving away from actually recognizing that, like when 569 00:28:51,720 --> 00:28:53,240 Speaker 1: you say stuff's true, it's got to be It's got 570 00:28:53,280 --> 00:28:54,920 Speaker 1: to be true. Otherwise, what are we even? How can 571 00:28:54,960 --> 00:28:56,920 Speaker 1: we talk? How can we argue? How can we agree 572 00:28:56,960 --> 00:28:57,440 Speaker 1: on anything? 573 00:28:58,200 --> 00:29:00,960 Speaker 2: Yeah, I think we were, we were in the post 574 00:29:01,000 --> 00:29:03,400 Speaker 2: truth thing already. It's like post post truth. 575 00:29:03,640 --> 00:29:05,920 Speaker 1: Yeah, I think like university professors have been here for 576 00:29:05,920 --> 00:29:07,720 Speaker 1: a while already, Like it's been a few years of 577 00:29:08,000 --> 00:29:10,360 Speaker 1: them dealing with this world. But for me personally, in 578 00:29:10,400 --> 00:29:13,520 Speaker 1: like my little corner of magazine journalism, I haven't been 579 00:29:13,520 --> 00:29:15,040 Speaker 1: confronted with it until now. 580 00:29:15,160 --> 00:29:16,120 Speaker 3: You know, it's interesting you. 581 00:29:16,040 --> 00:29:17,440 Speaker 2: Bring that up, because I think a lot of people 582 00:29:17,480 --> 00:29:21,760 Speaker 2: are curious about how university and professors or high school 583 00:29:21,800 --> 00:29:23,840 Speaker 2: professors or whatever dealing with the fact that, yeah, a 584 00:29:23,880 --> 00:29:26,280 Speaker 2: lot of their students are probably just typing a prompt 585 00:29:26,280 --> 00:29:28,880 Speaker 2: into CHATGBT and then you know, cranking out a thousand 586 00:29:28,920 --> 00:29:31,920 Speaker 2: word essay. But that's a very controlled environment. You know, 587 00:29:32,080 --> 00:29:35,560 Speaker 2: it's a fish bowl. If somebody at the university down 588 00:29:35,640 --> 00:29:39,600 Speaker 2: the road chat gbt generates their history exam, that doesn't 589 00:29:39,640 --> 00:29:45,440 Speaker 2: affect me personally. But if somebody is chatgebt generating an 590 00:29:45,600 --> 00:29:49,960 Speaker 2: article about what the LAPD is doing or not doing, that. 591 00:29:49,880 --> 00:29:50,600 Speaker 3: Could affect me. 592 00:29:51,240 --> 00:29:54,200 Speaker 2: And we're already a stage where people just straight up 593 00:29:54,240 --> 00:29:58,000 Speaker 2: don't trust journalism at all. And what do we do 594 00:29:58,080 --> 00:30:01,760 Speaker 2: if the if the writers are lying to the editors, 595 00:30:02,680 --> 00:30:03,959 Speaker 2: or the writers don't even exist. 596 00:30:05,040 --> 00:30:09,680 Speaker 1: Yeah, yeah, I have no answers. It's weighing on me. 597 00:30:12,480 --> 00:30:15,200 Speaker 2: As of right now, as I'm recording this, not all 598 00:30:15,240 --> 00:30:18,400 Speaker 2: of Victoria Goldie's work is gone. So among that work 599 00:30:18,440 --> 00:30:21,560 Speaker 2: are three articles that she wrote for Business Insider. Two 600 00:30:21,560 --> 00:30:24,320 Speaker 2: of those were published just before chat ChiPT dropped in 601 00:30:24,320 --> 00:30:26,720 Speaker 2: November of twenty twenty two, and one came out right 602 00:30:26,760 --> 00:30:30,280 Speaker 2: after I dug into all of these articles. Each one 603 00:30:30,360 --> 00:30:32,800 Speaker 2: is an interview with one person who, as far as 604 00:30:32,840 --> 00:30:35,680 Speaker 2: I can tell, seems to actually exists. The articles are 605 00:30:35,800 --> 00:30:39,240 Speaker 2: kind of boring, but they seem legit. But there's also 606 00:30:39,480 --> 00:30:42,200 Speaker 2: this really interesting article that she wrote for The Guardian, 607 00:30:42,280 --> 00:30:44,600 Speaker 2: and this one only came out in September of twenty 608 00:30:44,640 --> 00:30:48,440 Speaker 2: twenty five. So Victoria Goldie writes about hanging out in 609 00:30:48,520 --> 00:30:52,040 Speaker 2: East London and going to raves and soccer games, mixing 610 00:30:52,120 --> 00:30:56,080 Speaker 2: with different people, learning about different cultures, and near the 611 00:30:56,280 --> 00:31:00,680 Speaker 2: end of this really personal essay, she writes, quote, the 612 00:31:00,720 --> 00:31:04,920 Speaker 2: future of our music is not written by algorithm. That 613 00:31:05,120 --> 00:31:09,000 Speaker 2: article has been taken down? So was that one real? 614 00:31:09,480 --> 00:31:12,480 Speaker 2: Is any of this real? How many of us even care? 615 00:31:15,320 --> 00:31:15,680 Speaker 4: All right? 616 00:31:15,840 --> 00:31:17,960 Speaker 2: I don't want to lose sight of how all this started. 617 00:31:18,360 --> 00:31:21,400 Speaker 2: Nick has finally put his investigation to rest, and the 618 00:31:21,440 --> 00:31:24,840 Speaker 2: local special on healthcare in Canada is now out and 619 00:31:24,880 --> 00:31:26,760 Speaker 2: it's ready for you to read it. And if you're 620 00:31:26,760 --> 00:31:29,200 Speaker 2: wondering if they're going to put something about Victoria Goldie 621 00:31:29,200 --> 00:31:30,640 Speaker 2: in there, I. 622 00:31:30,560 --> 00:31:32,600 Speaker 1: Don't think so. I don't think so. We're moving moving on, 623 00:31:32,720 --> 00:31:33,360 Speaker 1: We're moving on. 624 00:31:33,680 --> 00:31:35,760 Speaker 3: Moving on, moving on. I dig it. 625 00:31:38,840 --> 00:31:41,640 Speaker 2: Thank you so much for checking out another episode of 626 00:31:41,720 --> 00:31:43,480 Speaker 2: kill Switch. If you want to talk to us, you 627 00:31:43,520 --> 00:31:47,120 Speaker 2: can email us at kill Switch at Kaleidoscope dot NYC 628 00:31:47,640 --> 00:31:48,680 Speaker 2: or on Instagram. 629 00:31:48,760 --> 00:31:50,360 Speaker 3: We're at kill Switch Pod. 630 00:31:50,760 --> 00:31:52,800 Speaker 2: And before you move on to doing something else, maybe 631 00:31:52,840 --> 00:31:55,000 Speaker 2: think of leaving the show or a review. It helps 632 00:31:55,040 --> 00:31:57,400 Speaker 2: other people find the show which in turn helps us 633 00:31:57,480 --> 00:32:00,360 Speaker 2: keep doing our thing. Kill Switch is hosted by Me 634 00:32:00,560 --> 00:32:04,520 Speaker 2: Dexter Thomas. It's produced by Shena Ozaki, Darluck Potts and 635 00:32:04,680 --> 00:32:08,080 Speaker 2: Julian Nutter. Our theme song is by Me and Kyle Murdoch, 636 00:32:08,240 --> 00:32:11,920 Speaker 2: and Kyle also mixes a show from Kaleidoscope. Our executive 637 00:32:11,960 --> 00:32:17,200 Speaker 2: producers are Ozma Lashin from Geshatgadour and Kate Osborne from iHeart. 638 00:32:17,240 --> 00:32:20,560 Speaker 2: Our executive producers of Katrina Norville and Nikki E Tour 639 00:32:21,200 --> 00:32:24,840 Speaker 2: catch on next week 640 00:32:30,760 --> 00:32:36,600 Speaker 4: Goodbye