1 00:00:09,920 --> 00:00:13,239 Speaker 1: One of the first examples that I saw was a 2 00:00:13,280 --> 00:00:17,040 Speaker 1: really long memo with this style that we talk about 3 00:00:17,079 --> 00:00:18,079 Speaker 1: as purple prose. 4 00:00:18,160 --> 00:00:19,439 Speaker 2: It's like when people. 5 00:00:19,200 --> 00:00:23,720 Speaker 1: Just use really flowery language that's elaborate and long. 6 00:00:25,040 --> 00:00:28,240 Speaker 3: Kate Niedhoffer is a social psychologist and the vice president 7 00:00:28,240 --> 00:00:31,400 Speaker 3: of Betterop Labs, which recently conducted a study about a 8 00:00:31,440 --> 00:00:34,480 Speaker 3: new phenomenon in the workplace AI WORKSLOP. 9 00:00:34,840 --> 00:00:38,320 Speaker 1: We ask people to share examples of workslops that they've received. 10 00:00:38,800 --> 00:00:42,080 Speaker 1: They'll tell us like, there's this weirdness, like there's something 11 00:00:42,120 --> 00:00:45,640 Speaker 1: off about this, and feeling so confused and unsure what 12 00:00:45,800 --> 00:00:49,519 Speaker 1: the content is really about. But then it's like I 13 00:00:49,520 --> 00:00:51,720 Speaker 1: don't know what to do about this. I don't know 14 00:00:51,760 --> 00:00:54,920 Speaker 1: if I should just redo it myself, start from scratch, 15 00:00:55,560 --> 00:00:59,240 Speaker 1: tell the person, ask the person. Sometimes there's a power 16 00:00:59,320 --> 00:01:03,160 Speaker 1: dynamic that makes it really complicated to even engage, and 17 00:01:03,200 --> 00:01:06,880 Speaker 1: so it leaves people sort of paralyzed. It's like I 18 00:01:06,959 --> 00:01:08,600 Speaker 1: don't know what this is, what it means, or what 19 00:01:08,680 --> 00:01:10,640 Speaker 1: to do about it, and I still have to do 20 00:01:10,680 --> 00:01:11,119 Speaker 1: the work. 21 00:01:11,600 --> 00:01:14,039 Speaker 3: What Kate's study found is that contrary to what AI 22 00:01:14,120 --> 00:01:17,080 Speaker 3: companies have been promising, AI is creating more work for 23 00:01:17,160 --> 00:01:19,920 Speaker 3: people we're having to spend time reading emails written by 24 00:01:19,959 --> 00:01:24,800 Speaker 3: chat GPT, watching automatically generated PowerPoint presentations, and dealing with 25 00:01:24,880 --> 00:01:29,640 Speaker 3: computer code that looks okay but doesn't actually run properly. Meanwhile, 26 00:01:29,640 --> 00:01:32,160 Speaker 3: Mark Zuckerberg is telling his workers to use AI to 27 00:01:32,200 --> 00:01:35,000 Speaker 3: work five times faster, and a bunch of companies are 28 00:01:35,120 --> 00:01:38,080 Speaker 3: still laying off employees under the assumption that they can 29 00:01:38,160 --> 00:01:40,800 Speaker 3: save money by just using AI to do that work. 30 00:01:41,360 --> 00:01:45,920 Speaker 1: What we're doing is slashing the human and eliminating that 31 00:01:46,040 --> 00:01:51,200 Speaker 1: whole productivity potential. I think it's backfiring because you're having 32 00:01:51,280 --> 00:01:53,720 Speaker 1: a well being tax on the people who stay, who 33 00:01:53,760 --> 00:01:56,400 Speaker 1: have lost their peers, who are taking on all of 34 00:01:56,400 --> 00:01:59,800 Speaker 1: this work. It's just a compounding negative effect. 35 00:02:02,160 --> 00:02:04,680 Speaker 3: This study went kind of viral and made some headlines, 36 00:02:04,920 --> 00:02:07,240 Speaker 3: and if you just read the headlines, it can seem 37 00:02:07,320 --> 00:02:09,680 Speaker 3: like the results prove that AI shouldn't be used in 38 00:02:09,680 --> 00:02:12,800 Speaker 3: the workplace at all, But that wasn't really the point 39 00:02:12,800 --> 00:02:15,519 Speaker 3: of the story. Betterup's main thing is that they give 40 00:02:15,560 --> 00:02:18,919 Speaker 3: advice to companies. Usually that advice isn't just given out 41 00:02:18,919 --> 00:02:21,920 Speaker 3: for free in public. Companies pay money for this insight. 42 00:02:22,720 --> 00:02:25,119 Speaker 3: So my interview with Kate was an opportunity to see 43 00:02:25,160 --> 00:02:27,960 Speaker 3: what kind of conversations are actually happening behind the scenes 44 00:02:28,000 --> 00:02:31,120 Speaker 3: at workplaces that are experimenting with AI, the kind of 45 00:02:31,120 --> 00:02:44,359 Speaker 3: conversations that might also be happening where you work. Kaleidoscope 46 00:02:44,400 --> 00:02:45,519 Speaker 3: and iHeart podcasts. 47 00:02:46,720 --> 00:03:23,880 Speaker 4: This is kill switch. I'm dexter Thomas, I'm sorry, goodbye. 48 00:03:29,240 --> 00:03:35,320 Speaker 5: How would you define workslap workslop is AI generated text 49 00:03:35,960 --> 00:03:39,840 Speaker 5: that looks like it completes a task, but. 50 00:03:40,600 --> 00:03:45,640 Speaker 1: Upon further inspection reveals that it lacks context and all 51 00:03:45,720 --> 00:03:49,800 Speaker 1: the cognitive effort that is really necessary to complete a task. 52 00:03:50,480 --> 00:03:52,160 Speaker 3: So, in other words, it's when you get sent an 53 00:03:52,160 --> 00:03:55,760 Speaker 3: email or a presentation that's very clearly written by chat GBT, 54 00:03:56,200 --> 00:03:58,680 Speaker 3: when there's a bunch of M dashes and bullet points, 55 00:03:58,800 --> 00:04:02,960 Speaker 3: flowery language phrases like delve into the stuff that at 56 00:04:02,960 --> 00:04:05,000 Speaker 3: first glanced kind of looks like a well thought out 57 00:04:05,000 --> 00:04:08,520 Speaker 3: piece of work, but if you keep reading, it's obviously robotic. 58 00:04:09,160 --> 00:04:14,040 Speaker 1: We see a lot of examples of emails, reports, code, 59 00:04:14,560 --> 00:04:18,039 Speaker 1: a lot of like vibe coded code, even apps. So 60 00:04:18,200 --> 00:04:22,279 Speaker 1: people talked about the creation of prototypes, for example, that 61 00:04:22,320 --> 00:04:24,599 Speaker 1: are based on prompts, and so they're sort of brittle 62 00:04:24,880 --> 00:04:28,040 Speaker 1: experiences that don't really have substance underneath them. 63 00:04:28,360 --> 00:04:30,159 Speaker 3: Where did this phrase work slob come. 64 00:04:30,040 --> 00:04:34,560 Speaker 1: From My colleague Jeff Hancock at the Stanford Social Media Lab. 65 00:04:34,720 --> 00:04:39,760 Speaker 1: And we had been studying the ways that managers make 66 00:04:39,839 --> 00:04:44,640 Speaker 1: decisions about delegating tasks to humans versus AI, and we 67 00:04:44,720 --> 00:04:47,839 Speaker 1: knew about AI slop, you know, that already existed, and 68 00:04:47,880 --> 00:04:49,599 Speaker 1: so we were thinking about, like, but what is it 69 00:04:49,640 --> 00:04:51,880 Speaker 1: in the workplace and why is it so different? So 70 00:04:52,440 --> 00:04:55,560 Speaker 1: Jeff came up with that first definition of AI generated 71 00:04:55,600 --> 00:04:58,560 Speaker 1: content that fulfills the appearance of completing the task, but 72 00:04:59,040 --> 00:05:02,960 Speaker 1: it doesn't actually have this like substantive cognitive work or 73 00:05:02,960 --> 00:05:06,479 Speaker 1: the decision making that a task really requires. And so 74 00:05:06,760 --> 00:05:09,520 Speaker 1: we went out with this definition we started to study it. 75 00:05:10,080 --> 00:05:12,599 Speaker 3: Kate and her team worked with Stanford Social Media Lab 76 00:05:12,680 --> 00:05:15,680 Speaker 3: to study this. They surveyed over one thousand workers across 77 00:05:15,720 --> 00:05:18,240 Speaker 3: fields from finance to medicine to government. 78 00:05:18,680 --> 00:05:21,120 Speaker 1: We start by kind of like funneling them into it 79 00:05:21,200 --> 00:05:24,400 Speaker 1: by saying, like, we have some questions about your experience 80 00:05:24,400 --> 00:05:27,279 Speaker 1: with AI at work. We explain what AI is so 81 00:05:27,320 --> 00:05:29,480 Speaker 1: we're all on the same page, and then we ask 82 00:05:29,560 --> 00:05:31,680 Speaker 1: them how often do you use AI at work? And 83 00:05:31,720 --> 00:05:35,320 Speaker 1: we ask if people in their company have required them 84 00:05:35,360 --> 00:05:38,280 Speaker 1: to use AI. As you know, many people are mandated 85 00:05:38,360 --> 00:05:40,440 Speaker 1: to use AI, so we want to get a sense 86 00:05:40,480 --> 00:05:43,599 Speaker 1: of what their policy is. And then we start asking 87 00:05:43,680 --> 00:05:46,719 Speaker 1: questions about how often they send or share work with 88 00:05:46,920 --> 00:05:50,520 Speaker 1: colleagues that uses AI to help produce it. And then 89 00:05:50,680 --> 00:05:53,159 Speaker 1: we say, how much of the AI generated work that 90 00:05:53,200 --> 00:05:55,760 Speaker 1: you send to colleagues do you think is actually unhelpful, 91 00:05:55,839 --> 00:05:57,320 Speaker 1: low effort, or low quality? 92 00:05:57,760 --> 00:06:00,719 Speaker 2: And people readily admit to that. 93 00:06:01,520 --> 00:06:04,240 Speaker 1: And then we say, well, how much of the AI 94 00:06:04,320 --> 00:06:06,320 Speaker 1: generated work that you send a colleagues do you think 95 00:06:06,360 --> 00:06:09,120 Speaker 1: is helpful and high quality? So you know, we have 96 00:06:09,200 --> 00:06:11,440 Speaker 1: to make sure that we're not being biased. And then 97 00:06:11,520 --> 00:06:14,200 Speaker 1: we say our most important question to give us the 98 00:06:14,240 --> 00:06:17,640 Speaker 1: prevalence is in your job, have you received work content 99 00:06:17,760 --> 00:06:20,400 Speaker 1: in the last month that you believe is AI generated 100 00:06:20,720 --> 00:06:23,520 Speaker 1: that looks like it completes a task at work, but 101 00:06:23,680 --> 00:06:27,960 Speaker 1: is actually unhelpful, low quality, or seems like the sender 102 00:06:28,000 --> 00:06:29,640 Speaker 1: didn't put in enough effort. 103 00:06:29,880 --> 00:06:33,400 Speaker 3: So you asked that question, which is essentially your definition 104 00:06:33,440 --> 00:06:35,800 Speaker 3: of WORKSLOP without actually using it for them so as 105 00:06:35,839 --> 00:06:39,000 Speaker 3: not to bias them. How many people answer yes. 106 00:06:38,839 --> 00:06:40,760 Speaker 2: To that question forty percent? 107 00:06:41,480 --> 00:06:45,040 Speaker 3: Forty percent in the last month. That's a really high number. 108 00:06:45,320 --> 00:06:46,600 Speaker 3: I can only imagine that's going. 109 00:06:46,520 --> 00:06:46,960 Speaker 1: To go up. 110 00:06:47,279 --> 00:06:47,920 Speaker 2: Yeah, I agree. 111 00:06:47,920 --> 00:06:51,560 Speaker 1: It's something that we're really interested in tracking. We also 112 00:06:52,400 --> 00:06:56,080 Speaker 1: wanted to know how frequently people experience it. So if 113 00:06:56,120 --> 00:06:58,240 Speaker 1: you think that that's the prevalence that forty percent of 114 00:06:58,279 --> 00:07:01,840 Speaker 1: people have experienced it, then how much of the work 115 00:07:01,880 --> 00:07:05,440 Speaker 1: that they receive do they think fits this description? And 116 00:07:05,520 --> 00:07:09,600 Speaker 1: it's about fifteen percent. So fifteen percent of the work 117 00:07:09,600 --> 00:07:14,320 Speaker 1: that you receive is AI generated work that looks like 118 00:07:14,400 --> 00:07:17,680 Speaker 1: it completes the task but really doesn't. 119 00:07:19,640 --> 00:07:25,760 Speaker 3: Okay, so this sounds a little annoying, but aside from 120 00:07:26,000 --> 00:07:28,840 Speaker 3: the mild annoying, so the deep annoying somebody might feel 121 00:07:28,840 --> 00:07:32,120 Speaker 3: from getting an email that pretty clearly is written by Chadgibt, 122 00:07:32,280 --> 00:07:35,200 Speaker 3: we're seeing a presentation that was written by chadgpt. What 123 00:07:35,240 --> 00:07:36,560 Speaker 3: else are the effects we should. 124 00:07:36,280 --> 00:07:36,800 Speaker 4: Think about here? 125 00:07:37,600 --> 00:07:41,040 Speaker 1: So it's a lot of time wasted, that's the first thing. 126 00:07:42,040 --> 00:07:46,760 Speaker 1: It has an emotional impact that's pretty strongly negative emotions 127 00:07:46,840 --> 00:07:49,880 Speaker 1: like being annoyed, frustrated. 128 00:07:49,440 --> 00:07:50,480 Speaker 2: You've been confused. 129 00:07:51,560 --> 00:07:55,280 Speaker 1: And then I think the most insidious impact of all 130 00:07:55,840 --> 00:08:00,920 Speaker 1: is this interpersonal judgment and evaluation that you think that 131 00:08:00,960 --> 00:08:05,440 Speaker 1: the person who sent you the workslop is less capable, 132 00:08:06,000 --> 00:08:12,080 Speaker 1: less creative, less trustworthy. It's like you're immediately casting judgment 133 00:08:12,520 --> 00:08:15,360 Speaker 1: on your ability to work with this person over time 134 00:08:15,520 --> 00:08:19,000 Speaker 1: and to collaborate effectively. And that's so important. That's the 135 00:08:19,040 --> 00:08:22,000 Speaker 1: most important thing in the workplace is that we believe 136 00:08:22,040 --> 00:08:25,480 Speaker 1: that our colleagues are competent, capable people with shared goals 137 00:08:25,520 --> 00:08:27,960 Speaker 1: that we can align with and do our best work. 138 00:08:28,560 --> 00:08:30,720 Speaker 3: So the respondence to your study, you're telling you this 139 00:08:30,840 --> 00:08:33,200 Speaker 3: that when they get workslop from somebody, they don't trust 140 00:08:33,200 --> 00:08:34,120 Speaker 3: that person anymore. 141 00:08:34,880 --> 00:08:36,000 Speaker 1: They trust them less. 142 00:08:36,440 --> 00:08:41,400 Speaker 3: Yeah, you know, so I've gotten pitches asking me to 143 00:08:41,440 --> 00:08:45,240 Speaker 3: cover things. So it's hey, dexter, I've seen that you 144 00:08:45,360 --> 00:08:48,400 Speaker 3: covered X. Here is this thing that I do, or 145 00:08:48,440 --> 00:08:50,640 Speaker 3: here's this thing that my client does. And it was 146 00:08:50,679 --> 00:08:54,480 Speaker 3: always in the same format. It's like a three paragraph format, 147 00:08:54,720 --> 00:08:57,800 Speaker 3: and it would be a brief sentence telling me how 148 00:08:57,840 --> 00:09:02,480 Speaker 3: cool I am, and then a couple of paragraphs telling 149 00:09:02,520 --> 00:09:06,800 Speaker 3: me what their client's product or project was, and the 150 00:09:06,800 --> 00:09:10,760 Speaker 3: third paragraph telling me why it's so interesting in giving 151 00:09:10,800 --> 00:09:13,280 Speaker 3: me suggestions for how I could cover it. And I 152 00:09:13,320 --> 00:09:16,040 Speaker 3: was seeing the same format over and over and over again. 153 00:09:16,280 --> 00:09:19,640 Speaker 3: And that made me not only dislike the PR person, 154 00:09:19,679 --> 00:09:22,520 Speaker 3: but dislike the person who I'm supposed to be covering it. 155 00:09:22,600 --> 00:09:26,040 Speaker 3: So I start to think, whoever this person is, I 156 00:09:26,080 --> 00:09:28,200 Speaker 3: don't want to have anything to do with them because 157 00:09:28,200 --> 00:09:30,720 Speaker 3: they're having some PR person who's just running their stuff 158 00:09:30,720 --> 00:09:34,760 Speaker 3: through perhaps chat Gibt, perhaps Claude, whatever, and so I 159 00:09:34,800 --> 00:09:36,920 Speaker 3: start to Yeah, I start to have a pretty negative 160 00:09:37,240 --> 00:09:38,560 Speaker 3: opinion totally. 161 00:09:38,640 --> 00:09:40,600 Speaker 1: I think that's the first phase of it. It's like 162 00:09:40,640 --> 00:09:46,000 Speaker 1: you recognize some linguistic cues that seem repetitive or template, 163 00:09:46,160 --> 00:09:49,199 Speaker 1: and like, to me, I experience it as so deceptive. 164 00:09:49,640 --> 00:09:51,360 Speaker 2: There's something first. 165 00:09:51,080 --> 00:09:54,800 Speaker 1: That's empty about the text and hard to read. But 166 00:09:54,920 --> 00:09:57,679 Speaker 1: then that second experience, which is so emotional. It's like 167 00:09:57,920 --> 00:10:01,839 Speaker 1: I'm confused, I'm frustrated, I'm annoying, angry, I feel deceived. 168 00:10:02,120 --> 00:10:05,480 Speaker 1: And then there's like that third process, that interpersonal process 169 00:10:05,480 --> 00:10:07,439 Speaker 1: that kicks in and you're like, I don't want to 170 00:10:07,480 --> 00:10:10,120 Speaker 1: work with this person, I don't trust this person, I 171 00:10:10,160 --> 00:10:13,240 Speaker 1: don't like this person in your case, and that's when 172 00:10:13,280 --> 00:10:16,160 Speaker 1: we saw those results in our study. That's when I 173 00:10:16,240 --> 00:10:19,640 Speaker 1: was like, oh God, we're in for something bad here. 174 00:10:20,320 --> 00:10:22,840 Speaker 3: The feeling of anger or frustration that you get when 175 00:10:22,920 --> 00:10:25,880 Speaker 3: someone dumps a bunch of AI generated slop on your desk. 176 00:10:26,280 --> 00:10:28,760 Speaker 3: This is what Kate would call the emotional tax of 177 00:10:28,800 --> 00:10:31,800 Speaker 3: work slop. About half the people surveyed said that when 178 00:10:31,800 --> 00:10:34,800 Speaker 3: they receive workslop from a colleague, their opinion of that 179 00:10:34,880 --> 00:10:38,520 Speaker 3: colleague changed, and at interpersonal level, this is obviously a 180 00:10:38,559 --> 00:10:40,720 Speaker 3: bad thing, but this is also the sort of thing 181 00:10:40,760 --> 00:10:43,960 Speaker 3: that a company would want to avoid because it translates 182 00:10:44,080 --> 00:10:45,720 Speaker 3: into less money for the company. 183 00:10:46,040 --> 00:10:51,440 Speaker 1: People are spending just under two hours on each incent 184 00:10:51,880 --> 00:10:55,160 Speaker 1: of workslop, and so if you think about you know, 185 00:10:55,200 --> 00:10:58,680 Speaker 1: the forty percent of people who have experienced it, the 186 00:10:58,760 --> 00:11:01,520 Speaker 1: rate with which they experience and set it can cost 187 00:11:01,679 --> 00:11:06,480 Speaker 1: up to nine million dollars for an organization. So it's 188 00:11:06,520 --> 00:11:10,520 Speaker 1: a very very costly phenomenon right now. In addition to 189 00:11:10,600 --> 00:11:14,079 Speaker 1: the cost of using these tools. 190 00:11:14,880 --> 00:11:18,240 Speaker 3: The MIT Media Lab recently reported that despite thirty to 191 00:11:18,240 --> 00:11:22,040 Speaker 3: forty billion dollars being invested in the generative AI, ninety 192 00:11:22,040 --> 00:11:25,800 Speaker 3: five percent of organizations are getting zero return on that investment. 193 00:11:26,200 --> 00:11:29,160 Speaker 3: And on top of that, Kate's study suggests that it's 194 00:11:29,200 --> 00:11:32,920 Speaker 3: actually even worse. A lot of businesses are losing money 195 00:11:33,000 --> 00:11:37,079 Speaker 3: to work slab so two questions, why is work slab 196 00:11:37,120 --> 00:11:40,520 Speaker 3: happening and whose fault is it? The second one you 197 00:11:40,720 --> 00:11:43,800 Speaker 3: might think, you know, the first one that answers a 198 00:11:43,800 --> 00:11:56,640 Speaker 3: little deeper, that's after the break So is there an 199 00:11:56,760 --> 00:12:01,079 Speaker 3: age demographic that's more likely to be getting or receiving 200 00:12:01,360 --> 00:12:02,880 Speaker 3: WORKSLOP in your study? 201 00:12:03,320 --> 00:12:06,880 Speaker 1: I don't have a demographic breakdown right now. It's something 202 00:12:06,920 --> 00:12:10,400 Speaker 1: that we're working on. We know from our previous research 203 00:12:10,640 --> 00:12:14,000 Speaker 1: that older people trust AI more, and we know that 204 00:12:14,280 --> 00:12:17,480 Speaker 1: trust in AI, especially when a sort of blind trust, 205 00:12:17,800 --> 00:12:21,520 Speaker 1: can lead to this over reliance and can produce more WORKSLOP. 206 00:12:22,040 --> 00:12:25,320 Speaker 3: So that right there is really interesting because the tendency 207 00:12:25,360 --> 00:12:28,640 Speaker 3: with any kind of new technology, with almost any new 208 00:12:28,640 --> 00:12:32,520 Speaker 3: technology in the past, we've expected for young people to 209 00:12:32,520 --> 00:12:34,840 Speaker 3: be the early adopters, and that I think societally we 210 00:12:34,920 --> 00:12:37,800 Speaker 3: start to blame young people for all of the ills 211 00:12:37,800 --> 00:12:40,840 Speaker 3: that that new technology brings. But what you're saying about 212 00:12:40,920 --> 00:12:42,880 Speaker 3: trust in AIS is really interesting. 213 00:12:43,200 --> 00:12:46,440 Speaker 1: So I think there is something really interesting happening with 214 00:12:46,720 --> 00:12:50,440 Speaker 1: age here. You know, agency understanding how to use the 215 00:12:50,520 --> 00:12:53,920 Speaker 1: tools in a discerning way, knowing when to use them, 216 00:12:54,080 --> 00:12:57,480 Speaker 1: how to use them. That's a really good positive predictor 217 00:12:57,520 --> 00:13:02,439 Speaker 1: of not producing WORKSLOP, and so it's really possible that 218 00:13:02,559 --> 00:13:06,520 Speaker 1: maybe a more tech native person would have higher agency 219 00:13:06,559 --> 00:13:09,840 Speaker 1: and be less likely to over rely on the tools. 220 00:13:10,160 --> 00:13:12,840 Speaker 1: There's this really interesting distinction and the way that people 221 00:13:12,960 --> 00:13:15,600 Speaker 1: use AI. We call it the pilot mindset. So people 222 00:13:15,600 --> 00:13:19,360 Speaker 1: who are high agency, high optimism in the way that 223 00:13:19,600 --> 00:13:23,840 Speaker 1: they use AI, and we see that pilots are not 224 00:13:24,120 --> 00:13:28,839 Speaker 1: producing as much WORKSLOP as the inverse, which is passengers. 225 00:13:29,160 --> 00:13:34,400 Speaker 1: Passengers are low agency, low optimism. They're either not using 226 00:13:34,559 --> 00:13:38,920 Speaker 1: AI at all still or they're using it to create 227 00:13:38,960 --> 00:13:40,400 Speaker 1: a whole bunch of workslop. 228 00:13:40,800 --> 00:13:42,839 Speaker 3: So this is something that people are telling you because 229 00:13:42,880 --> 00:13:46,720 Speaker 3: you also asked not only have you received workslop, but 230 00:13:46,880 --> 00:13:51,720 Speaker 3: you're asking people fairly directly, are you generating AI work 231 00:13:52,280 --> 00:13:54,440 Speaker 3: that maybe is unhelpful and just passing it off like 232 00:13:54,440 --> 00:13:57,120 Speaker 3: it's your own work? And people are saying yes to this. Yeah. 233 00:13:57,520 --> 00:14:00,120 Speaker 1: What's incredible is that even with the self report or 234 00:14:00,200 --> 00:14:05,079 Speaker 1: biases that exist, people are still admitting to producing, you know, 235 00:14:05,200 --> 00:14:08,760 Speaker 1: just under twenty percent of their work it's WORKSLOP. 236 00:14:10,559 --> 00:14:13,400 Speaker 3: And I am going to assume that there are people 237 00:14:13,480 --> 00:14:15,120 Speaker 3: who are doing this and they're just not willing to 238 00:14:15,160 --> 00:14:15,560 Speaker 3: admit it. 239 00:14:16,120 --> 00:14:17,240 Speaker 2: I completely agree. 240 00:14:18,080 --> 00:14:20,680 Speaker 1: I will say that what we're starting to see is 241 00:14:20,720 --> 00:14:26,200 Speaker 1: the biggest predictor of creating WORKSLOP is actually being kind 242 00:14:26,200 --> 00:14:30,600 Speaker 1: of burned out. So people say that, like the reason 243 00:14:30,680 --> 00:14:35,480 Speaker 1: why they're generating it themselves as authors of WORKSLOP is 244 00:14:35,560 --> 00:14:39,840 Speaker 1: because these tasks that they're using AI to do are 245 00:14:39,920 --> 00:14:42,880 Speaker 1: one of many things that they're responsible for doing, so 246 00:14:43,240 --> 00:14:45,640 Speaker 1: they have a really full plate. You know, there have 247 00:14:45,760 --> 00:14:49,080 Speaker 1: been layoffs in the past few years, and so people 248 00:14:49,120 --> 00:14:51,440 Speaker 1: have increased span of control. They're having to do the 249 00:14:51,480 --> 00:14:55,040 Speaker 1: work of multiple people, and that makes it really tempting 250 00:14:55,400 --> 00:14:57,600 Speaker 1: to try to get a tool to do work on 251 00:14:57,640 --> 00:15:01,480 Speaker 1: your behalf. Also, I don't know if this feels true 252 00:15:01,520 --> 00:15:03,840 Speaker 1: for you, but a lot of what we hear from 253 00:15:04,080 --> 00:15:07,840 Speaker 1: our members and organizations is that, like everything feels really urgent, 254 00:15:08,360 --> 00:15:12,120 Speaker 1: and that's another predictor of producing work slow. It's just 255 00:15:12,400 --> 00:15:17,200 Speaker 1: the pressure to perform right now is really high, and 256 00:15:17,280 --> 00:15:21,120 Speaker 1: people are really depleted and haven't recovered from COVID. 257 00:15:21,600 --> 00:15:22,960 Speaker 2: It's really tempting. 258 00:15:22,600 --> 00:15:26,680 Speaker 1: When you have a very powerful tool to rely on 259 00:15:26,760 --> 00:15:30,240 Speaker 1: it to help you out. It's just that people are 260 00:15:30,240 --> 00:15:32,480 Speaker 1: doing that in a really blind way. 261 00:15:32,880 --> 00:15:35,160 Speaker 3: I mean, I would go further than saying it's tempting. 262 00:15:35,160 --> 00:15:37,280 Speaker 3: I would say it's the logical decision. 263 00:15:37,800 --> 00:15:38,040 Speaker 1: Right. 264 00:15:38,160 --> 00:15:41,720 Speaker 3: If you were at a job that you're watching people 265 00:15:41,760 --> 00:15:44,760 Speaker 3: getting laid off, you think you might be next, and 266 00:15:45,040 --> 00:15:49,600 Speaker 3: you're doing multiple people's job and you're not getting paid anymore, 267 00:15:50,920 --> 00:15:54,600 Speaker 3: why not use AI to get some of that stuff 268 00:15:54,600 --> 00:15:57,400 Speaker 3: off of your plate. What we're talking about here is 269 00:15:57,720 --> 00:16:01,000 Speaker 3: we're talking about fundamentally, I think in empathy thing which 270 00:16:01,160 --> 00:16:06,280 Speaker 3: is workslop. Maybe just the sign that, Yo, this person's overworked, 271 00:16:06,640 --> 00:16:10,040 Speaker 3: that's what I should think, But my immediate knee jerk 272 00:16:10,080 --> 00:16:12,720 Speaker 3: reaction might be this person's lazy, that're a bad person, 273 00:16:12,760 --> 00:16:16,040 Speaker 3: they don't respect me, right, And how do you even 274 00:16:16,080 --> 00:16:16,760 Speaker 3: get around that? 275 00:16:17,480 --> 00:16:21,440 Speaker 1: It's such a compassionate reframe on the situation, and maybe 276 00:16:21,480 --> 00:16:23,960 Speaker 1: you just came up with a new intervention. We could 277 00:16:24,080 --> 00:16:26,520 Speaker 1: try to do an experiment on that of just you know, 278 00:16:26,800 --> 00:16:30,200 Speaker 1: when people, you know they're about to start judging people, 279 00:16:30,600 --> 00:16:33,920 Speaker 1: that maybe we could just introduce the idea that people 280 00:16:33,920 --> 00:16:38,400 Speaker 1: are producing workslop because they're overworked, because everything feels urgent 281 00:16:38,520 --> 00:16:41,720 Speaker 1: and important, and maybe just by telling people that it 282 00:16:41,720 --> 00:16:45,000 Speaker 1: can initiate a dialogue that's more about hey, can I 283 00:16:45,040 --> 00:16:48,000 Speaker 1: help you approach these tasks in a different way. 284 00:16:48,120 --> 00:16:49,800 Speaker 2: Can I take something off of your plate? 285 00:16:50,120 --> 00:16:51,240 Speaker 3: I love your optimism. 286 00:16:53,360 --> 00:16:55,760 Speaker 2: My collaborator is Canadian. I can't help it. 287 00:16:58,120 --> 00:17:01,200 Speaker 3: I mean, I mean real, just real, Because maybe the 288 00:17:01,240 --> 00:17:07,520 Speaker 3: workslop thing isn't necessarily a sign that everybody's using chatgabt 289 00:17:07,760 --> 00:17:10,000 Speaker 3: or the chatchabt is bad or the claud is bad. 290 00:17:10,080 --> 00:17:12,879 Speaker 3: Maybe that's not really it. Maybe it's more of a 291 00:17:12,960 --> 00:17:18,840 Speaker 3: deeper fundamental societal thing, which is that yo, people are overworked. 292 00:17:19,160 --> 00:17:20,560 Speaker 2: I think that's exactly right. 293 00:17:20,960 --> 00:17:23,600 Speaker 1: Like that has been the focus of our research for 294 00:17:23,680 --> 00:17:26,720 Speaker 1: the last year, is about this crisis that we're in. 295 00:17:27,040 --> 00:17:29,399 Speaker 1: We've seen this over the past ten years in our 296 00:17:29,480 --> 00:17:33,600 Speaker 1: data that people are decreasing in the amount of resilience 297 00:17:33,680 --> 00:17:37,119 Speaker 1: and agility and these foundations of well being they have. 298 00:17:37,200 --> 00:17:42,280 Speaker 1: We're an all time low. So people really lack motivation, optimism, agency, 299 00:17:42,520 --> 00:17:45,760 Speaker 1: and we're just hungry for compassion and we just need 300 00:17:45,800 --> 00:17:49,199 Speaker 1: someone to refuel us so that we can be motivated 301 00:17:49,280 --> 00:17:52,560 Speaker 1: and feel like we matter and our work matters. 302 00:17:52,680 --> 00:17:59,600 Speaker 3: Again, basically, the conditions that are causing the problem here, 303 00:18:00,359 --> 00:18:04,000 Speaker 3: like we are going to continue to create those conditions. 304 00:18:04,119 --> 00:18:07,240 Speaker 3: The product is the output, which is the product which 305 00:18:07,280 --> 00:18:09,840 Speaker 3: is the output with this sounds like a spiral to. 306 00:18:09,840 --> 00:18:13,720 Speaker 2: Me, a vicious cycle. I think there's a way out. 307 00:18:14,080 --> 00:18:17,320 Speaker 1: We are thinking a lot about what type of interventions 308 00:18:17,320 --> 00:18:21,240 Speaker 1: are possible to prevent work slog and there are a 309 00:18:21,240 --> 00:18:22,640 Speaker 1: few things that are possible. 310 00:18:23,440 --> 00:18:38,080 Speaker 3: We'll get into that after the break. So Sam Altman said, 311 00:18:38,119 --> 00:18:42,240 Speaker 3: I believe, paraphrasing him, that this year we may see 312 00:18:42,720 --> 00:18:47,920 Speaker 3: AI basically quote unquote join the workforce. You've studied how 313 00:18:48,400 --> 00:18:53,240 Speaker 3: actual employees feel about basically AI kind of joining the workforce. 314 00:18:53,640 --> 00:18:55,880 Speaker 3: Where are we in terms of that right now? 315 00:18:56,480 --> 00:18:58,600 Speaker 2: Actually, you may be surprised. 316 00:18:58,840 --> 00:19:02,720 Speaker 1: I think that people are really excited about AI joining 317 00:19:02,800 --> 00:19:03,440 Speaker 1: the workforce. 318 00:19:03,800 --> 00:19:06,000 Speaker 2: When we ask people about. 319 00:19:05,680 --> 00:19:09,720 Speaker 1: This idea of managing a hybrid workforce of humans and agents, 320 00:19:10,000 --> 00:19:15,000 Speaker 1: for the most part, they express positive emotions like excitement, optimism, confidence. 321 00:19:15,160 --> 00:19:17,800 Speaker 3: Well, when you say people here, are we talking managers? 322 00:19:17,800 --> 00:19:21,240 Speaker 3: Are we talking CEOs, bosses? Are we talking workers? 323 00:19:21,680 --> 00:19:25,080 Speaker 1: Okay, that's a great question. Primarily when we ask managers, 324 00:19:25,440 --> 00:19:31,199 Speaker 1: they're more excited than individual contributors about AI joining the workforce. 325 00:19:31,680 --> 00:19:35,840 Speaker 1: And what people are wanting is training, And so we 326 00:19:35,960 --> 00:19:38,960 Speaker 1: have done a lot of work trying to understand what 327 00:19:39,040 --> 00:19:42,359 Speaker 1: type of training people need in management skills to best 328 00:19:42,440 --> 00:19:47,800 Speaker 1: manage AI, and for the most part, relational skills training, 329 00:19:48,560 --> 00:19:54,080 Speaker 1: a much more human type of training providing context, listening, 330 00:19:54,240 --> 00:19:59,760 Speaker 1: empathy is far more effective than task based training, where 331 00:19:59,760 --> 00:20:04,159 Speaker 1: you're giving directions on how to prompt, how to specify style, 332 00:20:04,320 --> 00:20:08,360 Speaker 1: how to evaluate the AI's output accurately. There are all 333 00:20:08,400 --> 00:20:11,160 Speaker 1: these courses you can take right now in AI literacy. 334 00:20:11,600 --> 00:20:15,119 Speaker 1: They're all about these task based training, hard skills to 335 00:20:15,200 --> 00:20:18,000 Speaker 1: learn how to do this. We find then if you 336 00:20:18,080 --> 00:20:21,119 Speaker 1: teach people to be really good managers, they're going to 337 00:20:21,160 --> 00:20:24,160 Speaker 1: do a really good job managing the AI and have 338 00:20:24,240 --> 00:20:25,080 Speaker 1: better results. 339 00:20:27,320 --> 00:20:30,119 Speaker 3: So really a big part of reducing workslot falls on 340 00:20:30,160 --> 00:20:33,439 Speaker 3: the managers and the CEOs. They're the ones responsible for 341 00:20:33,480 --> 00:20:36,320 Speaker 3: creating an environment where people can be comfortable with how 342 00:20:36,359 --> 00:20:39,359 Speaker 3: AI is being used, or where people can be comfortable 343 00:20:39,440 --> 00:20:41,000 Speaker 3: speaking up if something's wrong. 344 00:20:42,200 --> 00:20:44,480 Speaker 1: One of the things that we're seeing is that having 345 00:20:44,920 --> 00:20:49,359 Speaker 1: an environment that's psychologically safe, where people have the ability 346 00:20:49,400 --> 00:20:52,280 Speaker 1: to ask questions and take risks in a way that 347 00:20:52,440 --> 00:20:55,639 Speaker 1: feels safe, so they don't have to be worried about 348 00:20:55,680 --> 00:21:00,639 Speaker 1: whether using the AI is demanded or appropriate or permissible, 349 00:21:00,760 --> 00:21:02,719 Speaker 1: you know. So, I think part of that is like, 350 00:21:03,000 --> 00:21:05,400 Speaker 1: can I try this out? Can I disclose that I've 351 00:21:05,560 --> 00:21:08,199 Speaker 1: used it here? Can you give me some feedback on 352 00:21:08,320 --> 00:21:12,199 Speaker 1: whether this quality is high enough? Also training those mindsets, 353 00:21:12,280 --> 00:21:15,960 Speaker 1: so the pilot mindset, teaching people to have these skills 354 00:21:15,960 --> 00:21:20,080 Speaker 1: to be augentic over their usage of AI and also optimistic, 355 00:21:20,200 --> 00:21:24,159 Speaker 1: so curious and confident and willing to explore different ways 356 00:21:24,160 --> 00:21:27,000 Speaker 1: to use it to be more creative as opposed to 357 00:21:27,040 --> 00:21:29,800 Speaker 1: feeling like you have to use it. And then I 358 00:21:29,840 --> 00:21:33,480 Speaker 1: often like to say that AI is a multiplayer tool, 359 00:21:33,720 --> 00:21:38,040 Speaker 1: multiplayer game. And if we remind people that there is 360 00:21:38,080 --> 00:21:41,159 Speaker 1: a human recipient on the other end of your work, 361 00:21:41,400 --> 00:21:44,919 Speaker 1: and your goal is to collaborate with another human, and 362 00:21:45,000 --> 00:21:49,560 Speaker 1: you think about the potential negative impact of producing thoughtless 363 00:21:49,600 --> 00:21:53,439 Speaker 1: work that's AI generated, then I think even that psycho 364 00:21:53,600 --> 00:21:57,400 Speaker 1: education and the salience of knowing that there are humans 365 00:21:57,440 --> 00:21:59,760 Speaker 1: on the other side of the tool can be really 366 00:22:00,000 --> 00:22:02,240 Speaker 1: helpful to having positive collaboration. 367 00:22:02,359 --> 00:22:06,000 Speaker 3: Again, I think the study in some ways is a 368 00:22:06,080 --> 00:22:09,760 Speaker 3: kind of a warning. And basically, right now I can 369 00:22:09,800 --> 00:22:15,800 Speaker 3: totally understand why somebody would generate work slap, and it 370 00:22:15,840 --> 00:22:19,840 Speaker 3: would be great if their bosses or their company would 371 00:22:20,320 --> 00:22:24,880 Speaker 3: recognize the root cause of why they're using chat GBT 372 00:22:25,240 --> 00:22:29,480 Speaker 3: to write code or write a presentation that would be amazing. 373 00:22:30,119 --> 00:22:34,680 Speaker 3: Let's be realistic, that probably won't happen. And so your 374 00:22:34,680 --> 00:22:38,480 Speaker 3: study finds forty percent of people have received work slap 375 00:22:38,600 --> 00:22:42,800 Speaker 3: in the last month. I see that going up. Let's 376 00:22:42,800 --> 00:22:45,600 Speaker 3: talk about a five year in the future period. What 377 00:22:45,640 --> 00:22:46,280 Speaker 3: does that look like. 378 00:22:46,760 --> 00:22:50,960 Speaker 1: Yeah, it's a really complicated moment that we're in right now, 379 00:22:51,200 --> 00:22:54,600 Speaker 1: and I think the pressure to do more with less 380 00:22:54,880 --> 00:22:59,520 Speaker 1: is increasing, and there are so many unknowns about the 381 00:22:59,600 --> 00:23:03,720 Speaker 1: future of humans and the way that we work together and. 382 00:23:03,680 --> 00:23:04,680 Speaker 2: The tools that we use. 383 00:23:04,760 --> 00:23:08,880 Speaker 1: And they don't mean to be pollyannish and thinking about, 384 00:23:08,960 --> 00:23:10,879 Speaker 1: you know, the way that we can help and the 385 00:23:10,880 --> 00:23:12,679 Speaker 1: way that we can train people. I do think there 386 00:23:12,680 --> 00:23:16,639 Speaker 1: are deeper issues to bring like politicians and policymakers in. 387 00:23:17,760 --> 00:23:21,000 Speaker 1: But I have to say that, like, we've been through 388 00:23:21,040 --> 00:23:27,040 Speaker 1: this before. So think back to the early two thousand 389 00:23:27,080 --> 00:23:31,720 Speaker 1: and tens when social media came around and people had 390 00:23:31,760 --> 00:23:35,560 Speaker 1: to people meeting organizations and the people within them had 391 00:23:35,560 --> 00:23:38,400 Speaker 1: to learn to communicate and collaborate in a new way. 392 00:23:38,840 --> 00:23:40,840 Speaker 2: You know, it reshaped the whole. 393 00:23:40,600 --> 00:23:44,520 Speaker 1: World of just as an example, marketing and advertising, it 394 00:23:44,600 --> 00:23:48,199 Speaker 1: was a totally enormous shift in the way that we 395 00:23:48,240 --> 00:23:50,240 Speaker 1: do work, the way that we relate to each other, 396 00:23:50,359 --> 00:23:53,040 Speaker 1: the way that we use tools to get our work done. 397 00:23:53,600 --> 00:23:56,840 Speaker 1: And I think that gives me hope that you know, 398 00:23:56,920 --> 00:24:01,840 Speaker 1: this is a really, really powerful tool. We can create 399 00:24:01,880 --> 00:24:05,200 Speaker 1: a ton of value. We may need to slow down 400 00:24:05,240 --> 00:24:08,800 Speaker 1: a little bit, but you do have to stay relevant 401 00:24:09,080 --> 00:24:11,720 Speaker 1: with the tools that we all have access to right now, 402 00:24:11,960 --> 00:24:14,359 Speaker 1: to test them out, to try them out, to figure 403 00:24:14,400 --> 00:24:17,160 Speaker 1: out if you know, in the same way that years 404 00:24:17,200 --> 00:24:22,199 Speaker 1: ago we learned code, maybe no longer as relevant, but 405 00:24:22,480 --> 00:24:24,480 Speaker 1: I think you have to figure out how to use 406 00:24:24,520 --> 00:24:25,320 Speaker 1: these tools. 407 00:24:26,040 --> 00:24:27,760 Speaker 3: I mean, yeah, that's kind of the wild thing, is 408 00:24:27,760 --> 00:24:30,720 Speaker 3: I mean, I remember when a journalist lost their job, 409 00:24:30,800 --> 00:24:32,480 Speaker 3: you know, somebody to get up on Twitter and say, hey, 410 00:24:32,560 --> 00:24:33,160 Speaker 3: learn to code. 411 00:24:33,160 --> 00:24:33,879 Speaker 1: Ha ha ha ha ha. 412 00:24:34,119 --> 00:24:34,320 Speaker 4: Right. 413 00:24:35,760 --> 00:24:39,520 Speaker 3: Ironically, some of those same people I think are probably 414 00:24:39,520 --> 00:24:41,640 Speaker 3: now looking for a gig along with the rest of us, 415 00:24:42,040 --> 00:24:46,240 Speaker 3: because entry level coding gigs, I mean Claude and chat 416 00:24:46,320 --> 00:24:48,560 Speaker 3: GBT kind of has that on lock. It may be 417 00:24:48,720 --> 00:24:51,960 Speaker 3: generating terrible code, but it may be quote unquote good enough, 418 00:24:52,600 --> 00:24:55,719 Speaker 3: or as you say, it may be workslop and then 419 00:24:55,760 --> 00:24:57,520 Speaker 3: somebody gots to come in and clean it up. But 420 00:24:57,960 --> 00:25:00,200 Speaker 3: the company hasn't quite figured that out yet. 421 00:25:00,480 --> 00:25:04,280 Speaker 1: Right, So maybe there's a new industry of people who 422 00:25:04,359 --> 00:25:07,840 Speaker 1: have strong critical thinking skills and the ability to evaluate 423 00:25:07,880 --> 00:25:10,600 Speaker 1: the quality of code and to detect these signals and 424 00:25:10,640 --> 00:25:12,840 Speaker 1: to point that out. You know, I could see that 425 00:25:12,960 --> 00:25:17,240 Speaker 1: being an unexpectedly useful ability. I noticed that, Like, my 426 00:25:17,359 --> 00:25:21,399 Speaker 1: kids are way better at detecting AI generated images whenever 427 00:25:21,440 --> 00:25:22,760 Speaker 1: they are those games. 428 00:25:22,440 --> 00:25:23,720 Speaker 2: You know, AI generated or not. 429 00:25:24,040 --> 00:25:26,119 Speaker 1: I get them wrong every time, and my kids know 430 00:25:26,320 --> 00:25:28,760 Speaker 1: every single time, Like it's so obvious. 431 00:25:29,200 --> 00:25:30,400 Speaker 2: So maybe there are. 432 00:25:30,320 --> 00:25:34,760 Speaker 1: Some cues that you know, younger demographics, people who have 433 00:25:34,840 --> 00:25:38,639 Speaker 1: more hourly jobs can develop to help them evaluate and 434 00:25:38,720 --> 00:25:41,600 Speaker 1: work on their critical thinking skills to help the rest 435 00:25:41,600 --> 00:25:42,399 Speaker 1: of us prevent it. 436 00:25:43,080 --> 00:25:45,480 Speaker 3: I mean, I'm starting to wonder if maybe the one 437 00:25:45,520 --> 00:25:48,879 Speaker 3: prediction that we could make is that there might be 438 00:25:48,960 --> 00:25:52,040 Speaker 3: some kind of new category of job created, which is 439 00:25:52,359 --> 00:25:56,120 Speaker 3: basically work slop manager, like somebody who can who can 440 00:25:56,160 --> 00:25:58,280 Speaker 3: catch the work slop and say, I know what that is, 441 00:25:58,400 --> 00:26:02,359 Speaker 3: let me fix that workslop Jim workslop janitor. You gotta 442 00:26:02,400 --> 00:26:05,520 Speaker 3: be kidding me, man, You're right, I think some people 443 00:26:05,520 --> 00:26:06,439 Speaker 3: are that right now. 444 00:26:06,760 --> 00:26:09,840 Speaker 2: I think they are for sure that it's. 445 00:26:09,720 --> 00:26:13,159 Speaker 3: A little bleak, but maybe it's necessary. Maybe this is 446 00:26:13,200 --> 00:26:15,840 Speaker 3: where we're headed. Maybe not everybody figures out a way 447 00:26:15,880 --> 00:26:18,320 Speaker 3: within their company to make it so the workslop isn't 448 00:26:18,359 --> 00:26:21,480 Speaker 3: being generated, and so just understanding that this stuff is 449 00:26:21,480 --> 00:26:23,040 Speaker 3: going to be generated, and then we got to have 450 00:26:23,080 --> 00:26:29,280 Speaker 3: somebody clean it up before it gets put out. You know, 451 00:26:29,440 --> 00:26:31,159 Speaker 3: Kate and I didn't realize this when we were talking, 452 00:26:31,200 --> 00:26:33,919 Speaker 3: but it turns out that the phrasing of janitors for 453 00:26:34,000 --> 00:26:37,280 Speaker 3: AI workslop is actually already a thing. I was just 454 00:26:37,320 --> 00:26:40,320 Speaker 3: searching and there is a site called code janitor that'll 455 00:26:40,359 --> 00:26:42,639 Speaker 3: charge you a couple thousand dollars to clean up the 456 00:26:42,680 --> 00:26:45,560 Speaker 3: AI generated code so that your app will stop crashing 457 00:26:45,600 --> 00:26:48,480 Speaker 3: and actually work. This is a niche job that I 458 00:26:48,560 --> 00:26:51,080 Speaker 3: think is probably going to spread out to other industries, 459 00:26:51,440 --> 00:26:54,040 Speaker 3: maybe even the industry that you work in, to which 460 00:26:54,359 --> 00:26:57,040 Speaker 3: you might say, well, wouldn't it just be better to 461 00:26:57,160 --> 00:26:59,480 Speaker 3: have hired somebody who's good at their job to do 462 00:26:59,520 --> 00:27:03,600 Speaker 3: it right there the first time? Yes, probably, but if 463 00:27:03,640 --> 00:27:06,399 Speaker 3: you're not the boss, that decision isn't always up to you. 464 00:27:09,280 --> 00:27:12,080 Speaker 3: Thank you for checking out another episode of kill Switch. 465 00:27:12,200 --> 00:27:14,480 Speaker 3: If you want to email us we're at kill Switch 466 00:27:14,640 --> 00:27:18,760 Speaker 3: at Kaleidoscope dot NYC, or we're also on Instagram at 467 00:27:18,960 --> 00:27:21,840 Speaker 3: kill switch pod. And if you like the show, which 468 00:27:21,920 --> 00:27:23,880 Speaker 3: hopefully you do because you're all the way at the end, 469 00:27:24,240 --> 00:27:26,720 Speaker 3: think about leaving us a review. It helps other people 470 00:27:26,760 --> 00:27:29,200 Speaker 3: find the show, which helps us keep doing our thing. 471 00:27:29,760 --> 00:27:31,840 Speaker 3: And once you've done that, did you know that kill 472 00:27:31,840 --> 00:27:34,200 Speaker 3: Switch is on YouTube? So if you like seeing stuff 473 00:27:34,240 --> 00:27:36,280 Speaker 3: as well as hearing it, you can have the link 474 00:27:36,359 --> 00:27:39,119 Speaker 3: for that. In the show notes, kill Switch is hosted 475 00:27:39,119 --> 00:27:43,159 Speaker 3: by Me Dexter Thomas. It's produced by Sena Ozaki, Darluck Potts, 476 00:27:43,359 --> 00:27:47,640 Speaker 3: and Julia Nutter. Alexanderveld also helps with production on this episode. 477 00:27:47,840 --> 00:27:50,440 Speaker 3: Our theme song is by me and Kyle Murdoch and 478 00:27:50,560 --> 00:27:54,360 Speaker 3: Kyle also mixed the show. From Kaleidoscope, our executive producers 479 00:27:54,400 --> 00:27:58,840 Speaker 3: are Ozma Lashin, Mangesh Hajikadur, and Kate Osborne. From iHeart 480 00:27:58,880 --> 00:28:02,880 Speaker 3: Our executive producers ur Katrina Norvile and Nikki E. Tor Oh. 481 00:28:03,000 --> 00:28:05,120 Speaker 3: And one last thing, I don't want people to get 482 00:28:05,119 --> 00:28:08,120 Speaker 3: the impression that workslop is just people turning bad work 483 00:28:08,160 --> 00:28:12,040 Speaker 3: into their bosses. The study shows that workslop also flows 484 00:28:12,160 --> 00:28:15,600 Speaker 3: down the organizational ladder too. Kate, give me an example. 485 00:28:16,680 --> 00:28:20,440 Speaker 1: There are some people who talk about receiving them from 486 00:28:20,640 --> 00:28:21,320 Speaker 1: their ceo. 487 00:28:21,880 --> 00:28:22,960 Speaker 2: Here's an example. 488 00:28:23,359 --> 00:28:25,800 Speaker 1: The CEO of our company sent over some notes for 489 00:28:25,880 --> 00:28:28,399 Speaker 1: fundraising strategy. It was just a whole lot of words 490 00:28:28,440 --> 00:28:32,159 Speaker 1: saying nothing Chat GPT puke he sent over would have 491 00:28:32,200 --> 00:28:34,439 Speaker 1: been so much more helpful as a paragraph written in 492 00:28:34,480 --> 00:28:38,520 Speaker 1: his own words. The style of writing, reliance on bullet points, 493 00:28:38,720 --> 00:28:42,200 Speaker 1: the fact that our CEO readily admits using GBT for everything. 494 00:28:42,800 --> 00:28:46,120 Speaker 1: He had just gotten back from vacation. He was asking 495 00:28:46,160 --> 00:28:48,960 Speaker 1: GPT for parenting advice. So it's just like. 496 00:28:48,920 --> 00:28:49,760 Speaker 2: Going on and on. 497 00:28:51,440 --> 00:28:56,720 Speaker 3: I'm sorry, Chat GBT for parenting advice is you know. 498 00:28:56,760 --> 00:28:59,040 Speaker 3: I'm gonna leave that alone because that's an entirely different episode. 499 00:28:59,040 --> 00:29:03,400 Speaker 3: I don't want to touch that right now. Wow, I really 500 00:29:03,440 --> 00:29:05,200 Speaker 3: hope that we do not need to make an episode 501 00:29:05,240 --> 00:29:08,840 Speaker 3: about people using chat GBT for parenting advice. Please do 502 00:29:09,000 --> 00:29:11,880 Speaker 3: not make us make an episode about that anyway. 503 00:29:12,320 --> 00:29:19,440 Speaker 4: Catch on the next one. Good Bye,