1 00:00:01,080 --> 00:00:05,400 Speaker 1: The secret to using AI well might actually be teaching 2 00:00:05,440 --> 00:00:09,159 Speaker 1: it to shut up. In this bonus episode, I am 3 00:00:09,240 --> 00:00:12,400 Speaker 1: joined by Elan Lee, the brilliant mind behind games like 4 00:00:12,440 --> 00:00:15,720 Speaker 1: Exploding Kittens and Throw Throw Burrito. And while you might 5 00:00:15,800 --> 00:00:19,200 Speaker 1: expect someone like Elan to use AI to design wild 6 00:00:19,280 --> 00:00:23,720 Speaker 1: new worlds or characters, that's not what he does at all. Instead, 7 00:00:24,040 --> 00:00:27,600 Speaker 1: he's found a way to make AI his creative spiring partner, 8 00:00:27,880 --> 00:00:31,800 Speaker 1: one that never interrupts, never judges, and somehow helps him 9 00:00:31,840 --> 00:00:33,159 Speaker 1: think more clearly. 10 00:00:33,760 --> 00:00:38,960 Speaker 2: When you're working with AI, you don't let it answer 11 00:00:39,040 --> 00:00:42,120 Speaker 2: questions for you. You let it ask you questions. 12 00:00:42,360 --> 00:00:45,680 Speaker 1: In this bonus chat, you'll hear how Elan uses AI 13 00:00:45,840 --> 00:00:49,400 Speaker 1: to stress test his game ideas, solve tricky design problems, 14 00:00:49,479 --> 00:01:00,360 Speaker 1: and even guide his own creative walks. Welcome to How 15 00:01:00,480 --> 00:01:04,080 Speaker 1: I Work, a show about habits, rituals, and strategies for 16 00:01:04,200 --> 00:01:12,600 Speaker 1: optimizing your day. I'm your host, doctor Amantha Imba. I'm 17 00:01:12,720 --> 00:01:15,760 Speaker 1: curious now that the GENII is such a part of 18 00:01:15,800 --> 00:01:18,640 Speaker 1: our work lives, and you know, really has been for 19 00:01:18,840 --> 00:01:23,680 Speaker 1: like three years now, how has that changed your creative process? 20 00:01:23,720 --> 00:01:26,319 Speaker 2: If it all, we don't use it as part of 21 00:01:26,400 --> 00:01:29,520 Speaker 2: the creative process, I have asked A. I've said, hey, 22 00:01:30,080 --> 00:01:31,840 Speaker 2: I want to come up with a game that has 23 00:01:31,880 --> 00:01:35,160 Speaker 2: a duck of cards and it's themed fish, and it 24 00:01:35,360 --> 00:01:37,399 Speaker 2: involves something that you throw up in the air and 25 00:01:37,440 --> 00:01:39,720 Speaker 2: have to catch it, and the junk that it spits 26 00:01:39,800 --> 00:01:43,319 Speaker 2: back is the worst, most awful stuff imaginable. So we've 27 00:01:43,400 --> 00:01:45,200 Speaker 2: learned very you know, we tried all those tests and 28 00:01:45,240 --> 00:01:47,640 Speaker 2: realized like, okay, this is not what AI is for. 29 00:01:48,160 --> 00:01:50,400 Speaker 2: Then we tried it for art. We're like, okay, so 30 00:01:50,520 --> 00:01:53,040 Speaker 2: here's a final game. I don't have a good theme. 31 00:01:53,320 --> 00:01:55,320 Speaker 2: Come up with one and create all the art, and 32 00:01:55,400 --> 00:01:58,040 Speaker 2: it comes up with well you've seen it comes up 33 00:01:58,040 --> 00:02:01,800 Speaker 2: with AI art, and so that's useless. So now what 34 00:02:01,840 --> 00:02:05,680 Speaker 2: we use it for is kind of an augmentation process. 35 00:02:05,840 --> 00:02:09,520 Speaker 2: We'll say, like, here are instructions that we're ready to 36 00:02:09,520 --> 00:02:12,560 Speaker 2: send out to these four hundred families. However, once we 37 00:02:12,639 --> 00:02:15,320 Speaker 2: do that, we can't test those four hundred families anymore. 38 00:02:15,360 --> 00:02:17,040 Speaker 2: So I want to make sure this has the best 39 00:02:17,080 --> 00:02:21,080 Speaker 2: chance of success. Dear AI, please read these instructions and 40 00:02:21,400 --> 00:02:24,160 Speaker 2: you ask me any questions that will help with clarity, 41 00:02:24,480 --> 00:02:27,640 Speaker 2: and it will find edge cases that I never would 42 00:02:27,639 --> 00:02:30,480 Speaker 2: have come up within one hundred years. And that's really 43 00:02:30,520 --> 00:02:34,200 Speaker 2: really helpful because now I get the most out of 44 00:02:34,200 --> 00:02:37,360 Speaker 2: these families because I can only use them each once. Now, 45 00:02:37,440 --> 00:02:41,079 Speaker 2: anything that could possibly come up within reason AI has 46 00:02:41,160 --> 00:02:44,440 Speaker 2: usually flagged. We've modified the instructions and now we get 47 00:02:44,720 --> 00:02:48,040 Speaker 2: wonderful test results that are to the quality of gameplay 48 00:02:48,080 --> 00:02:51,320 Speaker 2: and not to these weird edge cases that just distract 49 00:02:51,440 --> 00:02:54,520 Speaker 2: from gameplay. So we use it a lot. There. We 50 00:02:54,680 --> 00:02:57,120 Speaker 2: use it a lot when we get into a bind 51 00:02:57,360 --> 00:03:01,440 Speaker 2: with points. Points in games are this whole weird eCos 52 00:03:01,480 --> 00:03:04,320 Speaker 2: It's an economy. It's this miniature economy that exists within 53 00:03:04,360 --> 00:03:07,120 Speaker 2: the game, and if inflation gets too high, the game breaks, 54 00:03:07,120 --> 00:03:09,480 Speaker 2: and if a recession happens, the game breaks, and if 55 00:03:09,560 --> 00:03:12,280 Speaker 2: you run out of resources, the game breaks. Right. Points 56 00:03:12,320 --> 00:03:15,239 Speaker 2: are weird because they have to abide by all the 57 00:03:15,320 --> 00:03:18,160 Speaker 2: rules of economics, but in this very very micro sense, 58 00:03:18,160 --> 00:03:21,880 Speaker 2: and it's different from games games game. So oftentimes we'll 59 00:03:21,919 --> 00:03:24,200 Speaker 2: find ourselves in a bind where it's like we hit 60 00:03:24,240 --> 00:03:27,320 Speaker 2: this thing where everyone had to get points, but nobody 61 00:03:27,400 --> 00:03:29,480 Speaker 2: won because there weren't enough points. And I don't want 62 00:03:29,480 --> 00:03:32,200 Speaker 2: to put in infinite points in this game, So how 63 00:03:32,280 --> 00:03:35,320 Speaker 2: do we restrict this one little thing? AI is very 64 00:03:35,320 --> 00:03:40,240 Speaker 2: good at those problems solving microeconomic issues. Turns out AI 65 00:03:40,400 --> 00:03:42,240 Speaker 2: is lovely at that in a way that I don't 66 00:03:42,240 --> 00:03:43,800 Speaker 2: want to study, I don't want to get good at. 67 00:03:43,960 --> 00:03:45,760 Speaker 2: I'm going to just throw the problem at AI. It 68 00:03:45,800 --> 00:03:47,880 Speaker 2: feels a lot like math. Let somebody else solve it. 69 00:03:47,960 --> 00:03:51,600 Speaker 1: I love those two examples. Are there any other ways 70 00:03:51,680 --> 00:03:55,040 Speaker 1: that you found personally II to be quite useful in 71 00:03:55,280 --> 00:03:56,040 Speaker 1: the role that you do. 72 00:03:56,440 --> 00:03:59,800 Speaker 2: Yeah. I use AI every day for about two to 73 00:03:59,800 --> 00:04:03,560 Speaker 2: three hours a day, and what I do is primarily 74 00:04:03,600 --> 00:04:07,000 Speaker 2: I use chatch EPT and I'll put it on voice mode, 75 00:04:07,200 --> 00:04:09,680 Speaker 2: and I have this prompt that I used to have 76 00:04:09,720 --> 00:04:12,160 Speaker 2: to refeed it, but now it's learned to remember this prompt. 77 00:04:12,400 --> 00:04:15,920 Speaker 2: But the prompt basically says, I'm going to go for 78 00:04:16,000 --> 00:04:18,080 Speaker 2: a three hour walk and I want you to stay 79 00:04:18,080 --> 00:04:21,200 Speaker 2: on the whole time, and I'm going to describe just 80 00:04:21,240 --> 00:04:24,240 Speaker 2: what's going through my head. Might be a game design issue, 81 00:04:24,320 --> 00:04:26,719 Speaker 2: it might be a company policy issue, it might be 82 00:04:26,720 --> 00:04:28,920 Speaker 2: an HR issue. Whatever it is, here's what I'm thinking 83 00:04:28,920 --> 00:04:32,560 Speaker 2: about today. And one every time I tell you to 84 00:04:32,600 --> 00:04:35,520 Speaker 2: take notes, just record the last thing I said, and 85 00:04:35,680 --> 00:04:38,599 Speaker 2: later on give me a transcript of just the parts 86 00:04:38,640 --> 00:04:41,599 Speaker 2: that I've said to record, and then two, don't be 87 00:04:41,680 --> 00:04:45,400 Speaker 2: scared of silence. Please please please do not fill silence. 88 00:04:45,480 --> 00:04:47,719 Speaker 2: Is if I stop talking, that doesn't mean you talk, 89 00:04:48,040 --> 00:04:51,040 Speaker 2: that means just hang out. I'm processing and if I 90 00:04:51,080 --> 00:04:53,520 Speaker 2: ask you a question, please help me. And if I 91 00:04:53,560 --> 00:04:56,600 Speaker 2: ask you for other ideas for something, then please answer. 92 00:04:56,880 --> 00:04:59,279 Speaker 2: But your job is not to be the other half 93 00:04:59,320 --> 00:05:02,160 Speaker 2: of this conversation. Your job is to let me think 94 00:05:02,320 --> 00:05:05,800 Speaker 2: and facilitate that thinking process. And I do that every 95 00:05:05,839 --> 00:05:09,560 Speaker 2: single morning, and it's glorious. It's such a wonderful practice 96 00:05:09,560 --> 00:05:12,279 Speaker 2: because I come back with this note file that it 97 00:05:12,360 --> 00:05:16,320 Speaker 2: generates of exactly how I want to handle the problem 98 00:05:16,400 --> 00:05:17,120 Speaker 2: that I was working on. 99 00:05:18,400 --> 00:05:20,559 Speaker 1: I want to know more about this, Like with every 100 00:05:20,600 --> 00:05:22,400 Speaker 1: walk that you're going on in the morning, are you 101 00:05:22,480 --> 00:05:26,640 Speaker 1: starting with a predefined problem or is it sometimes fuzzier 102 00:05:26,640 --> 00:05:26,920 Speaker 1: than that. 103 00:05:27,120 --> 00:05:29,239 Speaker 2: Yeah, I used to think I had to I would 104 00:05:29,279 --> 00:05:32,880 Speaker 2: list out exactly the problem, but I found that as 105 00:05:33,400 --> 00:05:36,960 Speaker 2: the engine gets better and better, I can come to 106 00:05:37,000 --> 00:05:40,760 Speaker 2: it with some pretty nebulous ideas and sometimes it will 107 00:05:40,760 --> 00:05:44,440 Speaker 2: guide me through brainstorming. Again, this is some modifications I 108 00:05:44,520 --> 00:05:46,760 Speaker 2: made to the prompt where I'll say, like I'm looking 109 00:05:46,800 --> 00:05:49,640 Speaker 2: for a new game idea. I'm vaguely interested by X, Y, 110 00:05:49,720 --> 00:05:54,279 Speaker 2: and Z, So help me figure out what I'm trying 111 00:05:54,320 --> 00:05:57,640 Speaker 2: to figure out. Don't answer it for me, just ask 112 00:05:57,760 --> 00:06:00,640 Speaker 2: me leading questions that will help me work through my 113 00:06:00,680 --> 00:06:03,640 Speaker 2: own design process. And it's quite good at that I'm 114 00:06:03,680 --> 00:06:06,800 Speaker 2: able to approach it with much less data than I 115 00:06:06,839 --> 00:06:08,560 Speaker 2: thought I had to and still come out with a 116 00:06:08,560 --> 00:06:09,120 Speaker 2: great result. 117 00:06:09,240 --> 00:06:12,400 Speaker 1: That's really interesting how you don't ask it for input. 118 00:06:12,440 --> 00:06:15,120 Speaker 1: You're just using it as someone to question you and 119 00:06:15,200 --> 00:06:16,960 Speaker 1: extract information from yourself. 120 00:06:17,520 --> 00:06:18,040 Speaker 2: If you were. 121 00:06:17,960 --> 00:06:22,400 Speaker 1: Giving advice to someone who was looking for help being 122 00:06:22,440 --> 00:06:25,159 Speaker 1: more creative, improving their own creative process, and they're not 123 00:06:25,200 --> 00:06:27,760 Speaker 1: you and they're not math, like, what advice would you 124 00:06:27,760 --> 00:06:30,560 Speaker 1: give them on how to work with AI? 125 00:06:31,400 --> 00:06:35,480 Speaker 2: First, you have to fill your own head with raw materials. 126 00:06:35,720 --> 00:06:38,200 Speaker 2: You don't have to remember everything, but like you need 127 00:06:38,240 --> 00:06:40,479 Speaker 2: to go watch all the movies, and you need to 128 00:06:40,480 --> 00:06:42,400 Speaker 2: listen to all the podcasts, and you need to read 129 00:06:42,480 --> 00:06:44,440 Speaker 2: all the books and watch all the TV shows and 130 00:06:44,440 --> 00:06:47,320 Speaker 2: play all the games. Because you don't have to have 131 00:06:47,400 --> 00:06:49,760 Speaker 2: instant recall of these things, but you do need a 132 00:06:49,800 --> 00:06:53,160 Speaker 2: lot of raw data in your head kicking around in 133 00:06:53,200 --> 00:07:00,520 Speaker 2: there somewhere, and then when you're working with AI, don't 134 00:07:00,920 --> 00:07:04,240 Speaker 2: let it answer questions for you. You let it ask 135 00:07:04,400 --> 00:07:07,800 Speaker 2: you questions that all of that raw material might be 136 00:07:07,880 --> 00:07:11,560 Speaker 2: the answer for, and in particular, you train it to 137 00:07:11,800 --> 00:07:15,760 Speaker 2: encourage you to combine things together. I feel like all 138 00:07:15,960 --> 00:07:19,600 Speaker 2: my best ideas are really the combination of other people's ideas, 139 00:07:19,840 --> 00:07:23,400 Speaker 2: just combined in new and interesting ways. And if you 140 00:07:23,440 --> 00:07:28,400 Speaker 2: can get AI to help you focus on those individual 141 00:07:28,480 --> 00:07:30,240 Speaker 2: nuggets that are kicking around in your head because you 142 00:07:30,280 --> 00:07:32,800 Speaker 2: filled your head with all those things, and then encourage 143 00:07:32,840 --> 00:07:36,320 Speaker 2: you to start combining them together, not how to combine them, 144 00:07:36,480 --> 00:07:39,200 Speaker 2: just hey, what if you mix those two ideas together? 145 00:07:40,480 --> 00:07:44,000 Speaker 2: Suddenly you're brainstorming in a whole new capacity. 146 00:07:44,160 --> 00:07:47,320 Speaker 1: What I love about Alan's approach is that he's flipped 147 00:07:47,360 --> 00:07:50,720 Speaker 1: the script on AI. Instead of asking it to come 148 00:07:50,800 --> 00:07:54,320 Speaker 1: up with ideas, he uses it to ask him better questions, 149 00:07:54,880 --> 00:07:57,480 Speaker 1: to help him see what's already in his head and 150 00:07:57,600 --> 00:08:01,000 Speaker 1: connect the dots in new ways. So the next time 151 00:08:01,000 --> 00:08:05,600 Speaker 1: you're feeling stuck, try this open chat JPT and ask 152 00:08:05,640 --> 00:08:08,360 Speaker 1: it to interview you about your problem instead of solving 153 00:08:08,400 --> 00:08:12,360 Speaker 1: it for you, you might be surprised by what surfaces, 154 00:08:12,960 --> 00:08:16,000 Speaker 1: and if you haven't yet, heard my full conversation with Elan. 155 00:08:16,120 --> 00:08:20,160 Speaker 1: I'd highly recommend it, or you dive into creativity, failure 156 00:08:20,320 --> 00:08:23,040 Speaker 1: and what it really takes to make great ideas happen. 157 00:08:23,360 --> 00:08:25,320 Speaker 1: There is a link to that in the show notes. 158 00:08:25,480 --> 00:08:28,200 Speaker 1: If you like today's show, make sure you hit follow 159 00:08:28,320 --> 00:08:31,800 Speaker 1: on your podcast app to be alerted when new episodes drop. 160 00:08:32,360 --> 00:08:34,880 Speaker 1: How I Work was recorded on the traditional land of 161 00:08:34,880 --> 00:08:37,080 Speaker 1: the warrangery people, part of the Cooler Nation.