1 00:00:01,040 --> 00:00:04,440 Speaker 1: What if every morning, instead of drowning in a sea 2 00:00:04,640 --> 00:00:09,000 Speaker 1: of newsletters, tweets, and podcasts, you had the day's biggest 3 00:00:09,080 --> 00:00:14,400 Speaker 1: breakthroughs already sorted for you. That's what Morgan Brown, vice 4 00:00:14,440 --> 00:00:18,200 Speaker 1: president of Product and Growth at drop Box, built for himself. 5 00:00:18,840 --> 00:00:20,880 Speaker 1: By the time he sits down with his coffee, an 6 00:00:20,960 --> 00:00:27,280 Speaker 1: AI workflow has already scanned the landscape of research, papers, substacks, podcasts, 7 00:00:27,320 --> 00:00:31,640 Speaker 1: and even YouTube. It spits out just three can't miss signals, 8 00:00:32,040 --> 00:00:35,839 Speaker 1: ranked from world changing to incremental, and explains what they 9 00:00:35,880 --> 00:00:40,320 Speaker 1: actually mean for his role. In today's quick Win, Morgan 10 00:00:40,400 --> 00:00:43,760 Speaker 1: shares the simple steps he used to design that system, 11 00:00:44,200 --> 00:00:48,879 Speaker 1: starting with one prompt, refining and running it daily. By 12 00:00:48,920 --> 00:00:51,000 Speaker 1: the end, you'll know how to set up your own 13 00:00:51,120 --> 00:00:55,560 Speaker 1: version to cut through the noise and actually stay ahead. 14 00:01:00,720 --> 00:01:04,440 Speaker 1: Welcome to How I Work, a show about habits, rituals, 15 00:01:04,480 --> 00:01:08,399 Speaker 1: and strategies for optimizing your day. I'm your host, doctor 16 00:01:08,440 --> 00:01:15,520 Speaker 1: Amantha Imber. Still staying on the topic of I guess 17 00:01:15,680 --> 00:01:19,080 Speaker 1: reducing or eliminating some of the grunt work through AI, 18 00:01:19,800 --> 00:01:23,080 Speaker 1: I'd love to know if they're more on the automation 19 00:01:23,440 --> 00:01:27,440 Speaker 1: or agentic AI side of things. What some of the 20 00:01:28,440 --> 00:01:31,160 Speaker 1: I guess other tasks that you've been able to automate 21 00:01:31,200 --> 00:01:32,800 Speaker 1: in interesting ways would be. 22 00:01:32,840 --> 00:01:35,360 Speaker 2: I've really done a lot of automation around trying to 23 00:01:35,920 --> 00:01:38,559 Speaker 2: understand what's changing around me. So I have a daily 24 00:01:39,319 --> 00:01:44,679 Speaker 2: workflow that reviews all new kind of substack newsletters, important 25 00:01:44,680 --> 00:01:49,200 Speaker 2: tweets across a large domain of AI and mL ex accounts, 26 00:01:49,800 --> 00:01:55,200 Speaker 2: reviews Archive, which is the scientific paper repository for new 27 00:01:55,400 --> 00:01:59,800 Speaker 2: mL and artificial intelligence papers. It scans the whole landscape 28 00:02:00,080 --> 00:02:04,480 Speaker 2: to Spotify podcasts and basically takes all of the day's 29 00:02:04,600 --> 00:02:08,960 Speaker 2: information about AI and mL reviews. It all categorizes it 30 00:02:09,040 --> 00:02:15,840 Speaker 2: into world changing, interesting, incremental, and noise, and then summarizes 31 00:02:15,880 --> 00:02:19,000 Speaker 2: the top three quote unquote world changing potentials and then 32 00:02:19,040 --> 00:02:21,360 Speaker 2: summarizes it for me in the context of my role 33 00:02:21,760 --> 00:02:24,680 Speaker 2: at DASH, where it kind of talks about hey, things 34 00:02:24,720 --> 00:02:27,360 Speaker 2: you might consider as the head of product for Dash. So, 35 00:02:27,400 --> 00:02:30,840 Speaker 2: for example, Anthropic the other day just announced that claud 36 00:02:30,880 --> 00:02:35,120 Speaker 2: can handle now a million token context window, which effectively 37 00:02:35,160 --> 00:02:38,919 Speaker 2: means that for many reasonable size software codebases, it can 38 00:02:39,000 --> 00:02:41,640 Speaker 2: reason about the entire codebase, which is a pretty big 39 00:02:41,639 --> 00:02:44,679 Speaker 2: and important milestone for software development. And the other one 40 00:02:44,720 --> 00:02:47,560 Speaker 2: is I automate my personal email I can't imagine you're 41 00:02:47,680 --> 00:02:50,560 Speaker 2: inbox Atmantha, but it gets hammered all the time. And 42 00:02:50,600 --> 00:02:55,639 Speaker 2: so I built a little agent using claud code to 43 00:02:55,720 --> 00:03:00,679 Speaker 2: read my personal Gmail every day, categorize it into newsletters, 44 00:03:01,240 --> 00:03:05,960 Speaker 2: spam priority emails, summarize the newsletters for me, flag the 45 00:03:05,960 --> 00:03:09,600 Speaker 2: messages that I want to respond to, and draft responses 46 00:03:10,160 --> 00:03:14,480 Speaker 2: for those newsletters or not for those emails. So really 47 00:03:14,480 --> 00:03:18,839 Speaker 2: trying to kind of like manage my inbound email automatically 48 00:03:18,840 --> 00:03:19,200 Speaker 2: as well. 49 00:03:19,240 --> 00:03:21,480 Speaker 1: Wow, okay, I love the both those examples. I want 50 00:03:21,520 --> 00:03:24,120 Speaker 1: to dig a bit deeper into how you went about 51 00:03:24,280 --> 00:03:30,560 Speaker 1: creating this automation that summarizes all the AI news of 52 00:03:30,600 --> 00:03:34,160 Speaker 1: the day. I mean, that sounds so powerful. Can I 53 00:03:34,200 --> 00:03:36,160 Speaker 1: buy that from you? Morgan? Are you selling it? 54 00:03:37,440 --> 00:03:38,720 Speaker 2: You can link it in the show notes? 55 00:03:39,600 --> 00:03:43,040 Speaker 1: Wow? Amazing handed over? Yeah wow. Talk me through the 56 00:03:43,080 --> 00:03:45,120 Speaker 1: process of how you put that together. 57 00:03:45,360 --> 00:03:47,040 Speaker 2: First of all, I realized that there were a few 58 00:03:47,160 --> 00:03:51,640 Speaker 2: key sources of information that I was constantly referring to personally, 59 00:03:51,720 --> 00:03:53,680 Speaker 2: So you know, whenever I had a free moment, I 60 00:03:53,720 --> 00:03:56,840 Speaker 2: would go to archive and kind of browse the actual 61 00:03:57,360 --> 00:04:01,360 Speaker 2: published papers because that's the real source of truth cutting edge. 62 00:04:01,600 --> 00:04:05,200 Speaker 2: I would see a lot of popular posts on x 63 00:04:05,480 --> 00:04:08,760 Speaker 2: about kind of new research breakthroughs. I follow some of 64 00:04:08,760 --> 00:04:11,280 Speaker 2: the key people like the sam Altman's of the World 65 00:04:11,320 --> 00:04:14,920 Speaker 2: and the Frontier Labs. And then there's also several great 66 00:04:15,040 --> 00:04:19,360 Speaker 2: substack newsletters around AIS that I read as much as 67 00:04:19,360 --> 00:04:21,599 Speaker 2: I can. But I just realized there was no way 68 00:04:21,600 --> 00:04:24,560 Speaker 2: for me to stay on top of that volume consistently, 69 00:04:24,680 --> 00:04:27,560 Speaker 2: and I just started with a prompt. I told chat TBT, 70 00:04:27,640 --> 00:04:29,839 Speaker 2: I'd like to create a prompt to scan the entire 71 00:04:30,440 --> 00:04:35,560 Speaker 2: AIML space for important signals around research and development and 72 00:04:35,600 --> 00:04:39,440 Speaker 2: industry news that pertains to my role at Dropbox. And 73 00:04:39,480 --> 00:04:42,760 Speaker 2: I said before X, I want to look across these 74 00:04:42,880 --> 00:04:45,120 Speaker 2: accounts and accounts like it. So I kind of gave 75 00:04:45,160 --> 00:04:47,159 Speaker 2: it a seed list of accounts, but asked it to 76 00:04:47,279 --> 00:04:51,039 Speaker 2: expand where it could. I gave it specific newsletters that 77 00:04:51,120 --> 00:04:55,000 Speaker 2: I read on substack, but also again asked to consider adjacents, 78 00:04:55,440 --> 00:04:59,200 Speaker 2: and then I gave it specific categories on archive where 79 00:04:59,240 --> 00:05:02,720 Speaker 2: those papers are housed and asked it to consider relevant 80 00:05:02,760 --> 00:05:05,320 Speaker 2: ones that there. And then I said, I'd like to 81 00:05:05,960 --> 00:05:08,760 Speaker 2: really filter high signal, like I don't want this to 82 00:05:08,800 --> 00:05:11,560 Speaker 2: be a laundry list of everything that happened that when 83 00:05:11,640 --> 00:05:14,120 Speaker 2: actually helped me kind of consume it any better. So 84 00:05:14,160 --> 00:05:16,640 Speaker 2: I said, help me come up with a scale to 85 00:05:16,960 --> 00:05:20,000 Speaker 2: rank the innovations and the news that's coming out in 86 00:05:20,040 --> 00:05:22,400 Speaker 2: the day, and kind of categorize things as you can't 87 00:05:22,400 --> 00:05:26,880 Speaker 2: miss these are like foundational important, shifts incremental and important, 88 00:05:27,160 --> 00:05:29,960 Speaker 2: and then give me some synthesis around Hey, when you 89 00:05:29,960 --> 00:05:33,279 Speaker 2: put these signals together, what does it mean for my role? 90 00:05:33,839 --> 00:05:37,320 Speaker 2: So I fed that all just in basically talking to 91 00:05:37,440 --> 00:05:40,840 Speaker 2: chat GPT. I like to use the audio input to 92 00:05:40,920 --> 00:05:43,560 Speaker 2: kind of just stream of consciousness my thoughts into it. 93 00:05:43,640 --> 00:05:45,039 Speaker 2: Once I did that, I asked it to kind of 94 00:05:45,040 --> 00:05:48,320 Speaker 2: structure it into a prompt. I reviewed the prompt. Some 95 00:05:48,360 --> 00:05:50,719 Speaker 2: of the things I didn't like. I always try to 96 00:05:50,760 --> 00:05:53,279 Speaker 2: go back and forth with it. I think you should 97 00:05:53,320 --> 00:05:55,120 Speaker 2: kind of refine it much like you would the copy 98 00:05:55,160 --> 00:05:58,160 Speaker 2: for a website or you know, the script for something. 99 00:05:58,600 --> 00:06:00,800 Speaker 2: And then once I got the prompt too you get start. 100 00:06:01,080 --> 00:06:04,039 Speaker 2: I said, Okay, let's run that for today and give 101 00:06:04,080 --> 00:06:05,880 Speaker 2: me the output. And I got the output and I 102 00:06:05,920 --> 00:06:07,719 Speaker 2: was like, oh, this is a little too long. It 103 00:06:07,720 --> 00:06:10,560 Speaker 2: could be a little shorter. It looks like we're underrepresenting, 104 00:06:10,600 --> 00:06:13,680 Speaker 2: we're missing YouTube. Let's try to add YouTube in, so 105 00:06:13,720 --> 00:06:15,000 Speaker 2: on and so forth, and then I finally got it 106 00:06:15,040 --> 00:06:17,320 Speaker 2: dialed in, and then I said, great, set this as 107 00:06:17,320 --> 00:06:20,479 Speaker 2: a daily task every day five am Pacific. Run it 108 00:06:20,520 --> 00:06:23,200 Speaker 2: again while I'm having coffee. I just kind of read 109 00:06:23,200 --> 00:06:23,560 Speaker 2: through it. 110 00:06:24,680 --> 00:06:27,400 Speaker 1: What I love about Morgan's approach is that it didn't 111 00:06:27,400 --> 00:06:30,520 Speaker 1: start with coding or fancy tools. It started with a 112 00:06:30,600 --> 00:06:34,200 Speaker 1: simple question, what sources do I keep checking and how 113 00:06:34,240 --> 00:06:36,600 Speaker 1: could I pull them together? So the next time you 114 00:06:36,640 --> 00:06:40,640 Speaker 1: feel buried under articles or updates, try listing your key sources, 115 00:06:40,960 --> 00:06:43,760 Speaker 1: feed them into a prompt and ask AI to filter 116 00:06:43,920 --> 00:06:47,080 Speaker 1: and rank what matters most. And of course you'll find 117 00:06:47,120 --> 00:06:50,920 Speaker 1: the full episode with Morgan linked in the show notes. 118 00:06:51,920 --> 00:06:54,640 Speaker 1: If you like today's show, make sure you git follow 119 00:06:54,800 --> 00:06:58,279 Speaker 1: on your podcast app to be alerted when new episodes drop. 120 00:06:58,839 --> 00:07:01,320 Speaker 1: How I Work was recorded on the traditional land of 121 00:07:01,360 --> 00:07:03,560 Speaker 1: the Warringery people, part of the Coulan nation.