WEBVTT - UL NO. 447: Sam Curry on Bug Bounty Careers, Slack Data Exfil, The Work Lie

0:00:01.129 --> 0:00:04.460
<v S1>Welcome to Unsupervised Learning, a security, AI and meaning focused

0:00:04.460 --> 0:00:07.310
<v S1>podcast that looks at how best to thrive as humans

0:00:07.310 --> 0:00:11.540
<v S1>in a post AI world. It combines original ideas, analysis,

0:00:11.570 --> 0:00:14.780
<v S1>and mental models to bring not just the news, but

0:00:14.780 --> 0:00:22.340
<v S1>why it matters and how to respond. All right, welcome

0:00:22.340 --> 0:00:26.060
<v S1>to unsupervised learning. This is Daniel Miessler. All right. What

0:00:26.060 --> 0:00:29.480
<v S1>have we got here? Lots of content this week. Super

0:00:29.480 --> 0:00:32.840
<v S1>excited for this episode. Uh, did an all text episode

0:00:32.840 --> 0:00:34.940
<v S1>this time, which is kind of a callback to how

0:00:34.940 --> 0:00:37.520
<v S1>I used to do it. Got upcoming speaking at the

0:00:37.520 --> 0:00:43.400
<v S1>Snyk conference in October, Cyberstorm in Switzerland and Blackhat in Riyadh.

0:00:43.430 --> 0:00:47.000
<v S1>One tool in AI that you should be trying out,

0:00:47.000 --> 0:00:50.870
<v S1>that everyone is talking about, including Karpathy and a bunch

0:00:50.870 --> 0:00:54.800
<v S1>of other people, is called cursor AI. So it's cursor.com

0:00:54.800 --> 0:00:57.230
<v S1>is the domain. I thought it was cursor AI, but

0:00:57.230 --> 0:01:00.730
<v S1>it is not. But the big feature appears to be

0:01:00.730 --> 0:01:04.630
<v S1>that it basically it looks and feels like VSCode, but

0:01:04.630 --> 0:01:08.259
<v S1>what you do is you upload your entire repository into it.

0:01:08.290 --> 0:01:11.710
<v S1>I guess I guess technically VSCode could also see your

0:01:11.740 --> 0:01:14.259
<v S1>all your code if it's all in there as well.

0:01:14.260 --> 0:01:17.110
<v S1>But I think what cursor is supposedly doing well is

0:01:17.110 --> 0:01:19.900
<v S1>it's taking all of that content and kind of using

0:01:19.900 --> 0:01:22.900
<v S1>it in context to understand it better, not just the

0:01:22.900 --> 0:01:25.240
<v S1>current file that you're in. So I believe that's the

0:01:25.240 --> 0:01:28.360
<v S1>big feature. If I'm wrong about that, somebody correct me. Okay.

0:01:28.390 --> 0:01:33.280
<v S1>My work, a couple of massive episodes or essays that

0:01:33.280 --> 0:01:35.620
<v S1>I put out this week, and I actually sent them

0:01:35.620 --> 0:01:38.470
<v S1>out directly, which I only do for things that I

0:01:38.470 --> 0:01:43.000
<v S1>think are really decent and also evergreen. So one of

0:01:43.000 --> 0:01:46.600
<v S1>them is called We Were Lied To About Work or

0:01:46.600 --> 0:01:48.610
<v S1>the Real Problem with the job market. I had two

0:01:48.610 --> 0:01:52.000
<v S1>different titles actually, but it's basically why layoffs, hiring the

0:01:52.000 --> 0:01:54.850
<v S1>job market, and work in general just really sucks right now.

0:01:54.850 --> 0:01:56.559
<v S1>And I would say it's probably one of my top

0:01:56.590 --> 0:02:00.970
<v S1>20 essays ever. So highly, highly recommend that one. And

0:02:00.970 --> 0:02:03.850
<v S1>I've got a new way to explain AI and specifically

0:02:03.880 --> 0:02:06.880
<v S1>Llms to people, and I think this one is short enough.

0:02:06.880 --> 0:02:08.920
<v S1>In fact, I'm just going to give the highlights of it.

0:02:08.919 --> 0:02:11.950
<v S1>So let me take you into this one. So here

0:02:11.980 --> 0:02:16.510
<v S1>is the basic concept five levels of LLM understanding. So

0:02:16.510 --> 0:02:19.630
<v S1>the first level at the bottom is it's just predicting

0:02:19.660 --> 0:02:23.440
<v S1>text like the next text token in a in a

0:02:23.440 --> 0:02:27.429
<v S1>text sequence. That's all it's doing. It's not magical. Right.

0:02:27.430 --> 0:02:30.010
<v S1>And this is kind of the most common argument for

0:02:30.010 --> 0:02:33.580
<v S1>why llms and AI is not all that special or

0:02:33.580 --> 0:02:37.180
<v S1>specifically AI based on Llms. People are just like, look,

0:02:37.210 --> 0:02:40.030
<v S1>it's just next token prediction. No big deal. If you

0:02:40.030 --> 0:02:43.090
<v S1>pull away from that one level, it's like, look, it's

0:02:43.090 --> 0:02:46.360
<v S1>just predicting the next item in a sequence, okay? So

0:02:46.360 --> 0:02:50.919
<v S1>it's just next token prediction okay. One abstraction away from there.

0:02:50.919 --> 0:02:54.880
<v S1>And this all comes from this thing from Eliezer Yudkowsky

0:02:54.910 --> 0:02:59.429
<v S1>who had this, this great kind of little statement on

0:02:59.430 --> 0:03:02.340
<v S1>X that made me think about this, but essentially what

0:03:02.340 --> 0:03:06.090
<v S1>it is, is for this level. Level three, it's predicting

0:03:06.090 --> 0:03:09.839
<v S1>the next token in the description of an answer. Okay.

0:03:09.870 --> 0:03:16.110
<v S1>So what Yudkowsky said was any predicting the next token

0:03:16.110 --> 0:03:20.850
<v S1>is isomorphic with predicting the next token of an answer. Okay.

0:03:20.880 --> 0:03:23.220
<v S1>And that is really, really powerful. And it wasn't the

0:03:23.220 --> 0:03:25.740
<v S1>exact quote, but that's essentially it. If you go one

0:03:25.740 --> 0:03:31.350
<v S1>level above that okay, it's predicting answers to insanely difficult questions.

0:03:31.350 --> 0:03:34.590
<v S1>So so here are the four levels so far. It's

0:03:34.590 --> 0:03:37.920
<v S1>predicting the next token in a piece of text. It's

0:03:37.920 --> 0:03:41.670
<v S1>predicting the next token third level. It's predicting the next

0:03:41.670 --> 0:03:46.080
<v S1>token of an answer. Second level it's predicting answers. And

0:03:46.080 --> 0:03:49.770
<v S1>the top level it seems to know everything okay. And

0:03:49.770 --> 0:03:53.100
<v S1>this all comes from this. Here it is right here

0:03:53.100 --> 0:03:57.390
<v S1>it just predicts the next token literally any well posed

0:03:57.410 --> 0:04:01.340
<v S1>problem is isomorphic to predict the next token of the answer.

0:04:01.340 --> 0:04:05.030
<v S1>And literally anyone with a grasp of undergraduate comp sci

0:04:05.060 --> 0:04:07.820
<v S1>is supposed to see that without being told. I don't

0:04:07.850 --> 0:04:10.160
<v S1>agree with that last part. This part here, I don't

0:04:10.190 --> 0:04:13.070
<v S1>agree with that. I don't think that's accurate. In fact,

0:04:13.070 --> 0:04:16.400
<v S1>I think the vast majority of people that's very much

0:04:16.400 --> 0:04:18.979
<v S1>not the case. Very much not the case. I would

0:04:18.980 --> 0:04:23.300
<v S1>say 95%. So I would say I disagree on multiple

0:04:23.300 --> 0:04:26.360
<v S1>levels for that second piece. The part that I like

0:04:26.390 --> 0:04:32.060
<v S1>is this here, which I highlighted and separated on purpose,

0:04:32.060 --> 0:04:36.950
<v S1>literally any well posed problem is isomorphic to predict the

0:04:36.950 --> 0:04:42.530
<v S1>next token of the answer. That is extraordinary. That is brilliant.

0:04:42.529 --> 0:04:46.310
<v S1>That is the answer to the it's just next token

0:04:46.310 --> 0:04:50.240
<v S1>prediction argument. That is, the immediate counter of that well

0:04:50.270 --> 0:04:54.500
<v S1>posed problem equals prediction of next token of answer. So

0:04:54.500 --> 0:04:58.010
<v S1>that's that one. And basically this essay is a full

0:04:58.040 --> 0:05:02.630
<v S1>breakdown of that. So it's a standalone evergreen essay about

0:05:02.630 --> 0:05:05.660
<v S1>why this is a powerful concept and why you shouldn't

0:05:05.660 --> 0:05:09.800
<v S1>trust that next token argument. Okay. Security. CrowdStrike did their

0:05:09.980 --> 0:05:15.680
<v S1>2024 report talking about how North Koreans have infiltrated over

0:05:15.680 --> 0:05:19.339
<v S1>100 US based companies in all sorts of important places

0:05:19.339 --> 0:05:22.250
<v S1>like aerospace, defense, retail and tech. They didn't mention much

0:05:22.250 --> 0:05:25.490
<v S1>about Blue Friday. Not sure why that was. State linked

0:05:25.490 --> 0:05:29.930
<v S1>Chinese entities are using cloud services from Amazon and other

0:05:29.930 --> 0:05:34.940
<v S1>competitors of Amazon to access advanced US chips and AI capabilities,

0:05:34.940 --> 0:05:38.000
<v S1>so basically they can't get the chips themselves. But you

0:05:38.000 --> 0:05:41.239
<v S1>can go to AWS. Anybody can, or most people can.

0:05:41.270 --> 0:05:44.029
<v S1>You go to AWS, you sign up and you just

0:05:44.029 --> 0:05:48.740
<v S1>attach Nvidia chips to that instance and you start doing workloads.

0:05:48.740 --> 0:05:51.890
<v S1>And so this is what China is doing. They're like, well,

0:05:51.920 --> 0:05:54.080
<v S1>we can't get the chips, but we can use a

0:05:54.080 --> 0:05:57.679
<v S1>service that leverages the chips. So I'm sure Amazon and

0:05:57.680 --> 0:06:00.590
<v S1>the government are in conversations about that trying to get

0:06:00.589 --> 0:06:04.190
<v S1>that fixed. Cisco has patched multiple vulnerabilities, including a high

0:06:04.190 --> 0:06:08.600
<v S1>severity bug in its Unified Communications Manager product. Thanks to

0:06:08.630 --> 0:06:12.860
<v S1>Threatlocker for sponsoring. Two US lawmakers are urging the Commerce

0:06:12.860 --> 0:06:18.260
<v S1>Department to investigate cybersecurity risks associated with TP-Link routers, citing

0:06:18.260 --> 0:06:22.219
<v S1>vulnerabilities and potential data sharing with the Chinese government. So

0:06:22.220 --> 0:06:25.820
<v S1>kind of a Huawei situation. A mini Huawei Quarks lab

0:06:25.820 --> 0:06:28.970
<v S1>found a major back door in RFID cards made by

0:06:28.970 --> 0:06:34.310
<v S1>Shanghai Fudan Microelectronics, one of China's top chip manufacturers. I

0:06:34.310 --> 0:06:37.339
<v S1>don't think this is a China influence story. I think

0:06:37.339 --> 0:06:40.489
<v S1>this is just vulnerabilities and chips, and the impact here

0:06:40.490 --> 0:06:43.730
<v S1>is the fact that it's smart cards for like office

0:06:43.730 --> 0:06:46.729
<v S1>stores and hotel rooms and whatever. And it's a big

0:06:46.730 --> 0:06:50.780
<v S1>company that does this. And lots of RFID chips and

0:06:50.779 --> 0:06:54.020
<v S1>cards come from China. So just an impact due to

0:06:54.050 --> 0:06:56.440
<v S1>the size of the market type situation. What is this

0:06:56.440 --> 0:07:03.370
<v S1>bouncing now? Now you're seeing my Diablo four chat messages.

0:07:03.370 --> 0:07:06.039
<v S1>It's really important stuff I'm working on in Diablo four.

0:07:06.070 --> 0:07:08.440
<v S1>All right. What are we talking about here? Yeah. Thanks

0:07:08.440 --> 0:07:12.460
<v S1>to defender five for sponsoring. Next one. Researchers found a

0:07:12.460 --> 0:07:15.670
<v S1>way to exfiltrate data from Slack's AI by using indirect

0:07:15.670 --> 0:07:18.910
<v S1>prompt injection. US Navy is rolling out Starlink on its

0:07:18.940 --> 0:07:23.380
<v S1>warships to provide high speed, reliable internet connections, improving operations

0:07:23.380 --> 0:07:27.790
<v S1>and crew morale. AI and Tech Anthropic has published the

0:07:27.790 --> 0:07:31.720
<v S1>system prompts for its latest AI models, including opus, Sonnet

0:07:31.720 --> 0:07:34.840
<v S1>and Haiku. AGI bot is a Chinese company and they

0:07:34.840 --> 0:07:38.920
<v S1>just unveiled unveiled a fleet of advanced humanoid robots to

0:07:38.950 --> 0:07:43.150
<v S1>compete directly with Optimus, which is the one from Tesla,

0:07:43.150 --> 0:07:46.900
<v S1>and they're designed for tasks ranging from household chores to

0:07:46.930 --> 0:07:50.410
<v S1>industrial operations. And they're going to start shipping supposedly by

0:07:50.410 --> 0:07:53.590
<v S1>the end of this year. So like immediately. And Optimus

0:07:53.620 --> 0:07:57.960
<v S1>is nowhere near ready for that kind of timeline. So

0:07:57.960 --> 0:08:02.910
<v S1>I'm basically anti-Chinese imports for both robotaxis and humanoid robots,

0:08:02.910 --> 0:08:07.740
<v S1>because China is too far ahead and they're too cheap

0:08:07.740 --> 0:08:10.800
<v S1>and I would say just too good. So I don't

0:08:10.800 --> 0:08:12.960
<v S1>want to give them a head start. And I don't

0:08:12.960 --> 0:08:16.740
<v S1>like being anti-competitive against any sort of country. I don't

0:08:16.740 --> 0:08:19.530
<v S1>like slowing pressure from the outside, but if this were

0:08:19.530 --> 0:08:22.560
<v S1>India or Ireland, I would actually be okay with them

0:08:22.560 --> 0:08:25.350
<v S1>applying pressure to the US but not China because they're

0:08:25.350 --> 0:08:28.830
<v S1>too obviously a malicious actor who actually just like, wants

0:08:28.830 --> 0:08:31.950
<v S1>to crush the United States in like all aspects, including

0:08:31.950 --> 0:08:34.920
<v S1>the United States going away as an economic power. So

0:08:34.920 --> 0:08:37.859
<v S1>it's not like friendly competition. So I think we should

0:08:37.890 --> 0:08:41.280
<v S1>actually just either tax the hell out of them or

0:08:41.280 --> 0:08:44.010
<v S1>not allow them to function until until we have proper

0:08:44.010 --> 0:08:47.280
<v S1>footing and like, can compete properly. And speaking of that,

0:08:47.280 --> 0:08:51.030
<v S1>Tesla is hiring people to train its Optimus humanoid robot

0:08:51.030 --> 0:08:55.100
<v S1>by wearing a motion capture suit and mimicking the different actions.

0:08:55.100 --> 0:08:57.679
<v S1>You get like $48 an hour, but you got to

0:08:57.710 --> 0:09:01.160
<v S1>walk over seven hours a day carrying £30 while wearing

0:09:01.160 --> 0:09:04.340
<v S1>a VR headset. That's a tough job. Waymo is looking

0:09:04.340 --> 0:09:08.150
<v S1>to launch a subscription service called Waymo Teen, so this

0:09:08.150 --> 0:09:11.900
<v S1>is basically to help parents not have to shuffle kids around.

0:09:11.929 --> 0:09:15.020
<v S1>Although I'm not sure, depending on the age of the teen,

0:09:15.020 --> 0:09:17.840
<v S1>should they have their own car, I'm not sure. But anyway,

0:09:17.900 --> 0:09:23.780
<v S1>cool idea. I scientists developed by the University of British Columbia,

0:09:23.809 --> 0:09:30.110
<v S1>Oxford and Sakana AI is creating its own machine learning experiments. Okay,

0:09:30.140 --> 0:09:34.309
<v S1>let's back up. An AI scientist is creating its own

0:09:34.309 --> 0:09:37.849
<v S1>machine learning experiments and running them autonomously. I think this

0:09:37.850 --> 0:09:40.880
<v S1>is where most innovation will come from. AI not just

0:09:40.880 --> 0:09:44.179
<v S1>implementing tasks, but in doing new research. And I talked

0:09:44.179 --> 0:09:48.260
<v S1>about it in a post there. Victor Miller, a mayoral

0:09:48.260 --> 0:09:51.350
<v S1>candidate in Wyoming's capital city, has vowed to let its

0:09:51.350 --> 0:09:56.679
<v S1>customized ChatGPT named Vic Virtual Integrated Citizen help run the

0:09:56.679 --> 0:09:59.950
<v S1>local government. I'm actually working on how to articulate a

0:09:59.950 --> 0:10:03.910
<v S1>political platform for any level of office using substrate. Basically,

0:10:03.910 --> 0:10:06.400
<v S1>define exactly what you want to do, how it branches

0:10:06.400 --> 0:10:10.510
<v S1>out with problem strategies, most importantly KPIs and promises. So

0:10:10.510 --> 0:10:12.670
<v S1>you could literally say, like I talked about it in

0:10:12.670 --> 0:10:15.610
<v S1>the substrate video, you could say, look, here's how I'm

0:10:15.610 --> 0:10:18.850
<v S1>measuring myself. Here are my assumptions. Here's what I believe

0:10:18.850 --> 0:10:23.080
<v S1>the problems are. Here are my strategies for attacking those problems.

0:10:23.080 --> 0:10:27.070
<v S1>And here are my specific projects that I'm going to

0:10:27.070 --> 0:10:30.400
<v S1>do to implement those strategies. And here's how much it's

0:10:30.400 --> 0:10:34.330
<v S1>going to cost. Here's how I'm measuring myself. And here's

0:10:34.330 --> 0:10:37.840
<v S1>the promise if I don't move these numbers by X amount,

0:10:37.840 --> 0:10:39.970
<v S1>you should fire me in two years or in four

0:10:39.970 --> 0:10:42.790
<v S1>years or whatever the term is. So I think this

0:10:42.790 --> 0:10:48.130
<v S1>is where leadership is heading. Transparent descriptions of vision, strategy, KPIs,

0:10:48.130 --> 0:10:54.280
<v S1>and promises. Sean Ammirati Arathi, a professor at Carnegie Mellon,

0:10:54.309 --> 0:10:58.210
<v S1>noticed a massive up leveling of progress in his entrepreneurship

0:10:58.210 --> 0:11:01.570
<v S1>class this year, thanks to generative AI tools like ChatGPT,

0:11:01.900 --> 0:11:06.520
<v S1>GitHub Copilot, and Flow wise, AI. So they basically the

0:11:06.520 --> 0:11:09.970
<v S1>students use these tools for marketing, coding, product development, and

0:11:09.970 --> 0:11:14.079
<v S1>recruiting early customers. This is what I've been talking about

0:11:14.080 --> 0:11:17.500
<v S1>with AI automation. If you were competing with a 95

0:11:17.530 --> 0:11:20.530
<v S1>out of 100 person before and they were a 95

0:11:20.559 --> 0:11:23.410
<v S1>because they went to CMU, well, now you're competing with

0:11:23.410 --> 0:11:26.350
<v S1>a 130 out of 100 because they went to CMU

0:11:26.350 --> 0:11:29.230
<v S1>and they're using AI for everything. And for me, in

0:11:29.230 --> 0:11:32.050
<v S1>my own life, I read better articles because of AI.

0:11:32.080 --> 0:11:35.110
<v S1>I get better ideas because of AI. Therefore I build

0:11:35.110 --> 0:11:37.990
<v S1>better stuff because of AI. And this is all feeding

0:11:37.990 --> 0:11:41.200
<v S1>on itself, just okay. And this all feeds itself and

0:11:41.200 --> 0:11:44.020
<v S1>just makes that whole top of funnel even better. And

0:11:44.020 --> 0:11:47.980
<v S1>I believe that your options are to upgrade or lose. Basically,

0:11:48.010 --> 0:11:51.540
<v S1>GM is cutting over 1000 software engineers to streamline its

0:11:51.540 --> 0:11:57.300
<v S1>software services organization, streamlining by cutting out 1000 devs. The

0:11:57.300 --> 0:12:00.630
<v S1>way I see this is they're actually just taking everyone

0:12:00.630 --> 0:12:03.210
<v S1>out because they're like, this is a huge waste of time.

0:12:03.210 --> 0:12:06.270
<v S1>We hired a bunch of duds. Let's start from scratch

0:12:06.270 --> 0:12:09.179
<v S1>and only hire the best possible people who are probably

0:12:09.179 --> 0:12:12.300
<v S1>also massively augmented with AI. Yeah, so I got a

0:12:12.300 --> 0:12:15.540
<v S1>whole bunch of other content killer cult members. This is

0:12:15.540 --> 0:12:19.650
<v S1>my new, um, way of framing. This whole thing is like,

0:12:19.650 --> 0:12:23.070
<v S1>killer cult members is what companies are looking for. It's

0:12:23.070 --> 0:12:25.980
<v S1>kind of obvious that this is what startups are looking for,

0:12:25.980 --> 0:12:28.140
<v S1>but I think corporations are going to look for this

0:12:28.140 --> 0:12:31.380
<v S1>as well. So people talk about, oh, toxic work culture.

0:12:31.410 --> 0:12:35.280
<v S1>Guess what? Companies only want employees who have a toxic

0:12:35.280 --> 0:12:38.610
<v S1>work culture. They want people who show up and are like,

0:12:38.610 --> 0:12:42.180
<v S1>I am religiously dedicated to this. I will sleep under

0:12:42.179 --> 0:12:44.550
<v S1>my desk. I will work as much time as I

0:12:44.550 --> 0:12:48.030
<v S1>need to. I am fully dedicated to this mission. I

0:12:48.059 --> 0:12:51.140
<v S1>wear all the swag in public. I talk about this.

0:12:51.140 --> 0:12:53.630
<v S1>I think about this all day long. That is what

0:12:53.630 --> 0:12:56.689
<v S1>people want. That is what hiring managers want. That is

0:12:56.690 --> 0:13:02.449
<v S1>what corporations want. Because that behavior, especially if you're on site,

0:13:02.450 --> 0:13:05.750
<v S1>which everyone is pulling everyone back to be on prem now,

0:13:05.750 --> 0:13:08.780
<v S1>if you are surrounded by other cult members with that

0:13:08.780 --> 0:13:11.900
<v S1>sort of energy and keep in mind, this also has

0:13:11.900 --> 0:13:16.100
<v S1>toxic aspects to it. There are massive downsides to this

0:13:16.100 --> 0:13:20.420
<v S1>type of culture, but the downsides are not to the company. Really.

0:13:20.420 --> 0:13:24.860
<v S1>The downsides are mostly to the employees themselves as a

0:13:24.860 --> 0:13:28.160
<v S1>trade off for doing this much risk to get much,

0:13:28.309 --> 0:13:32.090
<v S1>a lot of a reward, potentially in equity or pay

0:13:32.120 --> 0:13:36.620
<v S1>or whatever. So this culture is good for companies and

0:13:36.620 --> 0:13:39.140
<v S1>that's why they are fostering it. That's why they're saying

0:13:39.140 --> 0:13:42.590
<v S1>you must be on site. So for example, OpenAI OpenAI

0:13:42.620 --> 0:13:44.630
<v S1>requires you to be on site. You have to work

0:13:44.630 --> 0:13:48.290
<v S1>on site. There are some exceptions, very few. But in

0:13:48.290 --> 0:13:52.339
<v S1>general they make a cool office. In a cool city,

0:13:52.340 --> 0:13:55.160
<v S1>they require the best. They hire the best talent. They

0:13:55.160 --> 0:13:59.449
<v S1>basically tell you, or it's naturally implied that you must

0:13:59.450 --> 0:14:02.180
<v S1>be one of these killer cult members. And then that

0:14:02.179 --> 0:14:06.710
<v S1>is that energy which is synergistic with itself, builds and

0:14:06.710 --> 0:14:09.950
<v S1>builds and builds and it produces the best products that

0:14:09.950 --> 0:14:12.860
<v S1>people want to buy, and they get massive valuations. And

0:14:12.890 --> 0:14:18.110
<v S1>that's capitalism and that's what works. What doesn't work is, okay,

0:14:18.110 --> 0:14:21.980
<v S1>let's not do that. Let's have a company culture that's

0:14:21.980 --> 0:14:25.880
<v S1>good for employees, and let's make sure there's work life balance.

0:14:25.880 --> 0:14:29.180
<v S1>And let's make sure, like all these different policies and

0:14:29.180 --> 0:14:32.720
<v S1>it ends up with a lot of people not working,

0:14:32.720 --> 0:14:36.140
<v S1>not doing good work. It ends up with A-players hiring

0:14:36.170 --> 0:14:40.070
<v S1>B players. B players hire C players. You end up

0:14:40.070 --> 0:14:43.130
<v S1>with a bunch of C players, a few B's and

0:14:43.280 --> 0:14:46.790
<v S1>and some D players. And after a while, the corporation

0:14:46.790 --> 0:14:52.090
<v S1>looks at their entire headcount, spend their entire human resources spend,

0:14:52.090 --> 0:14:54.370
<v S1>and they're like, I'm not getting value from this. This

0:14:54.370 --> 0:14:59.140
<v S1>is not worth it. And they just fire everyone. And

0:14:59.140 --> 0:15:02.110
<v S1>so when I see somebody as firing a thousand software

0:15:02.110 --> 0:15:05.170
<v S1>engineers to streamline the software, I don't know exactly what's

0:15:05.170 --> 0:15:08.200
<v S1>happening here. I'm arguing that what I'm saying might be

0:15:08.230 --> 0:15:11.890
<v S1>happening here is definitely happening all over the place. And

0:15:11.890 --> 0:15:14.170
<v S1>what they'll do is they'll go to zero. Then they'll

0:15:14.170 --> 0:15:18.220
<v S1>find a players who are killer cult members and only

0:15:18.220 --> 0:15:20.860
<v S1>hire those from now on. And here's the crazy part.

0:15:20.890 --> 0:15:25.060
<v S1>100 of those people might be worth 1000 or 10,000

0:15:25.090 --> 0:15:28.630
<v S1>C players who are interested in work life balance. Again,

0:15:28.630 --> 0:15:32.050
<v S1>I'm not saying anything about like the total value of

0:15:32.050 --> 0:15:35.350
<v S1>on society of work life balance or all of that.

0:15:35.350 --> 0:15:39.040
<v S1>I've got completely separate ideas about that. Actually, in the

0:15:39.040 --> 0:15:41.500
<v S1>essay We were Lied To About Work, you can see

0:15:41.500 --> 0:15:44.200
<v S1>what I really feel about that whole thing. The point is,

0:15:44.200 --> 0:15:46.690
<v S1>this is what companies are looking for. If you want

0:15:46.720 --> 0:15:49.330
<v S1>to get hired, you have the answer. Meta is using

0:15:49.360 --> 0:15:52.870
<v S1>AI to streamline system reliability investigations with a new root

0:15:52.870 --> 0:15:57.010
<v S1>cause analysis system. System combines heuristic based retrieval and large

0:15:57.010 --> 0:16:02.230
<v S1>language model based ranking, achieving 42% accuracy in identifying root

0:16:02.230 --> 0:16:06.310
<v S1>causes at the investigation start. I didn't look to see

0:16:06.340 --> 0:16:09.250
<v S1>or it wasn't in there, how that compares to humans

0:16:09.250 --> 0:16:12.760
<v S1>and how fast, because that's the trick. Accuracy and speed

0:16:12.760 --> 0:16:15.940
<v S1>and of course, cost. AI companies are shifting focus from

0:16:15.940 --> 0:16:20.410
<v S1>creating godlike AI to building practical products. Who knew? So

0:16:20.410 --> 0:16:22.150
<v S1>I don't think this is a bubble pop. I think

0:16:22.150 --> 0:16:25.240
<v S1>it's a natural maturity of brand new tech that just

0:16:25.240 --> 0:16:28.120
<v S1>came out. And because people are still figuring this stuff

0:16:28.120 --> 0:16:31.120
<v S1>out and it's basically day one, like AI hasn't even

0:16:31.120 --> 0:16:34.330
<v S1>gotten good yet, hasn't even started to get good. Canada

0:16:34.330 --> 0:16:40.690
<v S1>is slapping 100% import tariffs on Chinese electric vehicles starting

0:16:40.720 --> 0:16:44.290
<v S1>October 1st. We were just talking about that, former Google

0:16:44.320 --> 0:16:48.330
<v S1>CEO Eric Schmidt predicts rapid advancements in AI, the potential

0:16:48.330 --> 0:16:52.530
<v S1>to create significant apps like TikTok competitors in minutes within

0:16:52.530 --> 0:16:55.140
<v S1>the next few years. I know what he's saying, but

0:16:55.140 --> 0:16:57.840
<v S1>there's a difference between being able to run an app

0:16:57.840 --> 0:17:01.590
<v S1>at scale versus creating it. Of course, he knows that

0:17:01.590 --> 0:17:04.950
<v S1>better than I do or most people. But important point

0:17:04.950 --> 0:17:09.149
<v S1>to make. Claude 3.5 can now create icalendar files from images.

0:17:09.150 --> 0:17:12.750
<v S1>And this guy Greg's ramblings shows how you can use

0:17:12.750 --> 0:17:16.440
<v S1>this feature to generate calendar entries by snapping a photo

0:17:16.440 --> 0:17:20.940
<v S1>of a schedule or event flyer. AWS CEO Adam Selipsky

0:17:20.970 --> 0:17:25.110
<v S1>predicts that within the next 24 months, most developers might

0:17:25.109 --> 0:17:28.890
<v S1>not be coding anymore due to AI advancements. He says

0:17:28.890 --> 0:17:33.300
<v S1>the real skill shift will be towards innovation and understanding

0:17:33.300 --> 0:17:39.030
<v S1>customer needs, rather than writing code. 100% agree. Although most

0:17:39.030 --> 0:17:42.480
<v S1>developers in 24 months. This is the type of thing

0:17:42.480 --> 0:17:45.920
<v S1>where it happens way slower than you think and then

0:17:45.950 --> 0:17:48.800
<v S1>way faster than you think. At the same time, most

0:17:48.800 --> 0:17:52.520
<v S1>in 24 months will not be coding anymore. That is

0:17:52.520 --> 0:17:56.750
<v S1>wildly off, wildly off. But people who are like, oh,

0:17:56.780 --> 0:17:58.730
<v S1>it's going to take forever. It's going to take, you know,

0:17:58.760 --> 0:18:01.939
<v S1>three years, five years, seven years, ten years, 15 years.

0:18:01.940 --> 0:18:05.659
<v S1>They are also wildly off. Chinese companies have ramped up

0:18:05.660 --> 0:18:11.659
<v S1>imports of chip production equipment. $26 billion in the first

0:18:11.660 --> 0:18:15.530
<v S1>seven months of 24 on chip production equipment. They need

0:18:15.530 --> 0:18:20.240
<v S1>to equip 18 new fabs expected to start operations in 24.

0:18:20.570 --> 0:18:23.120
<v S1>There they are all in on this because they see

0:18:23.119 --> 0:18:26.359
<v S1>the world turning against them and shutting them down. So

0:18:26.359 --> 0:18:29.570
<v S1>they are all in on this. Question is, can they actually,

0:18:29.570 --> 0:18:33.560
<v S1>even with all the fabs and the equipment and you know,

0:18:33.590 --> 0:18:36.500
<v S1>the factories, do they have the know how to actually

0:18:36.500 --> 0:18:40.580
<v S1>do what TSMC is doing? My current understanding from current

0:18:40.609 --> 0:18:44.300
<v S1>knowledge combined with like the book chip. Wars know they

0:18:44.300 --> 0:18:47.230
<v S1>don't have the knowledge. So even with all that stuff,

0:18:47.230 --> 0:18:50.170
<v S1>it's not going to be enough humans. Cisco's laying off 7%

0:18:50.170 --> 0:18:54.040
<v S1>of its workforce, around 6000 employees, as it pivots towards

0:18:54.070 --> 0:18:58.209
<v S1>AI and cybersecurity. McKinsey's new study reveals that business leaders

0:18:58.210 --> 0:19:01.810
<v S1>are missing the mark on why employees are quitting. They

0:19:01.840 --> 0:19:06.970
<v S1>say companies are focusing on transactional perks like compensation and flexibility,

0:19:06.970 --> 0:19:12.909
<v S1>but employees are actually seeking meaning. Belonging, holistic care and

0:19:12.910 --> 0:19:17.230
<v S1>appreciation at work couldn't have been better timed with this

0:19:17.230 --> 0:19:22.660
<v S1>week's essay. 24 brain samples collected in early 2424 brain

0:19:22.660 --> 0:19:29.440
<v S1>samples in 2024 measured an average of 0.5% plastic by weight.

0:19:29.440 --> 0:19:33.490
<v S1>So a brain is multiple pounds. Is it £3 or £2?

0:19:33.490 --> 0:19:37.030
<v S1>I can't remember 3 or £4, something like that. It's heavy.

0:19:37.060 --> 0:19:39.430
<v S1>You can feel this thing right. You can feel your head.

0:19:39.430 --> 0:19:43.930
<v S1>It's heavy. Half a percent. Okay. Let's just type in here.

0:19:43.960 --> 0:19:46.120
<v S1>What's the half a percent of the weight of an

0:19:46.119 --> 0:19:49.990
<v S1>average brain? 1400 grams. Half a percent of that would

0:19:49.990 --> 0:19:56.230
<v S1>be seven grams. Okay, I put 25g of coffee to

0:19:56.260 --> 0:20:00.070
<v S1>brew when I brew coffee, so I roughly feel like

0:20:00.070 --> 0:20:02.950
<v S1>I know that's about a third, about a third of

0:20:02.950 --> 0:20:05.020
<v S1>the amount of coffee that I drink in the morning,

0:20:05.020 --> 0:20:08.260
<v S1>which when you brew that thing, I mean, it's significant.

0:20:08.260 --> 0:20:13.750
<v S1>This is a extremely non-trivial amount of plastic sitting inside

0:20:13.750 --> 0:20:16.480
<v S1>the brain. And there's a lot of speculation right now

0:20:16.480 --> 0:20:20.080
<v S1>which I don't put too much weight into it because,

0:20:20.109 --> 0:20:22.420
<v S1>you know, how we have these health scares and like, oh,

0:20:22.450 --> 0:20:25.690
<v S1>this is dangerous and that's dangerous or whatever, but seven

0:20:25.690 --> 0:20:31.510
<v S1>grams of plastic, seven grams of plastic sitting inside of

0:20:31.510 --> 0:20:34.869
<v S1>a brain. I feel like it can't be good unless

0:20:34.869 --> 0:20:38.199
<v S1>it's like alien plastic that is like nanobots or something.

0:20:38.200 --> 0:20:41.260
<v S1>But no, this is regular plastic. The question is, where

0:20:41.290 --> 0:20:43.210
<v S1>is it coming from? How is it getting in there?

0:20:43.210 --> 0:20:45.660
<v S1>Is it in all our foods? Is it in? Is

0:20:45.660 --> 0:20:48.600
<v S1>it from drinking bottles? I'm actually super concerned about this

0:20:48.600 --> 0:20:51.510
<v S1>because I drink energy drinks. That's why I switched to

0:20:51.540 --> 0:20:53.609
<v S1>these because I think they have less. But I saw

0:20:53.609 --> 0:20:58.140
<v S1>a crazy report about my favorite not energy drinks, protein drinks,

0:20:58.140 --> 0:21:01.740
<v S1>my favorite protein drink. It was the core power one,

0:21:01.740 --> 0:21:04.560
<v S1>and it was like off the charts in the amount

0:21:04.590 --> 0:21:07.350
<v S1>of plastic, according to this one lab that ran it.

0:21:07.380 --> 0:21:10.050
<v S1>Now who knows, maybe that lab was run by the

0:21:10.050 --> 0:21:12.930
<v S1>counter product, which I went out and bought, by the way,

0:21:12.930 --> 0:21:16.800
<v S1>which is Muscle Milk. Anyway, I'm concerned about this, but

0:21:16.800 --> 0:21:19.170
<v S1>I'm also cautious because you don't know if it's a

0:21:19.170 --> 0:21:23.130
<v S1>scare turn. That plastic could be completely inert and doesn't

0:21:23.130 --> 0:21:26.580
<v S1>even matter. And the thing that's causing all this cancer

0:21:26.580 --> 0:21:30.630
<v S1>and drop in testosterone and all this stuff is actually

0:21:30.630 --> 0:21:32.880
<v S1>something else could be our sun exposure, which I think

0:21:32.880 --> 0:21:36.120
<v S1>is a later. Yeah, it's two stories down. So anyway,

0:21:36.119 --> 0:21:39.450
<v S1>lots of plastic in our brains. Not sure what that

0:21:39.450 --> 0:21:43.820
<v S1>means yet. Gallup has released its 2023 Global Emotions Report,

0:21:43.820 --> 0:21:48.139
<v S1>which measures the world's emotional temperature through Positive Experience Index.

0:21:48.140 --> 0:21:50.959
<v S1>I'm opening this one because it is cool. Look at this.

0:21:50.990 --> 0:21:55.159
<v S1>Experienced anger. Look at this. You got these country breakdowns.

0:21:55.160 --> 0:21:57.650
<v S1>This is really cool. Then you got map views. And

0:21:57.650 --> 0:22:00.050
<v S1>look at this. The map view. You can click on

0:22:00.050 --> 0:22:04.340
<v S1>sadness stress worry pain. Look at that enjoyment. Okay. So

0:22:04.369 --> 0:22:10.460
<v S1>dark is better right? Um. China. Yes. Yeah. Dark. Yep. Okay,

0:22:10.490 --> 0:22:14.720
<v S1>so let's look at a light one. Yes, 100% dark

0:22:14.720 --> 0:22:19.369
<v S1>is better because that's Afghanistan. Two thirds. No. Oh, man,

0:22:19.400 --> 0:22:24.379
<v S1>that's so depressing. Afghanistan has two thirds. No enjoyment. However,

0:22:24.380 --> 0:22:28.040
<v S1>the question was asked another one with that similar color turkey.

0:22:28.070 --> 0:22:31.310
<v S1>I'm pretty sure that's turkey. Yeah that's Turkey. Uh, Ukraine.

0:22:31.310 --> 0:22:35.570
<v S1>About half, for obvious reasons. What's this one? Morocco half.

0:22:35.600 --> 0:22:40.340
<v S1>What's this one? Tunisia half. Roughly super happy over here.

0:22:40.340 --> 0:22:45.950
<v S1>Somalia Why is Somalia so happy? See, I love these visualizations. Uzbekistan.

0:22:45.950 --> 0:22:48.889
<v S1>Very happy. What have we got over here? Norway is happy.

0:22:48.920 --> 0:22:51.439
<v S1>Keep forgetting. Norway is the far left one. What is

0:22:51.440 --> 0:22:54.980
<v S1>this one? Over here. Estonia. Very happy. Iceland could have

0:22:54.980 --> 0:23:00.050
<v S1>guessed that one. Mexico! Kicking total ass over here. Ireland seems. Yeah.

0:23:00.080 --> 0:23:04.850
<v S1>Really happy. UK? Not so much. What are these? Over here. Indonesia. Indonesia.

0:23:04.880 --> 0:23:10.850
<v S1>Very happy. Malaysia. Very happy anyway. Really cool stuff. Russian Federation.

0:23:10.880 --> 0:23:14.000
<v S1>Pretty low score, actually. I think this is the actual

0:23:14.000 --> 0:23:18.199
<v S1>worst one. Afghanistan, which is totally explainable. I mean, so

0:23:18.200 --> 0:23:20.780
<v S1>this is one experience I've been having is I take

0:23:20.780 --> 0:23:23.149
<v S1>a decent number of Ubers and I always talk to

0:23:23.180 --> 0:23:26.750
<v S1>the person in the Bay area. My chances of getting

0:23:26.750 --> 0:23:31.190
<v S1>someone from Afghanistan are like 80, 90%, and I talk

0:23:31.190 --> 0:23:33.800
<v S1>to every single one of them for the entire duration

0:23:33.800 --> 0:23:38.270
<v S1>of the trip. Usually they are interpreters, don't say translator,

0:23:38.270 --> 0:23:41.800
<v S1>they are interpreters. And oftentimes they worked for the US government,

0:23:41.800 --> 0:23:43.900
<v S1>which is how they got their visa to come over

0:23:43.900 --> 0:23:47.710
<v S1>here and what they have back home if they brought

0:23:47.710 --> 0:23:51.100
<v S1>their family. Most likely they didn't bring most of their family.

0:23:51.100 --> 0:23:54.340
<v S1>Their family is in danger because they are here and

0:23:54.340 --> 0:23:58.000
<v S1>it is going from bad to worse every single week.

0:23:58.030 --> 0:24:01.960
<v S1>It just is so, so depressing. Anyway, I'm on a

0:24:01.960 --> 0:24:05.770
<v S1>tangent here, but talk to your Uber drivers. Experience life

0:24:05.770 --> 0:24:11.080
<v S1>through other people's eyes. Okay. Um. Data from surveys conducted

0:24:11.080 --> 0:24:14.530
<v S1>in 142 countries. Mix of telephone, face to face, and

0:24:14.530 --> 0:24:18.400
<v S1>some web stuff. About a thousand respondents per country. So

0:24:18.400 --> 0:24:20.950
<v S1>that's not great, but I'm sure they're doing it scientifically,

0:24:20.950 --> 0:24:25.270
<v S1>so it's a decent sample. Non-smokers who avoided the sun

0:24:25.300 --> 0:24:28.870
<v S1>had a life expectancy similar to smokers who got the

0:24:28.869 --> 0:24:33.639
<v S1>most sun. And this is nearly 30,000 Swedish women over

0:24:33.640 --> 0:24:37.960
<v S1>20 years, by the way, Scandinavia likes to smoke. And Germany,

0:24:37.960 --> 0:24:40.169
<v S1>this is the worst thing about Europe and there are

0:24:40.170 --> 0:24:42.720
<v S1>many things competing for that right now. But one of

0:24:42.720 --> 0:24:45.240
<v S1>the worst things about Europe is smoking. I go there

0:24:45.240 --> 0:24:48.960
<v S1>and I'm just like, what is going on? Yeah, Switzerland.

0:24:48.990 --> 0:24:50.820
<v S1>I'm about to go back to Switzerland and everyone's going

0:24:50.850 --> 0:24:54.630
<v S1>to be smoking inside the restaurant. They're smoking inside the restaurant.

0:24:54.630 --> 0:24:56.820
<v S1>I'm just like, what is? I thought you guys were

0:24:56.820 --> 0:25:00.300
<v S1>the advanced group. Okay, the research suggests that avoiding sun

0:25:00.300 --> 0:25:04.919
<v S1>is as risky as smoking. So this needs more research, obviously.

0:25:04.920 --> 0:25:07.860
<v S1>But like I said, damn. For me, I get sun

0:25:07.859 --> 0:25:10.680
<v S1>in the morning. I got decent amount of sun this morning.

0:25:10.680 --> 0:25:14.550
<v S1>Massive boost for me. I put on the Waking Up app,

0:25:14.580 --> 0:25:17.550
<v S1>listen to Sam do a ten minute thing. I do

0:25:17.550 --> 0:25:19.860
<v S1>a ten minute walk out there, I do some breathing.

0:25:19.859 --> 0:25:22.170
<v S1>That's how I start my day. When I'm on a routine,

0:25:22.170 --> 0:25:27.270
<v S1>which I should be and often not enough. But I

0:25:27.270 --> 0:25:29.850
<v S1>did today and yesterday and the day before. All right.

0:25:29.880 --> 0:25:34.080
<v S1>Stanford researchers have found that blocking the whatever blocking this

0:25:34.080 --> 0:25:38.129
<v S1>pathway in the brain can reverse the metabolic disruptions caused

0:25:38.130 --> 0:25:43.770
<v S1>by Alzheimer's disease, providing cognitive improving cognitive functions in mice.

0:25:43.770 --> 0:25:46.920
<v S1>I'm starting to feel like we're about to make massive

0:25:46.920 --> 0:25:51.090
<v S1>progress on both Alzheimer's and cancer. And honestly, it's making

0:25:51.090 --> 0:25:52.980
<v S1>me want to invest in like, the top three drug

0:25:52.980 --> 0:25:56.250
<v S1>companies I think I'm already in, which one I'm in,

0:25:56.250 --> 0:25:58.620
<v S1>the one that does wegovy. I can't remember the name

0:25:58.619 --> 0:26:01.710
<v S1>of that one. I'll think of it. It's right there.

0:26:01.710 --> 0:26:04.440
<v S1>It's right there. Anyway, I'm in that one, but I'm

0:26:04.440 --> 0:26:07.139
<v S1>not in the Lilly one. I think they're called Eli Lilly,

0:26:07.140 --> 0:26:10.080
<v S1>but they're rebranding to Lilly. So I figure if I

0:26:10.080 --> 0:26:13.470
<v S1>get into the two top competitors, one or both of

0:26:13.470 --> 0:26:16.859
<v S1>them is going to do something with Alzheimer's and cancer,

0:26:16.859 --> 0:26:21.899
<v S1>and we're already doing it with obesity. Like, that's a trifecta.

0:26:21.930 --> 0:26:25.560
<v S1>All we need now is like balding and aging. Aging

0:26:25.560 --> 0:26:27.600
<v S1>is the big one. Aging and cancer I would say,

0:26:27.630 --> 0:26:31.800
<v S1>are the big ones. And then like obesity and hair loss,

0:26:31.800 --> 0:26:36.300
<v S1>that's amazing. So not investment advice, but I'm damn sure

0:26:36.330 --> 0:26:40.760
<v S1>getting in all right. Air purifiers and two Helsinki daycare

0:26:40.760 --> 0:26:45.440
<v S1>centers reduced sick kids days by 30%. And I don't

0:26:45.440 --> 0:26:48.590
<v S1>think this is Covid or flu or anything. I think

0:26:48.590 --> 0:26:51.290
<v S1>they're just talking about like, all cause based on the

0:26:51.290 --> 0:26:55.129
<v S1>the study parts that I saw, University of Missouri scientists

0:26:55.130 --> 0:26:59.240
<v S1>have developed a liquid based solution that removes over 98%

0:26:59.240 --> 0:27:02.930
<v S1>of nanoplastics from water. It uses water repelling solvents to

0:27:02.960 --> 0:27:07.189
<v S1>absorb the particles, which are easily separated and removed. I

0:27:07.190 --> 0:27:09.110
<v S1>assume you just run it through a filter and those

0:27:09.109 --> 0:27:12.500
<v S1>things will be big globs, and they get stopped behind

0:27:12.500 --> 0:27:14.990
<v S1>by the filter. I expect to see lots more of this.

0:27:15.020 --> 0:27:17.930
<v S1>Can't wait for the Huberman episode on this, because I

0:27:17.930 --> 0:27:20.930
<v S1>wanted to be able to do this like cheaply at home.

0:27:20.930 --> 0:27:25.070
<v S1>Eli Lilly's weight loss drug Tripeptide, found in Zepp bound

0:27:25.100 --> 0:27:29.359
<v S1>and Manjaro, reduced the risk of developing type two diabetes

0:27:29.359 --> 0:27:36.649
<v S1>by 94% in obese or overweight adults with pre-diabetes, 94%.

0:27:36.650 --> 0:27:39.790
<v S1>And Apple Podcasts is losing ground to YouTube and Spotify,

0:27:39.790 --> 0:27:46.600
<v S1>so recent study put YouTube at 31%, Spotify at 21%,

0:27:46.600 --> 0:27:50.050
<v S1>and Apple Podcasts at 12. I don't do Spotify, I

0:27:50.050 --> 0:27:54.699
<v S1>do YouTube and Apple Podcasts. But all right. Ideas thought

0:27:54.700 --> 0:27:58.360
<v S1>of a cool idea for fabric, Telos and substrate. Maintain

0:27:58.359 --> 0:28:01.210
<v S1>a list of everything I've been really, really wrong about,

0:28:01.210 --> 0:28:03.429
<v S1>which I'm already building that list, and then write a

0:28:03.430 --> 0:28:06.070
<v S1>fabric pattern that looks at that list. By the way,

0:28:06.070 --> 0:28:07.600
<v S1>I'm just going to have it all in my same

0:28:07.630 --> 0:28:10.600
<v S1>Telos file as I talked about in the augmented course,

0:28:10.600 --> 0:28:15.700
<v S1>but basically just have this list listed as all my

0:28:15.700 --> 0:28:20.200
<v S1>biggest mistakes, cognitive errors or whatever listed in there. Then

0:28:20.200 --> 0:28:23.290
<v S1>I also have another section inside that same telos file,

0:28:23.290 --> 0:28:25.869
<v S1>which is like all my current beliefs, my model of

0:28:25.869 --> 0:28:29.470
<v S1>the world. And then the fabric pattern basically says evaluate

0:28:29.470 --> 0:28:33.160
<v S1>all my current beliefs, look at all the previous mistakes

0:28:33.160 --> 0:28:36.670
<v S1>that I've made and look for patterns. Look for what

0:28:36.670 --> 0:28:40.390
<v S1>in my current beliefs might be broken in a similar

0:28:40.390 --> 0:28:43.030
<v S1>way as the way I was wrong about the other things.

0:28:43.030 --> 0:28:45.670
<v S1>First of all, it's going to help me diagnose why

0:28:45.670 --> 0:28:47.979
<v S1>I was wrong. Is it because I have a bias

0:28:47.980 --> 0:28:50.590
<v S1>towards this one thing? It's like, oh, you're so pro

0:28:50.590 --> 0:28:54.070
<v S1>I that you are wrong about this thing because you

0:28:54.070 --> 0:28:57.790
<v S1>thought technology was a solution. So turns out your bias

0:28:57.820 --> 0:29:00.670
<v S1>is thinking that tech can solve too many things in

0:29:00.670 --> 0:29:03.970
<v S1>the world. Good to know. Like I'm already actively defending

0:29:03.970 --> 0:29:07.270
<v S1>against that bias because I know it's there doesn't mean

0:29:07.270 --> 0:29:11.470
<v S1>I'm properly or adequately defending against that bias. Point is,

0:29:11.470 --> 0:29:13.150
<v S1>I want to see the bias. I want to see

0:29:13.150 --> 0:29:15.310
<v S1>it call me out on it, and look for other

0:29:15.310 --> 0:29:18.370
<v S1>evidence that my current beliefs are broken because of that problem.

0:29:18.370 --> 0:29:22.450
<v S1>Discovery Foof I uses fluff and I yeah, that's why

0:29:22.450 --> 0:29:25.720
<v S1>it's called that. To find more web hacking targets by

0:29:25.720 --> 0:29:30.700
<v S1>Joseph Thacker, go fuzzy recursively looks at JavaScript files and

0:29:30.700 --> 0:29:34.510
<v S1>finds endpoints that can be tested. Analyze interviewer techniques is

0:29:34.510 --> 0:29:37.660
<v S1>a new fabric pattern that will capture the je NE

0:29:37.660 --> 0:29:40.030
<v S1>sais quoi. By the way, this is spelled wrong. I

0:29:40.030 --> 0:29:42.550
<v S1>don't know why I didn't just use AI and spell

0:29:42.550 --> 0:29:46.480
<v S1>this correctly. I always spell this wrong. N e is

0:29:46.510 --> 0:29:49.690
<v S1>n a I s and I think the other one

0:29:49.690 --> 0:29:52.090
<v S1>might even be wrong too, but I think the quote

0:29:52.120 --> 0:29:55.660
<v S1>is right. But anyway, it's n I s I believe

0:29:55.690 --> 0:29:58.600
<v S1>is how you spell that. Anyway, someone French gave me

0:29:58.600 --> 0:30:02.290
<v S1>the proper spelling again, more I should be part of this. Um,

0:30:02.680 --> 0:30:05.590
<v S1>back to my bias of I being the problem, but

0:30:05.590 --> 0:30:10.780
<v S1>I've been using this on Dwarkesh and Tyler Cowen content

0:30:10.780 --> 0:30:14.680
<v S1>analyze interviewer techniques. It basically finds out, it figures out

0:30:14.680 --> 0:30:17.470
<v S1>and tells you why they're such a good interviewer. Harness

0:30:17.500 --> 0:30:19.420
<v S1>is a quick tool I put together to test the

0:30:19.420 --> 0:30:22.480
<v S1>efficacy of one prompt versus another. It runs both against

0:30:22.480 --> 0:30:25.450
<v S1>an input and then scores the output according to a

0:30:25.450 --> 0:30:28.989
<v S1>third objective prompt that rates how well they followed the

0:30:28.990 --> 0:30:33.880
<v S1>plot and actually executed the instructions of what the prompts

0:30:33.880 --> 0:30:37.340
<v S1>were trying to do in the first place. So super useful.

0:30:37.370 --> 0:30:40.700
<v S1>Stayed in time are the same thing by Hillel Wayne.

0:30:40.700 --> 0:30:43.190
<v S1>Don't Force yourself to become a bug bounty Hunter by

0:30:43.190 --> 0:30:47.540
<v S1>Sam Curry 67 years of RadioShack catalogs have been scanned

0:30:47.540 --> 0:30:51.290
<v S1>and are now available online. MD RSS is a go

0:30:51.320 --> 0:30:54.470
<v S1>based tool that converts markdown files to RSS feeds. You

0:30:54.470 --> 0:30:57.380
<v S1>can write articles in a local folder, and it automatically

0:30:57.380 --> 0:31:01.340
<v S1>formats them into an RSS compliant XML file. Super cool.

0:31:01.340 --> 0:31:05.480
<v S1>No hello, no quick call, no meetings without an agenda.

0:31:05.510 --> 0:31:08.870
<v S1>You already know that's good. Roger Penrose's book, The Emperor's

0:31:08.900 --> 0:31:12.560
<v S1>New Mind explores the relationship between the human mind and computers,

0:31:12.560 --> 0:31:16.880
<v S1>arguing that human consciousness cannot be replicated by machines. I

0:31:16.880 --> 0:31:18.740
<v S1>have the opposite view, which is why I'm going to

0:31:18.740 --> 0:31:21.650
<v S1>read this book collection of free public APIs that are

0:31:21.650 --> 0:31:24.020
<v S1>tested daily and the recommendation of the week. Take the

0:31:24.020 --> 0:31:26.930
<v S1>time to read this week's main essay. We've been lied

0:31:26.960 --> 0:31:28.970
<v S1>to about work, but more than just read it. Think

0:31:28.970 --> 0:31:31.040
<v S1>about what it means. If I am right, think about

0:31:31.040 --> 0:31:33.290
<v S1>what that means for you and your career, but also

0:31:33.290 --> 0:31:35.630
<v S1>all the young people you know and care about. So

0:31:35.630 --> 0:31:37.670
<v S1>I didn't talk about the solution in the piece, but

0:31:37.670 --> 0:31:40.460
<v S1>it's essentially human 3.0, and I'm going to be talking

0:31:40.460 --> 0:31:43.310
<v S1>a lot more about that, but start thinking about it now.

0:31:43.340 --> 0:31:46.640
<v S1>Definitely recommend that you read this piece. And the aphorism

0:31:46.640 --> 0:31:49.070
<v S1>of the week to fear love is to fear life,

0:31:49.070 --> 0:31:52.970
<v S1>and those who fear life are already three parts dead

0:31:53.000 --> 0:31:55.430
<v S1>to fear. Love is to fear life, and those who

0:31:55.430 --> 0:32:01.280
<v S1>fear life are already three parts dead. Bertrand Russell. Unsupervised

0:32:01.280 --> 0:32:03.800
<v S1>learning is produced and edited by Daniel Miessler on a

0:32:03.800 --> 0:32:08.660
<v S1>Neumann U87 AI microphone using Hindenburg. Intro and outro music

0:32:08.660 --> 0:32:11.750
<v S1>is by zombie with a Y. And to get the

0:32:11.750 --> 0:32:13.820
<v S1>text and links from this episode, sign up for the

0:32:13.820 --> 0:32:19.459
<v S1>newsletter version of the show at Daniel miessler.com/newsletter. We'll see

0:32:19.460 --> 0:32:20.240
<v S1>you next time.