1 00:00:04,720 --> 00:00:07,360 Speaker 1: Hey, everybody, welcome to it could happen here. I am 2 00:00:07,440 --> 00:00:09,360 Speaker 1: Robert Evans, and this is the show where we talk 3 00:00:09,400 --> 00:00:11,559 Speaker 1: about how everything is kind of falling apart and how 4 00:00:11,600 --> 00:00:14,080 Speaker 1: we might put it back together again in a way 5 00:00:14,120 --> 00:00:17,279 Speaker 1: that works better than it did before UM, or do 6 00:00:17,400 --> 00:00:21,439 Speaker 1: something different that is even anyway. Whatever. It's a show 7 00:00:21,440 --> 00:00:23,960 Speaker 1: about the future and about the messed up present UM 8 00:00:24,000 --> 00:00:25,520 Speaker 1: And as a result of that, one of the things 9 00:00:25,520 --> 00:00:28,120 Speaker 1: we talk about a lot is self sufficiency. We've had 10 00:00:28,120 --> 00:00:30,680 Speaker 1: a number of episodes kind of covering the values of 11 00:00:30,720 --> 00:00:33,919 Speaker 1: like replacing your lawn with food, guerilla gardening, that sort 12 00:00:33,960 --> 00:00:36,159 Speaker 1: of stuff. And one of the critiques we get is 13 00:00:36,200 --> 00:00:38,279 Speaker 1: people saying, well, you know, that's never gonna work on 14 00:00:38,440 --> 00:00:41,360 Speaker 1: a large scale. It's never going to replace industrial agriculture 15 00:00:41,440 --> 00:00:45,520 Speaker 1: or whatever. And that's perfectly true. But the point we're 16 00:00:45,560 --> 00:00:48,199 Speaker 1: going for here and why we encourage these kind of 17 00:00:48,240 --> 00:00:52,120 Speaker 1: resilience building activities is because they do improve the ability 18 00:00:52,159 --> 00:00:54,680 Speaker 1: of communities to resist when they need to resist, and 19 00:00:54,760 --> 00:00:59,240 Speaker 1: also provide opportunities by which people can reimagine their relationship to, 20 00:00:59,440 --> 00:01:02,000 Speaker 1: for example, all the food supply chain, or reimagine the 21 00:01:02,040 --> 00:01:04,360 Speaker 1: relationship to their community, and the kind of things that 22 00:01:04,400 --> 00:01:07,640 Speaker 1: communities provide for each other rather than having them shipped 23 00:01:07,640 --> 00:01:11,319 Speaker 1: in by Amazon. UM. And when we start talking about that, 24 00:01:11,400 --> 00:01:15,280 Speaker 1: when we start talking about improving community resiliency for things like, 25 00:01:15,640 --> 00:01:19,480 Speaker 1: you know, a general strike or even potentially more radical stuff. 26 00:01:20,080 --> 00:01:22,760 Speaker 1: One of the big issues that any community has to 27 00:01:22,760 --> 00:01:25,880 Speaker 1: confront is not just food, but medicine. I'm I do, 28 00:01:26,000 --> 00:01:27,760 Speaker 1: and I'm sure a lot of other people have friends 29 00:01:27,880 --> 00:01:33,040 Speaker 1: who cannot survive without medications that are very like reliant 30 00:01:33,160 --> 00:01:37,320 Speaker 1: upon existing supply chains UM and to some extent, even 31 00:01:37,319 --> 00:01:40,640 Speaker 1: the stability of the government, you know, UM, getting your insulin, 32 00:01:40,760 --> 00:01:44,720 Speaker 1: getting your medication for whatever kind of disease you have 33 00:01:44,920 --> 00:01:47,200 Speaker 1: that that needs constant medication. There's a bunch of different 34 00:01:47,240 --> 00:01:51,080 Speaker 1: reasons why people are reliant upon the medical um supply 35 00:01:51,200 --> 00:01:54,400 Speaker 1: lines and upon the kind of pharmaceutical industry. And that's 36 00:01:54,400 --> 00:01:56,480 Speaker 1: one of the big when we talk about building more 37 00:01:56,560 --> 00:01:59,280 Speaker 1: resilient communities, one of the big hurdles to jump. Well. Today, 38 00:01:59,800 --> 00:02:04,000 Speaker 1: my guest is someone who is working on bridging some 39 00:02:04,120 --> 00:02:07,120 Speaker 1: of these problems. UM. His name is Michael Lawfer, and 40 00:02:07,200 --> 00:02:09,839 Speaker 1: he is the founder of an organization called the Four 41 00:02:09,919 --> 00:02:14,480 Speaker 1: Thieves Vinegar Collective. They are bio hackers UM, and they 42 00:02:14,480 --> 00:02:20,600 Speaker 1: are working on cracking certain pharmaceutical medications to allow individuals 43 00:02:20,680 --> 00:02:24,960 Speaker 1: with resources that are generally available to people who are 44 00:02:25,040 --> 00:02:31,320 Speaker 1: not rich or pharmaceutical companies UM to produce life saving medications. UM. 45 00:02:31,360 --> 00:02:33,400 Speaker 1: The number one thing you would have heard of from 46 00:02:33,440 --> 00:02:36,359 Speaker 1: four Thieves is the EPI pencil, which we'll talk about 47 00:02:36,400 --> 00:02:38,480 Speaker 1: it a bit. But first, Michael, thank you for coming 48 00:02:38,480 --> 00:02:40,680 Speaker 1: on the show. Thanks so much for having me. It's 49 00:02:40,760 --> 00:02:43,920 Speaker 1: exciting to be able to shot and talk with you 50 00:02:44,080 --> 00:02:47,240 Speaker 1: and all the people surrounding you who are trying to 51 00:02:48,400 --> 00:02:52,400 Speaker 1: just unfunned things a little bit. Yeah, yeah, and I 52 00:02:53,200 --> 00:02:55,520 Speaker 1: most of the conversation I want to have today is 53 00:02:55,560 --> 00:02:58,880 Speaker 1: on the unfucking of things variety. But I do think 54 00:02:58,919 --> 00:03:01,360 Speaker 1: we should start with a little bit technical talk first. 55 00:03:01,880 --> 00:03:04,760 Speaker 1: Can you give people an idea of what kind of 56 00:03:04,800 --> 00:03:08,040 Speaker 1: medications you and other people in the collective have figured 57 00:03:08,040 --> 00:03:11,960 Speaker 1: out how to produce UM and what kind of resources 58 00:03:11,960 --> 00:03:14,240 Speaker 1: and individual needs to be able to do some of 59 00:03:14,280 --> 00:03:19,480 Speaker 1: this stuff. Sure. So, from a technical perspective, most of 60 00:03:19,480 --> 00:03:22,120 Speaker 1: the things that we focus on are what's called small 61 00:03:22,160 --> 00:03:28,680 Speaker 1: molecule chemistry, and to kind of describe that blanketly, if 62 00:03:28,720 --> 00:03:31,200 Speaker 1: you can draw the molecule in a cocktail napkin. It 63 00:03:31,240 --> 00:03:34,200 Speaker 1: probably qualifies as a small molecule if it's one of 64 00:03:34,200 --> 00:03:36,520 Speaker 1: these things that like, you know, if you look at 65 00:03:36,520 --> 00:03:39,480 Speaker 1: the diagram for the molecules approaching, it's got big ribbons 66 00:03:39,480 --> 00:03:42,440 Speaker 1: that are colored and stuff that's a that's a biochemic thing, 67 00:03:42,440 --> 00:03:47,520 Speaker 1: and it's a whole different set of problems. Now, the 68 00:03:47,760 --> 00:03:57,000 Speaker 1: mean focide that we've had have been surrounding access to abortion, 69 00:03:57,640 --> 00:04:04,640 Speaker 1: access to hive medications, access to hepatitis C medications, and 70 00:04:04,920 --> 00:04:08,720 Speaker 1: access to reversal of jug overdose medications. So that's been 71 00:04:08,760 --> 00:04:10,920 Speaker 1: sort of our main focus, but there's been a handful 72 00:04:10,960 --> 00:04:13,480 Speaker 1: of others. The things that we tend to look for 73 00:04:13,520 --> 00:04:18,440 Speaker 1: our where are there things that there's a great need 74 00:04:18,960 --> 00:04:22,480 Speaker 1: and there's a huge barrier, And so you see those 75 00:04:22,720 --> 00:04:27,039 Speaker 1: in those places a lot, because the three main barriers 76 00:04:27,040 --> 00:04:30,560 Speaker 1: that tend to pop up between somebody and access to 77 00:04:30,560 --> 00:04:35,440 Speaker 1: the medication they need are either price or legality or 78 00:04:35,520 --> 00:04:41,320 Speaker 1: lack of infrastructure. And typically the weirdness that comes up 79 00:04:42,760 --> 00:04:50,880 Speaker 1: mostly surrounds price because of intellectual property laws and marginalization 80 00:04:51,240 --> 00:04:55,880 Speaker 1: of people who suffer from particular ailments or seem to 81 00:04:55,920 --> 00:05:00,320 Speaker 1: suffer predominantly from particular ailments. And so if you're if 82 00:05:00,320 --> 00:05:02,680 Speaker 1: you're poor, and you're in a class of people that 83 00:05:03,560 --> 00:05:05,720 Speaker 1: is seen as something not to be cared about because 84 00:05:05,800 --> 00:05:11,919 Speaker 1: they're not a strong voter base, then the ability to 85 00:05:12,160 --> 00:05:16,680 Speaker 1: move access away from those people and put in more 86 00:05:16,760 --> 00:05:24,159 Speaker 1: barriers and raise prices becomes easier to defend. UM. So 87 00:05:25,160 --> 00:05:32,440 Speaker 1: the first drug that we focused on was a an 88 00:05:32,480 --> 00:05:43,360 Speaker 1: anti parasitic um. Toxoplasmosis is a parasite that's pretty innocuous 89 00:05:43,400 --> 00:05:46,560 Speaker 1: for most people anybody when you get from cats, right 90 00:05:46,640 --> 00:05:48,960 Speaker 1: or is this not Gandhi? It is. It is the 91 00:05:48,960 --> 00:05:51,440 Speaker 1: one you get from cats, and it's really fascinating parasite 92 00:05:51,440 --> 00:05:54,440 Speaker 1: too if if if you ever dig into the behavioral 93 00:05:54,440 --> 00:05:57,479 Speaker 1: biology of it, it's really really fascinating parasite. UM, I 94 00:05:57,600 --> 00:06:01,039 Speaker 1: probably have it. Yeah, I have three cats. I definitely 95 00:06:01,080 --> 00:06:05,080 Speaker 1: have it. So and so it's not a big deal 96 00:06:05,440 --> 00:06:07,919 Speaker 1: for those people, but if you have a massively compromised 97 00:06:07,920 --> 00:06:11,039 Speaker 1: immune system, especially with people with HIV or advanced stages 98 00:06:11,080 --> 00:06:13,320 Speaker 1: of cancer, and that's why it was labeled sort of 99 00:06:13,960 --> 00:06:16,440 Speaker 1: you know HIV drug, it's not it's a it's an 100 00:06:16,440 --> 00:06:20,279 Speaker 1: anti parasitic, but it's used almost exclusively by people who 101 00:06:20,320 --> 00:06:25,440 Speaker 1: are an advanced stages of cancer. Uh, people with fairly 102 00:06:25,440 --> 00:06:28,760 Speaker 1: compromised immune systems from HIV or something else, and then 103 00:06:28,839 --> 00:06:33,159 Speaker 1: pregnant women. Um, And it's not that big deal. If 104 00:06:33,200 --> 00:06:37,400 Speaker 1: you have access to the medication, you can merely take 105 00:06:37,440 --> 00:06:41,760 Speaker 1: it and eradicates from body. Uh. The difference was is 106 00:06:41,800 --> 00:06:44,000 Speaker 1: that something that was a short course of treatment you 107 00:06:44,120 --> 00:06:50,039 Speaker 1: take I think for um, you know, four doses the 108 00:06:50,080 --> 00:06:55,839 Speaker 1: first time around, and then one dose each day subsequently 109 00:06:55,880 --> 00:06:58,440 Speaker 1: for something like ten days. Um, And that's not a 110 00:06:58,440 --> 00:07:00,960 Speaker 1: big deal when each dose each was about thirteen and 111 00:07:01,000 --> 00:07:03,560 Speaker 1: a half dollars. And then Martin cry jacked up the 112 00:07:03,560 --> 00:07:08,600 Speaker 1: price to seven fifty a pill, and so we're like, well, 113 00:07:08,640 --> 00:07:10,760 Speaker 1: this is ridiculous. So that was the first one that 114 00:07:10,760 --> 00:07:14,960 Speaker 1: we went after. Then of course access to abortion drugs, um, 115 00:07:14,960 --> 00:07:18,600 Speaker 1: that's a big one. That's pretty topical lately. We released 116 00:07:18,600 --> 00:07:21,560 Speaker 1: a video maybe three months ago on how you can 117 00:07:21,560 --> 00:07:23,760 Speaker 1: make your own abortion pills without too much fuss. This 118 00:07:23,760 --> 00:07:29,040 Speaker 1: would be mythopristone, right, Uh, mithopristone and Mr Pristall. So 119 00:07:29,680 --> 00:07:34,080 Speaker 1: you can do it with just Mr Purstall, or you 120 00:07:34,080 --> 00:07:37,800 Speaker 1: can do it in combination. And when you do it 121 00:07:37,840 --> 00:07:42,559 Speaker 1: with just the one with just miso, you have about 122 00:07:42,560 --> 00:07:46,160 Speaker 1: an eighty five percent chance of it working. And if 123 00:07:46,200 --> 00:07:49,680 Speaker 1: you have both, they bumps it up to about and 124 00:07:49,720 --> 00:07:51,840 Speaker 1: what is the like what when you're doing this? And 125 00:07:51,840 --> 00:07:53,520 Speaker 1: we'll talk a little bit with the hardware, but like 126 00:07:53,560 --> 00:07:55,880 Speaker 1: what is the re agent that you have for this? 127 00:07:55,960 --> 00:07:57,600 Speaker 1: Because I know that's been a big part of some 128 00:07:57,640 --> 00:07:59,680 Speaker 1: of the discussions is like how do you get the 129 00:07:59,760 --> 00:08:01,920 Speaker 1: thing as you make the medicines from which is easier 130 00:08:01,960 --> 00:08:04,800 Speaker 1: for some than it is for others. Sure, there are 131 00:08:04,800 --> 00:08:09,080 Speaker 1: a couple different ways that you google about that. The 132 00:08:09,080 --> 00:08:11,680 Speaker 1: the interesting but more difficult way, of course, is to 133 00:08:11,720 --> 00:08:16,280 Speaker 1: do the chemistry from scratch, where like you say, get 134 00:08:16,360 --> 00:08:18,680 Speaker 1: access from the agents, you do some chemistry, and you 135 00:08:18,760 --> 00:08:24,120 Speaker 1: end up with the active pharmaceutical ingredient, which we lovingly 136 00:08:24,120 --> 00:08:26,320 Speaker 1: referred to as the API, and then you package it 137 00:08:26,760 --> 00:08:31,520 Speaker 1: somehow into a tablet or a pill or or some 138 00:08:31,640 --> 00:08:40,600 Speaker 1: other means of ingress into the body. Um. The instructions 139 00:08:40,679 --> 00:08:47,720 Speaker 1: that we distributed skip the difficult part because mr. Pristal 140 00:08:49,559 --> 00:08:55,280 Speaker 1: is an ulcer medication, and so for instance, if you 141 00:08:56,360 --> 00:08:59,600 Speaker 1: have access to Mexico or are in Mexico, it's kind 142 00:08:59,640 --> 00:09:02,480 Speaker 1: of not a big deal because as an ulcer medication, 143 00:09:02,520 --> 00:09:04,760 Speaker 1: it's over the counter, and you can just go in 144 00:09:04,840 --> 00:09:08,880 Speaker 1: and say, oh, you know, my grandmother can't get out 145 00:09:08,920 --> 00:09:11,600 Speaker 1: of bed. She needs this ulcer medication. I need just 146 00:09:11,720 --> 00:09:14,640 Speaker 1: a little bit of it to get her through the weekend. Um, 147 00:09:14,720 --> 00:09:19,480 Speaker 1: and then no problem. Uh. Not so easy in places 148 00:09:19,520 --> 00:09:21,959 Speaker 1: where it's a little more controlled, like the US. However, 149 00:09:23,720 --> 00:09:33,679 Speaker 1: one amazing trick when looking for medicines, access to medicines 150 00:09:34,320 --> 00:09:40,560 Speaker 1: that are generally blocked from people that the existing power 151 00:09:40,640 --> 00:09:44,560 Speaker 1: structure tries to disenfranchise from access is you look and 152 00:09:44,679 --> 00:09:50,320 Speaker 1: see if it's similarly used for other classes of person 153 00:09:50,480 --> 00:09:54,800 Speaker 1: or being that the infrastructure does care about. So interestingly, 154 00:09:55,520 --> 00:09:58,400 Speaker 1: you look for ulcer medication, you say, well, like, who 155 00:09:58,480 --> 00:10:01,800 Speaker 1: else has ulcers that you know, people might think are 156 00:10:01,840 --> 00:10:04,800 Speaker 1: important people. That doesn't really come up, and there are 157 00:10:04,800 --> 00:10:08,520 Speaker 1: other ulster medications are a little bit better. However, there 158 00:10:08,559 --> 00:10:10,760 Speaker 1: are a lot of really wealthy people in the United States, 159 00:10:11,280 --> 00:10:16,520 Speaker 1: and really wealthy people tend to keep horses and horses 160 00:10:17,040 --> 00:10:25,320 Speaker 1: interestingly um or something or maybe more some ungodly percentage 161 00:10:25,440 --> 00:10:31,680 Speaker 1: of domesticated horses have ulcers. UM. Now why that is 162 00:10:31,880 --> 00:10:36,400 Speaker 1: I'm not entirely clear about, but my own theory is 163 00:10:36,440 --> 00:10:38,840 Speaker 1: that it has something to do with taking a gigantic 164 00:10:39,040 --> 00:10:41,600 Speaker 1: wild animal and putting it into a very small box 165 00:10:41,679 --> 00:10:44,160 Speaker 1: for most of its life. Yeah, it doesn't seem like 166 00:10:44,600 --> 00:10:49,000 Speaker 1: the thing that horses evolved to do. Yeah. So, so 167 00:10:49,120 --> 00:10:53,320 Speaker 1: that said, people who are horse owners typically have to 168 00:10:53,400 --> 00:10:57,000 Speaker 1: treat them constantly for ulcers, and the best thing for 169 00:10:57,080 --> 00:11:01,720 Speaker 1: that is Mr. Perstal. And so you can get Mr. 170 00:11:01,720 --> 00:11:07,640 Speaker 1: Crystall powder in a tub from places that, yeah, feedstore 171 00:11:07,720 --> 00:11:09,840 Speaker 1: or something. Yeah, I go to a feed store every week. 172 00:11:09,880 --> 00:11:13,200 Speaker 1: I'm sure I could buy a bucket of this ship probably, 173 00:11:13,280 --> 00:11:15,559 Speaker 1: So it comes in tubs. And the other thing that's 174 00:11:15,600 --> 00:11:17,439 Speaker 1: great about it coming in a tub is that it's 175 00:11:17,440 --> 00:11:22,560 Speaker 1: already in with a buffer. Part of the thing about Mr. 176 00:11:22,640 --> 00:11:25,719 Speaker 1: Crystall is that the dosages in micrograms, and that's very 177 00:11:25,760 --> 00:11:29,120 Speaker 1: hard to weigh unless you have a really high precision scale. 178 00:11:29,200 --> 00:11:32,040 Speaker 1: Even your good drug dealers generally don't have a scale 179 00:11:32,080 --> 00:11:36,319 Speaker 1: that can do that. Right. So, but the magic is, 180 00:11:36,280 --> 00:11:39,600 Speaker 1: this isn't a tub with a bunch of inert powder, 181 00:11:40,000 --> 00:11:43,440 Speaker 1: and it's it's already mixed up to be homogeneous. And 182 00:11:43,520 --> 00:11:44,880 Speaker 1: so what you can do is you can do a 183 00:11:44,880 --> 00:11:49,160 Speaker 1: little bit of back of the envelope arithmetic, and you 184 00:11:49,200 --> 00:11:52,120 Speaker 1: can measure out much larger quantities and know how much 185 00:11:52,160 --> 00:11:55,920 Speaker 1: active ingredients you have and then pack that into a tablet. 186 00:11:58,640 --> 00:12:02,600 Speaker 1: Now yeah, um, I mean that makes so much sense. 187 00:12:02,640 --> 00:12:07,160 Speaker 1: And it's also like like the you have kind of 188 00:12:07,160 --> 00:12:09,280 Speaker 1: the dark side and light side version. It's kind of 189 00:12:09,280 --> 00:12:11,440 Speaker 1: the light side version of all of those people buying 190 00:12:11,520 --> 00:12:15,000 Speaker 1: up ivermectin for for nonsense. It's like, well, no, there's 191 00:12:15,040 --> 00:12:21,599 Speaker 1: reasons to buy, you know, like, uh, livestock medication especially. Um. 192 00:12:21,640 --> 00:12:23,559 Speaker 1: I mean, I have a lot of friends who took 193 00:12:23,559 --> 00:12:25,719 Speaker 1: fishing antibiotics back in the day, and this is kind 194 00:12:25,720 --> 00:12:28,880 Speaker 1: of a much more um using it in a much 195 00:12:28,920 --> 00:12:33,320 Speaker 1: more rigorous way to provide people with something that can 196 00:12:33,520 --> 00:12:36,559 Speaker 1: is getting it will be getting increasingly difficult to access 197 00:12:36,600 --> 00:12:39,040 Speaker 1: in a lot of parts of the country. Yeah, it's 198 00:12:39,080 --> 00:12:41,960 Speaker 1: just such a smart way of approaching it, I think. Yeah. 199 00:12:41,960 --> 00:12:45,960 Speaker 1: And one of the things that becomes philosophically a bit 200 00:12:46,160 --> 00:12:53,199 Speaker 1: sticky is when you end up talking about the importance 201 00:12:53,400 --> 00:12:59,920 Speaker 1: of independent management of one's own health and decision making 202 00:13:00,040 --> 00:13:04,320 Speaker 1: not coming from above. There's this difficult moment that I've 203 00:13:04,360 --> 00:13:08,720 Speaker 1: had kind of having to cop to the reality that 204 00:13:09,920 --> 00:13:14,319 Speaker 1: if you're building mechanisms to empower people to have access 205 00:13:14,480 --> 00:13:19,280 Speaker 1: to make decisions about managing their own health. H Part 206 00:13:19,280 --> 00:13:26,160 Speaker 1: of that entails realizing that that will also lead to 207 00:13:26,200 --> 00:13:30,720 Speaker 1: a lot of people making what I might think are 208 00:13:30,760 --> 00:13:35,080 Speaker 1: bad decisions, but that the important thing is that it 209 00:13:35,120 --> 00:13:38,560 Speaker 1: doesn't matter what I think. That people should not be 210 00:13:38,679 --> 00:13:41,439 Speaker 1: controlled by other people, and if they make bad decisions, 211 00:13:41,480 --> 00:13:45,839 Speaker 1: that sucks, and hopefully we can help that, but not 212 00:13:45,840 --> 00:13:54,880 Speaker 1: not lamenting the importance of or not not backtracking, not 213 00:13:55,000 --> 00:14:01,720 Speaker 1: having some sort of retrograde about offering more access even 214 00:14:01,760 --> 00:14:07,800 Speaker 1: if people misuse that access to the miss manage their 215 00:14:07,840 --> 00:14:12,360 Speaker 1: own health. Mismanagement of health happens no matter what right. 216 00:14:12,480 --> 00:14:16,280 Speaker 1: It happens constantly, and people will ignore things that seem 217 00:14:16,360 --> 00:14:19,280 Speaker 1: like they're bigger problems and don't get them addressed. And 218 00:14:19,400 --> 00:14:22,280 Speaker 1: so I have to sort of retreat into this idea 219 00:14:22,360 --> 00:14:26,520 Speaker 1: that more access to more tools is better and that's 220 00:14:26,560 --> 00:14:29,480 Speaker 1: just the way of it. And yeah, the problem, yeah, 221 00:14:29,960 --> 00:14:32,160 Speaker 1: I mean, the problem with ivermectin isn't the problem. The 222 00:14:32,160 --> 00:14:34,680 Speaker 1: problem is that that people have access to ivermectin, and 223 00:14:34,760 --> 00:14:36,600 Speaker 1: so they're taking it in a way that is harmful 224 00:14:36,640 --> 00:14:39,520 Speaker 1: to them. The problem is that people have been have 225 00:14:39,520 --> 00:14:43,120 Speaker 1: have been blinded by disinformation and so are making a 226 00:14:43,120 --> 00:14:46,360 Speaker 1: horrible health care decision. The fact that they have it's 227 00:14:46,400 --> 00:14:50,680 Speaker 1: possessed to veterinary medication is fine, right, exactly, And and 228 00:14:50,720 --> 00:14:52,920 Speaker 1: it's and it's interesting that you say that because I 229 00:14:53,000 --> 00:14:55,600 Speaker 1: have a friend of Doctors Without Borders and they are 230 00:14:56,480 --> 00:15:00,120 Speaker 1: starting a couple of pretty strong programs to try and 231 00:15:00,200 --> 00:15:08,040 Speaker 1: combat misinformation, because just from a metric standpoint, they look 232 00:15:08,160 --> 00:15:10,280 Speaker 1: for sort of like, what's killing the greatest number of 233 00:15:10,320 --> 00:15:13,480 Speaker 1: people at the greatest rate in the worst way, And 234 00:15:13,560 --> 00:15:15,800 Speaker 1: currently the thing that's killing the most people in the 235 00:15:15,840 --> 00:15:20,960 Speaker 1: worst way at the greatest rate is misinformation. So yeah, 236 00:15:21,200 --> 00:15:24,880 Speaker 1: that's that's really the great danger. And one of the 237 00:15:24,880 --> 00:15:27,440 Speaker 1: things I found really interesting about kind of what you 238 00:15:27,480 --> 00:15:29,960 Speaker 1: all have been doing because obviously the question of how 239 00:15:29,960 --> 00:15:32,000 Speaker 1: to fight the misinformation in the medical sphere is a 240 00:15:32,080 --> 00:15:35,560 Speaker 1: much larger conversation without simple answers. When it comes to 241 00:15:35,600 --> 00:15:38,760 Speaker 1: a question like, oh, hey, this pharmaceutical company jacked up 242 00:15:38,760 --> 00:15:42,280 Speaker 1: the price by what seven for this necessary medication for 243 00:15:42,320 --> 00:15:46,000 Speaker 1: people a lot of people who have HIV UM What 244 00:15:46,080 --> 00:15:48,080 Speaker 1: are we The solution to that is simple, You find 245 00:15:48,080 --> 00:15:49,880 Speaker 1: a way for them to get it without paying hundreds 246 00:15:49,880 --> 00:15:53,920 Speaker 1: and hundreds of dollars per dose. UM the question. Some 247 00:15:54,000 --> 00:15:57,160 Speaker 1: of the work you all have done is with very 248 00:15:57,720 --> 00:16:01,040 Speaker 1: uh mass needed products like the the mythopres don't like 249 00:16:01,080 --> 00:16:04,000 Speaker 1: the epipencil um, where there's large numbers of people who 250 00:16:04,000 --> 00:16:05,320 Speaker 1: need it. But a lot of what I think, one 251 00:16:05,320 --> 00:16:06,760 Speaker 1: of the things I think is really cool is y'all 252 00:16:06,800 --> 00:16:11,080 Speaker 1: are also working on hacking medications that are very niche like, 253 00:16:11,240 --> 00:16:14,640 Speaker 1: very very few people have this particular disease, and so 254 00:16:14,720 --> 00:16:18,160 Speaker 1: the medication is the costs as much as a fully 255 00:16:18,200 --> 00:16:21,880 Speaker 1: loaded Toyota tacoma, you know, in order to sometimes far 256 00:16:21,960 --> 00:16:25,600 Speaker 1: worse than that. Because of this Orphan Drug Act that 257 00:16:25,680 --> 00:16:28,960 Speaker 1: got past in the US and equivalence that exists in 258 00:16:29,000 --> 00:16:34,720 Speaker 1: other places, you have all of these allowances that are 259 00:16:34,840 --> 00:16:39,600 Speaker 1: granted to people who invent I put in air quotes 260 00:16:40,200 --> 00:16:44,560 Speaker 1: m because really they just purchase the rights to it 261 00:16:45,440 --> 00:16:53,040 Speaker 1: um these these orphan drugs where when you talk about controls, 262 00:16:53,480 --> 00:16:59,800 Speaker 1: it's kind of the most tragic incidents of that entire 263 00:17:00,240 --> 00:17:02,480 Speaker 1: because what's happening is you've got somebody who has a 264 00:17:02,560 --> 00:17:08,359 Speaker 1: very rare disease, and in many cases you have something 265 00:17:08,400 --> 00:17:12,560 Speaker 1: that's the difference between somebody who just cannot function and 266 00:17:12,640 --> 00:17:16,080 Speaker 1: they're dealing with their life kind of moment to moment 267 00:17:16,119 --> 00:17:21,120 Speaker 1: there they're mostly cared for, and if they have access 268 00:17:21,160 --> 00:17:23,600 Speaker 1: to a particular medication, then they can go through life 269 00:17:23,600 --> 00:17:27,640 Speaker 1: in a fairly normal sort of way where they don't 270 00:17:27,640 --> 00:17:29,840 Speaker 1: need to be an assisted living, where that they can 271 00:17:30,760 --> 00:17:36,000 Speaker 1: do sort of basic things for themselves, and that that 272 00:17:36,080 --> 00:17:40,200 Speaker 1: seems so much more predatory. I mean, it's important, of course, 273 00:17:40,240 --> 00:17:42,960 Speaker 1: you know, to look at things with that macro lens 274 00:17:43,000 --> 00:17:44,800 Speaker 1: as well and say what can what can do a 275 00:17:44,800 --> 00:17:48,240 Speaker 1: lot of good for a lot of people. But then 276 00:17:48,400 --> 00:17:51,560 Speaker 1: the sort of micro ethical lens needs to come out 277 00:17:51,600 --> 00:17:53,440 Speaker 1: from time to time and say, all right, well here's 278 00:17:53,480 --> 00:17:57,280 Speaker 1: something that only affects a few hundred thousand people across 279 00:17:57,359 --> 00:18:01,119 Speaker 1: the world. But these are people who can just go 280 00:18:01,280 --> 00:18:06,040 Speaker 1: through life normally if only they had access to a 281 00:18:06,040 --> 00:18:09,440 Speaker 1: little bit of medicine. And the only reason they don't 282 00:18:09,720 --> 00:18:15,360 Speaker 1: is because of misplaced average or all averice is misplaced 283 00:18:15,400 --> 00:18:20,639 Speaker 1: because yeah, and you're you're you're providing individuals or a 284 00:18:20,640 --> 00:18:23,600 Speaker 1: way for people to to help individuals who have this 285 00:18:23,680 --> 00:18:26,240 Speaker 1: problem and who can't couldn't possibly afford this because they 286 00:18:26,280 --> 00:18:28,080 Speaker 1: don't have health care or something a way to deal 287 00:18:28,119 --> 00:18:34,000 Speaker 1: with these illnesses. Um and oftentimes like even even people 288 00:18:34,000 --> 00:18:38,000 Speaker 1: who are ensured get the medication that they need or 289 00:18:38,040 --> 00:18:43,240 Speaker 1: don't get an affordable rate because it's not seen as critical. Yeah, 290 00:18:43,720 --> 00:18:46,080 Speaker 1: it's like, oh, well there's a there's a solution that's 291 00:18:46,119 --> 00:18:48,520 Speaker 1: not as good, but it's much less expensive. So that's 292 00:18:48,560 --> 00:18:51,360 Speaker 1: the only thing we're going to cover. And so yeah, 293 00:18:51,400 --> 00:18:54,040 Speaker 1: and you're saying, well, it should be your decision whether 294 00:18:54,160 --> 00:18:56,960 Speaker 1: or not this is something you want to treat this way, 295 00:18:57,000 --> 00:19:00,320 Speaker 1: and we're this is a way if you, you know, 296 00:19:00,720 --> 00:19:03,239 Speaker 1: have access, this is a way for you to kind of, 297 00:19:03,600 --> 00:19:07,359 Speaker 1: as you've been saying, like take your health care and 298 00:19:07,400 --> 00:19:10,000 Speaker 1: your ability to get medication into your own hands and 299 00:19:10,040 --> 00:19:13,440 Speaker 1: produce the things that you need without needing to beg 300 00:19:13,480 --> 00:19:18,600 Speaker 1: an insurance company or go fund me eight dollars or whatever. Yeah, 301 00:19:18,840 --> 00:19:21,480 Speaker 1: does go fund me to break my heart so much. Yeah, 302 00:19:21,600 --> 00:19:24,679 Speaker 1: it's especially when people say, oh, look, how great somebody 303 00:19:24,720 --> 00:19:27,760 Speaker 1: got the money that they needed, and I say, look, 304 00:19:27,920 --> 00:19:32,080 Speaker 1: I am happy that people get healthcare, but this should 305 00:19:32,080 --> 00:19:36,040 Speaker 1: be entirely unnecessary. And the fact that this comes up 306 00:19:36,119 --> 00:19:39,680 Speaker 1: is criminal. Yeah, we can, we can as a species 307 00:19:39,720 --> 00:19:43,240 Speaker 1: produce this ship for less than the cost of like 308 00:19:43,280 --> 00:19:46,400 Speaker 1: a lamp, you know, Like, why why don't why isn't 309 00:19:46,440 --> 00:19:50,880 Speaker 1: this available? Um now, I And that's what I think 310 00:19:51,000 --> 00:19:53,280 Speaker 1: is kind of so powerful about what y'all are doing, 311 00:19:53,400 --> 00:19:57,760 Speaker 1: And is that so so often we kind of get 312 00:19:57,800 --> 00:20:00,399 Speaker 1: stuck in this like the horror of how bad health 313 00:20:00,400 --> 00:20:02,800 Speaker 1: care is, of how funked up the pharmaceutical industry is, 314 00:20:03,119 --> 00:20:04,920 Speaker 1: and then we get our relief from that in these 315 00:20:04,920 --> 00:20:08,000 Speaker 1: stories of people like crowdfunding so they can get their medication. 316 00:20:08,480 --> 00:20:11,280 Speaker 1: And what you're saying is, well, what's actually much more 317 00:20:11,320 --> 00:20:14,439 Speaker 1: inspiring than that is people just making finding ways to 318 00:20:14,480 --> 00:20:16,480 Speaker 1: make what they need. Um. Again, the kind of the 319 00:20:16,480 --> 00:20:19,919 Speaker 1: most popular popular is the wrong word. The most press 320 00:20:20,000 --> 00:20:22,280 Speaker 1: y'all have received, I think is for the epipencil, which 321 00:20:22,400 --> 00:20:25,719 Speaker 1: is an EpiPen is a device that you take that 322 00:20:25,840 --> 00:20:28,600 Speaker 1: is used when people are going into anaphylactic shock, which 323 00:20:28,600 --> 00:20:30,560 Speaker 1: is when they have an allergic reaction that will kill 324 00:20:30,600 --> 00:20:35,280 Speaker 1: them if untreated. Generally, UM and it you injected into 325 00:20:35,320 --> 00:20:38,720 Speaker 1: your muscles or generally, like an EpiPen does the injecting, 326 00:20:38,720 --> 00:20:41,320 Speaker 1: you just kind of put it in place. Um and 327 00:20:41,359 --> 00:20:44,160 Speaker 1: it it is a life saving medication when people need it. 328 00:20:44,160 --> 00:20:46,679 Speaker 1: It's the choice between that and death. UM. And they 329 00:20:46,680 --> 00:20:49,560 Speaker 1: are very expensive. There is a company that owns the 330 00:20:49,560 --> 00:20:52,600 Speaker 1: patent because of how the EpiPen actually does the injecting. 331 00:20:52,600 --> 00:20:56,120 Speaker 1: The actual medicine is very cheap and very easy to make, 332 00:20:56,200 --> 00:20:58,960 Speaker 1: but it's unbelievably expensive and people die as a result 333 00:20:59,040 --> 00:21:03,280 Speaker 1: of lack lack of access. UM, and you've provided a 334 00:21:03,280 --> 00:21:05,320 Speaker 1: way using both kinds of this thing called a bio 335 00:21:05,400 --> 00:21:08,400 Speaker 1: lab that people You've developed plans that people can build 336 00:21:08,440 --> 00:21:11,080 Speaker 1: it for themselves in order to make this, and also 337 00:21:11,160 --> 00:21:15,040 Speaker 1: using a three D printer, you can make UM an epipencil, 338 00:21:15,119 --> 00:21:17,680 Speaker 1: which is a little less kind of a more analog version. 339 00:21:17,720 --> 00:21:20,120 Speaker 1: I think, I guess you'd say, uh, no, it's it's 340 00:21:20,240 --> 00:21:22,960 Speaker 1: it's equivalent. It's equivalent. It works the same way. The 341 00:21:23,040 --> 00:21:27,520 Speaker 1: things that are different about it that UM are critical. 342 00:21:27,600 --> 00:21:29,520 Speaker 1: The first one that you mentioned, of course, is that 343 00:21:29,600 --> 00:21:32,440 Speaker 1: you can you can build it for a little over 344 00:21:32,560 --> 00:21:36,520 Speaker 1: thirty dollars US, and you can reload it for about 345 00:21:36,560 --> 00:21:42,199 Speaker 1: three dollars, unlike the EpiPen, which is I think it's 346 00:21:42,200 --> 00:21:47,600 Speaker 1: about fifty dollars for yeah, UM, and that might be 347 00:21:47,680 --> 00:21:53,680 Speaker 1: for a pair, but even so to UM. But the 348 00:21:53,720 --> 00:21:59,560 Speaker 1: other two critical differences are that EpiPens are single use, 349 00:22:01,640 --> 00:22:06,520 Speaker 1: so you can't test whether it's faulty or not until 350 00:22:06,600 --> 00:22:12,159 Speaker 1: you use it, And there have been a lot of failures. 351 00:22:12,400 --> 00:22:15,400 Speaker 1: In fact, there was a big EpiPen recall a bunch 352 00:22:15,440 --> 00:22:19,639 Speaker 1: of years ago, and there were just these tragic, tragic stories. 353 00:22:19,880 --> 00:22:23,520 Speaker 1: Some guy had to watch his little kid die. He 354 00:22:23,720 --> 00:22:27,080 Speaker 1: had had a pair of EpiPens. The kid went into shock. 355 00:22:27,640 --> 00:22:30,400 Speaker 1: He used it, the thing failed, He brought the other one. 356 00:22:30,480 --> 00:22:32,479 Speaker 1: The other one failed, and they're in the air and 357 00:22:32,520 --> 00:22:34,960 Speaker 1: you can't land in fifteen minutes and the little kid died, 358 00:22:35,040 --> 00:22:37,600 Speaker 1: which just and I'm sure there are dozens of dozens 359 00:22:37,640 --> 00:22:39,280 Speaker 1: of stories like that, that just happens to be one 360 00:22:39,280 --> 00:22:43,000 Speaker 1: of the ones I know. So one of the things 361 00:22:43,040 --> 00:22:45,800 Speaker 1: that's great about the epipencils because you're putting it together 362 00:22:45,840 --> 00:22:50,000 Speaker 1: yourself and it only takes four parts, you can test it. 363 00:22:50,080 --> 00:22:52,800 Speaker 1: You can make sure that it works as many times 364 00:22:52,800 --> 00:22:55,000 Speaker 1: as you need to. You can dry running with saline 365 00:22:55,000 --> 00:22:57,440 Speaker 1: and just double check that it does what it's supposed 366 00:22:57,480 --> 00:23:05,600 Speaker 1: to um and so it's safer. So the fact that 367 00:23:05,640 --> 00:23:09,600 Speaker 1: it's you can control yourself, you can reload it, and 368 00:23:09,680 --> 00:23:14,240 Speaker 1: you can test it. All these things fix a lot 369 00:23:14,280 --> 00:23:19,000 Speaker 1: of these immediate problems that come with and it still 370 00:23:19,119 --> 00:23:22,359 Speaker 1: has the benefit that everybody wants from the EpiPen, which 371 00:23:22,440 --> 00:23:27,840 Speaker 1: is that it doesn't require um, you know, measurement or 372 00:23:29,440 --> 00:23:33,560 Speaker 1: like knowing how deep to press the needle before you 373 00:23:33,600 --> 00:23:36,040 Speaker 1: depress the plunger. All of that happens automatically, and it 374 00:23:36,080 --> 00:23:39,520 Speaker 1: happens very quickly. UM. And yeah, we as you say, 375 00:23:39,560 --> 00:23:42,960 Speaker 1: we've got a lot of pressed for that because essentially 376 00:23:43,000 --> 00:23:46,080 Speaker 1: a good timing. We've released at the same time that 377 00:23:46,200 --> 00:23:49,160 Speaker 1: Heather Brush was lying to Congress about why they hit 378 00:23:49,680 --> 00:23:54,280 Speaker 1: raised the price on the EpiPens and then so it 379 00:23:54,320 --> 00:23:58,360 Speaker 1: was in the public eye. Yeah, and and that's that's 380 00:23:58,400 --> 00:24:01,560 Speaker 1: a huge one, being able to produce that because that 381 00:24:01,720 --> 00:24:03,880 Speaker 1: is I mean, there's a tremendous number of people who 382 00:24:03,880 --> 00:24:08,760 Speaker 1: rely on UH EpiPens um and and I think the 383 00:24:08,760 --> 00:24:13,960 Speaker 1: potential of that project is staggering UM. And there's some 384 00:24:14,040 --> 00:24:16,320 Speaker 1: there's some you know, we when we talk about kind 385 00:24:16,359 --> 00:24:18,320 Speaker 1: of the different people who are who are working on 386 00:24:18,400 --> 00:24:20,359 Speaker 1: similar problems to you, there's also a group of people 387 00:24:20,359 --> 00:24:23,760 Speaker 1: who are working on UM cracking insulin, being able to 388 00:24:23,760 --> 00:24:26,960 Speaker 1: produce insulin UM. And Yeah, the Open Insulin project is 389 00:24:27,000 --> 00:24:31,560 Speaker 1: an amazing group of people, incredibly important. They're yeah, they're 390 00:24:31,560 --> 00:24:37,960 Speaker 1: working on mm hmm probably the largest scale public health crisis. 391 00:24:38,040 --> 00:24:41,080 Speaker 1: I mean in terms of queries that we get. I 392 00:24:41,119 --> 00:24:43,760 Speaker 1: think we get people asking about insulin more than anything else. 393 00:24:44,080 --> 00:24:47,760 Speaker 1: And I always say, oh, yeah, they're very, very bright 394 00:24:47,800 --> 00:24:49,680 Speaker 1: people already work around this. Go talk to the open 395 00:24:49,720 --> 00:25:02,800 Speaker 1: insulin um and and they're just amazing. I want to 396 00:25:02,800 --> 00:25:04,720 Speaker 1: move on because I want to talk about kind of 397 00:25:04,760 --> 00:25:08,199 Speaker 1: the more um philosophical dimensions of some of this. But 398 00:25:08,280 --> 00:25:11,239 Speaker 1: before we get into that, i'd like to so, like, 399 00:25:11,320 --> 00:25:12,679 Speaker 1: you know, one of the things you and I've been 400 00:25:12,680 --> 00:25:14,760 Speaker 1: talking about a little bit behind the scenes is I 401 00:25:14,760 --> 00:25:18,080 Speaker 1: am not a technically savvy person, but I'm I want 402 00:25:18,080 --> 00:25:20,600 Speaker 1: to try and i'd like to be able to like 403 00:25:20,760 --> 00:25:23,320 Speaker 1: produce an epipencil. I want to like understand this it 404 00:25:23,480 --> 00:25:26,960 Speaker 1: kind of and and potentially be able to contribute UM 405 00:25:27,000 --> 00:25:30,520 Speaker 1: in a more direct sense, in part because I'm curious, 406 00:25:30,560 --> 00:25:33,919 Speaker 1: like how how doable actually is this? For I consider 407 00:25:33,960 --> 00:25:35,879 Speaker 1: myself a pretty normal person when it comes to like 408 00:25:35,960 --> 00:25:39,399 Speaker 1: technical understanding, right, Like, I'm reasonably handy, but I'm not 409 00:25:40,080 --> 00:25:42,159 Speaker 1: I'm not a chemist. I'm not a I'm not a 410 00:25:43,000 --> 00:25:45,240 Speaker 1: I haven't really I have no I have no prior 411 00:25:45,320 --> 00:25:48,399 Speaker 1: experience three D printing or anything like that, what is 412 00:25:49,000 --> 00:25:52,240 Speaker 1: what is required in terms of financial investment, and what 413 00:25:52,359 --> 00:25:54,440 Speaker 1: is kind of your general estimate in terms of time 414 00:25:54,520 --> 00:25:58,600 Speaker 1: to get up to you know, a kind of the 415 00:25:58,720 --> 00:26:01,560 Speaker 1: level where you can start learning how to do some 416 00:26:01,600 --> 00:26:05,119 Speaker 1: of this stuff. I think the barrier to entry is 417 00:26:05,119 --> 00:26:10,320 Speaker 1: pretty low, depending on how you want to start. As 418 00:26:10,320 --> 00:26:14,959 Speaker 1: I said, there are different avenues to doing it. You can, 419 00:26:15,119 --> 00:26:17,600 Speaker 1: of course one of the one of the greatest hacks. 420 00:26:17,680 --> 00:26:20,400 Speaker 1: If if anybody listening this doesn't pick up anything else, 421 00:26:20,840 --> 00:26:24,080 Speaker 1: here's the best hack in terms of getting access to medication. 422 00:26:24,720 --> 00:26:27,399 Speaker 1: You have a medication you don't have access to for 423 00:26:27,440 --> 00:26:32,119 Speaker 1: whatever reason, assuming it comes in a capsule form, you 424 00:26:32,160 --> 00:26:35,960 Speaker 1: can merely go to a chemical supplier, purchase the active 425 00:26:36,200 --> 00:26:40,240 Speaker 1: pharmaceutical ingredient, wait out, put it into a capsule, and 426 00:26:40,280 --> 00:26:43,919 Speaker 1: you've made your medication. That's a very simple thing, you know. 427 00:26:44,040 --> 00:26:46,240 Speaker 1: That takes nothing more than being able to read a 428 00:26:46,280 --> 00:26:49,679 Speaker 1: scale and scooping powder into a little you know, capsules. 429 00:26:50,359 --> 00:26:54,600 Speaker 1: The next step up, there are things that you can do. 430 00:26:54,640 --> 00:26:56,160 Speaker 1: They're a little more involved. If you want to build 431 00:26:56,160 --> 00:27:00,240 Speaker 1: an epic pencil, again, this is three or four hearts, 432 00:27:00,280 --> 00:27:04,040 Speaker 1: depending on how you count. You take the needle from 433 00:27:04,359 --> 00:27:07,399 Speaker 1: one syringe needle set on you put it onto a 434 00:27:07,440 --> 00:27:09,680 Speaker 1: different syringe needle set, and then you put it into 435 00:27:09,680 --> 00:27:14,639 Speaker 1: this auto injector, this design for needle phobic diabetics. You 436 00:27:14,760 --> 00:27:18,240 Speaker 1: load it with the up and ron and you close 437 00:27:18,320 --> 00:27:22,560 Speaker 1: it up and you're done. Then if you want to 438 00:27:22,640 --> 00:27:25,479 Speaker 1: step into this a little bit further, if something is 439 00:27:25,960 --> 00:27:29,119 Speaker 1: so barriered for whatever reason that you can't get the 440 00:27:29,160 --> 00:27:33,720 Speaker 1: actual ingredient, then you might start messing around with our 441 00:27:33,760 --> 00:27:37,840 Speaker 1: micro lab. The micro lab, I would say probably takes 442 00:27:38,080 --> 00:27:43,200 Speaker 1: around a hundred dollars US to build um it, but 443 00:27:43,400 --> 00:27:47,960 Speaker 1: it's not super technical. Our latest version doesn't require any soldering. 444 00:27:48,119 --> 00:27:51,560 Speaker 1: Everything snaps together, which is really nice. You can plug 445 00:27:51,600 --> 00:27:54,960 Speaker 1: everything in UM. All the wires are just screw terminals, 446 00:27:55,000 --> 00:27:59,480 Speaker 1: which is really convenient. UM And it takes some time, 447 00:28:00,200 --> 00:28:05,320 Speaker 1: and you do have to load some code. But we're 448 00:28:05,320 --> 00:28:10,520 Speaker 1: looking to release a a new set of documentation in 449 00:28:10,560 --> 00:28:14,199 Speaker 1: the summer that will be very very stripped down of 450 00:28:14,640 --> 00:28:17,080 Speaker 1: here's your bill of materials. You can order all of 451 00:28:17,119 --> 00:28:21,360 Speaker 1: this stuff. Here's how you can put the disk image 452 00:28:21,520 --> 00:28:24,480 Speaker 1: onto the SD card that you put in and you 453 00:28:24,520 --> 00:28:31,320 Speaker 1: should start it and it will wake up and work independently. UM. 454 00:28:31,359 --> 00:28:35,520 Speaker 1: We had a video of our head hardware guy actually 455 00:28:35,600 --> 00:28:40,480 Speaker 1: building the micro lab from just parts that we're sitting on, 456 00:28:40,680 --> 00:28:43,080 Speaker 1: laid out on a table, and I think all told 457 00:28:43,080 --> 00:28:46,680 Speaker 1: it took him about forty five minutes, maybe a little 458 00:28:46,720 --> 00:28:50,080 Speaker 1: bit longer, but again, like granted, this guy's a hardware 459 00:28:50,160 --> 00:28:54,600 Speaker 1: specialist and he you know, designed it, so for somebody 460 00:28:54,600 --> 00:28:57,320 Speaker 1: who's not done before, it might take an afternoon, but 461 00:28:57,520 --> 00:29:01,520 Speaker 1: it's not. It's not a pro bitively long or involved 462 00:29:01,560 --> 00:29:04,760 Speaker 1: project that you know, would take you weeks to put together, 463 00:29:04,960 --> 00:29:12,600 Speaker 1: or any specialized understanding of you know, biomedical engineering or 464 00:29:12,600 --> 00:29:16,360 Speaker 1: anything like that. Now, UM, I kind of want to 465 00:29:16,360 --> 00:29:18,000 Speaker 1: move at this point because I think that gives people 466 00:29:18,400 --> 00:29:20,760 Speaker 1: an idea of what's actually necessary and they can go 467 00:29:20,840 --> 00:29:23,600 Speaker 1: to y'all's website, UM or look up you have plans 468 00:29:23,640 --> 00:29:25,840 Speaker 1: on a get hub if they want to kind of 469 00:29:25,880 --> 00:29:29,200 Speaker 1: look at what's what's involved, and it's um some of 470 00:29:29,240 --> 00:29:31,400 Speaker 1: it seems a little daunting to me, like look looking 471 00:29:31,440 --> 00:29:33,960 Speaker 1: at the construction of the bio lab, But I'm that 472 00:29:33,960 --> 00:29:36,040 Speaker 1: that's going to be a project that I'll be engaging 473 00:29:36,040 --> 00:29:37,640 Speaker 1: in over the next couple of weeks, so we'll keep 474 00:29:37,640 --> 00:29:41,040 Speaker 1: people updated on how I do there. UM I want 475 00:29:41,040 --> 00:29:44,520 Speaker 1: to move on to talk Michael about what you see 476 00:29:44,560 --> 00:29:50,320 Speaker 1: as kind of the I don't know, the the the 477 00:29:50,480 --> 00:29:53,880 Speaker 1: potential from kind of a revolutionary perspective, from a perspective 478 00:29:53,880 --> 00:29:58,360 Speaker 1: of actually building dual power of this project. And obviously 479 00:29:58,400 --> 00:30:00,240 Speaker 1: you are and I think what would would be called 480 00:30:00,280 --> 00:30:03,600 Speaker 1: the early stages of this idea of kind of democratizing 481 00:30:03,600 --> 00:30:09,720 Speaker 1: and decentralizing the production of life saving medications. UM. Although 482 00:30:09,760 --> 00:30:11,400 Speaker 1: I guess you could argue in some ways it's kind 483 00:30:11,400 --> 00:30:14,120 Speaker 1: of a return to more traditional attitudes about health care 484 00:30:14,120 --> 00:30:17,840 Speaker 1: in a lot of ways. Yeah, there's a cyclic nature there, 485 00:30:18,200 --> 00:30:21,800 Speaker 1: and in the sort of zen mind, beginner's mind, we'd 486 00:30:21,800 --> 00:30:24,880 Speaker 1: like to think that the revolution is always in its 487 00:30:24,920 --> 00:30:31,120 Speaker 1: beginning stages, right. Um. That to say, over the past decade, 488 00:30:31,200 --> 00:30:37,120 Speaker 1: roughly looking at trying to find ways to give people 489 00:30:37,600 --> 00:30:44,120 Speaker 1: more independent access that doesn't require infrastructure to medicines and 490 00:30:44,160 --> 00:30:54,680 Speaker 1: medical technologies. The the hope really is to create a 491 00:30:54,800 --> 00:31:01,440 Speaker 1: certain amount of cultural shift. I remember one point a 492 00:31:01,480 --> 00:31:05,440 Speaker 1: friend of mine, who as a business school graduate, asked 493 00:31:05,440 --> 00:31:09,240 Speaker 1: me a very sort of like business school type question 494 00:31:09,360 --> 00:31:15,400 Speaker 1: where he said, how would you measure success of your project? Um? 495 00:31:15,440 --> 00:31:19,640 Speaker 1: And I said, well, we cease to exist as an organization, 496 00:31:20,280 --> 00:31:21,640 Speaker 1: and he kind of had this moment of like, what 497 00:31:21,720 --> 00:31:25,760 Speaker 1: do you mean we shouldn't be pushing this right, the 498 00:31:25,880 --> 00:31:30,160 Speaker 1: ideas that eventually the concept of managing your own health 499 00:31:30,280 --> 00:31:36,960 Speaker 1: is sufficiently normalized that it's not something that has to 500 00:31:36,960 --> 00:31:40,200 Speaker 1: be explained between people. But somebody says, oh, yeah, I 501 00:31:40,320 --> 00:31:47,120 Speaker 1: just I just did that up in my micro lab um. 502 00:31:47,400 --> 00:31:50,680 Speaker 1: In the same way that when you look at the 503 00:31:50,760 --> 00:31:57,240 Speaker 1: shift that happened between oh, you know, the mid eighties 504 00:31:57,320 --> 00:32:03,280 Speaker 1: in the mid nineties, where computers were this strange, scary 505 00:32:03,360 --> 00:32:07,640 Speaker 1: thing that was you know, we're only accessible or usable 506 00:32:07,680 --> 00:32:11,440 Speaker 1: by people who were very specialized. Is something that you know, 507 00:32:11,560 --> 00:32:13,800 Speaker 1: everybody knew about and everybody kind of had and everybody 508 00:32:13,880 --> 00:32:16,280 Speaker 1: sort of used. And the same sort of thing that 509 00:32:16,400 --> 00:32:22,280 Speaker 1: happened between the period of time I don't know, maybe 510 00:32:22,760 --> 00:32:27,000 Speaker 1: ten twelve years ago and now with with three D printing, 511 00:32:27,040 --> 00:32:30,960 Speaker 1: where like stereo lithography and rapid prototyping was again the 512 00:32:31,040 --> 00:32:34,440 Speaker 1: specialized thing that a bunch of people who were essentially 513 00:32:34,480 --> 00:32:38,720 Speaker 1: at the machine tool industry had started to spearhead. And 514 00:32:38,760 --> 00:32:41,200 Speaker 1: now you say, three D printing, everybody knows what it means. 515 00:32:41,800 --> 00:32:46,040 Speaker 1: In the same sort of way, I'd very much like 516 00:32:46,120 --> 00:32:52,960 Speaker 1: to see a cultural shift where when somebody is unwell, 517 00:32:54,320 --> 00:32:58,240 Speaker 1: that when discussions between people happened, that instead of the 518 00:32:58,600 --> 00:33:04,080 Speaker 1: have you had that looked at it, or you might 519 00:33:04,160 --> 00:33:09,520 Speaker 1: instead here from somebody saying well have you read up 520 00:33:09,560 --> 00:33:13,880 Speaker 1: on that? You know, to see people actually engaged in 521 00:33:13,920 --> 00:33:19,400 Speaker 1: their own health and not going through this very typical 522 00:33:19,520 --> 00:33:22,840 Speaker 1: process of outsourcing responsibility. Now, it's not to say that 523 00:33:22,960 --> 00:33:27,200 Speaker 1: like experts aren't good people with whom to consult, right, Yeah, 524 00:33:27,280 --> 00:33:31,320 Speaker 1: we're not talking about replacing the idea of medical professionals 525 00:33:31,320 --> 00:33:34,360 Speaker 1: who can help you understand what your health and diagnose 526 00:33:34,440 --> 00:33:37,760 Speaker 1: and stuff like. Yeah, but there is again this drastic 527 00:33:37,840 --> 00:33:43,720 Speaker 1: difference between going to a doctor and essentially just like 528 00:33:43,920 --> 00:33:46,640 Speaker 1: throwing the problem on their desk and saying fixed it, 529 00:33:47,040 --> 00:33:50,560 Speaker 1: call me when it's over, versus going to a doctor 530 00:33:50,560 --> 00:33:54,960 Speaker 1: and saying, hey, i'd like to talk about this, I'd 531 00:33:55,000 --> 00:33:58,200 Speaker 1: like to know more about what's wrong here, and I'd 532 00:33:58,240 --> 00:34:04,240 Speaker 1: like to discus us what the options are and what 533 00:34:04,840 --> 00:34:09,920 Speaker 1: seems best. Um that would be great on a lot 534 00:34:09,920 --> 00:34:15,480 Speaker 1: of levels. And and then these questions of access to 535 00:34:15,520 --> 00:34:21,240 Speaker 1: medication then become even more relevant because when you're talking 536 00:34:21,560 --> 00:34:24,319 Speaker 1: with a doctor and the doctor says Okay, well we 537 00:34:24,320 --> 00:34:27,719 Speaker 1: could try this therapy, but your insurance won't pay for it. 538 00:34:27,760 --> 00:34:30,799 Speaker 1: This three hundred dollars. You can say, all right, well 539 00:34:30,880 --> 00:34:32,799 Speaker 1: let's just do a little thought experiment and if that 540 00:34:32,880 --> 00:34:35,000 Speaker 1: fell from a truck, what would I do with it? 541 00:34:35,480 --> 00:34:37,759 Speaker 1: And then maybe you can go home and say, you know, 542 00:34:38,200 --> 00:34:40,000 Speaker 1: I'll call you and let you know how it goes. 543 00:34:42,239 --> 00:34:47,600 Speaker 1: I that's that's really my my grant, Hope. And there 544 00:34:47,640 --> 00:34:49,600 Speaker 1: are so many different ways that that can play out. 545 00:34:50,440 --> 00:34:53,600 Speaker 1: In fact, I'll tell you a hilarious story in regards 546 00:34:53,640 --> 00:34:57,160 Speaker 1: to this, which was in I guess it was when 547 00:34:57,160 --> 00:35:01,920 Speaker 1: we presented it, Hope. UM. I called Martin Shakraley's cell 548 00:35:01,960 --> 00:35:05,319 Speaker 1: phone from stage UH to try and ask him what 549 00:35:05,360 --> 00:35:09,319 Speaker 1: he thought about what we were doing, given that I 550 00:35:09,400 --> 00:35:12,320 Speaker 1: was handing his drug out for free, UM and showing 551 00:35:12,320 --> 00:35:15,319 Speaker 1: people how to make it. And he didn't answer the 552 00:35:15,320 --> 00:35:17,000 Speaker 1: phone when I called him then, but he called me 553 00:35:17,040 --> 00:35:22,360 Speaker 1: back a few hours later, which was really hilarious. We 554 00:35:22,480 --> 00:35:25,480 Speaker 1: actually chatted for a while and the guys, I mean 555 00:35:25,560 --> 00:35:30,280 Speaker 1: a little detached from reality, but he's he's no dummy, UM. 556 00:35:31,040 --> 00:35:33,359 Speaker 1: And when I sort of described what we were trying 557 00:35:33,400 --> 00:35:38,640 Speaker 1: to do with the micro lab, he had some interesting insights, 558 00:35:38,640 --> 00:35:40,600 Speaker 1: and he said, yeah, you know, one way I can 559 00:35:40,640 --> 00:35:45,480 Speaker 1: imagine that working really well is if somebody with a 560 00:35:45,560 --> 00:35:52,760 Speaker 1: little more knowledge of pharmaceutical medicine were too, maybe build 561 00:35:52,760 --> 00:35:56,160 Speaker 1: one of these and serve a small community. I think 562 00:35:56,239 --> 00:35:59,719 Speaker 1: that could be very efficient. And I was like, that's 563 00:35:59,719 --> 00:36:05,160 Speaker 1: a good saw it you chiseling bastard Um. Yeah, I 564 00:36:05,160 --> 00:36:07,560 Speaker 1: mean that there's a degree which that's That's kind of 565 00:36:07,560 --> 00:36:10,759 Speaker 1: how I see the most realistic potential. This is not 566 00:36:10,840 --> 00:36:13,319 Speaker 1: every individual making all of their medicine, but kind of 567 00:36:13,360 --> 00:36:16,160 Speaker 1: like you know we had during the fires last year, 568 00:36:16,200 --> 00:36:18,720 Speaker 1: when when our local and state governments during the heatwave 569 00:36:18,800 --> 00:36:22,160 Speaker 1: this year like completely shut the bed. We had different 570 00:36:22,239 --> 00:36:25,080 Speaker 1: mutual aid collectives do things like we are providing people 571 00:36:25,080 --> 00:36:27,400 Speaker 1: with like oh it's a blizzard, We're providing people with firewood. 572 00:36:27,680 --> 00:36:30,520 Speaker 1: We are providing people with cooling stations because of the heat. 573 00:36:30,560 --> 00:36:32,480 Speaker 1: You know, we are providing people with they've just fled 574 00:36:32,480 --> 00:36:35,320 Speaker 1: their houses. We have kits that have food and basic 575 00:36:35,400 --> 00:36:38,160 Speaker 1: necessities so they can get through mutual aid collectives that 576 00:36:38,200 --> 00:36:40,120 Speaker 1: are like, well, we are making we specialize and we 577 00:36:40,120 --> 00:36:42,799 Speaker 1: can produce this and this and this medication like these 578 00:36:42,800 --> 00:36:44,680 Speaker 1: three and we have and here's the information you can 579 00:36:44,719 --> 00:36:46,759 Speaker 1: find online about our process. You know that we know 580 00:36:46,840 --> 00:36:48,719 Speaker 1: what we're doing, and if you need these things, you 581 00:36:48,800 --> 00:36:50,480 Speaker 1: let us know and we we get them to you. 582 00:36:50,760 --> 00:36:52,960 Speaker 1: And here's different ways in which people can volunteer if 583 00:36:53,000 --> 00:36:55,120 Speaker 1: you want to help engage in this mutual aid process, 584 00:36:55,120 --> 00:36:58,440 Speaker 1: even if you're not someone who's going to be doing 585 00:36:58,480 --> 00:37:00,480 Speaker 1: a lot of the technical stuff. We need people to 586 00:37:00,520 --> 00:37:02,160 Speaker 1: go pick up parts, so we need people to do this, 587 00:37:02,200 --> 00:37:07,000 Speaker 1: and you can help us here, you know, And I think, yeah, yeah, 588 00:37:07,080 --> 00:37:08,759 Speaker 1: And I think in a similar way, right, a lot 589 00:37:08,760 --> 00:37:11,399 Speaker 1: of that sort of thing is already happening in other 590 00:37:11,560 --> 00:37:14,160 Speaker 1: realms right where it's the sort of thing where you 591 00:37:14,360 --> 00:37:17,279 Speaker 1: you might be building something, or you you see some 592 00:37:17,320 --> 00:37:19,719 Speaker 1: project on GitHub or whatever, and some there are these 593 00:37:19,880 --> 00:37:22,759 Speaker 1: STL files and you go, oh, gosh, well I don't 594 00:37:22,760 --> 00:37:24,600 Speaker 1: want how to do that, but oh right, x y 595 00:37:24,680 --> 00:37:27,520 Speaker 1: z down the street as a three D printer. I'll 596 00:37:27,520 --> 00:37:30,280 Speaker 1: go ask her. She's really good at making these things. 597 00:37:30,719 --> 00:37:32,799 Speaker 1: And you say, hey, look, I have this thing. Would 598 00:37:32,800 --> 00:37:35,920 Speaker 1: this be difficult to print? And with their experience, they 599 00:37:35,960 --> 00:37:38,439 Speaker 1: can kind of look at me like, I know that 600 00:37:38,440 --> 00:37:41,040 Speaker 1: that shouldn't be too hard. Um, you know, I I 601 00:37:41,080 --> 00:37:42,719 Speaker 1: have some time this weekend, maybe I can make that 602 00:37:42,800 --> 00:37:46,080 Speaker 1: for you. And in the same way, you say, hey, 603 00:37:46,160 --> 00:37:48,920 Speaker 1: it looks like I seem to have this rare infection 604 00:37:49,600 --> 00:37:54,080 Speaker 1: from whatever whatever, or I have this odd condition. Um, 605 00:37:54,120 --> 00:37:57,080 Speaker 1: I wanted to try this medication because it might be 606 00:37:57,120 --> 00:38:00,360 Speaker 1: really helpful, but it's not legal in this kind treat 607 00:38:00,640 --> 00:38:02,920 Speaker 1: Do you think you can put this together? Again? You know, 608 00:38:02,960 --> 00:38:05,520 Speaker 1: you call somebody and whoever's on the other lines, and so, 609 00:38:05,560 --> 00:38:09,120 Speaker 1: oh yeah, I have a micro lab. I can try 610 00:38:09,120 --> 00:38:11,080 Speaker 1: and put a program together for that and see if 611 00:38:11,080 --> 00:38:13,480 Speaker 1: I can make it for you. That sort of thing, 612 00:38:13,520 --> 00:38:19,160 Speaker 1: I think is a potentially really positive avenue for that 613 00:38:19,280 --> 00:38:22,920 Speaker 1: sort of thing to proliferate and again eventually to have 614 00:38:24,120 --> 00:38:31,200 Speaker 1: a cultural shift where the idea of medicine and medical 615 00:38:31,280 --> 00:38:36,800 Speaker 1: technology not being something that is comes down from above 616 00:38:36,920 --> 00:38:42,239 Speaker 1: from some authority, but instead is something that's managed by 617 00:38:42,719 --> 00:38:46,480 Speaker 1: people who are part of your community, who you already trust. 618 00:38:46,760 --> 00:38:49,240 Speaker 1: I mean, that's why going to a doctor is so scary. 619 00:38:49,320 --> 00:38:52,399 Speaker 1: They seem to be the arbiter of your fate. They're 620 00:38:52,400 --> 00:38:56,120 Speaker 1: going to tell you whether you're well or not, and 621 00:38:56,120 --> 00:39:01,880 Speaker 1: and that is just the truth, and much better to 622 00:39:02,040 --> 00:39:05,040 Speaker 1: have it where people are making up their own mind 623 00:39:05,239 --> 00:39:09,640 Speaker 1: based on learning about their own health and consulting with 624 00:39:09,680 --> 00:39:14,280 Speaker 1: people who can give them perspective. Um. And if there's 625 00:39:14,440 --> 00:39:17,880 Speaker 1: more of that, and if it's closer to the person 626 00:39:18,000 --> 00:39:22,360 Speaker 1: who's actually suffering, that I think will be on the 627 00:39:22,400 --> 00:39:26,879 Speaker 1: whole much better. Yeah. It's this the and this gets 628 00:39:26,920 --> 00:39:28,839 Speaker 1: tangled up in a lot of the more toxic things 629 00:39:28,880 --> 00:39:31,120 Speaker 1: we've seen this year, But it's this this understanding that 630 00:39:33,160 --> 00:39:37,319 Speaker 1: with any given problem, if individuals trying to solve that 631 00:39:37,360 --> 00:39:40,760 Speaker 1: problem have more autonomy, and part of autonomy is knowledge, 632 00:39:41,680 --> 00:39:45,799 Speaker 1: that's nearly always better. Um. The problem, of course is 633 00:39:45,840 --> 00:39:47,680 Speaker 1: that like we we get into this situation, we are 634 00:39:47,719 --> 00:39:50,920 Speaker 1: now where some people take I'm take some people some 635 00:39:50,960 --> 00:39:54,160 Speaker 1: people use I want to take control of my health 636 00:39:54,200 --> 00:39:57,319 Speaker 1: care to you know, do stuff that's nonsense. And that 637 00:39:57,400 --> 00:39:59,680 Speaker 1: brings us back to the question of like, yeah, you 638 00:39:59,760 --> 00:40:02,239 Speaker 1: need in for the quality of the information that you're 639 00:40:02,280 --> 00:40:05,200 Speaker 1: getting is very important, right because if if you're if 640 00:40:05,239 --> 00:40:08,160 Speaker 1: you if your research is some YouTube video that has 641 00:40:08,160 --> 00:40:10,960 Speaker 1: convinced you that you need to you know, take this 642 00:40:10,960 --> 00:40:13,480 Speaker 1: this horse paste or something, then yeah, that's not good. 643 00:40:13,520 --> 00:40:17,280 Speaker 1: But that doesn't change the fact that like with food, 644 00:40:17,440 --> 00:40:20,799 Speaker 1: like with with everything, that you need to survive, the 645 00:40:20,920 --> 00:40:26,600 Speaker 1: more of a role you have in understanding that, deciding 646 00:40:26,760 --> 00:40:28,759 Speaker 1: what to do with that, understanding where it comes from 647 00:40:28,800 --> 00:40:32,600 Speaker 1: and how it is produced. Um. Not just like, not 648 00:40:32,680 --> 00:40:34,759 Speaker 1: only is that I think more satisfying as a human, 649 00:40:34,800 --> 00:40:42,200 Speaker 1: but it's it's also critical to to your well being. UM. 650 00:40:42,239 --> 00:40:46,120 Speaker 1: It's critical to like on two levels, right on two levels, 651 00:40:46,160 --> 00:40:52,560 Speaker 1: because not only when your health is taken from you, 652 00:40:54,560 --> 00:40:57,080 Speaker 1: it doesn't deprive you of life, but it deprives you 653 00:40:57,200 --> 00:41:02,880 Speaker 1: participating in any of the acts that make life meaningful. 654 00:41:03,719 --> 00:41:06,680 Speaker 1: And part of that key thing that makes life meaningful 655 00:41:06,920 --> 00:41:14,120 Speaker 1: is having a participatory role in the things that decide 656 00:41:14,160 --> 00:41:17,919 Speaker 1: the trajectory of your life. And so when you go 657 00:41:18,560 --> 00:41:24,320 Speaker 1: to the lengths of managing your own health, two things happen. 658 00:41:24,840 --> 00:41:29,160 Speaker 1: First off, your health improves, assuming you've made good decisions 659 00:41:29,160 --> 00:41:34,160 Speaker 1: and get lucky. But second, you're also having a participatory 660 00:41:34,320 --> 00:41:39,680 Speaker 1: role in your life, and that makes life more meaningful. 661 00:41:40,320 --> 00:41:44,600 Speaker 1: And it beyond just kind of the self actualization benefits 662 00:41:44,719 --> 00:41:48,000 Speaker 1: from from a perspective of actually enabling people to participate 663 00:41:48,520 --> 00:41:52,080 Speaker 1: in the move for radical change in our society. One 664 00:41:52,200 --> 00:41:54,839 Speaker 1: necessary element of that to any of the kind of 665 00:41:54,880 --> 00:41:59,520 Speaker 1: things that we need is a belief in your own 666 00:41:59,560 --> 00:42:03,720 Speaker 1: agent see and power. Um. And also and a freedom 667 00:42:03,760 --> 00:42:07,080 Speaker 1: from the kind of fear that comes from feeling helpless. 668 00:42:07,120 --> 00:42:10,160 Speaker 1: And there is I think probably no feeling worse in 669 00:42:10,280 --> 00:42:15,280 Speaker 1: the world than feeling completely helpless about a treatable medical problem. UM. 670 00:42:15,320 --> 00:42:17,120 Speaker 1: I mean, it's one thing I just went through my mom, 671 00:42:17,160 --> 00:42:19,040 Speaker 1: when when you get a disease where there's just nothing 672 00:42:19,040 --> 00:42:21,319 Speaker 1: that science can do, right, We're like, yeah, you've got 673 00:42:21,360 --> 00:42:24,279 Speaker 1: this cancer, and there there ain't ship anybody has for you. 674 00:42:24,280 --> 00:42:28,560 Speaker 1: You know, that's one kind of horrible, but I think 675 00:42:28,600 --> 00:42:31,879 Speaker 1: it's a lot less terrible than you. I have this 676 00:42:31,960 --> 00:42:34,399 Speaker 1: thing that we can deal with, but I either can't 677 00:42:34,400 --> 00:42:35,879 Speaker 1: afford it or I don't know that I'll be able 678 00:42:35,920 --> 00:42:38,080 Speaker 1: to afford it. I had a horrible I lost my 679 00:42:38,239 --> 00:42:41,480 Speaker 1: job in my healthcare in seventeen and so did a 680 00:42:41,520 --> 00:42:43,319 Speaker 1: person who was on my healthcare with me that I 681 00:42:43,320 --> 00:42:48,000 Speaker 1: love very much. And I got this, you know, hired 682 00:42:48,040 --> 00:42:50,480 Speaker 1: here in healthcare a couple of years later. And it 683 00:42:50,560 --> 00:42:55,279 Speaker 1: happened that a month before the the I started my 684 00:42:55,360 --> 00:42:57,920 Speaker 1: health care at this new job, this person who was 685 00:42:57,920 --> 00:43:00,879 Speaker 1: on my healthcare with me UM diagnosed with a brain 686 00:43:00,960 --> 00:43:03,160 Speaker 1: tumor and thankfully not a cancerous one, but the one 687 00:43:03,200 --> 00:43:05,080 Speaker 1: that they had to take medication for that would have 688 00:43:05,120 --> 00:43:10,400 Speaker 1: been would have bankrupted us, you know, without the without insurance. 689 00:43:10,560 --> 00:43:14,200 Speaker 1: And thankfully it worked out fine, the timing worked out okay. 690 00:43:14,239 --> 00:43:16,000 Speaker 1: But there's not a week that goes by that I 691 00:43:16,040 --> 00:43:20,000 Speaker 1: don't and it it's it's it is something that makes 692 00:43:20,000 --> 00:43:23,040 Speaker 1: you less willing to take risks, less willing to participate 693 00:43:23,239 --> 00:43:26,160 Speaker 1: in in things that because you have in the back 694 00:43:26,200 --> 00:43:28,239 Speaker 1: of your head, well, I have to I have to 695 00:43:28,320 --> 00:43:30,120 Speaker 1: keep this job, I have to keep this insurance, I 696 00:43:30,160 --> 00:43:34,400 Speaker 1: have to y yeah that And that's another thing that 697 00:43:34,440 --> 00:43:39,040 Speaker 1: I find so heartbreaking. There's so many people that I've 698 00:43:39,520 --> 00:43:45,560 Speaker 1: I've met totally outside of my activism who lament about 699 00:43:45,560 --> 00:43:48,080 Speaker 1: working a job that they hate, and I say, gosh, 700 00:43:48,080 --> 00:43:52,640 Speaker 1: for you know, I mean, you consider just bailing on 701 00:43:52,680 --> 00:43:55,600 Speaker 1: it and looking for something else and trying something else. 702 00:43:55,680 --> 00:44:04,080 Speaker 1: And they have this total paralysis of saying, but if 703 00:44:04,080 --> 00:44:08,400 Speaker 1: I quit my job, I won't have healthcare mhm and 704 00:44:08,400 --> 00:44:13,200 Speaker 1: and and mind you like, these were people who were 705 00:44:13,960 --> 00:44:17,960 Speaker 1: incredibly healthy. These were not people who had any regular 706 00:44:18,520 --> 00:44:22,239 Speaker 1: visits to healthcare. They're just scared that if something comes up, 707 00:44:22,600 --> 00:44:25,279 Speaker 1: they want people to handle it. And it's it's a 708 00:44:25,480 --> 00:44:29,480 Speaker 1: perfectly well grounded fear. But as you point out, what 709 00:44:29,560 --> 00:44:34,240 Speaker 1: this does is it works as this sort of shadow 710 00:44:34,960 --> 00:44:42,480 Speaker 1: oppressive mechanism to keep people from exploring, trying things, as 711 00:44:42,480 --> 00:44:46,200 Speaker 1: you say, taking risks, or or just doing things that 712 00:44:47,239 --> 00:44:54,000 Speaker 1: don't involve a an optimization toward a stable state of 713 00:44:54,120 --> 00:44:56,520 Speaker 1: maybe just like yeah, maybe I'll start a small business 714 00:44:56,520 --> 00:44:58,480 Speaker 1: and yeah, it probably will fail, but that will be 715 00:44:58,520 --> 00:45:03,280 Speaker 1: a cool adventure. And most people, you know, so many people, 716 00:45:03,560 --> 00:45:08,680 Speaker 1: maybe not most, but many, many people, um get just 717 00:45:08,920 --> 00:45:13,120 Speaker 1: terrified into this state of inertial paralysis. Yeah, and it 718 00:45:13,160 --> 00:45:16,400 Speaker 1: contributes to people being afraid to take to the street 719 00:45:16,440 --> 00:45:19,000 Speaker 1: to protest the police because maybe they get arrested, and 720 00:45:19,040 --> 00:45:21,400 Speaker 1: maybe they get fired, and then you know, maybe their 721 00:45:21,440 --> 00:45:25,200 Speaker 1: kid can't afford their Like there's a thousand ways. I think, honestly, 722 00:45:25,239 --> 00:45:28,680 Speaker 1: the fear of losing your health care is in some 723 00:45:28,760 --> 00:45:31,960 Speaker 1: ways as great a greater counterrevolutionary force than any law 724 00:45:32,040 --> 00:45:34,759 Speaker 1: enforcement agency could hope to be, because the fear is 725 00:45:34,800 --> 00:45:39,560 Speaker 1: so much more immediate to so many people. Nobody talks 726 00:45:39,640 --> 00:45:42,520 Speaker 1: about that. And thank you so much for mentioning it, 727 00:45:42,560 --> 00:45:45,439 Speaker 1: because it's something that like, oftentimes I try to bring 728 00:45:45,520 --> 00:45:49,200 Speaker 1: up when I'm discussing things in public for it, and 729 00:45:49,200 --> 00:45:51,520 Speaker 1: and oftentimes people kind of raise an eyebrow on me 730 00:45:51,560 --> 00:45:53,759 Speaker 1: and be like, that's what's the big deal, And I'm like, no, no, 731 00:45:54,080 --> 00:45:57,800 Speaker 1: Like you look two layers deep. There's something that's really 732 00:45:57,840 --> 00:46:06,160 Speaker 1: working against people being able to exercise protests, and it's, uh, 733 00:46:06,200 --> 00:46:10,360 Speaker 1: it's it's it's this really silent, terrifying force that seems 734 00:46:10,360 --> 00:46:15,960 Speaker 1: to underlie everything. And if you could alleviate that, if 735 00:46:16,000 --> 00:46:18,719 Speaker 1: you can get to the point where people are like, yeah, 736 00:46:18,719 --> 00:46:21,160 Speaker 1: the hell with it, you know, I don't need a 737 00:46:21,360 --> 00:46:25,400 Speaker 1: job to take care of me, then all of a sudden, 738 00:46:25,719 --> 00:46:31,919 Speaker 1: so many possibilities just blossomed in the mind. Yeah, if 739 00:46:31,960 --> 00:46:33,960 Speaker 1: you have, like it, say, if you're a parent who 740 00:46:33,960 --> 00:46:38,320 Speaker 1: has a child with you know who, who's insolent dependent. Uh, 741 00:46:38,480 --> 00:46:41,560 Speaker 1: there's not a lot of difference in my mind between 742 00:46:41,640 --> 00:46:44,560 Speaker 1: the fact that between someone holding a gun to your 743 00:46:44,560 --> 00:46:47,200 Speaker 1: head and your boss being able to fire you and 744 00:46:47,239 --> 00:46:49,839 Speaker 1: take away your your kids access to that insulent there's 745 00:46:49,880 --> 00:46:54,439 Speaker 1: not a tremendous moral difference to me. Um, I'd say 746 00:46:54,480 --> 00:46:57,720 Speaker 1: getting a gun to your head is actually more likely 747 00:46:57,760 --> 00:47:02,080 Speaker 1: to survive that for it's less inevitable. You could talk 748 00:47:02,160 --> 00:47:05,520 Speaker 1: your way out of that, and yeah, whatever, but there 749 00:47:05,560 --> 00:47:07,520 Speaker 1: are any number of things that might go wrong there. 750 00:47:07,600 --> 00:47:10,400 Speaker 1: But if somebody takes takes away your insulin, that's the 751 00:47:10,680 --> 00:47:14,000 Speaker 1: end of the story. Yeah, I guess the more salient 752 00:47:14,080 --> 00:47:16,840 Speaker 1: point than the comparison issues. They're both acts of violence, 753 00:47:17,239 --> 00:47:19,600 Speaker 1: and in every way that's meaningful, I think they're both 754 00:47:19,600 --> 00:47:23,399 Speaker 1: acts of violence. And one way that when I rail 755 00:47:23,719 --> 00:47:28,080 Speaker 1: against intellectual property as a concept and intellectual property law, 756 00:47:28,719 --> 00:47:31,400 Speaker 1: the example that I give is they say, if somebody 757 00:47:31,400 --> 00:47:34,759 Speaker 1: were dying and you knew how to save them, would 758 00:47:34,800 --> 00:47:39,399 Speaker 1: you ever not tell them how and just watch them die? Say, 759 00:47:39,719 --> 00:47:41,960 Speaker 1: oh no, that idea belongs to me, and I'm not 760 00:47:41,960 --> 00:47:45,960 Speaker 1: going to share unless you pay me. Like no human 761 00:47:46,120 --> 00:47:50,319 Speaker 1: being that I think I've ever heard of would do that. 762 00:47:51,080 --> 00:47:57,440 Speaker 1: And yet this happens every day because we've sort of 763 00:47:57,520 --> 00:48:02,680 Speaker 1: carried these questions of copy into patents and despite the 764 00:48:02,680 --> 00:48:04,719 Speaker 1: fact that they are hundreds of years old and not 765 00:48:05,080 --> 00:48:10,279 Speaker 1: applicable anymore, assuming they were ever applicable, and people just 766 00:48:10,400 --> 00:48:13,239 Speaker 1: die because people say, oh, well, we can make more 767 00:48:13,239 --> 00:48:25,239 Speaker 1: money if we do it this way. There's a fascinating 768 00:48:26,280 --> 00:48:29,160 Speaker 1: thing going on there when you really drilled into that idea. 769 00:48:29,160 --> 00:48:31,120 Speaker 1: Because I suspect. There are a lot of people who 770 00:48:31,160 --> 00:48:36,520 Speaker 1: have who are are integral in propping up this system, 771 00:48:36,560 --> 00:48:39,040 Speaker 1: both of kind of medical intellectual property and of just 772 00:48:39,120 --> 00:48:41,439 Speaker 1: like the pharmaceutical industry, the way that it works people 773 00:48:41,480 --> 00:48:44,239 Speaker 1: in politics, huge numbers of people who are integral in 774 00:48:44,760 --> 00:48:48,480 Speaker 1: some facet of keeping that going, who also, were they 775 00:48:48,520 --> 00:48:51,439 Speaker 1: to see an individual and immediate medical distress, would never 776 00:48:51,600 --> 00:48:55,160 Speaker 1: think of like try getting their debit card number or 777 00:48:55,160 --> 00:48:58,640 Speaker 1: whatever like asking them, would would without thinking attempt because 778 00:48:58,680 --> 00:49:00,800 Speaker 1: that's what people do, and it's I mean, this is 779 00:49:00,800 --> 00:49:02,440 Speaker 1: where we get into kind of some of these more 780 00:49:03,080 --> 00:49:07,640 Speaker 1: philosophical anarchist ideas about what hierarchy does and what these 781 00:49:07,680 --> 00:49:11,520 Speaker 1: structures do, because structures enable people to participate in evil 782 00:49:11,520 --> 00:49:15,200 Speaker 1: that they never would as an individual. Um, yeah, there's 783 00:49:15,239 --> 00:49:19,759 Speaker 1: this easy route that many easy routes that pop up 784 00:49:20,440 --> 00:49:25,280 Speaker 1: that allow people or force i should say force people 785 00:49:25,360 --> 00:49:29,040 Speaker 1: to be displaced from their humanity in that sort of 786 00:49:29,080 --> 00:49:33,080 Speaker 1: way where yes, of course, you you help somebody up 787 00:49:33,080 --> 00:49:36,640 Speaker 1: off of subway tracks if they've fallen. Yes of course, 788 00:49:36,719 --> 00:49:40,239 Speaker 1: if somebody were drowning, you drag them out and save them. 789 00:49:40,640 --> 00:49:44,600 Speaker 1: And yet just because it's a degree removed and it's 790 00:49:44,840 --> 00:49:49,560 Speaker 1: mediated by an agency. Suddenly it's so easy to forget 791 00:49:49,680 --> 00:49:54,680 Speaker 1: and ignore and be sort of complicit. And yeah, and 792 00:49:54,719 --> 00:49:58,000 Speaker 1: I just to go back around to what four Thieves 793 00:49:58,080 --> 00:50:00,319 Speaker 1: is doing and what y'all are doing it. It's one 794 00:50:00,360 --> 00:50:04,600 Speaker 1: of the few projects going on right now that fits 795 00:50:04,760 --> 00:50:08,800 Speaker 1: what my idealistic nineteen year old brain thought the Internet 796 00:50:08,840 --> 00:50:10,840 Speaker 1: would be sixteen fifteen, Like when I when it was, 797 00:50:10,920 --> 00:50:14,000 Speaker 1: when things were newer and a little less We're like, oh, 798 00:50:14,120 --> 00:50:16,120 Speaker 1: this is like one of these days, Well, this kind 799 00:50:16,120 --> 00:50:19,919 Speaker 1: of ship's gonna happen um. And that is I think, 800 00:50:19,960 --> 00:50:23,000 Speaker 1: I mean, that's that's not without value from again a 801 00:50:23,040 --> 00:50:26,600 Speaker 1: revolutionary perspective, the fact that it is pretty rad. You 802 00:50:26,640 --> 00:50:30,200 Speaker 1: know well, I mean I will not deny the fact 803 00:50:30,200 --> 00:50:33,160 Speaker 1: that it feels good, you know. There. I think that 804 00:50:34,880 --> 00:50:36,799 Speaker 1: I think that we all grew up with that sort 805 00:50:36,800 --> 00:50:42,000 Speaker 1: of hope and belief that we were gonna open these 806 00:50:42,120 --> 00:50:44,040 Speaker 1: new doors and they were going to be these new 807 00:50:44,080 --> 00:50:48,080 Speaker 1: possibilities and things that we have been reading about in 808 00:50:48,120 --> 00:50:52,400 Speaker 1: science fiction. We're going to become real and and there's 809 00:50:52,680 --> 00:50:57,640 Speaker 1: there's a great satisfaction in not just witnessing your childhood 810 00:50:57,760 --> 00:51:00,600 Speaker 1: dreams become realities, but actually, you know, having a hand 811 00:51:00,640 --> 00:51:04,600 Speaker 1: in it. Uh, it's there's there's something quite satisfying about that. 812 00:51:04,640 --> 00:51:08,879 Speaker 1: I will yeah, I will admit, well, I think that's 813 00:51:08,920 --> 00:51:11,640 Speaker 1: a pretty good point to close out on today. I 814 00:51:11,640 --> 00:51:13,600 Speaker 1: don't need to take up too much more of your 815 00:51:13,600 --> 00:51:16,360 Speaker 1: time right now, Michael. But but as I told people, 816 00:51:16,400 --> 00:51:19,160 Speaker 1: I'm gonna be I'm gonna be trying to get into 817 00:51:19,200 --> 00:51:21,799 Speaker 1: some of this because I find it just both fascinating 818 00:51:21,840 --> 00:51:27,680 Speaker 1: and incredibly hopeful. Um. In a world where it seems like, uh, 819 00:51:27,719 --> 00:51:31,120 Speaker 1: there are constantly forces conspiring to strip people of their 820 00:51:31,160 --> 00:51:34,240 Speaker 1: ability to take control of critical aspects of their lives, 821 00:51:34,680 --> 00:51:37,560 Speaker 1: you and your your colleagues in this are trying to 822 00:51:37,600 --> 00:51:43,680 Speaker 1: give people opportunities to take some some power back for themselves, 823 00:51:43,760 --> 00:51:47,960 Speaker 1: and I just think that's it's pretty dope. Thank you 824 00:51:48,080 --> 00:51:51,080 Speaker 1: so much. Yeah, and see your listeners. If there are 825 00:51:51,080 --> 00:51:52,920 Speaker 1: people out there who like what we're doing and you 826 00:51:52,920 --> 00:51:57,239 Speaker 1: want to support the project, please go find somebody who 827 00:51:57,320 --> 00:52:00,279 Speaker 1: needs your help but doesn't deserve it, and then help 828 00:52:00,320 --> 00:52:05,360 Speaker 1: them anyway. Yeah. Yeah, that's always a good thing to do. Um, Michael, 829 00:52:05,400 --> 00:52:08,200 Speaker 1: anything else, any like a thing else you want to 830 00:52:08,280 --> 00:52:10,320 Speaker 1: kind of put this is normally this section where people 831 00:52:10,320 --> 00:52:13,239 Speaker 1: plug websites or projects or anything, you've got anything in 832 00:52:13,239 --> 00:52:17,719 Speaker 1: particularly you want to throw out there right now. Uh sure, 833 00:52:17,400 --> 00:52:20,200 Speaker 1: We're hoping to do a bunch of big releases in 834 00:52:20,239 --> 00:52:23,799 Speaker 1: the summer um, so look for those. In the meantime, 835 00:52:24,960 --> 00:52:28,759 Speaker 1: we're always looking for help, So if you're out there 836 00:52:29,040 --> 00:52:32,879 Speaker 1: and you'd like to be assisted in the project, uh, 837 00:52:33,040 --> 00:52:35,560 Speaker 1: please get in touch. There's the contact us page on 838 00:52:35,600 --> 00:52:38,560 Speaker 1: the website. And by the way, this do not have 839 00:52:38,640 --> 00:52:41,480 Speaker 1: to be a technical person we're looking for currently, we're 840 00:52:41,480 --> 00:52:44,480 Speaker 1: looking for writers. We have a lot of documentation that 841 00:52:44,520 --> 00:52:46,919 Speaker 1: we need to do, So if you're out there and 842 00:52:47,040 --> 00:52:53,799 Speaker 1: you have you know, background in in language, then that 843 00:52:53,800 --> 00:52:56,560 Speaker 1: would be great. If if you're somebody who feels that 844 00:52:56,600 --> 00:53:00,160 Speaker 1: you're entirely without skills, please get in touch. We have 845 00:53:00,239 --> 00:53:03,239 Speaker 1: any number of endless small tasks that just need to 846 00:53:03,239 --> 00:53:05,640 Speaker 1: be taken care of because we don't have enough people. 847 00:53:06,280 --> 00:53:09,400 Speaker 1: So if you'd like to participate, we'd love to have you. 848 00:53:09,640 --> 00:53:13,640 Speaker 1: Please get in touch and in the meantime, keep each 849 00:53:13,680 --> 00:53:17,320 Speaker 1: other healthy, keep each other. Thank you so much, Michael, 850 00:53:18,000 --> 00:53:23,520 Speaker 1: thanks so much for having me. It could happen here 851 00:53:23,520 --> 00:53:26,200 Speaker 1: as a production of Cool Zone Media. For more podcasts 852 00:53:26,200 --> 00:53:28,800 Speaker 1: from Cool Zone Media, visit our website. Cool zone media 853 00:53:28,880 --> 00:53:30,680 Speaker 1: dot com or check us out on the I Heart 854 00:53:30,719 --> 00:53:33,799 Speaker 1: Radio app, Apple podcasts, or wherever you listen to podcasts, 855 00:53:34,320 --> 00:53:36,440 Speaker 1: you can find sources for It could happen here, Updated 856 00:53:36,520 --> 00:53:39,959 Speaker 1: monthly at cool zone media dot com slash sources. Thanks 857 00:53:40,000 --> 00:53:40,560 Speaker 1: for listening.