1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to Tech Stuff, a production from I Heart Radio. 2 00:00:11,800 --> 00:00:14,280 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:14,480 --> 00:00:17,520 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:17,560 --> 00:00:21,480 Speaker 1: and How the Tech Are You? Yesterday? Which was October four, 5 00:00:21,720 --> 00:00:24,680 Speaker 1: two thousand twenty two, for those of you listening from 6 00:00:24,680 --> 00:00:28,960 Speaker 1: the future, I finally got around to watching the Folding 7 00:00:29,000 --> 00:00:33,760 Speaker 1: Ideas YouTube video titled line goes up the problem with 8 00:00:34,000 --> 00:00:38,120 Speaker 1: n f t S and I cannot recommend that video enough. 9 00:00:38,360 --> 00:00:43,159 Speaker 1: It is a phenomenal piece of work, and it's substantial. 10 00:00:43,320 --> 00:00:45,800 Speaker 1: It is longer than two hours. I think it's like 11 00:00:45,840 --> 00:00:48,839 Speaker 1: two hours and eighteen minutes of information about n f 12 00:00:48,880 --> 00:00:52,760 Speaker 1: t S, cryptocurrency, blockchain, There's a bit about Web three 13 00:00:52,840 --> 00:00:57,840 Speaker 1: in there, and more. Host Dan Olson presents a meticulously 14 00:00:58,000 --> 00:01:01,560 Speaker 1: researched and well thought out take down of all of 15 00:01:01,560 --> 00:01:05,920 Speaker 1: these things. And I imagine he could and maybe will 16 00:01:06,000 --> 00:01:10,240 Speaker 1: do something similar with metaverse concepts, which tie in into 17 00:01:10,280 --> 00:01:12,840 Speaker 1: a lot of these concepts as well. And I think 18 00:01:12,840 --> 00:01:15,640 Speaker 1: it's really darn amazing when you consider that his challenge. 19 00:01:15,760 --> 00:01:18,360 Speaker 1: His channel is supposed to be about quote deconstructing the 20 00:01:18,440 --> 00:01:20,840 Speaker 1: craft of visual narrative end quote, and a lot of 21 00:01:20,840 --> 00:01:23,960 Speaker 1: His videos are about things like how editing can make 22 00:01:24,040 --> 00:01:29,000 Speaker 1: or break a movie, and how certain narratives become hopelessly 23 00:01:29,920 --> 00:01:32,480 Speaker 1: muddled because of bad editing, and that sort of stuff. 24 00:01:33,120 --> 00:01:36,160 Speaker 1: UM really interesting to see him branch out into this. 25 00:01:36,319 --> 00:01:42,560 Speaker 1: He's using his incredible ability to put together substantial arguments 26 00:01:43,120 --> 00:01:49,960 Speaker 1: to focus on things beyond narrative storytelling and those those videos, 27 00:01:50,000 --> 00:01:52,000 Speaker 1: by the way, are also great. I've learned a lot 28 00:01:52,160 --> 00:01:56,240 Speaker 1: by watching his videos. Anyway, Dan, I would say, is 29 00:01:56,280 --> 00:01:59,880 Speaker 1: even more opposed to cryptocurrencies and n f t s 30 00:02:00,320 --> 00:02:03,560 Speaker 1: than I am. And if you are a frequent listener 31 00:02:03,600 --> 00:02:06,760 Speaker 1: of tech stuff, you already know that I'm not exactly 32 00:02:07,440 --> 00:02:11,359 Speaker 1: a hype man for n f t S and cryptocurrency. Now, 33 00:02:11,400 --> 00:02:16,600 Speaker 1: I couldn't possibly cover the material more thoroughly than he has, 34 00:02:16,639 --> 00:02:19,400 Speaker 1: so I'm not gonna I'm not gonna go through and 35 00:02:19,560 --> 00:02:22,880 Speaker 1: regurgitate the points he made. He makes them very well, 36 00:02:22,919 --> 00:02:25,600 Speaker 1: and like I said, the video is fantastic. If you 37 00:02:25,639 --> 00:02:30,720 Speaker 1: are at all interested in learning more about cryptocurrencies, blockchain, 38 00:02:31,120 --> 00:02:33,239 Speaker 1: n f t s, all that kind of stuff, it's 39 00:02:33,240 --> 00:02:35,440 Speaker 1: well worth a watch. Line goes up the problem with 40 00:02:35,560 --> 00:02:37,840 Speaker 1: n f t s. So instead, what I want to 41 00:02:37,840 --> 00:02:41,239 Speaker 1: talk about is related, and it's something that he touches 42 00:02:41,280 --> 00:02:44,880 Speaker 1: on in the video, which is critical thinking within the 43 00:02:44,919 --> 00:02:51,160 Speaker 1: context of tech evangelism in general. So let's start off 44 00:02:51,200 --> 00:02:54,880 Speaker 1: with the origin of the word evangelism, because that's the 45 00:02:54,919 --> 00:02:59,440 Speaker 1: tricky one already. It's rooted in religion. Specifically, it's rooted 46 00:02:59,520 --> 00:03:03,639 Speaker 1: in christ Yanity. The concept of evangelism is to spread 47 00:03:03,840 --> 00:03:06,760 Speaker 1: the Gospel, with the end goal of this effort being 48 00:03:06,800 --> 00:03:11,120 Speaker 1: to bring more people into the fold of the religion. Now, 49 00:03:11,120 --> 00:03:14,720 Speaker 1: in more recent times, some folks use the word evangelist 50 00:03:14,919 --> 00:03:19,040 Speaker 1: to describe an enthusiast who speaks about or writes about 51 00:03:19,120 --> 00:03:22,280 Speaker 1: or you know whatever. They communicate in some way they're 52 00:03:22,400 --> 00:03:25,600 Speaker 1: chosen focus of devotion and an effort to get more 53 00:03:25,639 --> 00:03:28,320 Speaker 1: people to get into it. So you could be an 54 00:03:28,320 --> 00:03:32,080 Speaker 1: evangelist for lots of different stuff, not just religion, although 55 00:03:32,120 --> 00:03:35,120 Speaker 1: that again is where the word has its origin. You 56 00:03:35,160 --> 00:03:38,760 Speaker 1: could be an evangelist for a sports team and try 57 00:03:38,760 --> 00:03:42,440 Speaker 1: and get more people into watching a sport you like. Now, 58 00:03:42,600 --> 00:03:46,119 Speaker 1: to be clear, being an evangelist isn't necessarily a bad thing. 59 00:03:46,200 --> 00:03:51,200 Speaker 1: I'm not trying to suggest that being enthusiastic and communicative 60 00:03:51,240 --> 00:03:53,400 Speaker 1: about stuff is a bad thing. If it were I 61 00:03:53,440 --> 00:03:56,040 Speaker 1: wouldn't be a host of a podcast, because that's essentially 62 00:03:56,040 --> 00:04:00,520 Speaker 1: what podcasting is. You could be an evangelist for a cause. 63 00:04:00,840 --> 00:04:03,000 Speaker 1: For example, you could be spending time and effort to 64 00:04:03,040 --> 00:04:06,440 Speaker 1: get more support for a worthy cause, and you might 65 00:04:06,480 --> 00:04:10,160 Speaker 1: believe wholeheartedly in it, and any support you help build 66 00:04:10,200 --> 00:04:15,040 Speaker 1: contributes to that cause. That's a good one. But an 67 00:04:15,080 --> 00:04:20,839 Speaker 1: evangelist can be a bad thing too. But it definitely 68 00:04:21,040 --> 00:04:25,080 Speaker 1: can be a bad thing as well. The tools that 69 00:04:25,160 --> 00:04:28,400 Speaker 1: someone uses to attract people to support a good cause 70 00:04:28,920 --> 00:04:31,240 Speaker 1: are the same tools that can be used to attract 71 00:04:31,320 --> 00:04:36,840 Speaker 1: people to supporting something less altruistic or predatory. Scam artists 72 00:04:37,000 --> 00:04:42,159 Speaker 1: depend upon this. They use an evangelical approach pretty regularly, 73 00:04:42,240 --> 00:04:46,240 Speaker 1: although they frequently target things like people's greed or fear 74 00:04:46,279 --> 00:04:49,440 Speaker 1: in order to do it. And even folks who are 75 00:04:49,520 --> 00:04:55,520 Speaker 1: not scam artists, who might actually believe in something because 76 00:04:55,520 --> 00:04:58,039 Speaker 1: of their own investment, may use these tools to get 77 00:04:58,040 --> 00:05:02,200 Speaker 1: other people on board, probably because they've already invested so 78 00:05:02,320 --> 00:05:07,400 Speaker 1: much into this particular technology. So crypto is a good 79 00:05:07,440 --> 00:05:10,600 Speaker 1: example of this. Now, I will not go so far 80 00:05:10,600 --> 00:05:14,320 Speaker 1: as to say that crypto is a scam. Dan Olson 81 00:05:14,400 --> 00:05:17,400 Speaker 1: might go that far, um and you know, he makes 82 00:05:17,400 --> 00:05:21,479 Speaker 1: a very strong argument, but I would say that cryptocurrency 83 00:05:21,640 --> 00:05:24,840 Speaker 1: is just not a very good system. It's definitely not 84 00:05:24,960 --> 00:05:29,400 Speaker 1: a good system for replacing currency. It does not do 85 00:05:29,480 --> 00:05:31,880 Speaker 1: a good job of that. I think that's pretty clear. 86 00:05:32,800 --> 00:05:35,320 Speaker 1: It's really not that good of a system for building 87 00:05:35,360 --> 00:05:38,320 Speaker 1: wealth unless you were wealthy to begin with, or you 88 00:05:38,480 --> 00:05:42,320 Speaker 1: just happen to get in on the ground floor and 89 00:05:42,520 --> 00:05:45,839 Speaker 1: when there was no expectation for it to go crazy, 90 00:05:45,960 --> 00:05:47,680 Speaker 1: and you just happen to make a ton of money 91 00:05:47,720 --> 00:05:50,880 Speaker 1: that way. But for most people, it's not a good 92 00:05:50,920 --> 00:05:55,760 Speaker 1: way to build wealth. But if you wander into crypto forums, 93 00:05:56,160 --> 00:05:59,200 Speaker 1: you're gonna see a lot of folks championing the philosophy 94 00:05:59,240 --> 00:06:02,680 Speaker 1: of cryptocurrency in general and then debating with one another 95 00:06:02,760 --> 00:06:05,960 Speaker 1: which cryptocurrencies are great and which ones are garbage, And 96 00:06:06,200 --> 00:06:10,239 Speaker 1: boy howdy, there's an awful lot of disagreement on that topic. 97 00:06:10,279 --> 00:06:14,640 Speaker 1: I bet you can guess what the the defining factor is. 98 00:06:15,560 --> 00:06:19,760 Speaker 1: Whichever cryptocurrency you've invested in, that one, gosh darn better 99 00:06:19,800 --> 00:06:23,279 Speaker 1: will be a great one. And that's the thing, right, 100 00:06:23,320 --> 00:06:29,320 Speaker 1: Like a big driver of cryptocurrency value is demand for 101 00:06:29,400 --> 00:06:33,240 Speaker 1: that cryptocurrency. If there's a greater demand for the currency, 102 00:06:33,600 --> 00:06:36,640 Speaker 1: then you typically see the currencies price go up. And 103 00:06:37,240 --> 00:06:40,880 Speaker 1: it's a little more complicated and sophisticated than that, but 104 00:06:41,360 --> 00:06:43,120 Speaker 1: that's kind of at the heart of things, at least 105 00:06:43,120 --> 00:06:48,599 Speaker 1: for cryptocurrencies that are not directly tied to some fiat currency. 106 00:06:48,640 --> 00:06:53,880 Speaker 1: So if folks get scared about a cryptocurrency, demand typically 107 00:06:53,880 --> 00:06:56,240 Speaker 1: goes down and the value will go down too, And 108 00:06:56,279 --> 00:06:57,960 Speaker 1: the more scared they get, the more it goes down. 109 00:06:58,040 --> 00:07:00,440 Speaker 1: It becomes kind of a self fulfilling prophecy does this 110 00:07:00,480 --> 00:07:03,200 Speaker 1: in both directions, not just when it goes down. So 111 00:07:04,320 --> 00:07:08,200 Speaker 1: let's take a scenario. Okay, let's say that you have 112 00:07:08,320 --> 00:07:13,400 Speaker 1: sunk tens of thousands of dollars your life's savings into 113 00:07:13,480 --> 00:07:17,280 Speaker 1: a particular cryptocurrency. And let's say that you bought into 114 00:07:17,320 --> 00:07:20,440 Speaker 1: this cryptocurrency when each unit of the currency was like 115 00:07:20,760 --> 00:07:24,720 Speaker 1: five bucks, so five real world dollars to one of 116 00:07:24,760 --> 00:07:29,080 Speaker 1: these cryptocurrency units. Now, when this unit first launched, before 117 00:07:29,120 --> 00:07:32,160 Speaker 1: you got into it, its value was that fractions of 118 00:07:32,160 --> 00:07:37,800 Speaker 1: a penny per unit, So if someone got into it initially, 119 00:07:38,480 --> 00:07:43,080 Speaker 1: then they would have seen their investment grow to incredible 120 00:07:43,120 --> 00:07:45,360 Speaker 1: amounts by the time it gets to five dollars. So 121 00:07:45,400 --> 00:07:48,760 Speaker 1: if you had bought it when it first launched, you 122 00:07:48,800 --> 00:07:52,840 Speaker 1: would be wealthy beyond measure, but you didn't. You're in 123 00:07:52,920 --> 00:07:55,320 Speaker 1: now when it got to five dollars, and in fact, 124 00:07:55,600 --> 00:07:57,800 Speaker 1: that was when you bought in. Now we're a little 125 00:07:57,800 --> 00:08:02,200 Speaker 1: further along, and let's say that right now that value 126 00:08:02,200 --> 00:08:05,400 Speaker 1: has actually dropped down to three dollars. Now you hope 127 00:08:05,440 --> 00:08:08,280 Speaker 1: this is just a momentary depth and that the cryptocurrency 128 00:08:08,320 --> 00:08:11,680 Speaker 1: is going to increase in value again, maybe increasing at 129 00:08:11,720 --> 00:08:15,320 Speaker 1: that exponential speed it did when it first got huge, 130 00:08:15,880 --> 00:08:17,760 Speaker 1: but there's no way to really know if that's going 131 00:08:17,800 --> 00:08:20,640 Speaker 1: to happen. Well, you would have a pretty strong incentive 132 00:08:21,200 --> 00:08:26,360 Speaker 1: two talk positively about your investment. For one thing, there's 133 00:08:26,400 --> 00:08:31,120 Speaker 1: the dreaded sunk cost fallacy. This is a tendency to 134 00:08:31,360 --> 00:08:36,160 Speaker 1: stick with a strategy because you've already heavily invested into 135 00:08:36,360 --> 00:08:39,320 Speaker 1: that strategy, and you stick with it even if it 136 00:08:39,360 --> 00:08:42,839 Speaker 1: becomes obvious that if you jump ship you would be 137 00:08:42,840 --> 00:08:45,960 Speaker 1: better off. By the way, while we talk about sunk 138 00:08:46,080 --> 00:08:50,240 Speaker 1: costs a lot in financial matters, I argue that the 139 00:08:50,280 --> 00:08:53,839 Speaker 1: same thing is true with like emotional investment. If you've 140 00:08:53,880 --> 00:08:57,680 Speaker 1: ever been in a relationship, whether it's a romantic relationship 141 00:08:57,760 --> 00:09:01,760 Speaker 1: or a friendship, but you knew, you kind of knew 142 00:09:01,800 --> 00:09:03,720 Speaker 1: you would both be better off if you kind of 143 00:09:03,760 --> 00:09:07,240 Speaker 1: split up. But you've already invested so much emotional capital 144 00:09:07,280 --> 00:09:09,920 Speaker 1: into this relationship that you can't bring yourself to to 145 00:09:10,160 --> 00:09:15,640 Speaker 1: do that. That's very similar. That's essentially the sunk cost fallacy. 146 00:09:15,760 --> 00:09:19,160 Speaker 1: So if you invested tens of thousands of dollars into 147 00:09:19,200 --> 00:09:23,199 Speaker 1: this cryptocurrency and then this cryptocurrency drops in value by 148 00:09:23,240 --> 00:09:27,480 Speaker 1: forty percent, well that means you lost your investment, right, 149 00:09:27,520 --> 00:09:31,920 Speaker 1: Your investment is down, And while you could get out 150 00:09:32,280 --> 00:09:37,440 Speaker 1: with the sixty percent remaining minus any transaction fees, that 151 00:09:37,520 --> 00:09:39,640 Speaker 1: doesn't really feel good, right, It doesn't feel good to 152 00:09:39,679 --> 00:09:42,679 Speaker 1: get out of the system was significantly less money than 153 00:09:42,720 --> 00:09:46,080 Speaker 1: when you went into it, especially when you know that 154 00:09:46,120 --> 00:09:50,360 Speaker 1: folks who jumped into this cryptocurrency wagon before you did 155 00:09:50,960 --> 00:09:53,160 Speaker 1: had made up like bandits. People who bought it when 156 00:09:53,160 --> 00:09:55,600 Speaker 1: it was fractions of a penne per unit, they made 157 00:09:56,000 --> 00:09:59,480 Speaker 1: hundreds of thousands of dollars. Don't you deserve that money too? 158 00:09:59,480 --> 00:10:02,640 Speaker 1: Don't you just served to be a bandit? Huh? Smoky? 159 00:10:03,760 --> 00:10:08,200 Speaker 1: So that's one incentive that you know you have invested 160 00:10:08,200 --> 00:10:10,880 Speaker 1: this money, so you you have an incentive for you 161 00:10:11,000 --> 00:10:13,200 Speaker 1: to stick with it and try and get it to 162 00:10:13,240 --> 00:10:17,000 Speaker 1: turn around. Maybe you actually really believe in this cryptocurrency. 163 00:10:17,440 --> 00:10:20,640 Speaker 1: Maybe it's not just a here's an opportunity, I'm going 164 00:10:20,679 --> 00:10:22,920 Speaker 1: to jump on it, but that you actually believe in 165 00:10:22,960 --> 00:10:25,520 Speaker 1: whatever the cryptocurrency stands for. There are people who do 166 00:10:25,600 --> 00:10:28,679 Speaker 1: that um, but there are a lot more who are 167 00:10:28,760 --> 00:10:33,400 Speaker 1: just kind of opportunistic. So you might start to feel 168 00:10:33,400 --> 00:10:36,960 Speaker 1: sick because you've lost so much money in your investment. 169 00:10:38,040 --> 00:10:40,560 Speaker 1: Maybe you go into the crypto forums and you're really 170 00:10:40,640 --> 00:10:43,600 Speaker 1: cheerleading the currency. You're trying to get other people excited 171 00:10:43,640 --> 00:10:46,800 Speaker 1: about it, and if others do, they might buy into 172 00:10:46,800 --> 00:10:49,240 Speaker 1: the currency, and maybe that will drive out the value, 173 00:10:49,320 --> 00:10:53,640 Speaker 1: which means those lost investment dollars would start to filter back. 174 00:10:54,320 --> 00:10:57,000 Speaker 1: If you could build enough momentum, you might push the 175 00:10:57,080 --> 00:11:00,160 Speaker 1: value further than the five bucks it was at when 176 00:11:00,160 --> 00:11:02,680 Speaker 1: you bought it, and then it's to the moon. Baby, 177 00:11:02,720 --> 00:11:05,520 Speaker 1: You'll be rich in no time. But here's the thing. 178 00:11:06,240 --> 00:11:10,079 Speaker 1: This kind of operation is frightening lee similar to scams 179 00:11:10,080 --> 00:11:14,439 Speaker 1: like pyramid schemes or multi level marketing schemes. Now I 180 00:11:14,800 --> 00:11:17,240 Speaker 1: use the word scams, and I use the word schemes, 181 00:11:17,240 --> 00:11:19,320 Speaker 1: because in my mind, that's kind of what they are. 182 00:11:19,440 --> 00:11:25,520 Speaker 1: But MLM businesses, technically, if they're actually selling a real thing, 183 00:11:26,520 --> 00:11:29,360 Speaker 1: are viewed as a legitimate form of business. I just 184 00:11:29,640 --> 00:11:34,440 Speaker 1: find it really similar to scams to the point where 185 00:11:35,160 --> 00:11:38,199 Speaker 1: I'm very uncomfortable by them. Now, if you aren't familiar 186 00:11:38,240 --> 00:11:41,920 Speaker 1: with what mL M schemes are, I'll fill you in. 187 00:11:42,280 --> 00:11:45,319 Speaker 1: So essentially, you make money in one of two ways, 188 00:11:45,440 --> 00:11:48,160 Speaker 1: or in both ways. So one is that you get 189 00:11:48,200 --> 00:11:52,640 Speaker 1: recruited into an mL M. Someone convinces you that you 190 00:11:52,640 --> 00:11:55,760 Speaker 1: should join this business, and then it becomes your job 191 00:11:55,840 --> 00:11:59,880 Speaker 1: to sell something to people. Let's say it's cosmetics, because 192 00:12:00,000 --> 00:12:03,200 Speaker 1: a lot of MLM companies are in the cosmetics business. 193 00:12:03,240 --> 00:12:07,200 Speaker 1: So you start to sell cosmetics to folks, maybe mostly 194 00:12:07,200 --> 00:12:09,240 Speaker 1: through word of mouth and like selling it to friends 195 00:12:09,280 --> 00:12:12,600 Speaker 1: and family, and you get a portion of each sale 196 00:12:12,640 --> 00:12:15,160 Speaker 1: you make. Now you might actually have to spend your 197 00:12:15,160 --> 00:12:19,160 Speaker 1: own money to get your supply, though obviously you know 198 00:12:19,200 --> 00:12:21,520 Speaker 1: you're buying supplies at a lower cost than what you 199 00:12:21,600 --> 00:12:24,920 Speaker 1: sell them to to other folks. So that's method number 200 00:12:24,960 --> 00:12:27,920 Speaker 1: one of how to make money. Method number two is 201 00:12:27,920 --> 00:12:31,800 Speaker 1: that you recruit other people to join the company. You 202 00:12:31,960 --> 00:12:36,640 Speaker 1: essentially become their sponsor. So you are now sitting upstream 203 00:12:36,720 --> 00:12:40,680 Speaker 1: of those recruits who are downstream from you, and now 204 00:12:40,960 --> 00:12:44,400 Speaker 1: you get a little piece of every sale they make. 205 00:12:44,800 --> 00:12:46,520 Speaker 1: When they make a sale to someone else, you get 206 00:12:46,520 --> 00:12:50,800 Speaker 1: a little You get a little portion of that sale. 207 00:12:51,520 --> 00:12:55,040 Speaker 1: Upstream from you is your sponsor, the person who recruited 208 00:12:55,080 --> 00:12:57,120 Speaker 1: you into the MLM. They get a little piece of 209 00:12:57,160 --> 00:13:00,520 Speaker 1: every sale you make. And you do this so on 210 00:13:00,559 --> 00:13:02,240 Speaker 1: and so forth until you get to the very top 211 00:13:02,240 --> 00:13:05,760 Speaker 1: of the chain, where a precious few are reaping in 212 00:13:05,880 --> 00:13:08,240 Speaker 1: lots of these little slices of pie from all the 213 00:13:08,280 --> 00:13:10,760 Speaker 1: recruits who are downstream. These are the people who are 214 00:13:10,760 --> 00:13:13,720 Speaker 1: actually getting wealthy off this, because they're getting a cut 215 00:13:13,760 --> 00:13:18,079 Speaker 1: off every transaction, and even if that cut is teeny tiny, 216 00:13:18,120 --> 00:13:21,720 Speaker 1: it adds up as the organization grows, and that means 217 00:13:21,760 --> 00:13:23,880 Speaker 1: for the folks at the very top, there's this huge 218 00:13:23,880 --> 00:13:27,679 Speaker 1: incentive to push for more recruits. Now, you do still 219 00:13:27,760 --> 00:13:30,720 Speaker 1: need to sell stuff, or at least have recruits buying 220 00:13:30,800 --> 00:13:34,360 Speaker 1: supplies from you, otherwise you don't have revenue coming in. 221 00:13:34,800 --> 00:13:38,199 Speaker 1: But the more active people you have in the system, 222 00:13:38,320 --> 00:13:41,000 Speaker 1: the more money you can make. So a huge part 223 00:13:41,000 --> 00:13:45,160 Speaker 1: of your business strategy comes in the form of recruitment strategies. Similarly, 224 00:13:45,200 --> 00:13:48,079 Speaker 1: with crypto folks who have large holdings in the currency 225 00:13:48,360 --> 00:13:50,920 Speaker 1: would love more people to come on board because it 226 00:13:50,960 --> 00:13:53,400 Speaker 1: can boost the value of the individual unit and it 227 00:13:53,480 --> 00:13:58,560 Speaker 1: can make their own substantial investment grow significantly. It's also 228 00:13:58,720 --> 00:14:04,040 Speaker 1: not nearly as easy to extract yourself from a cryptocurrency 229 00:14:04,120 --> 00:14:07,520 Speaker 1: market as you might think. That means that once you 230 00:14:07,600 --> 00:14:11,040 Speaker 1: bought in, you might not feel like getting out is 231 00:14:11,280 --> 00:14:14,840 Speaker 1: a good risk. You might feel like if I get 232 00:14:14,840 --> 00:14:18,000 Speaker 1: out now, I'm gonna lose out on future growth. Or 233 00:14:18,760 --> 00:14:21,960 Speaker 1: I might have to spend so much in transaction fees 234 00:14:22,000 --> 00:14:25,240 Speaker 1: that I'm really cutting into whatever profit I made. Or 235 00:14:26,000 --> 00:14:30,000 Speaker 1: with certain cryptocurrencies like Bitcoin, the transaction process takes long 236 00:14:30,160 --> 00:14:34,200 Speaker 1: enough that it's possible you'll actually see a significant loss 237 00:14:34,240 --> 00:14:37,360 Speaker 1: of wealth from the moment you start a transaction to 238 00:14:37,440 --> 00:14:43,040 Speaker 1: the moment it finally concludes. Okay, we've got some more 239 00:14:43,200 --> 00:14:45,440 Speaker 1: to talk about when it comes to critical thinking and 240 00:14:45,440 --> 00:14:58,080 Speaker 1: tech evangelism, but first let's take a quick break. All right, 241 00:14:58,120 --> 00:15:00,280 Speaker 1: we're back now. One of the things that you'll here 242 00:15:00,400 --> 00:15:04,520 Speaker 1: in cryptocurrency forums whenever a given cryptocurrency is having a 243 00:15:04,600 --> 00:15:07,040 Speaker 1: rough go of it is you'll have a group of 244 00:15:07,080 --> 00:15:10,200 Speaker 1: folks telling everyone to hoddle or h o d l 245 00:15:10,360 --> 00:15:14,440 Speaker 1: or hold on for dear life. Essentially, what they are 246 00:15:14,440 --> 00:15:19,120 Speaker 1: saying is, hey, don't dump your coins now, just hunker down. 247 00:15:19,720 --> 00:15:23,240 Speaker 1: You can wait this out. Things will be fine, they'll 248 00:15:23,240 --> 00:15:26,840 Speaker 1: get better and then to the moon. And that sounds 249 00:15:26,840 --> 00:15:30,640 Speaker 1: like good advice, right, but it's really reinforcing that sunk 250 00:15:30,760 --> 00:15:34,520 Speaker 1: cost fallacy. There's no guarantee that things will get better. 251 00:15:34,760 --> 00:15:37,440 Speaker 1: Maybe they will, maybe they'll get better than they've ever 252 00:15:37,480 --> 00:15:40,080 Speaker 1: been before. We've seen that with Bitcoin several times where 253 00:15:40,560 --> 00:15:45,400 Speaker 1: the value grew too incredible heights crashed and then grew 254 00:15:45,400 --> 00:15:48,320 Speaker 1: even more. Um if it does that again, then that 255 00:15:48,320 --> 00:15:51,960 Speaker 1: means we're gonna see bitcoin go over sixty dollars per 256 00:15:52,080 --> 00:15:56,440 Speaker 1: unit if it repeats that cycle. The problem is there's 257 00:15:56,440 --> 00:16:00,840 Speaker 1: no guarantee that cycle will repeat. There's nothing inherent about 258 00:16:00,840 --> 00:16:06,040 Speaker 1: cryptocurrency that makes it reliable for this sort of cycle. 259 00:16:06,600 --> 00:16:11,400 Speaker 1: It could happen, but maybe it won't, and so that 260 00:16:11,560 --> 00:16:15,680 Speaker 1: sunk cost fallacy really starts to weigh on you. A 261 00:16:15,680 --> 00:16:18,280 Speaker 1: lot of people will give this advice because they have 262 00:16:18,400 --> 00:16:22,000 Speaker 1: their own significant investment in that virtual coin, so they 263 00:16:22,000 --> 00:16:24,600 Speaker 1: don't want to see people jumping ship, because every time 264 00:16:24,640 --> 00:16:28,120 Speaker 1: that happens, there's the potential for the coin's value to 265 00:16:28,200 --> 00:16:32,400 Speaker 1: decline even further that would hurt their own investment, So 266 00:16:32,680 --> 00:16:36,120 Speaker 1: they have an incentive to keep people on that currency 267 00:16:36,200 --> 00:16:40,400 Speaker 1: and try to convince more people to join in. They 268 00:16:41,040 --> 00:16:46,000 Speaker 1: there's also the phrase by the depth, meaning when cryptocurrencies 269 00:16:46,080 --> 00:16:49,680 Speaker 1: go low, that's when people should be buying more because 270 00:16:49,720 --> 00:16:52,400 Speaker 1: that's when they're going to see the biggest return when 271 00:16:52,440 --> 00:16:57,360 Speaker 1: the cryptocurrency recovers its value. If it recovers its value, 272 00:16:57,680 --> 00:17:01,240 Speaker 1: then yeah, sure, if it covers its value, and it 273 00:17:01,280 --> 00:17:02,960 Speaker 1: may be that you know, again, you bought it at 274 00:17:03,000 --> 00:17:06,919 Speaker 1: five dollars, it's now at three dollars, so you're telling people, hey, 275 00:17:06,920 --> 00:17:09,560 Speaker 1: by now because it's gonna go up. You're kind of 276 00:17:09,600 --> 00:17:12,360 Speaker 1: trying to create a self fulfilling prophecy to make that happen. 277 00:17:13,240 --> 00:17:15,399 Speaker 1: So my point here is it's always good advice to 278 00:17:15,440 --> 00:17:21,280 Speaker 1: think critically about tech evangelist claims. And I know I've 279 00:17:21,280 --> 00:17:24,440 Speaker 1: picked on cryptocurrency a lot so far in this episode, 280 00:17:24,440 --> 00:17:28,159 Speaker 1: but it's really true for pretty much all technologies, particularly 281 00:17:28,200 --> 00:17:31,320 Speaker 1: ones that are early on in the hype cycle. That 282 00:17:31,440 --> 00:17:33,760 Speaker 1: might be the metaverse. In fact, I would say the 283 00:17:33,800 --> 00:17:35,879 Speaker 1: better verse is a really good one to think critically about. 284 00:17:36,840 --> 00:17:39,479 Speaker 1: VR A R. Both of those are good to think 285 00:17:39,520 --> 00:17:42,679 Speaker 1: critically about. I'm not saying they're bad technologies, but that 286 00:17:42,720 --> 00:17:47,080 Speaker 1: you have to critically evaluate claims about the technologies. The 287 00:17:47,160 --> 00:17:49,720 Speaker 1: same is true for A I or really any given 288 00:17:49,800 --> 00:17:54,240 Speaker 1: tech product. Now, I am not saying that tech is 289 00:17:54,280 --> 00:17:57,919 Speaker 1: worthless or anything close to that. I'm saying that you 290 00:17:57,960 --> 00:18:03,320 Speaker 1: need to examine the motivations of tech evangelists to understand 291 00:18:03,880 --> 00:18:09,000 Speaker 1: where they're coming from, what is incentivizing them to evangelize 292 00:18:09,000 --> 00:18:12,600 Speaker 1: this particular technology, and to keep that in mind when 293 00:18:12,600 --> 00:18:16,000 Speaker 1: you start to evaluate their claims. It may turn out 294 00:18:16,040 --> 00:18:19,840 Speaker 1: that their claims are perfectly valid, but it may turn 295 00:18:19,840 --> 00:18:25,000 Speaker 1: out that they're somewhat biased or really biased. So if 296 00:18:25,040 --> 00:18:27,920 Speaker 1: someone comes up to you and says, hey, the mataverse 297 00:18:28,000 --> 00:18:30,960 Speaker 1: is gonna be huge, look at how many companies are 298 00:18:31,040 --> 00:18:34,159 Speaker 1: getting into the metaverse. Well, that's true. There are a 299 00:18:34,160 --> 00:18:37,600 Speaker 1: ton of companies that are all trying to get into 300 00:18:38,119 --> 00:18:41,399 Speaker 1: something related to the metaverse. But you might want to 301 00:18:41,400 --> 00:18:45,439 Speaker 1: ask questions like, how is the person who's telling you this, 302 00:18:45,560 --> 00:18:48,960 Speaker 1: how are they connected to the metaverse, what is their 303 00:18:49,119 --> 00:18:53,600 Speaker 1: investment in it? Maybe they're just an enthusiast, so it's 304 00:18:53,600 --> 00:18:56,720 Speaker 1: an emotional investment, it's not a financial investment. And it 305 00:18:56,800 --> 00:18:58,480 Speaker 1: might just be that this is a person who's been 306 00:18:58,480 --> 00:19:01,359 Speaker 1: swept up in a lot of height. Honestly, that's very 307 00:19:01,440 --> 00:19:04,159 Speaker 1: easy to have happened. I've had it happened to me 308 00:19:04,560 --> 00:19:07,440 Speaker 1: lots of times. I am not immune to it. I 309 00:19:07,800 --> 00:19:10,639 Speaker 1: think I'm getting slightly better at recognizing it early on 310 00:19:10,680 --> 00:19:15,280 Speaker 1: and then adjusting expectations, but I'm not perfect by any means. 311 00:19:16,359 --> 00:19:20,040 Speaker 1: Or maybe it's that this person runs a virtual real 312 00:19:20,160 --> 00:19:22,520 Speaker 1: estate business and they're kind of hoping you're gonna PLoP 313 00:19:22,560 --> 00:19:26,360 Speaker 1: down some real world cash to buy some virtual plots 314 00:19:26,400 --> 00:19:29,760 Speaker 1: of land that may or may not ever be connected 315 00:19:29,760 --> 00:19:33,520 Speaker 1: to a notable virtual landscape in the metaverse. Assuming we 316 00:19:33,800 --> 00:19:39,760 Speaker 1: ever get something like that, then you know, maybe it 317 00:19:39,800 --> 00:19:42,600 Speaker 1: would pay off in the long run, but there's no guarantee. 318 00:19:43,200 --> 00:19:47,000 Speaker 1: And again, like we're still in a space where metaverse 319 00:19:47,000 --> 00:19:52,000 Speaker 1: doesn't actually mean anything yet. There's too many conflicting ideas 320 00:19:52,040 --> 00:19:57,479 Speaker 1: and definitions and approaches, so buying into something specific right 321 00:19:57,520 --> 00:20:02,960 Speaker 1: now is really taking a long shot. Like, if you're 322 00:20:02,960 --> 00:20:06,240 Speaker 1: gonna do that, you might as well just be ready 323 00:20:06,280 --> 00:20:09,120 Speaker 1: to say goodbye to that money. Maybe it pays off, 324 00:20:09,280 --> 00:20:13,160 Speaker 1: which would be fantastic, but there's a real good chance 325 00:20:13,240 --> 00:20:16,040 Speaker 1: it won't. So just like if you go into a 326 00:20:16,119 --> 00:20:20,479 Speaker 1: casino where the house always wins in the end, you 327 00:20:20,520 --> 00:20:22,159 Speaker 1: need to just sit there and say, all right, this 328 00:20:22,200 --> 00:20:25,359 Speaker 1: is money I am comfortable losing. I am not going 329 00:20:25,440 --> 00:20:28,919 Speaker 1: to suffer if this money is gone forever. I might 330 00:20:28,960 --> 00:20:31,840 Speaker 1: as well just assume that it is gone forever, and 331 00:20:32,200 --> 00:20:36,480 Speaker 1: if it turns out otherwise, that will be a huge bonus. 332 00:20:38,080 --> 00:20:42,200 Speaker 1: Or maybe a tech evangelist might out a particularly interesting 333 00:20:42,280 --> 00:20:45,840 Speaker 1: aspect of a technology, but they might fail to mention 334 00:20:46,320 --> 00:20:49,040 Speaker 1: the whole story, or maybe they don't even know the 335 00:20:49,040 --> 00:20:52,159 Speaker 1: whole story. So for this I'm gonna touch on something 336 00:20:52,200 --> 00:20:54,960 Speaker 1: that Dan Olsen talks about in that video I mentioned 337 00:20:54,960 --> 00:20:57,199 Speaker 1: at the top of the episode. So with n f 338 00:20:57,240 --> 00:21:00,359 Speaker 1: T S, one of the benefits you'll hear of. This 339 00:21:00,440 --> 00:21:04,000 Speaker 1: is a benefit that's often marketed towards artists, is that 340 00:21:04,119 --> 00:21:07,359 Speaker 1: the person who meants and n f T, which again 341 00:21:07,400 --> 00:21:10,480 Speaker 1: stands for non fungible token, the person who ments a 342 00:21:10,560 --> 00:21:16,440 Speaker 1: token can get a piece of every future transaction involving 343 00:21:16,480 --> 00:21:19,199 Speaker 1: that n f T. So let's say you are an 344 00:21:19,280 --> 00:21:23,280 Speaker 1: artist and you create a work of digital art, and 345 00:21:23,400 --> 00:21:26,879 Speaker 1: you meant an n f T to represent this piece 346 00:21:27,040 --> 00:21:31,399 Speaker 1: of art um which by the way, exists independently of 347 00:21:31,440 --> 00:21:34,399 Speaker 1: the n f T. The n f T itself isn't 348 00:21:34,440 --> 00:21:38,320 Speaker 1: the art. It's a smart contract that represents the ownership 349 00:21:38,560 --> 00:21:41,639 Speaker 1: of that art, and by ownership I mean of that 350 00:21:41,760 --> 00:21:47,720 Speaker 1: instance of that art. It gets really granular. So you 351 00:21:47,840 --> 00:21:51,040 Speaker 1: sell this n f T to someone for ten whole dollars, 352 00:21:51,080 --> 00:21:53,880 Speaker 1: they now have a token that says they own that 353 00:21:54,000 --> 00:21:59,480 Speaker 1: instance of your digital art. Good. But then let's say 354 00:21:59,480 --> 00:22:02,440 Speaker 1: that there is this this crazy n f T boom, Right, 355 00:22:02,560 --> 00:22:05,359 Speaker 1: people just go nuts speculating on n f T s. 356 00:22:05,880 --> 00:22:08,280 Speaker 1: You've already sold your n f T for ten dollars. 357 00:22:08,800 --> 00:22:12,240 Speaker 1: The person you sold that too then goes on and 358 00:22:12,560 --> 00:22:15,080 Speaker 1: resells that n f T, but they sell it for 359 00:22:15,160 --> 00:22:18,600 Speaker 1: ten thousand dollars. In the real world, if this were 360 00:22:18,640 --> 00:22:20,399 Speaker 1: to happen, if you were an artist and you sold 361 00:22:20,400 --> 00:22:22,760 Speaker 1: your painting for ten bucks, and then later on the 362 00:22:22,800 --> 00:22:25,040 Speaker 1: person you sold it to were was able to offload 363 00:22:25,040 --> 00:22:28,359 Speaker 1: it for ten grand, you probably wouldn't see any of 364 00:22:28,400 --> 00:22:31,480 Speaker 1: that money, right, because you already sold the painting. You're 365 00:22:31,520 --> 00:22:33,240 Speaker 1: you're out of luck. Now, the best you can hope 366 00:22:33,240 --> 00:22:35,720 Speaker 1: for is that there would be a jump and interest 367 00:22:35,800 --> 00:22:38,720 Speaker 1: in your work, and that future sales that you would 368 00:22:38,720 --> 00:22:42,520 Speaker 1: make wouldn't met you more revenue. But with certain n 369 00:22:42,560 --> 00:22:47,080 Speaker 1: f T market places, like whichever marketplace you initially meant 370 00:22:47,200 --> 00:22:51,200 Speaker 1: the n f T n the creator can get a portion, 371 00:22:51,359 --> 00:22:54,760 Speaker 1: a royalty, as it were, of each future sale of 372 00:22:54,880 --> 00:22:59,000 Speaker 1: that token. So let's say you sell your n f 373 00:22:59,000 --> 00:23:02,920 Speaker 1: T to buyer one. Buyer one sells to buyer two, 374 00:23:03,240 --> 00:23:06,879 Speaker 1: and some of that sales price will actually go to you, 375 00:23:07,240 --> 00:23:09,399 Speaker 1: the person who minted the token in the first place. 376 00:23:09,880 --> 00:23:12,720 Speaker 1: Buyer two sells the same n f T to buyer three, 377 00:23:12,840 --> 00:23:16,800 Speaker 1: and again you who minted the token, gets a little 378 00:23:16,880 --> 00:23:20,920 Speaker 1: cut of that. It's ongoing passive revenue, which is the dream. 379 00:23:21,119 --> 00:23:25,080 Speaker 1: You hear about that a lot online. Ways of creating 380 00:23:25,160 --> 00:23:28,800 Speaker 1: ongoing passive revenue. That, by the way, that's a that's 381 00:23:28,800 --> 00:23:32,840 Speaker 1: a phrase that should always raise red flags and prompt 382 00:23:32,880 --> 00:23:36,399 Speaker 1: you to think critically about what is being sold to you. 383 00:23:36,880 --> 00:23:41,639 Speaker 1: But that's another topic for another time. Here's the thing 384 00:23:41,680 --> 00:23:43,520 Speaker 1: about the n f T model, because I mean that 385 00:23:43,640 --> 00:23:45,879 Speaker 1: is true, Like if you if you have this in 386 00:23:45,960 --> 00:23:49,600 Speaker 1: this marketplace, yeah, you can you can keep on getting 387 00:23:49,640 --> 00:23:53,760 Speaker 1: little cuts of each sale, except it only applies to 388 00:23:53,800 --> 00:23:56,480 Speaker 1: whichever n f T marketplace you minted the token in 389 00:23:57,520 --> 00:24:00,639 Speaker 1: the royalty system isn't built into the n f T 390 00:24:00,840 --> 00:24:04,560 Speaker 1: s themselves. It's built into the marketplaces where the n 391 00:24:04,600 --> 00:24:08,000 Speaker 1: f T s are bought and sold. So nothing stops 392 00:24:08,119 --> 00:24:11,040 Speaker 1: somewhere from taking an n f T out of one 393 00:24:11,080 --> 00:24:15,399 Speaker 1: marketplace and then putting it up for sale into another marketplace. 394 00:24:15,520 --> 00:24:18,960 Speaker 1: So if buyer one purchases the n f T for 395 00:24:19,040 --> 00:24:21,760 Speaker 1: ten bucks from you, but then they move the n 396 00:24:21,800 --> 00:24:24,639 Speaker 1: f T to a different marketplace and they sell it 397 00:24:24,680 --> 00:24:27,200 Speaker 1: for ten thousan dollars, well then you're not getting any 398 00:24:27,240 --> 00:24:31,120 Speaker 1: passive revenue because it's no longer on the marketplace where 399 00:24:31,119 --> 00:24:34,880 Speaker 1: you minted it. And the royalty system doesn't work across 400 00:24:34,960 --> 00:24:39,040 Speaker 1: marketplaces yet. I say yet, because it potentially could in 401 00:24:39,040 --> 00:24:44,480 Speaker 1: the future. It just doesn't. Now, you might think, okay, 402 00:24:44,480 --> 00:24:47,000 Speaker 1: but how often does that scenario pop up? I mean, 403 00:24:47,640 --> 00:24:50,960 Speaker 1: usually it's just going to stay in the same marketplace, right. Well, 404 00:24:51,000 --> 00:24:54,439 Speaker 1: there's another issue at play here, and that's transaction fees. 405 00:24:54,920 --> 00:24:57,760 Speaker 1: And this is how marketplaces make revenue. They put on 406 00:24:57,800 --> 00:25:02,320 Speaker 1: a transaction fee for any transaction that happens within that marketplace. 407 00:25:02,880 --> 00:25:05,760 Speaker 1: The same sort of thing is like credit card transaction fees, right. 408 00:25:06,320 --> 00:25:09,560 Speaker 1: This means that someone has to pay a little bit 409 00:25:09,600 --> 00:25:13,200 Speaker 1: extra for this transaction to actually happen, so that when 410 00:25:13,240 --> 00:25:16,880 Speaker 1: Buyer two buys from buyer one, this transaction fee also 411 00:25:16,920 --> 00:25:19,800 Speaker 1: has to be covered. So a lot of buyers would 412 00:25:19,800 --> 00:25:24,680 Speaker 1: prefer to do private transactions but directly between digital wallets 413 00:25:24,760 --> 00:25:30,280 Speaker 1: and just avoid marketplaces entirely, and that way the transaction 414 00:25:30,680 --> 00:25:34,800 Speaker 1: is just for the actual transaction, there's no fee on 415 00:25:34,840 --> 00:25:37,399 Speaker 1: top of it. That way, you don't have to spend 416 00:25:37,400 --> 00:25:40,879 Speaker 1: extra money just to cover a fee. And when you 417 00:25:40,960 --> 00:25:44,560 Speaker 1: switch to private transactions, the person who mented the token 418 00:25:45,080 --> 00:25:47,840 Speaker 1: is left out of the whole process. No royalties. In 419 00:25:47,880 --> 00:25:50,959 Speaker 1: other words, so while the royalty idea is great and 420 00:25:51,040 --> 00:25:54,919 Speaker 1: it addresses an issue that digital creators encounter all the time, 421 00:25:55,680 --> 00:26:00,320 Speaker 1: the execution is actually not that great and it doesn't 422 00:26:00,359 --> 00:26:04,200 Speaker 1: really solve the problem. But it sure does come across 423 00:26:04,240 --> 00:26:07,000 Speaker 1: as a cool selling point for n f t s. 424 00:26:07,359 --> 00:26:09,600 Speaker 1: And if you're someone who is dependent upon making money 425 00:26:09,640 --> 00:26:12,080 Speaker 1: by selling n f t s or by driving up 426 00:26:12,119 --> 00:26:13,960 Speaker 1: the value of n f t s so that your 427 00:26:14,000 --> 00:26:17,359 Speaker 1: own purchases aren't a sunk cost, you are likely to 428 00:26:17,400 --> 00:26:19,840 Speaker 1: talk up this feature even if it doesn't you know, 429 00:26:20,560 --> 00:26:23,440 Speaker 1: works so good. Now, all that being said, there are 430 00:26:23,480 --> 00:26:27,000 Speaker 1: folks who are constantly working on evolving these smart contracts 431 00:26:27,040 --> 00:26:30,320 Speaker 1: within eno f t s so that it might be 432 00:26:30,320 --> 00:26:32,760 Speaker 1: possible that one day this will be fixed, it will 433 00:26:32,800 --> 00:26:36,840 Speaker 1: be addressed, the royalty system will be more robust, and 434 00:26:37,200 --> 00:26:39,960 Speaker 1: n f t s will fulfill the promise that people 435 00:26:39,960 --> 00:26:43,320 Speaker 1: are making right now. It's just that right now, they 436 00:26:43,359 --> 00:26:46,480 Speaker 1: don't do that right, not in every case, not as 437 00:26:46,480 --> 00:26:49,200 Speaker 1: long as anyone can take something off of one marketplace 438 00:26:49,200 --> 00:26:51,840 Speaker 1: and put it on another. So I would say, don't 439 00:26:51,840 --> 00:26:55,119 Speaker 1: count on the problem being addressed right away, and keeping 440 00:26:55,160 --> 00:26:58,440 Speaker 1: that in mind so that you don't just fall into 441 00:26:59,000 --> 00:27:01,840 Speaker 1: a system that doesn't work as well as what people 442 00:27:01,920 --> 00:27:07,320 Speaker 1: are promising. All right, Let's talk about a different scenario 443 00:27:07,880 --> 00:27:10,800 Speaker 1: in which critical thinking comes in really handy, and this 444 00:27:10,960 --> 00:27:16,119 Speaker 1: is one that I totally failed. Ultimately. I did come around, 445 00:27:16,160 --> 00:27:18,240 Speaker 1: obviously because I'm talking about it now so I have 446 00:27:18,280 --> 00:27:23,800 Speaker 1: awareness about it. But initially I did not consider the 447 00:27:23,840 --> 00:27:28,320 Speaker 1: big picture properly. I did not think critically when it 448 00:27:28,400 --> 00:27:32,280 Speaker 1: came to this topic. And I'm talking about autonomous cars 449 00:27:33,080 --> 00:27:36,879 Speaker 1: around a decade ago, I was really, really hyped for 450 00:27:36,960 --> 00:27:40,120 Speaker 1: autonomous cars. I was convinced it was right around the corner. 451 00:27:40,160 --> 00:27:42,840 Speaker 1: We were going to have driverless cars, that no one 452 00:27:42,880 --> 00:27:46,280 Speaker 1: would ever have to drive again, that traffic accidents would 453 00:27:46,280 --> 00:27:48,479 Speaker 1: become a thing of the past, that we would have 454 00:27:48,560 --> 00:27:53,480 Speaker 1: these incredibly efficient systems all across the world. And when 455 00:27:53,520 --> 00:27:56,000 Speaker 1: I was thinking about the discrete components, I could see 456 00:27:56,000 --> 00:27:59,639 Speaker 1: why I was all swept up into the idea. But 457 00:27:59,720 --> 00:28:05,600 Speaker 1: over time the veil fell from my eyes, and I 458 00:28:05,680 --> 00:28:10,320 Speaker 1: realized that this is a much harder challenge than I 459 00:28:10,359 --> 00:28:12,800 Speaker 1: gave it credit for when I was first thinking about it. 460 00:28:13,000 --> 00:28:17,119 Speaker 1: All Right, I'll talk about that whole process after we 461 00:28:17,200 --> 00:28:29,760 Speaker 1: come back from this quick break. Okay, let's talk about 462 00:28:29,920 --> 00:28:32,960 Speaker 1: how I got swept up into the hype of autonomous cars. 463 00:28:33,560 --> 00:28:37,280 Speaker 1: My thinking was, well, of course, the computer would be 464 00:28:37,280 --> 00:28:40,320 Speaker 1: better at driving a car than a person would. I 465 00:28:40,400 --> 00:28:42,800 Speaker 1: was thinking about driving a car strictly in the sense 466 00:28:42,960 --> 00:28:48,720 Speaker 1: of reaction times and perception. Right, a person has limited perception. 467 00:28:49,280 --> 00:28:52,680 Speaker 1: You can only see a certain field of view, and 468 00:28:52,720 --> 00:28:55,680 Speaker 1: that's just whichever direction you have to be facing. You 469 00:28:55,720 --> 00:28:59,440 Speaker 1: can only hear so much, right, There's an entire world 470 00:28:59,560 --> 00:29:04,680 Speaker 1: around owned you that you cannot directly perceive all the time, 471 00:29:05,120 --> 00:29:08,920 Speaker 1: anything that's outside of your peripheral you can't directly see it, 472 00:29:09,280 --> 00:29:11,600 Speaker 1: and you have to rely on things like mirrors to 473 00:29:11,640 --> 00:29:13,480 Speaker 1: compensate for that when you're driving, right, that's why you 474 00:29:13,520 --> 00:29:16,360 Speaker 1: have side mirrors and rear view mirrors, So you use 475 00:29:16,440 --> 00:29:18,480 Speaker 1: the mirrors to help compensate for the fact that you 476 00:29:18,520 --> 00:29:20,360 Speaker 1: don't have eyes in the back of your head. Even 477 00:29:20,400 --> 00:29:23,920 Speaker 1: if you are a mom, I'm onto you, moms, I 478 00:29:23,960 --> 00:29:26,520 Speaker 1: know you don't have eyes in the back of your head. Now, 479 00:29:26,560 --> 00:29:29,280 Speaker 1: these mirrors have blind spots, right, just because the way 480 00:29:29,320 --> 00:29:32,960 Speaker 1: the car is designed. So even with the mirrors and 481 00:29:33,040 --> 00:29:37,120 Speaker 1: even checking mirrors frequently and being alert, you're going to 482 00:29:37,240 --> 00:29:42,520 Speaker 1: have flawed perception. You cannot have perfect perception. It's just impossible. 483 00:29:43,000 --> 00:29:45,400 Speaker 1: You can be really, really careful, but even the most 484 00:29:45,440 --> 00:29:49,640 Speaker 1: careful driver is going to have blind spots. But a 485 00:29:49,640 --> 00:29:53,880 Speaker 1: car with a sophisticated set of sensors, well, in my mind, 486 00:29:53,960 --> 00:29:56,640 Speaker 1: I was like, well, it could have three sixty degree 487 00:29:56,720 --> 00:29:59,240 Speaker 1: vision all around the vehicle at all times. In fact, 488 00:29:59,280 --> 00:30:02,160 Speaker 1: it could even have it so that it has vision 489 00:30:02,200 --> 00:30:06,880 Speaker 1: above the vehicle, well beyond what we humans can do, 490 00:30:08,000 --> 00:30:10,960 Speaker 1: and that would be phenomenal. I mean that the car 491 00:30:11,040 --> 00:30:15,160 Speaker 1: would be able to detect every possible obstacle all around 492 00:30:15,200 --> 00:30:19,920 Speaker 1: it constantly. Then when it comes to reaction times, you know, 493 00:30:20,000 --> 00:30:22,320 Speaker 1: we we have a problem with latency when it comes 494 00:30:22,320 --> 00:30:24,720 Speaker 1: to perceiving something and then being able to act not. 495 00:30:24,800 --> 00:30:28,560 Speaker 1: It all depends upon the sensory input we get. Uh, 496 00:30:28,640 --> 00:30:30,600 Speaker 1: if it's by touch, we actually are able to react 497 00:30:30,880 --> 00:30:34,400 Speaker 1: much faster than if it's visual. I think by touch 498 00:30:34,440 --> 00:30:38,840 Speaker 1: it's like a hundred fifty milliseconds. Visual stimuli, it's like 499 00:30:38,880 --> 00:30:41,880 Speaker 1: two fifty milliseconds. It's a quarter of a second. Now, 500 00:30:41,880 --> 00:30:44,080 Speaker 1: a quarter of a second is fast, don't get me wrong, 501 00:30:45,120 --> 00:30:47,240 Speaker 1: But if you happen to be traveling in a vehicle 502 00:30:47,520 --> 00:30:50,480 Speaker 1: that's going seventy miles per hour, then a quarter of 503 00:30:50,480 --> 00:30:53,280 Speaker 1: a second lag between seeing something and being able to 504 00:30:53,320 --> 00:30:57,760 Speaker 1: react can potentially mean the difference between avoiding an accident 505 00:30:57,920 --> 00:31:01,720 Speaker 1: or being in one compute. As obviously, can react much 506 00:31:01,800 --> 00:31:06,200 Speaker 1: more quickly than humans can. So of course, in my mind, 507 00:31:06,280 --> 00:31:10,440 Speaker 1: an autonomous car would be better at avoiding accidents than humans. Right, 508 00:31:10,480 --> 00:31:14,440 Speaker 1: that just makes sense. They're able to see everything we can't, 509 00:31:15,320 --> 00:31:18,720 Speaker 1: They're able to react faster than we can. That was 510 00:31:18,760 --> 00:31:22,640 Speaker 1: my thinking, But I didn't take into consideration that the 511 00:31:22,760 --> 00:31:27,680 Speaker 1: decision making process is a really, really complicated one, and 512 00:31:27,720 --> 00:31:31,160 Speaker 1: that perception is more than just seeing. Perception is more 513 00:31:31,240 --> 00:31:34,720 Speaker 1: than just I recognize there's something in front of me. 514 00:31:35,120 --> 00:31:38,440 Speaker 1: It's recognizing the nature of that thing in front of 515 00:31:38,440 --> 00:31:40,680 Speaker 1: you and then being able to decide what to do 516 00:31:40,720 --> 00:31:44,000 Speaker 1: about it. So we humans, we can make decisions really quickly. 517 00:31:44,320 --> 00:31:47,320 Speaker 1: We're good at that. Generally speaking, it's not always the 518 00:31:47,400 --> 00:31:50,080 Speaker 1: right decision, but we're really good at making them. We're 519 00:31:50,120 --> 00:31:53,480 Speaker 1: also pretty good at adapting to new situations. We can 520 00:31:53,760 --> 00:31:58,239 Speaker 1: draw on experience from even situations that might only be 521 00:31:58,280 --> 00:32:01,160 Speaker 1: remotely similar, and be able to use that to guide 522 00:32:01,240 --> 00:32:05,200 Speaker 1: us into a course of actions. We're good at discerning 523 00:32:05,240 --> 00:32:08,840 Speaker 1: the difference between a threat, such as say, a piece 524 00:32:08,880 --> 00:32:11,520 Speaker 1: of a car's bumper laying across a lane of traffic, 525 00:32:12,120 --> 00:32:15,000 Speaker 1: and something that's not a threat, like some leaves that 526 00:32:15,040 --> 00:32:18,800 Speaker 1: are on the road. But computers do not have this 527 00:32:19,000 --> 00:32:23,520 Speaker 1: native capability, and so one huge challenge with autonomous cars 528 00:32:24,360 --> 00:32:28,920 Speaker 1: is training systems to recognize situations where emergency measures should 529 00:32:28,960 --> 00:32:33,240 Speaker 1: be taken versus ones where things should just proceed as normal. 530 00:32:34,000 --> 00:32:39,000 Speaker 1: If a car applies the brakes suddenly because it thinks 531 00:32:39,040 --> 00:32:41,480 Speaker 1: that a floating plastic bag ahead of it is a 532 00:32:41,520 --> 00:32:44,920 Speaker 1: solid obstacle. The car might cause an accident on the 533 00:32:45,000 --> 00:32:49,200 Speaker 1: roads because it stops short when it shouldn't. Similarly, if 534 00:32:49,240 --> 00:32:52,280 Speaker 1: a car fails to detect that a semi truck is 535 00:32:52,320 --> 00:32:56,680 Speaker 1: crossing lanes of traffic, tragedy can strike. We know this 536 00:32:56,840 --> 00:33:00,680 Speaker 1: because we've got examples of autonomous vehicles or least vehicles 537 00:33:00,720 --> 00:33:06,080 Speaker 1: operating an advanced driver assist modes where the vehicle failed 538 00:33:06,080 --> 00:33:08,960 Speaker 1: to detect a semi truck that was crossing lanes of traffic, 539 00:33:09,040 --> 00:33:12,280 Speaker 1: possibly because the system misinterpreted the side of the truck 540 00:33:12,640 --> 00:33:15,720 Speaker 1: to be the horizon, and as a result, the autonomous 541 00:33:15,760 --> 00:33:18,840 Speaker 1: car crashed into the truck and the driver in the 542 00:33:18,840 --> 00:33:23,320 Speaker 1: autonomous car died. And as time has gone on, my 543 00:33:23,440 --> 00:33:27,400 Speaker 1: initial enthusiasm for driverless cars has been tempered by the 544 00:33:27,480 --> 00:33:32,000 Speaker 1: reality of how complicated this challenge really is. I've learned 545 00:33:32,200 --> 00:33:35,840 Speaker 1: to be a bit more skeptical about autonomous car systems. 546 00:33:36,240 --> 00:33:38,160 Speaker 1: And it's not that I think we aren't going to 547 00:33:38,200 --> 00:33:42,440 Speaker 1: get there. I think that we will eventually, but I 548 00:33:42,520 --> 00:33:45,880 Speaker 1: now realized that my initial perception of the nature of 549 00:33:45,920 --> 00:33:50,040 Speaker 1: the challenge was way too narrow in scope. I was 550 00:33:50,080 --> 00:33:53,880 Speaker 1: being naive. I was thinking of the typical situations that 551 00:33:53,920 --> 00:33:57,320 Speaker 1: you might find yourself in when you're driving down your 552 00:33:57,760 --> 00:34:01,040 Speaker 1: average road. But the truth of the matter is that 553 00:34:01,160 --> 00:34:05,200 Speaker 1: driving conditions very greatly from place to place and at 554 00:34:05,240 --> 00:34:09,000 Speaker 1: different times of year, and unusual events can happen at 555 00:34:09,040 --> 00:34:13,520 Speaker 1: any time. Now, that unusual event might be something really dangerous. 556 00:34:13,520 --> 00:34:16,680 Speaker 1: It could be like mud slides or rock slides, that 557 00:34:16,760 --> 00:34:19,600 Speaker 1: kind of thing, or it might be something more benign, 558 00:34:20,000 --> 00:34:23,480 Speaker 1: like some leaves are being blown across the road. But 559 00:34:24,400 --> 00:34:27,600 Speaker 1: while we humans can interpret those things very quickly and 560 00:34:27,640 --> 00:34:32,160 Speaker 1: then react appropriately, computer systems don't magically have that ability. 561 00:34:32,560 --> 00:34:36,160 Speaker 1: That was just something I did not consider. Now, I 562 00:34:36,280 --> 00:34:40,200 Speaker 1: definitely use critical thinking when I consider driverless cars today, 563 00:34:40,520 --> 00:34:43,440 Speaker 1: and I recognize that there are some really sophisticated systems 564 00:34:43,440 --> 00:34:49,520 Speaker 1: out there that, within restrictive parameters, work incredibly well. But 565 00:34:49,719 --> 00:34:52,880 Speaker 1: they still need those restrictive parameters. They're not capable of 566 00:34:52,880 --> 00:34:59,920 Speaker 1: operating in every region, in every condition with incredible accuracy 567 00:35:00,080 --> 00:35:04,920 Speaker 1: and reliability. But back in those early days, I was 568 00:35:05,000 --> 00:35:07,719 Speaker 1: really a tech evangelist. I love the idea of driverless 569 00:35:07,719 --> 00:35:11,799 Speaker 1: cars virtually eliminating car accidents. I mean that means tens 570 00:35:11,840 --> 00:35:15,960 Speaker 1: of thousands of people would not die from traffic accidents 571 00:35:16,120 --> 00:35:20,440 Speaker 1: each year if we had reliable driverless cars, hundreds of 572 00:35:20,480 --> 00:35:23,600 Speaker 1: thousands of people would not be affected by the sudden 573 00:35:23,640 --> 00:35:27,040 Speaker 1: loss of a loved one, and then the benefits ripple 574 00:35:27,080 --> 00:35:29,480 Speaker 1: out in ways we can't even imagine those people continue 575 00:35:29,520 --> 00:35:33,480 Speaker 1: to be able to contribute to society. We don't have 576 00:35:34,200 --> 00:35:41,360 Speaker 1: these these uh hits, financial hits that come with the 577 00:35:41,880 --> 00:35:44,120 Speaker 1: fact that we have these traffic accidents. You have all 578 00:35:44,120 --> 00:35:48,279 Speaker 1: these different things that are are great if you're able 579 00:35:48,320 --> 00:35:51,040 Speaker 1: to eliminate accidents. So of course you would want that 580 00:35:51,120 --> 00:35:55,120 Speaker 1: kind of future. I mean, that's a future that that 581 00:35:55,440 --> 00:35:59,600 Speaker 1: I think everybody would long for. But as beautiful as 582 00:35:59,640 --> 00:36:02,160 Speaker 1: the potent chill is, that doesn't necessarily make it any 583 00:36:02,200 --> 00:36:05,200 Speaker 1: closer to reality. Just because we want something to be 584 00:36:05,280 --> 00:36:09,240 Speaker 1: true doesn't mean it is true, or that it's even 585 00:36:09,280 --> 00:36:12,800 Speaker 1: close to becoming true. In fact, that's when we really 586 00:36:12,840 --> 00:36:16,160 Speaker 1: need to use critical thinking to make sure that the 587 00:36:16,200 --> 00:36:20,120 Speaker 1: thing we want is in fact possible and not just 588 00:36:20,239 --> 00:36:25,719 Speaker 1: some diversion. Like just because you want that cryptocurrency to 589 00:36:25,719 --> 00:36:27,560 Speaker 1: go up doesn't mean it's going to go up. Just 590 00:36:27,600 --> 00:36:32,040 Speaker 1: because you want two buy an n f T and 591 00:36:32,040 --> 00:36:33,759 Speaker 1: and make a huge amount of money. Doesn't mean that's 592 00:36:33,760 --> 00:36:38,480 Speaker 1: gonna work either. Just because you want a web system 593 00:36:38,520 --> 00:36:43,080 Speaker 1: that is divorced from the massive companies that currently dominate 594 00:36:43,120 --> 00:36:49,279 Speaker 1: the web Google, Facebook, Amazon, Microsoft, Apple doesn't mean that 595 00:36:49,360 --> 00:36:55,759 Speaker 1: the new system won't have its own pillars of centralized authority. Right. 596 00:36:55,920 --> 00:36:58,680 Speaker 1: Just because you change one thing doesn't mean that it's 597 00:36:58,719 --> 00:37:04,480 Speaker 1: actually better. It made us be different anyway. If you're 598 00:37:04,480 --> 00:37:07,239 Speaker 1: someone who loves tech and gadgets and whatnot, I think 599 00:37:07,239 --> 00:37:10,440 Speaker 1: that's awesome. I mean, I still do love tech. I 600 00:37:10,520 --> 00:37:14,879 Speaker 1: love gadgets. I love learning about technological systems and how 601 00:37:14,920 --> 00:37:17,760 Speaker 1: they work and sometimes how they don't work in quirky ways. 602 00:37:18,360 --> 00:37:20,520 Speaker 1: I don't think we should just give up on tech. 603 00:37:20,880 --> 00:37:24,480 Speaker 1: I don't think we should be unenthusiastic about technology. I 604 00:37:24,520 --> 00:37:28,000 Speaker 1: don't think we should just automatically shut down if we 605 00:37:28,080 --> 00:37:31,680 Speaker 1: hear someone talk a technology up. I don't think that that's, 606 00:37:31,960 --> 00:37:34,160 Speaker 1: you know, the right course of action either. If someone 607 00:37:34,200 --> 00:37:36,960 Speaker 1: comes up and says, hey, I really love this new phone, 608 00:37:37,360 --> 00:37:40,319 Speaker 1: you don't just automatically just say ah, chill and turn 609 00:37:40,360 --> 00:37:43,440 Speaker 1: around and walk away. But you should engage in critical thinking. 610 00:37:44,080 --> 00:37:48,960 Speaker 1: Ask yourself and ask others questions about the various claims 611 00:37:49,080 --> 00:37:53,080 Speaker 1: and promises, look into them, really examine them, find out 612 00:37:53,160 --> 00:37:56,520 Speaker 1: if the things that are being promised are even possible. 613 00:37:56,600 --> 00:38:00,040 Speaker 1: Like Saraos should have taught us all that this is 614 00:38:00,080 --> 00:38:02,200 Speaker 1: something we need to do on a regular basis. That 615 00:38:02,239 --> 00:38:06,040 Speaker 1: company reach the levels it did because people failed to 616 00:38:06,120 --> 00:38:10,840 Speaker 1: ask realistic, critical questions and a lot of people lost 617 00:38:10,880 --> 00:38:13,560 Speaker 1: a lot of money because of it, and some people 618 00:38:13,560 --> 00:38:16,879 Speaker 1: are going to jail because of it. If the tech 619 00:38:16,920 --> 00:38:20,120 Speaker 1: holds up to whatever level of scrutiny you've given it 620 00:38:20,200 --> 00:38:22,680 Speaker 1: when you started to ask these tough questions and all 621 00:38:22,719 --> 00:38:26,080 Speaker 1: the answers are pointing to great things, holy cats, you 622 00:38:26,120 --> 00:38:30,080 Speaker 1: are onto something. Hold on with both hands. That's awesome. 623 00:38:31,200 --> 00:38:34,600 Speaker 1: Maybe it doesn't hold up entirely, maybe it fails in 624 00:38:34,640 --> 00:38:37,160 Speaker 1: a couple of respects, but really that could just mean 625 00:38:37,200 --> 00:38:40,839 Speaker 1: that your expectations are now more realistic. Maybe you still 626 00:38:40,840 --> 00:38:43,399 Speaker 1: want to adopt the technology, but now you know what 627 00:38:43,440 --> 00:38:46,600 Speaker 1: it's limitations are, and so you're not disappointed when you 628 00:38:46,680 --> 00:38:49,120 Speaker 1: run up against them because you're aware of them. You 629 00:38:49,200 --> 00:38:52,080 Speaker 1: ask the right questions and you're you're you're not expecting 630 00:38:52,080 --> 00:38:55,440 Speaker 1: it to do something that it cannot do. Or maybe 631 00:38:56,040 --> 00:38:59,680 Speaker 1: you find out the thing you thought sounded cool really 632 00:39:00,040 --> 00:39:02,719 Speaker 1: has nothing going for it, and you avoid stepping into 633 00:39:02,760 --> 00:39:07,279 Speaker 1: a trap. Whatever the outcome, you benefit from the use 634 00:39:07,480 --> 00:39:11,160 Speaker 1: of critical thinking. I really do think critical thinking is 635 00:39:11,160 --> 00:39:14,080 Speaker 1: something that needs to be taught formally, and I think 636 00:39:14,080 --> 00:39:16,919 Speaker 1: it needs to be taught early. I did not really 637 00:39:17,000 --> 00:39:21,719 Speaker 1: encounter it until I was well into high school and 638 00:39:22,000 --> 00:39:25,839 Speaker 1: really more into college, and I feel like I would 639 00:39:25,840 --> 00:39:31,440 Speaker 1: have benefited from that perspective much much earlier on than 640 00:39:31,520 --> 00:39:36,000 Speaker 1: when I was exposed to it, because well, especially as 641 00:39:36,040 --> 00:39:39,320 Speaker 1: a teenager, I was resistant to ideas that I didn't 642 00:39:39,360 --> 00:39:43,960 Speaker 1: already have because you know, that's awesome. Uh. I'm not 643 00:39:44,000 --> 00:39:46,560 Speaker 1: proud of that, but it was a phase and I 644 00:39:46,560 --> 00:39:48,200 Speaker 1: think a lot of people go through that. So if 645 00:39:48,200 --> 00:39:50,600 Speaker 1: it had been introduced earlier, I'm not saying I still 646 00:39:50,640 --> 00:39:53,400 Speaker 1: wouldn't have been a little jerk as a teenager, but 647 00:39:53,480 --> 00:39:57,239 Speaker 1: maybe I would have been a critically thinking one. Um Anyway, 648 00:39:57,560 --> 00:39:59,319 Speaker 1: I know, I just did a rerun where I talked 649 00:39:59,320 --> 00:40:02,800 Speaker 1: about critical thinking, and I know that people can get 650 00:40:02,840 --> 00:40:07,840 Speaker 1: a little tired of this topic, but when I see 651 00:40:08,000 --> 00:40:14,080 Speaker 1: examples pop up over and over that really reinforce the 652 00:40:14,120 --> 00:40:18,080 Speaker 1: idea that folks are failing to think critically. I feel 653 00:40:18,080 --> 00:40:20,600 Speaker 1: like I need to talk about it again just to 654 00:40:20,680 --> 00:40:24,920 Speaker 1: reinforce that and to remind people. And it's okay if 655 00:40:24,960 --> 00:40:28,200 Speaker 1: occasionally you fall short. I still fall short all the time. 656 00:40:28,880 --> 00:40:31,200 Speaker 1: But it's good to try and keep it in mind 657 00:40:31,280 --> 00:40:35,240 Speaker 1: because it might mean you avoid calamity in the future, 658 00:40:36,080 --> 00:40:38,960 Speaker 1: or it might mean you actually find the next big thing. 659 00:40:40,040 --> 00:40:44,040 Speaker 1: But if you don't think critically, that becomes way more 660 00:40:44,280 --> 00:40:48,239 Speaker 1: a matter of luck than of actual determination. All Right, 661 00:40:48,280 --> 00:40:50,719 Speaker 1: that's it for this episode. If you have suggestions for 662 00:40:50,800 --> 00:40:52,400 Speaker 1: future topics, there are a couple of ways you can 663 00:40:52,400 --> 00:40:54,920 Speaker 1: get in touch. One is you can go to the 664 00:40:55,000 --> 00:40:58,280 Speaker 1: I Heart radio app. You can navigate over to tech Stuff. 665 00:40:58,920 --> 00:41:01,040 Speaker 1: The app is free to you. Is it's free to download. 666 00:41:01,120 --> 00:41:02,799 Speaker 1: Just go to the tech stuff part. You know, use 667 00:41:02,840 --> 00:41:04,719 Speaker 1: little search engine. Go over to tech Stuff. There's a 668 00:41:04,719 --> 00:41:07,279 Speaker 1: little microphone icon you click on that you can leave 669 00:41:07,280 --> 00:41:10,239 Speaker 1: a voice message up to thirty seconds in length. And 670 00:41:10,680 --> 00:41:12,360 Speaker 1: at least for the time being, you can still go 671 00:41:12,400 --> 00:41:15,520 Speaker 1: to Twitter and send me a message tech stuff. Hs W. 672 00:41:16,360 --> 00:41:20,040 Speaker 1: I'll be talking tomorrow about Elon Musk again and Twitter, 673 00:41:20,440 --> 00:41:24,200 Speaker 1: because that story has changed again. In fact, it changed yesterday, 674 00:41:24,200 --> 00:41:27,120 Speaker 1: but it changed after I had already filed my podcast, 675 00:41:28,320 --> 00:41:31,000 Speaker 1: so that's the thing again. We'll talk about that tomorrow, 676 00:41:31,360 --> 00:41:33,279 Speaker 1: but anyway, Yeah, get in touch with me, let me 677 00:41:33,320 --> 00:41:35,040 Speaker 1: know what you would like, and I'll talk to you again. 678 00:41:35,719 --> 00:41:44,480 Speaker 1: Release soon. Text Stuff is an I Heart Radio production. 679 00:41:44,719 --> 00:41:47,560 Speaker 1: For more podcasts from my Heart Radio, visit the i 680 00:41:47,640 --> 00:41:50,880 Speaker 1: heart Radio app, Apple Podcasts, or wherever you listen to 681 00:41:50,920 --> 00:41:51,840 Speaker 1: your favorite shows.