1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to Tech Stuff, a production from my Heart Radio. 2 00:00:12,000 --> 00:00:14,920 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host 3 00:00:15,080 --> 00:00:18,000 Speaker 1: job in Strickland. I'm an executive producer with I Heart Radio, 4 00:00:18,120 --> 00:00:21,479 Speaker 1: and I love all things tech. Though. I may just 5 00:00:21,600 --> 00:00:24,560 Speaker 1: need to come up with a new tagline, y'all, because 6 00:00:24,600 --> 00:00:26,799 Speaker 1: these episodes are really going to take it out of me, 7 00:00:26,880 --> 00:00:29,360 Speaker 1: because it's time to look back on some of the 8 00:00:29,360 --> 00:00:34,320 Speaker 1: biggest tech stories of the past year. Now, there is 9 00:00:34,320 --> 00:00:38,760 Speaker 1: no question that one was another rough one in a 10 00:00:38,880 --> 00:00:43,239 Speaker 1: series of rough years. The COVID nineteen pandemic and the 11 00:00:43,280 --> 00:00:47,000 Speaker 1: emergence of variants like Delta and Omicron have meant that 12 00:00:47,520 --> 00:00:52,360 Speaker 1: the tough conditions we found ourselves in during have continued. 13 00:00:52,880 --> 00:00:55,760 Speaker 1: And so that's going to weave in and out of 14 00:00:55,800 --> 00:00:58,080 Speaker 1: a lot of the stories we will be covering over 15 00:00:58,120 --> 00:01:01,840 Speaker 1: the next couple of episodes. And a lot happened, so 16 00:01:02,080 --> 00:01:04,080 Speaker 1: there will be several episodes. Also, a lot of this 17 00:01:04,240 --> 00:01:07,720 Speaker 1: is going to extend beyond tech in some ways, but 18 00:01:07,880 --> 00:01:13,160 Speaker 1: tech played a critical part in these stories. So one 19 00:01:13,200 --> 00:01:17,600 Speaker 1: really big story has been how tech companies like Apple, Google, 20 00:01:18,040 --> 00:01:24,000 Speaker 1: and Meta Slash, Facebook plus others have announced, postponed, and 21 00:01:24,040 --> 00:01:29,039 Speaker 1: in some cases indefinitely delayed a return to the office. 22 00:01:29,600 --> 00:01:34,320 Speaker 1: We've seen companies make these announcements several times throughout one 23 00:01:34,440 --> 00:01:38,000 Speaker 1: only to have to backtrack, like sometimes a few days 24 00:01:38,080 --> 00:01:39,839 Speaker 1: or a few weeks or a month or two later, 25 00:01:40,440 --> 00:01:45,400 Speaker 1: as things totally failed to get better. Maybe all the 26 00:01:45,480 --> 00:01:48,440 Speaker 1: optimism in the tech sector is a little bit on 27 00:01:48,480 --> 00:01:51,760 Speaker 1: the naive side. I think a lot of people start thinking, Hey, 28 00:01:51,800 --> 00:01:55,400 Speaker 1: just because we can seemingly do anything with tech means 29 00:01:55,440 --> 00:01:59,559 Speaker 1: that we can seemingly overcome any obstacle. Isn't necessarily true 30 00:01:59,720 --> 00:02:04,240 Speaker 1: see also paraos. But lots of companies originally planned a 31 00:02:04,360 --> 00:02:07,080 Speaker 1: return to the office, or at least a partial return 32 00:02:07,640 --> 00:02:10,280 Speaker 1: in which corporate employees were expected to come in at 33 00:02:10,360 --> 00:02:13,000 Speaker 1: least a few days out of each week. Uh And 34 00:02:13,160 --> 00:02:15,520 Speaker 1: a lot of that started in the late summer or 35 00:02:15,560 --> 00:02:19,840 Speaker 1: early fall of one. But then the Delta variant of 36 00:02:19,919 --> 00:02:23,280 Speaker 1: COVID nineteen changed things as Delta proved to be more 37 00:02:23,320 --> 00:02:27,560 Speaker 1: easily transmissible than other variants of COVID. And then you 38 00:02:27,600 --> 00:02:31,680 Speaker 1: started to hear plans of late fall or early winter 39 00:02:31,919 --> 00:02:34,959 Speaker 1: being the time to return, and a lot of those 40 00:02:35,000 --> 00:02:39,040 Speaker 1: have changed yet again. At the moment, plans generally look 41 00:02:39,160 --> 00:02:43,919 Speaker 1: at a January twenty twenty two return. Google had planned 42 00:02:44,120 --> 00:02:47,640 Speaker 1: to come back to the office for January ten, but 43 00:02:47,840 --> 00:02:51,600 Speaker 1: recently executives sent out an email indicating that employees would 44 00:02:51,600 --> 00:02:54,720 Speaker 1: not be required to comply and that the company will 45 00:02:54,800 --> 00:02:59,920 Speaker 1: reassess those plans early in the new year. Meta slash 46 00:03:00,120 --> 00:03:03,320 Speaker 1: Facebook is aiming for a return to office by January 47 00:03:03,400 --> 00:03:07,200 Speaker 1: thirty one, but has also offered a deferral program two 48 00:03:07,240 --> 00:03:10,000 Speaker 1: employees who can choose to push back their their return 49 00:03:10,120 --> 00:03:13,800 Speaker 1: date two between three and five months, So we might 50 00:03:13,840 --> 00:03:16,919 Speaker 1: be looking at the middle of twenty two, assuming things 51 00:03:17,639 --> 00:03:21,359 Speaker 1: are better by then, which based on the last year, 52 00:03:22,080 --> 00:03:25,520 Speaker 1: it's tough to make those kinds of assumptions. Apple, which 53 00:03:25,639 --> 00:03:28,919 Speaker 1: had aimed to get employees back in offices in early 54 00:03:29,000 --> 00:03:32,400 Speaker 1: February of next year, has now released a statement that 55 00:03:32,480 --> 00:03:35,280 Speaker 1: all return plans are currently on hold and that the 56 00:03:35,360 --> 00:03:38,920 Speaker 1: return date is yet to be determined. The pandemic has 57 00:03:38,920 --> 00:03:42,640 Speaker 1: really driven home the fact that making firm plans is 58 00:03:42,680 --> 00:03:45,680 Speaker 1: just really hard to do. The rise of new variants 59 00:03:45,680 --> 00:03:50,840 Speaker 1: requires changes in game plans. Meanwhile, employees are becoming accustomed 60 00:03:50,920 --> 00:03:54,200 Speaker 1: to being able to work from wherever they happen to be, 61 00:03:54,720 --> 00:03:57,560 Speaker 1: whether that's at home or on the road or whatever. 62 00:03:58,120 --> 00:04:01,280 Speaker 1: So we're seeing a trend of employees expressing reluctance to 63 00:04:01,320 --> 00:04:03,920 Speaker 1: go back to an office for lots of different reasons, 64 00:04:04,400 --> 00:04:08,840 Speaker 1: ranging from convenience to health and safety, to promoting a 65 00:04:08,840 --> 00:04:12,560 Speaker 1: different approach to you know, work life balance and uh. 66 00:04:12,680 --> 00:04:16,080 Speaker 1: One other big bonus to working remotely is that employees 67 00:04:16,120 --> 00:04:20,479 Speaker 1: don't necessarily have to live within commuting distance to their office. 68 00:04:20,960 --> 00:04:23,960 Speaker 1: So for employees who work at companies that are located 69 00:04:24,000 --> 00:04:28,560 Speaker 1: in really expensive markets like the San Francisco area, this 70 00:04:28,640 --> 00:04:32,240 Speaker 1: means they could potentially move away from San Francisco and 71 00:04:32,320 --> 00:04:35,320 Speaker 1: still work at that company. They're just doing so remotely, 72 00:04:35,640 --> 00:04:37,320 Speaker 1: and they can end up living in a place that 73 00:04:37,360 --> 00:04:40,680 Speaker 1: has a much lower cost of living, and that can 74 00:04:40,760 --> 00:04:44,760 Speaker 1: mean that they can afford a larger home or more 75 00:04:44,800 --> 00:04:49,280 Speaker 1: creature comforts, or put stuff towards savings. Plus they might 76 00:04:49,279 --> 00:04:52,520 Speaker 1: be dealing with a lot less hassle like city traffic. 77 00:04:53,200 --> 00:04:57,279 Speaker 1: Some employees have pushed back pretty hard against any move 78 00:04:57,360 --> 00:05:00,720 Speaker 1: to return to offices, at least pushing back against making 79 00:05:00,720 --> 00:05:05,839 Speaker 1: it mandatory. There have been numerous discussions about whether being 80 00:05:05,960 --> 00:05:10,920 Speaker 1: in an office is actually that important to you know, 81 00:05:11,000 --> 00:05:15,159 Speaker 1: getting work done. Um, there's some studies that seem to 82 00:05:15,160 --> 00:05:17,800 Speaker 1: suggest that we actually get more work done at home, 83 00:05:17,960 --> 00:05:24,039 Speaker 1: but things like innovation and forming bonds with coworkers really 84 00:05:24,080 --> 00:05:26,919 Speaker 1: suffers when we are not in the same place at 85 00:05:26,960 --> 00:05:32,359 Speaker 1: the same time. So there are these conflicting thoughts about 86 00:05:32,800 --> 00:05:36,640 Speaker 1: what it means to do work and productivity and the 87 00:05:36,720 --> 00:05:40,799 Speaker 1: benefit of doing work at home versus in the office. 88 00:05:40,880 --> 00:05:43,520 Speaker 1: These are big conversations that are continuing to move forward, 89 00:05:43,520 --> 00:05:46,360 Speaker 1: not just in tech, but we're seeing it play out 90 00:05:46,360 --> 00:05:49,600 Speaker 1: a lot in the text sphere in particular. Now. Whether 91 00:05:49,640 --> 00:05:52,920 Speaker 1: we're shifting toward a parliament approach to a hybrid work 92 00:05:53,040 --> 00:05:57,240 Speaker 1: strategy where people at least work partly from home or remotely, 93 00:05:57,839 --> 00:06:00,479 Speaker 1: that's hard to say right now, but it's certainly seems 94 00:06:00,560 --> 00:06:02,400 Speaker 1: as though a lot of companies are going to have 95 00:06:02,480 --> 00:06:05,040 Speaker 1: trouble convincing employees to come back to the office on 96 00:06:05,080 --> 00:06:08,760 Speaker 1: a regular basis. It may turn out the companies that 97 00:06:08,880 --> 00:06:13,120 Speaker 1: don't offer a hybrid or work from home approach, we'll 98 00:06:13,160 --> 00:06:17,640 Speaker 1: find it more difficult to attract and retain talent. And 99 00:06:18,360 --> 00:06:20,440 Speaker 1: that's a big deal in the tech sector, right Like 100 00:06:21,000 --> 00:06:24,680 Speaker 1: if it turns out that company A offers employees the 101 00:06:24,760 --> 00:06:28,559 Speaker 1: chance to work from wherever they happen to be and 102 00:06:28,839 --> 00:06:33,039 Speaker 1: still get all of their compensation and benefits and company 103 00:06:33,080 --> 00:06:36,560 Speaker 1: B doesn't. You're gonna see migrations from company B to 104 00:06:36,640 --> 00:06:39,280 Speaker 1: company A. I mean that just that's kind of how 105 00:06:39,320 --> 00:06:42,840 Speaker 1: it works in the tech sector in particular. And then 106 00:06:42,839 --> 00:06:44,680 Speaker 1: we also have to look at another side of this, 107 00:06:44,839 --> 00:06:49,599 Speaker 1: all that empty office space. Apple infamously spent around five 108 00:06:49,960 --> 00:06:54,880 Speaker 1: billion dollars that's billion with a B and eight years 109 00:06:54,920 --> 00:06:59,279 Speaker 1: of work to build their their quote unquote spaceship campus 110 00:07:00,240 --> 00:07:03,920 Speaker 1: that opened in seventeen and it's got like seven cafes 111 00:07:03,960 --> 00:07:07,880 Speaker 1: and a hundred thousand square foot fitness center, it's got 112 00:07:07,920 --> 00:07:12,840 Speaker 1: a thousand person theater, tons of office space, so understandably 113 00:07:13,440 --> 00:07:15,760 Speaker 1: having it mostly empty has got to be a source 114 00:07:15,760 --> 00:07:20,360 Speaker 1: of frustration and pain for Apple executives. Also, it's really 115 00:07:20,400 --> 00:07:22,960 Speaker 1: hard to keep an eye on what's going on when 116 00:07:23,000 --> 00:07:26,560 Speaker 1: your employees aren't like forced to be right there at 117 00:07:26,560 --> 00:07:30,200 Speaker 1: the office. It's also really hard to intimidate employees with 118 00:07:30,280 --> 00:07:34,200 Speaker 1: your presence if you can't be there. So I'm not 119 00:07:34,280 --> 00:07:40,160 Speaker 1: saying that that's necessarily Apple's modus operandi, but I'm not 120 00:07:40,160 --> 00:07:44,720 Speaker 1: not saying it either. So we've seen office occupancy dropped 121 00:07:44,800 --> 00:07:49,160 Speaker 1: below you know, between twenty five and across the United 122 00:07:49,200 --> 00:07:53,800 Speaker 1: States of what it was pre pandemic for most of 123 00:07:53,840 --> 00:07:56,800 Speaker 1: the last two years. Now in some cities like San Francisco, 124 00:07:56,960 --> 00:08:00,520 Speaker 1: that occupancy rate is actually lower, it's below fifteen percent. 125 00:08:01,320 --> 00:08:05,040 Speaker 1: There's still a trend in construction, so there's still like 126 00:08:05,640 --> 00:08:08,920 Speaker 1: a trend of building out new office space that's still happening. 127 00:08:09,560 --> 00:08:12,240 Speaker 1: So this raises the question who the heck is going 128 00:08:12,280 --> 00:08:16,040 Speaker 1: to be in all those office spaces moving forward. We 129 00:08:16,120 --> 00:08:21,400 Speaker 1: haven't seen companies abandoned their lease agreements on MOSS. It's 130 00:08:21,400 --> 00:08:23,840 Speaker 1: not like companies are saying, well, no need for us 131 00:08:23,840 --> 00:08:25,920 Speaker 1: to have offices anymore, We're all gonna work from home, 132 00:08:25,960 --> 00:08:28,920 Speaker 1: so let's get rid of those. That hasn't really happened, 133 00:08:28,920 --> 00:08:31,600 Speaker 1: at least not on a big scale. We have seen 134 00:08:31,680 --> 00:08:35,160 Speaker 1: some companies start to sublease some of their space, so 135 00:08:35,840 --> 00:08:39,160 Speaker 1: we might see some companies start to kind of slim 136 00:08:39,240 --> 00:08:42,040 Speaker 1: down as far as office space goes. They might reduce 137 00:08:42,080 --> 00:08:45,880 Speaker 1: the amount of office space that they maintain, especially if 138 00:08:45,920 --> 00:08:49,959 Speaker 1: they go to a hybrid approach where not all employees 139 00:08:50,000 --> 00:08:52,319 Speaker 1: will be in the office on every day, so people 140 00:08:52,400 --> 00:08:56,640 Speaker 1: might end up using shared employee spaces and thus you 141 00:08:56,679 --> 00:09:01,199 Speaker 1: don't need as many individual desk as you would if 142 00:09:01,240 --> 00:09:04,560 Speaker 1: everybody were there for every day. That's a possibility, but 143 00:09:05,040 --> 00:09:08,199 Speaker 1: it's still too early to really say that that's the 144 00:09:08,240 --> 00:09:11,120 Speaker 1: future of work. I expect we're going to see a 145 00:09:11,120 --> 00:09:14,160 Speaker 1: lot more innovation and products that target the hybrid or 146 00:09:14,320 --> 00:09:18,240 Speaker 1: work from home models in two We've already seen some 147 00:09:18,320 --> 00:09:21,520 Speaker 1: of those emerge. Uh, and we've also seen companies like 148 00:09:21,600 --> 00:09:25,199 Speaker 1: Slack and Zoom pivot a little bit to target that 149 00:09:25,440 --> 00:09:29,400 Speaker 1: use case more so not necessarily developing a lot of 150 00:09:29,720 --> 00:09:34,800 Speaker 1: products specifically geared towards that, but rather hey, this this 151 00:09:34,840 --> 00:09:37,640 Speaker 1: tool is also good if you aren't in the office, 152 00:09:37,720 --> 00:09:39,959 Speaker 1: that kind of thing. So I wouldn't be surprised to 153 00:09:40,000 --> 00:09:45,000 Speaker 1: see more of that introduced next year. Another pandemic issue 154 00:09:45,040 --> 00:09:48,960 Speaker 1: that started in and is going strong today is the 155 00:09:49,080 --> 00:09:53,240 Speaker 1: disruption of the supply chain. Now, this also goes well 156 00:09:53,280 --> 00:09:56,120 Speaker 1: beyond just tech. We're seeing it affect everything. Heck, I 157 00:09:56,160 --> 00:09:58,679 Speaker 1: see it at restaurants where they say, hey, we may 158 00:09:58,720 --> 00:10:01,520 Speaker 1: not have everything that's on the menu because of supply 159 00:10:01,640 --> 00:10:05,880 Speaker 1: chain issues. Um. So, supply chains are for pretty much 160 00:10:05,920 --> 00:10:08,720 Speaker 1: every sector are affected by this, but the tech space 161 00:10:08,760 --> 00:10:11,840 Speaker 1: has been hit really hard by it. Now we've all 162 00:10:11,920 --> 00:10:16,360 Speaker 1: heard about the semiconductor shortage, which has affected everything from 163 00:10:16,440 --> 00:10:20,800 Speaker 1: video game consoles to vehicles. And shortages mean that it's 164 00:10:20,840 --> 00:10:23,600 Speaker 1: really hard to find stuff like say a PS five 165 00:10:24,240 --> 00:10:27,440 Speaker 1: or It also means that car dealerships have less inventory 166 00:10:27,520 --> 00:10:30,320 Speaker 1: than what they typically have, and what they do have 167 00:10:30,760 --> 00:10:34,760 Speaker 1: ends up being way more expensive. Also means that some 168 00:10:34,840 --> 00:10:38,360 Speaker 1: car manufacturers are having to scale back on features and 169 00:10:38,440 --> 00:10:42,520 Speaker 1: options that they had introduced a few years ago because 170 00:10:42,520 --> 00:10:44,920 Speaker 1: now they just don't have the chips to continue offering 171 00:10:44,960 --> 00:10:48,760 Speaker 1: those features. You know that's BMW has done that with 172 00:10:48,840 --> 00:10:51,360 Speaker 1: a couple of its features, where things that you could 173 00:10:51,400 --> 00:10:53,840 Speaker 1: have purchased a couple of years ago now were no 174 00:10:53,880 --> 00:10:56,360 Speaker 1: longer available because they the company simply doesn't have the 175 00:10:56,480 --> 00:11:00,840 Speaker 1: chips to make those those components. So rather than go 176 00:11:00,960 --> 00:11:04,080 Speaker 1: without a car at all, the new ones just have 177 00:11:04,320 --> 00:11:08,040 Speaker 1: fewer features than older ones. That's a trend most companies 178 00:11:08,040 --> 00:11:12,360 Speaker 1: don't like to follow. Companies like Intel are working hard 179 00:11:12,600 --> 00:11:17,400 Speaker 1: to expand production for semiconductors and microchips. They're planning out 180 00:11:17,520 --> 00:11:20,920 Speaker 1: new manufacturing facilities all around the world, including within the 181 00:11:21,000 --> 00:11:23,800 Speaker 1: United States, but those plans are going to take a 182 00:11:23,800 --> 00:11:26,559 Speaker 1: few years to come to fruition, and in the meantime, 183 00:11:26,840 --> 00:11:30,520 Speaker 1: we've still got this shortage to deal with. Intel's CEO 184 00:11:30,720 --> 00:11:34,760 Speaker 1: has expressed doubt that the shortage will end before three 185 00:11:34,840 --> 00:11:38,080 Speaker 1: at the earliest, so we've got another year of this, 186 00:11:38,240 --> 00:11:42,360 Speaker 1: at least according to the CEO of Intel. There are 187 00:11:42,360 --> 00:11:47,120 Speaker 1: other leaders in tech who have slightly more optimistic prognoses, 188 00:11:47,679 --> 00:11:50,679 Speaker 1: but um, I'm not sure that those are realistic. I 189 00:11:50,720 --> 00:11:54,679 Speaker 1: hope they are. I'm just I'm not feeling it. Even 190 00:11:54,720 --> 00:11:57,560 Speaker 1: after building out new facilities, we still have issues with 191 00:11:57,640 --> 00:12:00,880 Speaker 1: the supply chain. So one of the really big problems 192 00:12:00,880 --> 00:12:05,160 Speaker 1: with the pandemic has been how port and shipping operations 193 00:12:05,160 --> 00:12:08,240 Speaker 1: have been affected. The pandemic has pulled back the curtain 194 00:12:08,280 --> 00:12:11,240 Speaker 1: on how much we depend upon a relatively small number 195 00:12:11,280 --> 00:12:15,840 Speaker 1: of professionals to keep the supply chain moving, and when 196 00:12:15,880 --> 00:12:20,240 Speaker 1: that gets disrupted, you have delays and bottlenecks and sort 197 00:12:20,240 --> 00:12:23,000 Speaker 1: of domino effects, and stuff starts to pile up at 198 00:12:23,040 --> 00:12:26,160 Speaker 1: one port while waiting to get loaded onto a ship, 199 00:12:26,679 --> 00:12:29,280 Speaker 1: and then the ship ends up getting in a bottleneck 200 00:12:29,320 --> 00:12:32,360 Speaker 1: waiting to dock at another port that's already backed up. 201 00:12:32,840 --> 00:12:36,439 Speaker 1: It becomes this sort of cascading effect across the entire industry, 202 00:12:36,720 --> 00:12:40,520 Speaker 1: and of course, the supply chain issue wasn't helped when 203 00:12:40,640 --> 00:12:44,800 Speaker 1: in March of this year, the cargo vessel named ever 204 00:12:45,000 --> 00:12:48,920 Speaker 1: Given got stuck in the Suez Canal. So for about 205 00:12:48,960 --> 00:12:53,320 Speaker 1: a week six days actually, the vessel was blocking a 206 00:12:53,440 --> 00:12:56,560 Speaker 1: key passage for cargo ships, and by a key passage, 207 00:12:56,600 --> 00:12:59,920 Speaker 1: I mean nearly a third of all container ship track 208 00:13:00,080 --> 00:13:04,080 Speaker 1: thick passes through the Suez Canal. Now, the Suez Canal 209 00:13:04,160 --> 00:13:08,800 Speaker 1: Authority took possession of the ship once it was finally dislodged, 210 00:13:09,440 --> 00:13:12,080 Speaker 1: and they held it for several months, finally releasing it 211 00:13:12,120 --> 00:13:16,199 Speaker 1: in July after demanding nearly one billion dollars in fines. 212 00:13:16,880 --> 00:13:20,720 Speaker 1: That amount was later reduced to around fifty million dollars. 213 00:13:21,440 --> 00:13:23,440 Speaker 1: And um, you know, if you've ever been stuck in 214 00:13:23,480 --> 00:13:26,120 Speaker 1: bumper to bumper traffic on a highway and you finally 215 00:13:26,160 --> 00:13:29,680 Speaker 1: get to a section that mysteriously clears up and there's 216 00:13:29,720 --> 00:13:32,720 Speaker 1: no obvious cause for what made the traffic jam happen 217 00:13:32,760 --> 00:13:35,520 Speaker 1: in the first place, then you've had some experience with 218 00:13:35,520 --> 00:13:40,200 Speaker 1: bottleneck issues. They can and do clear up, but it's 219 00:13:40,240 --> 00:13:44,560 Speaker 1: not instantaneous, and even as one element gets some breathing room, 220 00:13:44,679 --> 00:13:49,400 Speaker 1: others are lagging behind, so we're likely to continue feeling 221 00:13:49,440 --> 00:13:52,640 Speaker 1: the effects of the supply chain disruption for some time, 222 00:13:52,800 --> 00:13:55,800 Speaker 1: and not just in semiconductors. This also, by the way, 223 00:13:55,800 --> 00:14:00,280 Speaker 1: effects pricing, which means we consumers start to really feel 224 00:14:00,320 --> 00:14:02,360 Speaker 1: the pinch, as I'm sure all of you out there 225 00:14:02,400 --> 00:14:05,079 Speaker 1: listening have seen in some way or another, whether it's 226 00:14:05,080 --> 00:14:10,640 Speaker 1: in food prices, coffee prices, fuel prices, electronics, all of 227 00:14:10,640 --> 00:14:13,480 Speaker 1: these things, we're all kind of feeling it in different ways. 228 00:14:13,960 --> 00:14:16,840 Speaker 1: On a related note, one trend we saw increase in 229 00:14:17,679 --> 00:14:22,720 Speaker 1: one was in bought activity, specifically bots. That is, these 230 00:14:22,760 --> 00:14:28,160 Speaker 1: automated scripts that are designed to purchase certain goods um 231 00:14:28,320 --> 00:14:32,040 Speaker 1: like like video game consoles, you know, high value goods 232 00:14:32,120 --> 00:14:35,120 Speaker 1: that are in high demand. Now, the whole purpose of 233 00:14:35,160 --> 00:14:38,920 Speaker 1: these bots is just to corner a market on whatever 234 00:14:39,000 --> 00:14:43,480 Speaker 1: high end good they're focusing on, and then resell those 235 00:14:43,480 --> 00:14:48,880 Speaker 1: goods at crazy markups on sites like eBay. So gamers 236 00:14:48,920 --> 00:14:51,760 Speaker 1: out there who are desperate for a new console might 237 00:14:51,760 --> 00:14:54,080 Speaker 1: find that the only option available to them is to 238 00:14:54,200 --> 00:14:57,600 Speaker 1: go and pay through the nose at you know, eBay 239 00:14:57,720 --> 00:15:02,120 Speaker 1: or some other resales site. Essentially, it's scalping. It's the 240 00:15:02,160 --> 00:15:05,760 Speaker 1: same approach that you see with people who scalp tickets. 241 00:15:05,800 --> 00:15:09,160 Speaker 1: So the practice inspired several US politicians, all of whom 242 00:15:09,160 --> 00:15:12,840 Speaker 1: are Democrats, to reintroduce a piece of legislation called the 243 00:15:12,960 --> 00:15:17,120 Speaker 1: Stopping the grinch Bots Act. This would make the use 244 00:15:17,160 --> 00:15:21,640 Speaker 1: of automated tools to bypass online retail security measures uh 245 00:15:21,920 --> 00:15:25,840 Speaker 1: in the effort to purchase and resell products and services illegal. 246 00:15:25,920 --> 00:15:28,640 Speaker 1: It was just it would become against the law, and 247 00:15:28,760 --> 00:15:31,560 Speaker 1: it would give authority to the Federal Trade Commission or 248 00:15:31,680 --> 00:15:36,080 Speaker 1: f TC to enforce this law. Lawmakers introduced a similar 249 00:15:36,120 --> 00:15:38,680 Speaker 1: measure in two thousand eighteen, but it's stalled out in 250 00:15:38,840 --> 00:15:42,080 Speaker 1: committees and never really went to vote. And there's also 251 00:15:42,200 --> 00:15:47,520 Speaker 1: a Stop the Grinch Act proposed by the Senator from Utah. 252 00:15:47,680 --> 00:15:50,160 Speaker 1: That one actually is looking to alleviate some of the 253 00:15:50,200 --> 00:15:53,840 Speaker 1: congestion within the supply chain, you know, the stuff that 254 00:15:53,880 --> 00:15:56,000 Speaker 1: we were talking about a moment ago, at least within 255 00:15:56,040 --> 00:15:59,560 Speaker 1: the United States, to help alleviate some of those issues. 256 00:15:59,800 --> 00:16:03,840 Speaker 1: Of course, you know, it wouldn't necessarily fixed stuff overseas, 257 00:16:03,920 --> 00:16:06,680 Speaker 1: which is part of the problem. We're talking about global problems, 258 00:16:06,680 --> 00:16:10,040 Speaker 1: so local solutions are kind of only a band aid, 259 00:16:10,320 --> 00:16:13,480 Speaker 1: but it would still be you know, a step toward that. However, 260 00:16:13,800 --> 00:16:16,320 Speaker 1: that's a different act, so it gets a bit confusing 261 00:16:16,320 --> 00:16:18,640 Speaker 1: because there's a Stop the grinch Bots Act and a 262 00:16:18,840 --> 00:16:22,680 Speaker 1: Stopping the Grinch Act, so they have similar names. Um, 263 00:16:22,680 --> 00:16:25,680 Speaker 1: but there are two different things. Something else that didn't 264 00:16:25,760 --> 00:16:29,080 Speaker 1: originate in one but it cranked into a higher gear 265 00:16:30,080 --> 00:16:33,440 Speaker 1: was the spread of misinformation. Now that's going to be 266 00:16:33,480 --> 00:16:36,240 Speaker 1: a running theme in a lot of this episode. But 267 00:16:36,320 --> 00:16:39,520 Speaker 1: before I get into all of that, let's take a 268 00:16:39,640 --> 00:16:51,040 Speaker 1: quick break. All right, misinformation. Um, there's a lot to 269 00:16:51,120 --> 00:16:56,800 Speaker 1: cover here, but when it comes to misinformation one, I mean, 270 00:16:56,920 --> 00:16:59,320 Speaker 1: I've kind of feel like Marlon Brando and The Wild 271 00:16:59,360 --> 00:17:01,800 Speaker 1: One when he's asked like what are you rebelling against? 272 00:17:01,800 --> 00:17:04,399 Speaker 1: He's like, what have you got? Well, in this case, 273 00:17:04,680 --> 00:17:06,480 Speaker 1: it would be you know, what do you have to 274 00:17:06,480 --> 00:17:09,119 Speaker 1: say about misinformation? And a well, how much time do 275 00:17:09,160 --> 00:17:13,000 Speaker 1: you have? Because uh, we could literally do a week's 276 00:17:13,000 --> 00:17:17,920 Speaker 1: worth of episodes about misinformation just in one. But I'll 277 00:17:17,960 --> 00:17:22,800 Speaker 1: try and keep it somewhat succinct, you know, I'll try 278 00:17:22,840 --> 00:17:25,320 Speaker 1: to to keep all of that in this episode. I'm 279 00:17:25,359 --> 00:17:26,960 Speaker 1: sure some of it will spill out as I talk 280 00:17:27,040 --> 00:17:31,760 Speaker 1: about other other stories later this week. Um, Okay, A 281 00:17:31,760 --> 00:17:35,960 Speaker 1: lot of the misinformation is focused on the pandemic from 282 00:17:36,000 --> 00:17:40,040 Speaker 1: folks proclaiming that they knew with absolute certainty that the 283 00:17:40,080 --> 00:17:44,439 Speaker 1: origins of the virus were from this one place or 284 00:17:44,480 --> 00:17:48,440 Speaker 1: this other place, without actually providing any evidence to that. 285 00:17:48,760 --> 00:17:51,360 Speaker 1: You know, they're just claiming it. Or there were those 286 00:17:51,400 --> 00:17:54,439 Speaker 1: who were proclaiming vaccines to be harmful or even a 287 00:17:54,440 --> 00:17:57,040 Speaker 1: tool that governments were using to try and control or 288 00:17:57,119 --> 00:18:00,800 Speaker 1: track populations. And then there were just the right denials 289 00:18:00,800 --> 00:18:03,520 Speaker 1: that there's even a pandemic going on in the first place, 290 00:18:03,560 --> 00:18:06,520 Speaker 1: that that COVID itself is a hoax. The fact that 291 00:18:06,640 --> 00:18:12,280 Speaker 1: several high profile deniers have subsequently died after contracting COVID 292 00:18:12,880 --> 00:18:17,840 Speaker 1: doesn't seem to squash the spread of that dangerous misinformation, 293 00:18:17,880 --> 00:18:22,600 Speaker 1: those lies, which again crazy, right, you have people who 294 00:18:22,640 --> 00:18:26,520 Speaker 1: are saying COVID's not real, then they die of COVID, 295 00:18:26,960 --> 00:18:30,120 Speaker 1: but the people that they've told COVID's not real are 296 00:18:30,119 --> 00:18:34,359 Speaker 1: still saying COVID's not real. Listen, I also wish COVID 297 00:18:34,400 --> 00:18:39,080 Speaker 1: weren't real. But you know, if if if wishes were horses, 298 00:18:39,119 --> 00:18:43,560 Speaker 1: beggars would ride. And that's just not the case. As 299 00:18:43,560 --> 00:18:45,480 Speaker 1: for the tech angle, well, as we all know, a 300 00:18:45,560 --> 00:18:50,040 Speaker 1: lot of these messages spread via social networking platforms like Facebook, 301 00:18:50,240 --> 00:18:54,440 Speaker 1: and we're going to talk so much about Facebook within 302 00:18:54,520 --> 00:19:00,240 Speaker 1: the context of one. Volumes have been written recently about 303 00:19:00,280 --> 00:19:03,800 Speaker 1: how Facebook's algorithm has played a part in elevating messages 304 00:19:03,840 --> 00:19:09,200 Speaker 1: that spread harmful misinformation and how that ends up reinforcing falsehoods. 305 00:19:09,240 --> 00:19:12,240 Speaker 1: It amplifies the message, and it convinces people to do 306 00:19:12,400 --> 00:19:18,480 Speaker 1: stupid stuff like denounced vaccines or defy mask orders and more. 307 00:19:19,000 --> 00:19:21,960 Speaker 1: And I figure any reasonable person has to be left 308 00:19:21,960 --> 00:19:26,560 Speaker 1: wondering how many people have died because of those kinds 309 00:19:26,560 --> 00:19:30,240 Speaker 1: of messages. And keep in mind, the folks who die 310 00:19:30,760 --> 00:19:33,399 Speaker 1: are not necessarily the same ones who have bought into 311 00:19:33,400 --> 00:19:36,480 Speaker 1: the lie. Right. The folks who die might be friends 312 00:19:36,560 --> 00:19:39,120 Speaker 1: or family, or loved ones or co workers, or even 313 00:19:39,160 --> 00:19:42,000 Speaker 1: just strangers who just happened to be in the wrong 314 00:19:42,040 --> 00:19:46,040 Speaker 1: place at the wrong time, under the wrong circumstances and 315 00:19:46,080 --> 00:19:49,800 Speaker 1: they caught COVID as a result. This is why I 316 00:19:49,880 --> 00:19:52,320 Speaker 1: keep harping on this. I know I sound like a 317 00:19:52,359 --> 00:19:55,560 Speaker 1: broken record, but I'm not going to stop. I don't 318 00:19:55,640 --> 00:19:59,600 Speaker 1: want anyone to die. But even if I were super cynical, 319 00:20:00,040 --> 00:20:01,960 Speaker 1: that if I were one of those people who saying 320 00:20:02,320 --> 00:20:05,320 Speaker 1: this is Darwinian right, the idea that the people who 321 00:20:05,359 --> 00:20:07,480 Speaker 1: deny it are the ones who are dying off it 322 00:20:07,520 --> 00:20:09,840 Speaker 1: will it's a problem that takes care of itself. I 323 00:20:09,880 --> 00:20:13,280 Speaker 1: can't think that way because that's just writing people's lives off, 324 00:20:13,560 --> 00:20:18,159 Speaker 1: and that's fundamentally wrong to me. I cannot be that person. 325 00:20:19,000 --> 00:20:21,199 Speaker 1: But even if I were, I would still have to 326 00:20:21,240 --> 00:20:24,920 Speaker 1: acknowledge that it's not just the ignorant and the misled 327 00:20:25,000 --> 00:20:30,000 Speaker 1: who are suffering here due to misinformation campaigns. It's everyone. Now. 328 00:20:30,040 --> 00:20:34,160 Speaker 1: Some of the messaging spread in claim that vaccines could 329 00:20:34,200 --> 00:20:38,000 Speaker 1: do anything from magnetize you to introduce some sort of 330 00:20:38,040 --> 00:20:41,840 Speaker 1: tracking device into your body. It didn't matter that folks 331 00:20:41,840 --> 00:20:45,439 Speaker 1: pointed out that those relies like they would show that no, 332 00:20:45,760 --> 00:20:48,880 Speaker 1: there's no such thing that magnets, No, that's not happening, 333 00:20:48,960 --> 00:20:51,399 Speaker 1: or that all you have to do is own a 334 00:20:51,480 --> 00:20:55,320 Speaker 1: smartphone and that device alone is tracking everywhere you go 335 00:20:55,400 --> 00:20:59,080 Speaker 1: and everything you do. There's no need to inject a tracker, 336 00:20:59,160 --> 00:21:03,160 Speaker 1: and you you're already a willing participant in an ecosystem 337 00:21:03,320 --> 00:21:06,560 Speaker 1: that's tracking you all the time. None of that matters 338 00:21:06,800 --> 00:21:10,520 Speaker 1: when it comes to stopping misinformation. The best way to 339 00:21:10,520 --> 00:21:14,159 Speaker 1: stop misinformation really besides you know, sitting people down and 340 00:21:14,200 --> 00:21:17,199 Speaker 1: having one on ones and really getting to it is 341 00:21:17,280 --> 00:21:21,919 Speaker 1: not allowing for that amplification and that rapid spread in 342 00:21:21,960 --> 00:21:27,320 Speaker 1: the first place. All right, Uh, but not all the 343 00:21:27,359 --> 00:21:30,960 Speaker 1: misinformation had to do just with COVID. We will touch 344 00:21:31,000 --> 00:21:34,119 Speaker 1: back on that. There was something else that happened in 345 00:21:34,880 --> 00:21:38,720 Speaker 1: one where misinformation played a pivotal role, and let's get 346 00:21:38,760 --> 00:21:41,040 Speaker 1: to it. I mean, it feels like it happened a 347 00:21:41,080 --> 00:21:45,040 Speaker 1: billion years ago to me, But it was on January six, 348 00:21:45,359 --> 00:21:51,840 Speaker 1: twenty one, when there was a nearly successful insurrection in 349 00:21:51,920 --> 00:21:54,920 Speaker 1: the United States. That is the date when a group 350 00:21:55,000 --> 00:21:58,920 Speaker 1: of violent individuals stormed the US capital with the goal 351 00:21:59,040 --> 00:22:03,919 Speaker 1: of interfering with the ratification of the twenty twenty election. Now, 352 00:22:04,880 --> 00:22:07,360 Speaker 1: Congress was in the process of what is usually a 353 00:22:07,400 --> 00:22:12,359 Speaker 1: pretty ceremonial event where the states present and ratify their 354 00:22:12,400 --> 00:22:16,480 Speaker 1: elector results and the winner of the most recent election 355 00:22:16,840 --> 00:22:19,760 Speaker 1: is made official. Everyone knows who the winner is already, 356 00:22:19,800 --> 00:22:22,760 Speaker 1: by the way, because everyone already knows the outcome of 357 00:22:22,800 --> 00:22:27,320 Speaker 1: the election, but this is the process that officially recognizes 358 00:22:27,720 --> 00:22:33,480 Speaker 1: the winner. The rioters on January six, encouraged by then 359 00:22:33,680 --> 00:22:37,320 Speaker 1: President Donald Trump, were on a mission to disrupt this 360 00:22:37,760 --> 00:22:41,040 Speaker 1: process at any cost, with some going so far as 361 00:22:41,119 --> 00:22:44,240 Speaker 1: to advocate for the capture and execution of the then 362 00:22:44,359 --> 00:22:49,040 Speaker 1: Vice President Mike Pence, because it was Pence's ceremonial duty 363 00:22:49,119 --> 00:22:52,879 Speaker 1: to recognize the results, and he was not going to 364 00:22:53,040 --> 00:22:58,160 Speaker 1: overturn the recognized results despite what Trump wanted him to do. 365 00:22:58,560 --> 00:23:03,640 Speaker 1: So his followers, Trump followers, we're saying, Okay, let's kill 366 00:23:03,720 --> 00:23:10,840 Speaker 1: the vice president. A truly horridonous series of crazy, irrational 367 00:23:10,960 --> 00:23:15,480 Speaker 1: thoughts right now, y'all. This story is so outrageous that 368 00:23:15,600 --> 00:23:18,280 Speaker 1: it is hard to get my mind wrapped around it. 369 00:23:18,359 --> 00:23:21,840 Speaker 1: And perhaps even more outrageous is how so much of 370 00:23:21,880 --> 00:23:24,920 Speaker 1: the United States has just kind of accepted this as it's, 371 00:23:25,040 --> 00:23:28,439 Speaker 1: you know, just a thing that happened. It's bonkers to me. 372 00:23:28,720 --> 00:23:31,800 Speaker 1: But let's get back to the tech. Investigators looking into 373 00:23:31,800 --> 00:23:35,439 Speaker 1: the January six riots suspected that a lot of planning 374 00:23:35,520 --> 00:23:38,679 Speaker 1: for the event was carried out on various social network 375 00:23:38,760 --> 00:23:44,600 Speaker 1: platforms like Facebook, Twitter, and at the time, parlor or parlay. 376 00:23:44,640 --> 00:23:48,200 Speaker 1: I'm gonna call it parlor. This story continues to play 377 00:23:48,200 --> 00:23:51,240 Speaker 1: out today. By the way, there was a finding in 378 00:23:51,720 --> 00:23:55,320 Speaker 1: the FBI released in August of this year that said 379 00:23:55,600 --> 00:23:57,919 Speaker 1: that they found very little evidence that there was a 380 00:23:57,920 --> 00:24:01,480 Speaker 1: lot of planning going on on social networks at the 381 00:24:01,560 --> 00:24:04,720 Speaker 1: time leading up to the January six riots, But then 382 00:24:04,760 --> 00:24:07,159 Speaker 1: subsequently there have been others who have come for and 383 00:24:07,160 --> 00:24:10,560 Speaker 1: said no, we totally use social networks to help plan 384 00:24:10,720 --> 00:24:13,920 Speaker 1: this stuff out. So it's the story that has gone 385 00:24:13,960 --> 00:24:18,960 Speaker 1: back and forth many times throughout um. But there are 386 00:24:18,960 --> 00:24:21,040 Speaker 1: still people who are calling out for platforms to be 387 00:24:21,080 --> 00:24:25,919 Speaker 1: held accountable for amplifying misinformation campaigns that led to civil 388 00:24:26,040 --> 00:24:30,480 Speaker 1: unrest and violence, including the deaths of five people. Some 389 00:24:30,680 --> 00:24:34,280 Speaker 1: have been held more to account than others, so Parlor, 390 00:24:34,359 --> 00:24:36,760 Speaker 1: for instance, had a pretty rough go of it. Early 391 00:24:36,840 --> 00:24:39,639 Speaker 1: in the year. The site, which promoted itself as a 392 00:24:39,680 --> 00:24:42,680 Speaker 1: place where freedom of speech is valued above everything else, 393 00:24:43,240 --> 00:24:45,600 Speaker 1: was mostly known as being the home for far right 394 00:24:45,720 --> 00:24:51,880 Speaker 1: conservatives and conspiracy theorists. Parlor's site depended upon Amazon Web 395 00:24:52,000 --> 00:24:56,399 Speaker 1: Services for for hosting, and in the wake of the 396 00:24:56,480 --> 00:25:00,399 Speaker 1: January six insurrection, Google and Apple both were moved the 397 00:25:00,440 --> 00:25:03,640 Speaker 1: Parlor app from their respective stores, and a short time 398 00:25:03,760 --> 00:25:08,040 Speaker 1: later Amazon removed Parlor from a WS and the site 399 00:25:08,080 --> 00:25:13,680 Speaker 1: went dark. Parlor subsequently saw its CEO, John Matt's, fired 400 00:25:13,760 --> 00:25:17,800 Speaker 1: and replaced first by Mark Meckler and then by George Farmer. 401 00:25:18,240 --> 00:25:21,280 Speaker 1: The site returned to service in mid February, having moved 402 00:25:21,280 --> 00:25:24,560 Speaker 1: from AWS to a company called Epic, although it had 403 00:25:24,600 --> 00:25:30,320 Speaker 1: lost all posts that had been sent before the takedown, 404 00:25:30,800 --> 00:25:35,280 Speaker 1: so nothing from the earlier era of Parlor still existed. 405 00:25:36,400 --> 00:25:38,920 Speaker 1: So the site is still around today, though reportedly not 406 00:25:39,000 --> 00:25:41,119 Speaker 1: nearly as popular as it was leading up to the 407 00:25:41,119 --> 00:25:44,359 Speaker 1: twenty twenty election, A lot of people just migrated to 408 00:25:44,400 --> 00:25:48,520 Speaker 1: other platforms instead. As for other sites such as you know, 409 00:25:49,040 --> 00:25:53,280 Speaker 1: Facebook and Twitter, much of one and YouTube too, I 410 00:25:53,280 --> 00:25:57,600 Speaker 1: should say, but Much of One has featured investigations into 411 00:25:57,680 --> 00:26:01,440 Speaker 1: and statements from those various site and services regarding to 412 00:26:01,520 --> 00:26:05,120 Speaker 1: what extent they might or might not have played as 413 00:26:05,240 --> 00:26:08,679 Speaker 1: part of the January six attack on the Capitol. And 414 00:26:08,720 --> 00:26:11,280 Speaker 1: I guess that's as good a place for us to 415 00:26:11,359 --> 00:26:16,680 Speaker 1: start talking about Facebook in is any Because good Gully 416 00:26:16,720 --> 00:26:18,880 Speaker 1: is there a lot to talk about. In fact, there's 417 00:26:18,920 --> 00:26:21,320 Speaker 1: so much to talk about that I suspect it's going 418 00:26:21,359 --> 00:26:24,920 Speaker 1: to take up probably the rest of this episode. So 419 00:26:25,840 --> 00:26:28,720 Speaker 1: this is also going to involve some jumping around because 420 00:26:29,440 --> 00:26:31,840 Speaker 1: I tried originally to kind of sketch this out in 421 00:26:31,880 --> 00:26:36,439 Speaker 1: a strictly chronological retelling of Facebook's year, but that was 422 00:26:36,480 --> 00:26:40,240 Speaker 1: too messy, it was too tangled up. And that's largely 423 00:26:40,359 --> 00:26:44,840 Speaker 1: because a ton of stuff really that really applies to 424 00:26:44,960 --> 00:26:48,960 Speaker 1: the year that Facebook has had actually goes back several years, 425 00:26:49,000 --> 00:26:52,840 Speaker 1: but it only became public knowledge late this year when 426 00:26:52,960 --> 00:26:56,119 Speaker 1: Francis Hogan, who was a former product manager at Facebook, 427 00:26:56,600 --> 00:27:01,000 Speaker 1: came forward with thousands of internal Facebook document and shared 428 00:27:01,080 --> 00:27:05,560 Speaker 1: them with authorities and with journalists. And those documents prompted 429 00:27:05,680 --> 00:27:09,120 Speaker 1: hundreds of questions that lots of different people want answered, 430 00:27:09,640 --> 00:27:13,240 Speaker 1: and Facebook, or as the company is now known, Meta, 431 00:27:13,400 --> 00:27:17,960 Speaker 1: has slowly kind of been answering them, to varying degrees 432 00:27:18,000 --> 00:27:22,720 Speaker 1: of satisfaction. Some of the revelations that tied directly into 433 00:27:22,760 --> 00:27:26,520 Speaker 1: that January sixth event. For example, leading up to the 434 00:27:26,640 --> 00:27:31,479 Speaker 1: twenty election, Facebook had an internal group within the company 435 00:27:31,520 --> 00:27:34,919 Speaker 1: called the Civic Integrity Group, and the purpose of this 436 00:27:35,000 --> 00:27:40,240 Speaker 1: group was to monitor issues like misinformation relating to elections 437 00:27:40,240 --> 00:27:43,879 Speaker 1: and politics. Now, you could probably make a pretty decent 438 00:27:43,960 --> 00:27:48,200 Speaker 1: argument that this group was not capable of handling that task. Now, 439 00:27:48,840 --> 00:27:52,320 Speaker 1: I don't mean that the people who worked within the 440 00:27:52,359 --> 00:27:57,160 Speaker 1: Civic Integrity Group were inept or anything like that. Rather, 441 00:27:57,240 --> 00:28:00,639 Speaker 1: what I mean is that the scope of the problem 442 00:28:00,680 --> 00:28:03,520 Speaker 1: I think was well beyond their capabilities. It's just such 443 00:28:03,560 --> 00:28:07,520 Speaker 1: a huge issue. Also, you had problems with Facebook executives 444 00:28:07,800 --> 00:28:10,919 Speaker 1: who were kind of counteracting some of the suggestions that 445 00:28:10,960 --> 00:28:16,920 Speaker 1: people had to make changes that could potentially reduce Facebook's 446 00:28:17,560 --> 00:28:22,520 Speaker 1: uh role in amplifying this information. But setting all that aside, 447 00:28:22,600 --> 00:28:26,040 Speaker 1: Hogan told a U. S. Senate subcommittee that Facebook dissolved 448 00:28:26,160 --> 00:28:31,320 Speaker 1: the Civic Integrity Team shortly after the election. Now, Facebook 449 00:28:31,320 --> 00:28:34,800 Speaker 1: reps have since said that they didn't dissolve the Civic 450 00:28:34,840 --> 00:28:38,840 Speaker 1: Integrity team. Instead, they merged it with a larger Central 451 00:28:39,080 --> 00:28:43,959 Speaker 1: Integrity team, So they say Cecivic Integrity didn't really go away, 452 00:28:44,400 --> 00:28:47,120 Speaker 1: it just became part of something bigger than itself, and 453 00:28:47,160 --> 00:28:50,000 Speaker 1: the idea was to use the experience that the Civic 454 00:28:50,040 --> 00:28:55,200 Speaker 1: Integrity Team had to a larger scope. However, the concern 455 00:28:55,320 --> 00:28:59,320 Speaker 1: that US lawmakers have is that this movement Facebook was 456 00:28:59,720 --> 00:29:03,200 Speaker 1: even less prepared to deal with the surge in rhetoric 457 00:29:03,600 --> 00:29:07,520 Speaker 1: that preceded the January six riots earlier this year. So, 458 00:29:07,560 --> 00:29:11,080 Speaker 1: in other words, the timing of the decision to dissolve 459 00:29:11,200 --> 00:29:15,640 Speaker 1: or rather merge Civic Integrity with another group may have 460 00:29:15,720 --> 00:29:19,040 Speaker 1: been a contributing factor that there weren't as many eyeballs 461 00:29:19,080 --> 00:29:22,280 Speaker 1: on the issue out of critical time as there should 462 00:29:22,320 --> 00:29:27,400 Speaker 1: have been. Perhaps the spread of election misinformation, you know, 463 00:29:27,480 --> 00:29:32,120 Speaker 1: the whole stop the steel nonsense, drove more people towards extremism, 464 00:29:32,160 --> 00:29:34,840 Speaker 1: which in turn added fuel to the folks who ultimately 465 00:29:34,920 --> 00:29:40,000 Speaker 1: stormed the capital on January six. Now I'm writing this 466 00:29:40,120 --> 00:29:43,720 Speaker 1: and recording this on December twenty, two, thousand twenty one, 467 00:29:43,880 --> 00:29:48,200 Speaker 1: and it was just today that Engadget published a piece 468 00:29:48,280 --> 00:29:52,560 Speaker 1: about this very thing. So nearly a full year has 469 00:29:52,600 --> 00:29:57,040 Speaker 1: passed since that riot, and we're still, you know, writing 470 00:29:57,040 --> 00:30:02,240 Speaker 1: about discoveries made, uh leading up to the riot, igor 471 00:30:02,480 --> 00:30:06,320 Speaker 1: Bonifaceic and I apologize for butchering your name, Igore. He 472 00:30:06,360 --> 00:30:08,560 Speaker 1: wrote the piece, and he rightly points out that while 473 00:30:08,640 --> 00:30:13,040 Speaker 1: there's a lot of outrage being expressed about this, there's 474 00:30:13,080 --> 00:30:17,240 Speaker 1: not a lot of actual, you know, action being done 475 00:30:17,320 --> 00:30:20,000 Speaker 1: about it. And and part of that is because while 476 00:30:20,080 --> 00:30:24,720 Speaker 1: both Democrats and Republicans have an act to grind with Facebook, 477 00:30:25,200 --> 00:30:29,040 Speaker 1: the two parties fundamentally disagree on the nature of that 478 00:30:29,160 --> 00:30:33,440 Speaker 1: acts and how to grind it. I'll explain more after 479 00:30:33,520 --> 00:30:44,480 Speaker 1: we take this quick break. All right, I've left off 480 00:30:44,520 --> 00:30:48,480 Speaker 1: talking about how there's a general feeling that Facebook needs 481 00:30:48,480 --> 00:30:52,040 Speaker 1: to be called to task, but there's a fundamental difference 482 00:30:52,080 --> 00:30:56,640 Speaker 1: in ideology over why and how. So there's another ongoing 483 00:30:56,720 --> 00:31:01,320 Speaker 1: story that really was playing out in UH and it's 484 00:31:01,360 --> 00:31:04,040 Speaker 1: one that's not really supported by facts, but it's still 485 00:31:04,440 --> 00:31:07,720 Speaker 1: important because there were consequences. And it's that there's the 486 00:31:07,880 --> 00:31:11,680 Speaker 1: this narrative that Facebook and some other social networks have 487 00:31:11,840 --> 00:31:18,480 Speaker 1: a bias against conservative voices and therefore have a tendency 488 00:31:18,600 --> 00:31:22,920 Speaker 1: to suppress those voices, that these entities are working to 489 00:31:23,080 --> 00:31:27,680 Speaker 1: silence the conservative perspective. Those who argue this like to 490 00:31:27,720 --> 00:31:30,640 Speaker 1: point out stuff that you know, like Facebook, Twitter, and 491 00:31:30,720 --> 00:31:35,760 Speaker 1: YouTube eventually banned Donald Trump's accounts with those services, mostly 492 00:31:35,800 --> 00:31:38,360 Speaker 1: in the wake of the January six riots and the 493 00:31:38,400 --> 00:31:43,800 Speaker 1: perpetuation of election fraud claims, claims that are wholly unsupported 494 00:31:44,120 --> 00:31:49,760 Speaker 1: by actual evidence. See The platforms under intense pressure began 495 00:31:49,880 --> 00:31:55,280 Speaker 1: flagging posts that contained unsupported allegations and they tagged them 496 00:31:55,320 --> 00:32:00,400 Speaker 1: as misinformation, so alerting people, hey, the claim was being 497 00:32:00,440 --> 00:32:03,280 Speaker 1: made in this post are not supported by facts. That's 498 00:32:03,360 --> 00:32:06,760 Speaker 1: essentially what they were doing, saying, you know, essentially, don't 499 00:32:06,880 --> 00:32:10,320 Speaker 1: don't necessarily believe what you're reading here. Now that had 500 00:32:10,320 --> 00:32:14,320 Speaker 1: prompted some conservatives to argue that the platforms have a bias. 501 00:32:15,040 --> 00:32:17,240 Speaker 1: But you know, just a word on this. See, if 502 00:32:17,280 --> 00:32:21,040 Speaker 1: your claim is not supported by actual evidence, it doesn't 503 00:32:21,080 --> 00:32:26,400 Speaker 1: matter how loudly you want to shout it. It's still misinformation. 504 00:32:26,800 --> 00:32:30,479 Speaker 1: And if a site has a policy against the spread 505 00:32:30,520 --> 00:32:34,320 Speaker 1: of misinformation, that means you are violating the policy by 506 00:32:34,320 --> 00:32:36,600 Speaker 1: trying to spread it. And you know, just because you 507 00:32:36,640 --> 00:32:40,800 Speaker 1: want something to be true doesn't make it true. Trust me, 508 00:32:41,040 --> 00:32:44,960 Speaker 1: I know this because otherwise I would be a bazillionaire 509 00:32:45,360 --> 00:32:50,440 Speaker 1: because I want that to be true anyway, the narrative 510 00:32:50,880 --> 00:32:54,240 Speaker 1: you know that social networks are biased against conservatives and 511 00:32:54,280 --> 00:32:58,720 Speaker 1: are actively suppressing conservative speech. That has played into other 512 00:32:58,840 --> 00:33:03,160 Speaker 1: stories with face Book. Some of those internal documents revealed 513 00:33:03,200 --> 00:33:08,400 Speaker 1: that Facebook executives discouraged teams from clamping down on misinformation 514 00:33:08,440 --> 00:33:14,080 Speaker 1: campaigns or making tweaks to Facebook's algorithm to avoid amplifying 515 00:33:14,120 --> 00:33:17,360 Speaker 1: those claims, because they were concerned that if they made 516 00:33:17,360 --> 00:33:20,560 Speaker 1: those changes, it would add fuel to the fire and 517 00:33:20,600 --> 00:33:24,840 Speaker 1: give conservatives more quote unquote proof that the platform had 518 00:33:24,920 --> 00:33:29,960 Speaker 1: an anti conservative bias. Never mind that some of these 519 00:33:30,000 --> 00:33:33,960 Speaker 1: posts were getting amplified by the algorithm, such as posts 520 00:33:34,000 --> 00:33:38,600 Speaker 1: that included hateful rhetoric targeting vulnerable populations ranging from ethnic 521 00:33:38,640 --> 00:33:43,760 Speaker 1: minorities to the transgender community. These were all violating Facebook's 522 00:33:43,800 --> 00:33:48,240 Speaker 1: own rules, right, Facebook has actual rules about this. The 523 00:33:48,360 --> 00:33:54,560 Speaker 1: posts were definitely and definitively violating those rules, and yet 524 00:33:54,600 --> 00:33:58,000 Speaker 1: they were being amplified by Facebook's algorithm because they were 525 00:33:58,080 --> 00:34:02,040 Speaker 1: driving engagement. So the desire to avoid looking like a 526 00:34:02,080 --> 00:34:07,080 Speaker 1: biased platform in the executive's minds outweighed any desire to 527 00:34:07,200 --> 00:34:10,839 Speaker 1: avoid spreading harm. And yeah, this is borne out by 528 00:34:10,880 --> 00:34:14,480 Speaker 1: some of those internal documents that Hogan provided, So we've 529 00:34:14,480 --> 00:34:19,279 Speaker 1: actually seen the emails that the story plays out in. 530 00:34:20,200 --> 00:34:24,000 Speaker 1: But wait, there's more. One of the early stories prompted 531 00:34:24,040 --> 00:34:28,080 Speaker 1: by Hogan's whistle blowing activities focused on Instagram and the 532 00:34:28,120 --> 00:34:33,000 Speaker 1: possibility that Instagram could be harming people, particularly young women, 533 00:34:33,520 --> 00:34:36,839 Speaker 1: and that worse, folks on Instagram are fully aware of 534 00:34:36,880 --> 00:34:40,920 Speaker 1: this and yet continue to make things worse. And the 535 00:34:40,960 --> 00:34:43,759 Speaker 1: news broke right around the time that we started hearing 536 00:34:43,760 --> 00:34:47,000 Speaker 1: about a plan that would see an Instagram app targeting 537 00:34:47,160 --> 00:34:50,600 Speaker 1: young users, like those younger than the age of thirteen. 538 00:34:51,160 --> 00:34:54,920 Speaker 1: Not a great look. So this internal document cited some 539 00:34:55,040 --> 00:34:58,600 Speaker 1: research that you know, appeared to link a decline in 540 00:34:58,719 --> 00:35:02,680 Speaker 1: some users sell image or their mental health and their 541 00:35:02,840 --> 00:35:05,360 Speaker 1: use of Instagram, you know, saying that there appears to 542 00:35:05,400 --> 00:35:09,640 Speaker 1: be some sort of link between the two. Now Immediately, 543 00:35:09,760 --> 00:35:13,000 Speaker 1: lawmakers in the US showed concern, and I think we 544 00:35:13,080 --> 00:35:17,520 Speaker 1: all pretty much intuitively feel that overuse of platforms like 545 00:35:17,560 --> 00:35:22,680 Speaker 1: Instagram are probably harmful because the platform really promotes an 546 00:35:22,760 --> 00:35:26,759 Speaker 1: unrealistic view of people's lives. Right if you spend any 547 00:35:26,800 --> 00:35:29,640 Speaker 1: time on Instagram and you start scrolling through it, you're 548 00:35:29,719 --> 00:35:33,759 Speaker 1: likely to see posts of really attractive people appearing to 549 00:35:33,840 --> 00:35:38,319 Speaker 1: live out fantasy lives in incredible settings, and it's hard 550 00:35:38,360 --> 00:35:42,560 Speaker 1: not to feel inferior by comparison. But this internal research 551 00:35:42,920 --> 00:35:46,880 Speaker 1: seemed to show that this wasn't a bug in Instagram, 552 00:35:46,880 --> 00:35:50,520 Speaker 1: It's a feature, and that Instagram, far from being ignorant 553 00:35:50,520 --> 00:35:53,760 Speaker 1: of this effect, was well aware and was still pursuing 554 00:35:53,800 --> 00:35:57,000 Speaker 1: efforts to expand its user base to even younger people. 555 00:35:57,880 --> 00:36:02,680 Speaker 1: Now Facebook slash Instagram would try to refute this, saying 556 00:36:02,719 --> 00:36:06,160 Speaker 1: that the research only represents a tiny amount of information 557 00:36:06,560 --> 00:36:10,360 Speaker 1: and that these interpretations are wrong, and also that the 558 00:36:10,400 --> 00:36:14,080 Speaker 1: studies involved a very small sample size. But that also 559 00:36:14,160 --> 00:36:17,520 Speaker 1: prompts discussions about you know, how the company is under 560 00:36:17,560 --> 00:36:21,160 Speaker 1: Meta operate with a lack of transparency, and that's saying 561 00:36:21,520 --> 00:36:24,400 Speaker 1: this is just a tiny bit of information isn't really 562 00:36:24,440 --> 00:36:27,399 Speaker 1: a good argument because the company hasn't bothered to show 563 00:36:27,400 --> 00:36:31,440 Speaker 1: anything of the big picture that could actually contradict the 564 00:36:31,480 --> 00:36:35,560 Speaker 1: conclusions people have made based upon the quote unquote limited 565 00:36:35,719 --> 00:36:40,680 Speaker 1: information that they've had access to. So, you know it, 566 00:36:40,680 --> 00:36:43,279 Speaker 1: it's kind of like when you you're you're a kid 567 00:36:43,320 --> 00:36:45,759 Speaker 1: and you're playing a game like Cops and Robbers and 568 00:36:45,800 --> 00:36:47,680 Speaker 1: you say, bang, I shot you and the other kids 569 00:36:47,760 --> 00:36:51,279 Speaker 1: is no, you didn't. It's kind of like what Facebook saying. 570 00:36:51,280 --> 00:36:52,880 Speaker 1: It's like that, you know, they're saying this is the 571 00:36:52,920 --> 00:36:57,439 Speaker 1: study shows you know it's harmful, So no, it doesn't. Um, well, 572 00:36:57,600 --> 00:37:00,719 Speaker 1: that's not really a good refutation. I would say I 573 00:37:00,719 --> 00:37:03,200 Speaker 1: could probably do a very full episode or even a 574 00:37:03,239 --> 00:37:06,640 Speaker 1: series of episodes about the awful stories linked to Facebook 575 00:37:06,719 --> 00:37:09,960 Speaker 1: and Meta for this year, but instead, I'm going to 576 00:37:10,080 --> 00:37:12,640 Speaker 1: limit myself to a few highlights. So keep in mind, 577 00:37:13,239 --> 00:37:15,359 Speaker 1: I'm going to cover some stuff, some more stuff about 578 00:37:15,360 --> 00:37:18,440 Speaker 1: Facebook before we close out this episode. But it's a 579 00:37:18,520 --> 00:37:21,400 Speaker 1: drop in the bucket compared to all the different stories 580 00:37:21,440 --> 00:37:24,279 Speaker 1: that came out this past year about that company. So 581 00:37:25,560 --> 00:37:29,080 Speaker 1: there were concerns about misinformation that obviously extend well beyond 582 00:37:29,200 --> 00:37:33,480 Speaker 1: US politics. I mentioned COVID nineteen earlier. Uh, there were 583 00:37:33,520 --> 00:37:37,840 Speaker 1: a lot of concerns about Facebook's role in spreading and 584 00:37:37,880 --> 00:37:42,200 Speaker 1: amplifying misinformation about COVID nineteen. In fact, some people going 585 00:37:42,239 --> 00:37:45,920 Speaker 1: so far as to say that Facebook's role, uh, it 586 00:37:46,040 --> 00:37:49,040 Speaker 1: was actually leading to people getting sick and dying. I mean, 587 00:37:49,160 --> 00:37:53,040 Speaker 1: the United States President Joe Biden said as much. He 588 00:37:53,120 --> 00:37:57,040 Speaker 1: said the platform was quote killing people end quote. Now 589 00:37:57,120 --> 00:38:00,799 Speaker 1: later he backtracked a little bit, back pedaled and clarified 590 00:38:00,920 --> 00:38:03,800 Speaker 1: himself and said what he meant was that the misinformation 591 00:38:03,840 --> 00:38:08,560 Speaker 1: is what is killing people, not Facebook itself. However, when 592 00:38:08,600 --> 00:38:14,120 Speaker 1: Facebook's algorithm promotes misinformation because misinformation drives engagement, and as 593 00:38:14,160 --> 00:38:18,960 Speaker 1: I've said many many times, that is the end goal 594 00:38:19,160 --> 00:38:23,640 Speaker 1: of Facebook. It's keeping as many folks on Facebook for 595 00:38:23,880 --> 00:38:26,959 Speaker 1: as long as possible in order to serve as many 596 00:38:27,000 --> 00:38:30,719 Speaker 1: ads as possible. Doesn't really care what the content is, 597 00:38:30,760 --> 00:38:35,080 Speaker 1: as long as it keeps people there. Well, then tomato, tomato, right. 598 00:38:35,200 --> 00:38:38,040 Speaker 1: I mean, if that's the goal, is just keeping people there, 599 00:38:38,080 --> 00:38:40,600 Speaker 1: and if the thing that keeps people there is misinformation, 600 00:38:41,400 --> 00:38:43,680 Speaker 1: and if the misinformation is what is leading to people 601 00:38:43,680 --> 00:38:46,040 Speaker 1: getting sick and dying, I don't think it's that big 602 00:38:46,080 --> 00:38:48,880 Speaker 1: of a stretch to actually say something like, hey, Facebook, 603 00:38:48,920 --> 00:38:53,640 Speaker 1: you're killing people because it you know, it doesn't really 604 00:38:53,680 --> 00:38:59,000 Speaker 1: matter the semantics of that. It ultimately of Facebook's playing 605 00:38:59,000 --> 00:39:02,400 Speaker 1: a part in the ampleification and distribution of misinformation, and 606 00:39:02,440 --> 00:39:08,759 Speaker 1: that misinformation is killing people. You know, the logic concludes, right, Well, 607 00:39:08,800 --> 00:39:13,200 Speaker 1: then there's the story of the Rohinga in Myanmar. So 608 00:39:13,239 --> 00:39:16,040 Speaker 1: the Rohinga people are Muslim and they are also the 609 00:39:16,080 --> 00:39:19,560 Speaker 1: target of a genocidal campaign in Myanmar and have been 610 00:39:19,640 --> 00:39:24,239 Speaker 1: for a few years. Anti Rohinga posts flourished on Facebook 611 00:39:24,280 --> 00:39:28,200 Speaker 1: in Myanmar um, some of which were posted by the 612 00:39:28,239 --> 00:39:33,280 Speaker 1: government itself, and hate speech circulated frequently on the platform, 613 00:39:33,360 --> 00:39:36,399 Speaker 1: and it fueled the government's campaign to wipe out an 614 00:39:36,600 --> 00:39:40,839 Speaker 1: entire people. Now, some survivors have brought a pair of 615 00:39:40,840 --> 00:39:46,600 Speaker 1: class action lawsuits against Meta Slash Facebook, claiming that Facebook's 616 00:39:46,640 --> 00:39:51,240 Speaker 1: actions exacerbated a deadly situation Myanmar, and they are seeking 617 00:39:51,280 --> 00:39:55,280 Speaker 1: a hundred fifty billion dollars in damages as a result. 618 00:39:55,320 --> 00:39:58,640 Speaker 1: Now that's a fairly new story, so it hasn't played 619 00:39:58,680 --> 00:40:02,120 Speaker 1: out yet they spoke the company itself. As I've mentioned 620 00:40:02,120 --> 00:40:05,640 Speaker 1: a couple of times here, it officially rebranded as Meta 621 00:40:05,920 --> 00:40:08,920 Speaker 1: in October, in reference to the concept of a metaverse. 622 00:40:09,360 --> 00:40:11,920 Speaker 1: But I've covered that fairly recently, so we're not going 623 00:40:11,960 --> 00:40:14,400 Speaker 1: to dwell on it here except to say that the 624 00:40:14,440 --> 00:40:19,080 Speaker 1: metaverse as folks usually think of it, is probably pretty 625 00:40:19,120 --> 00:40:22,640 Speaker 1: far off the tech just eight there yet folks, and 626 00:40:22,760 --> 00:40:26,480 Speaker 1: the rebranding prompted a lot of cynics to suggest that 627 00:40:26,560 --> 00:40:28,920 Speaker 1: the timing was more about Facebook trying to make it 628 00:40:29,000 --> 00:40:32,600 Speaker 1: harder for folks to criticize the company or to link 629 00:40:32,719 --> 00:40:37,120 Speaker 1: Meta the company to Facebook the platform. I don't necessarily 630 00:40:37,160 --> 00:40:39,680 Speaker 1: think that's the case. If it is the case, I 631 00:40:39,680 --> 00:40:43,840 Speaker 1: don't think it's working very well. And discussions continue around 632 00:40:43,880 --> 00:40:47,200 Speaker 1: the world about the possibility of breaking Meta up into 633 00:40:47,239 --> 00:40:51,239 Speaker 1: smaller companies to make Meta divest itself of things like 634 00:40:51,400 --> 00:40:55,560 Speaker 1: Instagram and WhatsApp Meta slash. Facebook has been in the 635 00:40:55,560 --> 00:41:00,880 Speaker 1: spotlight during several talks about anti competitive practices. We haven't 636 00:41:00,880 --> 00:41:04,320 Speaker 1: seen much action on that front, and there are plenty 637 00:41:04,520 --> 00:41:08,279 Speaker 1: of folks who are skeptical that if Meta ever did 638 00:41:08,360 --> 00:41:11,800 Speaker 1: break up, which is already unlikely. According to these people, 639 00:41:12,239 --> 00:41:15,200 Speaker 1: that if it did break up, it wouldn't change very much. 640 00:41:16,280 --> 00:41:19,600 Speaker 1: That's entirely possible. It's outside my realm of expertise, so 641 00:41:19,640 --> 00:41:23,719 Speaker 1: I can't really comment on it. But yeah, that's uh, 642 00:41:23,840 --> 00:41:26,279 Speaker 1: that's kind of what I wanted to wrap up with 643 00:41:26,440 --> 00:41:30,920 Speaker 1: Facebook in one. Again, there's a lot more to say. 644 00:41:31,280 --> 00:41:35,279 Speaker 1: Those internal documents, they go so far and so deep, 645 00:41:35,320 --> 00:41:38,120 Speaker 1: and we still only have, you know, people like me, 646 00:41:38,280 --> 00:41:43,280 Speaker 1: We've I've only seen a fraction of what is actually there. Uh, 647 00:41:43,320 --> 00:41:46,560 Speaker 1: goodness knows what else is there that could lead to 648 00:41:46,760 --> 00:41:51,040 Speaker 1: some pretty tough conversations around Facebook and its role in 649 00:41:51,160 --> 00:41:56,880 Speaker 1: various social ills and the amplification of those. All right, um, 650 00:41:56,960 --> 00:41:58,640 Speaker 1: you know what, I've got a few minutes. Let's go 651 00:41:58,680 --> 00:42:02,320 Speaker 1: ahead and talk about one other thing before we conclude, 652 00:42:02,360 --> 00:42:05,200 Speaker 1: and that'll be n f T S. I mean, can 653 00:42:05,239 --> 00:42:08,480 Speaker 1: I interest you in n f T Uh That there's 654 00:42:08,520 --> 00:42:11,480 Speaker 1: what we also refer to as a non fungible token, 655 00:42:11,840 --> 00:42:14,200 Speaker 1: and it tells you it's the product of the future, 656 00:42:14,719 --> 00:42:17,920 Speaker 1: and and by product, I mean it's digital code linked 657 00:42:17,920 --> 00:42:21,000 Speaker 1: to a digital wallet, and arguably it don't mean much else. 658 00:42:21,200 --> 00:42:23,239 Speaker 1: But never mind that. Look at this shiny n f 659 00:42:23,440 --> 00:42:26,560 Speaker 1: T and by y'all, so Yeah, n f t s, 660 00:42:27,239 --> 00:42:31,440 Speaker 1: we're huge in one, or at least they got a 661 00:42:31,560 --> 00:42:35,319 Speaker 1: ton of attention early in one. Now, to be clear, 662 00:42:35,719 --> 00:42:38,520 Speaker 1: n f t s predate one. It's not like they 663 00:42:38,560 --> 00:42:40,919 Speaker 1: were suddenly invented this year. But I think it's safe 664 00:42:40,960 --> 00:42:43,840 Speaker 1: to say that a lot of us, including myself, hadn't 665 00:42:43,880 --> 00:42:47,400 Speaker 1: really heard about them before this year. So let's just 666 00:42:47,440 --> 00:42:49,239 Speaker 1: get to it. What the heck is an n f 667 00:42:49,280 --> 00:42:52,839 Speaker 1: T and what's all the fuss about? Well, I like 668 00:42:52,920 --> 00:42:55,279 Speaker 1: to call n f t s the equivalent of a 669 00:42:55,360 --> 00:43:01,160 Speaker 1: digital receipt. It's proof of ownership over some digital product. 670 00:43:01,719 --> 00:43:04,600 Speaker 1: That product could be an MP three file, it could 671 00:43:04,600 --> 00:43:07,520 Speaker 1: be a video, it could be a piece of digital artwork. 672 00:43:07,920 --> 00:43:10,160 Speaker 1: It could even be a tweet or just a line 673 00:43:10,200 --> 00:43:13,479 Speaker 1: of code. There are things that are hard to quote 674 00:43:13,560 --> 00:43:17,600 Speaker 1: unquote own in the traditional sense in the digital world, 675 00:43:17,719 --> 00:43:21,200 Speaker 1: Like you can't go out and purchase the original all 676 00:43:21,239 --> 00:43:24,560 Speaker 1: your base are belonged to us meme and then hang 677 00:43:24,640 --> 00:43:26,759 Speaker 1: it in your den, for example. It's not like a 678 00:43:26,800 --> 00:43:31,840 Speaker 1: physical piece of art where there's one single original work 679 00:43:31,880 --> 00:43:36,080 Speaker 1: of it and then lots of copies. It's a digital thing, right, 680 00:43:36,200 --> 00:43:39,280 Speaker 1: Folks can replicate and alter it to their hearts. Content, 681 00:43:39,840 --> 00:43:44,280 Speaker 1: and there's no way to differentiate a copy from the original. However, 682 00:43:45,160 --> 00:43:49,040 Speaker 1: and n f T can represent a specific instance of 683 00:43:49,280 --> 00:43:53,279 Speaker 1: a piece of digital information, and it can show that 684 00:43:53,280 --> 00:43:57,800 Speaker 1: that instance belongs to someone specific. So n f T 685 00:43:58,040 --> 00:44:00,759 Speaker 1: s are of these digital receipts, and they exist on 686 00:44:00,880 --> 00:44:06,319 Speaker 1: top of the Ethereum blockchain, the cryptocurrency Ethereum. Now that 687 00:44:06,360 --> 00:44:09,960 Speaker 1: means each transaction of an n f T is part 688 00:44:10,080 --> 00:44:13,880 Speaker 1: of the blockchain record. So just by referencing the ledger 689 00:44:14,080 --> 00:44:17,960 Speaker 1: for Ethereum, you can see who owns what. So I 690 00:44:18,000 --> 00:44:20,279 Speaker 1: can actually look at that ledger and say, oh, hey, look, 691 00:44:20,400 --> 00:44:23,000 Speaker 1: Josh Clark owns the n f T for the original 692 00:44:23,040 --> 00:44:26,800 Speaker 1: all your base meme. He doesn't. That's just an example, 693 00:44:27,000 --> 00:44:29,120 Speaker 1: or at least he doesn't to my knowledge. But that's 694 00:44:29,120 --> 00:44:33,280 Speaker 1: just an example. But what does that mean in practical terms? 695 00:44:34,239 --> 00:44:37,680 Speaker 1: That is harder to say, because while you've got proof 696 00:44:37,960 --> 00:44:42,240 Speaker 1: that you have ownership of that particular piece of digital work, 697 00:44:42,760 --> 00:44:45,600 Speaker 1: it doesn't change the fact that the digital item is, 698 00:44:46,280 --> 00:44:51,600 Speaker 1: you know, digital and thus replicable and such. There's proof 699 00:44:51,640 --> 00:44:54,520 Speaker 1: that you own it, and you can always sell or 700 00:44:54,640 --> 00:44:58,279 Speaker 1: trade that proof of ownership to someone else, and that 701 00:44:58,360 --> 00:45:01,680 Speaker 1: transaction will then enter the chain Ledger to show that 702 00:45:01,760 --> 00:45:05,799 Speaker 1: it has in fact changed hands, and there's been you know, 703 00:45:06,239 --> 00:45:08,359 Speaker 1: a lot of that. There's also a lot of speculation 704 00:45:08,400 --> 00:45:10,600 Speaker 1: around n f T s, the idea that these are 705 00:45:10,640 --> 00:45:13,960 Speaker 1: things that will increase in value. So we've seen a 706 00:45:13,960 --> 00:45:17,520 Speaker 1: lot of investors speculation, but a lot of those values 707 00:45:17,600 --> 00:45:21,680 Speaker 1: haven't really held over time. There's also this thought that 708 00:45:21,840 --> 00:45:23,880 Speaker 1: n f t s are going to become more important 709 00:45:23,880 --> 00:45:26,600 Speaker 1: in the future with a metaverse. The idea here is 710 00:45:26,640 --> 00:45:29,400 Speaker 1: that you could purchase n f t s that represent 711 00:45:29,480 --> 00:45:34,000 Speaker 1: ownership of certain digital goods, and that in theory, these 712 00:45:34,040 --> 00:45:38,040 Speaker 1: digital goods will be able to exist across multiple virtual 713 00:45:38,239 --> 00:45:42,080 Speaker 1: environments within the metaverse. So if you bought, say a 714 00:45:42,160 --> 00:45:46,000 Speaker 1: really cool avatar design in one virtual environment, and you 715 00:45:46,000 --> 00:45:48,000 Speaker 1: have the n f T for it, you could in 716 00:45:48,040 --> 00:45:52,520 Speaker 1: theory port that over to other digital environments. So to 717 00:45:52,600 --> 00:45:56,000 Speaker 1: use a video game analogy, imagine you're playing a game 718 00:45:56,080 --> 00:45:59,680 Speaker 1: like war Zone and you purchase a skin in the 719 00:45:59,719 --> 00:46:01,920 Speaker 1: game to make your character look like I don't know, 720 00:46:02,320 --> 00:46:04,640 Speaker 1: the ghost face killer in the screen movies, because that 721 00:46:04,680 --> 00:46:07,400 Speaker 1: actually exists in the war Zone game war Zone. If 722 00:46:07,400 --> 00:46:12,000 Speaker 1: you're not familiar first person shooter game and uh so, yeah, 723 00:46:12,040 --> 00:46:14,560 Speaker 1: you can buy a skin that makes you look like 724 00:46:14,920 --> 00:46:18,839 Speaker 1: the killer from the screen movies. Then you decide, Hey, 725 00:46:18,840 --> 00:46:21,399 Speaker 1: I want to start playing Minecraft, but I also would 726 00:46:21,480 --> 00:46:24,280 Speaker 1: love to use the skin I bought in war Zone 727 00:46:25,160 --> 00:46:28,160 Speaker 1: in Minecraft, so that if you see me in Minecraft, 728 00:46:28,360 --> 00:46:30,160 Speaker 1: I'm still wearing the same skin that I would be 729 00:46:30,160 --> 00:46:32,920 Speaker 1: wearing in war Zone. So no matter what game I'm in, 730 00:46:33,360 --> 00:46:36,200 Speaker 1: you can tell it's me because I'm wearing that skin. However, 731 00:46:36,719 --> 00:46:38,640 Speaker 1: you know we bought the skin in war Zone. That 732 00:46:38,719 --> 00:46:41,359 Speaker 1: means that you know we bought it in a game 733 00:46:41,360 --> 00:46:43,880 Speaker 1: that has no connection to Minecraft, so you can't just 734 00:46:44,000 --> 00:46:47,879 Speaker 1: poured it over right. Well, the metaverse, if it ever 735 00:46:47,920 --> 00:46:51,320 Speaker 1: actually comes to pass, is likely to consist of multiple 736 00:46:51,440 --> 00:46:55,960 Speaker 1: virtual environments that are created and maintained by different companies 737 00:46:55,960 --> 00:47:00,040 Speaker 1: and groups. Navigating from world to world will require a 738 00:47:00,040 --> 00:47:03,120 Speaker 1: lot of platform support, so the ideas that n f 739 00:47:03,160 --> 00:47:06,240 Speaker 1: t s could play a small part in this, providing 740 00:47:06,280 --> 00:47:09,360 Speaker 1: proof that a person has the right to use certain 741 00:47:09,400 --> 00:47:13,160 Speaker 1: designs as they navigate throughout the metaverse and pass from 742 00:47:13,239 --> 00:47:17,600 Speaker 1: environment to environment. However, that doesn't mean it's actually going 743 00:47:17,680 --> 00:47:21,520 Speaker 1: to happen. I mean, imagine if you could pour over 744 00:47:21,560 --> 00:47:24,440 Speaker 1: your skin from war Zone to Minecraft, let's say that 745 00:47:24,520 --> 00:47:29,720 Speaker 1: there actually was that capability. War Zone uses fairly realistic graphics, 746 00:47:29,719 --> 00:47:32,960 Speaker 1: like the people in war Zone look like people. Minecraft 747 00:47:33,080 --> 00:47:36,480 Speaker 1: is very cartoony and blocking, so a war Zone skin 748 00:47:37,160 --> 00:47:41,520 Speaker 1: would look very different when realized in Minecraft, and you 749 00:47:41,520 --> 00:47:43,000 Speaker 1: would have to have some way to deal with that 750 00:47:43,160 --> 00:47:47,000 Speaker 1: to actually make it work within the two very different environments. 751 00:47:47,040 --> 00:47:49,200 Speaker 1: So what I'm saying is that while n f T 752 00:47:49,360 --> 00:47:51,960 Speaker 1: s could theoretically play a part in the metaverse, it 753 00:47:52,000 --> 00:47:55,920 Speaker 1: will require a ton of cooperation behind the scenes. So 754 00:47:56,000 --> 00:48:00,560 Speaker 1: it's not like it's a foregone conclusion. Like, even even 755 00:48:00,600 --> 00:48:02,759 Speaker 1: as companies like Nike get into the n f T 756 00:48:02,960 --> 00:48:06,400 Speaker 1: game and they start offering up virtual versions of Nikes 757 00:48:06,840 --> 00:48:09,840 Speaker 1: as n f T s, it doesn't mean that, should 758 00:48:09,840 --> 00:48:12,839 Speaker 1: the metaverse actually become a thing, that you'll be able 759 00:48:12,880 --> 00:48:16,320 Speaker 1: to purchase a specific style of Nikes as an n 760 00:48:16,400 --> 00:48:20,320 Speaker 1: f T and wear it throughout all the metaverse, unless 761 00:48:20,360 --> 00:48:23,279 Speaker 1: it's something that's all self contained by one company, in 762 00:48:23,320 --> 00:48:26,000 Speaker 1: which case I would argue it's not really a metaverse 763 00:48:26,040 --> 00:48:30,600 Speaker 1: at all, But that's an episode for another time. Anyway, 764 00:48:30,920 --> 00:48:32,719 Speaker 1: the n f T craze really kind of hit a 765 00:48:32,719 --> 00:48:35,680 Speaker 1: peak in the early part of one and of course 766 00:48:35,680 --> 00:48:39,000 Speaker 1: you still hear about them now, but I feel like 767 00:48:39,680 --> 00:48:43,160 Speaker 1: a lot of folks, apart from the evangelists who are 768 00:48:43,160 --> 00:48:45,080 Speaker 1: hoping to get rich quick off n f t s, 769 00:48:45,320 --> 00:48:48,160 Speaker 1: have kind of cooled off of them for now, Like 770 00:48:48,280 --> 00:48:51,759 Speaker 1: people don't seem to be quite as jazzed about it 771 00:48:51,800 --> 00:48:55,160 Speaker 1: as when they were really being pushed in say like 772 00:48:55,320 --> 00:48:59,480 Speaker 1: March of this year. Okay, we've gone on pretty long. 773 00:48:59,520 --> 00:49:02,080 Speaker 1: We're gonna wrap this up now. We are going to 774 00:49:02,239 --> 00:49:04,279 Speaker 1: continue to do a few more episodes about some of 775 00:49:04,320 --> 00:49:07,120 Speaker 1: the tech stories of one because obviously a lot of 776 00:49:07,160 --> 00:49:11,440 Speaker 1: other stuff has happened. So when we return later this week, 777 00:49:11,880 --> 00:49:15,680 Speaker 1: we will keep on going until my psyche cracks under 778 00:49:15,719 --> 00:49:17,960 Speaker 1: all the things that have happened over the last year. 779 00:49:18,160 --> 00:49:23,200 Speaker 1: Like I said, years, these days feel like they've lasted 780 00:49:23,200 --> 00:49:26,239 Speaker 1: an eternity and at the same time like they're over 781 00:49:26,280 --> 00:49:31,640 Speaker 1: in an instant. It's it's a weird paradox where I think, 782 00:49:31,719 --> 00:49:34,160 Speaker 1: oh gosh, that happened this year. I could have sworn 783 00:49:34,200 --> 00:49:37,680 Speaker 1: that was like five years ago too. Oh wow, I 784 00:49:37,719 --> 00:49:42,560 Speaker 1: can't believe the years over already. Um, time is funny, y'all. 785 00:49:42,640 --> 00:49:45,480 Speaker 1: All right, Well, if you have suggestions for topics I 786 00:49:45,480 --> 00:49:48,960 Speaker 1: should cover in future episodes of tech Stuff. Please let 787 00:49:49,000 --> 00:49:51,120 Speaker 1: me know. Reach out to me on Twitter. It's the 788 00:49:51,120 --> 00:49:53,280 Speaker 1: best place to find me. The handle for the show 789 00:49:53,400 --> 00:49:56,640 Speaker 1: is text stuff H s W and I'll talk to 790 00:49:56,719 --> 00:50:06,359 Speaker 1: you again really soon. Text Stuff is an I Heart 791 00:50:06,440 --> 00:50:10,160 Speaker 1: Radio production. For more podcasts from I Heart Radio, visit 792 00:50:10,200 --> 00:50:13,319 Speaker 1: the i Heart Radio app, Apple Podcasts, or wherever you 793 00:50:13,360 --> 00:50:14,720 Speaker 1: listen to your favorite shows.