1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,119 --> 00:00:14,720 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:14,840 --> 00:00:17,680 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:17,720 --> 00:00:19,799 Speaker 1: and a love of all things tech. And this is 5 00:00:19,840 --> 00:00:24,000 Speaker 1: the tech news for Thursday, May two thousand twenty one. 6 00:00:24,280 --> 00:00:26,920 Speaker 1: If I sound a little different, because I'm actually in 7 00:00:26,960 --> 00:00:31,319 Speaker 1: the studio today though I'm using my old equipment that 8 00:00:31,360 --> 00:00:33,920 Speaker 1: I use at home. But yeah, I'm in a different 9 00:00:33,920 --> 00:00:36,240 Speaker 1: recording environment. Just in case it sounds a little odd, 10 00:00:36,840 --> 00:00:40,479 Speaker 1: let's get into the news. Just as in Tuesday's episode, 11 00:00:40,479 --> 00:00:44,159 Speaker 1: we will start today with some stories about global politics 12 00:00:44,200 --> 00:00:47,440 Speaker 1: and global tech companies. And as I mentioned earlier this week, 13 00:00:47,479 --> 00:00:50,120 Speaker 1: big tech companies are really feeling pressure points from lots 14 00:00:50,159 --> 00:00:54,160 Speaker 1: of different sources due to global politics and policies. Now, 15 00:00:54,200 --> 00:00:57,720 Speaker 1: sometimes that pressure comes from, you know, a government or 16 00:00:57,760 --> 00:01:02,200 Speaker 1: a group of governments, and sometimes it comes from advocacy groups. 17 00:01:02,240 --> 00:01:05,720 Speaker 1: This particular story is about the latter. Numerous groups have 18 00:01:05,880 --> 00:01:09,080 Speaker 1: urged Google to pull out from a cloud computing project 19 00:01:09,400 --> 00:01:13,640 Speaker 1: in Saudi Arabia. The concern is that Google's efforts would 20 00:01:13,640 --> 00:01:18,119 Speaker 1: help enable a regime that has committed human rights violations. So, 21 00:01:18,319 --> 00:01:22,160 Speaker 1: in other words, Google might make it easier for the 22 00:01:22,160 --> 00:01:26,440 Speaker 1: Saudi government to hurt people. Those groups include Human Rights 23 00:01:26,520 --> 00:01:30,880 Speaker 1: Watch and Amnesty International, and back in ten you might 24 00:01:30,920 --> 00:01:35,080 Speaker 1: remember the terrible story of the Saudi journalist Jamal ka Shoji, 25 00:01:35,480 --> 00:01:40,039 Speaker 1: who was murdered at the Saudi consulate in Instantabul, Turkey. 26 00:01:40,400 --> 00:01:44,120 Speaker 1: Uh The CIA concluded that the Crown Prince Mohammed Ben 27 00:01:44,240 --> 00:01:49,040 Speaker 1: Solomon ordered that assassination. That particular event drew a great 28 00:01:49,160 --> 00:01:52,480 Speaker 1: deal of attention towards Saudi Arabia and the Crown Prince, 29 00:01:52,960 --> 00:01:55,320 Speaker 1: and while by no means it was the only horrific 30 00:01:55,360 --> 00:01:59,000 Speaker 1: act attributable to the Saudi government, it is one that 31 00:01:59,160 --> 00:02:02,880 Speaker 1: stood out and the rights organizations argue that Google's cloud 32 00:02:02,920 --> 00:02:06,520 Speaker 1: product could allow the government to spy on its own 33 00:02:06,560 --> 00:02:10,440 Speaker 1: people as well as people visiting Saudi Arabia, It could 34 00:02:10,480 --> 00:02:13,440 Speaker 1: allow the government to limit freedom of expression, and it 35 00:02:13,440 --> 00:02:16,720 Speaker 1: could have a negative impact on due process. Humans Rights 36 00:02:16,800 --> 00:02:19,840 Speaker 1: Watch published a response from Google that said the company 37 00:02:19,840 --> 00:02:23,320 Speaker 1: had submitted its cloud project to an independent human rights 38 00:02:23,440 --> 00:02:27,600 Speaker 1: organization in order to identify possible points where that tool 39 00:02:27,720 --> 00:02:31,960 Speaker 1: could be abused, and that Google then quote took steps 40 00:02:32,000 --> 00:02:35,400 Speaker 1: to address matters identified as part of that review. In 41 00:02:35,480 --> 00:02:39,160 Speaker 1: the quote alright, I'm gonna chime in here now. Usually 42 00:02:40,160 --> 00:02:42,640 Speaker 1: you could say that tools are neither good nor bad. 43 00:02:42,720 --> 00:02:45,360 Speaker 1: It's all on how we use them. So a hammer 44 00:02:45,520 --> 00:02:48,880 Speaker 1: can be a useful construction tool, or it could be 45 00:02:48,960 --> 00:02:52,359 Speaker 1: used as a weapon. And while Google might identify obvious 46 00:02:52,400 --> 00:02:55,280 Speaker 1: points where they need to adjust their approach to try 47 00:02:55,320 --> 00:02:59,240 Speaker 1: and prevent abuse, the fact is that if someone gets 48 00:02:59,280 --> 00:03:02,399 Speaker 1: hold of a tool and they want to use it destructively, 49 00:03:03,080 --> 00:03:05,959 Speaker 1: they typically will find a way to do that. And 50 00:03:06,080 --> 00:03:08,920 Speaker 1: usually it's a way that other folks just didn't anticipate 51 00:03:09,000 --> 00:03:13,239 Speaker 1: because they're looking at it from a preventive standpoint as 52 00:03:13,240 --> 00:03:15,560 Speaker 1: opposed to figuring out how you would use this to 53 00:03:15,600 --> 00:03:19,000 Speaker 1: actually attack someone. So my point is that I'm not 54 00:03:19,080 --> 00:03:22,000 Speaker 1: sure there is a way for Google to see this 55 00:03:22,040 --> 00:03:26,480 Speaker 1: project through while also maintaining, you know, a spotless image 56 00:03:26,480 --> 00:03:30,280 Speaker 1: with regard to any abuses the saudiast might commit. It 57 00:03:30,320 --> 00:03:33,760 Speaker 1: feels a bit too close to them trying to assume 58 00:03:33,880 --> 00:03:38,000 Speaker 1: plausible deniability to me. But as of this recording, it 59 00:03:38,000 --> 00:03:40,960 Speaker 1: appears like that project is still for a go. It's 60 00:03:40,960 --> 00:03:44,960 Speaker 1: still still heading on. Moving over to Iran, the country's 61 00:03:45,000 --> 00:03:48,800 Speaker 1: government has banned cryptocurrency mining for the next few months, 62 00:03:49,040 --> 00:03:53,960 Speaker 1: in anticipation of increased demands in electricity usage. Iran has 63 00:03:54,080 --> 00:03:58,080 Speaker 1: issues with its electrical grid and has seen it's seen 64 00:03:58,120 --> 00:04:02,080 Speaker 1: it fail in the past due to high demand, so 65 00:04:02,440 --> 00:04:05,760 Speaker 1: they've had blackouts that have hit entire regions when energy 66 00:04:05,760 --> 00:04:09,920 Speaker 1: consumption was much higher than normal. They ban on bitcoin 67 00:04:10,040 --> 00:04:12,920 Speaker 1: mining and other cryptocurrency mining has already begun and it's 68 00:04:12,960 --> 00:04:16,400 Speaker 1: going to last until September twenty two. Now, as I've 69 00:04:16,400 --> 00:04:20,400 Speaker 1: covered several times, cryptocurrencies that use a proof of work 70 00:04:20,520 --> 00:04:24,240 Speaker 1: approach to mining, which includes bitcoin, they suck up a 71 00:04:24,279 --> 00:04:27,680 Speaker 1: lot of electricity. Now that's not necessarily by design. So 72 00:04:27,720 --> 00:04:30,760 Speaker 1: in the old days, a bitcoin was worth practically nothing. 73 00:04:31,320 --> 00:04:33,279 Speaker 1: You might have heard the story about the guy who 74 00:04:33,360 --> 00:04:36,920 Speaker 1: used thousands of bitcoins to order a pizza, for example. 75 00:04:37,480 --> 00:04:40,600 Speaker 1: Back in those days, the return on investment in bitcoin 76 00:04:40,680 --> 00:04:44,400 Speaker 1: mining was really low, so it didn't make sense to 77 00:04:44,440 --> 00:04:46,799 Speaker 1: have a big mining operation. You would spend more money 78 00:04:46,920 --> 00:04:50,040 Speaker 1: running it than you would in the proceeds you got 79 00:04:50,080 --> 00:04:53,160 Speaker 1: from mining. So a relatively small number of computers were 80 00:04:53,160 --> 00:04:56,920 Speaker 1: actively mining on the bitcoin system. But as interest in 81 00:04:56,960 --> 00:04:59,760 Speaker 1: the cryptocurrency grew, more people wanted to get in on it, 82 00:05:00,240 --> 00:05:03,800 Speaker 1: and as demand increased. But the supply remains, you know, 83 00:05:03,880 --> 00:05:07,080 Speaker 1: pretty steady. Every ten minutes, some more bitcoins are released, 84 00:05:07,080 --> 00:05:09,560 Speaker 1: but it's not a huge amount. Well, that just meant 85 00:05:09,560 --> 00:05:12,320 Speaker 1: that the value of the currency grew, and as the 86 00:05:12,400 --> 00:05:15,320 Speaker 1: value went up, more people began to jump on board 87 00:05:15,400 --> 00:05:19,800 Speaker 1: to try and mind cryptocurrency because now mining was becoming profitable, 88 00:05:20,240 --> 00:05:23,839 Speaker 1: and this led to the equivalent of a gold rush. Today, 89 00:05:24,320 --> 00:05:27,200 Speaker 1: massive computer networks attempt to be the first to mind 90 00:05:27,240 --> 00:05:30,760 Speaker 1: the next block for the the bitcoins that are released 91 00:05:30,760 --> 00:05:34,280 Speaker 1: when you verify a block of transactions. These computers are 92 00:05:34,360 --> 00:05:38,159 Speaker 1: running on super fast processors. Typically they have state of 93 00:05:38,200 --> 00:05:41,880 Speaker 1: the art graphics processing units or GPUs. That in turn 94 00:05:41,920 --> 00:05:44,240 Speaker 1: has had a big impact on the video game industry 95 00:05:44,240 --> 00:05:46,960 Speaker 1: because it becomes really hard to get the best graphics 96 00:05:46,960 --> 00:05:49,600 Speaker 1: card for your gaming rig because all the ding dang 97 00:05:49,640 --> 00:05:52,960 Speaker 1: bitcoin miners are grabbing them. Add to that the energy 98 00:05:53,000 --> 00:05:56,719 Speaker 1: needs you required to keep operations cool enough so that 99 00:05:56,800 --> 00:06:00,160 Speaker 1: your computers don't overheat, and you end up consuming more 100 00:06:00,160 --> 00:06:04,520 Speaker 1: electricity in bitcoin mining per year than the annual amount 101 00:06:04,520 --> 00:06:08,520 Speaker 1: of electricity used by some countries like Argentina. This has 102 00:06:08,520 --> 00:06:11,520 Speaker 1: been a big point in conversations about fossil fuels and 103 00:06:11,560 --> 00:06:15,320 Speaker 1: climate change, as not all bitcoin mining operations are dependent 104 00:06:15,400 --> 00:06:19,040 Speaker 1: upon renewable energy sources, but it's also a strain on 105 00:06:19,240 --> 00:06:22,839 Speaker 1: utility infrastructure, as is the case with Iran, and according 106 00:06:22,839 --> 00:06:26,240 Speaker 1: to analytics firm Elliptic, around four and a half percent 107 00:06:26,360 --> 00:06:31,200 Speaker 1: of all bitcoin mining worldwide happens in Iran. With China 108 00:06:31,240 --> 00:06:34,679 Speaker 1: cracking down on cryptocurrency for various reasons which we covered 109 00:06:34,720 --> 00:06:38,200 Speaker 1: on Tuesday, and now Iran putting the brakes on operations 110 00:06:38,240 --> 00:06:41,359 Speaker 1: for several months, this could mean that the cryptocurrency market 111 00:06:41,360 --> 00:06:43,280 Speaker 1: will still have a few more bumps in the road 112 00:06:43,400 --> 00:06:47,200 Speaker 1: for you know, the near future. Now, let us pop 113 00:06:47,279 --> 00:06:50,640 Speaker 1: over to the UK where things are changing with regard 114 00:06:50,760 --> 00:06:54,080 Speaker 1: to Uber. For the first time, Uber has recognized the 115 00:06:54,200 --> 00:06:57,840 Speaker 1: legitimacy of a union formed by UK workers who drive 116 00:06:57,920 --> 00:07:01,120 Speaker 1: for the company. The union g m B will be 117 00:07:01,200 --> 00:07:04,440 Speaker 1: able to stand as a representative for UK drivers and 118 00:07:04,600 --> 00:07:08,040 Speaker 1: negotiate deals with regard to stuff like earnings and benefits 119 00:07:08,080 --> 00:07:13,040 Speaker 1: and pensions. This is a huge shift. Traditionally, Uber has 120 00:07:13,080 --> 00:07:16,800 Speaker 1: maintained that Uber drivers are not Uber employees, They're not 121 00:07:16,960 --> 00:07:22,600 Speaker 1: Uber workers. Rather, Uber says these people are freelancers, their 122 00:07:22,680 --> 00:07:27,560 Speaker 1: contract workers who work with Uber but not for Uber, 123 00:07:27,920 --> 00:07:31,840 Speaker 1: and so argues Uber the company really only has obligations 124 00:07:31,880 --> 00:07:35,679 Speaker 1: to provide benefits and such to corporate employees, the people 125 00:07:35,680 --> 00:07:39,520 Speaker 1: who work for Uber's offices. But several lawsuits have gradually 126 00:07:39,600 --> 00:07:43,119 Speaker 1: kind of reshaped this issue, and the UK Supreme Court 127 00:07:43,200 --> 00:07:46,880 Speaker 1: ruled that no, you know what, actually, Uber drivers should 128 00:07:46,880 --> 00:07:51,640 Speaker 1: be considered employees and they are thus entitled to those considerations. 129 00:07:51,680 --> 00:07:54,880 Speaker 1: So now in the UK, the GMB and Uber will 130 00:07:54,920 --> 00:07:59,080 Speaker 1: meet every quarter to discuss driver issues and concerns, and 131 00:07:59,200 --> 00:08:02,200 Speaker 1: Uber and GM will also meet to talk about Uber's 132 00:08:02,280 --> 00:08:05,080 Speaker 1: national living wage guarantee and what that looks like in 133 00:08:05,120 --> 00:08:09,120 Speaker 1: the UK, as well as stuff like pensions. Drivers will 134 00:08:09,120 --> 00:08:12,320 Speaker 1: have to sign up to join the union because membership 135 00:08:12,440 --> 00:08:16,880 Speaker 1: is not automatic. Also, while this is potentially great news 136 00:08:16,920 --> 00:08:20,280 Speaker 1: for UK Uber drivers, the company has not made any 137 00:08:20,360 --> 00:08:23,000 Speaker 1: similar moves in other countries. There are a lot of 138 00:08:23,040 --> 00:08:26,920 Speaker 1: groups advocating for change, but I mean Uber actively campaigned 139 00:08:26,920 --> 00:08:30,720 Speaker 1: against that in California just recently. So so far, the 140 00:08:30,800 --> 00:08:33,120 Speaker 1: UK is the only spot where we're really seeing this 141 00:08:33,200 --> 00:08:37,120 Speaker 1: kind of Uber working with you know, a union and 142 00:08:37,120 --> 00:08:41,040 Speaker 1: and allowing their drivers to be known as employees. It's 143 00:08:41,080 --> 00:08:43,760 Speaker 1: also good to remember the Uber is a company that 144 00:08:43,800 --> 00:08:47,959 Speaker 1: continues to lose money on a quarterly basis, in they 145 00:08:48,000 --> 00:08:51,160 Speaker 1: averaged at about a billion dollars per quarter. Actually, when 146 00:08:51,200 --> 00:08:53,000 Speaker 1: it was all said and done, they lost about six 147 00:08:53,040 --> 00:08:57,000 Speaker 1: point seven billion dollars in. Now, that was bad, but 148 00:08:57,160 --> 00:08:59,839 Speaker 1: still better than twenty nineteen when they lost eight point 149 00:09:00,040 --> 00:09:05,360 Speaker 1: five one billion dollars. So yeah, Uper's future is one 150 00:09:05,440 --> 00:09:09,880 Speaker 1: that's really up in the air. The company keeps saying 151 00:09:09,920 --> 00:09:12,559 Speaker 1: that you know, they are on track for profitability. In fact, 152 00:09:13,600 --> 00:09:16,320 Speaker 1: up until the end of last year there was a 153 00:09:16,360 --> 00:09:18,440 Speaker 1: hope that they were going to be able to to 154 00:09:19,080 --> 00:09:22,680 Speaker 1: post a profit in but that did not happen, So 155 00:09:22,760 --> 00:09:28,040 Speaker 1: maybe their year left to see. European rights organization including 156 00:09:28,080 --> 00:09:32,319 Speaker 1: Privacy International, have filed complaints against a company called clear 157 00:09:32,440 --> 00:09:37,720 Speaker 1: View AI. That company specializes in facial recognition technology. The 158 00:09:37,800 --> 00:09:41,280 Speaker 1: complaints say that the company scrapes image data from various 159 00:09:41,320 --> 00:09:46,400 Speaker 1: websites and services and that this practice violates European privacy laws. 160 00:09:46,920 --> 00:09:49,439 Speaker 1: Clear View AI has said that it has a database 161 00:09:49,480 --> 00:09:53,240 Speaker 1: of more than three billion facial images and the company 162 00:09:53,320 --> 00:09:59,079 Speaker 1: frequently scans content from sites and services like Facebook, Instagram, LinkedIn, 163 00:09:59,320 --> 00:10:03,520 Speaker 1: and YouTube. Then the company organizes that database and sells 164 00:10:03,600 --> 00:10:07,880 Speaker 1: access to it to various parties like other private companies 165 00:10:08,040 --> 00:10:12,080 Speaker 1: or with law enforcement agencies. So the advocacy groups say 166 00:10:12,120 --> 00:10:15,400 Speaker 1: that this practice goes far beyond what users expect when 167 00:10:15,400 --> 00:10:18,720 Speaker 1: they post on these websites, and that this leads to 168 00:10:18,760 --> 00:10:22,280 Speaker 1: not just an invasion of privacy concern, but the potential 169 00:10:22,320 --> 00:10:26,920 Speaker 1: for harassment and wrongful accusations and more. This isn't the 170 00:10:27,000 --> 00:10:30,000 Speaker 1: first time groups have taken aim at clear View AI. 171 00:10:30,160 --> 00:10:34,960 Speaker 1: There are other inquiries and investigations going on in Australia, Canada, 172 00:10:35,280 --> 00:10:38,920 Speaker 1: and the United States. Also, Hey, if you're curious, you 173 00:10:38,960 --> 00:10:42,000 Speaker 1: can actually go to clear views website and use tools 174 00:10:42,000 --> 00:10:46,280 Speaker 1: to request any data that the company might have on you. 175 00:10:46,280 --> 00:10:49,320 Speaker 1: You can also tell clear view AI if you want 176 00:10:49,360 --> 00:10:52,600 Speaker 1: to have your face omitted from clients searches in the future. 177 00:10:53,080 --> 00:10:55,560 Speaker 1: And if a lot of people did that, it would 178 00:10:55,600 --> 00:10:59,320 Speaker 1: really undermine the value of clear Views product. I'm not 179 00:10:59,360 --> 00:11:02,240 Speaker 1: telling you to do it, I'm just saying it's an option. 180 00:11:03,320 --> 00:11:08,079 Speaker 1: On Tuesday, after our last tech News episode went live, 181 00:11:08,880 --> 00:11:12,560 Speaker 1: which is always the way, vm Ware disclosed a security 182 00:11:12,640 --> 00:11:17,199 Speaker 1: vulnerability in its v Center server tool, and simultaneously they 183 00:11:17,240 --> 00:11:21,720 Speaker 1: issued a patch to address that vulnerability. This tool lets 184 00:11:21,800 --> 00:11:26,520 Speaker 1: database administrators manage computer virtualization and let me give you 185 00:11:26,520 --> 00:11:29,319 Speaker 1: a quick word on that in case you are unfamiliar 186 00:11:29,360 --> 00:11:34,120 Speaker 1: with that term. Virtualization allows you to virtually divide up 187 00:11:34,160 --> 00:11:37,720 Speaker 1: the assets of a physical computer as if it were 188 00:11:37,800 --> 00:11:42,120 Speaker 1: more than one physical computer. Now, you might use virtualization 189 00:11:42,480 --> 00:11:44,600 Speaker 1: so that your computer runs as if it were a 190 00:11:44,640 --> 00:11:49,080 Speaker 1: totally different type of computer running on a different operating system. 191 00:11:49,200 --> 00:11:52,480 Speaker 1: That's one version of virtualization where you're not really dividing 192 00:11:52,480 --> 00:11:54,760 Speaker 1: it up so much as you're having your computer kind 193 00:11:54,800 --> 00:11:58,600 Speaker 1: of mimic a different type of machine because of this 194 00:11:58,679 --> 00:12:03,040 Speaker 1: virtual environment. Or you could use it to partition a 195 00:12:03,080 --> 00:12:07,240 Speaker 1: computer so that different partitions have no contact with one 196 00:12:07,280 --> 00:12:10,640 Speaker 1: another their siloed. This is important for the purposes of 197 00:12:10,800 --> 00:12:14,120 Speaker 1: data security and that kind of thing. Well, anyway, vm 198 00:12:14,160 --> 00:12:17,240 Speaker 1: Whare says the vulnerability could mean that hackers would have 199 00:12:17,280 --> 00:12:21,000 Speaker 1: the opportunity to inject malware into servers running this kind 200 00:12:21,000 --> 00:12:24,319 Speaker 1: of software. If there is a port on the server 201 00:12:24,600 --> 00:12:26,960 Speaker 1: that connects out to the Internet, and a lot of servers, 202 00:12:27,040 --> 00:12:29,200 Speaker 1: do you know, a lot of them connect out to 203 00:12:29,200 --> 00:12:32,199 Speaker 1: the Internet in some way or another, So how bad 204 00:12:32,640 --> 00:12:36,040 Speaker 1: is this vulnerability? Well, according to security analysts, if you 205 00:12:36,080 --> 00:12:39,400 Speaker 1: were to rank vulnerabilities on a scale of one being 206 00:12:39,679 --> 00:12:44,320 Speaker 1: not very serious and tend being catastrophic. This one is 207 00:12:44,360 --> 00:12:49,120 Speaker 1: a nine point eight. It is a critical vulnerability. Anyone 208 00:12:49,160 --> 00:12:51,840 Speaker 1: who works on servers and uses this software that is 209 00:12:51,880 --> 00:12:56,440 Speaker 1: the v Center Server tool should immediately get the patch 210 00:12:56,679 --> 00:13:00,600 Speaker 1: and install it on those servers, or restrict access to 211 00:13:00,720 --> 00:13:03,760 Speaker 1: those servers, you know, disconnect them from the Internet until 212 00:13:03,800 --> 00:13:07,360 Speaker 1: they can be patched. It is that big of a deal. 213 00:13:09,040 --> 00:13:12,440 Speaker 1: Instagram has been testing a feature in several countries for 214 00:13:12,480 --> 00:13:16,080 Speaker 1: the past two years that it is now rolling out worldwide, 215 00:13:16,440 --> 00:13:19,240 Speaker 1: and it's an option that users can select that lets 216 00:13:19,280 --> 00:13:22,160 Speaker 1: them hide the number of likes and views that they 217 00:13:22,160 --> 00:13:25,680 Speaker 1: get on any given post. For some on social media, 218 00:13:26,240 --> 00:13:30,520 Speaker 1: likes and views that's kind of like currency. For influencers. 219 00:13:30,559 --> 00:13:34,960 Speaker 1: It's a way to show relevance and potentially land sponsorship deals. 220 00:13:35,000 --> 00:13:38,080 Speaker 1: But judging your own value on how many likes or 221 00:13:38,200 --> 00:13:41,480 Speaker 1: views you did or didn't get on a post isn't 222 00:13:41,520 --> 00:13:44,920 Speaker 1: necessarily healthy. I mean, I know, I have fixated on 223 00:13:44,960 --> 00:13:48,280 Speaker 1: that a few times, and it's always been a bad idea. 224 00:13:48,520 --> 00:13:51,360 Speaker 1: I mostly do it over on Twitter, where I'll make 225 00:13:51,400 --> 00:13:54,559 Speaker 1: what I think was a great tweet and then I get, 226 00:13:54,600 --> 00:13:57,720 Speaker 1: you know, kind of grouchy when there aren't that many 227 00:13:57,800 --> 00:14:00,560 Speaker 1: likes for it or shares or anything. Now, that's just 228 00:14:00,720 --> 00:14:05,079 Speaker 1: dumb of me to do that to myself. Now on Instagram, 229 00:14:05,080 --> 00:14:08,280 Speaker 1: people can choose to hide those numbers and just use 230 00:14:08,320 --> 00:14:11,360 Speaker 1: the app to share photos without you know, thinking about 231 00:14:11,400 --> 00:14:14,559 Speaker 1: whether the thing they posted was the best picture at 232 00:14:14,559 --> 00:14:17,440 Speaker 1: the best time or whatever. And I think for a 233 00:14:17,440 --> 00:14:21,440 Speaker 1: lot of users that's actually really helpful. Now, obviously, for influencers, 234 00:14:21,440 --> 00:14:25,160 Speaker 1: it's probably not something they're going to use very often, 235 00:14:26,080 --> 00:14:28,280 Speaker 1: even though I think a few folks of the influencers 236 00:14:28,320 --> 00:14:32,920 Speaker 1: category might benefit from taking a break of quantifying their 237 00:14:33,000 --> 00:14:36,320 Speaker 1: own worth based on how people interact with their content. 238 00:14:36,640 --> 00:14:39,000 Speaker 1: But hey, I'm just a grumpy old dude who shakes 239 00:14:39,040 --> 00:14:42,440 Speaker 1: his fist at passing clouds. Right, I'm an old content 240 00:14:42,520 --> 00:14:46,480 Speaker 1: creator who thinks sometimes it's good to kind of distance 241 00:14:46,520 --> 00:14:49,640 Speaker 1: yourself from that, because otherwise it can lead you down 242 00:14:49,640 --> 00:14:53,160 Speaker 1: a pretty dark mental pathway. If you would like to 243 00:14:53,200 --> 00:14:56,880 Speaker 1: activate this feature, then you can log into Instagram. You 244 00:14:56,920 --> 00:14:59,960 Speaker 1: can go to settings, You go to the posts section, 245 00:15:00,040 --> 00:15:02,880 Speaker 1: and and you should see an option to hide like 246 00:15:03,160 --> 00:15:07,560 Speaker 1: and view counts. Boom sordid. People will still be able 247 00:15:07,560 --> 00:15:10,840 Speaker 1: to like and view your stuff, obviously, but you won't 248 00:15:10,840 --> 00:15:14,240 Speaker 1: have that number in plain sight, you know, mocking you 249 00:15:14,400 --> 00:15:17,640 Speaker 1: or nagging you to do better, which for me, I 250 00:15:17,640 --> 00:15:20,640 Speaker 1: mean that sounds ideal. It's not for everyone, but for 251 00:15:20,680 --> 00:15:23,920 Speaker 1: people like me, I think it's a great feature. Well, 252 00:15:23,920 --> 00:15:25,840 Speaker 1: I've got a little bit more news to go, but 253 00:15:25,960 --> 00:15:37,720 Speaker 1: first let's take a quick break. We're back and hay, 254 00:15:38,280 --> 00:15:42,560 Speaker 1: do you use Twitter like a lot? And do you 255 00:15:42,680 --> 00:15:46,520 Speaker 1: rely on Twitter to do stuff like make connections, sell 256 00:15:46,600 --> 00:15:50,760 Speaker 1: your brand or try to get Neil Gaiman to notice you. Well, 257 00:15:51,080 --> 00:15:55,000 Speaker 1: if you're like me, the answer to those things is yeah, 258 00:15:55,400 --> 00:15:57,960 Speaker 1: and you probably feel a little sad about it, like 259 00:15:58,040 --> 00:16:01,880 Speaker 1: I do. Anyway, Twitter has a new service coming aimed 260 00:16:02,040 --> 00:16:06,040 Speaker 1: just at people like you and me. It's their premium service, 261 00:16:06,200 --> 00:16:09,000 Speaker 1: so yes, that does mean you will have to pay 262 00:16:09,200 --> 00:16:12,840 Speaker 1: to get access to these new features. Features like a 263 00:16:12,960 --> 00:16:17,560 Speaker 1: bookmarking feature and an undo tweets feature, which sounds like 264 00:16:17,600 --> 00:16:19,320 Speaker 1: it's the closest thing we're ever going to get to, 265 00:16:19,400 --> 00:16:22,400 Speaker 1: an edit button, and you know, there might be other 266 00:16:22,480 --> 00:16:25,720 Speaker 1: features as well, but like, we don't know. Because all 267 00:16:25,760 --> 00:16:29,480 Speaker 1: this news comes from an app researcher named Jane man 268 00:16:29,640 --> 00:16:33,240 Speaker 1: Chun Wong. She came across this information and shared it. 269 00:16:33,400 --> 00:16:37,640 Speaker 1: Twitter itself has not yet commented on this planned service, 270 00:16:38,200 --> 00:16:40,760 Speaker 1: but apparently, at least right now, it has the name 271 00:16:41,000 --> 00:16:45,440 Speaker 1: Twitter Blue and the subscription is a low two dollars 272 00:16:44,880 --> 00:16:47,840 Speaker 1: n cents per month. This, by the way, is a 273 00:16:47,880 --> 00:16:53,320 Speaker 1: separate offering from the already announced subscription based service that 274 00:16:53,400 --> 00:16:56,440 Speaker 1: Twitter is going to offer up that will allow you know, 275 00:16:56,480 --> 00:16:59,920 Speaker 1: Twitter users to set up a paid for tier of 276 00:17:00,120 --> 00:17:04,160 Speaker 1: access for their followers. So this would be like, if 277 00:17:04,160 --> 00:17:06,600 Speaker 1: I set this up, then followers who really liked me 278 00:17:07,160 --> 00:17:11,320 Speaker 1: could subscribe to this exclusive feed and get you know, 279 00:17:11,720 --> 00:17:16,440 Speaker 1: special data, special content that I'm only posting to that 280 00:17:16,680 --> 00:17:19,240 Speaker 1: version of my Twitter, and they're doing it for a 281 00:17:19,280 --> 00:17:22,600 Speaker 1: recurring fee, and Twitter takes a share of that subscription price. 282 00:17:23,240 --> 00:17:26,679 Speaker 1: That's something you're not likely to see me use, because 283 00:17:26,760 --> 00:17:29,560 Speaker 1: let's face it, I'm not important enough and I would 284 00:17:29,560 --> 00:17:31,720 Speaker 1: spend way too much time just trying to figure out 285 00:17:31,760 --> 00:17:34,880 Speaker 1: what I could say in those like two dly characters 286 00:17:35,320 --> 00:17:38,520 Speaker 1: that would be worth paying for. But these efforts show 287 00:17:38,560 --> 00:17:42,080 Speaker 1: how Twitter is looking to diversify its revenue sources. The 288 00:17:42,119 --> 00:17:45,840 Speaker 1: company has had issues attracting new users that that number 289 00:17:45,840 --> 00:17:48,640 Speaker 1: has kind of plateaued, which means it's tough to grow 290 00:17:48,680 --> 00:17:53,320 Speaker 1: revenue and generally speaking, in modern day business, that's a 291 00:17:53,359 --> 00:17:56,480 Speaker 1: bad thing. It's not good enough to make money. You 292 00:17:56,520 --> 00:18:00,720 Speaker 1: need to make more money than you made last year. Hi. 293 00:18:02,400 --> 00:18:05,680 Speaker 1: Facebook has announced that the social media platform will begin 294 00:18:05,760 --> 00:18:09,919 Speaker 1: to use its algorithm to counteract misinformation, which, you know, 295 00:18:10,080 --> 00:18:12,000 Speaker 1: I think it's a nice change of pace, since that 296 00:18:12,119 --> 00:18:17,199 Speaker 1: algorithm so far has really helped elevate and propagate misinformation. 297 00:18:17,720 --> 00:18:20,600 Speaker 1: The company says that now it will both label posts 298 00:18:20,640 --> 00:18:24,840 Speaker 1: appearing to contain misinformation and d prioritize those posts from 299 00:18:24,840 --> 00:18:27,600 Speaker 1: appearing in the feeds of other people, but it will 300 00:18:27,680 --> 00:18:31,160 Speaker 1: also lower the visibility of the users who are sharing 301 00:18:31,520 --> 00:18:35,639 Speaker 1: those posts. So if I start sharing posts that have 302 00:18:35,880 --> 00:18:40,480 Speaker 1: been found to contain outright lies and misinformation, what should 303 00:18:40,520 --> 00:18:43,120 Speaker 1: happen is that from that point forward, if I've done 304 00:18:43,119 --> 00:18:46,160 Speaker 1: it enough, all of my posts, not just the ones 305 00:18:46,200 --> 00:18:49,960 Speaker 1: containing misinformation, will start to show up in fewer feeds. 306 00:18:50,359 --> 00:18:52,240 Speaker 1: So would be kind of like if I found myself 307 00:18:52,560 --> 00:18:56,720 Speaker 1: in a glass cage of sadness that's sound proved, and 308 00:18:56,760 --> 00:18:58,520 Speaker 1: meanwhile that cage is in the middle of a big 309 00:18:58,600 --> 00:19:01,119 Speaker 1: old party, and that's going on all around me. I 310 00:19:01,160 --> 00:19:03,640 Speaker 1: could yell at the top of my lungs what's going on? 311 00:19:04,640 --> 00:19:06,800 Speaker 1: Or you know I could I could try and spread 312 00:19:06,800 --> 00:19:09,919 Speaker 1: misinformation from inside my glass cage, or or even I 313 00:19:09,920 --> 00:19:12,760 Speaker 1: could just say I'm thinking of getting pizza tonight, and 314 00:19:12,840 --> 00:19:17,119 Speaker 1: folks around me wasn't here because Facebook's disincentivizing the spread 315 00:19:17,119 --> 00:19:21,480 Speaker 1: of misinformation. There d prioritizing those messages so they don't 316 00:19:21,560 --> 00:19:24,600 Speaker 1: pop up for most people. And I actually feel like 317 00:19:24,680 --> 00:19:29,240 Speaker 1: this tactic, assuming the system works, is a pretty good one. 318 00:19:29,640 --> 00:19:32,520 Speaker 1: It helps cut down on the echo chamber effect, and 319 00:19:32,560 --> 00:19:35,119 Speaker 1: it means that the people who try and spread misinformation 320 00:19:35,160 --> 00:19:40,280 Speaker 1: purposefully are going to find their influence diminishing as a result. Now, 321 00:19:40,359 --> 00:19:42,880 Speaker 1: the way this will work in practice is that Facebook 322 00:19:42,880 --> 00:19:46,160 Speaker 1: will keep tabs on people who have repeatedly shared posts 323 00:19:46,200 --> 00:19:50,320 Speaker 1: that Facebook fact checkers have flagged as containing misinformation, So 324 00:19:50,359 --> 00:19:53,920 Speaker 1: if you do that enough times, you'll find yourself essentially demoted. 325 00:19:54,520 --> 00:19:56,440 Speaker 1: Do I think this is going to solve the problems 326 00:19:56,440 --> 00:20:01,040 Speaker 1: of misinformation spreading across on Facebook? Probably not. A lot 327 00:20:01,080 --> 00:20:02,680 Speaker 1: more has to happen for that to be a thing, 328 00:20:03,040 --> 00:20:06,520 Speaker 1: which might require a massive change in how advertisers work 329 00:20:06,520 --> 00:20:09,159 Speaker 1: with Facebook in order for that to come about, But 330 00:20:09,240 --> 00:20:13,000 Speaker 1: I think it might be a step in the right direction. Possibly, 331 00:20:13,240 --> 00:20:15,400 Speaker 1: we'll have to see how it actually plays out when 332 00:20:15,440 --> 00:20:19,400 Speaker 1: it's being put in use. Okay, let's go back to England, 333 00:20:19,400 --> 00:20:23,800 Speaker 1: shall we, because our penultimate story is one that's all 334 00:20:23,840 --> 00:20:29,400 Speaker 1: about fusion. Stars are giant nuclear fusion reactors in which 335 00:20:29,400 --> 00:20:32,880 Speaker 1: elements fused together and in the process release enormous amounts 336 00:20:32,920 --> 00:20:37,840 Speaker 1: of energy. Or, as the somewhat apocryphal song goes, the 337 00:20:37,880 --> 00:20:41,960 Speaker 1: sun is a massive incandescent gas, a gigantic nuclear furnace 338 00:20:42,400 --> 00:20:45,119 Speaker 1: where hydrogen is built into helium and a temperature of 339 00:20:45,160 --> 00:20:50,719 Speaker 1: billions of degrees. That's somewhat misleading, but still catchy. So 340 00:20:50,840 --> 00:20:53,880 Speaker 1: with with the son, you're talking about four hydrogen atoms 341 00:20:53,880 --> 00:20:57,320 Speaker 1: that fused together to form a helium adam Well, scientists 342 00:20:57,359 --> 00:21:00,840 Speaker 1: and engineers have been trying to make a sustainable, trollable, 343 00:21:01,800 --> 00:21:05,560 Speaker 1: usable fusion reactor here on Earth for years, and if 344 00:21:05,600 --> 00:21:09,080 Speaker 1: they succeed, this could lead to an energy revolution. So 345 00:21:09,240 --> 00:21:11,720 Speaker 1: remember when I was talking about concerns about how much 346 00:21:11,760 --> 00:21:16,919 Speaker 1: electricity cryptocurrency requires, Well, if we have working fusion reactors, 347 00:21:17,560 --> 00:21:20,440 Speaker 1: that wouldn't be as big of a concern as long 348 00:21:20,480 --> 00:21:24,119 Speaker 1: as the actual infrastructure of power lines and transformers, you know, 349 00:21:24,160 --> 00:21:27,320 Speaker 1: all the physical stuff that conveys electricity. As long as 350 00:21:27,320 --> 00:21:30,800 Speaker 1: that could hold up to the demand, we'd be pretty good. 351 00:21:31,200 --> 00:21:36,679 Speaker 1: Nuclear fusion, unlike nuclear fission, does not generate dangerous radioactive waste, 352 00:21:37,040 --> 00:21:40,640 Speaker 1: but nuclear fusion does have some big challenges. For example, 353 00:21:40,840 --> 00:21:44,959 Speaker 1: to get fusion reactions, you're typically working with systems that 354 00:21:45,040 --> 00:21:48,399 Speaker 1: have to generate enormous amounts of heat and pressure. So 355 00:21:48,400 --> 00:21:51,400 Speaker 1: we're talking about conditions that could be even more intense 356 00:21:51,440 --> 00:21:53,800 Speaker 1: than what you would find at the core of the sun. 357 00:21:54,640 --> 00:21:57,240 Speaker 1: That requires a lot of energy to do, which means 358 00:21:57,280 --> 00:22:01,440 Speaker 1: that many prototype fusion reactors have encountered the problem that 359 00:22:01,560 --> 00:22:04,960 Speaker 1: it can take more energy to generate the fusion reaction 360 00:22:05,440 --> 00:22:07,840 Speaker 1: then you get out of the other side of it. 361 00:22:08,400 --> 00:22:11,440 Speaker 1: Another challenge, one that some scientists in the UK think 362 00:22:11,480 --> 00:22:14,879 Speaker 1: they've really overcome, has to do with exhaust gases that 363 00:22:15,040 --> 00:22:18,679 Speaker 1: incredibly hot helium gas that gets generated as a result 364 00:22:18,720 --> 00:22:22,639 Speaker 1: of fusion. Now, these gases are in more than a 365 00:22:22,720 --> 00:22:26,200 Speaker 1: hundred million degrees of celsius hot. I mean they are 366 00:22:26,280 --> 00:22:30,200 Speaker 1: insanely hot, hotter than the surface of the sun. Hot. 367 00:22:30,800 --> 00:22:33,639 Speaker 1: That means they're also going to cause a lot of 368 00:22:33,720 --> 00:22:38,359 Speaker 1: damage to elements of reactors as they exhaust out, like 369 00:22:38,440 --> 00:22:41,880 Speaker 1: they come into contact with the sides of the reactor, 370 00:22:41,960 --> 00:22:44,560 Speaker 1: they can start melting tiles, even tiles that are made 371 00:22:44,560 --> 00:22:50,120 Speaker 1: out of very melt resistant materials like tungsten. So getting 372 00:22:50,320 --> 00:22:53,920 Speaker 1: fusion reactions efficient is one area of research and we've 373 00:22:53,920 --> 00:22:56,080 Speaker 1: seen a lot of progress there. And now finding a 374 00:22:56,080 --> 00:22:59,359 Speaker 1: way to cool those exhaust gases faster so that they 375 00:22:59,480 --> 00:23:02,560 Speaker 1: cause less where and tear is another big one. The 376 00:23:02,640 --> 00:23:06,480 Speaker 1: exhaust system the scientists developed has a cool name, which 377 00:23:06,520 --> 00:23:09,000 Speaker 1: I mean, I guess that's fitting because it's cooling gas 378 00:23:09,600 --> 00:23:14,359 Speaker 1: and it is the super ex diverter. It diverts generated 379 00:23:14,400 --> 00:23:18,200 Speaker 1: helium from fusion reactions down a long pathway to cool 380 00:23:18,800 --> 00:23:21,000 Speaker 1: and the cooling happens pretty fast. I mean like that 381 00:23:21,119 --> 00:23:24,520 Speaker 1: energy losses is quick, and because the helium is traveling 382 00:23:24,560 --> 00:23:27,840 Speaker 1: down sort of a straightaway, it's not banging into the 383 00:23:27,880 --> 00:23:31,280 Speaker 1: tiled sides of the reactor structure. And this is a 384 00:23:31,320 --> 00:23:34,200 Speaker 1: great step, but we're still looking at a long term 385 00:23:34,280 --> 00:23:38,120 Speaker 1: implementation plan for fusion reactors. The estimate is that we're 386 00:23:38,160 --> 00:23:40,919 Speaker 1: not going to see a commercially viable fusion reactor for 387 00:23:40,960 --> 00:23:44,640 Speaker 1: at least another twenty years. So while they are promising 388 00:23:44,760 --> 00:23:47,679 Speaker 1: and if they work out, they will transform the world, 389 00:23:48,240 --> 00:23:49,960 Speaker 1: we still have a lot of work to do to 390 00:23:50,119 --> 00:23:55,280 Speaker 1: address our energy needs in the meantime. Finally, let's talk 391 00:23:55,320 --> 00:23:59,920 Speaker 1: about our perception of time. Now, we have all experienced 392 00:24:00,080 --> 00:24:03,280 Speaker 1: the fact that we perceive time passing at different rates 393 00:24:03,280 --> 00:24:07,200 Speaker 1: depending on what's going on. That our perception isn't a constant. 394 00:24:07,240 --> 00:24:10,520 Speaker 1: We're not like a reliable clock that is just ticking 395 00:24:10,560 --> 00:24:14,440 Speaker 1: down each second to incredible precision. So when you're in 396 00:24:14,520 --> 00:24:16,840 Speaker 1: the middle of doing something that you don't really want 397 00:24:16,840 --> 00:24:21,240 Speaker 1: to do, time just drags on. When you're enjoying yourself, 398 00:24:21,480 --> 00:24:23,639 Speaker 1: time zooms by I mean, we've got that phrase right. 399 00:24:23,760 --> 00:24:27,679 Speaker 1: Time flies when you're having fun. But now researchers have 400 00:24:27,880 --> 00:24:32,520 Speaker 1: seen through some experiments that in certain virtual reality experiences, 401 00:24:32,560 --> 00:24:36,879 Speaker 1: a user's perception of time slows down, and the researchers 402 00:24:36,920 --> 00:24:40,000 Speaker 1: believe that this is a fundamental aspect to the experience 403 00:24:40,000 --> 00:24:42,440 Speaker 1: of VR that's not related to whether or not you're 404 00:24:42,600 --> 00:24:46,960 Speaker 1: enjoying yourself while you're in this virtual experience, but rather 405 00:24:47,440 --> 00:24:50,600 Speaker 1: it's the anchor point of the virtual reality experience itself. 406 00:24:50,640 --> 00:24:54,200 Speaker 1: So in other words, while you are in VR, you'll 407 00:24:54,240 --> 00:24:57,120 Speaker 1: perceive that a lot less time has passed for you. 408 00:24:57,640 --> 00:24:59,600 Speaker 1: Then what the clock would tell you when you take 409 00:24:59,640 --> 00:25:03,360 Speaker 1: the heads that off that lion clock. So here's how 410 00:25:03,400 --> 00:25:07,560 Speaker 1: they conducted the experiment. Test subjects were to complete a maze, 411 00:25:08,000 --> 00:25:10,120 Speaker 1: you know, a virtual maze as a like a computer 412 00:25:10,200 --> 00:25:13,119 Speaker 1: game version of a maze. There was a VR version 413 00:25:13,200 --> 00:25:15,400 Speaker 1: of this maze, and there was another one that used 414 00:25:15,400 --> 00:25:19,000 Speaker 1: just a classic computer display. And the users were supposed 415 00:25:19,040 --> 00:25:21,080 Speaker 1: to try and complete the maze, and then they were 416 00:25:21,119 --> 00:25:24,679 Speaker 1: supposed to tell administrators when they felt that five minutes 417 00:25:24,720 --> 00:25:26,800 Speaker 1: had gone by. So, you know, they're working on the 418 00:25:26,840 --> 00:25:28,720 Speaker 1: maze and when they think, oh, I felt like about 419 00:25:28,760 --> 00:25:31,800 Speaker 1: five minutes, they'd speak up uh, And there was a 420 00:25:31,840 --> 00:25:34,399 Speaker 1: reminder for them to actually do this that would flash 421 00:25:34,440 --> 00:25:38,199 Speaker 1: on screen every eight seconds. However, the subjects were not 422 00:25:38,320 --> 00:25:42,600 Speaker 1: told the frequency at which this this reminder would flash, 423 00:25:42,760 --> 00:25:45,239 Speaker 1: so if they had, they could have used that as 424 00:25:45,280 --> 00:25:48,040 Speaker 1: a way to measure the passing of time. So they 425 00:25:48,080 --> 00:25:50,240 Speaker 1: just knew that this message kept popping up. They didn't 426 00:25:50,280 --> 00:25:53,320 Speaker 1: know that it was, you know, an eight second delay 427 00:25:53,480 --> 00:25:57,600 Speaker 1: between messages. Anyway, at the end of it, the scientists 428 00:25:57,640 --> 00:26:00,520 Speaker 1: said that those who were using the VR version on 429 00:26:00,640 --> 00:26:04,240 Speaker 1: average suggested that five minutes had passed at around the 430 00:26:04,320 --> 00:26:07,280 Speaker 1: six and a half minute mark, or the way they 431 00:26:07,320 --> 00:26:10,080 Speaker 1: put it, was that quote an average of twenty eight 432 00:26:10,119 --> 00:26:13,479 Speaker 1: point five percent more real time passed for participants who 433 00:26:13,520 --> 00:26:16,399 Speaker 1: played the VR game than for those in the control group, 434 00:26:16,600 --> 00:26:20,720 Speaker 1: with no difference in perceived duration end quote. Now, there 435 00:26:20,720 --> 00:26:22,879 Speaker 1: have been other studies kind of related to this issue. 436 00:26:23,240 --> 00:26:28,000 Speaker 1: Some of them have conflicting results that that sort of uh, 437 00:26:28,160 --> 00:26:31,240 Speaker 1: you know, they don't align with this this experiments results. 438 00:26:31,440 --> 00:26:34,160 Speaker 1: So this is by no means the definitive answer as 439 00:26:34,160 --> 00:26:36,560 Speaker 1: to whether or not we perceived time to pass more 440 00:26:36,600 --> 00:26:40,159 Speaker 1: slowly in VR than in real life. Also, researchers have 441 00:26:40,240 --> 00:26:44,280 Speaker 1: noted that if you're talking about extended play sessions like 442 00:26:44,359 --> 00:26:47,000 Speaker 1: half an hour or more, this effect has been seen 443 00:26:47,080 --> 00:26:51,160 Speaker 1: not just in VR, but in all sorts of gameplay 444 00:26:51,520 --> 00:26:55,880 Speaker 1: modes like traditional computer gaming and otherwise. So in other words, 445 00:26:55,920 --> 00:26:58,680 Speaker 1: it does look like we get back to that time 446 00:26:58,680 --> 00:27:01,480 Speaker 1: flies when you're having fun, saying that there really is 447 00:27:02,080 --> 00:27:04,000 Speaker 1: some element of truth to that, in the sense that 448 00:27:04,000 --> 00:27:07,920 Speaker 1: our perception of time is uh, is different while we're 449 00:27:07,920 --> 00:27:10,359 Speaker 1: having a good time and we think, oh, that was 450 00:27:10,400 --> 00:27:12,240 Speaker 1: just five minutes, and then we look up and realize 451 00:27:12,240 --> 00:27:15,760 Speaker 1: we've been playing Civilization for four days straight. Just one 452 00:27:15,800 --> 00:27:20,280 Speaker 1: more turn, all right, that wraps up this episode of 453 00:27:20,280 --> 00:27:22,960 Speaker 1: text Stuff. If you have suggestions for topics I should 454 00:27:22,960 --> 00:27:24,880 Speaker 1: cover in future episodes, reach out to me. The best 455 00:27:24,880 --> 00:27:27,240 Speaker 1: place to do that so we're on Twitter. You won't 456 00:27:27,240 --> 00:27:30,159 Speaker 1: even need to subscribe because I'm not gonna do a 457 00:27:30,200 --> 00:27:33,240 Speaker 1: subscription based thing. You won't even need the premium version 458 00:27:33,280 --> 00:27:35,560 Speaker 1: of Twitter. Just regular old Twitter will do. And the 459 00:27:35,600 --> 00:27:38,800 Speaker 1: handle we use is text stuff H s W and 460 00:27:38,840 --> 00:27:47,040 Speaker 1: I'll talk to you again really soon. Text Stuff is 461 00:27:47,040 --> 00:27:50,200 Speaker 1: an I Heart Radio production. For more podcasts from my 462 00:27:50,320 --> 00:27:53,920 Speaker 1: Heart Radio, visit the i Heart Radio app, Apple Podcasts, 463 00:27:54,040 --> 00:27:56,040 Speaker 1: or wherever you listen to your favorite shows