1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,119 --> 00:00:14,960 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:15,080 --> 00:00:18,280 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:18,360 --> 00:00:20,720 Speaker 1: and a love of all things tech. And this is 5 00:00:20,760 --> 00:00:27,080 Speaker 1: the tech News episode for Tuesday, March twenty one. Now, 6 00:00:27,160 --> 00:00:30,200 Speaker 1: last week we had an episode that really went into 7 00:00:30,200 --> 00:00:35,520 Speaker 1: how various companies and organizations are tracking you. But don't worry, 8 00:00:35,560 --> 00:00:39,479 Speaker 1: We're only gonna do that a little bit in today's episode. So, 9 00:00:40,320 --> 00:00:46,360 Speaker 1: which app do you think is the most invasive? TikTok? Nope, 10 00:00:46,400 --> 00:00:52,479 Speaker 1: not that one. YouTube, Nope, Facebook, you're getting warmer. But 11 00:00:52,560 --> 00:00:56,920 Speaker 1: according to the cloud storage company p Cloud, the actual 12 00:00:56,960 --> 00:01:01,080 Speaker 1: answer is Instagram, So we're still in the Facebook family. 13 00:01:01,560 --> 00:01:05,039 Speaker 1: That makes sense, right, that tracks The company came to 14 00:01:05,120 --> 00:01:10,400 Speaker 1: this conclusion after reviewing updated app privacy labels. See not 15 00:01:10,560 --> 00:01:14,440 Speaker 1: that long ago. Apple updated its privacy policy and it 16 00:01:14,480 --> 00:01:18,160 Speaker 1: now requires companies to more thoroughly list out the ways 17 00:01:18,200 --> 00:01:23,240 Speaker 1: in which those companies collect and use data through these apps. 18 00:01:23,800 --> 00:01:25,360 Speaker 1: And this is one of the reasons why it took 19 00:01:25,440 --> 00:01:29,400 Speaker 1: Google a long time to update the company's apps on iOS. 20 00:01:29,560 --> 00:01:32,440 Speaker 1: Because it should come as a surprise to no one 21 00:01:32,760 --> 00:01:37,080 Speaker 1: that Google is collecting a lot of user information. Well, 22 00:01:37,160 --> 00:01:40,679 Speaker 1: the same appears to be true of Instagram, only more so. 23 00:01:41,040 --> 00:01:45,399 Speaker 1: According to p Cloud, Instagram collects nearly eighty percent of 24 00:01:45,560 --> 00:01:51,960 Speaker 1: users personal data, including stuff like search history, location, financial 25 00:01:52,000 --> 00:01:55,720 Speaker 1: information like what bank do you use, where do you 26 00:01:55,840 --> 00:01:59,360 Speaker 1: shop that kind of thing, plus who your contacts are, 27 00:01:59,720 --> 00:02:04,360 Speaker 1: and Instagram shares that information with various third parties who 28 00:02:04,360 --> 00:02:08,519 Speaker 1: presumably are paying a decent price for that level of access. Now, 29 00:02:08,560 --> 00:02:11,440 Speaker 1: this is how companies like Facebook, which by the way, 30 00:02:11,520 --> 00:02:15,560 Speaker 1: was in second place behind Instagram, It's how they can 31 00:02:15,639 --> 00:02:19,680 Speaker 1: market you to various advertisers. The more these companies know 32 00:02:19,919 --> 00:02:23,840 Speaker 1: about each user, the more they can target that user 33 00:02:23,919 --> 00:02:28,720 Speaker 1: with specific ads. They can match that user up with advertisers, 34 00:02:28,720 --> 00:02:31,320 Speaker 1: and being able to go to advertisers with the message 35 00:02:31,360 --> 00:02:35,000 Speaker 1: that hey, our app is going to put your ads 36 00:02:35,080 --> 00:02:36,960 Speaker 1: in front of the people who are most likely to 37 00:02:37,080 --> 00:02:41,520 Speaker 1: act on those ads. That's a powerful selling point. We 38 00:02:41,600 --> 00:02:44,760 Speaker 1: are well beyond the old days where you might, as 39 00:02:44,760 --> 00:02:48,320 Speaker 1: part of your marketing strategy, really rely on putting up 40 00:02:48,320 --> 00:02:51,680 Speaker 1: a billboard in a prominent location in town and hope 41 00:02:51,680 --> 00:02:54,680 Speaker 1: you get as many eyeballs as possible. Now we have 42 00:02:54,760 --> 00:02:59,119 Speaker 1: companies identifying which eyeballs are the most valuable to any 43 00:02:59,160 --> 00:03:03,839 Speaker 1: given client, and then sending those ads that way. Now, 44 00:03:03,840 --> 00:03:06,840 Speaker 1: what's the moral of this story, Well, it's that we 45 00:03:06,880 --> 00:03:11,160 Speaker 1: should all be aware of how apps are collecting our information, 46 00:03:11,480 --> 00:03:15,519 Speaker 1: how we are providing data to these apps. If we're 47 00:03:15,520 --> 00:03:18,720 Speaker 1: okay with that, no worries, right, I mean, it's this 48 00:03:18,800 --> 00:03:21,560 Speaker 1: is a personal thing. But if we're not okay with that, 49 00:03:22,000 --> 00:03:25,000 Speaker 1: we need to consider if those apps are really something 50 00:03:25,080 --> 00:03:28,560 Speaker 1: we want to use, because there's not really an easy 51 00:03:28,600 --> 00:03:31,360 Speaker 1: way for us to go in and cherry pick which 52 00:03:31,440 --> 00:03:34,880 Speaker 1: points of data can and cannot be used by any 53 00:03:34,880 --> 00:03:37,480 Speaker 1: given app. I mean, just doing that alone would become 54 00:03:37,480 --> 00:03:40,480 Speaker 1: a full time job. Now, in the interest of full disclosure, 55 00:03:40,720 --> 00:03:44,720 Speaker 1: I have Instagram on my phone, So I say this 56 00:03:44,800 --> 00:03:48,120 Speaker 1: as someone who is both aware that the app is 57 00:03:48,160 --> 00:03:50,960 Speaker 1: collecting a lot of data and still is using that app. 58 00:03:51,040 --> 00:03:53,720 Speaker 1: I'm one of those people. But I am also very 59 00:03:54,560 --> 00:03:59,120 Speaker 1: very boring. So my hope is that Facebook is not 60 00:03:59,240 --> 00:04:02,040 Speaker 1: getting very much money at all for my information, because 61 00:04:02,080 --> 00:04:07,480 Speaker 1: come on, I'm lame. According to the Korea Herald, the 62 00:04:07,600 --> 00:04:12,280 Speaker 1: giant tech company l G is now considering just shutting 63 00:04:12,280 --> 00:04:16,520 Speaker 1: down its smartphone division entirely. You might remember that. Earlier 64 00:04:16,560 --> 00:04:19,400 Speaker 1: this year, LG announced that it was looking into the 65 00:04:19,440 --> 00:04:23,400 Speaker 1: possibility of selling off its smartphone division to some other 66 00:04:23,480 --> 00:04:27,120 Speaker 1: company and just getting out of the smartphone game. Now 67 00:04:27,160 --> 00:04:32,000 Speaker 1: apparently no suitable parties have made an offer to LGS liking, 68 00:04:32,440 --> 00:04:35,799 Speaker 1: so the company may just shut down that division entirely 69 00:04:35,920 --> 00:04:39,320 Speaker 1: and then try to cut its losses. And LG has 70 00:04:39,360 --> 00:04:44,240 Speaker 1: been experiencing losses through its smartphone division. While LG is 71 00:04:44,320 --> 00:04:48,600 Speaker 1: the third largest smartphone maker behind Apple and Samsung, at 72 00:04:48,640 --> 00:04:53,160 Speaker 1: least according to market share that is from a counterpoint research. 73 00:04:53,480 --> 00:04:56,000 Speaker 1: Depending on which analysts you look at, you get different 74 00:04:56,200 --> 00:05:00,200 Speaker 1: numbers for these things. Anyway, despite being a big are 75 00:05:00,240 --> 00:05:03,040 Speaker 1: in the smartphone space, the division has been operating at 76 00:05:03,040 --> 00:05:06,359 Speaker 1: a loss for several years in a row. In fact, 77 00:05:06,560 --> 00:05:10,640 Speaker 1: according to the website gives China, l G has lost 78 00:05:10,760 --> 00:05:15,240 Speaker 1: four point four three billion dollars total and has had 79 00:05:15,240 --> 00:05:17,839 Speaker 1: a loss at the end of every single one of 80 00:05:17,920 --> 00:05:22,679 Speaker 1: the past twenty three consecutive quarters. This is a pretty 81 00:05:22,680 --> 00:05:26,240 Speaker 1: big deal. LG had even made a pretty big splash 82 00:05:26,320 --> 00:05:29,440 Speaker 1: this year earlier at CS with the reveal of the 83 00:05:29,600 --> 00:05:34,799 Speaker 1: rollable smartphone, a smartphone that can actually change screen sizes 84 00:05:34,920 --> 00:05:39,080 Speaker 1: dynamically because it uses a flexible oh LED display that 85 00:05:39,120 --> 00:05:43,400 Speaker 1: can unroll as it expands, and it also shows that 86 00:05:43,440 --> 00:05:46,240 Speaker 1: the smartphone business is a really tough one to be in. 87 00:05:46,400 --> 00:05:50,640 Speaker 1: Development costs are really high. It's a huge challenge to 88 00:05:50,760 --> 00:05:54,359 Speaker 1: stand out when you've got so many different smartphone companies 89 00:05:54,360 --> 00:05:57,039 Speaker 1: and models that are all on the market, you know, 90 00:05:57,080 --> 00:06:00,080 Speaker 1: and they're all competing for the same customers, and the 91 00:06:00,200 --> 00:06:03,520 Speaker 1: challenge to price units so that they are competitive in 92 00:06:03,560 --> 00:06:08,240 Speaker 1: such a dense field. So not a huge surprise. It 93 00:06:08,440 --> 00:06:10,840 Speaker 1: is sort of the end of an era with LG 94 00:06:11,000 --> 00:06:13,520 Speaker 1: getting out because LG has played such a big part 95 00:06:13,800 --> 00:06:18,520 Speaker 1: in smartphones. Atlas VPN released a report stating that in 96 00:06:18,600 --> 00:06:22,560 Speaker 1: twenty twenty, there was a one thousand, nine two percent 97 00:06:22,760 --> 00:06:28,200 Speaker 1: increase in development of malware aimed at the Mac operating system. Now, 98 00:06:28,320 --> 00:06:32,240 Speaker 1: percentages are tricky things, right, because it doesn't actually tell 99 00:06:32,279 --> 00:06:37,760 Speaker 1: you the total numbers. If only one instance of Mac 100 00:06:37,800 --> 00:06:41,599 Speaker 1: OS malware had happened in twenty nineteen, that would just 101 00:06:41,680 --> 00:06:44,279 Speaker 1: mean that there were one thousand, ninety two of them 102 00:06:44,320 --> 00:06:48,000 Speaker 1: in twenty twenty, and that number is dwarfed by the 103 00:06:48,080 --> 00:06:51,719 Speaker 1: number of incidents you would expect for Windows based operating systems, 104 00:06:51,760 --> 00:06:54,520 Speaker 1: But in this case, the report found evidence of six 105 00:06:54,600 --> 00:06:58,640 Speaker 1: hundred seventy four thousand, two hundred seventy three news samples 106 00:06:58,640 --> 00:07:03,080 Speaker 1: of malware for the MA cos. Still that's nothing. I mean, 107 00:07:03,120 --> 00:07:05,640 Speaker 1: half a million is a lot, so it's not nothing, 108 00:07:05,680 --> 00:07:08,680 Speaker 1: But it's not the same as what you see for Windows. 109 00:07:08,720 --> 00:07:12,120 Speaker 1: That's orders of magnitude bigger with Windows. So I just 110 00:07:12,160 --> 00:07:14,880 Speaker 1: want to be clear because I don't want people saying 111 00:07:14,920 --> 00:07:18,480 Speaker 1: my anti Mac bias is coming into play. The same 112 00:07:19,200 --> 00:07:23,880 Speaker 1: group that is Atlas VbN found ninety one oh five 113 00:07:24,280 --> 00:07:29,680 Speaker 1: million new Windows malware samples. Now that means that Windows 114 00:07:29,720 --> 00:07:33,800 Speaker 1: would rack up as many examples of malware new malware 115 00:07:34,200 --> 00:07:38,760 Speaker 1: in just three days as Mac had for the entire 116 00:07:38,960 --> 00:07:42,960 Speaker 1: year of twenty Still, this is a reminder that Mac 117 00:07:42,960 --> 00:07:47,840 Speaker 1: computers are not magically immune to malware. I'm not actually 118 00:07:47,880 --> 00:07:51,040 Speaker 1: sure what the perception is these days, but when I 119 00:07:51,080 --> 00:07:53,880 Speaker 1: got started in podcasting, there was this kind of general 120 00:07:54,120 --> 00:07:59,600 Speaker 1: belief among the public that Mac computers were effectively malware proof. 121 00:08:00,320 --> 00:08:02,920 Speaker 1: And there was, you know, some truth to that, but 122 00:08:03,000 --> 00:08:06,360 Speaker 1: not because the Mac operating system was just magically better 123 00:08:06,440 --> 00:08:10,680 Speaker 1: than Windows. It had more to do with opportunity. Because 124 00:08:10,720 --> 00:08:14,320 Speaker 1: if you are someone who's developing malware you probably want 125 00:08:14,400 --> 00:08:17,880 Speaker 1: that malware to hit as many targets as possible, and 126 00:08:17,920 --> 00:08:21,600 Speaker 1: the market share of Windows versus Mac os machines was 127 00:08:21,760 --> 00:08:24,360 Speaker 1: really out of whack. It just made more sense to 128 00:08:24,400 --> 00:08:28,680 Speaker 1: develop malware targeting PCs because there were way more PC 129 00:08:28,920 --> 00:08:32,400 Speaker 1: users out there. But we've seen a rise in instances 130 00:08:32,440 --> 00:08:35,520 Speaker 1: of Mac based malware over the past few years and 131 00:08:35,559 --> 00:08:39,360 Speaker 1: it's a solid reminder that there is no bulletproof operating 132 00:08:39,360 --> 00:08:42,000 Speaker 1: system out there, and no matter what type of machine 133 00:08:42,040 --> 00:08:44,880 Speaker 1: we used to access files and the Internet, we need 134 00:08:44,960 --> 00:08:49,600 Speaker 1: to be wary of malware. And now we transition to 135 00:08:49,640 --> 00:08:54,280 Speaker 1: a segment I like to call robots are scary and 136 00:08:54,400 --> 00:08:58,120 Speaker 1: they can be over In the UK, the Defense Secretary 137 00:08:58,200 --> 00:09:01,200 Speaker 1: Ben Wallace said that Britain's mill terry will be able 138 00:09:01,240 --> 00:09:05,280 Speaker 1: to achieve a greater effect with fewer actual soldiers in 139 00:09:05,320 --> 00:09:09,280 Speaker 1: the future thanks to technology, and part of that involves 140 00:09:09,520 --> 00:09:13,160 Speaker 1: drones now. The established strength of the UK Army in 141 00:09:13,160 --> 00:09:16,120 Speaker 1: the mid two thousands was set at eighty two thousand troops. 142 00:09:16,520 --> 00:09:19,400 Speaker 1: UH This includes all people who have received basic training 143 00:09:19,480 --> 00:09:22,400 Speaker 1: and then a secondary specialized training to focus on a 144 00:09:22,440 --> 00:09:26,440 Speaker 1: particular role or area of expertise. Today, the army has 145 00:09:26,600 --> 00:09:30,800 Speaker 1: seventy six thousand, five hundred personnel, including seventy six thousand, 146 00:09:30,880 --> 00:09:35,199 Speaker 1: three hundred fifty soldiers. Wallace's plans would reduce this number 147 00:09:35,200 --> 00:09:39,679 Speaker 1: to seventy two five hundred by five. At the same time, 148 00:09:39,720 --> 00:09:42,679 Speaker 1: the defense budget in the UK is to increase by 149 00:09:42,720 --> 00:09:46,280 Speaker 1: twenty four billion pounds over the next four years, So 150 00:09:46,320 --> 00:09:48,640 Speaker 1: the question is where is that money going if the 151 00:09:48,760 --> 00:09:51,719 Speaker 1: army is actually scaling back on the number of soldiers 152 00:09:51,800 --> 00:09:54,439 Speaker 1: that will be part of the army. Well, big part 153 00:09:54,520 --> 00:09:59,239 Speaker 1: of it is automated systems and drones, including replacing existing 154 00:09:59,480 --> 00:10:04,360 Speaker 1: Reaper drones with Protector drones. That is a little bit 155 00:10:04,559 --> 00:10:08,800 Speaker 1: confusing to some folks, I'm sure, because both the Reaper 156 00:10:08,880 --> 00:10:12,760 Speaker 1: and the Protector are themselves variants of the Predator B 157 00:10:13,120 --> 00:10:18,600 Speaker 1: class drone. Other big expenses include establishing a national cyber force, 158 00:10:19,160 --> 00:10:22,440 Speaker 1: building out a digital backbone for the purposes of rapid 159 00:10:22,520 --> 00:10:27,040 Speaker 1: data sharing, and the development of a future combat air system. 160 00:10:27,160 --> 00:10:30,560 Speaker 1: Technology is going to play a much bigger role, and 161 00:10:30,600 --> 00:10:33,480 Speaker 1: in theory it will reduce the need to have as 162 00:10:33,480 --> 00:10:36,920 Speaker 1: many human soldiers as are currently in service in the UK, 163 00:10:37,679 --> 00:10:42,520 Speaker 1: so at least some of that responsibility will fall to 164 00:10:42,640 --> 00:10:46,200 Speaker 1: technology and the operators who are in charge of it, 165 00:10:46,840 --> 00:10:54,239 Speaker 1: including the drones, which, by the way, are pretty terrifying things. Meanwhile, 166 00:10:54,320 --> 00:10:56,920 Speaker 1: on this side of the pond, Ben Callos, a New 167 00:10:56,960 --> 00:11:00,120 Speaker 1: York City Council member, has raised concerns about the New 168 00:11:00,200 --> 00:11:04,160 Speaker 1: York Police Force using robots to respond to a hostage 169 00:11:04,280 --> 00:11:07,400 Speaker 1: situation that took place in the Bronx. The robot in 170 00:11:07,520 --> 00:11:11,440 Speaker 1: question was a Diggi Dog from Boston Dynamics, and according 171 00:11:11,480 --> 00:11:15,360 Speaker 1: to Ours Technica, Callos reacted with horror seeing this robot 172 00:11:15,360 --> 00:11:18,320 Speaker 1: in use, which then prompted him to propose a ban 173 00:11:18,520 --> 00:11:22,120 Speaker 1: on police forces from owning or operating robots that are 174 00:11:22,240 --> 00:11:24,760 Speaker 1: armed with weaponry. But I do want to be clear 175 00:11:25,000 --> 00:11:29,480 Speaker 1: that the Digi Doog robot wasn't armed. It was only 176 00:11:29,520 --> 00:11:32,760 Speaker 1: equipped with surveillance cameras which gave police a view into 177 00:11:32,800 --> 00:11:35,920 Speaker 1: an area that was considered too dangerous for a human 178 00:11:36,000 --> 00:11:39,280 Speaker 1: officer to enter. But it wasn't like Diggi Dog was 179 00:11:39,320 --> 00:11:43,400 Speaker 1: packing heat or anything. Even so, without weapons, the Digi 180 00:11:43,440 --> 00:11:47,240 Speaker 1: Dog still has its critics. The American Civil Liberties Union 181 00:11:47,360 --> 00:11:50,280 Speaker 1: has asked why the Digi Dog didn't show up on 182 00:11:50,320 --> 00:11:55,360 Speaker 1: a police list of surveillance devices that they use. That's 183 00:11:55,360 --> 00:11:58,240 Speaker 1: a problem because New York recently passed a law that 184 00:11:58,360 --> 00:12:02,439 Speaker 1: states law enforcement agencies have to divulge that kind of information. 185 00:12:03,040 --> 00:12:05,600 Speaker 1: Groups like the A c L You are concerned that 186 00:12:05,640 --> 00:12:09,440 Speaker 1: there aren't proper privacy protections in place that would prevent 187 00:12:09,520 --> 00:12:13,280 Speaker 1: police from abusing the surveillance power on citizens. I did 188 00:12:13,280 --> 00:12:18,040 Speaker 1: not realize how many words start with P in that sentence. Uh, 189 00:12:18,080 --> 00:12:20,960 Speaker 1: there were until I actually said it out loud. But 190 00:12:21,040 --> 00:12:23,560 Speaker 1: there have been a lot of people, myself included, who 191 00:12:23,559 --> 00:12:26,600 Speaker 1: have been warning against the use of armed robots because 192 00:12:26,640 --> 00:12:29,120 Speaker 1: going down that pathway could lead to another type of 193 00:12:29,240 --> 00:12:32,960 Speaker 1: arms race, and that's one that will undoubtedly lead to 194 00:12:33,040 --> 00:12:39,199 Speaker 1: tragic consequences, whether by intent or accident. Callis himself isn't 195 00:12:39,240 --> 00:12:42,920 Speaker 1: totally anti robot. He said that utility robots like the 196 00:12:42,960 --> 00:12:46,720 Speaker 1: digit dog are not really what he's concerned about. Despite 197 00:12:46,760 --> 00:12:50,520 Speaker 1: that initial, you know, reaction of horror, I guess, nor 198 00:12:50,600 --> 00:12:54,160 Speaker 1: would he want to see bomb disposal robots get banned either, 199 00:12:54,720 --> 00:12:58,120 Speaker 1: But he is concerned about there being a slippery slope. 200 00:12:58,600 --> 00:13:01,240 Speaker 1: The ARS Technica piece at atually quotes the director of 201 00:13:01,280 --> 00:13:05,679 Speaker 1: the Ethics and Emerging Sciences Group at California Polytechnic, and 202 00:13:05,800 --> 00:13:09,120 Speaker 1: he points out that what is a non lethal robot 203 00:13:09,160 --> 00:13:12,960 Speaker 1: today could be tweaked and modified and become a lethal 204 00:13:13,040 --> 00:13:16,600 Speaker 1: robot in the future, and we've already seen what happens 205 00:13:17,040 --> 00:13:21,400 Speaker 1: when we militarized police forces. To learn more about this, 206 00:13:21,679 --> 00:13:25,280 Speaker 1: I highly recommend reading the full article on Ours Technica. 207 00:13:25,640 --> 00:13:29,040 Speaker 1: It is titled New York lawmaker wants to ban police 208 00:13:29,160 --> 00:13:32,560 Speaker 1: use of armed robots. It's by Sydney Fussele, who writes 209 00:13:32,600 --> 00:13:35,720 Speaker 1: for Wired dot com. So, like I said, I found 210 00:13:35,760 --> 00:13:38,680 Speaker 1: the article over at ours Technica. It is extremely well 211 00:13:38,679 --> 00:13:42,000 Speaker 1: written and well researched, So go check that out. And 212 00:13:42,080 --> 00:13:45,920 Speaker 1: now let's move to some varying degrees of weird stories. 213 00:13:46,280 --> 00:13:49,320 Speaker 1: Our first one is that the final bid for Twitter 214 00:13:49,480 --> 00:13:54,920 Speaker 1: CEO Jack Dorsey's first tweet has been made. The transaction 215 00:13:55,000 --> 00:13:58,680 Speaker 1: has happened. Specifically, this was a bid for the n 216 00:13:58,760 --> 00:14:04,040 Speaker 1: f T or on fungible token version of that tweet. Now. 217 00:14:04,040 --> 00:14:06,840 Speaker 1: I have an episode coming up explaining what n f 218 00:14:06,880 --> 00:14:09,280 Speaker 1: t s are and how they work. That's going to 219 00:14:09,400 --> 00:14:12,240 Speaker 1: be later this week, but for the purposes of this story, 220 00:14:12,360 --> 00:14:15,080 Speaker 1: it's safe to boil it down and just say this 221 00:14:15,160 --> 00:14:18,400 Speaker 1: is a way to certify a digital thing as being 222 00:14:18,520 --> 00:14:21,560 Speaker 1: unique anyway. In this case, the n f t was 223 00:14:21,640 --> 00:14:25,840 Speaker 1: Dorsey's first tweet, which reads just setting up my Twitter. 224 00:14:27,080 --> 00:14:30,640 Speaker 1: Dorsey posted that back on March twenty one, two thousand six. 225 00:14:31,000 --> 00:14:34,320 Speaker 1: Back then Twitter had no vowels. I mean, the service 226 00:14:34,360 --> 00:14:37,640 Speaker 1: would let you use vowels, but the the company didn't 227 00:14:37,720 --> 00:14:39,760 Speaker 1: use vowels in the name, so it was spelled t 228 00:14:40,200 --> 00:14:44,440 Speaker 1: W T t R. The winning bid came from Senna Estav, 229 00:14:44,960 --> 00:14:49,880 Speaker 1: the CEO at Bridge Oracle and stiv Or Stav. His 230 00:14:50,080 --> 00:14:54,200 Speaker 1: winning bid was for two million, nine hundred fifteen thousand, 231 00:14:54,640 --> 00:14:59,160 Speaker 1: eight hundred thirty five dollars and forty seven cents, which 232 00:14:59,240 --> 00:15:03,680 Speaker 1: is not a round number and I don't know if 233 00:15:03,720 --> 00:15:08,040 Speaker 1: that number has any special significance. He paid for the 234 00:15:08,040 --> 00:15:13,160 Speaker 1: tweet using Ether cryptocurrency, and Dorsey took the money, converted 235 00:15:13,160 --> 00:15:16,440 Speaker 1: it to bitcoin, and then donated it to give directly 236 00:15:16,840 --> 00:15:20,920 Speaker 1: a charitable organization that gives money directly to those who 237 00:15:20,960 --> 00:15:24,800 Speaker 1: needed That two million and such and such dollars ended 238 00:15:24,880 --> 00:15:28,720 Speaker 1: up being just under fifty one bitcoin. The actual value 239 00:15:28,960 --> 00:15:34,200 Speaker 1: was fifty point eight seven five one six six nine bitcoin, 240 00:15:34,280 --> 00:15:38,920 Speaker 1: because that's how bitcoin works. Five of the bid actually 241 00:15:38,960 --> 00:15:42,400 Speaker 1: went to the platform sent that's c E n T. 242 00:15:43,040 --> 00:15:47,320 Speaker 1: That is the platform that hosted the auction pretty wild. 243 00:15:47,600 --> 00:15:51,560 Speaker 1: Also wild is that the actor William Shatner, perhaps best 244 00:15:51,600 --> 00:15:54,880 Speaker 1: known as the original Captain James T. Kirk in the 245 00:15:55,040 --> 00:15:59,360 Speaker 1: Star Trek franchise, has turned ninety years old, and he 246 00:15:59,440 --> 00:16:03,080 Speaker 1: has also spawned an AI version of himself with the 247 00:16:03,080 --> 00:16:07,160 Speaker 1: help of a company called story File. Now, according to 248 00:16:07,200 --> 00:16:10,880 Speaker 1: the CEO of story File, there will be a video 249 00:16:11,320 --> 00:16:15,240 Speaker 1: version of William Shatner that will not be a deep fake. 250 00:16:15,720 --> 00:16:18,240 Speaker 1: It will not be an avatar. In her words, it 251 00:16:18,280 --> 00:16:22,400 Speaker 1: will be the real Shatner. What that actually means, I 252 00:16:22,520 --> 00:16:26,200 Speaker 1: guess is up to interpretation. But the idea is that 253 00:16:26,560 --> 00:16:29,400 Speaker 1: this video version of Shatner will be able to interact 254 00:16:29,400 --> 00:16:33,200 Speaker 1: with people and respond to people, just as William Shatner 255 00:16:33,320 --> 00:16:36,320 Speaker 1: himself would if he were, you know, doing like a 256 00:16:36,400 --> 00:16:39,560 Speaker 1: video conference with you. So, in other words, it should 257 00:16:39,560 --> 00:16:43,120 Speaker 1: be a digital copy of William Shatner, though I have 258 00:16:43,200 --> 00:16:46,720 Speaker 1: questions about how faithfully the video will recreate the experience 259 00:16:47,080 --> 00:16:50,640 Speaker 1: of actually interacting with the star. Shatner said he wanted 260 00:16:50,680 --> 00:16:52,440 Speaker 1: to create a way that would allow his family and 261 00:16:52,520 --> 00:16:55,760 Speaker 1: friends to interact with him for all time. The video 262 00:16:55,880 --> 00:16:58,600 Speaker 1: version should be up and running by May of this year, 263 00:16:58,960 --> 00:17:02,600 Speaker 1: and I really, really hope that if you ask it 264 00:17:03,320 --> 00:17:07,479 Speaker 1: very trivial questions about Star Trek, it will prompt the 265 00:17:07,600 --> 00:17:11,080 Speaker 1: video Shatner to respond in the same way that the 266 00:17:11,160 --> 00:17:14,440 Speaker 1: real Shatner did in an old Saturday Night Live sketch 267 00:17:14,840 --> 00:17:17,919 Speaker 1: in which he appeared as uh if you were a 268 00:17:17,920 --> 00:17:21,440 Speaker 1: guest at a science fiction convention and he fields increasingly 269 00:17:22,320 --> 00:17:25,040 Speaker 1: weird questions from a big group of nerdy Star Trek 270 00:17:25,080 --> 00:17:27,680 Speaker 1: fans until he just explodes and yells, get a life, 271 00:17:27,680 --> 00:17:31,879 Speaker 1: will you people? That's what I want from my Shatner interaction. 272 00:17:31,960 --> 00:17:35,200 Speaker 1: I'll be disappointed if I don't get it. And our 273 00:17:35,400 --> 00:17:38,800 Speaker 1: final story for today's episode comes as a huge personal 274 00:17:38,880 --> 00:17:41,119 Speaker 1: challenge for me, but I will do my best to 275 00:17:41,240 --> 00:17:45,200 Speaker 1: keep things at the standards that we expect for tech stuff. 276 00:17:47,280 --> 00:17:50,640 Speaker 1: In San Francisco, a couple who founded a company called 277 00:17:50,760 --> 00:17:54,600 Speaker 1: you Biome have now been indicted on multiple fraud charges 278 00:17:54,800 --> 00:17:59,160 Speaker 1: by the federal government. You Biome was in the how 279 00:17:59,160 --> 00:18:04,560 Speaker 1: do I put this uh fecal matter testing business? Now. 280 00:18:04,560 --> 00:18:07,400 Speaker 1: The pitch was that this startup company would take your 281 00:18:08,960 --> 00:18:13,199 Speaker 1: sample and then they would run tests on it and 282 00:18:13,280 --> 00:18:16,400 Speaker 1: determine like your gut health and give recommendations on how 283 00:18:16,440 --> 00:18:20,919 Speaker 1: to improve your general well being. The only problem, according 284 00:18:21,000 --> 00:18:24,520 Speaker 1: to the charges anyway, is that their methods were totally 285 00:18:24,720 --> 00:18:28,479 Speaker 1: untested and without evidence, and there was no proof that 286 00:18:28,520 --> 00:18:32,240 Speaker 1: they were even at all effective. In the meantime, the 287 00:18:32,280 --> 00:18:37,240 Speaker 1: company developed so called clinical tests that they urged medical 288 00:18:37,280 --> 00:18:42,840 Speaker 1: professionals to give to patients, but these tests, again according 289 00:18:42,880 --> 00:18:46,600 Speaker 1: to the charges, also lacked any sort of actual validation 290 00:18:46,680 --> 00:18:51,080 Speaker 1: or accreditation. The whole purpose of the tests was just 291 00:18:51,160 --> 00:18:54,400 Speaker 1: to create a way for you biom to seek reimbursements 292 00:18:54,440 --> 00:18:59,040 Speaker 1: from health insurance companies. So essentially, again according to the charges, 293 00:18:59,600 --> 00:19:02,880 Speaker 1: this was about committing a type of insurance fraud if 294 00:19:02,920 --> 00:19:06,159 Speaker 1: in fact the tests had no validity and had no 295 00:19:06,960 --> 00:19:11,159 Speaker 1: you know, medical necessity. The story is very similar to 296 00:19:11,280 --> 00:19:13,840 Speaker 1: that of thora nos in that the pitch for it 297 00:19:13,960 --> 00:19:18,280 Speaker 1: sounds plausible, right a company that you know analyzes poop 298 00:19:18,600 --> 00:19:21,119 Speaker 1: to determine gut health. That seems like that would be 299 00:19:21,160 --> 00:19:25,439 Speaker 1: achievable in a startup kind of company. But according to 300 00:19:25,480 --> 00:19:28,280 Speaker 1: the charges, the couple fooled an awful lot of people 301 00:19:28,359 --> 00:19:32,200 Speaker 1: in the process of trying to get this business going. 302 00:19:32,760 --> 00:19:36,560 Speaker 1: It sounds like a really crappy situation to me. No 303 00:19:37,000 --> 00:19:39,800 Speaker 1: Beans A dun Winton did the thing I said I 304 00:19:39,840 --> 00:19:44,280 Speaker 1: wasn't gonna do anyway. But they also gained praise from 305 00:19:44,400 --> 00:19:50,320 Speaker 1: various parties, including Gwyneth Paltrow's lifestyle company Goop, which I 306 00:19:50,359 --> 00:19:55,440 Speaker 1: mean for Goop to praise Poop not a big surprise 307 00:19:56,000 --> 00:19:59,800 Speaker 1: if you follow Goop. The story of you Bioms should 308 00:20:00,000 --> 00:20:03,439 Speaker 1: mind everyone that hype can be exciting, but it can 309 00:20:03,480 --> 00:20:08,320 Speaker 1: also be really hazardous, not just to your investment, but 310 00:20:08,400 --> 00:20:11,240 Speaker 1: potentially to your health. If you're counting on something that 311 00:20:11,320 --> 00:20:16,200 Speaker 1: doesn't actually have any medical validity to it, you could 312 00:20:16,200 --> 00:20:20,160 Speaker 1: be making really bad decisions. So before you Biom began 313 00:20:20,200 --> 00:20:23,240 Speaker 1: to fall apart back in twenty nineteen, and it reached 314 00:20:23,240 --> 00:20:27,640 Speaker 1: a valuation of more than half a billion dollars. That 315 00:20:28,040 --> 00:20:31,359 Speaker 1: is terrible. It tells us that there is there's something 316 00:20:31,440 --> 00:20:36,080 Speaker 1: seriously broken in the investment tech sector. It's really just 317 00:20:36,200 --> 00:20:40,120 Speaker 1: enough for you to give them all the stink. I Well, 318 00:20:40,119 --> 00:20:45,399 Speaker 1: that wraps up all the stories for Tuesday March, or 319 00:20:45,400 --> 00:20:47,560 Speaker 1: at least all the stories I wanted to talk about. 320 00:20:48,160 --> 00:20:50,840 Speaker 1: We'll be covering more tech news later in the week. 321 00:20:51,240 --> 00:20:53,840 Speaker 1: If you have any suggestions for topics I should cover 322 00:20:54,040 --> 00:20:57,000 Speaker 1: in episodes of Tech Stuff, let me know. Reach out 323 00:20:57,080 --> 00:20:59,359 Speaker 1: to me on Twitter. The handle for the show is 324 00:20:59,440 --> 00:21:02,520 Speaker 1: text of H s W and I'll talk to You 325 00:21:02,560 --> 00:21:10,800 Speaker 1: again really soon. Y. Text Stuff is an I heart 326 00:21:10,880 --> 00:21:14,640 Speaker 1: Radio production. For more podcasts from my heart Radio, visit 327 00:21:14,680 --> 00:21:17,760 Speaker 1: the i heart Radio app, Apple Podcasts, or wherever you 328 00:21:17,800 --> 00:21:19,159 Speaker 1: listen to your favorite shows.