1 00:00:04,440 --> 00:00:12,319 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,320 --> 00:00:15,880 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:16,000 --> 00:00:19,400 Speaker 1: I'm an executive producer with iHeartRadio. And how the tech 4 00:00:19,440 --> 00:00:22,520 Speaker 1: are you. It's time for the tech news for Thursday 5 00:00:22,840 --> 00:00:27,720 Speaker 1: May eighteen, twenty twenty three. So first up here in 6 00:00:27,760 --> 00:00:31,160 Speaker 1: the United States, the Supreme Court decided not to hear 7 00:00:31,400 --> 00:00:35,200 Speaker 1: two cases that otherwise could have forced a decision about 8 00:00:35,240 --> 00:00:39,199 Speaker 1: the infamous Section too thirty rules. So, in case you 9 00:00:39,240 --> 00:00:43,320 Speaker 1: need a refresher, Section two thirty is part of Title 10 00:00:43,400 --> 00:00:47,680 Speaker 1: forty seven of the United States Code. Probably need to 11 00:00:47,680 --> 00:00:50,919 Speaker 1: add more context to that. It was introduced in the 12 00:00:50,920 --> 00:00:55,240 Speaker 1: Communications Decency Act of nineteen ninety six, that in turn 13 00:00:55,520 --> 00:00:59,760 Speaker 1: is part of an even larger act called the Telecommunications 14 00:00:59,800 --> 00:01:03,760 Speaker 1: Act of nineteen ninety six. And the whole point of 15 00:01:03,800 --> 00:01:07,160 Speaker 1: Section two thirty is that it gives online platforms protection 16 00:01:07,800 --> 00:01:12,440 Speaker 1: from liability if users post stuff that's you know, illegal 17 00:01:12,959 --> 00:01:15,959 Speaker 1: on those platforms. So, in other words, let's say I 18 00:01:16,120 --> 00:01:21,039 Speaker 1: created a YouTube video and my video contained illegal content 19 00:01:21,240 --> 00:01:26,639 Speaker 1: in it. Well, YouTube slash, Google, slash alphabet wouldn't be 20 00:01:26,880 --> 00:01:30,720 Speaker 1: held legally responsible for what I did because of Section 21 00:01:30,760 --> 00:01:34,479 Speaker 1: two thirty it's just the platform. I'm the one who 22 00:01:34,760 --> 00:01:39,480 Speaker 1: committed the crime. So this protects the platforms from hosting 23 00:01:39,600 --> 00:01:44,520 Speaker 1: and being held legally liable for hosting content that's illegal 24 00:01:44,760 --> 00:01:48,560 Speaker 1: for whatever reason. It does get a little more complicated 25 00:01:48,640 --> 00:01:50,960 Speaker 1: than that, but that's the basic idea. And there were 26 00:01:51,000 --> 00:01:56,440 Speaker 1: a couple of notable cases, big, big emotionally charged cases 27 00:01:57,000 --> 00:01:59,840 Speaker 1: that were recently submitted to the Supreme Court for consideration, 28 00:02:00,400 --> 00:02:03,880 Speaker 1: and they could have served as a test for Section 29 00:02:03,960 --> 00:02:08,640 Speaker 1: two thirty's legitimacy from a constitutional standpoint. But it turns 30 00:02:08,680 --> 00:02:12,840 Speaker 1: out that's not happening because the Court found that neither 31 00:02:13,160 --> 00:02:17,480 Speaker 1: case had merit for other reasons, like they wouldn't hear 32 00:02:17,560 --> 00:02:19,720 Speaker 1: the case, not because of the Section two thirty thing, 33 00:02:20,440 --> 00:02:23,959 Speaker 1: but for other reasons. Essentially, the Supreme Court said that 34 00:02:24,000 --> 00:02:27,519 Speaker 1: the cases were accusing platforms of violating the Anti Terrorism 35 00:02:27,560 --> 00:02:31,920 Speaker 1: Act and that that particular law shouldn't have applied in 36 00:02:32,000 --> 00:02:35,160 Speaker 1: the first place. So there was no case there, right, Like, 37 00:02:35,639 --> 00:02:39,000 Speaker 1: they can't use that law as the reason to bring 38 00:02:39,040 --> 00:02:43,080 Speaker 1: a case against the company because it doesn't apply. So 39 00:02:43,200 --> 00:02:45,519 Speaker 1: the Supreme Court said it would not be weighing in 40 00:02:46,000 --> 00:02:49,160 Speaker 1: with a decision about Section two thirty because the case 41 00:02:49,520 --> 00:02:52,799 Speaker 1: isn't relevant. So you could say the Supreme Court has 42 00:02:52,840 --> 00:02:57,200 Speaker 1: sort of punted the decision regarding Section two thirty down 43 00:02:57,200 --> 00:03:00,200 Speaker 1: the field, and it will take some other legal matter 44 00:03:00,280 --> 00:03:04,880 Speaker 1: in the future that involves Section two thirty to make 45 00:03:04,960 --> 00:03:09,440 Speaker 1: the Supreme Court, you know, make an actual decision that 46 00:03:10,000 --> 00:03:12,560 Speaker 1: settles the question about whether or not Section two thirty 47 00:03:12,760 --> 00:03:18,000 Speaker 1: is constitutional. The US state of Montana has become the 48 00:03:18,120 --> 00:03:23,120 Speaker 1: first state in our nation to issue a ban on TikTok. 49 00:03:23,440 --> 00:03:26,840 Speaker 1: The ban will not actually take effect until January first, 50 00:03:27,000 --> 00:03:32,200 Speaker 1: next year, assuming that the various challenges to this new 51 00:03:32,400 --> 00:03:36,080 Speaker 1: law don't end up making the whole matter moot. So 52 00:03:36,440 --> 00:03:39,960 Speaker 1: the justification for banning TikTok boils down to a concern 53 00:03:40,440 --> 00:03:44,360 Speaker 1: that the Chinese Communist Party is essentially relying on TikTok 54 00:03:44,440 --> 00:03:48,920 Speaker 1: to gather intelligence about US citizens and institutions. So the 55 00:03:48,960 --> 00:03:53,080 Speaker 1: reason for banning TikTok is to prevent Chinese surveillance. TikTok, 56 00:03:53,120 --> 00:03:57,320 Speaker 1: for what it's worth, disputes these accusations and says that 57 00:03:57,440 --> 00:04:00,640 Speaker 1: no one from the Chinese Communist Party has access to 58 00:04:01,040 --> 00:04:05,040 Speaker 1: data on its US data servers. The American Civil Liberties 59 00:04:05,120 --> 00:04:09,680 Speaker 1: Union or ACLU, argues that banning TikTok amounts to violations 60 00:04:09,680 --> 00:04:12,240 Speaker 1: of the First Amendment, aka the right to free speech, 61 00:04:12,720 --> 00:04:15,120 Speaker 1: due to the fact that folks depend upon the platform 62 00:04:15,200 --> 00:04:18,800 Speaker 1: to express their views and to view others. So the 63 00:04:18,880 --> 00:04:23,080 Speaker 1: ACLU's argument says, the law is unconstitutional, so it should 64 00:04:23,120 --> 00:04:27,080 Speaker 1: just be voided. It should not be put into place 65 00:04:27,200 --> 00:04:32,640 Speaker 1: because it violates constitutional rights. From a technical perspective, banning 66 00:04:32,680 --> 00:04:36,200 Speaker 1: an online service from a specific state comes with its 67 00:04:36,240 --> 00:04:41,720 Speaker 1: own set of challenges. If TikTok is allowed elsewhere, like, 68 00:04:41,760 --> 00:04:45,279 Speaker 1: if it's available anywhere other than Montana, how do you 69 00:04:45,320 --> 00:04:49,160 Speaker 1: prevent it from crossing state lines? So Montana says it 70 00:04:49,200 --> 00:04:53,640 Speaker 1: will find TikTok if the service continues to operate within 71 00:04:53,800 --> 00:04:59,640 Speaker 1: Montana's state boundaries, and further, it will also find online 72 00:04:59,640 --> 00:05:03,960 Speaker 1: apps like Apple and Google's app stores if they do 73 00:05:04,040 --> 00:05:08,600 Speaker 1: not prevent folks within Montana citizens of Montana from downloading 74 00:05:08,640 --> 00:05:11,479 Speaker 1: the app. But again this gets tricky. I mean, you 75 00:05:11,520 --> 00:05:15,080 Speaker 1: could use a VPN, a virtual private network, which would 76 00:05:15,120 --> 00:05:17,960 Speaker 1: make it look as if you're not in Montana. So 77 00:05:18,000 --> 00:05:20,360 Speaker 1: you're in Montana, you decide you're going to use this VPN, 78 00:05:20,520 --> 00:05:22,479 Speaker 1: and it makes you look like you're in North Dakota 79 00:05:22,600 --> 00:05:26,720 Speaker 1: or something. Well, you just bypass that whatever geo fencing 80 00:05:26,800 --> 00:05:29,599 Speaker 1: strategy was in place to prevent TikTok from getting to you. 81 00:05:30,000 --> 00:05:33,039 Speaker 1: So does that then mean Montana would also have to 82 00:05:33,080 --> 00:05:38,200 Speaker 1: consider a ban on VPNs to try and prevent the workaround. 83 00:05:39,240 --> 00:05:42,880 Speaker 1: I honestly think this law is a lost cause at 84 00:05:42,920 --> 00:05:46,039 Speaker 1: a state level. I just don't think it works. It 85 00:05:46,080 --> 00:05:49,279 Speaker 1: doesn't really work on a technical level. It may not 86 00:05:49,360 --> 00:05:52,599 Speaker 1: work on a legal level. I'm no legal expert, and 87 00:05:52,720 --> 00:05:55,120 Speaker 1: I don't think it works on a social level either. 88 00:05:55,880 --> 00:05:59,039 Speaker 1: As for TikTok's potential for harm, I have some thoughts 89 00:05:59,080 --> 00:06:02,400 Speaker 1: about that. I mean, it is true that TikTok is 90 00:06:02,400 --> 00:06:06,200 Speaker 1: a subsidiary of a Chinese company, Byte Dance, and it's 91 00:06:06,200 --> 00:06:09,440 Speaker 1: also true that China has laws that compel citizens and 92 00:06:09,560 --> 00:06:13,320 Speaker 1: companies to gather intelligence on behalf of the Chinese Communist Party. 93 00:06:13,760 --> 00:06:17,120 Speaker 1: These things are true. But even if you were to 94 00:06:17,279 --> 00:06:20,880 Speaker 1: wipe TikTok off the face of the earth, let's say 95 00:06:20,920 --> 00:06:24,039 Speaker 1: we just obliterated it from space, because it's the only 96 00:06:24,080 --> 00:06:27,640 Speaker 1: way to be sure, the Chinese Communist Party could still 97 00:06:27,800 --> 00:06:32,280 Speaker 1: scoop up tons of information about US citizens, because news flash, 98 00:06:32,560 --> 00:06:36,920 Speaker 1: our personal data is floating around in various databases all 99 00:06:37,080 --> 00:06:42,920 Speaker 1: over the place, like our activities online are constantly being 100 00:06:43,240 --> 00:06:48,840 Speaker 1: tracked and not like actively monitored, but certainly notated. There 101 00:06:48,880 --> 00:06:51,960 Speaker 1: are records of all this stuff that we do online, 102 00:06:52,000 --> 00:06:55,960 Speaker 1: particularly on things like social networks, And there's an industry 103 00:06:56,040 --> 00:06:59,920 Speaker 1: that's grown up around the buying and selling of personal information. 104 00:07:00,480 --> 00:07:03,600 Speaker 1: And I'm not even talking about clandestine stuff here, like 105 00:07:03,720 --> 00:07:08,599 Speaker 1: for the purposes of espionage. The online advertising ecosystem depends 106 00:07:08,680 --> 00:07:13,800 Speaker 1: upon this infrastructure of personal data being bought and sold. 107 00:07:14,160 --> 00:07:16,440 Speaker 1: So even if we get rid of TikTok, there are 108 00:07:16,560 --> 00:07:21,360 Speaker 1: plenty of ways anyone, including the Chinese Communist Party, could 109 00:07:21,480 --> 00:07:24,440 Speaker 1: gobble up personal data, because there's so much that's out 110 00:07:24,440 --> 00:07:26,600 Speaker 1: there that's just being bought and sold all the time. Anyway, 111 00:07:26,960 --> 00:07:30,320 Speaker 1: now that being said, I also recognize that TikTok has 112 00:07:30,360 --> 00:07:33,800 Speaker 1: the potential to cause great harm, not by being a 113 00:07:33,840 --> 00:07:38,440 Speaker 1: surveillance tool necessarily, but by serving up harmful content to users, 114 00:07:38,440 --> 00:07:42,800 Speaker 1: particularly impressionable younger users, and a lot of young people 115 00:07:43,240 --> 00:07:46,520 Speaker 1: use TikTok. However, that's also the case with lots of 116 00:07:46,600 --> 00:07:50,840 Speaker 1: other social networks and platforms that serve up user generated content. 117 00:07:50,920 --> 00:07:55,840 Speaker 1: They also can be potentially really harmful to specific people 118 00:07:55,920 --> 00:07:59,920 Speaker 1: particularly impressionable young people. But this really gets more into 119 00:08:00,120 --> 00:08:04,720 Speaker 1: how platforms depend heavily on algorithms to serve up content 120 00:08:04,880 --> 00:08:07,480 Speaker 1: in an effort to try and keep eyeballs on the 121 00:08:07,520 --> 00:08:12,960 Speaker 1: service for as long as possible, right, Like, their whole 122 00:08:12,960 --> 00:08:15,720 Speaker 1: goal is to keep you there and serve you ads, 123 00:08:15,760 --> 00:08:18,080 Speaker 1: and the longer they keep you there, the more ads 124 00:08:18,120 --> 00:08:20,320 Speaker 1: they serve and the more revenue they generate. So to 125 00:08:20,400 --> 00:08:24,920 Speaker 1: do that, they design algorithms that are essentially looking for hooks. 126 00:08:25,400 --> 00:08:28,440 Speaker 1: They're looking for things that interest you, and if you 127 00:08:28,640 --> 00:08:31,080 Speaker 1: indicate that you're interested in something like if you were 128 00:08:31,120 --> 00:08:34,040 Speaker 1: to watch a specific type of TikTok video all the 129 00:08:34,080 --> 00:08:37,840 Speaker 1: way through, the algorithm says, aha, this is what this 130 00:08:37,920 --> 00:08:40,840 Speaker 1: person likes. Let's grab things that are similar to that 131 00:08:41,440 --> 00:08:44,199 Speaker 1: and keep serving it to them so that they stay 132 00:08:44,240 --> 00:08:48,360 Speaker 1: on here longer. If that thing that you watched happens 133 00:08:48,400 --> 00:08:51,480 Speaker 1: to be harmful in some way, like, for instance, let's 134 00:08:51,520 --> 00:08:56,520 Speaker 1: say it's promoting something like a behavior that falls into 135 00:08:56,559 --> 00:09:00,760 Speaker 1: the category of anorexia, and you happen to have a 136 00:09:00,960 --> 00:09:05,480 Speaker 1: vulnerable self image issue, you could end up seeing video 137 00:09:05,520 --> 00:09:09,280 Speaker 1: after a video that's reinforcing that particular message, and that 138 00:09:09,360 --> 00:09:12,760 Speaker 1: can be harmful I think that's where TikTok poses a 139 00:09:12,760 --> 00:09:15,240 Speaker 1: lot of harm. But at the same time, like I said, 140 00:09:15,280 --> 00:09:19,040 Speaker 1: it's true across lots of different platforms. It's not unique 141 00:09:19,400 --> 00:09:23,120 Speaker 1: to TikTok. It is insidious, it is a problem, but 142 00:09:23,679 --> 00:09:27,319 Speaker 1: it's not a TikTok problem. It's bigger than that. So 143 00:09:27,440 --> 00:09:30,240 Speaker 1: I guess what I'm really saying is that Montana's law 144 00:09:31,120 --> 00:09:33,360 Speaker 1: it seems like it's going to be challenging to enforce, 145 00:09:34,000 --> 00:09:37,120 Speaker 1: it might not be constitutionally sound, which means it won't 146 00:09:37,160 --> 00:09:40,800 Speaker 1: be around forever anyway. And in the end, the most 147 00:09:40,800 --> 00:09:43,240 Speaker 1: tragic thing I think is that it's not actually addressing 148 00:09:43,320 --> 00:09:47,800 Speaker 1: the problems that really do exist, whether TikTok does or not. 149 00:09:48,640 --> 00:09:52,160 Speaker 1: On a related note, the Federal Trade Commission, or FTC 150 00:09:52,480 --> 00:09:56,080 Speaker 1: here in the United States has accused a fertility tracking 151 00:09:56,120 --> 00:10:00,520 Speaker 1: app called pre Mom of being real Lucy Goosey sensitive 152 00:10:00,559 --> 00:10:05,080 Speaker 1: data and thus violating the FTC's health breach notification rule 153 00:10:05,120 --> 00:10:10,280 Speaker 1: in the process. So obviously, if you are using a 154 00:10:10,320 --> 00:10:14,319 Speaker 1: fertility tracking app, you are also sharing some very personal 155 00:10:14,520 --> 00:10:19,800 Speaker 1: private information with that app. This is data that traditionally 156 00:10:19,840 --> 00:10:23,199 Speaker 1: would be trusted to a healthcare provider, and healthcare providers 157 00:10:23,240 --> 00:10:26,679 Speaker 1: need to follow very strict rules to keep patient information 158 00:10:26,920 --> 00:10:31,680 Speaker 1: secure and private. That's like one of the top concerns 159 00:10:32,160 --> 00:10:37,120 Speaker 1: for handling data in the healthcare sector. But the FTC 160 00:10:37,240 --> 00:10:41,840 Speaker 1: says that pre mom was sharing personal and private data 161 00:10:42,280 --> 00:10:47,440 Speaker 1: with third parties, including advertisers, and that also included data 162 00:10:47,440 --> 00:10:50,800 Speaker 1: that would make it trivial to identify a specific user 163 00:10:51,360 --> 00:10:56,520 Speaker 1: rather than anonymized data that would keep your identity secret. 164 00:10:57,080 --> 00:11:00,320 Speaker 1: And the FTC says that among those entities that it 165 00:11:00,480 --> 00:11:05,559 Speaker 1: shared this information with were two Chinese mobile analytics companies 166 00:11:06,240 --> 00:11:11,520 Speaker 1: that had previously been flagged as showing quote suspect privacy 167 00:11:11,720 --> 00:11:15,440 Speaker 1: practices in the quote according to Connecticut's Attorney general, so 168 00:11:16,280 --> 00:11:19,280 Speaker 1: this is an example of what I was just talking about. That, 169 00:11:19,400 --> 00:11:22,640 Speaker 1: you know, TikTok does not represent the sole weakness in 170 00:11:22,679 --> 00:11:27,800 Speaker 1: protecting US citizens private data, even just from China. The 171 00:11:28,040 --> 00:11:32,200 Speaker 1: company Easy Healthcare has copped to the fact that it 172 00:11:32,280 --> 00:11:37,640 Speaker 1: has inappropriately shared private information with companies, including two Chinese companies. 173 00:11:38,120 --> 00:11:41,160 Speaker 1: It has agreed to pay a one hundred thousand dollars 174 00:11:41,200 --> 00:11:45,200 Speaker 1: civil penalty and has marked another one hundred grand to 175 00:11:45,200 --> 00:11:48,240 Speaker 1: be split between the states of Connecticut and Oregon, as 176 00:11:48,320 --> 00:11:53,000 Speaker 1: well as Washington. D C also PRIMAM will no longer 177 00:11:53,040 --> 00:11:55,560 Speaker 1: be allowed to share personal data with third parties, and 178 00:11:55,640 --> 00:11:58,839 Speaker 1: it also has to ask third parties with whom they 179 00:11:58,840 --> 00:12:03,200 Speaker 1: had previously shared in information to delete that information. However, 180 00:12:03,240 --> 00:12:07,360 Speaker 1: there's no legal requirement for those third parties to do that. 181 00:12:07,360 --> 00:12:11,520 Speaker 1: There's no like enforcement teeth that will make them have 182 00:12:11,600 --> 00:12:15,080 Speaker 1: to go and delete that information. So whether that happens 183 00:12:15,160 --> 00:12:18,720 Speaker 1: or not remains to be seen, but I remain skeptical. Okay, 184 00:12:18,720 --> 00:12:21,520 Speaker 1: we've got a lot more stories to go over before 185 00:12:21,520 --> 00:12:34,959 Speaker 1: we get to those, let's take a quick break. We're back. 186 00:12:35,160 --> 00:12:40,400 Speaker 1: So the British broadband and mobile provider BT Group has 187 00:12:40,440 --> 00:12:44,080 Speaker 1: announced that over the next several years from now to 188 00:12:44,360 --> 00:12:50,040 Speaker 1: twenty thirty, essentially the company plans to cut around fifty 189 00:12:50,200 --> 00:12:55,200 Speaker 1: five thousand jobs that would represent more than forty percent 190 00:12:55,280 --> 00:12:59,880 Speaker 1: of its current workforce. Now this isn't just some sort 191 00:12:59,920 --> 00:13:04,160 Speaker 1: of disastrous news. BT Group. Over the last several years 192 00:13:04,520 --> 00:13:09,840 Speaker 1: has been working to roll out internet fiber infrastructure as 193 00:13:09,880 --> 00:13:14,520 Speaker 1: well as five G deployment, and throughout that process they've 194 00:13:14,520 --> 00:13:17,480 Speaker 1: had to bring on lots of extra hands to get 195 00:13:17,520 --> 00:13:21,680 Speaker 1: stuff done, including a lot of contractors. So part of 196 00:13:21,679 --> 00:13:24,840 Speaker 1: this is just that once it actually has finished that 197 00:13:25,000 --> 00:13:30,000 Speaker 1: project of creating this infrastructure and deploying five G it's 198 00:13:30,040 --> 00:13:33,480 Speaker 1: not going to have the need for all those people 199 00:13:33,480 --> 00:13:37,559 Speaker 1: who are currently making that happen. For another matter, BT 200 00:13:37,679 --> 00:13:41,280 Speaker 1: Group anticipates that AI and automation are going to end 201 00:13:41,360 --> 00:13:44,319 Speaker 1: up handling a lot of the tasks that are behind 202 00:13:44,320 --> 00:13:46,840 Speaker 1: the scenes, and that will mean there will need to 203 00:13:46,840 --> 00:13:51,440 Speaker 1: be fewer actual human beings doing that stuff. Teams themselves 204 00:13:52,200 --> 00:13:55,200 Speaker 1: on BT Group side will need to be smaller or 205 00:13:55,520 --> 00:13:57,240 Speaker 1: they won't need to be as large, if you want 206 00:13:57,280 --> 00:13:59,800 Speaker 1: to look at it that way. So the company has 207 00:13:59,840 --> 00:14:04,080 Speaker 1: our already met with the Communication Workers Union or CWU 208 00:14:04,600 --> 00:14:08,000 Speaker 1: to kind of talk this over right, because obviously, with 209 00:14:08,160 --> 00:14:11,040 Speaker 1: unions and everything, a company can't just willy nilly make 210 00:14:11,080 --> 00:14:14,000 Speaker 1: decisions to cut tens of thousands of people over the 211 00:14:14,040 --> 00:14:17,760 Speaker 1: next several years. And the CWU says that BT Group 212 00:14:17,840 --> 00:14:22,680 Speaker 1: really needs to focus on contractors first, and you know, 213 00:14:22,760 --> 00:14:24,880 Speaker 1: the people who had been hired specifically for things like 214 00:14:24,880 --> 00:14:30,320 Speaker 1: building out this infrastructure and to sunset those positions first. 215 00:14:30,360 --> 00:14:32,640 Speaker 1: That's where the priority should be before you start to 216 00:14:32,680 --> 00:14:36,800 Speaker 1: even touch company staff. On the one hand, the story 217 00:14:36,800 --> 00:14:38,480 Speaker 1: does feel like it's starting to lean a little bit 218 00:14:38,520 --> 00:14:42,840 Speaker 1: into the fear and uncertainty and doubt about AI taking 219 00:14:43,000 --> 00:14:46,480 Speaker 1: jobs away. There is an element of that here, But 220 00:14:46,600 --> 00:14:49,040 Speaker 1: on the other hand, it also stresses how it is 221 00:14:49,120 --> 00:14:52,280 Speaker 1: important for companies to try and be efficient and to 222 00:14:52,360 --> 00:14:55,640 Speaker 1: avoid the trap of creating a workforce that's too large 223 00:14:55,680 --> 00:14:57,920 Speaker 1: to support the actual amount of work to be done. 224 00:14:58,160 --> 00:15:00,680 Speaker 1: You know, we've heard again and again in the States 225 00:15:01,160 --> 00:15:06,000 Speaker 1: that tech companies had come to a conclusion that leaders 226 00:15:06,000 --> 00:15:08,640 Speaker 1: in tech companies, I should say, came to the conclusion 227 00:15:08,680 --> 00:15:11,960 Speaker 1: that they were overinflated in their workforce. They had too 228 00:15:12,040 --> 00:15:15,600 Speaker 1: many people and not enough work to go around. I'm 229 00:15:15,640 --> 00:15:19,320 Speaker 1: sure that was true in some cases, and it doesn't 230 00:15:19,360 --> 00:15:22,280 Speaker 1: do anyone any favors for a company to just kind 231 00:15:22,320 --> 00:15:27,960 Speaker 1: of act as a holding facility for adults when they 232 00:15:27,960 --> 00:15:30,520 Speaker 1: don't have anything to do. They're not being productive, they're 233 00:15:30,560 --> 00:15:33,600 Speaker 1: not adding anything to the company, they're not adding anything 234 00:15:33,680 --> 00:15:37,320 Speaker 1: to society. Their time could be better spent doing something else. 235 00:15:37,520 --> 00:15:40,800 Speaker 1: Although it might be comforting to know that you've got 236 00:15:41,080 --> 00:15:44,360 Speaker 1: a steady paycheck even if you don't have steady work. 237 00:15:45,240 --> 00:15:48,680 Speaker 1: So yeah, there's both sides to that, and I can 238 00:15:48,720 --> 00:15:52,000 Speaker 1: see both sides in that particular story. Over in Italy, 239 00:15:52,640 --> 00:15:55,680 Speaker 1: the government has actually set aside thirty million euros to 240 00:15:55,720 --> 00:15:59,560 Speaker 1: create programs to help people improve their work skills specifically, 241 00:16:00,080 --> 00:16:02,600 Speaker 1: and an effort to smooth the transition to a future 242 00:16:02,680 --> 00:16:05,920 Speaker 1: where certain types of work are more likely to be 243 00:16:06,000 --> 00:16:09,360 Speaker 1: automated or handed over to AI. So the goal is 244 00:16:09,400 --> 00:16:12,800 Speaker 1: to identify sectors that are most likely to be impacted 245 00:16:12,840 --> 00:16:16,520 Speaker 1: by automation and to prepare people who are currently employed 246 00:16:16,520 --> 00:16:19,600 Speaker 1: in those jobs to learn marketable skills so that they 247 00:16:19,640 --> 00:16:22,760 Speaker 1: can then change career paths to something that's more sustainable. 248 00:16:23,320 --> 00:16:27,040 Speaker 1: I think that's great. I think it recognizes that if 249 00:16:27,080 --> 00:16:31,360 Speaker 1: you have an unskilled workforce that's harmful to everybody. It's 250 00:16:31,400 --> 00:16:34,080 Speaker 1: not just the workers themselves, although clearly it's a hardship 251 00:16:34,080 --> 00:16:36,560 Speaker 1: on them, because if you're suddenly out of work and 252 00:16:36,600 --> 00:16:41,240 Speaker 1: you don't have marketable skills, it becomes very difficult, right, 253 00:16:41,720 --> 00:16:44,600 Speaker 1: But beyond that, it is hard and bad for society 254 00:16:44,600 --> 00:16:48,720 Speaker 1: at large for that to happen. It becomes an impediment 255 00:16:48,800 --> 00:16:51,560 Speaker 1: to the whole, not just to the person. So it 256 00:16:51,680 --> 00:16:55,560 Speaker 1: makes sense to build in these systems to try and 257 00:16:55,680 --> 00:16:58,720 Speaker 1: help people prepare for the future so that you have 258 00:16:59,680 --> 00:17:04,520 Speaker 1: a animal impact on both the individuals and the country 259 00:17:04,560 --> 00:17:06,919 Speaker 1: as a whole. I do think it's going to require 260 00:17:06,960 --> 00:17:10,720 Speaker 1: way more than thirty million euros to adequately prepare people 261 00:17:10,720 --> 00:17:13,800 Speaker 1: for how AI and automation are going to disrupt multiple industries. 262 00:17:14,520 --> 00:17:17,720 Speaker 1: But it's a start, and that's something like that should 263 00:17:17,760 --> 00:17:22,840 Speaker 1: be applauded that there's actually effort being done to work 264 00:17:22,880 --> 00:17:27,639 Speaker 1: on this now for some really fun stuff. So YouTube 265 00:17:28,240 --> 00:17:33,240 Speaker 1: participated in essentially what amounted to some upfronts recently and 266 00:17:33,320 --> 00:17:37,919 Speaker 1: made a few announcements that are sure to irritate certain users. So, 267 00:17:38,000 --> 00:17:41,479 Speaker 1: first off, upfronts are a type of industry event. If 268 00:17:41,520 --> 00:17:45,000 Speaker 1: you've never heard the phrase upfront before, here's what it is. 269 00:17:45,520 --> 00:17:49,560 Speaker 1: It's a kind of event where a platform that carries 270 00:17:49,680 --> 00:17:53,720 Speaker 1: some form of content and thus advertises against that content 271 00:17:54,320 --> 00:17:59,760 Speaker 1: gets up front of actual advertising companies, or rather up 272 00:17:59,840 --> 00:18:04,800 Speaker 1: in front of them. So it's pretty typical for these platforms, 273 00:18:04,840 --> 00:18:08,160 Speaker 1: which can include everything from a streaming service to a 274 00:18:08,200 --> 00:18:12,600 Speaker 1: cable television network, to trot out some talent. It becomes 275 00:18:12,640 --> 00:18:15,840 Speaker 1: kind of a dog and pony show. They'll promote upcoming 276 00:18:16,000 --> 00:18:19,600 Speaker 1: content and it's all in an effort to get advertisers 277 00:18:19,600 --> 00:18:24,440 Speaker 1: excited and to attract that sweet, sweet cash. Well, YouTube 278 00:18:24,760 --> 00:18:27,840 Speaker 1: held its own event that was pretty much an upfront, 279 00:18:28,200 --> 00:18:31,000 Speaker 1: and during that event, announced that one change coming to 280 00:18:31,040 --> 00:18:33,679 Speaker 1: its service is that for people who watch YouTube on 281 00:18:34,240 --> 00:18:39,159 Speaker 1: a television, they may soon encounter ads that are thirty 282 00:18:39,280 --> 00:18:42,840 Speaker 1: seconds long, and they might get a single thirty second 283 00:18:42,920 --> 00:18:47,240 Speaker 1: long ad rather than two fifteen second ads, and these 284 00:18:47,240 --> 00:18:51,040 Speaker 1: will be unskippable ads when they start a video, so 285 00:18:51,080 --> 00:18:53,919 Speaker 1: you get a full thirty six second commercial before you 286 00:18:53,920 --> 00:18:56,399 Speaker 1: can start watching whatever it is you're watching on YouTube 287 00:18:56,440 --> 00:19:00,399 Speaker 1: on your television. Further, YouTube is going to tell a 288 00:19:00,440 --> 00:19:04,560 Speaker 1: feature that will show ads to people who pause a video. 289 00:19:04,720 --> 00:19:06,760 Speaker 1: So you've got a video going, you need to pause 290 00:19:06,800 --> 00:19:09,080 Speaker 1: it for a bit, and then an ad begins to 291 00:19:09,080 --> 00:19:11,800 Speaker 1: play while the video itself is on pause. Now, the 292 00:19:11,840 --> 00:19:14,920 Speaker 1: example that YouTube showed is not quite as obnoxious as 293 00:19:14,960 --> 00:19:17,199 Speaker 1: what I first imagined. Like to me, it sounded like 294 00:19:17,920 --> 00:19:22,800 Speaker 1: the frame of the YouTube video would suddenly be taken 295 00:19:22,840 --> 00:19:26,159 Speaker 1: over by a commercial. No, apparently, it's more like in 296 00:19:26,200 --> 00:19:29,879 Speaker 1: a banner that appears to the side of the video, 297 00:19:29,960 --> 00:19:33,600 Speaker 1: and you might have a video ad playing out in 298 00:19:33,640 --> 00:19:36,119 Speaker 1: that banner, but it doesn't replace whatever it was you 299 00:19:36,160 --> 00:19:39,080 Speaker 1: were watching, so it's not quite that bad. Plus, they 300 00:19:39,119 --> 00:19:44,040 Speaker 1: showed a dismiss button beneath the ad itself, which at 301 00:19:44,119 --> 00:19:47,600 Speaker 1: least indicates that you could quickly click on dismiss so 302 00:19:47,640 --> 00:19:51,960 Speaker 1: that that little automated ad stops playing. So not as 303 00:19:52,000 --> 00:19:56,960 Speaker 1: bad as I first imagined based upon the description, I 304 00:19:57,000 --> 00:19:59,480 Speaker 1: guess for you know, it's not the worst thing in 305 00:19:59,520 --> 00:20:02,320 Speaker 1: the world. It's for people who are watching YouTube on 306 00:20:02,359 --> 00:20:05,640 Speaker 1: television who are not part of YouTube Premium. I'm sure 307 00:20:05,920 --> 00:20:10,200 Speaker 1: this will be frustrating to them. For people who are 308 00:20:10,200 --> 00:20:13,000 Speaker 1: on YouTube Premium, you know you're paying a subscription fee, 309 00:20:13,000 --> 00:20:15,800 Speaker 1: you don't get ads for you at all, So y'all 310 00:20:15,800 --> 00:20:19,960 Speaker 1: are good. A professor at Texas A and M University 311 00:20:20,040 --> 00:20:26,040 Speaker 1: reportedly gave several students an X grade, which indicates an incomplete. 312 00:20:26,080 --> 00:20:30,080 Speaker 1: It's not a fail, but it is an incomplete. And 313 00:20:30,840 --> 00:20:33,920 Speaker 1: this included students who were at senior level who otherwise 314 00:20:33,920 --> 00:20:37,320 Speaker 1: would have graduated but then were denied diplomas because they 315 00:20:37,359 --> 00:20:40,680 Speaker 1: had a course where they had an incomplete. So why 316 00:20:40,720 --> 00:20:46,320 Speaker 1: did the professor give incompletes to these students. Well, allegedly 317 00:20:46,359 --> 00:20:50,360 Speaker 1: what happened is the professor assigned several essays and then 318 00:20:50,440 --> 00:20:54,439 Speaker 1: took essays that were submitted by students and fed the 319 00:20:54,600 --> 00:20:59,560 Speaker 1: essays into what he referred to as chat GTP. Of 320 00:20:59,600 --> 00:21:05,480 Speaker 1: course he meant chat GPT, not GTP, but that mistake 321 00:21:05,560 --> 00:21:06,960 Speaker 1: is easy to make. I'm not gonna give him too 322 00:21:07,000 --> 00:21:09,399 Speaker 1: much grief for that. I will say the Rolling Stones article, 323 00:21:09,920 --> 00:21:13,640 Speaker 1: of the Rolling Stone article, it's not the band, the magazine. 324 00:21:13,640 --> 00:21:17,840 Speaker 1: The Rolling Stone article was way more snarky about this 325 00:21:18,000 --> 00:21:20,800 Speaker 1: than I will be. But anyway, he was asking chat 326 00:21:20,880 --> 00:21:25,720 Speaker 1: gpt if it had been responsible for part or all 327 00:21:26,000 --> 00:21:30,640 Speaker 1: of the various essays, and apparently chat GPT said it 328 00:21:30,840 --> 00:21:34,600 Speaker 1: was responsible for at least some part of these essays. 329 00:21:34,640 --> 00:21:39,000 Speaker 1: So boom, students get an incomplete because it appears that 330 00:21:39,080 --> 00:21:41,960 Speaker 1: the work they submitted was not their own. The problem 331 00:21:42,000 --> 00:21:45,920 Speaker 1: with this is that chat GPT doesn't really work that way. 332 00:21:46,520 --> 00:21:51,640 Speaker 1: You can submit material to chat GPT that it definitively 333 00:21:52,040 --> 00:21:55,960 Speaker 1: did not create and then ask it if it created 334 00:21:55,960 --> 00:21:59,280 Speaker 1: the material, and it might say it did, or it 335 00:21:59,359 --> 00:22:03,080 Speaker 1: might say it could have, which isn't quite the same thing, 336 00:22:03,119 --> 00:22:07,280 Speaker 1: but still, you know, raises doubt. People have actually shown 337 00:22:07,280 --> 00:22:11,200 Speaker 1: this off by using passages from classic novels, and chat 338 00:22:11,240 --> 00:22:15,199 Speaker 1: GPT just confidently says maybe it actually wrote that, so 339 00:22:15,320 --> 00:22:18,960 Speaker 1: you could say, wow, according to chat GPT, it created 340 00:22:18,960 --> 00:22:22,760 Speaker 1: great expectations or bride and prejudice, which would be quite 341 00:22:22,760 --> 00:22:25,760 Speaker 1: a trick for chat g t two have done that. 342 00:22:25,880 --> 00:22:29,360 Speaker 1: So the students are understandably upset that they got an 343 00:22:29,359 --> 00:22:33,280 Speaker 1: incomplete and were accused of plagiarism, essentially of foisting their 344 00:22:33,320 --> 00:22:36,800 Speaker 1: work onto an AI chatbot, and they did no such thing, 345 00:22:37,280 --> 00:22:41,560 Speaker 1: and the professor did not realize that chat GPT can't 346 00:22:41,680 --> 00:22:45,320 Speaker 1: be relied upon to indicate whether or not it generated 347 00:22:45,400 --> 00:22:48,600 Speaker 1: a particular work. Heck, some folks went so far as 348 00:22:48,640 --> 00:22:52,639 Speaker 1: to dig up the professor's own doctoral thesis when he 349 00:22:52,720 --> 00:22:56,879 Speaker 1: was a graduate student and submitted passages to chat GPT 350 00:22:57,000 --> 00:23:01,240 Speaker 1: and asked if it wrote the professor's and chat GPT 351 00:23:01,440 --> 00:23:04,879 Speaker 1: essentially said, huh, yeah, I might have written this. Now. 352 00:23:05,520 --> 00:23:08,520 Speaker 1: As I said, Rolling Stone has a pretty snarky article 353 00:23:08,920 --> 00:23:11,479 Speaker 1: that throws massive shade of the professor for this, and 354 00:23:11,520 --> 00:23:14,240 Speaker 1: I get it. Holding up a person's diploma through the 355 00:23:14,280 --> 00:23:18,040 Speaker 1: misapplication of technology is a big deal. But on the 356 00:23:18,040 --> 00:23:21,480 Speaker 1: flip side, and at least a little bit in the 357 00:23:21,520 --> 00:23:26,600 Speaker 1: professor's defense, the discussion around chat GPT and education has 358 00:23:26,640 --> 00:23:31,280 Speaker 1: been so dramatic and so disruptive over the past several 359 00:23:31,280 --> 00:23:35,320 Speaker 1: months that I think it's it's natural for educators to 360 00:23:35,440 --> 00:23:39,840 Speaker 1: be concerned that students are passing off AI generated work 361 00:23:40,040 --> 00:23:43,720 Speaker 1: as their own work. That is an understandable concern. The 362 00:23:43,800 --> 00:23:48,200 Speaker 1: problem is you can't trust the robots to claim authorship 363 00:23:48,440 --> 00:23:51,119 Speaker 1: because those rotten watsits will say they wrote stuff what 364 00:23:51,280 --> 00:23:54,040 Speaker 1: was published one hundred years ago, and clearly that's not 365 00:23:54,119 --> 00:24:01,200 Speaker 1: the case. So yeah, kind of an absurdly comical situation 366 00:24:01,320 --> 00:24:03,480 Speaker 1: here if it weren't for the fact that it also 367 00:24:03,520 --> 00:24:07,040 Speaker 1: means a bunch of students were denied the chance to 368 00:24:07,240 --> 00:24:11,119 Speaker 1: graduate with a diploma, at least temporarily because of this 369 00:24:11,280 --> 00:24:16,000 Speaker 1: incomplete There there are people working towards trying to get 370 00:24:16,040 --> 00:24:20,720 Speaker 1: all this resolved, but as I recorded this, I didn't 371 00:24:20,720 --> 00:24:24,080 Speaker 1: have an update to give about where we are in 372 00:24:24,119 --> 00:24:27,879 Speaker 1: that process. All right, Hey, so you know how AI 373 00:24:28,040 --> 00:24:31,080 Speaker 1: large language models are trained by analyzing tons of data 374 00:24:31,440 --> 00:24:35,080 Speaker 1: through various sources like chat. GPT is built on top 375 00:24:35,119 --> 00:24:38,440 Speaker 1: of a model that crawled through millions and millions of 376 00:24:38,560 --> 00:24:43,520 Speaker 1: web based documents. Well, what if you did that same thing, 377 00:24:44,240 --> 00:24:48,000 Speaker 1: but instead of using the web, you turned to the 378 00:24:48,080 --> 00:24:52,359 Speaker 1: content on the dark Web as your training material. Of course, 379 00:24:52,400 --> 00:24:57,439 Speaker 1: the dark Web is inaccessible through normal links on the 380 00:24:57,480 --> 00:25:00,879 Speaker 1: World Wide Web. You typically get to the dark web 381 00:25:01,440 --> 00:25:06,399 Speaker 1: by using special types of browsers that allow you to 382 00:25:06,480 --> 00:25:09,800 Speaker 1: access these sorts of things, and you can encounter all 383 00:25:09,840 --> 00:25:14,399 Speaker 1: sorts of stuff, like you know, hacker communities that post 384 00:25:14,520 --> 00:25:18,239 Speaker 1: malware so that you can take it and tweak it 385 00:25:18,600 --> 00:25:22,880 Speaker 1: and deploy it. You know, obviously stolen information is bought 386 00:25:22,920 --> 00:25:26,560 Speaker 1: and sold on the dark web. Well, I would say, 387 00:25:27,119 --> 00:25:31,160 Speaker 1: don't do that. Don't train AI on dark web material, 388 00:25:31,280 --> 00:25:33,760 Speaker 1: not because I think it's going to create dangerous AI, 389 00:25:34,400 --> 00:25:37,199 Speaker 1: but because someone already beats you to it. Some researchers 390 00:25:37,200 --> 00:25:40,120 Speaker 1: in South Korea introduced an AI system that they call 391 00:25:40,640 --> 00:25:46,040 Speaker 1: dark BURT, and they trained it on information exclusively from 392 00:25:46,160 --> 00:25:49,919 Speaker 1: the dark web. So BERT in this case actually stands 393 00:25:49,960 --> 00:25:56,480 Speaker 1: for bi directional Encoder Representations from transformers, and it was 394 00:25:56,520 --> 00:26:00,919 Speaker 1: originally created by Google back in twenty eighteen. BURT, that is, 395 00:26:01,040 --> 00:26:05,280 Speaker 1: was created by Google, and then Meta researchers took BERT 396 00:26:05,440 --> 00:26:09,480 Speaker 1: and they continued to evolve it. They began to tweak it, 397 00:26:09,720 --> 00:26:13,240 Speaker 1: change it a little bit, and they turned it into 398 00:26:13,359 --> 00:26:18,480 Speaker 1: a new AI system called ROBERTA cute right by the way, 399 00:26:18,480 --> 00:26:20,560 Speaker 1: This was back when Meta was still just Facebook, but 400 00:26:20,640 --> 00:26:25,720 Speaker 1: of course today it's Meta. Roberta then provided the foundation 401 00:26:25,880 --> 00:26:29,879 Speaker 1: for these South Korean researchers. They took that framework, but 402 00:26:30,000 --> 00:26:33,159 Speaker 1: they trained it on information on the dark web, and 403 00:26:33,240 --> 00:26:37,960 Speaker 1: thus we get dark Bert. Apparently, they say it worked 404 00:26:38,000 --> 00:26:42,199 Speaker 1: really well, like surprisingly well, and that tools built on 405 00:26:42,320 --> 00:26:46,240 Speaker 1: top of this aimodel perform at least as well, if 406 00:26:46,280 --> 00:26:49,479 Speaker 1: not better than other AI tools. So for example, if 407 00:26:49,480 --> 00:26:53,080 Speaker 1: you were to create an AI chatbot based on this model, 408 00:26:53,880 --> 00:26:59,080 Speaker 1: it might end up being as impressive as chat GPT. 409 00:27:00,200 --> 00:27:04,399 Speaker 1: That being said, they are not going to unleash dark 410 00:27:04,520 --> 00:27:07,560 Speaker 1: Burt on the general public. They're going to keep it 411 00:27:07,680 --> 00:27:11,240 Speaker 1: under wraps. They are going to allow academic researchers access 412 00:27:11,320 --> 00:27:17,639 Speaker 1: to it, so there can be academic applications to try 413 00:27:17,640 --> 00:27:21,040 Speaker 1: the tool out, test it out, and to develop different techniques. 414 00:27:21,440 --> 00:27:25,000 Speaker 1: It may be used also in ways to get a 415 00:27:25,040 --> 00:27:28,840 Speaker 1: better understanding of how the dark web works from kind 416 00:27:28,880 --> 00:27:33,119 Speaker 1: of a architectural approach as that, which that could be 417 00:27:33,160 --> 00:27:39,280 Speaker 1: really useful and everything from cybersecurity to government investigations. So 418 00:27:39,800 --> 00:27:42,520 Speaker 1: it is important. But I just wanted y'all to know 419 00:27:42,680 --> 00:27:47,159 Speaker 1: that it's not like an even more evil version of 420 00:27:47,240 --> 00:27:49,920 Speaker 1: chad GPT is going to be running on the loose 421 00:27:49,960 --> 00:27:52,440 Speaker 1: out there. Okay, We're going to take another quick break 422 00:27:52,480 --> 00:27:56,320 Speaker 1: when we come back. I've got a few final stories 423 00:27:56,359 --> 00:28:09,280 Speaker 1: to cover for this week, all right. So, the Pew 424 00:28:09,400 --> 00:28:13,160 Speaker 1: Research Center, which has done lots and lots of surveys 425 00:28:13,320 --> 00:28:16,840 Speaker 1: about various things connected to technology in general and the 426 00:28:16,840 --> 00:28:21,960 Speaker 1: Internet in particular, recently held a survey that found six 427 00:28:22,160 --> 00:28:26,920 Speaker 1: out of ten respondents, so sixty percent of their respondents, 428 00:28:26,920 --> 00:28:30,080 Speaker 1: indicated that they've taken a bit of a break from 429 00:28:30,119 --> 00:28:34,480 Speaker 1: Twitter since Elon Musk took over. I don't know how 430 00:28:34,480 --> 00:28:38,160 Speaker 1: many people were actually involved in this survey, I just 431 00:28:38,240 --> 00:28:41,600 Speaker 1: know that around sixty percent of them said that this 432 00:28:41,760 --> 00:28:43,800 Speaker 1: was the case. So forty percent said they had not 433 00:28:43,880 --> 00:28:47,720 Speaker 1: taken a break, sixty percent said they had, and those 434 00:28:47,760 --> 00:28:52,000 Speaker 1: breaks ranged from like a couple of weeks to essentially 435 00:28:52,120 --> 00:28:56,320 Speaker 1: leaving the platform since Musk took over. People of color 436 00:28:56,440 --> 00:28:59,600 Speaker 1: were more likely to say that they had taken a 437 00:28:59,640 --> 00:29:05,400 Speaker 1: break than white users were, But interestingly, other major factors 438 00:29:05,960 --> 00:29:10,640 Speaker 1: didn't seem to show as much of a difference. So, 439 00:29:10,800 --> 00:29:15,840 Speaker 1: for example, there was very little difference found between people 440 00:29:15,840 --> 00:29:20,920 Speaker 1: who leaned conservative versus people who leaned liberal. That both 441 00:29:20,960 --> 00:29:24,920 Speaker 1: conservatives and liberals indicated that about sixty percent of them 442 00:29:25,000 --> 00:29:29,360 Speaker 1: had taken a break from Twitter recently, and things got 443 00:29:29,400 --> 00:29:31,320 Speaker 1: a little more complicated when you started to break it 444 00:29:31,360 --> 00:29:36,160 Speaker 1: down by gender, also within political leaning. But I'm not 445 00:29:36,240 --> 00:29:38,600 Speaker 1: going to dive into all of that because it would 446 00:29:38,600 --> 00:29:41,240 Speaker 1: take up too much time, and honestly, I don't know 447 00:29:41,240 --> 00:29:44,720 Speaker 1: what conclusions you could draw from it either, other than 448 00:29:45,120 --> 00:29:49,520 Speaker 1: one really big takeaway, which is if that survey is 449 00:29:49,640 --> 00:29:52,880 Speaker 1: reflective of a larger trend, which is a big if 450 00:29:52,960 --> 00:29:55,680 Speaker 1: you never know, Like if the survey size was really small, 451 00:29:55,760 --> 00:29:59,920 Speaker 1: then you can't really make any big predictions based on 452 00:30:00,080 --> 00:30:04,240 Speaker 1: off that. But if it's a representative survey and if 453 00:30:04,520 --> 00:30:07,880 Speaker 1: the findings are true, it could mean that Twitter's new 454 00:30:08,000 --> 00:30:10,600 Speaker 1: CEO is going to have a lot of challenging work 455 00:30:10,680 --> 00:30:13,880 Speaker 1: ahead of her to pull Twitter out of the doldrums, 456 00:30:14,120 --> 00:30:18,800 Speaker 1: because it's not enough to tell advertisers we value you 457 00:30:18,880 --> 00:30:21,239 Speaker 1: and we want you back on the platform, because you know, 458 00:30:21,440 --> 00:30:25,040 Speaker 1: famously Twitter has lost a lot of ad revenue since 459 00:30:25,120 --> 00:30:28,800 Speaker 1: Musk took over. They also have to show that their 460 00:30:28,840 --> 00:30:31,320 Speaker 1: platform is a place where users want to be. And 461 00:30:31,400 --> 00:30:35,280 Speaker 1: if advertisers are seeing reports that sixty percent of Twitter 462 00:30:35,360 --> 00:30:39,720 Speaker 1: users are kind of jumping ship, that's not a great 463 00:30:39,760 --> 00:30:42,120 Speaker 1: selling point for them to come back to the platform 464 00:30:42,360 --> 00:30:45,120 Speaker 1: because they're just not you know, the people aren't there anymore, 465 00:30:45,280 --> 00:30:48,520 Speaker 1: so why would you spend money to advertise there. So yeah, 466 00:30:48,600 --> 00:30:51,960 Speaker 1: I think that this is bad news for Twitter overall 467 00:30:52,040 --> 00:30:57,360 Speaker 1: if in fact the survey is delivering dependable information, and 468 00:30:57,440 --> 00:30:59,560 Speaker 1: again that's a big if. It would need I think 469 00:30:59,760 --> 00:31:05,240 Speaker 1: for their investigation to make sure that that's actually what's happening. Hey, 470 00:31:05,640 --> 00:31:09,640 Speaker 1: do you remember Elizabeth Holmes. She's the disgraced founder of 471 00:31:09,680 --> 00:31:14,120 Speaker 1: the medical tech company Farrhanose. So if you don't remember her, 472 00:31:14,160 --> 00:31:17,600 Speaker 1: here's a very quick overview of who she is and 473 00:31:17,600 --> 00:31:21,280 Speaker 1: what she did. So Holmes dropped out of Stanford and 474 00:31:21,440 --> 00:31:24,600 Speaker 1: went on to found a company whose aim was to 475 00:31:24,640 --> 00:31:29,640 Speaker 1: create a medical device capable of testing a micro drop 476 00:31:29,800 --> 00:31:33,960 Speaker 1: of blood for more than one hundred different medical conditions 477 00:31:34,040 --> 00:31:37,760 Speaker 1: and diseases. So with a teeny tiny pinprick, you would, 478 00:31:37,760 --> 00:31:41,600 Speaker 1: in theory, be able to submit that sample to this device, 479 00:31:41,640 --> 00:31:45,000 Speaker 1: which theoretically would be small enough to be like a 480 00:31:45,000 --> 00:31:49,600 Speaker 1: desktop printer, and run banks of tests on it to 481 00:31:49,760 --> 00:31:53,880 Speaker 1: determine if you are at risk for any particular medical conditions. 482 00:31:54,440 --> 00:31:56,960 Speaker 1: And Theranos received a lot of positive press in the 483 00:31:57,000 --> 00:31:59,520 Speaker 1: early days, Like they were talking about it as the 484 00:31:59,560 --> 00:32:07,360 Speaker 1: democratization of medicine and making medicine and proactive health care 485 00:32:07,720 --> 00:32:11,600 Speaker 1: far more accessible and democratized. And there were heavy hitter 486 00:32:11,680 --> 00:32:16,400 Speaker 1: investors who poured hundreds of millions of dollars into the 487 00:32:16,440 --> 00:32:20,520 Speaker 1: fledgling startup. But then a few years later an expose 488 00:32:20,760 --> 00:32:24,800 Speaker 1: revealed that things were pretty shady behind the scenes at Theranose. 489 00:32:25,480 --> 00:32:29,840 Speaker 1: The expose claimed that advancements in Theranose technology did not 490 00:32:30,040 --> 00:32:33,200 Speaker 1: stand up to scrutiny, that the company was making claims 491 00:32:33,360 --> 00:32:36,480 Speaker 1: it could not back up, that it was outright misleading 492 00:32:36,520 --> 00:32:43,760 Speaker 1: investors regarding technological process and relying on various competitor technologies 493 00:32:43,800 --> 00:32:46,760 Speaker 1: to make it look like thereonos tech was working. So 494 00:32:48,200 --> 00:32:51,760 Speaker 1: charges of fraud and other things were brought against Homes 495 00:32:51,960 --> 00:32:56,000 Speaker 1: and some other folks at Thernose, and Holmes was ultimately 496 00:32:56,120 --> 00:33:00,080 Speaker 1: found guilty of at least some of those charges, and 497 00:33:00,160 --> 00:33:04,760 Speaker 1: she was headed to prison when she decided to petition 498 00:33:04,840 --> 00:33:08,760 Speaker 1: the court to ask to allow her to remain free 499 00:33:09,320 --> 00:33:12,120 Speaker 1: while she challenges the conviction. She's attempting to have her 500 00:33:12,160 --> 00:33:15,800 Speaker 1: conviction overturned, and she said in the process she would 501 00:33:15,840 --> 00:33:18,880 Speaker 1: very much like to not be in prison, please, And 502 00:33:18,920 --> 00:33:22,400 Speaker 1: the judge said nah, naw, you're going to the pokey. 503 00:33:23,080 --> 00:33:27,360 Speaker 1: So now Holmes appears likely to be headed to prison 504 00:33:27,440 --> 00:33:29,520 Speaker 1: in a couple of weeks. And on top of that, 505 00:33:29,560 --> 00:33:31,960 Speaker 1: the judge has also levied a four hundred and fifty 506 00:33:32,000 --> 00:33:36,520 Speaker 1: two million dollar judgment in restitutions that Holmes is supposed 507 00:33:36,560 --> 00:33:39,440 Speaker 1: to pay the various victims of her crimes. And when 508 00:33:39,440 --> 00:33:43,400 Speaker 1: I say victims, keep in mind, I'm mostly talking about 509 00:33:43,480 --> 00:33:47,560 Speaker 1: really rich people who put money into her company. I'm 510 00:33:47,600 --> 00:33:51,040 Speaker 1: not talking about necessarily the folks who are depending upon 511 00:33:51,160 --> 00:33:57,360 Speaker 1: Therahnose to deliver reliable medical information so that they could 512 00:33:57,360 --> 00:34:00,080 Speaker 1: make the right decisions regarding their health care. No, so 513 00:34:00,360 --> 00:34:03,800 Speaker 1: those aren't the victims that the court's particularly concerned about. 514 00:34:04,080 --> 00:34:08,040 Speaker 1: They're concerned more about, you know, Rupert Murdoch, who's obviously 515 00:34:08,080 --> 00:34:11,720 Speaker 1: really hurting for cash. Anyway, the moral of this story 516 00:34:11,719 --> 00:34:14,680 Speaker 1: should be that the Silicon Valley mantra move fast and 517 00:34:14,719 --> 00:34:18,160 Speaker 1: break things, doesn't you know, apply to breaking the law, 518 00:34:18,400 --> 00:34:21,760 Speaker 1: at least not when it means that rich people lose money, because, 519 00:34:22,040 --> 00:34:26,720 Speaker 1: like I've said many times before, they hate that. Finally, 520 00:34:27,280 --> 00:34:31,640 Speaker 1: Bloomberg's Mark German reports that Apple had to make lots 521 00:34:31,680 --> 00:34:35,840 Speaker 1: and lots of concessions while designing the upcoming mixed reality 522 00:34:35,880 --> 00:34:39,560 Speaker 1: headset that we expect to see unveiled at some point 523 00:34:39,600 --> 00:34:43,880 Speaker 1: this year, possibly at the Worldwide Developer Conference or WWDC 524 00:34:44,600 --> 00:34:47,920 Speaker 1: in June. Now, the fact that Apple made concessions is 525 00:34:47,920 --> 00:34:51,360 Speaker 1: not a surprise. We have heard about this before. I 526 00:34:51,360 --> 00:34:54,520 Speaker 1: think everyone has heard that. The initial hope was Apple 527 00:34:54,600 --> 00:34:58,399 Speaker 1: was going to produce an augmented reality headset that would 528 00:34:58,480 --> 00:35:04,120 Speaker 1: appear indistinguishable from a stylish pair of eyeglasses. However, the 529 00:35:04,280 --> 00:35:08,160 Speaker 1: technical requirements that were needed to achieve the desired performance 530 00:35:08,760 --> 00:35:12,680 Speaker 1: meant it just wasn't plausible for a company to get 531 00:35:12,840 --> 00:35:15,800 Speaker 1: both that and the form factor in one package. You 532 00:35:15,840 --> 00:35:19,359 Speaker 1: could either have a stylish pair of eyeglasses that had 533 00:35:19,640 --> 00:35:24,200 Speaker 1: very limited utility, or you could have a more useful device, 534 00:35:24,600 --> 00:35:27,439 Speaker 1: but it is definitely not going to fit into a 535 00:35:27,480 --> 00:35:31,080 Speaker 1: small form factor. So the heads that we're getting, which 536 00:35:31,120 --> 00:35:36,319 Speaker 1: is reportedly called Apple Reality, will feature a screen that 537 00:35:36,360 --> 00:35:39,160 Speaker 1: will feed live video from external cameras to the viewer. 538 00:35:39,239 --> 00:35:41,560 Speaker 1: So it's kind of like if you're holding your smartphone 539 00:35:41,640 --> 00:35:45,560 Speaker 1: up to your eyes and you've activated your smartphones backward 540 00:35:45,640 --> 00:35:48,319 Speaker 1: facing camera and you're just looking at a live feed 541 00:35:48,400 --> 00:35:51,399 Speaker 1: around you. That's essentially what this is doing. When it's 542 00:35:51,440 --> 00:35:54,560 Speaker 1: working in augmented reality mode, it'll also be able to 543 00:35:54,600 --> 00:35:59,240 Speaker 1: do virtual reality applications. It will connect to a separate 544 00:35:59,320 --> 00:36:03,160 Speaker 1: battery path, so there'll be some sort of a cable, 545 00:36:03,440 --> 00:36:06,040 Speaker 1: I guess attaching the headset to a battery pack that 546 00:36:06,080 --> 00:36:09,040 Speaker 1: you would wear somewhere else, maybe like on a belt 547 00:36:09,120 --> 00:36:14,120 Speaker 1: or in a pocket or something. And reportedly that so 548 00:36:14,280 --> 00:36:16,840 Speaker 1: that it can take some of the weight off the 549 00:36:16,880 --> 00:36:19,759 Speaker 1: headset itself, so that it's a little more comfortable to wear. 550 00:36:19,760 --> 00:36:23,000 Speaker 1: You're not wearing both a screen and a battery pack. 551 00:36:23,600 --> 00:36:25,759 Speaker 1: And also it'll give you a little more juice so 552 00:36:25,760 --> 00:36:27,879 Speaker 1: that you could actually use the darn thing for more 553 00:36:27,920 --> 00:36:32,480 Speaker 1: than like half an hour. Right, So, Apples had to 554 00:36:32,520 --> 00:36:36,840 Speaker 1: make lots of compromises in its quest to build this gadget, 555 00:36:37,360 --> 00:36:39,960 Speaker 1: and I have a feeling that the company is really 556 00:36:40,000 --> 00:36:44,320 Speaker 1: hoping that it becomes similar to the iPhone, right because 557 00:36:44,320 --> 00:36:47,440 Speaker 1: the iPhone was not the first smartphone on the market. 558 00:36:48,000 --> 00:36:53,160 Speaker 1: It was the first smartphone to get massive consumer interest 559 00:36:53,239 --> 00:36:56,399 Speaker 1: and demand. That's what really set the iPhone apart. Now 560 00:36:56,400 --> 00:36:59,319 Speaker 1: that Apple was first, but that Apple was able to 561 00:36:59,400 --> 00:37:03,720 Speaker 1: refine the approach to that gadget and get the general 562 00:37:03,760 --> 00:37:06,680 Speaker 1: public really excited about it. I think they're hoping for 563 00:37:06,719 --> 00:37:09,839 Speaker 1: the same thing with this mixed reality headset, because, as 564 00:37:09,840 --> 00:37:15,160 Speaker 1: we've seen, lots of other companies have introduced mixed reality gadgets, 565 00:37:15,560 --> 00:37:20,040 Speaker 1: but Apple is hoping to kind of define that market 566 00:37:20,280 --> 00:37:23,440 Speaker 1: and not, you know, be the innovator, but the best 567 00:37:23,480 --> 00:37:29,319 Speaker 1: in class. We've also heard that the price of this 568 00:37:29,360 --> 00:37:32,120 Speaker 1: particular technology is likely to be somewhere in the neighborhood 569 00:37:32,120 --> 00:37:37,239 Speaker 1: of three thousand dollars, which yikes, that's super expensive. I 570 00:37:37,239 --> 00:37:40,600 Speaker 1: think even hardcore Apple fans might hesitate before dropping three 571 00:37:40,800 --> 00:37:43,799 Speaker 1: g's on strapping a screen to their face, But then 572 00:37:43,960 --> 00:37:47,800 Speaker 1: I've been wrong about them before, so who knows. Okay, 573 00:37:47,920 --> 00:37:52,719 Speaker 1: that's it for the tech news for today, Thursday May eighteenth, 574 00:37:52,760 --> 00:37:57,120 Speaker 1: twenty twenty three. I hope you are all well, and 575 00:37:57,160 --> 00:38:07,279 Speaker 1: I'll talk to you again really soon. Tech Stuff is 576 00:38:07,280 --> 00:38:11,840 Speaker 1: an iHeartRadio production. For more podcasts from iHeartRadio, visit the 577 00:38:11,880 --> 00:38:15,520 Speaker 1: iHeartRadio app, Apple Podcasts, or wherever you listen to your 578 00:38:15,560 --> 00:38:16,280 Speaker 1: favorite shows.