1 00:00:04,360 --> 00:00:12,280 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. He there, 2 00:00:12,320 --> 00:00:15,880 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:15,880 --> 00:00:19,919 Speaker 1: I'm an executive producer with iHeartRadio and how the Tech Area. 4 00:00:20,560 --> 00:00:24,280 Speaker 1: It's time for the tech News for Tuesday, February twenty first, 5 00:00:24,440 --> 00:00:28,000 Speaker 1: two twenty three, and we're going to start this episode 6 00:00:28,040 --> 00:00:30,520 Speaker 1: off with a news segment that I would like to 7 00:00:30,560 --> 00:00:35,159 Speaker 1: call Blame the Robots. So the Washington Post published an 8 00:00:35,280 --> 00:00:39,840 Speaker 1: article by pranschu Verma titled AI is starting to pick 9 00:00:40,000 --> 00:00:45,360 Speaker 1: who gets laid off, which is perhaps a bit sensationalized 10 00:00:45,440 --> 00:00:48,800 Speaker 1: once you read the actual story, but maybe only a 11 00:00:48,920 --> 00:00:53,800 Speaker 1: little bit sensationalized. All right, So here's how this all unfolds. 12 00:00:54,800 --> 00:01:00,400 Speaker 1: There are software packages and various services and algorithms that 13 00:01:00,520 --> 00:01:03,960 Speaker 1: some hiring managers rely upon in order to do kind 14 00:01:03,960 --> 00:01:08,000 Speaker 1: of a pass filter across job applicants in order to 15 00:01:08,080 --> 00:01:11,720 Speaker 1: narrow down the search. There are companies that offer that 16 00:01:11,840 --> 00:01:15,039 Speaker 1: very service, and it starts to make sense if you're 17 00:01:15,360 --> 00:01:17,600 Speaker 1: hiring at a company that gets a lot of attention. 18 00:01:18,280 --> 00:01:22,119 Speaker 1: So let's say that you're a popular company, you have 19 00:01:22,280 --> 00:01:26,480 Speaker 1: fairly rare job openings, and you get lots, like maybe 20 00:01:26,520 --> 00:01:30,640 Speaker 1: thousands of applicants per job that you list, Well, you 21 00:01:30,760 --> 00:01:34,399 Speaker 1: probably need some help whittling down the applicant list to 22 00:01:34,400 --> 00:01:37,920 Speaker 1: get to a manageable pool of potential hires, right like 23 00:01:38,000 --> 00:01:42,160 Speaker 1: you need something to separate the cream from everything else. 24 00:01:42,880 --> 00:01:45,560 Speaker 1: It's not easy to do, especially at really high volumes. 25 00:01:45,560 --> 00:01:51,160 Speaker 1: And so these services and software packages essentially reduce applicants 26 00:01:51,200 --> 00:01:53,880 Speaker 1: down to data points and they kind of have to. 27 00:01:54,240 --> 00:01:57,760 Speaker 1: And depending on how tough that filter is, folks can 28 00:01:57,800 --> 00:02:00,160 Speaker 1: get weeded out. Sometimes a lot of folks. May you 29 00:02:00,160 --> 00:02:03,480 Speaker 1: get down to, like less than a dozen applicants out 30 00:02:03,480 --> 00:02:06,960 Speaker 1: of thousands has to be pretty brutal. Well, this article 31 00:02:07,080 --> 00:02:10,840 Speaker 1: postulates that we could see the reverse come into play 32 00:02:10,919 --> 00:02:15,679 Speaker 1: as well, that a company might lean on similar software 33 00:02:15,720 --> 00:02:20,440 Speaker 1: and services to identify the people who contribute the least 34 00:02:20,520 --> 00:02:23,320 Speaker 1: to the company or perform at a level that's considered 35 00:02:23,360 --> 00:02:26,880 Speaker 1: to be below their peers. Therefore, they could be candidates 36 00:02:26,880 --> 00:02:30,200 Speaker 1: for layoffs when the corporate overlords deemed that it is 37 00:02:30,280 --> 00:02:35,480 Speaker 1: time to reduce headcount in these troubling times. And therein 38 00:02:35,560 --> 00:02:40,480 Speaker 1: lies the story, right that an algorithm might determine that 39 00:02:40,639 --> 00:02:45,920 Speaker 1: you are expendable instead of your human boss. And I 40 00:02:45,960 --> 00:02:48,480 Speaker 1: am aware that I'm making an assumption here that your 41 00:02:48,520 --> 00:02:52,639 Speaker 1: boss is in fact human, some program, you know, some 42 00:02:52,760 --> 00:02:57,440 Speaker 1: freaking robot has determined that you're getting laid off. It 43 00:02:57,560 --> 00:03:02,520 Speaker 1: sounds positively dystopia, doesn't it. And with big tech companies 44 00:03:02,600 --> 00:03:06,320 Speaker 1: laying off thousands of folks over this year and the 45 00:03:06,360 --> 00:03:11,160 Speaker 1: previous year, it's easy to imagine managers shrugging off the 46 00:03:11,240 --> 00:03:14,040 Speaker 1: responsibility of telling someone they no longer have a job 47 00:03:14,639 --> 00:03:17,080 Speaker 1: by leaving it to the old zeros and ones of 48 00:03:17,080 --> 00:03:22,200 Speaker 1: a presumably objective and emotionless system. But this in turn 49 00:03:22,760 --> 00:03:27,559 Speaker 1: brings up other problems. As I have mentioned in tons 50 00:03:27,600 --> 00:03:30,840 Speaker 1: of episodes, one of the many problems we have in 51 00:03:30,960 --> 00:03:35,960 Speaker 1: AI systems can come down to unintended bias within the 52 00:03:36,000 --> 00:03:39,240 Speaker 1: system itself. So if the system is biased, it could 53 00:03:39,320 --> 00:03:44,600 Speaker 1: end up targeting employees of specific ethnicities or backgrounds. Now 54 00:03:44,680 --> 00:03:47,960 Speaker 1: Vermat makes this argument in the Washington Post article. He 55 00:03:47,960 --> 00:03:52,120 Speaker 1: says that if the algorithm were to, say, determine that 56 00:03:52,280 --> 00:03:55,280 Speaker 1: people of color have a higher incidence rate of leaving 57 00:03:55,320 --> 00:03:58,880 Speaker 1: their jobs, that a person of color is more likely 58 00:03:58,920 --> 00:04:03,400 Speaker 1: to leave their job than say, they're white colleagues, well, 59 00:04:03,520 --> 00:04:06,520 Speaker 1: then the system might naturally start to target employees who 60 00:04:06,560 --> 00:04:09,440 Speaker 1: happen to be people of color for the purposes of layoffs. 61 00:04:09,440 --> 00:04:14,240 Speaker 1: But then you're getting into very dangerous legal and ethical territory. 62 00:04:14,680 --> 00:04:18,320 Speaker 1: It's as if you're targeting these specific people because of 63 00:04:18,360 --> 00:04:22,479 Speaker 1: their race. Also, I'm not sure how well an algorithm 64 00:04:22,600 --> 00:04:27,920 Speaker 1: can actually judge a person's contributions. Presumably stuff from employee 65 00:04:27,960 --> 00:04:30,760 Speaker 1: reviews and such would play a big part, But in 66 00:04:31,080 --> 00:04:35,800 Speaker 1: highly collaborative work, a person could act as the sort 67 00:04:35,800 --> 00:04:38,640 Speaker 1: of lynch pen that keeps a team working really well together, 68 00:04:39,200 --> 00:04:42,840 Speaker 1: even if they themselves don't have the highest numbers on 69 00:04:42,920 --> 00:04:47,279 Speaker 1: whatever the deliverables are. So in my opinion, relying on 70 00:04:47,400 --> 00:04:52,960 Speaker 1: AI to make or even guide decisions regarding layoffs is 71 00:04:53,000 --> 00:04:56,599 Speaker 1: really a bad move all around. It can make sense 72 00:04:57,080 --> 00:05:00,840 Speaker 1: in the applicant phase, but in layoffs I would say 73 00:05:01,000 --> 00:05:03,919 Speaker 1: avoid it. It doesn't look good for the company. It 74 00:05:03,960 --> 00:05:07,120 Speaker 1: could ultimately lead to choices that will harm the overall 75 00:05:07,240 --> 00:05:11,320 Speaker 1: organization in the long run. In this article, Verma mentions 76 00:05:11,360 --> 00:05:14,440 Speaker 1: that while folks at Google wondered if perhaps they had 77 00:05:14,480 --> 00:05:16,919 Speaker 1: been laid off due to an algorithm choosing them because 78 00:05:17,560 --> 00:05:19,920 Speaker 1: there didn't seem to be much rhyme or reason to 79 00:05:19,960 --> 00:05:24,000 Speaker 1: the layoffs, the company denies making use of anything of 80 00:05:24,040 --> 00:05:27,719 Speaker 1: the sort. There's kind of a distinct lack of cases 81 00:05:27,760 --> 00:05:31,839 Speaker 1: where we know that an algorithm definitively played a part 82 00:05:32,360 --> 00:05:36,599 Speaker 1: in layoffs. However, Verma in the article also cites a 83 00:05:36,680 --> 00:05:40,800 Speaker 1: survey that showed ninety eight percent of HR managers there 84 00:05:40,800 --> 00:05:44,719 Speaker 1: were three hundred of them participating in the survey, had 85 00:05:44,760 --> 00:05:47,760 Speaker 1: said that they plan to rely on software and algorithms 86 00:05:47,800 --> 00:05:52,360 Speaker 1: to help make such decisions about layoffs this year. So 87 00:05:52,839 --> 00:05:55,080 Speaker 1: even if you were to argue it hasn't happened yet, 88 00:05:55,160 --> 00:05:58,000 Speaker 1: it looks like it's going to happen real soon. My 89 00:05:58,120 --> 00:06:01,560 Speaker 1: guess is we'll see some high cases where some company 90 00:06:01,640 --> 00:06:06,560 Speaker 1: relies too heavily on algorithms and it'll come back to 91 00:06:06,600 --> 00:06:09,560 Speaker 1: haunt them, perhaps only in PR, but it will be 92 00:06:10,680 --> 00:06:14,039 Speaker 1: a big blowback, and then maybe then we'll start to 93 00:06:14,040 --> 00:06:19,800 Speaker 1: see people form best practices around the whole thing. I 94 00:06:19,839 --> 00:06:23,080 Speaker 1: still think it feels a bit like shirking responsibility in 95 00:06:23,120 --> 00:06:27,039 Speaker 1: my opinion, If the top brass decides that layoffs are necessary, 96 00:06:27,320 --> 00:06:30,480 Speaker 1: then they are obligated to make each and every layoff 97 00:06:30,520 --> 00:06:34,919 Speaker 1: decision transparent and honest, I think they owe their employees 98 00:06:34,960 --> 00:06:38,280 Speaker 1: as much. And it's really infuriating because you'll see managers 99 00:06:38,279 --> 00:06:43,680 Speaker 1: who get a directive saying you have to apply this 100 00:06:43,880 --> 00:06:49,479 Speaker 1: artificial bell curve to the employees who are reporting to you. 101 00:06:50,080 --> 00:06:52,040 Speaker 1: We heard a story about that just a couple of 102 00:06:52,040 --> 00:06:55,719 Speaker 1: weeks ago, where a director actually essentially was fired for 103 00:06:55,800 --> 00:07:01,560 Speaker 1: refusing to follow that because it arbitrarily requires managers to 104 00:07:01,600 --> 00:07:04,599 Speaker 1: assign people as low performers even if you don't have 105 00:07:04,640 --> 00:07:08,760 Speaker 1: any low performers on your team, and that just again 106 00:07:08,920 --> 00:07:12,920 Speaker 1: seems inherently unfair. I feel like relying on AI to 107 00:07:12,960 --> 00:07:17,360 Speaker 1: make these choices also is inherently unfair and can miss 108 00:07:17,400 --> 00:07:21,240 Speaker 1: some really important factors that may not reduce down to 109 00:07:21,400 --> 00:07:24,800 Speaker 1: pure data. But we've got a lot of other AI 110 00:07:24,920 --> 00:07:27,840 Speaker 1: news to get through today. A lot of it is bad. 111 00:07:27,880 --> 00:07:30,920 Speaker 1: I'm not gonna lie. And our next story comes from 112 00:07:31,000 --> 00:07:36,400 Speaker 1: Vanderbilt University. The Peabody College at Vendorbilt, and that college's 113 00:07:36,440 --> 00:07:39,640 Speaker 1: Office of Equity, Diversity and Inclusion send out a message 114 00:07:39,680 --> 00:07:42,920 Speaker 1: to students in the wake of the terrible shooting at 115 00:07:42,960 --> 00:07:48,200 Speaker 1: Michigan State University. And clearly this was a delicate task 116 00:07:48,640 --> 00:07:52,520 Speaker 1: that needed empathy and support. I needed a message that 117 00:07:52,600 --> 00:07:57,000 Speaker 1: showed that Vanderbilt's staff have students and their welfare at 118 00:07:57,000 --> 00:08:00,280 Speaker 1: the top. Of their priority list, so of course they 119 00:08:00,360 --> 00:08:04,400 Speaker 1: use chat GPT to help craft the message. This pretty 120 00:08:04,480 --> 00:08:08,600 Speaker 1: much sent me spiraling, y'all, because passing the buck to 121 00:08:08,720 --> 00:08:13,000 Speaker 1: AI to handle things that are this important, things that 122 00:08:13,120 --> 00:08:18,960 Speaker 1: intrinsically involve a very human connection, it just feels beyond 123 00:08:19,040 --> 00:08:22,960 Speaker 1: shortsighted and crass to me. At best, you could say 124 00:08:23,120 --> 00:08:26,120 Speaker 1: this was a poor decision, but at worst it implies 125 00:08:26,160 --> 00:08:29,440 Speaker 1: that leadership has little to no regard for students and 126 00:08:29,560 --> 00:08:32,080 Speaker 1: instead we'll just lean on the robots to handle the 127 00:08:32,120 --> 00:08:35,120 Speaker 1: tough stuff. Anyway, the end of the message contained the 128 00:08:35,200 --> 00:08:41,040 Speaker 1: line paraphrased from open ais Chat GBTAI Language Model Personal Communication, 129 00:08:42,200 --> 00:08:45,000 Speaker 1: and at least the word paraphrase indicates that there was 130 00:08:45,160 --> 00:08:48,880 Speaker 1: human involvement in taking the generated message and shaping it 131 00:08:48,920 --> 00:08:53,000 Speaker 1: properly for students. So it was a collaborative effort, you 132 00:08:53,000 --> 00:08:56,560 Speaker 1: could say. But still, the fact that staff tapped AI 133 00:08:56,720 --> 00:08:59,199 Speaker 1: in the first place to help with such a sensitive 134 00:08:59,240 --> 00:09:04,520 Speaker 1: matter doesn't look good. It looks like people who want 135 00:09:04,600 --> 00:09:09,400 Speaker 1: to avoid the hard stuff, and hard stuff isn't the 136 00:09:09,559 --> 00:09:14,680 Speaker 1: human connection stuff, the stuff that has incredible impact on 137 00:09:14,760 --> 00:09:19,000 Speaker 1: emotion and mental health, whether it's layoffs or counseling people 138 00:09:19,080 --> 00:09:22,800 Speaker 1: in the wake of a violent act like the shooting 139 00:09:22,800 --> 00:09:26,240 Speaker 1: at the University of Michigan. That is the wrong way 140 00:09:26,240 --> 00:09:30,160 Speaker 1: to use AI, in my opinion. That is inherently the 141 00:09:30,160 --> 00:09:36,880 Speaker 1: realm of humanity and to off source that to AI's 142 00:09:38,120 --> 00:09:42,600 Speaker 1: it shows such a huge disregard for the people who 143 00:09:42,600 --> 00:09:48,000 Speaker 1: are ultimately the recipients of those messages that I think 144 00:09:48,000 --> 00:09:53,360 Speaker 1: it's unconscionable. Now the Associate Dean and Assistant Dean who 145 00:09:53,360 --> 00:09:57,439 Speaker 1: are part of this process have both stepped back from 146 00:09:57,600 --> 00:10:00,840 Speaker 1: that office of equity, diversity and inclusion, which is probably 147 00:10:00,880 --> 00:10:05,400 Speaker 1: for the best, but yeah, this was a really bad 148 00:10:05,520 --> 00:10:11,080 Speaker 1: use case for AI. Last week, the representatives from around 149 00:10:11,080 --> 00:10:15,240 Speaker 1: the world attended the Summit on Responsible Artificial Intelligence in 150 00:10:15,240 --> 00:10:20,720 Speaker 1: the Military Domain or re aim our AIM and we've 151 00:10:20,720 --> 00:10:24,440 Speaker 1: talked about how incorporating AI into military processes and hardware 152 00:10:24,559 --> 00:10:31,240 Speaker 1: raises really difficult questions regarding safety, accountability, escalation, and more. 153 00:10:31,920 --> 00:10:36,160 Speaker 1: Reps from many countries, including the United States and China, 154 00:10:36,200 --> 00:10:41,600 Speaker 1: but excluding Russia, which wasn't invited and Ukraine did not attend. 155 00:10:41,760 --> 00:10:44,720 Speaker 1: They were invited, but clearly have other things going on 156 00:10:44,800 --> 00:10:48,160 Speaker 1: at the moment. Anyway, these representatives all met to discuss 157 00:10:48,280 --> 00:10:53,040 Speaker 1: the issues of AI in its role in military operations. 158 00:10:53,559 --> 00:10:56,120 Speaker 1: At the conclusion of the summit, all but one of 159 00:10:56,160 --> 00:10:59,959 Speaker 1: the representatives of the countries that attended signed an agree 160 00:11:00,400 --> 00:11:04,280 Speaker 1: to commit to developing AI military applications that quote do 161 00:11:04,440 --> 00:11:08,960 Speaker 1: not undermine international security, stability and accountability end quote. So 162 00:11:09,200 --> 00:11:13,520 Speaker 1: what was the one nation that abstained. That would be Israel. Now, 163 00:11:13,880 --> 00:11:19,360 Speaker 1: don't like heap tons of criticism on Israel, because there 164 00:11:19,400 --> 00:11:25,200 Speaker 1: are critics who say this entire meeting was largely for show, 165 00:11:25,600 --> 00:11:29,560 Speaker 1: because according to critics, there was nothing in the summit 166 00:11:29,679 --> 00:11:32,920 Speaker 1: or the agreement that is legally binding for any of 167 00:11:32,960 --> 00:11:36,880 Speaker 1: the countries involved. So, in other words, the critics are 168 00:11:36,880 --> 00:11:40,120 Speaker 1: saying that the reps are all like, yeah, yeah, totally 169 00:11:40,160 --> 00:11:42,760 Speaker 1: AI killing people would really be bad. Let's totally not 170 00:11:42,840 --> 00:11:46,600 Speaker 1: do that, but they would have no real accountability to 171 00:11:46,720 --> 00:11:50,880 Speaker 1: follow through on that promise. Further, the agreement did not 172 00:11:51,040 --> 00:11:54,480 Speaker 1: include certain AI assisted or controlled systems that are already 173 00:11:54,480 --> 00:11:58,640 Speaker 1: in use, like AI controlled drones, And so there's concern 174 00:11:58,720 --> 00:12:02,920 Speaker 1: that this agreement, while represents little more than just putting 175 00:12:02,920 --> 00:12:05,480 Speaker 1: on a show to say, yes, we're all aware of 176 00:12:05,520 --> 00:12:09,240 Speaker 1: this and it is a bad thing. Now, Honestly, I 177 00:12:10,640 --> 00:12:15,600 Speaker 1: heavily suspect that several countries, including the United States and 178 00:12:15,679 --> 00:12:21,840 Speaker 1: also China, will continue incorporating AI into military applications, including 179 00:12:22,080 --> 00:12:27,520 Speaker 1: weaponized AI. I would be absolutely shocked if they didn't 180 00:12:27,600 --> 00:12:31,240 Speaker 1: continue down that pathway, because there's a very real fear 181 00:12:31,520 --> 00:12:33,840 Speaker 1: that if you don't do it, the other guy will, 182 00:12:33,880 --> 00:12:36,679 Speaker 1: and then there will be an AI gap. Now maybe 183 00:12:36,760 --> 00:12:38,480 Speaker 1: I only think that because I'm a child of the 184 00:12:38,520 --> 00:12:41,600 Speaker 1: seventies and eighties and I saw how this very similar 185 00:12:41,600 --> 00:12:45,839 Speaker 1: scenario played out with nuclear armaments, because boy howdy, was 186 00:12:45,880 --> 00:12:49,280 Speaker 1: that a thing. So I would love for us to 187 00:12:49,480 --> 00:12:54,080 Speaker 1: avoid the mistakes of the past. But I really am 188 00:12:54,240 --> 00:12:58,600 Speaker 1: skeptical that that's going to happen, because again, unless everybody 189 00:12:59,559 --> 00:13:03,680 Speaker 1: is hell accountable and agrees to not further that work, 190 00:13:04,160 --> 00:13:06,319 Speaker 1: someone is going to. And if someone's going to, then 191 00:13:06,400 --> 00:13:11,120 Speaker 1: everyone is going to because otherwise you are at a disadvantage. Okay, 192 00:13:11,760 --> 00:13:13,320 Speaker 1: with all that doom and gloom out of the way, 193 00:13:13,400 --> 00:13:15,560 Speaker 1: let's take a quick break. When we come back, I've 194 00:13:15,600 --> 00:13:27,760 Speaker 1: got some more news items to talk about. We're back now. 195 00:13:27,760 --> 00:13:29,839 Speaker 1: I still have a couple more AI stories, but these 196 00:13:29,880 --> 00:13:32,280 Speaker 1: are not quite as apocalyptic as the ones we started 197 00:13:32,360 --> 00:13:36,120 Speaker 1: off with. The one is that Clark's world magazine, which 198 00:13:36,120 --> 00:13:39,760 Speaker 1: has been publishing fantasy and science fiction stories online since 199 00:13:39,800 --> 00:13:45,280 Speaker 1: two thousand and six, has temporarily stopped accepting submissions. Why 200 00:13:46,040 --> 00:13:50,920 Speaker 1: because apparently the magazine has received too many AI generated submissions, 201 00:13:50,960 --> 00:13:54,120 Speaker 1: and until they have access to better tools to detect 202 00:13:54,120 --> 00:13:57,320 Speaker 1: those kinds of things, they have chosen to hold off 203 00:13:57,440 --> 00:14:03,720 Speaker 1: accepting anymore. That's both understandable and it stinks. Not that 204 00:14:03,760 --> 00:14:06,400 Speaker 1: I think the magazine made the wrong call here. I 205 00:14:06,400 --> 00:14:10,520 Speaker 1: think this is the right call, but rather it stinks 206 00:14:10,559 --> 00:14:14,440 Speaker 1: because there are genuine authors and would be authors out 207 00:14:14,440 --> 00:14:17,640 Speaker 1: there who have great stories to tell, and they're seeing 208 00:14:17,760 --> 00:14:20,000 Speaker 1: an outlet closed off to them, at least for now, 209 00:14:20,440 --> 00:14:23,680 Speaker 1: and it's all thanks to AI generated stories. Now, I 210 00:14:23,680 --> 00:14:26,440 Speaker 1: think in most cases the AI generated stuff has to 211 00:14:26,480 --> 00:14:31,160 Speaker 1: be a collaborator kind of relationship, because in my own 212 00:14:31,200 --> 00:14:35,800 Speaker 1: experience the stories generated by AI they aren't very good. Like, 213 00:14:36,160 --> 00:14:40,200 Speaker 1: grammatically they work, and you know, you get some interesting 214 00:14:40,280 --> 00:14:43,280 Speaker 1: descriptions and stuff, but the actual stories tend to be 215 00:14:43,440 --> 00:14:49,160 Speaker 1: pretty mundane and uninteresting. I imagine that most folks who 216 00:14:49,160 --> 00:14:52,160 Speaker 1: are using AI are leaning on it for stuff like 217 00:14:52,280 --> 00:14:55,560 Speaker 1: generating initial ideas, maybe shaping a certain part of the 218 00:14:55,680 --> 00:15:01,160 Speaker 1: narrative at least any stories that are difficult to determine, Oh, 219 00:15:01,200 --> 00:15:03,680 Speaker 1: this was made by AI, right, if they haven't done 220 00:15:03,720 --> 00:15:07,240 Speaker 1: any massaging, it often is pretty easy to detect that 221 00:15:07,240 --> 00:15:09,400 Speaker 1: it's AI, or the very least, it's easy to detect 222 00:15:09,400 --> 00:15:12,200 Speaker 1: that's not a very good story and it wouldn't pass 223 00:15:12,240 --> 00:15:16,440 Speaker 1: the bar for publication. But still, here's another example of 224 00:15:16,480 --> 00:15:19,120 Speaker 1: how AI can end up harming creative types, whether it's 225 00:15:19,120 --> 00:15:23,000 Speaker 1: from the unauthorized copying of their style or displacing them 226 00:15:23,040 --> 00:15:26,400 Speaker 1: from the creator community. Insider reports something that I think 227 00:15:26,400 --> 00:15:28,960 Speaker 1: most folks already have a pretty good handle on, and 228 00:15:29,040 --> 00:15:33,359 Speaker 1: that is the emergence and reception to chat GPT probably 229 00:15:33,480 --> 00:15:35,840 Speaker 1: means we're going to see a whole bunch of copycats 230 00:15:35,840 --> 00:15:38,760 Speaker 1: in the very near future. And to be clear, chatbots 231 00:15:38,920 --> 00:15:41,880 Speaker 1: have been a thing for years. I'm sure you're all 232 00:15:41,880 --> 00:15:44,960 Speaker 1: aware of that. In fact, someone who is once in 233 00:15:45,000 --> 00:15:48,120 Speaker 1: the business of reporting on tech, a person whom I 234 00:15:48,200 --> 00:15:51,560 Speaker 1: know and respect and like very very much, ended up 235 00:15:51,560 --> 00:15:55,440 Speaker 1: working at a company that developed sophisticated chatbots. But these 236 00:15:55,440 --> 00:15:59,600 Speaker 1: were tools that were intended for narrow use cases, something 237 00:15:59,600 --> 00:16:02,520 Speaker 1: that would work well within the confines of a particular 238 00:16:02,560 --> 00:16:07,560 Speaker 1: company's services and processes. The stuff we're seeing now is 239 00:16:07,600 --> 00:16:10,920 Speaker 1: made to be more general purpose, and with that comes 240 00:16:10,920 --> 00:16:14,920 Speaker 1: the problems of reliability and accuracy as well as transparency. 241 00:16:15,720 --> 00:16:20,520 Speaker 1: It is easier, not easy, mind you, but easier to 242 00:16:20,560 --> 00:16:24,120 Speaker 1: build a reliable and accurate tool that works within an 243 00:16:24,280 --> 00:16:28,400 Speaker 1: enclosed system, like the customer service arm of a consumer 244 00:16:28,440 --> 00:16:31,720 Speaker 1: facing company. But it's another thing when it's just you know, 245 00:16:31,880 --> 00:16:35,320 Speaker 1: free range AI chat bot. And meanwhile, the guy who 246 00:16:35,400 --> 00:16:39,200 Speaker 1: runs the company that made chat GPT has said repeatedly 247 00:16:40,240 --> 00:16:43,640 Speaker 1: that he thinks chat GPT isn't that good, or at 248 00:16:43,680 --> 00:16:47,960 Speaker 1: least it is far from perfect. And yet we're currently 249 00:16:48,000 --> 00:16:54,840 Speaker 1: living through a buzzy, height heightened age of chat GPT 250 00:16:55,280 --> 00:17:00,320 Speaker 1: and its peers like like Barred from Google. Tech Runch 251 00:17:00,320 --> 00:17:03,480 Speaker 1: has a piece titled the AI photo app trend has 252 00:17:03,480 --> 00:17:06,960 Speaker 1: already fizzled, new data shows and you should totally check 253 00:17:07,000 --> 00:17:09,520 Speaker 1: out this article. The author of the piece, Sarah Perez, 254 00:17:10,000 --> 00:17:13,320 Speaker 1: lays out some of the data, including download numbers and revenue, 255 00:17:13,480 --> 00:17:16,120 Speaker 1: and she shows that while the text to image AI 256 00:17:16,200 --> 00:17:20,160 Speaker 1: tools initially made a really big splash when they started 257 00:17:20,200 --> 00:17:24,840 Speaker 1: to emerge, particularly late last year, excitement has dropped off 258 00:17:24,880 --> 00:17:29,040 Speaker 1: considerably since then. There's been a lot of backlash in 259 00:17:29,080 --> 00:17:32,199 Speaker 1: the space, ranging from artists who are understandably upset to 260 00:17:32,200 --> 00:17:36,360 Speaker 1: see their style co opted by AI, to users who 261 00:17:36,359 --> 00:17:39,880 Speaker 1: are concerned that the tools can create inappropriate images far 262 00:17:39,960 --> 00:17:44,199 Speaker 1: too readily, and that any restrictions that are designed to 263 00:17:44,320 --> 00:17:48,040 Speaker 1: limit that sort of stuff aren't always the best. Whether 264 00:17:48,080 --> 00:17:51,640 Speaker 1: those actually played a big part in cooling this trend, 265 00:17:51,840 --> 00:17:54,439 Speaker 1: or maybe it was just that folks were getting tired 266 00:17:54,440 --> 00:17:57,000 Speaker 1: of the shiny new thing and they had already moved on, 267 00:17:57,720 --> 00:18:01,359 Speaker 1: I am uncertain, but my guests is that we're going 268 00:18:01,400 --> 00:18:04,800 Speaker 1: to see the space continue to evolve, perhaps with fewer 269 00:18:04,840 --> 00:18:08,080 Speaker 1: players as this goes on, if some of them find 270 00:18:08,080 --> 00:18:11,520 Speaker 1: it too difficult to cover costs with the declining revenues. 271 00:18:12,119 --> 00:18:15,040 Speaker 1: But I don't think AI generated imagery is going to 272 00:18:15,200 --> 00:18:18,240 Speaker 1: just go away at this point. Now that being said, 273 00:18:19,280 --> 00:18:22,440 Speaker 1: one fun story, or at least in my opinion, it's 274 00:18:22,480 --> 00:18:26,119 Speaker 1: fun that relates to the AI generated imagery involves a 275 00:18:26,200 --> 00:18:31,040 Speaker 1: robot from Carnegie Mellon University. So it's a robot arm 276 00:18:31,080 --> 00:18:36,159 Speaker 1: that has the name Frieda, which yes, is both a 277 00:18:36,200 --> 00:18:41,800 Speaker 1: tribute to Frieda the artist, and is an acronym that 278 00:18:41,840 --> 00:18:46,440 Speaker 1: stands for Framework and Robotics Initiative for developing arts, and 279 00:18:46,600 --> 00:18:51,040 Speaker 1: it also generates images based on text prompts. Only in 280 00:18:51,080 --> 00:18:54,840 Speaker 1: this case, the images it makes are not digital images. 281 00:18:54,880 --> 00:18:58,680 Speaker 1: They're not computer generated images. They are real world paintings. 282 00:18:59,080 --> 00:19:02,720 Speaker 1: You have a robot that paints with actual brushes and 283 00:19:02,840 --> 00:19:06,679 Speaker 1: actual paint It creates works of art based off text 284 00:19:06,800 --> 00:19:11,000 Speaker 1: input and directions. According to tech Spot, it takes about 285 00:19:11,040 --> 00:19:14,720 Speaker 1: an hour from the point where the robot receives input 286 00:19:14,920 --> 00:19:17,040 Speaker 1: in the form of the text to the point where 287 00:19:17,040 --> 00:19:19,400 Speaker 1: it begins to paint, because it actually has to plot 288 00:19:19,400 --> 00:19:24,000 Speaker 1: out how it's going to physically paint this. How are 289 00:19:24,000 --> 00:19:26,199 Speaker 1: the brushstrokes going to go, how long are they going 290 00:19:26,240 --> 00:19:28,160 Speaker 1: to go, how much pressure is going to be used? 291 00:19:28,960 --> 00:19:34,960 Speaker 1: What style is it going to follow? And that's very 292 00:19:35,040 --> 00:19:39,200 Speaker 1: understandable because as we all know, there are different strokes 293 00:19:39,240 --> 00:19:41,840 Speaker 1: for different folks, because the world don't move to the 294 00:19:41,880 --> 00:19:45,119 Speaker 1: beat of just one drum. Shout out to me if 295 00:19:45,160 --> 00:19:49,399 Speaker 1: you get that reference. Anyway, The roboticists and engineers are 296 00:19:49,480 --> 00:19:53,600 Speaker 1: quick to say that Frieda isn't an artist. FRIEDA is 297 00:19:53,600 --> 00:19:58,120 Speaker 1: a collaborator. FRIEDA is not creative. FRIEDA just follows instructions 298 00:19:58,160 --> 00:20:03,040 Speaker 1: as best it can pay the subject of the art 299 00:20:03,160 --> 00:20:07,919 Speaker 1: in the style that was dictated by its collaborator. Anyway, 300 00:20:07,960 --> 00:20:10,560 Speaker 1: I just thought this was a neat take on AI 301 00:20:10,720 --> 00:20:15,240 Speaker 1: generated images. Somehow it feels different that because it's, you know, 302 00:20:15,280 --> 00:20:19,240 Speaker 1: a physical painting. It's something that you could hold or 303 00:20:19,280 --> 00:20:21,680 Speaker 1: put into a frame and hang on a wall, or 304 00:20:22,080 --> 00:20:24,960 Speaker 1: you know, something you could rip apart in rage as 305 00:20:25,040 --> 00:20:28,920 Speaker 1: robots get another art commission and you don't. Now, finally 306 00:20:29,800 --> 00:20:32,320 Speaker 1: we're off of the AI stories and we can get 307 00:20:32,320 --> 00:20:35,280 Speaker 1: to everything else. So next up. Part of the big 308 00:20:35,280 --> 00:20:37,679 Speaker 1: news this week is Meta really shook things up on 309 00:20:37,720 --> 00:20:42,000 Speaker 1: Sunday announcing that the company is introducing a subscription service 310 00:20:42,040 --> 00:20:46,440 Speaker 1: called Meta Verified. It's just in the testing phase now, 311 00:20:46,840 --> 00:20:49,720 Speaker 1: and you know, the plan is to widely deploy it, 312 00:20:49,760 --> 00:20:52,920 Speaker 1: but we'll see if things go poorly in the test 313 00:20:53,000 --> 00:20:56,320 Speaker 1: markets that Meta is trying out at the moment. But essentially, 314 00:20:57,200 --> 00:21:02,280 Speaker 1: this subscription service is a verification tool. Users would have 315 00:21:02,320 --> 00:21:04,840 Speaker 1: to submit proof that they are who they claim to be, 316 00:21:05,640 --> 00:21:09,760 Speaker 1: using government issued ID for example, and in return for 317 00:21:09,920 --> 00:21:12,840 Speaker 1: twelve bucks a month or fifteen bucks a month if 318 00:21:12,840 --> 00:21:16,040 Speaker 1: you're doing it on iOS or Android, because Google and 319 00:21:16,080 --> 00:21:19,120 Speaker 1: Apple take their own cut of the fee. You then 320 00:21:19,240 --> 00:21:23,680 Speaker 1: get a little blue badge on Facebook and or Instagram 321 00:21:24,000 --> 00:21:28,520 Speaker 1: saying you're bona fide. On top of that, badge subscribers 322 00:21:28,520 --> 00:21:31,439 Speaker 1: will also have access to services that are meant to 323 00:21:31,440 --> 00:21:35,920 Speaker 1: protect against imposter accounts. They're supposed to get better customer support, 324 00:21:36,520 --> 00:21:39,960 Speaker 1: and they're supposed to get improved discoverability when folks are 325 00:21:39,960 --> 00:21:42,920 Speaker 1: actually searching for them, which am I opinion, is stuff 326 00:21:43,000 --> 00:21:46,040 Speaker 1: that should really be standard for all users, whether they're 327 00:21:46,040 --> 00:21:49,600 Speaker 1: paying a subscription or not. I think it's kind of 328 00:21:49,680 --> 00:21:53,359 Speaker 1: bullpucky to say one of the benefits to verification is 329 00:21:53,400 --> 00:21:55,639 Speaker 1: that meta will make sure other folks aren't trying to 330 00:21:55,880 --> 00:21:59,679 Speaker 1: impersonate you. I mean, arguably, this is a bigger problem 331 00:21:59,720 --> 00:22:02,439 Speaker 1: for notable folks like celebrities and brands. And I'm not 332 00:22:02,480 --> 00:22:06,639 Speaker 1: talking me, I'm talking real celebrities. I have no illusions 333 00:22:06,680 --> 00:22:10,199 Speaker 1: that I'm a celebrity, but I've still seen plenty of 334 00:22:10,240 --> 00:22:14,000 Speaker 1: instances of friends being impersonated as someone has either gained 335 00:22:14,040 --> 00:22:17,480 Speaker 1: access to their account or created a copy account in 336 00:22:17,560 --> 00:22:21,359 Speaker 1: an attempt to phish for data. Like this is still 337 00:22:21,400 --> 00:22:25,880 Speaker 1: something that affects the average person on these platforms. It's 338 00:22:25,960 --> 00:22:30,199 Speaker 1: not just for the celebrities. But yeah, I'm that's facing 339 00:22:30,200 --> 00:22:32,360 Speaker 1: issues with the revenue for a lot of different reasons, 340 00:22:32,800 --> 00:22:36,200 Speaker 1: so it's not surprising that the company is now introducing 341 00:22:36,200 --> 00:22:39,240 Speaker 1: the subscription feature. It just feels like the quote unquote 342 00:22:39,240 --> 00:22:43,159 Speaker 1: benefits of the service are things that really everyone on 343 00:22:43,200 --> 00:22:48,040 Speaker 1: the platform should have access to by default. Maybe I'm 344 00:22:48,080 --> 00:22:53,040 Speaker 1: just being unreasonable here. Today Microsoft will attempt to defend 345 00:22:53,160 --> 00:22:56,760 Speaker 1: its planned acquisition of Activision Blizzard in the EU in 346 00:22:56,840 --> 00:23:01,040 Speaker 1: a meeting that's behind closed doors and Brussels. Previously, EU 347 00:23:01,160 --> 00:23:04,760 Speaker 1: regulators indicated that they would block the purchase, saying it 348 00:23:04,760 --> 00:23:07,359 Speaker 1: would result in less competition in the video game space 349 00:23:07,680 --> 00:23:11,520 Speaker 1: and allow Microsoft to engage in actively anti competitive practices, 350 00:23:11,520 --> 00:23:16,040 Speaker 1: such as preventing other platforms like Sony PlayStation from having 351 00:23:16,080 --> 00:23:20,800 Speaker 1: access to popular video game franchises like Call of Duty. Earlier, 352 00:23:20,920 --> 00:23:25,520 Speaker 1: Microsoft Reps signed a deal with Nintendo Reps that legally 353 00:23:25,600 --> 00:23:29,320 Speaker 1: binds Microsoft to bring Call of Duty titles to Nintendo 354 00:23:29,359 --> 00:23:33,520 Speaker 1: platforms for ten years, and further that all titles will 355 00:23:33,520 --> 00:23:37,119 Speaker 1: be available on Nintendo platforms the same day that they 356 00:23:37,160 --> 00:23:41,280 Speaker 1: come out for Xbox platforms, with quote full feature and 357 00:23:41,480 --> 00:23:46,959 Speaker 1: content parity end quote between these versions, meaning Nintendo won't 358 00:23:47,000 --> 00:23:49,359 Speaker 1: have to be happy with a watered down version of 359 00:23:49,400 --> 00:23:51,560 Speaker 1: Call of Duty. It's going to get the real thing, 360 00:23:51,680 --> 00:23:55,000 Speaker 1: just like Xboxes. This puts pressure on Sony to make 361 00:23:55,000 --> 00:23:58,960 Speaker 1: a similar agreement or else Microsoft could argue before the 362 00:23:59,040 --> 00:24:04,040 Speaker 1: EU regulators that Microsoft has made attempts to ensure fairness 363 00:24:04,040 --> 00:24:07,560 Speaker 1: between the various console companies, but Sony isn't playing ball 364 00:24:07,680 --> 00:24:11,880 Speaker 1: on purpose in an effort to scuttle the deal, surprisingly, 365 00:24:11,920 --> 00:24:16,480 Speaker 1: at least to me, the Communications Workers of America the CWA, 366 00:24:16,640 --> 00:24:20,680 Speaker 1: a union organization here in the US, has also urged 367 00:24:20,720 --> 00:24:24,480 Speaker 1: the EU to approve the acquisition deal. They say that 368 00:24:24,560 --> 00:24:28,520 Speaker 1: Microsoft has been more receptive to attempts at unionizing than 369 00:24:28,600 --> 00:24:33,320 Speaker 1: Activision Blizzard has, and that without Microsoft's oversight, employees and 370 00:24:33,440 --> 00:24:39,080 Speaker 1: Activision could find themselves facing tough managerial resistance to unionizing. 371 00:24:39,640 --> 00:24:42,800 Speaker 1: By the time you hear this, a decision has probably 372 00:24:42,840 --> 00:24:45,239 Speaker 1: been made one way or the other. But as I 373 00:24:45,240 --> 00:24:48,800 Speaker 1: write this episode, it has yet to be announced, and again, 374 00:24:48,840 --> 00:24:51,520 Speaker 1: the meeting is behind closed doors, so it might be 375 00:24:51,560 --> 00:24:54,080 Speaker 1: a little while before we find out what the results are. 376 00:24:54,560 --> 00:24:58,359 Speaker 1: Corporate employees at Amazon are looking at decreased compensation this 377 00:24:58,440 --> 00:25:01,640 Speaker 1: year like an actual pay now. The reason for that 378 00:25:01,800 --> 00:25:04,600 Speaker 1: is because some of their compensation is tied up in 379 00:25:04,720 --> 00:25:09,000 Speaker 1: stock units, so as part of their salary, Amazon corporate 380 00:25:09,040 --> 00:25:14,080 Speaker 1: workers get stock in Amazon. However, Amazon stock price has 381 00:25:14,119 --> 00:25:17,320 Speaker 1: taken some massive hits over the last year, and that 382 00:25:17,359 --> 00:25:20,280 Speaker 1: means that the stocks awarded to corporate employees are worth 383 00:25:20,400 --> 00:25:24,760 Speaker 1: much less than they were a year earlier. That's particularly 384 00:25:24,800 --> 00:25:29,000 Speaker 1: tough because when Amazon structures its salary deals, they are 385 00:25:29,119 --> 00:25:31,560 Speaker 1: at least partly based on the idea of the stock 386 00:25:31,720 --> 00:25:34,560 Speaker 1: having a value of around one hundred and seventy dollars 387 00:25:34,560 --> 00:25:37,119 Speaker 1: per share. So, in other words, that's part of the 388 00:25:37,160 --> 00:25:41,040 Speaker 1: justification of yes, your salary is X amount of dollars 389 00:25:41,119 --> 00:25:44,960 Speaker 1: instead of why, because you're also being compensated by stock 390 00:25:45,040 --> 00:25:47,960 Speaker 1: units that are considered to be worth one hundred seventy 391 00:25:47,960 --> 00:25:52,400 Speaker 1: dollars per share. However, at the time of recording, Amazon 392 00:25:52,480 --> 00:25:55,760 Speaker 1: stock is currently at ninety four dollars fifty eight cents 393 00:25:55,800 --> 00:26:00,440 Speaker 1: per share, so a little more than half of what 394 00:26:00,840 --> 00:26:05,320 Speaker 1: it was when these salary figures were first calculated. So 395 00:26:05,480 --> 00:26:07,800 Speaker 1: if the cash part of your salary is dependent upon 396 00:26:07,840 --> 00:26:09,760 Speaker 1: the fact that the rest of your compensation is coming 397 00:26:09,760 --> 00:26:12,280 Speaker 1: in the form of stocks that were calculated at one 398 00:26:12,359 --> 00:26:14,879 Speaker 1: hundred and seventy bucks per share, it means you're getting 399 00:26:14,920 --> 00:26:18,200 Speaker 1: significantly less per year on top of that, the company 400 00:26:18,240 --> 00:26:21,080 Speaker 1: has been laying off thousands of employees. I wouldn't be 401 00:26:21,119 --> 00:26:23,440 Speaker 1: surprised if there were some managers over at Amazon who 402 00:26:23,440 --> 00:26:26,200 Speaker 1: were giving wistful glances toward chat GPT when it comes 403 00:26:26,240 --> 00:26:30,040 Speaker 1: time to communicate these issues to their team members. Okay, 404 00:26:30,160 --> 00:26:32,520 Speaker 1: I've got a couple more stories to talk about, including 405 00:26:32,520 --> 00:26:34,240 Speaker 1: one that's going to get me all head up again. 406 00:26:34,840 --> 00:26:37,439 Speaker 1: But before we get to that, I'm going to take 407 00:26:37,480 --> 00:26:39,919 Speaker 1: a quick break, and so are you. But we'll be 408 00:26:40,000 --> 00:26:54,280 Speaker 1: right back. Okay, here's where Jonathan gets upset for multiple reasons. 409 00:26:54,320 --> 00:26:56,840 Speaker 1: All right, So our next story is that torrent Freak 410 00:26:57,280 --> 00:27:00,879 Speaker 1: reports that filmmakers are demanding to know the identities of 411 00:27:00,920 --> 00:27:04,719 Speaker 1: certain reddit members who have been active in subreddits and 412 00:27:04,840 --> 00:27:09,720 Speaker 1: talked about content piracy like the illegal downloading and distribution 413 00:27:09,760 --> 00:27:13,240 Speaker 1: of films and such. Hey, y'all, here we go again. 414 00:27:13,480 --> 00:27:15,679 Speaker 1: Like I've been through this a few times because I 415 00:27:15,720 --> 00:27:18,520 Speaker 1: remember the good old napster days. All right, So the 416 00:27:18,600 --> 00:27:22,640 Speaker 1: filmmakers want to hold pirates accountable and that is understandable, right, 417 00:27:23,320 --> 00:27:26,679 Speaker 1: you know, they don't want their films to be pirated, 418 00:27:27,000 --> 00:27:29,840 Speaker 1: and that makes sense, Like this is not just art, 419 00:27:29,920 --> 00:27:33,439 Speaker 1: its commerce, and to see people get access to something 420 00:27:33,480 --> 00:27:38,240 Speaker 1: without legitimately paying for it. That is a problem. However, 421 00:27:38,880 --> 00:27:46,240 Speaker 1: the arguments that filmmakers and studios make are at best facetious. 422 00:27:46,880 --> 00:27:51,159 Speaker 1: Now by that, I mean you'll hear filmmakers and studios 423 00:27:51,280 --> 00:27:56,719 Speaker 1: cite huge figures for damages, like millions and millions of dollars, 424 00:27:56,720 --> 00:28:01,200 Speaker 1: that these in damages that these companies and these filmmakers 425 00:28:01,240 --> 00:28:05,320 Speaker 1: experience due to piracy. But the truth of the matter is, 426 00:28:06,280 --> 00:28:11,119 Speaker 1: you cannot say that with any kind of certainty. Those damages, 427 00:28:12,600 --> 00:28:15,520 Speaker 1: on the face of it, assume that the people who 428 00:28:15,680 --> 00:28:20,359 Speaker 1: pirated the content would have otherwise purchased a ticket, or 429 00:28:20,400 --> 00:28:24,960 Speaker 1: subscribe to a service or whatever, and so piracy, based 430 00:28:25,000 --> 00:28:29,439 Speaker 1: on this argument, amounts to lost revenue. Thus the damages right, like, 431 00:28:30,040 --> 00:28:32,840 Speaker 1: we would have sold x number of tickets, except that 432 00:28:32,920 --> 00:28:36,119 Speaker 1: this number of people pirated it, and therefore we're out 433 00:28:36,680 --> 00:28:39,760 Speaker 1: x number of dollars. Except you don't know that. You 434 00:28:39,840 --> 00:28:44,200 Speaker 1: do not know if the person who pirated something would 435 00:28:44,200 --> 00:28:47,920 Speaker 1: have otherwise sought a legitimate way to view the material. 436 00:28:48,200 --> 00:28:51,520 Speaker 1: You don't know that you actually lost out on money. 437 00:28:52,240 --> 00:28:55,560 Speaker 1: Maybe that person would just have gone without seeing it 438 00:28:55,640 --> 00:28:58,640 Speaker 1: at all. So that's not I mean, you can't you 439 00:28:58,680 --> 00:29:01,120 Speaker 1: can't accuse people of not going to see a movie, 440 00:29:01,280 --> 00:29:05,040 Speaker 1: right Like, I haven't gone to see Aunt Man in 441 00:29:05,160 --> 00:29:09,680 Speaker 1: the Quantum Maniacs or whatever it is. But Marvel can't 442 00:29:09,680 --> 00:29:12,320 Speaker 1: come to me and say, hey, you failed to see 443 00:29:12,360 --> 00:29:14,800 Speaker 1: the movie at the theater, so we're going to find you. 444 00:29:15,520 --> 00:29:19,400 Speaker 1: That doesn't make any sense. So you can't argue that 445 00:29:19,440 --> 00:29:23,680 Speaker 1: the pirates would have otherwise gone and paid legitimate money 446 00:29:23,720 --> 00:29:26,200 Speaker 1: to go and see stuff. Therefore the companies out of 447 00:29:26,240 --> 00:29:29,760 Speaker 1: money because maybe they wouldn't maybe they just wouldn't see 448 00:29:29,760 --> 00:29:33,200 Speaker 1: it at all. So pirting a film or a series 449 00:29:33,360 --> 00:29:36,000 Speaker 1: is not the same thing as someone stealing like a 450 00:29:36,200 --> 00:29:40,080 Speaker 1: physical something like a TV from a big box store. Right. 451 00:29:40,480 --> 00:29:43,720 Speaker 1: That is a physical item. There is only one of 452 00:29:43,760 --> 00:29:47,720 Speaker 1: that specific television in the world, and once it's gone 453 00:29:47,760 --> 00:29:50,600 Speaker 1: from an inventory, it is gone. It is not magically 454 00:29:50,640 --> 00:29:54,480 Speaker 1: replaced by a digital duplicate. Right. That is something where 455 00:29:54,480 --> 00:29:56,600 Speaker 1: you can look at that and say, yes, this amounts 456 00:29:56,600 --> 00:30:00,320 Speaker 1: to real losses. That's a lost sale, if not to 457 00:30:00,360 --> 00:30:02,400 Speaker 1: the person who stole it, to person who would have 458 00:30:02,480 --> 00:30:05,680 Speaker 1: ultimately bought it. That you could say, and you could 459 00:30:05,720 --> 00:30:08,200 Speaker 1: point to that and say these are real damages. You 460 00:30:08,400 --> 00:30:12,720 Speaker 1: cannot do that with digital media. The government accountability of this, 461 00:30:12,840 --> 00:30:16,760 Speaker 1: or the United States agrees with me, you cannot do that. 462 00:30:16,840 --> 00:30:21,480 Speaker 1: Does it amount to damages? Is there a loss of revenue? Undoubtedly, yes, 463 00:30:22,600 --> 00:30:24,840 Speaker 1: there is definitely a loss of revenue, but there's no 464 00:30:24,880 --> 00:30:28,960 Speaker 1: way to determine the extent of that. And because filmmakers 465 00:30:29,080 --> 00:30:33,080 Speaker 1: and studios depend upon these inflated numbers that represent the 466 00:30:33,400 --> 00:30:36,720 Speaker 1: quote unquote damages that they incurred as a result of 467 00:30:36,720 --> 00:30:41,840 Speaker 1: piracy to try and become like a bludgeon against pirates, 468 00:30:41,880 --> 00:30:49,520 Speaker 1: to cow people into avoiding piracy, it unfairly targets people 469 00:30:49,640 --> 00:30:52,840 Speaker 1: who may or may not have actually caused any damages 470 00:30:52,880 --> 00:30:54,880 Speaker 1: at all. You just don't know. That's the thing is 471 00:30:54,880 --> 00:30:59,280 Speaker 1: that because you don't know, you cannot make firm claims 472 00:30:59,280 --> 00:31:03,480 Speaker 1: of damages. And yet time and again we see filmmakers 473 00:31:03,480 --> 00:31:08,880 Speaker 1: and studios do this. Now, that is part of it. 474 00:31:09,080 --> 00:31:12,440 Speaker 1: I should also mention that Reddit is resisting these urges 475 00:31:12,480 --> 00:31:16,680 Speaker 1: to hand over user data. And just in case you 476 00:31:16,720 --> 00:31:18,560 Speaker 1: were curious, if you were to do something like I 477 00:31:18,600 --> 00:31:22,840 Speaker 1: don't know, use a VPN and create a unique email 478 00:31:22,920 --> 00:31:27,320 Speaker 1: address and only use your VPN when you're accessing something 479 00:31:27,400 --> 00:31:32,040 Speaker 1: like Reddit, and you register for Reddit using the unique 480 00:31:32,080 --> 00:31:34,600 Speaker 1: email address, that isn't tied to anything else of yours. 481 00:31:35,240 --> 00:31:39,680 Speaker 1: That could be a way to avoid imperial entanglements. That 482 00:31:39,800 --> 00:31:42,200 Speaker 1: being said, now, I say that because I don't like 483 00:31:42,800 --> 00:31:47,640 Speaker 1: seeing companies go super hard against people. But I also 484 00:31:48,080 --> 00:31:52,760 Speaker 1: firmly believe piracy is wrong. Okay, I do not condone 485 00:31:52,880 --> 00:31:55,720 Speaker 1: piracy at all. I pay for the content I consume 486 00:31:55,960 --> 00:31:59,320 Speaker 1: or I go without. I even bought a cheap region 487 00:31:59,400 --> 00:32:03,640 Speaker 1: free DVD player so that I can import DVDs from 488 00:32:03,720 --> 00:32:06,840 Speaker 1: the UK for series that just never get released over 489 00:32:06,880 --> 00:32:10,240 Speaker 1: here in the US. Mitchell and Webb. Look, I'm looking 490 00:32:10,240 --> 00:32:14,160 Speaker 1: at you, but I still end up buying the actual stuff. 491 00:32:15,320 --> 00:32:19,120 Speaker 1: I don't just try and pirate it. I condemn piracy, 492 00:32:19,160 --> 00:32:22,920 Speaker 1: but I also condemn an industry throwing its power around 493 00:32:23,400 --> 00:32:27,600 Speaker 1: making assertions that it simply cannot support with evidence, at 494 00:32:27,600 --> 00:32:31,440 Speaker 1: the cost of people who may not have ever gone 495 00:32:31,480 --> 00:32:36,000 Speaker 1: to see your quantum maniac ant movie. Okay, I'm done 496 00:32:36,000 --> 00:32:41,400 Speaker 1: with that. Finally, Hey, do you remember when you were 497 00:32:41,440 --> 00:32:45,120 Speaker 1: a kid and you've got that meant in the box 498 00:32:45,920 --> 00:32:50,280 Speaker 1: original iPhone, the two thousand and seven iPhone. Can you 499 00:32:50,320 --> 00:32:53,160 Speaker 1: remember how excited you were, but part of you thought, 500 00:32:53,440 --> 00:32:57,120 Speaker 1: you know, maybe I shouldn't open this because this is 501 00:32:57,160 --> 00:33:01,920 Speaker 1: a collector's item. No. Well, of course not because you're 502 00:33:01,920 --> 00:33:04,920 Speaker 1: a sensible person. But it turns out if you had 503 00:33:04,920 --> 00:33:08,640 Speaker 1: been less sensible and chose not to use the thing 504 00:33:08,680 --> 00:33:11,760 Speaker 1: you've got for the purpose that it was intended, you 505 00:33:11,800 --> 00:33:16,720 Speaker 1: could have made some crazy money. Why because at an 506 00:33:16,720 --> 00:33:22,040 Speaker 1: auction this past Sunday, and unopened original two thousand and 507 00:33:22,080 --> 00:33:28,440 Speaker 1: seven iPhone sold for more than sixty three thousand dollars. 508 00:33:29,520 --> 00:33:33,200 Speaker 1: When that phone came out, it cost five hundred ninety 509 00:33:33,280 --> 00:33:38,520 Speaker 1: nine bucks. Now, if we adjust that for inflation today, 510 00:33:39,080 --> 00:33:41,440 Speaker 1: that would be the same amount as around eight hundred 511 00:33:41,440 --> 00:33:45,240 Speaker 1: and sixty of today's dollars, so eight hundred sixty bucks. 512 00:33:45,240 --> 00:33:49,200 Speaker 1: But it's sold for sixty three thousand dollars at auction. 513 00:33:49,240 --> 00:33:52,120 Speaker 1: Actually it's sold for sixty three thousand, three hundred fifty 514 00:33:52,200 --> 00:33:57,120 Speaker 1: six dollars and forty cents at auction, which is really specific. 515 00:33:57,160 --> 00:34:00,880 Speaker 1: I don't typically see it go into the sense like that. 516 00:34:00,920 --> 00:34:04,520 Speaker 1: Maybe it was a winning bid that came from overseas 517 00:34:04,560 --> 00:34:10,080 Speaker 1: and it was a currency conversion thing. But if we 518 00:34:10,120 --> 00:34:13,400 Speaker 1: adjust the iPhone for inflation, then the value of the 519 00:34:13,400 --> 00:34:17,319 Speaker 1: phone increased by nearly seventy four times. If we don't 520 00:34:17,360 --> 00:34:21,360 Speaker 1: adjust for inflation, the value increased by one hundred times. 521 00:34:21,920 --> 00:34:27,560 Speaker 1: So why the heck didn't the person who owned this 522 00:34:27,640 --> 00:34:33,560 Speaker 1: two thousand and seven iPhone ever open the box? Well, actually, 523 00:34:34,040 --> 00:34:38,360 Speaker 1: there is a sensible answer to this. You see, back 524 00:34:38,360 --> 00:34:41,239 Speaker 1: when the iPhone first came out in the US, and 525 00:34:41,320 --> 00:34:44,919 Speaker 1: you may have forgotten about this, the Apple made an 526 00:34:44,920 --> 00:34:49,080 Speaker 1: exclusive deal with AT and T. It became the exclusive 527 00:34:49,600 --> 00:34:53,480 Speaker 1: carrier for the iPhone in the US. And the woman 528 00:34:53,719 --> 00:34:57,040 Speaker 1: who had received this particular iPhone as a gift way 529 00:34:57,080 --> 00:35:01,319 Speaker 1: back in two thousand and seven, a woman named Karen Green, Well, 530 00:35:01,400 --> 00:35:06,000 Speaker 1: she had a service contract with Verizon, and it would 531 00:35:06,040 --> 00:35:08,759 Speaker 1: have cost her a lot of money to cancel out 532 00:35:08,760 --> 00:35:11,719 Speaker 1: of that contract and then start up with AT and T. 533 00:35:12,600 --> 00:35:14,720 Speaker 1: That's a whole hassle. I don't know how many people 534 00:35:14,719 --> 00:35:16,759 Speaker 1: have had to go through that process, but it can 535 00:35:16,880 --> 00:35:21,680 Speaker 1: sometimes be really frustrating. So the iPhone that she received 536 00:35:21,719 --> 00:35:25,520 Speaker 1: wouldn't work on Verizon, her carrier, So Green just never 537 00:35:25,560 --> 00:35:29,399 Speaker 1: opened the darn thing. Instead, she kept it and kept 538 00:35:29,440 --> 00:35:32,520 Speaker 1: it in good condition, and I'm sure she's very glad 539 00:35:32,560 --> 00:35:35,080 Speaker 1: she did. Now. I guess if there is a moral 540 00:35:35,120 --> 00:35:38,839 Speaker 1: to the story, it's if you do not want that 541 00:35:39,000 --> 00:35:43,440 Speaker 1: nice tech gift, that someone gave to you. Keep it unopened. 542 00:35:43,880 --> 00:35:47,319 Speaker 1: You never know when it'll be worth sixty grand. Just 543 00:35:47,440 --> 00:35:51,640 Speaker 1: you know, don't hold your breath about it. All right, 544 00:35:51,840 --> 00:35:55,920 Speaker 1: that's it for the tech News for Tuesday, February twenty first, 545 00:35:56,239 --> 00:36:01,200 Speaker 1: two thousand, twenty three. Hope you enjoyed this episode and 546 00:36:01,400 --> 00:36:05,080 Speaker 1: my various rants, and if you have suggestions for topics 547 00:36:05,080 --> 00:36:07,160 Speaker 1: I should tackle in future episodes of tech Stuff. I've 548 00:36:07,160 --> 00:36:11,040 Speaker 1: got one coming up that's going to be dealing with 549 00:36:11,120 --> 00:36:16,919 Speaker 1: our old buddies, the activist group Anonymous. That's coming up soon. 550 00:36:17,680 --> 00:36:19,759 Speaker 1: Just let me know. You can get in touch with 551 00:36:19,880 --> 00:36:22,120 Speaker 1: me via Twitter. The handle for the show is tech 552 00:36:22,200 --> 00:36:26,759 Speaker 1: Stuff HSW. You can download the iHeartRadio app, which is 553 00:36:26,800 --> 00:36:28,880 Speaker 1: free to download, free to use. You navigate over to 554 00:36:28,880 --> 00:36:31,719 Speaker 1: tech Stuff using the search field, and that will bring 555 00:36:31,760 --> 00:36:33,759 Speaker 1: you to the text stuff page where you'll see a 556 00:36:33,760 --> 00:36:35,800 Speaker 1: little microphone icon. You can click on that leave a 557 00:36:35,880 --> 00:36:39,120 Speaker 1: voice message for me, or you can be like Nathan 558 00:36:39,800 --> 00:36:42,200 Speaker 1: and find my email address hiding out there on the 559 00:36:42,200 --> 00:36:44,279 Speaker 1: web and just send me an email, because that's how 560 00:36:44,320 --> 00:36:47,640 Speaker 1: we're going to talk about Anonymous. All right, that's it 561 00:36:47,760 --> 00:36:57,279 Speaker 1: for me. I'll talk to you again really soon. Tech 562 00:36:57,360 --> 00:37:02,319 Speaker 1: Stuff is an iHeartRadio production or podcasts from iHeartRadio, Visit 563 00:37:02,360 --> 00:37:05,920 Speaker 1: the iHeartRadio app, Apple podcasts, or wherever you listen to 564 00:37:05,960 --> 00:37:06,880 Speaker 1: your favorite shows.