1 00:00:04,440 --> 00:00:12,480 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,560 --> 00:00:15,920 Speaker 1: and welcome to tech Stuff. I'm your host Jonathan Strickland. 3 00:00:16,000 --> 00:00:19,640 Speaker 1: I'm an executive producer with iHeartRadio. And how the tech 4 00:00:19,680 --> 00:00:22,959 Speaker 1: are you. It's time for the tech news for Tuesday, 5 00:00:23,040 --> 00:00:28,680 Speaker 1: April eighteenth, twenty twenty three. Yesterday SpaceX had to scrub 6 00:00:28,840 --> 00:00:34,000 Speaker 1: its planned orbit old test of the Starship spacecraft. Starship 7 00:00:34,159 --> 00:00:37,440 Speaker 1: is a two stage launch vehicle. The second stage also 8 00:00:37,520 --> 00:00:41,400 Speaker 1: serves as a spacecraft that's capable of carrying a crew 9 00:00:41,720 --> 00:00:45,280 Speaker 1: or you know, cargo. It is the most powerful launch 10 00:00:45,400 --> 00:00:49,000 Speaker 1: vehicle ever built to date. It is capable of producing 11 00:00:49,080 --> 00:00:54,120 Speaker 1: nearly twice as much thrust as NASA's Space Launch System, 12 00:00:54,320 --> 00:00:58,240 Speaker 1: which is its latest launch vehicle. But Starship has not 13 00:00:58,440 --> 00:01:02,960 Speaker 1: yet left the ground. SpaceX has test fired Starship engines 14 00:01:02,960 --> 00:01:06,440 Speaker 1: in the past, igniting thirty one of the thirty three 15 00:01:06,520 --> 00:01:09,960 Speaker 1: engines in a test, but those tests had the launch 16 00:01:10,040 --> 00:01:12,640 Speaker 1: vehicle bolted to the platform, so you know, it didn't 17 00:01:12,680 --> 00:01:16,280 Speaker 1: go anywhere. This was really just a test of the engines. Yesterday, 18 00:01:16,360 --> 00:01:20,319 Speaker 1: the plan was to launch Starship into orbit. It would 19 00:01:20,319 --> 00:01:24,560 Speaker 1: be an uncrude mission, so there's no one aboard, and 20 00:01:24,600 --> 00:01:26,480 Speaker 1: it was meant to go into orbit and take a 21 00:01:26,520 --> 00:01:30,920 Speaker 1: full orbit of the Earth, but that didn't happen. And ultimately, 22 00:01:31,240 --> 00:01:34,080 Speaker 1: also this is important because it's different from how it 23 00:01:34,120 --> 00:01:37,600 Speaker 1: will normally be operated. The starship was meant to have 24 00:01:37,640 --> 00:01:41,160 Speaker 1: both stages crash into the ocean. Now, in normal operation, 25 00:01:42,160 --> 00:01:47,440 Speaker 1: these components will return to Earth with a controlled landing, 26 00:01:48,080 --> 00:01:52,320 Speaker 1: so that you can reuse the same vehicle repeatedly and 27 00:01:52,320 --> 00:01:55,320 Speaker 1: thus bring down the cost of launching things into space, 28 00:01:55,840 --> 00:01:59,400 Speaker 1: just like they've been doing with the Falcon nine. But 29 00:01:59,520 --> 00:02:03,440 Speaker 1: this is a much larger launch vehicle, and for the 30 00:02:03,480 --> 00:02:05,840 Speaker 1: first test, the plan was just to let it crash 31 00:02:05,880 --> 00:02:09,040 Speaker 1: into the ocean. Unfortunately, none of that happened because a 32 00:02:09,120 --> 00:02:13,040 Speaker 1: technical error in the form of a frozen pressurization valve 33 00:02:13,600 --> 00:02:17,520 Speaker 1: meant that they had to scrub the launch and plan 34 00:02:17,680 --> 00:02:21,320 Speaker 1: for later this week. SpaceX has posted on Twitter that 35 00:02:21,440 --> 00:02:25,480 Speaker 1: it is now aiming to try again on Thursday, April twentieth, 36 00:02:25,919 --> 00:02:29,680 Speaker 1: when the company hopes the engines will just blaze and 37 00:02:29,760 --> 00:02:33,760 Speaker 1: shake free the shackles of gravity. Now onto Elon Musk, 38 00:02:34,120 --> 00:02:36,959 Speaker 1: who appeared on Fox News in an interview with Tucker 39 00:02:37,040 --> 00:02:41,320 Speaker 1: Carlson yesterday. Part two of that interview will air today 40 00:02:42,120 --> 00:02:45,639 Speaker 1: and they covered a lot of ground. Musk waved off 41 00:02:45,720 --> 00:02:48,680 Speaker 1: Twitter's troubles and said that they were mostly due to 42 00:02:48,760 --> 00:02:52,720 Speaker 1: just bad timing. I think a lot of folks would 43 00:02:52,720 --> 00:02:56,560 Speaker 1: disagree that bad timing is the only or even primary 44 00:02:56,720 --> 00:03:01,400 Speaker 1: cause for Twitter's woes, which are pretty widely distributed across 45 00:03:01,400 --> 00:03:04,839 Speaker 1: things from having laid off eighty percent of the workforce 46 00:03:05,000 --> 00:03:09,720 Speaker 1: to seeing around fifty percent of ad revenue drop out, 47 00:03:10,960 --> 00:03:15,480 Speaker 1: seeing high profile accounts leave the service, Like, We've seen 48 00:03:15,480 --> 00:03:18,120 Speaker 1: a lot of things happen at Twitter that I don't 49 00:03:18,200 --> 00:03:21,640 Speaker 1: think relate just to bad timing. He also made an 50 00:03:21,760 --> 00:03:26,520 Speaker 1: unsubstantiated claim that the US government has had backdoor access 51 00:03:26,560 --> 00:03:30,800 Speaker 1: to Twitter that essentially government agencies, though I don't believe 52 00:03:30,800 --> 00:03:34,440 Speaker 1: he named any in particular, could even look at people's 53 00:03:34,520 --> 00:03:38,440 Speaker 1: private direct messages, that that was included with the access 54 00:03:38,480 --> 00:03:42,120 Speaker 1: the government had to Twitter's back end. Now, he didn't 55 00:03:42,160 --> 00:03:46,720 Speaker 1: produce any evidence for this claim, and if it is true, 56 00:03:47,000 --> 00:03:49,360 Speaker 1: it's rather shocking that we didn't see any hint of 57 00:03:49,400 --> 00:03:53,320 Speaker 1: that during the Trump administration, or that Trump himself would 58 00:03:53,320 --> 00:03:57,320 Speaker 1: be banned from the platform. That seems odd if the 59 00:03:57,360 --> 00:04:02,400 Speaker 1: government had that level of access and manipulation of Twitter, 60 00:04:02,440 --> 00:04:07,440 Speaker 1: because It's not like Joe Biden has been president forever. 61 00:04:08,000 --> 00:04:10,640 Speaker 1: He became president in twenty twenty one, and it was 62 00:04:10,720 --> 00:04:13,480 Speaker 1: less than a year later that Musk started to buy 63 00:04:13,560 --> 00:04:17,320 Speaker 1: up Twitter stock with the intent of ultimately purchasing the platform. 64 00:04:18,279 --> 00:04:21,760 Speaker 1: And I infer from Musk's comments that he believes tools 65 00:04:21,800 --> 00:04:25,719 Speaker 1: like chat, GPT and Google Bard and Microsoft Being represent 66 00:04:25,880 --> 00:04:31,520 Speaker 1: dangerous implementations of AI, perhaps even representing an existential crisis, 67 00:04:32,040 --> 00:04:36,760 Speaker 1: and that his AI tool would somehow be different from 68 00:04:36,800 --> 00:04:41,320 Speaker 1: these and peaceful and beneficial to humanity by just recognizing 69 00:04:41,320 --> 00:04:45,440 Speaker 1: that humans are pretty nifty. Musk himself recently showed support 70 00:04:45,480 --> 00:04:48,480 Speaker 1: for a proposal to put a halt on AI development 71 00:04:48,520 --> 00:04:52,560 Speaker 1: for six months, and some people have said that might 72 00:04:52,640 --> 00:04:55,200 Speaker 1: not have been so much about trying to make things 73 00:04:55,320 --> 00:04:59,880 Speaker 1: safe for AI development, but rather it was intended to 74 00:05:00,080 --> 00:05:02,520 Speaker 1: give Musk's own efforts a chance to catch up to 75 00:05:02,560 --> 00:05:07,040 Speaker 1: everyone else because they were way ahead of the game. Honestly, 76 00:05:07,600 --> 00:05:11,239 Speaker 1: I found most of what Musk said to be speculative 77 00:05:11,920 --> 00:05:15,479 Speaker 1: and difficult to believe. Now I will say this, I 78 00:05:15,600 --> 00:05:20,360 Speaker 1: do think chat, GPT and other AI tools are potentially 79 00:05:20,720 --> 00:05:24,360 Speaker 1: dangerous insomuch as they can be used to do stuff 80 00:05:24,400 --> 00:05:29,480 Speaker 1: like spread misinformation or help craft malware and to perform 81 00:05:29,600 --> 00:05:33,839 Speaker 1: other malicious acts. But companies like open ai are at 82 00:05:33,920 --> 00:05:36,440 Speaker 1: least trying to put protections in place to prevent that 83 00:05:36,480 --> 00:05:40,599 Speaker 1: from happening. So far, those protections haven't been very robust 84 00:05:40,680 --> 00:05:43,359 Speaker 1: and people have found ways around them, but they're still 85 00:05:43,480 --> 00:05:47,120 Speaker 1: trying to prevent that. I do not see chatbots as 86 00:05:47,120 --> 00:05:52,279 Speaker 1: being an existential threat. There's nothing inherent in chat GPT 87 00:05:52,440 --> 00:05:57,520 Speaker 1: that gives it incredible power. It seems really compelling and 88 00:05:57,640 --> 00:06:01,320 Speaker 1: powerful and somewhat scary because it appears to communicate the 89 00:06:01,360 --> 00:06:04,800 Speaker 1: way we do. But that is the extent of what 90 00:06:04,880 --> 00:06:09,839 Speaker 1: chat GPT does. Like, it's not able to take action really, 91 00:06:09,960 --> 00:06:16,240 Speaker 1: and it's ultimately generating responses on a probabilistic scale. In 92 00:06:16,279 --> 00:06:20,000 Speaker 1: other words, it's predicting what the next word should be 93 00:06:20,320 --> 00:06:24,039 Speaker 1: and then putting it there. It's not thinking. So I 94 00:06:24,080 --> 00:06:28,920 Speaker 1: think Musk is conflating generative AI stuff like chat GPT 95 00:06:29,120 --> 00:06:33,360 Speaker 1: and Google Bard with AI as a whole. And you know, 96 00:06:33,520 --> 00:06:35,880 Speaker 1: this is one of those things where you're like, you say, 97 00:06:35,960 --> 00:06:39,560 Speaker 1: all ducks are birds, but not all birds are ducks, right. 98 00:06:39,680 --> 00:06:43,560 Speaker 1: All generative AI is a type of artificial intelligence, but 99 00:06:43,720 --> 00:06:46,719 Speaker 1: not all of artificial intelligence is generative AI, and I 100 00:06:46,760 --> 00:06:50,240 Speaker 1: think some of the more dangerous implementations of AI have 101 00:06:50,520 --> 00:06:55,240 Speaker 1: nothing to do with large language models and chat ponds. Anyway, 102 00:06:55,760 --> 00:06:57,880 Speaker 1: we've got a lot more to talk about with AI, 103 00:06:58,000 --> 00:07:03,360 Speaker 1: so let's just move on. Sundhar Pachai, Alphabet's CEO, appeared 104 00:07:03,400 --> 00:07:06,559 Speaker 1: on Sixty Minutes this week to essentially say that AI 105 00:07:06,880 --> 00:07:10,800 Speaker 1: is going to be disruptive, that it will impact lots 106 00:07:10,800 --> 00:07:14,920 Speaker 1: of jobs, including knowledge based jobs, so jobs like my job, 107 00:07:15,520 --> 00:07:17,880 Speaker 1: and it's not up to the tech industry to figure 108 00:07:17,880 --> 00:07:20,120 Speaker 1: out how to do that responsibly. All right, So that 109 00:07:20,680 --> 00:07:23,760 Speaker 1: that last bit was probably a little bit of interpretation 110 00:07:23,920 --> 00:07:26,440 Speaker 1: on my own part. What he actually said was that 111 00:07:26,480 --> 00:07:30,000 Speaker 1: it's up to society to figure out regulations and laws, 112 00:07:30,720 --> 00:07:35,000 Speaker 1: to create the borders within which AI can operate, and 113 00:07:35,040 --> 00:07:38,640 Speaker 1: to make sure that the rules quote align with human values, 114 00:07:38,880 --> 00:07:43,680 Speaker 1: including morality end quote, and that quote it's not for 115 00:07:43,840 --> 00:07:48,000 Speaker 1: a company to decide end quote. This is really interesting 116 00:07:48,040 --> 00:07:50,640 Speaker 1: coming from a leader of a company that used to 117 00:07:50,800 --> 00:07:55,360 Speaker 1: have the motto don't be evil. Of course Google shed 118 00:07:55,520 --> 00:08:00,600 Speaker 1: that motto years ago, so you could say that doesn't 119 00:08:00,640 --> 00:08:04,160 Speaker 1: really apply anyway. One would think that creating AI that 120 00:08:04,320 --> 00:08:07,680 Speaker 1: does not cause harm would in the long run be 121 00:08:07,720 --> 00:08:10,680 Speaker 1: in the best interest of a company in that business. 122 00:08:11,680 --> 00:08:15,040 Speaker 1: But I guess that's just crazy talk. Anyway, what PITCHAI 123 00:08:15,200 --> 00:08:19,520 Speaker 1: is saying goes beyond chat bots and into broader implementations 124 00:08:19,560 --> 00:08:22,040 Speaker 1: of AI. And while I disagree with him about the 125 00:08:22,160 --> 00:08:26,200 Speaker 1: role companies should play visa v making sure AI doesn't 126 00:08:26,240 --> 00:08:30,280 Speaker 1: cause harm, I do agree that AI is going to 127 00:08:30,320 --> 00:08:35,720 Speaker 1: have an increasingly undeniable and disruptive impact on countless jobs 128 00:08:35,760 --> 00:08:39,320 Speaker 1: and tasks. Now, this does not necessarily mean that the 129 00:08:39,360 --> 00:08:43,400 Speaker 1: impact will always be bad, or that it will definitely 130 00:08:43,520 --> 00:08:48,360 Speaker 1: eliminate jobs, although that is certainly a possibility. My hope 131 00:08:48,760 --> 00:08:52,040 Speaker 1: is that we're gonna push AI to augment rather than 132 00:08:52,320 --> 00:08:58,000 Speaker 1: replace human employees. Otherwise, well, let's just take this to 133 00:08:58,480 --> 00:09:04,120 Speaker 1: an absurd conclusion. Right. Let's imagine we're in a world 134 00:09:04,320 --> 00:09:07,640 Speaker 1: where AI is doing all the work. Humans have been 135 00:09:07,880 --> 00:09:12,240 Speaker 1: replaced by AI. So humans are out of the equation 136 00:09:13,080 --> 00:09:18,240 Speaker 1: because they no longer are needed to have work be done. 137 00:09:18,320 --> 00:09:22,320 Speaker 1: So what is the AI doing work? For? For what purpose? 138 00:09:23,120 --> 00:09:26,200 Speaker 1: For whom is the AI doing work? If there are 139 00:09:26,200 --> 00:09:29,600 Speaker 1: no more humans working, what is the AI doing You 140 00:09:29,640 --> 00:09:33,040 Speaker 1: don't have consumers anymore because you don't have income right, 141 00:09:33,160 --> 00:09:35,680 Speaker 1: people aren't doing jobs, so they're not making money, so 142 00:09:35,720 --> 00:09:38,800 Speaker 1: there's no real economy, which means no one can buy 143 00:09:38,840 --> 00:09:42,640 Speaker 1: anything because no one has income. The companies would cease 144 00:09:42,640 --> 00:09:44,839 Speaker 1: to exist because there's no way for them to even 145 00:09:44,880 --> 00:09:48,320 Speaker 1: make money. At this point, money is meaningless. There's no money, 146 00:09:49,000 --> 00:09:51,320 Speaker 1: no one has a job. So it seems to me 147 00:09:51,559 --> 00:09:56,400 Speaker 1: like that absurd conclusion would quickly fall in on itself 148 00:09:56,440 --> 00:09:59,160 Speaker 1: without the implementation of something like, I don't know, universal 149 00:09:59,240 --> 00:10:03,079 Speaker 1: basic income, then you could reach that Star Trek future 150 00:10:03,200 --> 00:10:06,840 Speaker 1: right where nobody has to work, everybody makes the amount 151 00:10:06,840 --> 00:10:09,360 Speaker 1: of money they need to be able to do all 152 00:10:09,400 --> 00:10:11,920 Speaker 1: the basic things that we need to do, and then 153 00:10:11,960 --> 00:10:14,000 Speaker 1: we spend the rest of our time pursuing whatever it 154 00:10:14,080 --> 00:10:17,680 Speaker 1: is we want. But we don't have universal basic income. 155 00:10:17,760 --> 00:10:21,079 Speaker 1: That's a piece that's missing. And meanwhile, if everyone's pushing 156 00:10:21,120 --> 00:10:27,040 Speaker 1: for this future where AI is replacing everything, where does 157 00:10:27,040 --> 00:10:28,679 Speaker 1: that get us in the long run? Sure, in the 158 00:10:28,720 --> 00:10:30,840 Speaker 1: short term, you could say we've cut way back on 159 00:10:31,000 --> 00:10:34,800 Speaker 1: costs because we fired all the human employees, so we 160 00:10:34,840 --> 00:10:38,560 Speaker 1: don't have those costs anymore, But that doesn't sustain itself 161 00:10:38,559 --> 00:10:41,360 Speaker 1: for very long at all. I don't know. Maybe I'm 162 00:10:41,400 --> 00:10:44,160 Speaker 1: just missing the big picture here. All right, Well, while 163 00:10:44,200 --> 00:10:50,480 Speaker 1: I'm spiraling in this weird future reality, let's take a 164 00:10:50,559 --> 00:11:03,480 Speaker 1: quick break. Okay, we're back now. Before the break, I 165 00:11:03,520 --> 00:11:08,240 Speaker 1: talked about alphabet and Google's CEO talking about AIS impact. 166 00:11:08,880 --> 00:11:12,640 Speaker 1: Of course, Google could be looking at a very specific 167 00:11:12,679 --> 00:11:16,480 Speaker 1: situation in which AI could have a potentially negative impact. 168 00:11:16,679 --> 00:11:19,760 Speaker 1: In fact, it already has had a negative impact on Google. 169 00:11:20,200 --> 00:11:25,200 Speaker 1: So apparently last month, Google employees got word that Samsung 170 00:11:25,400 --> 00:11:30,200 Speaker 1: is considering ditching Google Search for Microsoft Bing, which of 171 00:11:30,200 --> 00:11:35,480 Speaker 1: course is augmented by chat GPT. There's our AI angle. Now, 172 00:11:35,480 --> 00:11:38,559 Speaker 1: if Samsung did do this, if Samsung chose to switch 173 00:11:38,559 --> 00:11:41,679 Speaker 1: from Google Search to Microsoft Bing, that would be a 174 00:11:41,800 --> 00:11:45,640 Speaker 1: huge blow to Google's dominant market share in the search space. 175 00:11:46,280 --> 00:11:49,640 Speaker 1: For years, Google has enjoyed being the eight hundred pound 176 00:11:49,720 --> 00:11:54,520 Speaker 1: gorilla in online search, which honestly, that's an understatement. So 177 00:11:54,559 --> 00:11:57,800 Speaker 1: according to Statista, which you know you keep that in mind, 178 00:11:57,840 --> 00:12:01,720 Speaker 1: like that's just one source, Google Search took up nearly 179 00:12:01,880 --> 00:12:06,320 Speaker 1: ninety seven percent of the mobile market share in January 180 00:12:06,400 --> 00:12:10,760 Speaker 1: twenty twenty three. Now that's mobile, not search overall, but 181 00:12:10,920 --> 00:12:15,920 Speaker 1: ninety seven percent. So if that's true, if ninety seven 182 00:12:16,000 --> 00:12:19,920 Speaker 1: percent of mobile devices use Google Search, as like the 183 00:12:19,920 --> 00:12:23,280 Speaker 1: default search, there's really nowhere to go but down. You're 184 00:12:23,280 --> 00:12:26,600 Speaker 1: not really going to creep up. You certainly are in 185 00:12:26,640 --> 00:12:29,000 Speaker 1: the realm of being called a monopoly, and it'd be 186 00:12:29,120 --> 00:12:34,240 Speaker 1: very difficult to argue against that. Right. But if you're Google, 187 00:12:34,440 --> 00:12:36,840 Speaker 1: you do not want to see those numbers go down 188 00:12:37,559 --> 00:12:42,400 Speaker 1: because that's bad for your business. So the word that Samsung, 189 00:12:42,520 --> 00:12:47,440 Speaker 1: a massive important player in the mobile space, could turn 190 00:12:47,559 --> 00:12:51,840 Speaker 1: to Bing instead of Google then sent a panic through Google, 191 00:12:51,920 --> 00:12:54,040 Speaker 1: which The New York Times picked up on and then 192 00:12:54,080 --> 00:12:57,319 Speaker 1: published an article about it. And so that panic then 193 00:12:57,400 --> 00:13:02,600 Speaker 1: spilled out from Google internally to Google shareholders, and yesterday 194 00:13:02,679 --> 00:13:05,760 Speaker 1: the company saw stock prices dropped by around four percent. 195 00:13:06,559 --> 00:13:10,679 Speaker 1: As for revenue, according to Gizmoto, Samsung switch could mean 196 00:13:10,720 --> 00:13:14,280 Speaker 1: Google misses out on around three billion dollars of revenue 197 00:13:14,280 --> 00:13:17,839 Speaker 1: per year. Yaousa. That's a huge amount of money. Of course, 198 00:13:18,640 --> 00:13:22,000 Speaker 1: Google rakes in more than one hundred and sixty billion 199 00:13:22,160 --> 00:13:27,000 Speaker 1: dollars per year, so it's still an enormous company. But 200 00:13:27,120 --> 00:13:31,160 Speaker 1: maybe Pachai was warning us about AI because Google's hoping 201 00:13:31,160 --> 00:13:34,480 Speaker 1: to push out its own AI augmented search tool in 202 00:13:34,559 --> 00:13:38,320 Speaker 1: order to keep Samsung's business and keep that strangle hold, 203 00:13:38,920 --> 00:13:43,480 Speaker 1: particularly on the mobile search market. Bloomberg reports that a 204 00:13:43,480 --> 00:13:47,520 Speaker 1: couple of research papers show chat gpt is pretty good 205 00:13:47,640 --> 00:13:50,880 Speaker 1: at figuring out whether news will be good or bad 206 00:13:51,040 --> 00:13:54,720 Speaker 1: for a company's stock price. So one of the two 207 00:13:54,760 --> 00:13:59,360 Speaker 1: papers analyzed how well chat gpt could analyze statements that 208 00:13:59,400 --> 00:14:02,040 Speaker 1: came out of the Better or Reserve to determine if 209 00:14:02,080 --> 00:14:06,240 Speaker 1: they were quote unquote hawkish or dove ish, and the 210 00:14:06,320 --> 00:14:11,920 Speaker 1: other paper analyzed chat GPT's ability to parse financial news 211 00:14:12,040 --> 00:14:16,320 Speaker 1: headlines about companies and then figure out whether those headlines 212 00:14:16,320 --> 00:14:18,840 Speaker 1: were a good indicator for the stock price or a 213 00:14:18,880 --> 00:14:23,520 Speaker 1: bad indicator. And apparently the finding show that chat gpt 214 00:14:23,840 --> 00:14:27,400 Speaker 1: is pretty darn good. It's sussing that stuff out, you know, 215 00:14:27,480 --> 00:14:31,200 Speaker 1: almost as good as a trained human analyst would be. 216 00:14:32,000 --> 00:14:35,480 Speaker 1: So I wouldn't call chat gpt superhuman. It's not like 217 00:14:35,520 --> 00:14:40,440 Speaker 1: it's doing something that people cannot do. However, chatbots can 218 00:14:40,640 --> 00:14:44,080 Speaker 1: analyze way more information at a much faster rate than 219 00:14:44,240 --> 00:14:49,640 Speaker 1: a human can do. And moreover, it's possible that individual 220 00:14:49,680 --> 00:14:53,480 Speaker 1: investors could start to lean on tools like chat gpt 221 00:14:54,280 --> 00:14:57,520 Speaker 1: to figure out which investments could be safe bets, which 222 00:14:57,560 --> 00:15:00,680 Speaker 1: ones could be long shots, and which ones might just 223 00:15:00,720 --> 00:15:03,600 Speaker 1: be throwing your money away. So if you can get 224 00:15:03,600 --> 00:15:07,240 Speaker 1: the same sort of guidance from chat GPT that you 225 00:15:07,280 --> 00:15:11,560 Speaker 1: would normally need a professional analyst to provide, well, that 226 00:15:11,600 --> 00:15:15,440 Speaker 1: definitely can change the game. And it's possibly bad news 227 00:15:15,440 --> 00:15:18,640 Speaker 1: for the analysts out there because their jobs could become 228 00:15:18,680 --> 00:15:23,920 Speaker 1: one of the ones potentially impacted by AI. SONY recently 229 00:15:23,960 --> 00:15:28,560 Speaker 1: held its World Photography Awards and chose photographer Boris Eldigson 230 00:15:28,840 --> 00:15:31,920 Speaker 1: as the recipient of an award in the Creative Open 231 00:15:32,120 --> 00:15:37,400 Speaker 1: category for photography. Eldigson has declined to accept this award 232 00:15:37,520 --> 00:15:40,880 Speaker 1: because he says the image he submitted was not in 233 00:15:40,920 --> 00:15:45,280 Speaker 1: fact a photograph he snapped, but rather a computer generated image. 234 00:15:45,680 --> 00:15:49,240 Speaker 1: Eldigson says that his intent was to test a major 235 00:15:49,280 --> 00:15:51,960 Speaker 1: photography competition to see if the field is ready to 236 00:15:52,000 --> 00:15:58,160 Speaker 1: distinguish between photographs taken by human photographers and computer generated imagery, 237 00:15:58,200 --> 00:16:00,760 Speaker 1: and he described his state of mind as that of 238 00:16:00,800 --> 00:16:05,320 Speaker 1: a cheeky monkey, his words, not mine. I can certainly 239 00:16:05,360 --> 00:16:08,560 Speaker 1: appreciate that, And while I'm sure that this matter will 240 00:16:08,600 --> 00:16:12,840 Speaker 1: spawn a lot of criticism for L. Dexon and the competition, 241 00:16:13,120 --> 00:16:15,800 Speaker 1: I think his intentions were good ones. His actions show 242 00:16:15,840 --> 00:16:20,680 Speaker 1: that things like competitions, we really need to take into 243 00:16:20,680 --> 00:16:25,280 Speaker 1: consideration the possibility of AI aided or generated design as 244 00:16:25,360 --> 00:16:29,440 Speaker 1: part of that competition, or figure out how we prevent 245 00:16:29,480 --> 00:16:33,360 Speaker 1: it from being part of the competition if that's our desire. So, 246 00:16:33,560 --> 00:16:36,920 Speaker 1: Eldigston said that the photography world needs to have an 247 00:16:36,920 --> 00:16:41,040 Speaker 1: open discussion about AI's place in photography. Does it have 248 00:16:41,160 --> 00:16:44,880 Speaker 1: a place? Should AI generated photography even be considered photography 249 00:16:44,960 --> 00:16:47,640 Speaker 1: at all? If not, how do we detect it to 250 00:16:47,680 --> 00:16:51,240 Speaker 1: prevent someone with access to a powerful image generator from 251 00:16:51,440 --> 00:16:53,960 Speaker 1: dominating what is otherwise meant to be a fair competition 252 00:16:54,000 --> 00:16:59,000 Speaker 1: between human photographers. So these questions aren't just hypothetical now 253 00:16:59,000 --> 00:17:02,200 Speaker 1: that image generating technology has reached a sufficient level of 254 00:17:02,200 --> 00:17:05,439 Speaker 1: sophistication so that it can pass itself off as a 255 00:17:05,520 --> 00:17:09,000 Speaker 1: human created work. A spokesperson from the World of Photography 256 00:17:09,119 --> 00:17:14,400 Speaker 1: organization appears to contradict at least some of Eldigson's statements, 257 00:17:14,840 --> 00:17:17,760 Speaker 1: claiming that he had made it clear that a generative 258 00:17:17,840 --> 00:17:20,439 Speaker 1: tool at least played a part in constructing the image, 259 00:17:20,640 --> 00:17:23,680 Speaker 1: and that they in turn thought that it was interesting, 260 00:17:24,520 --> 00:17:27,080 Speaker 1: and that he had quote unquote fulfilled the criteria for 261 00:17:27,160 --> 00:17:32,520 Speaker 1: the category and that the organization only withdrew from conversations 262 00:17:32,560 --> 00:17:36,920 Speaker 1: with him after he said that he had purposefully attempted 263 00:17:36,920 --> 00:17:40,679 Speaker 1: to mislead the competition and then declined the award. So 264 00:17:40,800 --> 00:17:47,280 Speaker 1: whether the organization was aware of the AI involvement, maybe 265 00:17:47,520 --> 00:17:51,320 Speaker 1: the extent of the involvement was miscommunicator, I can't really tell, 266 00:17:51,840 --> 00:17:54,240 Speaker 1: but it sounds like the organization says, no, we knew 267 00:17:54,640 --> 00:17:56,479 Speaker 1: what was going on when we gave him the award. 268 00:17:56,560 --> 00:18:00,280 Speaker 1: He just declined it, whereas he's saying, I submitted this 269 00:18:00,320 --> 00:18:02,879 Speaker 1: as my own work, but in fact it was AI generated. 270 00:18:03,359 --> 00:18:05,520 Speaker 1: I don't know who's telling the truth or where it's 271 00:18:05,560 --> 00:18:09,920 Speaker 1: getting lost in this article. It may be that it's 272 00:18:09,920 --> 00:18:11,880 Speaker 1: a little more complex than that, and I just don't 273 00:18:11,960 --> 00:18:17,280 Speaker 1: understand it all yet. Okay, one more generative AI story. Recently, 274 00:18:17,280 --> 00:18:20,440 Speaker 1: a person using the handle ghost writer used Generative AI 275 00:18:20,520 --> 00:18:23,760 Speaker 1: to create a song featuring the generated voices of Drake 276 00:18:24,080 --> 00:18:27,159 Speaker 1: and the Weekend. The song is called hard on My Sleeve, 277 00:18:27,240 --> 00:18:29,879 Speaker 1: and it became a kind of viral sensation on platforms 278 00:18:29,920 --> 00:18:33,600 Speaker 1: like Spotify and TikTok. So the voices sound like Drake 279 00:18:33,760 --> 00:18:37,359 Speaker 1: in the Weekend, but they are generated by AI. So 280 00:18:37,400 --> 00:18:41,159 Speaker 1: some refer to this as deep fake audio. Neither artist 281 00:18:41,240 --> 00:18:43,199 Speaker 1: was involved in the actual making of the song, and 282 00:18:43,240 --> 00:18:46,560 Speaker 1: it raises a bunch of questions about image and personality rights. 283 00:18:46,560 --> 00:18:49,200 Speaker 1: So here in the United States, we don't have any 284 00:18:49,320 --> 00:18:53,919 Speaker 1: laws at the federal level that protect personality or image rights. 285 00:18:54,480 --> 00:18:57,000 Speaker 1: So you have no right to your image or to 286 00:18:57,240 --> 00:19:01,359 Speaker 1: the expression of your personality the federal level. Anyway, some 287 00:19:01,359 --> 00:19:04,600 Speaker 1: of the states do, but not the federal level. So 288 00:19:05,119 --> 00:19:07,800 Speaker 1: if you were to create a deep fake audio for 289 00:19:08,960 --> 00:19:12,119 Speaker 1: some song and you were using someone else's voice for it, 290 00:19:12,160 --> 00:19:15,440 Speaker 1: there's nothing illegal about that at the federal level. There 291 00:19:15,440 --> 00:19:20,119 Speaker 1: have been companies that have made copyright strikes for deep 292 00:19:20,119 --> 00:19:23,920 Speaker 1: fake audios, but you can't copyright a voice or a personality. 293 00:19:24,359 --> 00:19:26,520 Speaker 1: So I guess they're just using that because it's the 294 00:19:26,520 --> 00:19:29,760 Speaker 1: only weapon they have right now for those kind of cases. 295 00:19:30,080 --> 00:19:32,919 Speaker 1: But it really shows how the US needs to revisit 296 00:19:33,240 --> 00:19:39,000 Speaker 1: concepts like image protection and personality protection laws. All right, 297 00:19:39,240 --> 00:19:41,760 Speaker 1: that's it for this episode. Before I run too long, 298 00:19:41,800 --> 00:19:43,480 Speaker 1: I just want to say I hope you're all well, 299 00:19:43,880 --> 00:19:53,280 Speaker 1: and I'll talk to you again really soon. Tech Stuff 300 00:19:53,400 --> 00:19:57,920 Speaker 1: is an iHeartRadio production. For more podcasts from iHeartRadio, visit 301 00:19:57,960 --> 00:20:01,359 Speaker 1: the iHeartRadio app, Apple podcast asks, or wherever you listen 302 00:20:01,400 --> 00:20:02,479 Speaker 1: to your favorite shows.