1 00:00:04,440 --> 00:00:12,360 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,360 --> 00:00:15,840 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:15,840 --> 00:00:19,279 Speaker 1: I'm an executive producer with iHeart Podcasts and How the 4 00:00:19,440 --> 00:00:22,560 Speaker 1: tech are you. It is time for the tech news 5 00:00:22,760 --> 00:00:28,120 Speaker 1: for Tuesday, November seventh, twenty twenty three. First up, last week, 6 00:00:28,200 --> 00:00:31,360 Speaker 1: I mentioned that the company we Work was rumored to 7 00:00:31,400 --> 00:00:36,040 Speaker 1: be headed for bankruptcy. Now the company has filed for bankruptcy. 8 00:00:36,400 --> 00:00:39,280 Speaker 1: So at its height, we Work was worth somewhere in 9 00:00:39,320 --> 00:00:43,560 Speaker 1: the neighborhood of forty seven billion with a B dollars. 10 00:00:43,960 --> 00:00:48,080 Speaker 1: Currently it's worth less than fifty million dollars. That is 11 00:00:48,120 --> 00:00:53,360 Speaker 1: an incredible fall from grace. We Work is not actually 12 00:00:53,479 --> 00:00:56,520 Speaker 1: a tech company. It was treated like one. It was 13 00:00:56,520 --> 00:00:59,560 Speaker 1: treated like it was a tech startup, but you know, 14 00:00:59,640 --> 00:01:04,800 Speaker 1: it is just all about buying or leasing office space 15 00:01:04,880 --> 00:01:09,080 Speaker 1: and then subleasing that to customers. We Work was already 16 00:01:09,080 --> 00:01:13,360 Speaker 1: in a rough spot before the pandemic even hit, and 17 00:01:13,400 --> 00:01:16,560 Speaker 1: obviously in the wake of the pandemic, Corporate America has 18 00:01:16,640 --> 00:01:20,039 Speaker 1: changed its approach to work in quite a few places. 19 00:01:20,600 --> 00:01:23,280 Speaker 1: So this doesn't actually mean that we Work is totally 20 00:01:23,280 --> 00:01:26,200 Speaker 1: going to go out of business. That's not what bankruptcy means. 21 00:01:26,240 --> 00:01:27,640 Speaker 1: What it means is that the company is going to 22 00:01:27,720 --> 00:01:31,199 Speaker 1: receive protection against you know, it's various creditors because it's 23 00:01:31,800 --> 00:01:36,440 Speaker 1: in deep, deep debt. And reps are saying that the 24 00:01:36,480 --> 00:01:39,360 Speaker 1: plan is to reorganize the company and to have as 25 00:01:39,400 --> 00:01:43,360 Speaker 1: little disruption in business as possible, while it will likely 26 00:01:43,400 --> 00:01:46,840 Speaker 1: have to get rid of certain properties, you know, sell them, 27 00:01:46,880 --> 00:01:50,320 Speaker 1: offer or cancel leases or whatever, they want, to keep 28 00:01:50,360 --> 00:01:53,520 Speaker 1: as many properties in operation as possible. At least that's 29 00:01:53,560 --> 00:01:57,960 Speaker 1: what the reps are saying. Personally, I think we Work 30 00:01:57,960 --> 00:02:00,640 Speaker 1: as a business plan has never been so super strong. 31 00:02:00,680 --> 00:02:04,440 Speaker 1: The whole concept isn't a new one, and it typically 32 00:02:04,520 --> 00:02:08,560 Speaker 1: has pretty small profit margins. So I'm not saying it's 33 00:02:08,560 --> 00:02:12,320 Speaker 1: impossible for we Work to find stability, but I'm personally 34 00:02:12,720 --> 00:02:16,720 Speaker 1: pretty pessimistic about it. Open Ai held its first developer 35 00:02:16,760 --> 00:02:21,120 Speaker 1: conference yesterday. They called it dev Day, and early on 36 00:02:21,240 --> 00:02:24,799 Speaker 1: open Ai kicked things off by giving developers an incentive 37 00:02:25,240 --> 00:02:28,760 Speaker 1: to make apps built on top of the GPT technology 38 00:02:29,400 --> 00:02:33,079 Speaker 1: through a five hundred dollars credit. The company showed off 39 00:02:33,120 --> 00:02:36,520 Speaker 1: some new capabilities to developers, including a simplified way to 40 00:02:36,560 --> 00:02:40,840 Speaker 1: create customized GPT agents using natural language which is pretty 41 00:02:40,880 --> 00:02:44,560 Speaker 1: incredible really. Open ai is also going to launch an 42 00:02:44,560 --> 00:02:47,239 Speaker 1: online store where developers will be able to sell their 43 00:02:47,240 --> 00:02:51,680 Speaker 1: custom GPT agents to customers. The company also announced it 44 00:02:51,680 --> 00:02:56,680 Speaker 1: would offer protection to developers against copyright infringement claims, so 45 00:02:57,280 --> 00:03:00,600 Speaker 1: open ai says it will cover costs or those kind 46 00:03:00,639 --> 00:03:05,880 Speaker 1: of legal claims against developers. The company also announced that 47 00:03:05,960 --> 00:03:10,160 Speaker 1: it hit a huge milestone of one hundred million weekly users, 48 00:03:10,919 --> 00:03:13,399 Speaker 1: and it sounds like things are just getting started, which 49 00:03:13,440 --> 00:03:15,359 Speaker 1: I'm sure is the source of anxiety for a lot 50 00:03:15,400 --> 00:03:17,920 Speaker 1: of creative types out there who are already concerned that 51 00:03:18,120 --> 00:03:23,480 Speaker 1: models like GBT may be devouring their work and ultimately 52 00:03:23,480 --> 00:03:27,359 Speaker 1: will set up technologies that compete directly against them. Tech 53 00:03:27,440 --> 00:03:31,000 Speaker 1: explores Rob Nichols has an article titled do you trust 54 00:03:31,080 --> 00:03:35,160 Speaker 1: AI to write the news? It already is and not 55 00:03:35,280 --> 00:03:39,240 Speaker 1: without issues. So in this article, Nichols tells the story 56 00:03:39,480 --> 00:03:44,160 Speaker 1: of how Microsoft reposted an article that was originally published 57 00:03:44,200 --> 00:03:48,120 Speaker 1: in The Guardian. So, in the repost of this article, 58 00:03:48,200 --> 00:03:53,000 Speaker 1: Microsoft also enabled an AI generative tool that automatically created 59 00:03:53,120 --> 00:03:57,320 Speaker 1: a poll and connected it to the news story. Unfortunately, 60 00:03:58,160 --> 00:04:00,680 Speaker 1: this news story was about the murder of a young 61 00:04:00,720 --> 00:04:05,160 Speaker 1: woman in Australia. Named Lily James, and the poll was 62 00:04:05,240 --> 00:04:08,920 Speaker 1: asking readers to speculate about the nature of her death, 63 00:04:09,560 --> 00:04:13,880 Speaker 1: which is absolutely horrifying. Right. It's clearly negligent on the 64 00:04:13,920 --> 00:04:16,880 Speaker 1: part of Microsoft to allow that through, and The Guardian 65 00:04:16,920 --> 00:04:19,640 Speaker 1: had no involvement with the use of this AI tool. 66 00:04:20,040 --> 00:04:23,599 Speaker 1: As Nichols points out, media companies are experimenting with AI 67 00:04:23,680 --> 00:04:27,359 Speaker 1: to generate stuff like poles because poles are proven to 68 00:04:27,400 --> 00:04:31,400 Speaker 1: boost engagement, but they also take time away from staff. 69 00:04:31,440 --> 00:04:34,159 Speaker 1: I'm reminded of when I was working for houstuffworks dot 70 00:04:34,200 --> 00:04:39,640 Speaker 1: com and we would have things like quizzes and galleries 71 00:04:39,680 --> 00:04:41,360 Speaker 1: and this kind of stuff that took a lot of 72 00:04:41,400 --> 00:04:44,080 Speaker 1: time to create. They did create a lot of engagement, 73 00:04:44,279 --> 00:04:47,719 Speaker 1: which is why the company loved them so much, but 74 00:04:47,960 --> 00:04:50,360 Speaker 1: it meant that we weren't actually spending time doing stuff 75 00:04:50,360 --> 00:04:54,120 Speaker 1: like researching and writing articles, which is really what most 76 00:04:54,120 --> 00:04:57,560 Speaker 1: of us wanted to do. So why not offload those 77 00:04:57,640 --> 00:05:04,120 Speaker 1: kinds of somewhat mindless tasks to AI? As the story indicates, 78 00:05:04,440 --> 00:05:08,520 Speaker 1: the subject matter of the news article is incredibly important. 79 00:05:09,120 --> 00:05:12,840 Speaker 1: Nichols goes on to reference various media companies that are 80 00:05:12,880 --> 00:05:15,400 Speaker 1: even going a step further. They're leaning on AI to 81 00:05:15,520 --> 00:05:20,599 Speaker 1: actually generate articles, not just supplemental material, but full on 82 00:05:20,720 --> 00:05:24,080 Speaker 1: news articles. And I've talked in the past about again 83 00:05:24,120 --> 00:05:27,640 Speaker 1: how my former employer How Stuff Works did that for 84 00:05:28,160 --> 00:05:31,240 Speaker 1: How Stuffworks articles. I have not actually been back to 85 00:05:31,279 --> 00:05:33,560 Speaker 1: the website for a few months now to see if 86 00:05:33,560 --> 00:05:35,479 Speaker 1: that's still the case, but that's what it was like 87 00:05:35,560 --> 00:05:41,560 Speaker 1: in the summer. Nichols argues that AI's shortcomings can create unfortunate, tragic, 88 00:05:41,880 --> 00:05:45,000 Speaker 1: and even dangerous consequences, and I think that's right on 89 00:05:45,040 --> 00:05:48,320 Speaker 1: the money. With a very strong editorial staff, you could 90 00:05:48,360 --> 00:05:53,640 Speaker 1: potentially weed out articles that are misleading or harmful, But 91 00:05:53,760 --> 00:05:57,200 Speaker 1: at some point you're asking editors to act both as 92 00:05:57,240 --> 00:06:01,839 Speaker 1: an editor and as a writer to rewrite pieces, and 93 00:06:01,880 --> 00:06:05,360 Speaker 1: then you start getting into these unmanageable workloads. So I'm 94 00:06:05,400 --> 00:06:09,840 Speaker 1: not entirely convinced that it even makes sense from a 95 00:06:09,880 --> 00:06:13,159 Speaker 1: business perspective. It certainly isn't going to help things like 96 00:06:13,360 --> 00:06:18,200 Speaker 1: editorial morale. Meanwhile, the News division of CBS has launched 97 00:06:18,200 --> 00:06:22,840 Speaker 1: a unit dedicated to investigating things like deep fakes and misinformation, 98 00:06:23,000 --> 00:06:28,320 Speaker 1: particularly from generative AI. The unit has the name CBS 99 00:06:28,560 --> 00:06:32,760 Speaker 1: News Confirmed, and they will have actual real life human 100 00:06:32,800 --> 00:06:36,920 Speaker 1: beings at the helm. Thankfully, Claudia Milne and Ross Dagan 101 00:06:37,040 --> 00:06:40,040 Speaker 1: are going to oversee the department. The company is looking 102 00:06:40,080 --> 00:06:45,000 Speaker 1: to hire experts in journalism and AI. Again, this is 103 00:06:45,240 --> 00:06:50,040 Speaker 1: really encouraging, y'all. I mean not to get on a soapbux, 104 00:06:50,080 --> 00:06:52,880 Speaker 1: but journalism in general has taken a real bad hit 105 00:06:53,160 --> 00:06:56,000 Speaker 1: over the last couple of decades. And to see a 106 00:06:56,040 --> 00:07:00,359 Speaker 1: company say, no, we want experts in journalism and in 107 00:07:00,480 --> 00:07:05,760 Speaker 1: artificial intelligence so that we are taking a responsible and 108 00:07:05,839 --> 00:07:09,440 Speaker 1: accountable approach toward reporting on this kind of stuff, I 109 00:07:09,440 --> 00:07:12,600 Speaker 1: think that's a huge step in the right direction. And moreover, 110 00:07:12,760 --> 00:07:16,480 Speaker 1: this is something that's arguably already in necessity because generative 111 00:07:16,480 --> 00:07:21,640 Speaker 1: AI tools are pretty sophisticated, and they are widely distributed, 112 00:07:21,920 --> 00:07:26,840 Speaker 1: and they are largely unregulated, So it is something that 113 00:07:26,880 --> 00:07:28,960 Speaker 1: we do need to be put in place in order 114 00:07:29,000 --> 00:07:36,560 Speaker 1: to prevent harm from being committed across entire populations. Microsoft 115 00:07:36,600 --> 00:07:40,880 Speaker 1: announced a partnership with in World AI to create Xbox 116 00:07:40,920 --> 00:07:44,320 Speaker 1: developer tools that well, I mean within World AI. I'm 117 00:07:44,320 --> 00:07:46,520 Speaker 1: sure you've already guessed it. They're going to integrate AI 118 00:07:46,680 --> 00:07:50,760 Speaker 1: in various ways in the game development cycle. So the 119 00:07:50,800 --> 00:07:53,360 Speaker 1: idea is that developers will be able to create AI 120 00:07:53,480 --> 00:07:57,880 Speaker 1: powered elements in their games, including stuff like AI generated 121 00:07:57,960 --> 00:08:02,320 Speaker 1: stories and quests, and even characters. Tom Warren of The 122 00:08:02,440 --> 00:08:05,480 Speaker 1: Verge wrote about this, and his piece actually surprised me 123 00:08:05,520 --> 00:08:09,920 Speaker 1: because originally I assumed that these AI tools would only 124 00:08:09,960 --> 00:08:13,400 Speaker 1: cover the actual game development phase on the back end, 125 00:08:13,960 --> 00:08:16,920 Speaker 1: that developers would be able to use these tools to 126 00:08:17,040 --> 00:08:20,960 Speaker 1: flesh out content in a game, while the human writers 127 00:08:20,960 --> 00:08:23,560 Speaker 1: would focus on the most important parts of the game. 128 00:08:24,560 --> 00:08:28,720 Speaker 1: So your human writers might be crafting a really satisfying 129 00:08:28,760 --> 00:08:33,240 Speaker 1: and emotional story, right, and you might offload things like 130 00:08:33,840 --> 00:08:38,960 Speaker 1: random NPC conversations to your AI so that you're not 131 00:08:39,120 --> 00:08:43,400 Speaker 1: spending a ton of time just generating minds that players 132 00:08:43,400 --> 00:08:46,760 Speaker 1: may or may not ever encounter. But according to Warren, 133 00:08:47,040 --> 00:08:50,360 Speaker 1: the tool will also allow quote and AI character engine 134 00:08:50,360 --> 00:08:54,160 Speaker 1: that can be integrated into games and used to dynamically 135 00:08:54,240 --> 00:09:00,440 Speaker 1: generate stories, quests, and dialogue end quote. Now, maybe my 136 00:09:00,520 --> 00:09:04,640 Speaker 1: interpretation is off, but by my reading, that sounds like 137 00:09:04,800 --> 00:09:08,640 Speaker 1: it could mean that you could have these active within 138 00:09:08,720 --> 00:09:10,800 Speaker 1: a game, not just in the game development, but in 139 00:09:10,840 --> 00:09:14,040 Speaker 1: the game itself, so that as you're playing the game, 140 00:09:14,080 --> 00:09:18,640 Speaker 1: you are encountering characters who are dynamically generating dialogue. At 141 00:09:18,640 --> 00:09:21,320 Speaker 1: that moment, as opposed to having done it during the 142 00:09:21,320 --> 00:09:24,560 Speaker 1: game development phase and then humans say yes, let's include 143 00:09:24,559 --> 00:09:27,400 Speaker 1: that in the game, or no, that doesn't really work, 144 00:09:27,480 --> 00:09:31,160 Speaker 1: let's strike it. If it's something that's truly dynamic, then 145 00:09:31,160 --> 00:09:33,520 Speaker 1: it may be like in the game itself. So you 146 00:09:33,520 --> 00:09:37,120 Speaker 1: could have a conversation with a character and it could 147 00:09:37,120 --> 00:09:39,320 Speaker 1: be totally different than someone else who's playing the game 148 00:09:39,360 --> 00:09:42,000 Speaker 1: and having a conversation with that same character. That to 149 00:09:42,040 --> 00:09:45,880 Speaker 1: me is really interesting. Now that's assuming that my interpretation 150 00:09:46,000 --> 00:09:48,400 Speaker 1: is correct, and I could be wrong. But if I 151 00:09:48,440 --> 00:09:50,439 Speaker 1: am right, that means that we could see an end 152 00:09:50,440 --> 00:09:53,440 Speaker 1: to NPC spouting off the same lines over and over, 153 00:09:54,040 --> 00:09:57,480 Speaker 1: which would mean that we would have no more memes 154 00:09:57,559 --> 00:09:59,440 Speaker 1: like I used to be an adventurer like you, and 155 00:09:59,480 --> 00:10:02,240 Speaker 1: then I took a arrow to the knee. Microsoft said 156 00:10:02,280 --> 00:10:06,040 Speaker 1: that developers will determine if and to what extent they'll 157 00:10:06,160 --> 00:10:09,200 Speaker 1: use AI, so it's not like this is mandated, and 158 00:10:09,280 --> 00:10:13,800 Speaker 1: obviously this is also a very sensitive topic. You can 159 00:10:13,960 --> 00:10:17,959 Speaker 1: frame this as a way for developers to make better 160 00:10:18,080 --> 00:10:20,080 Speaker 1: use of their time and to be more efficient, but 161 00:10:20,120 --> 00:10:22,960 Speaker 1: you could also frame this as a way to take 162 00:10:23,040 --> 00:10:26,720 Speaker 1: work away from people. Right, whether it's a voice actor 163 00:10:27,040 --> 00:10:33,520 Speaker 1: with a simulated voice, or game developers or writers, et cetera. 164 00:10:34,040 --> 00:10:37,240 Speaker 1: There's this deep concern that some game studios could choose 165 00:10:37,280 --> 00:10:40,480 Speaker 1: to go with the cheaper AI option rather than to 166 00:10:40,559 --> 00:10:43,760 Speaker 1: pay you know, those pesky human beings to do the work, 167 00:10:44,400 --> 00:10:47,200 Speaker 1: and among gamers there's also a concern that AI generated 168 00:10:47,240 --> 00:10:50,880 Speaker 1: games will not measure up to the top tier of 169 00:10:51,080 --> 00:10:56,720 Speaker 1: titles that human beings have made in past years. On Saturday, XAI, 170 00:10:57,679 --> 00:11:02,520 Speaker 1: the artificial intelligence startup from x Obsessed Elon Musk, launched 171 00:11:02,559 --> 00:11:08,679 Speaker 1: a chatbot called groc Grook. If you're curious what sets 172 00:11:08,720 --> 00:11:12,200 Speaker 1: groc apart from other chatbots, well, to the shock of 173 00:11:12,320 --> 00:11:15,200 Speaker 1: absolutely no one, it's a bit of a potty mouth. 174 00:11:15,800 --> 00:11:19,760 Speaker 1: It takes a more grouchy and vulgar approach to answering questions, 175 00:11:20,200 --> 00:11:23,439 Speaker 1: almost as if the chatbot is insulted that's being bothered 176 00:11:23,440 --> 00:11:26,240 Speaker 1: to answer those questions in the first place. In some cases, 177 00:11:26,840 --> 00:11:30,160 Speaker 1: Elon Musk was very coy about gosh I wonder who 178 00:11:30,200 --> 00:11:34,680 Speaker 1: decided that the chatbot should have an attitude anyway. Xai 179 00:11:34,760 --> 00:11:37,360 Speaker 1: has indicated that the actual chatbot will have a couple 180 00:11:37,360 --> 00:11:41,560 Speaker 1: of different modes. It'll have the fun mode, which presumably 181 00:11:41,640 --> 00:11:44,040 Speaker 1: is the one that has all the attitude, and then 182 00:11:44,120 --> 00:11:47,160 Speaker 1: I'm guessing it'll have an alternative that'll be a little 183 00:11:47,160 --> 00:11:49,720 Speaker 1: more straightforward and standard, something that's more in line with 184 00:11:49,760 --> 00:11:52,920 Speaker 1: the other chatbots that you can find out there. Musk's 185 00:11:52,960 --> 00:11:56,319 Speaker 1: plan is to release the chatbot to ex premium subscribers 186 00:11:56,360 --> 00:11:59,559 Speaker 1: once it emerges from beta, which is pretty darn funny 187 00:11:59,600 --> 00:12:01,719 Speaker 1: because for ages, Musk has argued that one of the 188 00:12:01,720 --> 00:12:04,720 Speaker 1: biggest problems with Twitter are the bots, and now he's 189 00:12:05,120 --> 00:12:10,480 Speaker 1: releasing one to Twitter. But whatever, Okay, I'm gonna take 190 00:12:10,480 --> 00:12:12,600 Speaker 1: a quick break to thank our sponsors. We'll be back 191 00:12:12,679 --> 00:12:24,680 Speaker 1: with some more news in just a moment. We're back. So. 192 00:12:25,240 --> 00:12:29,679 Speaker 1: Lucas Ropeck of Gizmoto has an article titled Cruz robotaxis 193 00:12:29,720 --> 00:12:33,800 Speaker 1: require remote human assistance every four to five miles. As 194 00:12:33,800 --> 00:12:36,480 Speaker 1: that headline suggests, it has been a bumpy road for 195 00:12:36,559 --> 00:12:39,560 Speaker 1: the autonomous taxi company, and you've got to remember it. 196 00:12:39,600 --> 00:12:44,720 Speaker 1: Cruise is also owned by General Motors. Just recently, the 197 00:12:44,720 --> 00:12:49,120 Speaker 1: state of California revoked Cruse's license to operate autonomous vehicles 198 00:12:49,240 --> 00:12:52,280 Speaker 1: due to concerns that the company vehicles were quote an 199 00:12:52,360 --> 00:12:57,160 Speaker 1: unreasonable risk to public safety end quote. This news story 200 00:12:57,280 --> 00:13:00,560 Speaker 1: is that apparently staff at Cruz frequently have to intervene 201 00:13:00,559 --> 00:13:04,920 Speaker 1: and provide what was called remote assistance to Cruise vehicles 202 00:13:05,000 --> 00:13:08,079 Speaker 1: due to the tendency to encounter situations that the vehicles 203 00:13:08,120 --> 00:13:12,920 Speaker 1: aren't able to navigate. Tiffany Testo, spokesperson for Cruz, said 204 00:13:12,920 --> 00:13:16,160 Speaker 1: this happened every four to five miles of travel among 205 00:13:16,200 --> 00:13:20,160 Speaker 1: the company's vehicles, so not four to five miles per vehicle, 206 00:13:20,200 --> 00:13:23,040 Speaker 1: but rather across the fleet every four to five miles, 207 00:13:23,080 --> 00:13:27,480 Speaker 1: there was a need to provide remote assistance. Ultimately, the 208 00:13:27,480 --> 00:13:30,000 Speaker 1: story seems to reinforce that we're still pretty good ways 209 00:13:30,000 --> 00:13:33,240 Speaker 1: from a future of truly autonomous vehicles and that human 210 00:13:33,280 --> 00:13:37,120 Speaker 1: intervention is still a necessary component. I will add that 211 00:13:37,280 --> 00:13:42,640 Speaker 1: it wasn't very clear what extent that assistance goes to, right, 212 00:13:42,720 --> 00:13:45,880 Speaker 1: whether it's just providing a little bit of data and 213 00:13:45,920 --> 00:13:48,640 Speaker 1: then the car takes care of everything else itself, or 214 00:13:48,679 --> 00:13:51,640 Speaker 1: if it goes so far as to require remote operations. 215 00:13:52,400 --> 00:13:56,200 Speaker 1: TikTok is ending its Creator Fund on December sixteenth, So 216 00:13:56,240 --> 00:13:59,560 Speaker 1: in case you're not aware, the Creator Fund is a 217 00:13:59,640 --> 00:14:02,960 Speaker 1: pool of money. Currently it is valued at around two 218 00:14:03,080 --> 00:14:06,880 Speaker 1: billion dollars, and TikTok uses that pool of money to 219 00:14:06,920 --> 00:14:10,199 Speaker 1: issue payments to creators who generate a ton of videos 220 00:14:10,200 --> 00:14:13,439 Speaker 1: from their work, so or a ton of views, i 221 00:14:13,480 --> 00:14:16,680 Speaker 1: should say, for their videos. So the whole idea was 222 00:14:16,760 --> 00:14:19,200 Speaker 1: this would be a direct way for TikTok stars to 223 00:14:19,280 --> 00:14:22,560 Speaker 1: monetize their work because in the past they really had 224 00:14:22,560 --> 00:14:24,880 Speaker 1: to hustle. Right, they could go viral, but there was 225 00:14:24,920 --> 00:14:26,880 Speaker 1: no way to make money off of that unless they 226 00:14:26,920 --> 00:14:31,600 Speaker 1: also landed a sponsorship deal with a third party. The 227 00:14:31,840 --> 00:14:34,600 Speaker 1: Creator Fund was meant to be a more direct path 228 00:14:34,640 --> 00:14:38,240 Speaker 1: to monetization, but it didn't get very good reception. Lots 229 00:14:38,240 --> 00:14:41,640 Speaker 1: of creators complained that when they did receive a payout 230 00:14:41,840 --> 00:14:44,320 Speaker 1: it was pennies on the dollar. They were barely making 231 00:14:44,400 --> 00:14:47,160 Speaker 1: any money at all, and it wasn't worth the amount 232 00:14:47,200 --> 00:14:50,920 Speaker 1: of work, nor did it reflect the tremendous number of 233 00:14:51,000 --> 00:14:53,600 Speaker 1: views some of these folks were stacking up. So that 234 00:14:53,640 --> 00:14:57,440 Speaker 1: program is going to go away on December sixteen. However, 235 00:14:57,520 --> 00:15:02,200 Speaker 1: TikTok does have an alternative in place called the Creativity Program, 236 00:15:02,840 --> 00:15:05,680 Speaker 1: and it sounds to me like it's pretty similar to 237 00:15:05,720 --> 00:15:09,720 Speaker 1: the Creator Fund, except this one is specifically focused on 238 00:15:09,880 --> 00:15:14,040 Speaker 1: longer form videos, stuff that's at least you a longer 239 00:15:14,080 --> 00:15:18,000 Speaker 1: than a minute, and I'm not sure if this also 240 00:15:18,040 --> 00:15:21,640 Speaker 1: means that TikTok will be better when it comes to 241 00:15:21,720 --> 00:15:25,240 Speaker 1: storing creator data. It came to light earlier this year 242 00:15:25,920 --> 00:15:30,960 Speaker 1: that some creator financial data, like personally identifying and very 243 00:15:32,640 --> 00:15:36,480 Speaker 1: private financial data of creators, was being stored on servers 244 00:15:36,520 --> 00:15:41,040 Speaker 1: in China. This was despite the fact that TikTok representatives 245 00:15:41,040 --> 00:15:43,480 Speaker 1: had been claiming that all that kind of information would 246 00:15:43,560 --> 00:15:47,320 Speaker 1: only be on servers in the United States or in Singapore. 247 00:15:47,800 --> 00:15:50,880 Speaker 1: But Forbes investigated this and found that at least some 248 00:15:50,920 --> 00:15:53,280 Speaker 1: of it was showing up on servers in China, which 249 00:15:53,320 --> 00:15:58,680 Speaker 1: is concerning. Sony is following Microsoft's lead by discontinuing the 250 00:15:58,720 --> 00:16:02,360 Speaker 1: PS four and PS fun integrations with x also known 251 00:16:02,400 --> 00:16:06,160 Speaker 1: as Twitter. Microsoft ended integration for the Xbox way back 252 00:16:06,200 --> 00:16:09,720 Speaker 1: in April. Sony has not commented on the reason for 253 00:16:10,000 --> 00:16:13,320 Speaker 1: ending integration with x but if I had to guess, 254 00:16:13,480 --> 00:16:15,920 Speaker 1: I would say it has something to do with X's 255 00:16:16,040 --> 00:16:20,920 Speaker 1: change to its API or Application programming interface. So back 256 00:16:20,920 --> 00:16:24,400 Speaker 1: in April, Twitter at that time shut down most of 257 00:16:24,440 --> 00:16:27,320 Speaker 1: the features that were found in the free tier of 258 00:16:27,400 --> 00:16:32,520 Speaker 1: its API. Instead, they introduced these very hefty paid tiers 259 00:16:32,560 --> 00:16:38,320 Speaker 1: and the enterprise level tier had potentially really hefty price tags. 260 00:16:38,360 --> 00:16:42,160 Speaker 1: Wired reported last May that some companies could be looking 261 00:16:42,160 --> 00:16:44,680 Speaker 1: to pay as much as forty two thousand dollars a 262 00:16:44,920 --> 00:16:48,920 Speaker 1: month in order to make use of this enterprise API. So, 263 00:16:49,000 --> 00:16:52,800 Speaker 1: assuming Sony was incurring substantial fees to allow for x integration, 264 00:16:53,600 --> 00:16:56,200 Speaker 1: it's no wonder that they've decided to shut it down now. 265 00:16:56,720 --> 00:16:59,720 Speaker 1: It's more surprising that it actually stuck around half a 266 00:16:59,760 --> 00:17:05,040 Speaker 1: year longer than Xbox did, so that's something. Now. You 267 00:17:05,119 --> 00:17:08,119 Speaker 1: might remember that Epic Games, the maker of the insanely 268 00:17:08,200 --> 00:17:11,600 Speaker 1: popular title Fortnite, got into a massive legal battle with 269 00:17:11,640 --> 00:17:15,600 Speaker 1: Apple regarding how Apple handles payments within iOS. That is 270 00:17:15,640 --> 00:17:19,280 Speaker 1: still not fully resolved because while a judge ruled mostly 271 00:17:19,640 --> 00:17:23,320 Speaker 1: in Apple's favor, the judge did give Epicic some considerations. 272 00:17:23,960 --> 00:17:27,119 Speaker 1: That whole case is now headed toward the US Supreme 273 00:17:27,160 --> 00:17:29,800 Speaker 1: Court to weigh in on it. Meanwhile, Epic is now 274 00:17:29,800 --> 00:17:34,000 Speaker 1: pursuing a similar legal strategy against Google. So, like Apple, 275 00:17:34,240 --> 00:17:37,600 Speaker 1: Google strategy and mobile is to funnel payment options through 276 00:17:37,680 --> 00:17:41,080 Speaker 1: Google itself, and that allows Google to take a commission, 277 00:17:41,160 --> 00:17:45,399 Speaker 1: sometimes as large as thirty percent per payment. Epic argues 278 00:17:45,440 --> 00:17:48,320 Speaker 1: that Google's policies are anti competitive, and that they hurt 279 00:17:48,359 --> 00:17:51,800 Speaker 1: consumers and they drive up prices. Whether the court will 280 00:17:51,840 --> 00:17:55,439 Speaker 1: follow the path that the Apple lawsuit took remains to 281 00:17:55,440 --> 00:17:58,639 Speaker 1: be seen. Complicating matters is the fact that Google is 282 00:17:58,680 --> 00:18:01,119 Speaker 1: currently in the hot seat with the US government in 283 00:18:01,160 --> 00:18:06,200 Speaker 1: a much larger antitrust investigation. You know, I've often talked 284 00:18:06,200 --> 00:18:08,800 Speaker 1: about how the way NFTs got rolled out was a 285 00:18:08,800 --> 00:18:12,760 Speaker 1: total disaster that while I don't necessarily think NFT technology 286 00:18:13,320 --> 00:18:15,920 Speaker 1: has no place in the world, the way it has 287 00:18:16,000 --> 00:18:20,800 Speaker 1: been introduced was really really dumb. They didn't really amount 288 00:18:20,800 --> 00:18:23,919 Speaker 1: to much more than a digital receipt, despite promises that 289 00:18:24,000 --> 00:18:28,600 Speaker 1: NFTs were going to enable all sorts of interesting implementations. Instead, 290 00:18:28,640 --> 00:18:32,600 Speaker 1: it just became a speculative circus that ultimately ended in disillusionment, 291 00:18:33,119 --> 00:18:36,720 Speaker 1: with the texts reputation suffering a massive and maybe even 292 00:18:36,880 --> 00:18:40,720 Speaker 1: fatal blow. But despite all that, there are still true 293 00:18:40,760 --> 00:18:45,359 Speaker 1: believers out there. Some of them are pretty badly burnt. 294 00:18:45,920 --> 00:18:47,960 Speaker 1: And I don't mean that they were burnt by NFT 295 00:18:48,119 --> 00:18:53,400 Speaker 1: values crashing. I mean they literally got burnt. You see, 296 00:18:53,400 --> 00:18:57,159 Speaker 1: in Hong Kong there was this big event called Apefest, 297 00:18:57,440 --> 00:19:01,320 Speaker 1: which was for devotees of the board Ape Yacht Club 298 00:19:01,480 --> 00:19:05,800 Speaker 1: NFTs and while there, a bunch of attendees reported that 299 00:19:06,160 --> 00:19:10,480 Speaker 1: they later suffered really bad pains. Some of them even 300 00:19:10,680 --> 00:19:14,800 Speaker 1: weren't able to see. And while I haven't seen any 301 00:19:14,840 --> 00:19:18,240 Speaker 1: definitive proof about the matter, the speculation is that the 302 00:19:18,440 --> 00:19:22,240 Speaker 1: venue hosting this event was using these powerful UV lights 303 00:19:22,280 --> 00:19:25,800 Speaker 1: as part of its lighting rig and those lights overexposed 304 00:19:25,840 --> 00:19:30,320 Speaker 1: attendees to UV radiation, so they essentially got sun burnt 305 00:19:30,720 --> 00:19:34,800 Speaker 1: inside and they even suffered burns to their eyes. According 306 00:19:34,800 --> 00:19:38,119 Speaker 1: to Jess Weatherbed of The Verge, a totally different event 307 00:19:38,160 --> 00:19:41,120 Speaker 1: in Hong Kong that happened way back in twenty seventeen 308 00:19:41,680 --> 00:19:44,760 Speaker 1: had a similar problem. Folks discovered that the venue was 309 00:19:44,880 --> 00:19:47,320 Speaker 1: using UV lights that were meant to be used to 310 00:19:47,560 --> 00:19:51,320 Speaker 1: disinfect stuff, so they were emitting UV at a much 311 00:19:51,359 --> 00:19:55,040 Speaker 1: higher intensity than say, your average black light. Now, just 312 00:19:55,080 --> 00:19:58,280 Speaker 1: to be real here, I'm wishing everyone affected a swift 313 00:19:58,320 --> 00:20:02,000 Speaker 1: recovery because it just sound it's awful. And that's it 314 00:20:02,200 --> 00:20:06,160 Speaker 1: for the Tech News Today for November seventh, twenty twenty three. 315 00:20:06,600 --> 00:20:09,399 Speaker 1: I hope you are all well, and I'll talk to 316 00:20:09,400 --> 00:20:19,560 Speaker 1: you again really soon. Tech Stuff is an iHeartRadio production. 317 00:20:19,840 --> 00:20:24,880 Speaker 1: For more podcasts from iHeartRadio, visit the iHeartRadio, app, Apple podcasts, 318 00:20:25,000 --> 00:20:30,880 Speaker 1: or wherever you listen to your favorite shows.