1 00:00:13,680 --> 00:00:17,400 Speaker 1: From Kaleidoscope and iHeart podcasts. This is tech Stuff. I'm 2 00:00:17,400 --> 00:00:20,119 Speaker 1: as Volocian and I'm Cara Price. Today we're going to 3 00:00:20,120 --> 00:00:23,079 Speaker 1: get into the headlines this week, including an anti tech 4 00:00:23,200 --> 00:00:26,000 Speaker 1: rally in New York City and a look at how 5 00:00:26,120 --> 00:00:30,040 Speaker 1: workslop is affecting the workplace. Then, on Chat to Me. 6 00:00:30,600 --> 00:00:34,640 Speaker 2: I had students take personality tests in Spanish and then 7 00:00:34,840 --> 00:00:39,200 Speaker 2: used AI to turn their personality data into unique paintings. 8 00:00:39,760 --> 00:00:47,080 Speaker 1: All of that on the Weekend Tech. It's Friday, October third, Hello, Cara, hi. 9 00:00:47,280 --> 00:00:47,600 Speaker 3: As. 10 00:00:48,000 --> 00:00:50,920 Speaker 1: One of the great pleasures of recording this podcast with 11 00:00:51,040 --> 00:00:54,120 Speaker 1: you is getting to hear you chew ice right next 12 00:00:54,160 --> 00:00:57,520 Speaker 1: to the microphone before we start. I hope your mouth 13 00:00:57,600 --> 00:00:58,560 Speaker 1: is nice and cool. 14 00:00:58,800 --> 00:01:02,560 Speaker 3: I'm actually number I'm like talking like I just got 15 00:01:02,560 --> 00:01:03,160 Speaker 3: a tongue pier. 16 00:01:03,840 --> 00:01:04,960 Speaker 1: My ears are numb as well. 17 00:01:06,600 --> 00:01:09,399 Speaker 3: Well. Speaking of being in the studio together, it's your 18 00:01:09,400 --> 00:01:13,039 Speaker 3: turn in the hot seat now to make fun. Are 19 00:01:13,080 --> 00:01:14,840 Speaker 3: you even wearing shoes today? Oh? 20 00:01:15,680 --> 00:01:18,560 Speaker 1: It's a deep cut. I actually am right now, but 21 00:01:18,600 --> 00:01:20,839 Speaker 1: you obviously noticed that I do have an unconscious habit 22 00:01:20,920 --> 00:01:22,319 Speaker 1: of slipping them off. 23 00:01:22,600 --> 00:01:24,000 Speaker 3: I do, and it's disgusting. 24 00:01:24,280 --> 00:01:27,120 Speaker 1: It is well, I consciously try and keep them on. 25 00:01:27,800 --> 00:01:30,240 Speaker 1: In fact, but I did get in trouble. I think 26 00:01:30,280 --> 00:01:32,800 Speaker 1: I probably told you the story many years ago when 27 00:01:32,840 --> 00:01:35,920 Speaker 1: I was a consultant working on the Google account. I 28 00:01:35,920 --> 00:01:39,039 Speaker 1: once unconsciously slipped them off in the Google office and 29 00:01:39,200 --> 00:01:44,120 Speaker 1: was padding around Google's actually off, and my boss took 30 00:01:44,120 --> 00:01:45,400 Speaker 1: me his side and said, listen, I was, You're a 31 00:01:45,440 --> 00:01:48,280 Speaker 1: great guy, but if you ever take your shoes off 32 00:01:48,320 --> 00:01:51,720 Speaker 1: and a fine meeting again, you're fired, which is totally 33 00:01:51,800 --> 00:01:52,280 Speaker 1: fair enough. 34 00:01:52,400 --> 00:01:53,720 Speaker 3: It's such a power booth. 35 00:01:53,960 --> 00:01:57,960 Speaker 1: It was. I was so I whacked out that I 36 00:01:58,600 --> 00:01:59,760 Speaker 1: was not even aware of it. 37 00:01:59,800 --> 00:02:01,800 Speaker 3: You had six hundred coffees probably. 38 00:02:01,560 --> 00:02:04,880 Speaker 1: By then, exactly. But that's why I was particularly delighted 39 00:02:05,040 --> 00:02:07,440 Speaker 1: to be ahead of the curve on this. It's now 40 00:02:07,600 --> 00:02:10,840 Speaker 1: the norm in Silicon Valley, apparently, according to the Fortune 41 00:02:10,919 --> 00:02:14,240 Speaker 1: article you share to shame me in our full producers 42 00:02:14,240 --> 00:02:17,800 Speaker 1: and host Slack channel with the headline the hottest workplace 43 00:02:17,840 --> 00:02:21,600 Speaker 1: policy at startups right now no shoes So I'm curious, 44 00:02:21,760 --> 00:02:23,760 Speaker 1: kra how would you feel if we had a shoes 45 00:02:23,760 --> 00:02:25,480 Speaker 1: off policy at Cardoscope. 46 00:02:25,639 --> 00:02:29,399 Speaker 3: The thing is is that I am so obsessed with 47 00:02:29,919 --> 00:02:33,799 Speaker 3: dumbing down the workplace to make workaholism. 48 00:02:34,040 --> 00:02:36,600 Speaker 1: Feel comfortable exactly, and that's exactly what this policy is 49 00:02:36,600 --> 00:02:39,440 Speaker 1: all about, exactly. So that's interesting. There was actually a 50 00:02:39,480 --> 00:02:42,640 Speaker 1: picture on the on the Fortune article of a shoe rack. 51 00:02:43,919 --> 00:02:46,200 Speaker 3: You see that It was like a Japanese restaurant. 52 00:02:46,240 --> 00:02:48,440 Speaker 1: Yeah, it looks like somewhere between a Japanese restaurant and 53 00:02:48,440 --> 00:02:50,560 Speaker 1: a high school, right. I mean, it's like the locker 54 00:02:50,600 --> 00:02:53,240 Speaker 1: full of like dirty trainers. I mean, it's really the 55 00:02:53,280 --> 00:02:55,080 Speaker 1: aesthetics of the new Silicon Vallee. 56 00:02:55,120 --> 00:02:58,600 Speaker 3: Weird, but it's a sort of sinister trojan horse in 57 00:02:58,639 --> 00:03:00,160 Speaker 3: the way that you know they used to. I mean, 58 00:03:00,200 --> 00:03:01,919 Speaker 3: I was just reading in the article all of these 59 00:03:01,960 --> 00:03:05,440 Speaker 3: ways that companies make offices more appealing, and I think 60 00:03:05,560 --> 00:03:08,519 Speaker 3: going in and being like, you know what, I'm walking 61 00:03:08,600 --> 00:03:10,720 Speaker 3: right back into my apartment to go to work is like. 62 00:03:10,880 --> 00:03:14,240 Speaker 1: Gotcha free haircuts, but shoes off. 63 00:03:14,360 --> 00:03:14,799 Speaker 3: That's right. 64 00:03:15,200 --> 00:03:19,160 Speaker 1: Well, I'm obviously obsessed by the workplace, sadly, so I 65 00:03:19,160 --> 00:03:22,679 Speaker 1: couldn't resist this story in the Harvard Business Review, which 66 00:03:22,720 --> 00:03:26,280 Speaker 1: went around the internet and had the headline AI generated 67 00:03:26,560 --> 00:03:29,240 Speaker 1: work slop is destroying productivity. 68 00:03:29,840 --> 00:03:31,560 Speaker 3: When was the last time you heard the word slop? 69 00:03:31,639 --> 00:03:33,880 Speaker 1: I hear the word slop all the time now, Yeah. 70 00:03:33,720 --> 00:03:37,520 Speaker 3: It's weird. It's like AI has brought slop back into 71 00:03:37,520 --> 00:03:40,560 Speaker 3: the conversation when it was never that, Like slop was 72 00:03:40,600 --> 00:03:41,080 Speaker 3: not a thing. 73 00:03:41,200 --> 00:03:43,600 Speaker 1: You're so right that the twenty twenty six word of 74 00:03:43,640 --> 00:03:45,840 Speaker 1: the year is slop. I do agree. I think I 75 00:03:45,920 --> 00:03:48,160 Speaker 1: first I first started hearing at the beginning of this 76 00:03:48,240 --> 00:03:50,400 Speaker 1: year AI slop, AI slop a stop, And now it's 77 00:03:50,440 --> 00:03:53,800 Speaker 1: like everywhere so work slop. Well, guess what it is. 78 00:03:53,960 --> 00:03:57,640 Speaker 3: The bottom of the barrel. 79 00:03:56,600 --> 00:03:59,960 Speaker 1: Was AI generated crap that looks a little bit like 80 00:04:00,080 --> 00:04:04,440 Speaker 1: real work. And I was pretty interested in this because 81 00:04:04,600 --> 00:04:06,480 Speaker 1: I am not only the host of text stuff with 82 00:04:06,600 --> 00:04:09,440 Speaker 1: you you are, I'm also the co founder of a 83 00:04:09,480 --> 00:04:13,200 Speaker 1: podcast network called Kaleidoscope, which is the network that produces 84 00:04:13,240 --> 00:04:17,560 Speaker 1: this podcast, and in my CEO hat, obviously in response 85 00:04:17,600 --> 00:04:20,920 Speaker 1: to investors and culture and whatever else, I'm always encouraging 86 00:04:20,960 --> 00:04:23,520 Speaker 1: our team. So how can we use AI? How can 87 00:04:23,520 --> 00:04:25,000 Speaker 1: we be part of the air evolution? How can we 88 00:04:25,000 --> 00:04:28,320 Speaker 1: not be left behind? On the other hand, when people 89 00:04:28,440 --> 00:04:31,599 Speaker 1: send me work there's clearly been made my AI, I'm 90 00:04:31,640 --> 00:04:35,280 Speaker 1: absolutely furious, Which is I think the paradox at the 91 00:04:35,279 --> 00:04:38,360 Speaker 1: heart of this. Harv a business review study which found 92 00:04:38,400 --> 00:04:42,200 Speaker 1: the following quote Approximately half of the people we surveyed 93 00:04:42,520 --> 00:04:46,400 Speaker 1: viewed colleagues who sent WORKSLOP as less creative, less capable, 94 00:04:46,440 --> 00:04:49,800 Speaker 1: and less reliable than they did before receiving the output. 95 00:04:50,320 --> 00:04:53,159 Speaker 1: Forty two percent saw them as less trustworthy, and thirty 96 00:04:53,240 --> 00:04:56,560 Speaker 1: seven percent thought that the colleague was less intelligent afterwards. 97 00:04:56,720 --> 00:04:59,680 Speaker 3: But if you're already sort of not in love with 98 00:04:59,680 --> 00:05:03,680 Speaker 3: your CAR colleagues, now there's a new reason to think 99 00:05:03,680 --> 00:05:05,000 Speaker 3: that your colleagues are slacking off. 100 00:05:05,120 --> 00:05:09,239 Speaker 1: Essentially, Yeah, I mean it's interesting, right. Not all AI 101 00:05:10,040 --> 00:05:14,640 Speaker 1: assisted work is slop. Obviously is work slot, but there's 102 00:05:14,680 --> 00:05:18,400 Speaker 1: also the sort of lazy thing where you say, oh, well, 103 00:05:18,520 --> 00:05:20,880 Speaker 1: let me have AI generated the report and not think. 104 00:05:21,240 --> 00:05:23,040 Speaker 1: And I think that's what this article is really about. 105 00:05:23,279 --> 00:05:25,440 Speaker 3: I looked at this article too. There's this idea of 106 00:05:25,520 --> 00:05:29,160 Speaker 3: like the distinction between a pilot and a passenger. I 107 00:05:29,560 --> 00:05:31,120 Speaker 3: can't exactly wrap my head around that. 108 00:05:31,279 --> 00:05:33,080 Speaker 1: Well, what do you imagine it is? 109 00:05:33,200 --> 00:05:35,480 Speaker 3: Well, a pilot flies the plane. 110 00:05:35,680 --> 00:05:36,960 Speaker 1: Yeah, what does the passenger do? 111 00:05:37,839 --> 00:05:39,599 Speaker 3: Rides the plane? 112 00:05:39,800 --> 00:05:40,960 Speaker 1: The plane sits in the back. 113 00:05:41,360 --> 00:05:42,760 Speaker 3: Are you a pilot or a passenger? 114 00:05:43,040 --> 00:05:44,880 Speaker 1: I'm not as much as a pilot as I would 115 00:05:44,920 --> 00:05:47,360 Speaker 1: like to be with AI, honestly, but the distinction they're 116 00:05:47,360 --> 00:05:51,320 Speaker 1: making is pilots are people who actually take these new 117 00:05:51,360 --> 00:05:54,520 Speaker 1: genera AI tools and harness them to do better work, 118 00:05:54,800 --> 00:05:57,720 Speaker 1: and passengers are people who use them either because their 119 00:05:57,720 --> 00:05:59,680 Speaker 1: boss has told them to or because they're trying to 120 00:05:59,680 --> 00:06:03,640 Speaker 1: short at doing their work. So essentially, you know, passengers 121 00:06:03,720 --> 00:06:06,840 Speaker 1: churn out AI slop, and pilots, at least according to 122 00:06:06,839 --> 00:06:09,040 Speaker 1: this study, use AI to be better workers. 123 00:06:09,360 --> 00:06:11,760 Speaker 3: This actually reminds me of one of my recent favorite 124 00:06:11,839 --> 00:06:15,839 Speaker 3: party term drops, which is cognitive offloading, which you know, 125 00:06:15,960 --> 00:06:19,160 Speaker 3: is when we offload critical thinking to generative AI and 126 00:06:19,279 --> 00:06:22,159 Speaker 3: just how much that puts the person who's using AI 127 00:06:22,279 --> 00:06:24,880 Speaker 3: at a deficit. And I think that slop, I guess, 128 00:06:25,000 --> 00:06:27,680 Speaker 3: is a product of cognitive offloading. 129 00:06:27,960 --> 00:06:32,599 Speaker 1: It is, and interestingly, not just cognitive offloading on behalf 130 00:06:32,839 --> 00:06:36,159 Speaker 1: of the employee, but also in many cases cognitive offloading 131 00:06:36,240 --> 00:06:38,160 Speaker 1: on behalf of the boss. And a lot of these 132 00:06:38,200 --> 00:06:41,120 Speaker 1: mandates whereas you're basically you have to use AI or 133 00:06:41,320 --> 00:06:47,240 Speaker 1: use AI the persons I will, okay, but if you 134 00:06:47,320 --> 00:06:49,839 Speaker 1: don't set a proper playbook for how to use it 135 00:06:49,920 --> 00:06:53,080 Speaker 1: and some training for how to use it smartly, no surprise, 136 00:06:53,400 --> 00:06:57,960 Speaker 1: garbage in, garbage out. I actually thought this story appealed 137 00:06:58,000 --> 00:06:59,960 Speaker 1: to me not just because of the you know, work 138 00:07:00,080 --> 00:07:03,480 Speaker 1: place context, but also the macro context and these swirling 139 00:07:03,560 --> 00:07:07,440 Speaker 1: questions about whether all this investment in AI will actually 140 00:07:07,520 --> 00:07:10,520 Speaker 1: pay off to transform the economy and the financial times 141 00:07:10,560 --> 00:07:13,440 Speaker 1: at a great piece where they actually used AI but 142 00:07:13,480 --> 00:07:16,239 Speaker 1: in an intriguing way to do a kind of meta 143 00:07:16,320 --> 00:07:20,520 Speaker 1: analysis of SEC filings of different companies and how they're 144 00:07:20,600 --> 00:07:24,840 Speaker 1: using AI. The takeaway was quote, the biggest US listed 145 00:07:24,880 --> 00:07:28,160 Speaker 1: companies keep talking about AI, but other than fear of 146 00:07:28,200 --> 00:07:31,360 Speaker 1: missing out, few appear to be able to describe how 147 00:07:31,400 --> 00:07:34,360 Speaker 1: the technology is changing their business for the better. Stick 148 00:07:34,400 --> 00:07:35,119 Speaker 1: with that for a moment. 149 00:07:35,880 --> 00:07:37,160 Speaker 3: That's really crazy. 150 00:07:37,480 --> 00:07:39,280 Speaker 1: On top of that, MIT put out a report in 151 00:07:39,360 --> 00:07:43,000 Speaker 1: July that found that ninety five percent of organizations are 152 00:07:43,040 --> 00:07:46,520 Speaker 1: getting zero return. What do you mean by that, Well, 153 00:07:46,560 --> 00:07:48,680 Speaker 1: I guess the benefits are not being realized in the 154 00:07:48,720 --> 00:07:52,120 Speaker 1: real world by most people. The HBr thing goes on 155 00:07:52,160 --> 00:07:55,680 Speaker 1: to talk about how, based on self reporting, forty one 156 00:07:55,680 --> 00:07:58,880 Speaker 1: percent of employees say they're dealing with AI slop and 157 00:07:58,960 --> 00:08:02,680 Speaker 1: each instance of ailop cost them almost two hours to 158 00:08:02,800 --> 00:08:06,400 Speaker 1: clean up whatever the slop is. So have a business 159 00:08:06,400 --> 00:08:09,160 Speaker 1: review goes on to estimate that each incident of workslop 160 00:08:09,600 --> 00:08:13,000 Speaker 1: creates what they call an invisible tax of one hundred 161 00:08:13,000 --> 00:08:16,160 Speaker 1: and eighty six dollars per month per employee. Now, I'm 162 00:08:16,160 --> 00:08:18,840 Speaker 1: going to do some math here. Please do For a 163 00:08:18,880 --> 00:08:22,320 Speaker 1: company that say ten thousand people, if forty one percent 164 00:08:22,360 --> 00:08:24,520 Speaker 1: of those people are dealing with work slop, and that 165 00:08:24,560 --> 00:08:26,680 Speaker 1: forty one percent is wasting one hundred and eighty six 166 00:08:26,760 --> 00:08:29,480 Speaker 1: dollars per month to fix the slop, that comes out 167 00:08:29,480 --> 00:08:33,160 Speaker 1: to over nine million dollars of loss productivity per year 168 00:08:33,440 --> 00:08:35,640 Speaker 1: for the company if they have ten thousand employees. 169 00:08:35,800 --> 00:08:37,959 Speaker 3: That's a lot of money to fix the slop. Actually, 170 00:08:38,600 --> 00:08:41,080 Speaker 3: do you think it is going to stop people from 171 00:08:41,120 --> 00:08:43,880 Speaker 3: integrating AI tools or stop founders like yourself? 172 00:08:43,920 --> 00:08:46,000 Speaker 1: Now? Definitely not. Because it's exciting and it has a 173 00:08:46,000 --> 00:08:48,599 Speaker 1: tremendous promise and it could change the world. At the 174 00:08:48,679 --> 00:08:52,160 Speaker 1: same time, there is this deep irony here, right, So 175 00:08:52,600 --> 00:08:55,040 Speaker 1: it may not really work right now, but if it 176 00:08:55,080 --> 00:08:57,480 Speaker 1: does work and we don't invest in it, we're dead 177 00:08:57,480 --> 00:08:58,160 Speaker 1: in the water. 178 00:08:58,720 --> 00:09:01,679 Speaker 3: All right. So moving away from the workplace, I want 179 00:09:01,720 --> 00:09:04,240 Speaker 3: to talk about a staggering figure I read in an 180 00:09:04,320 --> 00:09:08,439 Speaker 3: article from Business Insider this week, which is this five 181 00:09:08,600 --> 00:09:12,520 Speaker 3: hours of daily screen time equals fifteen years of life 182 00:09:12,880 --> 00:09:14,040 Speaker 3: by the age of seventy. 183 00:09:14,360 --> 00:09:17,840 Speaker 1: Man, that's depressing. I think I'm closer to ten hours. 184 00:09:18,040 --> 00:09:19,440 Speaker 3: You think you're on ten hours of screen? 185 00:09:19,600 --> 00:09:21,960 Speaker 1: I just phone and computer to phone computer. 186 00:09:22,520 --> 00:09:24,800 Speaker 3: Is Is that your life or is that another life 187 00:09:24,800 --> 00:09:25,600 Speaker 3: that could be lived? 188 00:09:25,840 --> 00:09:26,800 Speaker 1: Definitely the latter. 189 00:09:26,960 --> 00:09:30,079 Speaker 3: It's crazy. H Well, So this was in a Business 190 00:09:30,080 --> 00:09:35,040 Speaker 3: Insider article about a group of people who are kind 191 00:09:35,040 --> 00:09:38,760 Speaker 3: of reclaiming the term Luddite, who gathered at the Highline 192 00:09:38,800 --> 00:09:42,959 Speaker 3: in New York City to protest the use of screens. 193 00:09:43,200 --> 00:09:47,040 Speaker 1: Essentially, yeah, I just you mentioned reclaiming the term luddite. 194 00:09:47,280 --> 00:09:49,800 Speaker 1: Luddite is used as a smear or has been used 195 00:09:49,840 --> 00:09:53,000 Speaker 1: as a smear to describe people who have like a 196 00:09:53,000 --> 00:09:56,080 Speaker 1: aversion to technology. But I think the word has been 197 00:09:56,440 --> 00:09:59,320 Speaker 1: rehistoricized this year and people have done work on the 198 00:09:59,360 --> 00:10:03,920 Speaker 1: origin story the Luddites who were brave, forward looking protesters 199 00:10:03,960 --> 00:10:06,600 Speaker 1: who at the beginning of the Industrial Revolution were willing 200 00:10:06,679 --> 00:10:10,800 Speaker 1: to sacrifice their lives to protest against the mechanization of 201 00:10:10,840 --> 00:10:13,640 Speaker 1: society and how it was making people jobless and in 202 00:10:13,679 --> 00:10:16,280 Speaker 1: some case go hungry and in some case be killed 203 00:10:16,320 --> 00:10:19,199 Speaker 1: by machines. So it is interesting seeing this work come 204 00:10:19,559 --> 00:10:20,480 Speaker 1: back into vogue. 205 00:10:20,559 --> 00:10:23,440 Speaker 3: I think in this case it has a lot for 206 00:10:23,480 --> 00:10:25,760 Speaker 3: these people who were gathering. I don't know how much 207 00:10:25,760 --> 00:10:27,880 Speaker 3: it had to do with the state of work, as 208 00:10:27,880 --> 00:10:30,520 Speaker 3: it had to do with how much time we are 209 00:10:30,559 --> 00:10:33,760 Speaker 3: wasting on our screens and what the human cost of 210 00:10:33,800 --> 00:10:37,319 Speaker 3: wasting that time is. Business Insider sent someone to the 211 00:10:37,440 --> 00:10:40,720 Speaker 3: rally and basically she described what she saw there, so 212 00:10:40,760 --> 00:10:42,760 Speaker 3: in the article she actually said a number of people 213 00:10:42,760 --> 00:10:46,120 Speaker 3: were dressed up, and lots of them more colorful hats 214 00:10:46,160 --> 00:10:48,920 Speaker 3: in the shape of cones, meant to symbolize the down 215 00:10:48,960 --> 00:10:50,640 Speaker 3: to earth, humble garden gnome. 216 00:10:51,040 --> 00:10:55,160 Speaker 1: Interesting lots of critique about the tech industry. Do you 217 00:10:55,160 --> 00:10:58,800 Speaker 1: think that dressing as a gnome will make the lords 218 00:10:58,800 --> 00:11:00,760 Speaker 1: of the universe? Can the boots? 219 00:11:01,480 --> 00:11:04,840 Speaker 3: I don't think quake in their boots necessarily. I think 220 00:11:04,920 --> 00:11:07,679 Speaker 3: there's a sort of tongue in cheek aspect to this rally. 221 00:11:08,160 --> 00:11:10,319 Speaker 3: The hats were one example of that. There's also the 222 00:11:10,400 --> 00:11:13,680 Speaker 3: name of the event, which I loved, scathing hatred of 223 00:11:13,720 --> 00:11:18,920 Speaker 3: information technology and the passionate hemorrhaging of our neoliberal experience, which, 224 00:11:19,080 --> 00:11:22,640 Speaker 3: as you may guess, is an acronym for shit phone. 225 00:11:23,040 --> 00:11:25,480 Speaker 1: I don't think I've used the words neoliberals since I 226 00:11:25,520 --> 00:11:28,960 Speaker 1: was in college. For that, I've never used Neil's the 227 00:11:29,000 --> 00:11:30,240 Speaker 1: core of that message. 228 00:11:30,160 --> 00:11:32,280 Speaker 3: Yeah, you know. According to the Reporter and Business Insider, 229 00:11:32,400 --> 00:11:36,000 Speaker 3: the goal is to actually advocate for healthier relationships with 230 00:11:36,040 --> 00:11:39,480 Speaker 3: technology and quote in the article to take a conscious 231 00:11:39,520 --> 00:11:42,960 Speaker 3: step back from the social media apps, which I agree 232 00:11:43,000 --> 00:11:45,280 Speaker 3: with and need to implement in my own life. I 233 00:11:45,280 --> 00:11:47,679 Speaker 3: was actually telling you this before we started the show, 234 00:11:47,720 --> 00:11:50,640 Speaker 3: that there was this very serendipitous moment where a woman 235 00:11:50,720 --> 00:11:53,880 Speaker 3: came into the coffee shop where I was working, and 236 00:11:54,240 --> 00:11:56,440 Speaker 3: she said, can you please charge his phone? And I 237 00:11:56,520 --> 00:11:58,800 Speaker 3: look at it and it's a flip phone, and so 238 00:11:58,840 --> 00:12:00,800 Speaker 3: I'm thinking to myself, here's an opportunity to have a 239 00:12:00,800 --> 00:12:03,040 Speaker 3: conversation with her. And I said, you know, when did 240 00:12:03,120 --> 00:12:04,480 Speaker 3: you get this flip phone? And she said, I got 241 00:12:04,520 --> 00:12:06,959 Speaker 3: it a few months ago because I was noticing that 242 00:12:07,040 --> 00:12:10,120 Speaker 3: my memory was getting really bad, Like she just had 243 00:12:10,160 --> 00:12:12,280 Speaker 3: a hard time remembering things that she was reading. And 244 00:12:12,920 --> 00:12:15,400 Speaker 3: she said that since she started using this flip phone 245 00:12:15,679 --> 00:12:17,880 Speaker 3: that like her memory has come back within a matter 246 00:12:17,920 --> 00:12:19,960 Speaker 3: of weeks. I don't know exactly what that looks like 247 00:12:20,000 --> 00:12:22,640 Speaker 3: for her, but it was just this amazing moment where 248 00:12:22,640 --> 00:12:26,480 Speaker 3: I'm reading this piece about you know, techno ludditism, and 249 00:12:26,920 --> 00:12:29,640 Speaker 3: basically this woman comes in doing. 250 00:12:29,559 --> 00:12:32,480 Speaker 1: Exactly that's living that life. Yeah, I mean, how old 251 00:12:32,600 --> 00:12:33,079 Speaker 1: was this woman? 252 00:12:33,360 --> 00:12:34,320 Speaker 3: She must have been our age. 253 00:12:34,400 --> 00:12:37,760 Speaker 1: Okay, yeah, but the protests and they're gen z their 254 00:12:37,760 --> 00:12:40,120 Speaker 1: gen z. And I think one of the trends which 255 00:12:40,160 --> 00:12:42,760 Speaker 1: is emerging is that the most anti tech generation is 256 00:12:42,800 --> 00:12:45,199 Speaker 1: the generation who grew up most on tech, which is 257 00:12:45,280 --> 00:12:45,840 Speaker 1: kind of interesting. 258 00:12:45,920 --> 00:12:48,120 Speaker 3: I think that's because they never grew up without it, 259 00:12:48,160 --> 00:12:49,600 Speaker 3: and so I think there is a little bit of 260 00:12:49,600 --> 00:12:51,760 Speaker 3: a nostalgia for a time that they never had. It's 261 00:12:51,760 --> 00:12:53,920 Speaker 3: like me using vinyl, you know what I mean, Like, 262 00:12:53,960 --> 00:12:55,680 Speaker 3: I think it's cool because I didn't have to use it. 263 00:12:56,360 --> 00:12:59,640 Speaker 3: But Oz, this small but mighty rally wasn't the only 264 00:12:59,800 --> 00:13:02,360 Speaker 3: tech rejection I heard about over the weekend. Have you 265 00:13:02,400 --> 00:13:05,280 Speaker 3: actually been in the West fourth subway stop recently? 266 00:13:05,559 --> 00:13:06,840 Speaker 1: I have? 267 00:13:06,840 --> 00:13:09,280 Speaker 3: Have you seen the ads for friend dot com? 268 00:13:09,760 --> 00:13:11,760 Speaker 1: No? Well I have seen them not in West Fourth Street. 269 00:13:11,760 --> 00:13:13,720 Speaker 1: I saw them in another subway stations. At these like 270 00:13:14,320 --> 00:13:17,880 Speaker 1: big white posters with a picture of a mysterious device 271 00:13:18,600 --> 00:13:20,680 Speaker 1: and a Dictionary definition of the word friend. 272 00:13:20,920 --> 00:13:24,520 Speaker 3: Those are actually ads for a wearable device called Friend, 273 00:13:25,200 --> 00:13:28,320 Speaker 3: and the ad copy says things like I'll never bail 274 00:13:28,360 --> 00:13:31,880 Speaker 3: on dinner plans, you know, stuff like that. I originally 275 00:13:31,960 --> 00:13:34,880 Speaker 3: found this story that I'm telling you right now on Tumblr. 276 00:13:34,880 --> 00:13:36,839 Speaker 3: I actually didn't see it in the subway and there 277 00:13:36,920 --> 00:13:39,800 Speaker 3: was like a thread on tumbler full of pictures of 278 00:13:39,840 --> 00:13:43,720 Speaker 3: these ads that are completely covered in graffiti. So over 279 00:13:43,760 --> 00:13:47,120 Speaker 3: these friend ads people would write things like surveillance capitalism, 280 00:13:47,280 --> 00:13:50,640 Speaker 3: or get real friends, or stop profiting off of loneliness, 281 00:13:50,800 --> 00:13:52,880 Speaker 3: or friends is flesh and blood. 282 00:13:53,040 --> 00:13:56,800 Speaker 1: In other words, people objected to this very much, company 283 00:13:57,120 --> 00:14:00,800 Speaker 1: claiming ownership of the word friend. That's exactly the product though, 284 00:14:00,880 --> 00:14:01,760 Speaker 1: is like a wearable AI. 285 00:14:02,000 --> 00:14:04,880 Speaker 3: This is a wearable device that looks like a miniature 286 00:14:04,920 --> 00:14:08,160 Speaker 3: smart speaker that you wear around your neck. It listens 287 00:14:08,200 --> 00:14:10,040 Speaker 3: to you all day like a friend. It collects your 288 00:14:10,120 --> 00:14:12,320 Speaker 3: data and then you can talk with it via text 289 00:14:12,360 --> 00:14:15,800 Speaker 3: about what's going on my friend, Yeah text, I mean 290 00:14:15,800 --> 00:14:17,840 Speaker 3: this is a lot of my relationships are like this. 291 00:14:18,240 --> 00:14:21,560 Speaker 3: And according to Adweek, friend dot com spent over one 292 00:14:21,680 --> 00:14:25,800 Speaker 3: million dollars in subway advertising. The CEO of VI Schiffman 293 00:14:26,240 --> 00:14:29,560 Speaker 3: designed the creative himself and boasted on x that quote, 294 00:14:29,600 --> 00:14:32,760 Speaker 3: this is the largest ad campaign in New York subway 295 00:14:32,800 --> 00:14:34,000 Speaker 3: advertising history. 296 00:14:34,640 --> 00:14:37,720 Speaker 1: I read that Avi Schiffman raised five million dollars for 297 00:14:38,040 --> 00:14:41,720 Speaker 1: Friend a year ago and spent, as you mentioned, one 298 00:14:41,760 --> 00:14:46,040 Speaker 1: million on this campaign. And he said to Adweek, I 299 00:14:46,080 --> 00:14:48,880 Speaker 1: don't have much money left. I spent it all on this. 300 00:14:49,280 --> 00:14:52,120 Speaker 1: It's a huge gamble. This makes me laugh because I 301 00:14:52,160 --> 00:14:55,920 Speaker 1: have a pretty strong feeling and you know, I'm waiting 302 00:14:55,920 --> 00:14:58,880 Speaker 1: to be corrected that Friend will not take off as 303 00:14:58,920 --> 00:14:59,400 Speaker 1: a product. 304 00:14:59,680 --> 00:15:01,840 Speaker 3: I think think what's really funny about it is that 305 00:15:02,160 --> 00:15:04,360 Speaker 3: sometimes the marketing is the thing itself. And I think 306 00:15:04,400 --> 00:15:07,200 Speaker 3: it's really funny that people are engaging so much in 307 00:15:07,240 --> 00:15:10,760 Speaker 3: the defacing of these ads. And I think what's interesting 308 00:15:11,320 --> 00:15:13,880 Speaker 3: is that people know exactly what it is when they 309 00:15:13,880 --> 00:15:15,880 Speaker 3: look at the ad, and they also know that they 310 00:15:15,880 --> 00:15:18,520 Speaker 3: don't want to engage in using it. And so I 311 00:15:18,560 --> 00:15:20,680 Speaker 3: think the proof is in the putting in terms of 312 00:15:20,840 --> 00:15:23,320 Speaker 3: what people are writing on these ads. And again it 313 00:15:23,360 --> 00:15:29,080 Speaker 3: ties back to there's this growing trend to pull away 314 00:15:29,160 --> 00:15:33,760 Speaker 3: from just the onslaught of like digital interface that we 315 00:15:33,800 --> 00:15:36,600 Speaker 3: have in our lives. It's just I think gen Z 316 00:15:36,800 --> 00:15:43,440 Speaker 3: is really aware of how much products devices are co 317 00:15:43,480 --> 00:15:47,000 Speaker 3: opting their lives, and they don't. They're mad as hell 318 00:15:47,000 --> 00:15:48,200 Speaker 3: and they don't want to take it anymore. 319 00:15:48,640 --> 00:15:52,080 Speaker 1: I think the big question for me really is, will 320 00:15:52,120 --> 00:15:57,200 Speaker 1: this anger and frustration there's coalescing around the role of 321 00:15:57,240 --> 00:15:59,960 Speaker 1: technology in our lives. Will it become a political force. 322 00:16:00,120 --> 00:16:03,880 Speaker 1: Will it be organized enough and durable enough to actually 323 00:16:04,280 --> 00:16:07,840 Speaker 1: drive any change in the way technology is used, the 324 00:16:07,840 --> 00:16:11,680 Speaker 1: way it's regulated. I mean, it's you know, it's really 325 00:16:11,720 --> 00:16:14,560 Speaker 1: it's really really hard to make society. What changes. Of course, 326 00:16:14,680 --> 00:16:18,400 Speaker 1: it did happen with cigarettes and sugar, ELTs and sugar, 327 00:16:18,480 --> 00:16:22,960 Speaker 1: and so, you know, we'll see after break. Some more headlines. 328 00:16:23,240 --> 00:16:25,800 Speaker 1: A top video game company is going private in a 329 00:16:25,880 --> 00:16:29,600 Speaker 1: fifty billion dollar deal. Italy becomes the first country in 330 00:16:29,600 --> 00:16:33,400 Speaker 1: the EU to pass its own AI regulations, and an 331 00:16:33,400 --> 00:16:37,560 Speaker 1: AI actress has agents lining up to sign her. Then 332 00:16:37,840 --> 00:16:48,160 Speaker 1: on chatting me, we learn Spanish. Stay with us. So, Kara, 333 00:16:48,360 --> 00:16:51,640 Speaker 1: you asked me at the beginning of the episode whether 334 00:16:51,680 --> 00:16:52,520 Speaker 1: I was wearing shoes. 335 00:16:52,760 --> 00:16:53,560 Speaker 3: Yeah I did. 336 00:16:53,840 --> 00:16:57,240 Speaker 1: I'm going to ask you now, do you play video games? Hell? 337 00:16:57,360 --> 00:16:58,560 Speaker 1: Yeah you do. 338 00:16:58,120 --> 00:17:00,760 Speaker 3: I know I knew you didn't, and I was wondering 339 00:17:00,760 --> 00:17:02,400 Speaker 3: if you're going to ask me, what do you play 340 00:17:02,520 --> 00:17:05,720 Speaker 3: I'm a big Moriocart eight switch user. 341 00:17:05,800 --> 00:17:06,600 Speaker 1: Oh interesting, yea. 342 00:17:06,960 --> 00:17:08,400 Speaker 3: I fall asleep with it on me all the time 343 00:17:08,440 --> 00:17:10,600 Speaker 3: and it hits me in the face and then I 344 00:17:10,600 --> 00:17:11,000 Speaker 3: wake up. 345 00:17:11,040 --> 00:17:13,199 Speaker 1: So as a resounding yes, yes, have you heard the 346 00:17:13,200 --> 00:17:14,920 Speaker 1: story about electronic arts this week? 347 00:17:15,160 --> 00:17:17,359 Speaker 3: E A Sports It's in the game. If I say 348 00:17:17,440 --> 00:17:18,960 Speaker 3: that to my sister, she'd be like, yep, I know 349 00:17:19,000 --> 00:17:20,040 Speaker 3: exactly what you're talking about. 350 00:17:20,600 --> 00:17:23,960 Speaker 1: So it's a massive video game company, both titled like Fifa, 351 00:17:24,520 --> 00:17:27,639 Speaker 1: Madden and the Sims, and it has agreed to go 352 00:17:27,720 --> 00:17:31,199 Speaker 1: private for the price of fifty five billion dollars. According 353 00:17:31,200 --> 00:17:33,280 Speaker 1: to The New York Times. If the deal goes through, 354 00:17:33,520 --> 00:17:36,199 Speaker 1: it will be the largest buyout of a publicly traded 355 00:17:36,240 --> 00:17:39,040 Speaker 1: company ever, not adjusting for inflation. 356 00:17:39,640 --> 00:17:41,240 Speaker 3: So what happens to the shareholders? 357 00:17:41,400 --> 00:17:47,240 Speaker 1: They get money two hundred and ten dollars per share, 358 00:17:47,240 --> 00:17:50,240 Speaker 1: which is a twenty five percent premium to the company's 359 00:17:50,240 --> 00:17:52,119 Speaker 1: share price as a public company. 360 00:17:52,680 --> 00:17:55,199 Speaker 3: So who's paying for this? This is just crazy money. 361 00:17:55,359 --> 00:17:59,719 Speaker 1: The group of investors led by Saudi Arabia's public investment fund, 362 00:18:00,000 --> 00:18:04,560 Speaker 1: Infinity Partners, which is Jared Kushner's private equity firm, and finally, 363 00:18:04,640 --> 00:18:08,280 Speaker 1: another firm Silver Lake, who also rumored to be part 364 00:18:08,359 --> 00:18:09,440 Speaker 1: of the TikTok deal. 365 00:18:09,800 --> 00:18:12,080 Speaker 3: So that is the other huge deal that's said to 366 00:18:12,080 --> 00:18:13,320 Speaker 3: be decided in the coming months. 367 00:18:13,480 --> 00:18:16,800 Speaker 1: I mean to me, these stories go side by side. Obviously, 368 00:18:16,920 --> 00:18:20,080 Speaker 1: Jared Kushner being involved in the EA deal and you 369 00:18:20,200 --> 00:18:24,240 Speaker 1: benefiting from his Trump connections. The TikTok deal has, you know, 370 00:18:24,359 --> 00:18:27,080 Speaker 1: Larry and David Ellison front and centered in a deal 371 00:18:27,119 --> 00:18:30,439 Speaker 1: being broken by the Trump administration and the Chinese government. 372 00:18:30,480 --> 00:18:34,760 Speaker 1: It's interesting to me, you know how US technology capitalism 373 00:18:34,880 --> 00:18:38,439 Speaker 1: is kind of more and more integrating with the state, 374 00:18:38,680 --> 00:18:41,520 Speaker 1: or at the very least the Trump the Trump Friends 375 00:18:41,560 --> 00:18:42,400 Speaker 1: and Family. 376 00:18:42,080 --> 00:18:44,639 Speaker 3: Circle, Yes, Trump fn F as we'd like to call it. 377 00:18:44,680 --> 00:18:48,800 Speaker 3: So what's going to happen next in this electronic arts buyout. 378 00:18:49,160 --> 00:18:51,320 Speaker 1: Well, the deal has to be approved by the government's 379 00:18:51,359 --> 00:18:55,800 Speaker 1: Committee on Foreign Investment, which reviews foreign buyouts for security concerns. 380 00:18:56,960 --> 00:18:59,600 Speaker 1: There are concerns about data, of course. I mean a 381 00:18:59,640 --> 00:19:02,800 Speaker 1: lot of people play video games in the US, which 382 00:19:02,880 --> 00:19:05,119 Speaker 1: means there's a lot of data that could be collected 383 00:19:05,160 --> 00:19:06,400 Speaker 1: by a foreign government. 384 00:19:07,400 --> 00:19:09,480 Speaker 3: So do you look at app store charts? 385 00:19:10,320 --> 00:19:11,960 Speaker 1: I look at podcast charts because. 386 00:19:11,720 --> 00:19:13,280 Speaker 3: You want to see how your podcasts are doing because 387 00:19:13,280 --> 00:19:15,959 Speaker 3: you're a founder. That's why I actually look at the 388 00:19:16,000 --> 00:19:19,080 Speaker 3: App store charts sometimes because I think it's really interesting 389 00:19:19,119 --> 00:19:21,280 Speaker 3: what's trend Usually it's the same stuff that's trending. But 390 00:19:21,600 --> 00:19:24,160 Speaker 3: last week I actually noticed that there was this new 391 00:19:24,160 --> 00:19:27,120 Speaker 3: app that I'd never seen before called Neon Mobile at 392 00:19:27,119 --> 00:19:27,800 Speaker 3: the top of the chart. 393 00:19:27,800 --> 00:19:28,560 Speaker 1: The top of the charts. 394 00:19:28,640 --> 00:19:29,840 Speaker 3: Yes, it was at the top of the charts, and 395 00:19:29,840 --> 00:19:33,840 Speaker 3: then it disappeared or it stopped being used for a 396 00:19:33,880 --> 00:19:34,679 Speaker 3: privacy issue. 397 00:19:34,840 --> 00:19:36,440 Speaker 1: Well why was it so popular in the first place? 398 00:19:36,480 --> 00:19:37,879 Speaker 3: Because people were making money? 399 00:19:38,080 --> 00:19:40,520 Speaker 1: Of course, Okay, back up a couple of steps. What 400 00:19:40,600 --> 00:19:43,760 Speaker 1: is neomobile and how are people making money from it? 401 00:19:43,960 --> 00:19:47,960 Speaker 3: So the app basically offered to pay you money for 402 00:19:48,080 --> 00:19:51,880 Speaker 3: recordings and transcripts of your side of phone calls, which 403 00:19:51,920 --> 00:19:55,600 Speaker 3: was data that they would then sell to AI companies. 404 00:19:56,240 --> 00:19:59,640 Speaker 1: So you'll bet I basically, as a Neon user, would 405 00:19:59,640 --> 00:20:03,280 Speaker 1: allow Neon to record my calls, my private calls, and 406 00:20:03,320 --> 00:20:06,240 Speaker 1: then sell them on to other companies. 407 00:20:06,600 --> 00:20:08,439 Speaker 3: I when you say. 408 00:20:08,320 --> 00:20:10,359 Speaker 1: It like that, how much people getting paid? 409 00:20:10,520 --> 00:20:12,680 Speaker 3: What the website says about Neon is that they pay 410 00:20:13,240 --> 00:20:16,560 Speaker 3: thirty cents per minute if you're calling another Neon user, 411 00:20:16,640 --> 00:20:18,919 Speaker 3: so if you had Neon and I had Neon and 412 00:20:18,960 --> 00:20:20,920 Speaker 3: we were calling each other, they would pay us thirty 413 00:20:20,920 --> 00:20:24,199 Speaker 3: cents per minute. If I was calling a non Neon 414 00:20:24,280 --> 00:20:27,960 Speaker 3: app user, it's fifteen cents per minute. The max that 415 00:20:28,000 --> 00:20:31,360 Speaker 3: you can make is thirty dollars a day. Last week, 416 00:20:31,680 --> 00:20:35,280 Speaker 3: this free app had thousands of users and was downloaded 417 00:20:35,320 --> 00:20:39,879 Speaker 3: seventy five thousand times in one day. So selling your 418 00:20:39,880 --> 00:20:42,520 Speaker 3: personal data to train an AI is one thing, but 419 00:20:43,080 --> 00:20:46,520 Speaker 3: TechCrunch actually discovered something even more sinister, which is this 420 00:20:46,640 --> 00:20:50,640 Speaker 3: security flaw that allowed users to access the phone numbers, 421 00:20:50,840 --> 00:20:54,560 Speaker 3: call recordings, and transcripts of any other user on the app. 422 00:20:54,640 --> 00:20:57,840 Speaker 1: So it wasn't just that you were consciously selling your 423 00:20:57,840 --> 00:21:01,280 Speaker 1: phone calls to companies. There was also a kind of 424 00:21:01,359 --> 00:21:04,440 Speaker 1: unconscious or there was actually flaw which meant that everything 425 00:21:04,920 --> 00:21:07,400 Speaker 1: you did while using Neon was exposed to everyone else 426 00:21:07,640 --> 00:21:11,280 Speaker 1: on Neon. Good old journalism, and that's why, as you mentioned, 427 00:21:11,280 --> 00:21:12,240 Speaker 1: it went offline. 428 00:21:12,440 --> 00:21:16,479 Speaker 3: Yes for now, for now. So Neon actually intends to 429 00:21:16,480 --> 00:21:18,879 Speaker 3: come back with a vengeance, but they have not given 430 00:21:18,920 --> 00:21:21,040 Speaker 3: any indication of how long it will take to fix 431 00:21:21,080 --> 00:21:23,639 Speaker 3: the privacy flaw. The reason I sent this to our 432 00:21:23,640 --> 00:21:25,600 Speaker 3: producers is because I was going to sign up for it, Yeah, 433 00:21:25,680 --> 00:21:26,200 Speaker 3: because I'm a. 434 00:21:26,119 --> 00:21:29,240 Speaker 1: Free thirty dollars a day not bad. That's like three 435 00:21:29,280 --> 00:21:33,240 Speaker 1: that would be thirty dollars a day, thirty cents a minute, 436 00:21:33,440 --> 00:21:39,960 Speaker 1: that's one thousand. That's a thousand, even outstrip your average 437 00:21:40,080 --> 00:21:41,400 Speaker 1: minutes per day on the phone. 438 00:21:41,440 --> 00:21:44,399 Speaker 3: But I think that the average user relates to me 439 00:21:44,480 --> 00:21:47,920 Speaker 3: and sees paid for phone calls with no downside. Yeah, 440 00:21:47,920 --> 00:21:48,760 Speaker 3: I'm going to sign up for that. 441 00:21:48,800 --> 00:21:50,080 Speaker 1: Why not say there is a downside? 442 00:21:50,160 --> 00:21:52,960 Speaker 3: There was a huge downside. I think the takeaway here 443 00:21:53,080 --> 00:21:56,120 Speaker 3: is that be very cautious of when you download any 444 00:21:56,160 --> 00:21:58,199 Speaker 3: apps that are permissioned to record your phone calls and 445 00:21:58,240 --> 00:21:59,280 Speaker 3: access your contacts. 446 00:21:59,320 --> 00:22:02,040 Speaker 1: One hundred percent. The takeaway for me is slightly different, 447 00:22:02,080 --> 00:22:06,639 Speaker 1: which is how you know, this surveillance society doesn't need 448 00:22:06,720 --> 00:22:10,520 Speaker 1: to be imposed, like we kind of volunteer for it ourselves. 449 00:22:10,680 --> 00:22:13,600 Speaker 1: And we're talking about Larry Ellison, but he said last 450 00:22:13,640 --> 00:22:17,359 Speaker 1: year citizens will be on their best behavior because we're 451 00:22:17,359 --> 00:22:20,280 Speaker 1: constantly recording and reporting everything that's going on. 452 00:22:20,880 --> 00:22:23,520 Speaker 3: I think what is evidence here is that people have 453 00:22:23,520 --> 00:22:26,560 Speaker 3: gotten a little bit cynical about surveillance capitalism and they're like, 454 00:22:26,640 --> 00:22:27,080 Speaker 3: you know what. 455 00:22:27,040 --> 00:22:28,679 Speaker 1: Pay me for it's happening anyway, pay me for that. 456 00:22:28,680 --> 00:22:29,480 Speaker 3: That's exactly right. 457 00:22:29,800 --> 00:22:34,600 Speaker 1: You know, elsewhere in the world, AI is actually being regulated, 458 00:22:34,840 --> 00:22:38,520 Speaker 1: and Italy passed the landmark law last week that addressed 459 00:22:38,520 --> 00:22:40,400 Speaker 1: a number of issues that have bubbled up on tech 460 00:22:40,480 --> 00:22:42,080 Speaker 1: stuff over the last few months. 461 00:22:42,240 --> 00:22:45,040 Speaker 3: What are some of the issues that have bubbled. 462 00:22:44,760 --> 00:22:48,960 Speaker 1: Up well, per Reuters, the new regulations in Italy include 463 00:22:49,320 --> 00:22:54,639 Speaker 1: restrictions on copyrighted content for AI driven text, required parental 464 00:22:54,640 --> 00:22:57,800 Speaker 1: consent for AI access for children under the age of fourteen, which, 465 00:22:57,840 --> 00:22:59,400 Speaker 1: by the way, I think is a huge one huge 466 00:23:00,160 --> 00:23:03,880 Speaker 1: in forced transparency around AI use in the workplace, Employees 467 00:23:03,920 --> 00:23:07,200 Speaker 1: will be required to inform workers when AI is being deployed, 468 00:23:07,720 --> 00:23:11,840 Speaker 1: and in the healthcare setting in particular, AI is allowed 469 00:23:11,840 --> 00:23:15,199 Speaker 1: to be used to assist in diagnosing patients, but doctors 470 00:23:15,240 --> 00:23:17,479 Speaker 1: will have to continue to make the final call and 471 00:23:17,520 --> 00:23:20,800 Speaker 1: also to inform their patients about how AI was used 472 00:23:21,000 --> 00:23:22,200 Speaker 1: in the diagnostic process. 473 00:23:22,600 --> 00:23:25,000 Speaker 3: That last one actually reminds me of the conversation we 474 00:23:25,080 --> 00:23:27,600 Speaker 3: had with Robert Capps about AI in the workplace and 475 00:23:27,640 --> 00:23:30,879 Speaker 3: how transparency and accountability are going to be a huge 476 00:23:30,880 --> 00:23:35,200 Speaker 3: part of our jobs as human AI collaboration grows more popular. 477 00:23:34,840 --> 00:23:37,320 Speaker 1: I think that's right. I mean, these are a lot 478 00:23:37,320 --> 00:23:39,960 Speaker 1: of these are guidelines for how to work in an 479 00:23:40,320 --> 00:23:44,280 Speaker 1: ethical way with AI, but there is also a section 480 00:23:44,800 --> 00:23:50,400 Speaker 1: devoted to AI crime and punishment, prison terms ranging from 481 00:23:50,600 --> 00:23:53,919 Speaker 1: one to five years for people who quote use technology 482 00:23:53,960 --> 00:23:57,520 Speaker 1: to cause harm such as generating deep fakes, and even 483 00:23:57,560 --> 00:24:00,480 Speaker 1: harsher penalties for those who use AI to commit crimes 484 00:24:00,520 --> 00:24:02,840 Speaker 1: like fraud and identity theft. So when I said that 485 00:24:02,840 --> 00:24:04,600 Speaker 1: this law addresses a lot of things we've talked about 486 00:24:04,600 --> 00:24:06,720 Speaker 1: on textuff this year, it actually really does. On the 487 00:24:06,720 --> 00:24:10,960 Speaker 1: other hand, without being too cynical, I'm not sure how 488 00:24:11,000 --> 00:24:15,960 Speaker 1: consequential a national law can be for an international technology. 489 00:24:16,200 --> 00:24:17,119 Speaker 3: That's what I was just thinking. 490 00:24:17,160 --> 00:24:19,360 Speaker 1: And yeah, this is, to be fair, built on top 491 00:24:19,400 --> 00:24:21,920 Speaker 1: of an EU AI law that was passed a few 492 00:24:21,920 --> 00:24:24,159 Speaker 1: months ago and it is being implemented in stages, and 493 00:24:24,560 --> 00:24:26,240 Speaker 1: EU as a whole, I think has this kind of 494 00:24:26,280 --> 00:24:29,680 Speaker 1: reputation of being the regulator in chief. But also it 495 00:24:29,680 --> 00:24:32,639 Speaker 1: doesn't have the political clout of China and the US. 496 00:24:32,720 --> 00:24:36,200 Speaker 1: So it'll be interesting to know whether you know, as these, 497 00:24:36,240 --> 00:24:40,080 Speaker 1: for example, these new Luddite movements emerge and gain political 498 00:24:40,119 --> 00:24:42,560 Speaker 1: clout here in the US, will they look to Europe 499 00:24:42,600 --> 00:24:44,480 Speaker 1: and say, O, wow, it's interesting they've actually done something 500 00:24:44,560 --> 00:24:49,000 Speaker 1: concrete and demand that legislators here do something similar or 501 00:24:49,040 --> 00:24:52,120 Speaker 1: will this be kind of shouting into the wind by 502 00:24:52,160 --> 00:24:54,720 Speaker 1: the block of countries who don't actually create on the 503 00:24:54,760 --> 00:24:56,439 Speaker 1: whole AI products, like in. 504 00:24:56,480 --> 00:24:58,800 Speaker 3: Other words, good for Italy, But yeah, does it matter 505 00:24:58,800 --> 00:24:59,360 Speaker 3: for anyone else? 506 00:25:00,080 --> 00:25:01,960 Speaker 1: Going to be any more relevant than the Colosseum. 507 00:25:03,280 --> 00:25:05,439 Speaker 3: So the last story I want to bring you is 508 00:25:05,880 --> 00:25:08,480 Speaker 3: about one of my favorite subjects, which is AI in Hollywood, 509 00:25:08,520 --> 00:25:11,720 Speaker 3: which brings me to a very popular story this week 510 00:25:11,840 --> 00:25:13,040 Speaker 3: about Tillie. 511 00:25:12,800 --> 00:25:15,120 Speaker 1: Norwood, who is Tillie Norwood. 512 00:25:15,400 --> 00:25:18,400 Speaker 3: Tilly Norwood has been all over my Instagram feed all 513 00:25:18,440 --> 00:25:22,280 Speaker 3: week because I work, as you know, in show business 514 00:25:22,320 --> 00:25:25,159 Speaker 3: when I'm not doing this, and I noticed on my 515 00:25:25,200 --> 00:25:28,440 Speaker 3: Instagram feed all these people are making like loll Tilly Norwood, Loll, 516 00:25:28,440 --> 00:25:31,800 Speaker 3: Who's going to sign Tilly Norwood? And I googled Tillie 517 00:25:31,840 --> 00:25:35,960 Speaker 3: Norwood last week and it turns out that Tillie Norwood 518 00:25:36,400 --> 00:25:37,960 Speaker 3: is AI hold On. 519 00:25:38,520 --> 00:25:42,040 Speaker 1: So you became aware of Tillie Norwood before you became 520 00:25:42,080 --> 00:25:44,360 Speaker 1: aware she was an AI character, That's correct. You were 521 00:25:44,400 --> 00:25:46,360 Speaker 1: like a star is born. I gotta know more. 522 00:25:46,320 --> 00:25:48,800 Speaker 3: Well, Like people were posting photos of her, and I 523 00:25:48,840 --> 00:25:50,399 Speaker 3: was like, is she an actress? Am I supposed to know? 524 00:25:50,520 --> 00:25:53,040 Speaker 3: Or is she a new angeneu? And then I mean, 525 00:25:53,280 --> 00:25:55,480 Speaker 3: I'm kind of exaggerating, but like very quickly I googled 526 00:25:55,480 --> 00:25:56,760 Speaker 3: her and I realized, Oh, she's AI. 527 00:25:57,240 --> 00:26:01,240 Speaker 1: Why had everyone suddenly come alive being fascinated by this 528 00:26:01,359 --> 00:26:05,960 Speaker 1: one AI generated figure actress? I mean, there's so many. 529 00:26:05,720 --> 00:26:08,720 Speaker 3: There's a very specific reason. Actually, they're looking for an 530 00:26:08,760 --> 00:26:12,199 Speaker 3: agent who's going to sign her. They're treating, and so 531 00:26:12,400 --> 00:26:15,399 Speaker 3: a lot of the further kind of discourse on Instagram 532 00:26:15,520 --> 00:26:18,600 Speaker 3: was like making a joke like Lowell, who's Tilly going 533 00:26:18,680 --> 00:26:19,160 Speaker 3: to go with? 534 00:26:19,680 --> 00:26:20,280 Speaker 1: And for? 535 00:26:20,720 --> 00:26:23,600 Speaker 3: I might be wrong and this might have been something 536 00:26:23,600 --> 00:26:27,359 Speaker 3: that people talked about with other sort of AI generated characters, 537 00:26:27,400 --> 00:26:30,359 Speaker 3: but this is the first time I've seen an AI 538 00:26:30,680 --> 00:26:33,720 Speaker 3: actress be talked about in the way that other actresses 539 00:26:33,760 --> 00:26:34,440 Speaker 3: are talked about. 540 00:26:34,720 --> 00:26:35,720 Speaker 1: What actor has been saying. 541 00:26:35,840 --> 00:26:39,119 Speaker 3: They're basically saying that like this is bullshit, They're like, 542 00:26:39,160 --> 00:26:42,239 Speaker 3: this is this could literally come for our jobs. I 543 00:26:42,240 --> 00:26:45,440 Speaker 3: want to actually read you these two quotes from Tilly's creator, 544 00:26:45,520 --> 00:26:48,879 Speaker 3: Aileen van der Velden. She put out a statement on 545 00:26:48,960 --> 00:26:52,960 Speaker 3: Instagram saying Tilly Norwood is not a replacement for a 546 00:26:53,000 --> 00:26:56,200 Speaker 3: human being, but a creative work, a piece of art. 547 00:26:56,640 --> 00:26:59,199 Speaker 3: I see AI as not a replacement for people, but 548 00:26:59,280 --> 00:27:02,520 Speaker 3: as a new tool, a new paint brush. I think 549 00:27:02,560 --> 00:27:07,360 Speaker 3: it's also important to note that when Tilly debuted, Aileen 550 00:27:07,480 --> 00:27:10,880 Speaker 3: said quote she wants Tilly to be the next Scarlett 551 00:27:10,920 --> 00:27:12,320 Speaker 3: Johansson or Natalie Portman. 552 00:27:12,640 --> 00:27:15,840 Speaker 1: Do you think Scott Johansson Natalie Portman identify more with 553 00:27:15,960 --> 00:27:19,680 Speaker 1: being a tool or paint brush. What do you think? 554 00:27:19,720 --> 00:27:21,760 Speaker 1: I don't think more as humans. 555 00:27:22,000 --> 00:27:25,040 Speaker 3: I mean, I guess, I know. I know what she's 556 00:27:25,080 --> 00:27:29,120 Speaker 3: saying in terms of actors read lines, but it's a 557 00:27:29,240 --> 00:27:31,960 Speaker 3: I think that's blurring the lines between being a person 558 00:27:32,440 --> 00:27:35,119 Speaker 3: or technology in a way that people are uncomfortable with. Still, 559 00:27:35,480 --> 00:27:37,960 Speaker 3: SAG actually put out a statement and in that statement 560 00:27:37,960 --> 00:27:41,800 Speaker 3: they claim that Tilly was trained using performances from actors 561 00:27:41,840 --> 00:27:46,639 Speaker 3: without permission or compensation, which is actually out of compliance 562 00:27:46,640 --> 00:27:52,160 Speaker 3: with their new union contract. And girsh which on Entourage 563 00:27:52,240 --> 00:27:55,080 Speaker 3: Ari Gold would call a second tier agency. But I 564 00:27:55,080 --> 00:28:02,000 Speaker 3: don't agree with that. But I love gersh Gers has 565 00:28:02,040 --> 00:28:04,040 Speaker 3: great clients and Tilly won't be one of them. 566 00:28:04,600 --> 00:28:06,480 Speaker 1: Well, that's funny. I wonder, I wonder if she will 567 00:28:06,520 --> 00:28:09,240 Speaker 1: end up getting signed by one of the major agencies. 568 00:28:09,520 --> 00:28:12,800 Speaker 3: It's like the most lol thing to me that and 569 00:28:12,840 --> 00:28:15,240 Speaker 3: it's so agency. It's like, here's a product or a 570 00:28:15,280 --> 00:28:17,880 Speaker 3: commodity that is hot, right now, go chase after it. 571 00:28:18,200 --> 00:28:20,880 Speaker 3: I don't think anyone at the agencies is like overthinking 572 00:28:20,880 --> 00:28:23,320 Speaker 3: the fact that she's AI. I think actors are like, 573 00:28:23,720 --> 00:28:26,600 Speaker 3: wait a second, the agency that represents me is looking 574 00:28:26,600 --> 00:28:30,479 Speaker 3: to represent an AI. Is that are they condoning, you know, 575 00:28:30,600 --> 00:28:32,639 Speaker 3: movies being made with AI actors in them. And I 576 00:28:32,640 --> 00:28:35,439 Speaker 3: think that that's a fair question for actors to be asking. 577 00:28:35,480 --> 00:28:45,320 Speaker 3: I don't think it's outrageous. 578 00:28:57,080 --> 00:29:00,520 Speaker 1: Okay, Kara is time for Chatting me nowment where we 579 00:29:00,560 --> 00:29:03,600 Speaker 1: hear from our listeners about how they're using AI in 580 00:29:03,640 --> 00:29:06,440 Speaker 1: their daily lives. And this week was interesting because we 581 00:29:06,520 --> 00:29:11,600 Speaker 1: got a submission and our producers weren't sure if the 582 00:29:11,680 --> 00:29:16,040 Speaker 1: submission was read by AI or a real person's voice, 583 00:29:16,120 --> 00:29:18,640 Speaker 1: and so we had a back and forth over email 584 00:29:18,760 --> 00:29:20,960 Speaker 1: with the person who submitted it, who said that he's 585 00:29:20,960 --> 00:29:24,400 Speaker 1: a teacher and therefore he may have a somewhat robotic 586 00:29:24,520 --> 00:29:27,240 Speaker 1: voice as a result, but also that his voice may 587 00:29:27,320 --> 00:29:30,440 Speaker 1: have been compressed by the voice recording software he's using, 588 00:29:30,520 --> 00:29:32,400 Speaker 1: So this is kind of a meta chat. To me, 589 00:29:32,640 --> 00:29:34,880 Speaker 1: the topic is not meta, but I wanted to bring 590 00:29:34,960 --> 00:29:37,520 Speaker 1: up the meta framework of this because this is kind 591 00:29:37,520 --> 00:29:39,360 Speaker 1: of the dominant question of our age. 592 00:29:39,720 --> 00:29:42,800 Speaker 3: Is that person real? This week we heard from Sean. 593 00:29:42,920 --> 00:29:45,480 Speaker 3: His voice is real and he is a digital communications 594 00:29:45,560 --> 00:29:49,160 Speaker 3: lead at what we in the US call a technical college, 595 00:29:49,400 --> 00:29:51,360 Speaker 3: and part of Sean's job is to teach both his 596 00:29:51,440 --> 00:29:54,920 Speaker 3: students and his colleagues to incorporate AI responsibly. 597 00:29:55,160 --> 00:29:57,160 Speaker 1: I'd be interested to get his take on the work 598 00:29:57,200 --> 00:29:59,280 Speaker 1: slop story we talked about. Well. 599 00:29:59,480 --> 00:30:02,040 Speaker 3: I think in this this case, Sean is clearly a pilot, 600 00:30:02,160 --> 00:30:06,480 Speaker 3: not a passenger. Unlike me. He's been thinking very creatively 601 00:30:06,560 --> 00:30:08,880 Speaker 3: about how to harness AI for his work, and he 602 00:30:08,920 --> 00:30:12,120 Speaker 3: says his school wants to use AI to reduce workload 603 00:30:12,160 --> 00:30:15,800 Speaker 3: on the staff, enhance creativity, and support personalized learning. 604 00:30:16,360 --> 00:30:20,600 Speaker 2: I had students take personality tests in Spanish and then 605 00:30:20,800 --> 00:30:25,080 Speaker 2: used AI to turn their personality data into unique paintings. 606 00:30:25,440 --> 00:30:28,560 Speaker 2: We then held an exhibition where other Spanish speakers could 607 00:30:28,640 --> 00:30:32,360 Speaker 2: engage with the students, who had to introduce the pieces 608 00:30:32,440 --> 00:30:33,640 Speaker 2: and answer questions. 609 00:30:34,000 --> 00:30:37,720 Speaker 1: I love this sort of paradigm of human machine human 610 00:30:37,800 --> 00:30:41,520 Speaker 1: machine engagement and making which I've been in Sean's class. 611 00:30:41,640 --> 00:30:43,719 Speaker 3: I also like that he's not just an AI booster, 612 00:30:43,800 --> 00:30:45,400 Speaker 3: that he actually sees the whole picture. 613 00:30:45,920 --> 00:30:49,480 Speaker 2: While we're enthusiastic about AI, we're also careful about using 614 00:30:49,560 --> 00:30:53,080 Speaker 2: it responsibly. I have worked with senior leaders to issue 615 00:30:53,120 --> 00:30:56,640 Speaker 2: clear AI guidance for students and staff to outline how 616 00:30:56,680 --> 00:31:01,040 Speaker 2: these tools should and shouldn't be used. In practical terms, 617 00:31:01,120 --> 00:31:04,000 Speaker 2: this means AI is to be used as a supportive tool, 618 00:31:04,480 --> 00:31:06,400 Speaker 2: not as a way to do a teacher's or a 619 00:31:06,520 --> 00:31:10,280 Speaker 2: student's work for them. By cultivating these habits now, we're 620 00:31:10,320 --> 00:31:13,480 Speaker 2: hopefully preparing our students to use AI wisely in their 621 00:31:13,520 --> 00:31:17,600 Speaker 2: future workplaces because AI tools will be part of their careers. 622 00:31:18,080 --> 00:31:22,320 Speaker 2: As an educator, I find this incredibly exciting. Most importantly, 623 00:31:22,600 --> 00:31:25,880 Speaker 2: our students are learning with AI and not in fear 624 00:31:25,960 --> 00:31:26,200 Speaker 2: of it. 625 00:31:26,680 --> 00:31:29,200 Speaker 1: So in case anyone was wondering what the opposite of 626 00:31:29,240 --> 00:31:32,720 Speaker 1: WORKSLOP is, we can thank Sean because we have our answer. 627 00:31:33,000 --> 00:31:35,080 Speaker 3: Thank you Sean for submitting your chating. 628 00:31:34,920 --> 00:31:37,959 Speaker 1: Me and please listeners, we want to hear more from you. 629 00:31:37,960 --> 00:31:40,480 Speaker 1: We want more chat and mes. Send your stories to 630 00:31:40,680 --> 00:31:47,680 Speaker 1: tech Stuff Podcast at gmail dot com. 631 00:31:47,840 --> 00:31:49,560 Speaker 3: That's it for this week for tech Stuff. 632 00:31:49,560 --> 00:31:52,440 Speaker 1: I'm care Price and I'm os Valoschin. This episode was 633 00:31:52,480 --> 00:31:55,840 Speaker 1: produced by Eliza Dennis, Melissa Slaughter, and Tyler Hill, who 634 00:31:55,920 --> 00:31:58,800 Speaker 1: was executive produced by me Caarra Price and Kate Osborne 635 00:31:58,840 --> 00:32:03,400 Speaker 1: for Kaleidoscope and Katria Novel for iHeart Podcasts. The Engineer 636 00:32:03,440 --> 00:32:06,880 Speaker 1: is Bihe Fraser and Jack Insley mix this episode Kyle 637 00:32:06,960 --> 00:32:08,040 Speaker 1: murder rodat Theme Song. 638 00:32:08,440 --> 00:32:11,200 Speaker 3: Join us next Wednesday for Text Up the Story, when 639 00:32:11,200 --> 00:32:13,960 Speaker 3: we will share an in depth conversation with David Ignatius 640 00:32:14,000 --> 00:32:17,320 Speaker 3: all about spycraft and how the CIA is faring in 641 00:32:17,360 --> 00:32:18,600 Speaker 3: the technological age. 642 00:32:18,880 --> 00:32:21,520 Speaker 1: And please do rate and review the show wherever you listen, 643 00:32:21,920 --> 00:32:24,520 Speaker 1: and send us your thoughts at tech stuff podcast at 644 00:32:24,520 --> 00:32:26,239 Speaker 1: gmail dot com. We love hearing from you.