1 00:00:00,080 --> 00:00:02,680 Speaker 1: Now a massive trial in the US spells a bit 2 00:00:02,680 --> 00:00:05,440 Speaker 1: of trouble for the social media giants. Meta and Google 3 00:00:05,480 --> 00:00:08,000 Speaker 1: have been found liable for making addictive products at harm 4 00:00:08,119 --> 00:00:11,120 Speaker 1: young people by an l A jury. They've been ordered 5 00:00:11,160 --> 00:00:13,520 Speaker 1: to pay six million dollars to a woman who says 6 00:00:13,560 --> 00:00:16,000 Speaker 1: she became addicted to YouTube and Instagram as a child. 7 00:00:16,120 --> 00:00:19,200 Speaker 1: Doctor Samantha Marsh is a senior public health researcher at 8 00:00:19,360 --> 00:00:23,040 Speaker 1: University of Auckland and with US High Samantha, Hi, do 9 00:00:23,079 --> 00:00:25,959 Speaker 1: you think this is a big tobacco moment like many 10 00:00:25,960 --> 00:00:27,080 Speaker 1: commentators are saying it is? 11 00:00:28,640 --> 00:00:30,240 Speaker 2: Well, ah, yeah, one hundred percent. 12 00:00:30,440 --> 00:00:33,720 Speaker 3: You know, they had an addictive product that was causing 13 00:00:33,800 --> 00:00:37,680 Speaker 3: harm for many years that you know they dodged, you know, 14 00:00:37,760 --> 00:00:41,360 Speaker 3: going to trial for it, and you know they had 15 00:00:41,400 --> 00:00:43,920 Speaker 3: their day in the nineties. And this is what I 16 00:00:43,960 --> 00:00:46,640 Speaker 3: think we're seen today. This is a very historic moment 17 00:00:46,680 --> 00:00:47,400 Speaker 3: for public health. 18 00:00:47,720 --> 00:00:50,360 Speaker 1: Is it an inflection point because it is just yet 19 00:00:50,400 --> 00:00:54,360 Speaker 1: another thing that we are starting to see, which I 20 00:00:54,400 --> 00:00:57,880 Speaker 1: guess absolutely reinforces in our minds. Yes, these are the 21 00:00:57,920 --> 00:01:00,480 Speaker 1: bad guys, they are bad actors. Or is it is 22 00:01:00,480 --> 00:01:02,880 Speaker 1: it that this is this case provides a chink in 23 00:01:02,920 --> 00:01:05,520 Speaker 1: their legal armor that and it will be used against 24 00:01:05,560 --> 00:01:07,480 Speaker 1: them in future court cases. 25 00:01:08,720 --> 00:01:11,120 Speaker 3: Yeah, I mean, I think this is what is so 26 00:01:11,200 --> 00:01:14,040 Speaker 3: important about this is that, you know, this is the 27 00:01:14,080 --> 00:01:17,920 Speaker 3: first time that we're that you know, they've they've actually 28 00:01:17,959 --> 00:01:19,920 Speaker 3: been taken to court, they've. 29 00:01:19,640 --> 00:01:22,040 Speaker 2: Been found liable for their product. 30 00:01:22,360 --> 00:01:24,600 Speaker 3: Up until now, it's all just three and about you know, 31 00:01:25,120 --> 00:01:27,240 Speaker 3: what people post on their platforms and they can't be 32 00:01:27,560 --> 00:01:30,440 Speaker 3: held responsible for that. But this kind of changes everything 33 00:01:30,560 --> 00:01:34,120 Speaker 3: because we you know, what has come out of this 34 00:01:34,480 --> 00:01:37,160 Speaker 3: is that these companies knew that they were, you know, 35 00:01:37,280 --> 00:01:39,000 Speaker 3: designing a product to be addicted. 36 00:01:39,040 --> 00:01:40,119 Speaker 2: They were doing that on purpose. 37 00:01:40,120 --> 00:01:42,800 Speaker 3: They knew it was addictive, and they went ahead and 38 00:01:42,800 --> 00:01:46,360 Speaker 3: did it anyway, and then they promoted it to children. 39 00:01:46,720 --> 00:01:49,080 Speaker 1: But so what I mean, so here's the Devil's advocates. 40 00:01:49,120 --> 00:01:52,200 Speaker 1: So what ten million New Zealand dollars pay out to 41 00:01:52,240 --> 00:01:55,280 Speaker 1: this woman is nothing. These guys are squillionaires. It doesn't 42 00:01:55,320 --> 00:01:56,360 Speaker 1: even touch the sides. 43 00:01:58,360 --> 00:02:00,680 Speaker 2: So this was actually the second of the week. 44 00:02:00,760 --> 00:02:03,800 Speaker 3: So the day before in New Mexico, Meta was ordered 45 00:02:03,800 --> 00:02:06,720 Speaker 3: to pay three hundred and seventy five million dollars and 46 00:02:06,800 --> 00:02:09,400 Speaker 3: a separate lawsuit, which is it. You know, it's quite 47 00:02:09,440 --> 00:02:11,639 Speaker 3: a sizable amount of money that was because they misled 48 00:02:11,639 --> 00:02:14,680 Speaker 3: consumers about the safety of their platform, you know, knowing 49 00:02:14,680 --> 00:02:18,200 Speaker 3: the harming kids mental health and concealing child exploitation on 50 00:02:18,280 --> 00:02:19,160 Speaker 3: their platforms. 51 00:02:20,919 --> 00:02:23,960 Speaker 2: But I think, you know, this is just the first case. 52 00:02:24,000 --> 00:02:27,320 Speaker 3: Of many cases. You know, it was a it's a 53 00:02:27,320 --> 00:02:30,160 Speaker 3: Bellweather case. So there's thousands of cases waiting in the 54 00:02:30,160 --> 00:02:33,480 Speaker 3: wings now to you know. And and what I think 55 00:02:33,480 --> 00:02:35,600 Speaker 3: we really hope is this is going to make the 56 00:02:35,639 --> 00:02:39,720 Speaker 3: companies actually change their product, you know, make meaningful changes 57 00:02:39,760 --> 00:02:40,679 Speaker 3: to their platform. 58 00:02:40,880 --> 00:02:43,840 Speaker 1: I mean, so if we use the big Tobacco analogy, 59 00:02:43,880 --> 00:02:46,640 Speaker 1: then it's like asking it's like asking to big tobacco 60 00:02:46,760 --> 00:02:49,000 Speaker 1: to make the cities less addictive. They don't do that. 61 00:02:49,240 --> 00:02:51,160 Speaker 1: So is me to actually going to do that or 62 00:02:51,240 --> 00:02:52,760 Speaker 1: they will simply adapt and hide it. 63 00:02:54,280 --> 00:02:56,800 Speaker 2: Well, I think this is exactly the problem. 64 00:02:56,800 --> 00:02:58,880 Speaker 3: And that's why when you know, people are saying where 65 00:02:58,919 --> 00:03:01,840 Speaker 3: we should continue to let young children or you know, 66 00:03:01,880 --> 00:03:04,280 Speaker 3: thirteen year olds or whatever access these platforms, We're just 67 00:03:04,320 --> 00:03:05,200 Speaker 3: going to make them safer. 68 00:03:05,240 --> 00:03:07,480 Speaker 2: And it's like, well, are they going to make you? Know? 69 00:03:07,639 --> 00:03:09,280 Speaker 3: You can as I've always say, you can go and 70 00:03:09,320 --> 00:03:11,840 Speaker 3: get cigarettes. They're addictive and they. 71 00:03:11,720 --> 00:03:12,359 Speaker 2: Can kill you. 72 00:03:12,160 --> 00:03:14,919 Speaker 3: You can still buy them because it's you know, it's 73 00:03:14,960 --> 00:03:17,920 Speaker 3: a very profitable product. And so I don't think that 74 00:03:18,000 --> 00:03:21,519 Speaker 3: they will make meaningful changes to their platforms because it's 75 00:03:21,600 --> 00:03:24,720 Speaker 3: not just about the algorithm. It's not just about content 76 00:03:24,800 --> 00:03:27,960 Speaker 3: or predators, you know, it's the whole way the product 77 00:03:28,000 --> 00:03:31,400 Speaker 3: is designed, that is social media. So if you change 78 00:03:31,400 --> 00:03:34,120 Speaker 3: it to make it, you know, not harmful to young people, 79 00:03:34,160 --> 00:03:37,000 Speaker 3: it's not a profitable product anymore, which is why, you know, 80 00:03:37,080 --> 00:03:39,440 Speaker 3: one of the things is we need age restrictions now. 81 00:03:39,680 --> 00:03:42,080 Speaker 1: I mean, mesas share price went up today on the 82 00:03:42,080 --> 00:03:45,280 Speaker 1: back of it. So did Alphabets, which is the parent 83 00:03:45,320 --> 00:03:46,760 Speaker 1: company of Google. How do we explain that. 84 00:03:48,240 --> 00:03:51,440 Speaker 3: I can't explain that. I'm not that's not my area. 85 00:03:51,480 --> 00:03:54,520 Speaker 3: I'm sorry, I can't, you know, I don't. I can't 86 00:03:54,560 --> 00:03:55,440 Speaker 3: explain why. 87 00:03:55,640 --> 00:03:57,880 Speaker 1: Because with the logical thing, I mean, no, fair enough, 88 00:03:57,880 --> 00:03:59,760 Speaker 1: it's not your are. But the logical thing is the 89 00:03:59,800 --> 00:04:02,240 Speaker 1: sholders looked at that, they thought, oh, that's not that bad, 90 00:04:02,280 --> 00:04:04,200 Speaker 1: that court ruling. Maybe they're just judging it on the 91 00:04:04,600 --> 00:04:08,480 Speaker 1: damages the payout, and they've just gone no problem, and 92 00:04:08,480 --> 00:04:09,360 Speaker 1: they've actually put more. 93 00:04:09,280 --> 00:04:12,360 Speaker 3: Down the business, perhaps because it wasn't you know, for 94 00:04:12,440 --> 00:04:13,240 Speaker 3: that single case. 95 00:04:13,280 --> 00:04:14,760 Speaker 2: But you know, they won the day before. 96 00:04:14,960 --> 00:04:18,760 Speaker 3: And you know, I don't know how quick the market 97 00:04:18,760 --> 00:04:21,520 Speaker 3: would respond to this, but I see it as a 98 00:04:21,600 --> 00:04:24,920 Speaker 3: huge as a huge thing. And you know, we had 99 00:04:24,960 --> 00:04:28,880 Speaker 3: TikTok and snap who owns Snapchat. They they didn't even 100 00:04:28,920 --> 00:04:30,760 Speaker 3: go to trial because they you know, we assume that 101 00:04:30,800 --> 00:04:33,640 Speaker 3: they thought that they they didn't want this stuff to 102 00:04:33,640 --> 00:04:35,880 Speaker 3: come out that has come out in these trials. And 103 00:04:36,120 --> 00:04:37,560 Speaker 3: you know, some of the stuff that has come out 104 00:04:37,600 --> 00:04:40,760 Speaker 3: has been so damning, even to people like me who 105 00:04:40,800 --> 00:04:43,920 Speaker 3: have been very vocal, you know, a very vocal critic 106 00:04:43,960 --> 00:04:46,400 Speaker 3: of these companies. I've been shocked by what they've said 107 00:04:46,400 --> 00:04:47,200 Speaker 3: that they've been doing. 108 00:04:47,920 --> 00:04:49,680 Speaker 1: Now, samith that if we accept that this is the 109 00:04:49,680 --> 00:04:52,200 Speaker 1: big tobacco moment, how does it go from here? What 110 00:04:52,360 --> 00:04:54,479 Speaker 1: is the thing that makes it that takes us to 111 00:04:54,480 --> 00:04:57,600 Speaker 1: the big takes big tech to where big tobacco is today. 112 00:04:59,000 --> 00:05:01,760 Speaker 3: Well, I mean, and as I said, I'd like to 113 00:05:01,800 --> 00:05:04,480 Speaker 3: see them make meaningful changes to their products, but I 114 00:05:04,520 --> 00:05:07,960 Speaker 3: think the most important thing would be a fundamental shift 115 00:05:08,000 --> 00:05:11,000 Speaker 3: really in how society is approaching new technologies that are 116 00:05:11,040 --> 00:05:14,000 Speaker 3: you know, consumer technologies, you know, thinking about AI now 117 00:05:14,040 --> 00:05:17,440 Speaker 3: that's coming out particularly those technologies that we're giving kids 118 00:05:17,480 --> 00:05:21,160 Speaker 3: because they you know, the burden of proof has got 119 00:05:21,200 --> 00:05:23,360 Speaker 3: to shift, and that's what I would hope would come 120 00:05:23,400 --> 00:05:26,080 Speaker 3: out of this, because these companies should be required to 121 00:05:26,160 --> 00:05:29,720 Speaker 3: demonstrate safety for their products before their products reach young children, 122 00:05:30,000 --> 00:05:33,520 Speaker 3: not years later in a courtroom like you know has 123 00:05:33,600 --> 00:05:34,200 Speaker 3: just played out. 124 00:05:34,680 --> 00:05:37,040 Speaker 1: Hey, thank you, Samantha, appreciate your timing, your expertise at 125 00:05:37,040 --> 00:05:39,760 Speaker 1: doctor Samantha Marsh, Senior public health researcher at the University 126 00:05:39,800 --> 00:05:43,640 Speaker 1: of Auckland. For more from Heather Duplessy Allen Drive, listen 127 00:05:43,720 --> 00:05:46,760 Speaker 1: live to news talks they'd be from four pm weekdays, 128 00:05:46,839 --> 00:05:49,000 Speaker 1: or follow the podcast on iHeartRadio