1 00:00:00,320 --> 00:00:02,600 Speaker 1: Oh, get a team. It's bloody project. That's you, of 2 00:00:02,640 --> 00:00:05,040 Speaker 1: course it is. It's Patrick, it's Tiffany, it's me, it's 3 00:00:05,120 --> 00:00:09,480 Speaker 1: Jumbo in the studio. It's minus fifteen degrees in Hampton. 4 00:00:09,720 --> 00:00:11,920 Speaker 1: I was rugged up like a bloody I was going 5 00:00:12,000 --> 00:00:14,520 Speaker 1: to say an Eskimo. You're not allowed to say that. 6 00:00:15,240 --> 00:00:18,960 Speaker 1: A what do we say? Patrick? Is it Inuit person 7 00:00:19,840 --> 00:00:20,119 Speaker 1: in it? 8 00:00:20,360 --> 00:00:22,480 Speaker 2: I guess it depends on where you come from. If 9 00:00:22,480 --> 00:00:25,640 Speaker 2: it's North America or Canada. I think it's inwit. 10 00:00:25,480 --> 00:00:29,560 Speaker 1: Yeah, okay, And Tiffany and Cook who when she lasts, 11 00:00:29,600 --> 00:00:32,320 Speaker 1: which she will in this episode, everyone you'll hear sound 12 00:00:32,360 --> 00:00:36,599 Speaker 1: like a familiar cartoon character. Aka, there she is, There, 13 00:00:36,640 --> 00:00:38,199 Speaker 1: she is, there, she is. 14 00:00:38,640 --> 00:00:39,240 Speaker 3: It's Muttley. 15 00:00:40,080 --> 00:00:40,519 Speaker 1: Muttley. 16 00:00:40,680 --> 00:00:42,600 Speaker 3: Yeah, Ucky races. 17 00:00:42,120 --> 00:00:46,120 Speaker 1: There she is. Even Patrick can tell without being told, 18 00:00:47,120 --> 00:00:53,920 Speaker 1: or like, listen to you. Oh gosh, I don't know 19 00:00:54,000 --> 00:00:57,080 Speaker 1: what makes that resonance in your throat. I'm guessing it's 20 00:00:57,200 --> 00:01:03,960 Speaker 1: phlem Ei no, no, Well, I wonder how many days 21 00:01:03,960 --> 00:01:07,160 Speaker 1: of podcasts are going to be infiltrated by the bloody 22 00:01:07,280 --> 00:01:12,880 Speaker 1: Martley factor. Anyway, we had yesterday, We've got today. Patrick, 23 00:01:12,959 --> 00:01:13,640 Speaker 1: how are you? 24 00:01:14,040 --> 00:01:14,440 Speaker 3: I'm good? 25 00:01:14,480 --> 00:01:17,480 Speaker 2: By coincidence with the Wacky Races. My only nickname at 26 00:01:17,520 --> 00:01:19,119 Speaker 2: school was Professor Pat Pending. 27 00:01:20,280 --> 00:01:23,440 Speaker 1: That everybody as in Patent Pending. 28 00:01:24,080 --> 00:01:26,120 Speaker 2: Yeah, because Pat Pending was one of the characters in 29 00:01:26,200 --> 00:01:29,200 Speaker 2: the Wacky Races. He was the Wacky Professor, and because 30 00:01:29,600 --> 00:01:32,720 Speaker 2: in the Gadget's even as a kid, someone decided that 31 00:01:32,760 --> 00:01:34,600 Speaker 2: Pat Pending would be a good nickname for me. 32 00:01:36,400 --> 00:01:38,720 Speaker 1: Well, we might start calling you Pat from now on, 33 00:01:38,880 --> 00:01:41,520 Speaker 1: although you are Pat, so maybe we'll just call you 34 00:01:41,600 --> 00:01:45,319 Speaker 1: pp Pat Pending. How have you been it's been a 35 00:01:45,319 --> 00:01:48,680 Speaker 1: couple of weeks. Well, other than bullying Craig Harper savagely 36 00:01:48,760 --> 00:01:54,600 Speaker 1: before the recording started, both of you, honestly, yes, now 37 00:01:54,640 --> 00:01:57,640 Speaker 1: you're lying. Paka, can I say everyone? It was about 38 00:01:57,680 --> 00:02:01,080 Speaker 1: three minutes of just let's attack Harps and laugh at 39 00:02:01,120 --> 00:02:05,200 Speaker 1: our own bullying by these two poor little defenseless Harps 40 00:02:05,320 --> 00:02:09,680 Speaker 1: metaphorically in the corner, cowering, and these two standing over him, 41 00:02:09,840 --> 00:02:14,120 Speaker 1: pointing their fat, little fucking fingers at me and laughing 42 00:02:14,160 --> 00:02:17,600 Speaker 1: at each other with the barbs that they were each throwing. 43 00:02:18,000 --> 00:02:20,960 Speaker 2: We were laughing with each other at you, not laughing 44 00:02:21,000 --> 00:02:21,560 Speaker 2: at each other. 45 00:02:21,560 --> 00:02:23,639 Speaker 3: We would never laugh at each other. Tif is so 46 00:02:23,960 --> 00:02:26,240 Speaker 3: nice to me. She would never laugh at me. 47 00:02:26,400 --> 00:02:32,000 Speaker 1: Great, all right, I'm bringing hr which is me straight 48 00:02:32,040 --> 00:02:32,480 Speaker 1: after this. 49 00:02:32,880 --> 00:02:35,720 Speaker 2: I think you'll find it's people in culture, Greig, people 50 00:02:37,000 --> 00:02:40,520 Speaker 2: not here, Patrick, fuck people in culture. 51 00:02:40,720 --> 00:02:43,920 Speaker 3: The only culture we have, your freaks. 52 00:02:44,840 --> 00:02:51,440 Speaker 1: All we have is people. There's no culture. Now. Tips 53 00:02:51,440 --> 00:02:53,919 Speaker 1: turned and might look at her. She's fucking just Oh 54 00:02:54,040 --> 00:02:56,679 Speaker 1: she's thrown her spleen out onto the depths. She's put 55 00:02:56,680 --> 00:03:01,640 Speaker 1: her bloody mute on him. Why did you put your 56 00:03:01,680 --> 00:03:04,880 Speaker 1: mic on you? Did you nearly dislocate one of your kidneys? Again? 57 00:03:05,840 --> 00:03:08,320 Speaker 1: I have to put that back in with a spatula? Fuck? 58 00:03:08,360 --> 00:03:09,359 Speaker 1: What's going on with you? 59 00:03:11,880 --> 00:03:14,400 Speaker 2: What did it say about my life? You asked me earlier? 60 00:03:14,440 --> 00:03:16,960 Speaker 2: How how I've been? What does it say about my 61 00:03:17,000 --> 00:03:18,960 Speaker 2: life that one of the most exciting things that happened 62 00:03:19,000 --> 00:03:22,480 Speaker 2: in the last fortnight was Fritz and I walking along 63 00:03:22,560 --> 00:03:24,840 Speaker 2: a new footpath that was made in our town. 64 00:03:25,480 --> 00:03:28,320 Speaker 1: Wow, it's bland. I'd say that probably made the front 65 00:03:28,360 --> 00:03:30,000 Speaker 1: page of the Balan Gallette. 66 00:03:30,080 --> 00:03:31,360 Speaker 3: It was pretty pretty exciting. 67 00:03:31,480 --> 00:03:34,680 Speaker 2: Is that was that? I mean, all honesty, What does 68 00:03:34,720 --> 00:03:36,600 Speaker 2: that say about me? That I was excited to walk 69 00:03:36,640 --> 00:03:38,520 Speaker 2: on a new footpath? Mind you, I've been here for 70 00:03:38,560 --> 00:03:41,480 Speaker 2: eighteen years and we haven't had any footpaths in the time. 71 00:03:41,640 --> 00:03:44,360 Speaker 2: Like you know, most streets don't have footpaths. My street 72 00:03:44,480 --> 00:03:47,000 Speaker 2: where I live doesn't have a footpath, so having a 73 00:03:47,000 --> 00:03:49,920 Speaker 2: new footpath going was quite big for an entire block 74 00:03:50,000 --> 00:03:50,440 Speaker 2: as well. 75 00:03:51,240 --> 00:03:53,520 Speaker 1: Well, it's good that you're telling everyone in Australia. It's 76 00:03:53,560 --> 00:03:57,240 Speaker 1: because fuck, we're on the edge of our seats. But no, 77 00:03:57,400 --> 00:03:59,040 Speaker 1: I think it's good. You know why I think it's 78 00:03:59,080 --> 00:04:02,360 Speaker 1: good means it doesn't take much to make you happy. 79 00:04:02,360 --> 00:04:05,160 Speaker 1: And that's a nice thing. Yeah, yeah, because a lot 80 00:04:05,200 --> 00:04:07,840 Speaker 1: of people it takes a lot. Even when their life's great, 81 00:04:07,880 --> 00:04:11,120 Speaker 1: they're still bitching about shit. Let me find let me 82 00:04:11,160 --> 00:04:13,440 Speaker 1: look past all the good stuff to find something to 83 00:04:13,480 --> 00:04:16,720 Speaker 1: bitch about. You know, Tip's dying over there of fucking 84 00:04:16,839 --> 00:04:21,000 Speaker 1: some lung disease. But she's still smiling. She's still she's 85 00:04:21,040 --> 00:04:25,960 Speaker 1: still showing up Mutley, she's still muttling. Speaking of Mutley, 86 00:04:26,000 --> 00:04:29,320 Speaker 1: there's another dog on the other side. Fritz's Fritz the wunderdog, 87 00:04:29,920 --> 00:04:32,400 Speaker 1: who hasn't spoken a word, but somehow as a co 88 00:04:32,480 --> 00:04:37,040 Speaker 1: host if other than dying from the black plague. How 89 00:04:37,040 --> 00:04:37,320 Speaker 1: are you? 90 00:04:37,800 --> 00:04:39,599 Speaker 4: I'm feeling good. I'm good. Thank you? 91 00:04:40,480 --> 00:04:42,479 Speaker 1: M are you better? Than yesterday. 92 00:04:43,320 --> 00:04:46,120 Speaker 4: Yes, yeah, I am actually and the. 93 00:04:46,160 --> 00:04:49,280 Speaker 1: Day before Patrick or sorry, day before Patrick, we did 94 00:04:49,279 --> 00:04:53,400 Speaker 1: a podcast and about halfway through the podcast, Tiff hit 95 00:04:53,440 --> 00:04:57,680 Speaker 1: the wall and you could just see the deterioration. And 96 00:04:57,760 --> 00:04:59,760 Speaker 1: every time I offer to bring her into the chat, 97 00:05:00,000 --> 00:05:04,080 Speaker 1: he's like, nah, oh good. And then we had another 98 00:05:04,120 --> 00:05:07,719 Speaker 1: podcast recording straight after that one. She's like, I can't 99 00:05:07,800 --> 00:05:11,440 Speaker 1: do it. I'm like, okay, you go and lie down. 100 00:05:12,400 --> 00:05:16,200 Speaker 1: So it's been an interesting medical kind of time over here. 101 00:05:17,240 --> 00:05:22,720 Speaker 1: Let's talk about tech and such interesting things. Patrick. I 102 00:05:22,760 --> 00:05:27,080 Speaker 1: actually spoke with it this one yesterday, but we'll cover 103 00:05:27,120 --> 00:05:30,880 Speaker 1: it again quickly with David Gillespie about the lady who 104 00:05:30,960 --> 00:05:35,040 Speaker 1: sued Meta and YouTube ended up getting a payout of 105 00:05:35,160 --> 00:05:37,640 Speaker 1: six million. I thought it was three, but apparently it. 106 00:05:37,600 --> 00:05:41,960 Speaker 2: Was six three years three US conversion you no, No, 107 00:05:42,080 --> 00:05:42,599 Speaker 2: it wasn't. 108 00:05:42,800 --> 00:05:45,120 Speaker 1: I said that same thing and he corrected me because 109 00:05:45,160 --> 00:05:48,039 Speaker 1: it was three and three for something else. Apparently. Anyway, 110 00:05:48,080 --> 00:05:50,880 Speaker 1: it was millions of dollars. The matter, the money doesn't matter, 111 00:05:51,279 --> 00:05:53,080 Speaker 1: but yeah, walk us through this. 112 00:05:54,120 --> 00:05:54,280 Speaker 3: Well. 113 00:05:54,320 --> 00:05:56,600 Speaker 2: The thing is this has been in the courts and 114 00:05:56,720 --> 00:06:00,680 Speaker 2: it's and it's pretty landmark because whence she have a 115 00:06:00,720 --> 00:06:04,120 Speaker 2: test case in the US courts and something's been proven. 116 00:06:04,200 --> 00:06:06,680 Speaker 2: In this particular case, this is a woman who's twenty 117 00:06:06,800 --> 00:06:09,960 Speaker 2: years old. She had been on social media since she 118 00:06:10,040 --> 00:06:12,600 Speaker 2: was a young child, so first it was YouTube and 119 00:06:12,640 --> 00:06:16,200 Speaker 2: then she went over to the meta stuff, so Instagram. 120 00:06:16,680 --> 00:06:20,120 Speaker 2: But she was claiming, and the jury have come out 121 00:06:20,160 --> 00:06:24,279 Speaker 2: and said, well, yes, she's correct that she became addicted 122 00:06:24,440 --> 00:06:27,680 Speaker 2: to social media and it impacted on her mental state, 123 00:06:28,320 --> 00:06:32,640 Speaker 2: her sense of self that and it was the algorithms 124 00:06:32,720 --> 00:06:37,320 Speaker 2: and the mechanisms behind it that are inherently addictive. 125 00:06:37,880 --> 00:06:38,960 Speaker 3: So it was just that. 126 00:06:39,240 --> 00:06:41,600 Speaker 2: And I know you have very strong opinions about this 127 00:06:41,600 --> 00:06:43,800 Speaker 2: to Crago, and at the end of the day, we 128 00:06:43,839 --> 00:06:46,920 Speaker 2: all make those decisions ourselves whether to flick or turn 129 00:06:46,960 --> 00:06:48,200 Speaker 2: the phone off and put it away. 130 00:06:48,760 --> 00:06:51,520 Speaker 3: Ultimately, in this particular instance. 131 00:06:51,279 --> 00:06:54,520 Speaker 2: It may be a watershed case because it means that 132 00:06:54,560 --> 00:06:58,120 Speaker 2: it's now open the way for potentially hundreds and hundreds, 133 00:06:58,520 --> 00:07:02,040 Speaker 2: even thousands. Some of it may very well be legitimate. 134 00:07:02,080 --> 00:07:05,720 Speaker 2: You know, people who are susceptible to things like addictions. 135 00:07:06,480 --> 00:07:09,560 Speaker 2: Electronic addictions are definitely a thing, So you can see 136 00:07:09,640 --> 00:07:11,440 Speaker 2: where there would be a whole lot of people, but 137 00:07:11,840 --> 00:07:15,080 Speaker 2: we're talking that you know, there are potentially sixteen hundred 138 00:07:15,120 --> 00:07:18,760 Speaker 2: plaintifs in additional cases, three hundred and fifty families, two 139 00:07:18,840 --> 00:07:22,400 Speaker 2: hundred and fifty school districts. They're all watching what's happening 140 00:07:22,440 --> 00:07:24,760 Speaker 2: here and potentially lots more. 141 00:07:24,600 --> 00:07:27,080 Speaker 3: Cases of its sort. But I mean this is a 142 00:07:27,160 --> 00:07:28,240 Speaker 3: drop on the ocean. Mind. 143 00:07:28,280 --> 00:07:31,760 Speaker 2: You know, whether it's six million dollars US, that's not 144 00:07:31,800 --> 00:07:32,960 Speaker 2: a lot of money to a big. 145 00:07:32,840 --> 00:07:33,640 Speaker 3: Company like Meta. 146 00:07:33,720 --> 00:07:36,760 Speaker 2: But what it may mean is they may have to 147 00:07:36,920 --> 00:07:40,520 Speaker 2: change the way that they use their algorithms, they may 148 00:07:40,560 --> 00:07:44,320 Speaker 2: have to change the mechanisms. So they're still defending themselves saying, well, 149 00:07:44,360 --> 00:07:45,480 Speaker 2: we didn't do anything wrong. 150 00:07:45,560 --> 00:07:48,880 Speaker 3: It's not our platform. And the same with Google and YouTube. 151 00:07:49,640 --> 00:07:51,640 Speaker 2: So it could mean a lot of different things in 152 00:07:51,800 --> 00:07:55,080 Speaker 2: terms of a more legal cases and b whether it 153 00:07:55,120 --> 00:07:58,200 Speaker 2: could force these big tech companies to change the way 154 00:07:58,400 --> 00:07:59,000 Speaker 2: they try to. 155 00:07:58,920 --> 00:08:03,160 Speaker 1: People in Yeah, yeah, no, it is fascinating and there's 156 00:08:03,160 --> 00:08:07,680 Speaker 1: a kind of an intersection of legal stuff and psychological 157 00:08:07,680 --> 00:08:12,600 Speaker 1: stuff and neuroscience stuff, and yeah, what was interesting chatting 158 00:08:12,640 --> 00:08:17,240 Speaker 1: with Gillespie because he's a lawyer, right, so he was 159 00:08:17,520 --> 00:08:22,640 Speaker 1: with like the challenges to prove causation not just correlation, 160 00:08:23,360 --> 00:08:25,760 Speaker 1: and then you go, well, if if all of these 161 00:08:25,880 --> 00:08:29,040 Speaker 1: millions of people are using what she's using, but they 162 00:08:29,120 --> 00:08:32,280 Speaker 1: don't have what she has. It's very hard to go 163 00:08:32,880 --> 00:08:35,480 Speaker 1: this caused it when there are so many other variables. 164 00:08:35,520 --> 00:08:38,880 Speaker 1: But apparently they figured out a way that I think 165 00:08:39,679 --> 00:08:42,840 Speaker 1: the only I think it was the only kind of 166 00:08:42,840 --> 00:08:47,559 Speaker 1: social media stuff she used, was that anyway, they found 167 00:08:47,600 --> 00:08:53,280 Speaker 1: a way. The they found out a way to allegedly 168 00:08:53,559 --> 00:08:57,640 Speaker 1: prove or to prove. So the judge ruled obviously in 169 00:08:57,720 --> 00:09:01,160 Speaker 1: favor of the plaintiff, and now Meta has appealed it. 170 00:09:01,200 --> 00:09:05,600 Speaker 1: So yeah, like you said, if that goes down and 171 00:09:05,640 --> 00:09:08,960 Speaker 1: it sticks, David reckons, it could take five years in 172 00:09:09,000 --> 00:09:11,760 Speaker 1: and out of court. So that's what they wanted to do. 173 00:09:11,800 --> 00:09:15,440 Speaker 1: They want to they want to manipulate the system so 174 00:09:15,520 --> 00:09:17,800 Speaker 1: they're not guilty just yet, or they don't have to 175 00:09:17,840 --> 00:09:19,960 Speaker 1: pay out or And also, like you said, it's not 176 00:09:20,000 --> 00:09:23,240 Speaker 1: about the dough because whatever it is, three or six million, 177 00:09:23,679 --> 00:09:26,920 Speaker 1: they don't that's four minutes of work for them. But 178 00:09:27,000 --> 00:09:32,480 Speaker 1: it's the consequences of if this sticks, then that's a 179 00:09:32,640 --> 00:09:36,720 Speaker 1: nightmare for them moving forward. But staying in the legal 180 00:09:36,760 --> 00:09:38,840 Speaker 1: space man court using. 181 00:09:39,160 --> 00:09:41,240 Speaker 2: Sorry, can I just before you go on this one 182 00:09:41,240 --> 00:09:43,320 Speaker 2: other little point that you may not have talked about 183 00:09:43,400 --> 00:09:46,080 Speaker 2: and I guess got I mean, obviously the legal ramifications, 184 00:09:46,120 --> 00:09:48,720 Speaker 2: but what I wanted to point out in this particular 185 00:09:48,800 --> 00:09:52,040 Speaker 2: case is this was a young child, so that when 186 00:09:52,080 --> 00:09:55,520 Speaker 2: she started using social media, and it kind of vindicates 187 00:09:55,559 --> 00:09:57,440 Speaker 2: the fact that we've got a now a ban in 188 00:09:57,480 --> 00:10:01,360 Speaker 2: Australia for under sixteen's. UK is now looking at taking 189 00:10:01,360 --> 00:10:05,400 Speaker 2: this on board because we know that the teenage brain 190 00:10:05,600 --> 00:10:09,160 Speaker 2: can be more susceptible. There's a higher level of plasticity, 191 00:10:09,520 --> 00:10:12,319 Speaker 2: So if you are manipulating the teenage brain or a 192 00:10:12,440 --> 00:10:16,280 Speaker 2: children a child's brain at a really early age, that's 193 00:10:16,320 --> 00:10:18,480 Speaker 2: another factor that came out in this case. I think 194 00:10:18,520 --> 00:10:22,440 Speaker 2: that's really important to understand that these are young people 195 00:10:22,440 --> 00:10:25,280 Speaker 2: who don't have the ability to make those conscious decisions 196 00:10:25,280 --> 00:10:28,240 Speaker 2: that we do as adults, because we've got our formative brains, 197 00:10:28,280 --> 00:10:29,960 Speaker 2: you know, we can look at right and wrong, we 198 00:10:30,000 --> 00:10:32,679 Speaker 2: can make decisions, but young people have a different sort 199 00:10:32,720 --> 00:10:35,440 Speaker 2: of way of thinking. And I think that's the other 200 00:10:35,480 --> 00:10:37,679 Speaker 2: thing that I don't know how much it's been discussed, 201 00:10:37,720 --> 00:10:40,360 Speaker 2: but I think it's a very interesting way to look 202 00:10:40,400 --> 00:10:42,120 Speaker 2: at the band that we've got in Australia and how 203 00:10:42,120 --> 00:10:44,760 Speaker 2: other countries are now doing the same thing that we've done, 204 00:10:45,160 --> 00:10:48,360 Speaker 2: and taking social media away from children under the age 205 00:10:48,360 --> 00:10:48,920 Speaker 2: of sixteen. 206 00:10:49,880 --> 00:10:53,360 Speaker 1: Yeah, two things. One I agree with the taking it 207 00:10:53,400 --> 00:10:55,720 Speaker 1: away from them, so I'm not anti this. I agreed 208 00:10:55,800 --> 00:10:59,679 Speaker 1: with you. But to manipulate the teenage brain or the 209 00:10:59,760 --> 00:11:04,000 Speaker 1: child our brain happens constantly in life, with parents, with school, 210 00:11:04,120 --> 00:11:08,800 Speaker 1: with peers, with television, with music, with influences with I mean, 211 00:11:08,880 --> 00:11:11,320 Speaker 1: so to go to manipulate the brain. Yeah, but it's 212 00:11:11,360 --> 00:11:15,719 Speaker 1: always been manipulated. It's always being affected or impacted. But yeah, 213 00:11:15,720 --> 00:11:17,560 Speaker 1: I agree with you, and I don't know. I don't 214 00:11:17,559 --> 00:11:21,200 Speaker 1: know what the answer is, but we'll see. All right, now, 215 00:11:21,240 --> 00:11:23,960 Speaker 1: tell us about the bloke who got caught using smart 216 00:11:23,960 --> 00:11:29,440 Speaker 1: glasses to get advice. This is so clever. So he's 217 00:11:29,559 --> 00:11:32,920 Speaker 1: gone to court. He's wearing smart glasses and he's getting 218 00:11:32,960 --> 00:11:36,520 Speaker 1: advice while he's being cross examined by the I guess 219 00:11:36,559 --> 00:11:39,280 Speaker 1: prosecutors what it it's so smart? 220 00:11:39,960 --> 00:11:40,480 Speaker 3: Look at easy? 221 00:11:40,559 --> 00:11:43,720 Speaker 2: It isn't because he got caught. So it's an interesting case. 222 00:11:43,800 --> 00:11:45,880 Speaker 2: This was in January and it was in the UK 223 00:11:46,040 --> 00:11:48,760 Speaker 2: High Court and it was a Lithuanian guy who's a 224 00:11:48,840 --> 00:11:52,160 Speaker 2: CEO of a company that was trying to battle insolvencies, 225 00:11:52,200 --> 00:11:56,280 Speaker 2: so he was fighting against their insolvency ruling. But what 226 00:11:56,800 --> 00:12:00,400 Speaker 2: the judge noticed that during questioning the guy would has 227 00:12:00,760 --> 00:12:03,480 Speaker 2: for a few seconds and then start to answer and 228 00:12:03,600 --> 00:12:06,839 Speaker 2: during cross examination, and so he was being prompted by 229 00:12:07,040 --> 00:12:11,160 Speaker 2: smart glasses. And then of course once the judge realized 230 00:12:11,160 --> 00:12:13,680 Speaker 2: and told him to take off the glasses, he became 231 00:12:13,760 --> 00:12:16,640 Speaker 2: pretty obvious he was just being cheating because he went, 232 00:12:17,120 --> 00:12:20,680 Speaker 2: I don't know, I don't know, I don't know. So 233 00:12:20,760 --> 00:12:24,400 Speaker 2: that was kind of interesting, you know. So it's funny 234 00:12:24,679 --> 00:12:25,920 Speaker 2: because it is, and. 235 00:12:27,600 --> 00:12:28,199 Speaker 3: I think that. 236 00:12:28,280 --> 00:12:30,319 Speaker 2: But the interesting thing is now it's going to get 237 00:12:30,360 --> 00:12:33,280 Speaker 2: more subtle because I saw an article recently. We can 238 00:12:33,320 --> 00:12:36,480 Speaker 2: now get contact lenses are coming out that are smart 239 00:12:36,520 --> 00:12:38,480 Speaker 2: contacts that can actually have. 240 00:12:38,480 --> 00:12:39,800 Speaker 3: A little screen built into them. 241 00:12:39,840 --> 00:12:42,240 Speaker 2: So it's going to get harder to work out whether 242 00:12:42,320 --> 00:12:44,760 Speaker 2: or not someone's being prompted in a. 243 00:12:44,760 --> 00:12:47,800 Speaker 3: Case like this. So that isn't it funny? 244 00:12:48,080 --> 00:12:52,320 Speaker 1: I thought, So whether prompt's visual or auditory. 245 00:12:52,040 --> 00:12:53,880 Speaker 2: On a screen. Now they were on a screen. He's 246 00:12:53,920 --> 00:12:55,959 Speaker 2: got his smart glasses. It was projecting on the screen 247 00:12:56,000 --> 00:12:56,920 Speaker 2: of the smart glasses. 248 00:12:57,400 --> 00:13:00,720 Speaker 1: So was it generating like a script or something? No? 249 00:13:00,800 --> 00:13:03,040 Speaker 2: I think no, actually no, in this particular case, it 250 00:13:03,120 --> 00:13:05,840 Speaker 2: was through his ear It was the little speakers inside 251 00:13:05,880 --> 00:13:08,040 Speaker 2: the smart glasses. So I think there may have been 252 00:13:08,080 --> 00:13:11,240 Speaker 2: a second person listening and then prompting him, and he 253 00:13:11,280 --> 00:13:13,760 Speaker 2: was hearing it, so he was sitting back and listening 254 00:13:13,800 --> 00:13:14,840 Speaker 2: and then replying. 255 00:13:15,160 --> 00:13:17,120 Speaker 3: In this petition, She's. 256 00:13:16,480 --> 00:13:23,120 Speaker 1: Just worn an earpiece, way easier. Yeah, this is interesting. 257 00:13:23,200 --> 00:13:27,920 Speaker 1: Victorian business is fine for telling influencers to lie about 258 00:13:27,920 --> 00:13:30,840 Speaker 1: paid Instagram posts in first of it's kind penalty, and 259 00:13:30,880 --> 00:13:35,160 Speaker 1: I think, well, to me, what that's I mean, every 260 00:13:35,280 --> 00:13:39,480 Speaker 1: influencer that's selling this or that protein powder or that 261 00:13:39,800 --> 00:13:42,800 Speaker 1: range of vite, you know, and they're like, well this 262 00:13:42,880 --> 00:13:45,440 Speaker 1: is this is how I look like this. No, that's 263 00:13:45,600 --> 00:13:47,960 Speaker 1: just not true. But it's almost like we know that. 264 00:13:48,679 --> 00:13:52,240 Speaker 2: Yeah, you're right normally in those cases. But in this instance, 265 00:13:52,320 --> 00:13:55,120 Speaker 2: what they did was they told the influences not to 266 00:13:55,160 --> 00:13:58,840 Speaker 2: say that they received something in turn for the review. 267 00:13:58,960 --> 00:14:02,320 Speaker 3: That's where there's that's all right, Yeah, so you're quite right. 268 00:14:02,480 --> 00:14:04,760 Speaker 2: That's what influencers do. They promote a product, and it's 269 00:14:04,760 --> 00:14:07,120 Speaker 2: pretty obvious that they got this. You know, we were 270 00:14:07,120 --> 00:14:10,520 Speaker 2: sent this Tesla and now we're saying it's a great car. 271 00:14:10,760 --> 00:14:12,760 Speaker 3: But in this instance, that's not what happens. 272 00:14:12,800 --> 00:14:17,520 Speaker 2: So the company was fined thirty nine thousand, six hundred dollars. 273 00:14:17,600 --> 00:14:21,360 Speaker 2: A company called Thomson consolidated and they have an online 274 00:14:21,400 --> 00:14:26,960 Speaker 2: service called Photo book Shop, and so thirty six thirty 275 00:14:27,000 --> 00:14:29,160 Speaker 2: nine thousand dollars for a smaller business is quite a 276 00:14:29,200 --> 00:14:32,000 Speaker 2: significant amount. But it also, you know, puts the message 277 00:14:32,000 --> 00:14:35,120 Speaker 2: out there about the way that you use influencers, in 278 00:14:35,160 --> 00:14:37,920 Speaker 2: the way that products are disclosed, and whether or not 279 00:14:37,960 --> 00:14:41,280 Speaker 2: the disclosed. They also changed someone's review as well. They 280 00:14:41,280 --> 00:14:42,640 Speaker 2: were called out changing. 281 00:14:42,360 --> 00:14:44,560 Speaker 3: A review that was a little bit critical. 282 00:14:44,760 --> 00:14:46,840 Speaker 2: I think the review said something along the lines of 283 00:14:47,080 --> 00:14:49,840 Speaker 2: the product was a bit clunky to use, and they 284 00:14:50,040 --> 00:14:51,080 Speaker 2: edited that bit out. 285 00:14:51,840 --> 00:14:54,640 Speaker 1: Wow, well you won't like the next one you or tip. 286 00:14:55,400 --> 00:14:58,920 Speaker 1: Android brands face a rough twenty twenty six with more 287 00:14:59,040 --> 00:15:02,400 Speaker 1: iPhone switched. In other words, people going to iPhone. 288 00:15:02,720 --> 00:15:03,400 Speaker 3: Yeah that was. 289 00:15:03,600 --> 00:15:06,280 Speaker 1: Distressing for you, you bloody Android groupie. 290 00:15:06,440 --> 00:15:08,960 Speaker 2: Well not really because I've got a Pixel phone, and 291 00:15:09,000 --> 00:15:12,680 Speaker 2: Pixel was the least problematic of the Android ecosystem, so 292 00:15:12,720 --> 00:15:16,000 Speaker 2: Pixel didn't do too bad. They're looking like they're still tracking. Okay, 293 00:15:16,320 --> 00:15:19,720 Speaker 2: it's the Samsung's huaweies and all the other phones on 294 00:15:19,760 --> 00:15:21,880 Speaker 2: the market show me that sort of thing. But it 295 00:15:22,000 --> 00:15:24,960 Speaker 2: is interesting that there has been a shift. Australia has 296 00:15:25,000 --> 00:15:27,200 Speaker 2: a very big uptake of Apple products. I've got to 297 00:15:27,240 --> 00:15:30,920 Speaker 2: say now the iPhone is very prolific in Australia. 298 00:15:31,000 --> 00:15:31,360 Speaker 3: I don't know. 299 00:15:31,400 --> 00:15:33,040 Speaker 2: I think it might be one in two, or it 300 00:15:33,080 --> 00:15:35,800 Speaker 2: might even be more. I can't remember what the stats were. 301 00:15:35,840 --> 00:15:37,600 Speaker 2: I haven't looked at them for a few years, but 302 00:15:37,720 --> 00:15:41,440 Speaker 2: certainly Australia is a great adopter of Apple iPhones. People 303 00:15:41,600 --> 00:15:45,680 Speaker 2: love iPhones in Australia. That said, that's how it's tracking. 304 00:15:45,720 --> 00:15:46,120 Speaker 3: At the moment. 305 00:15:46,200 --> 00:15:48,160 Speaker 2: There are concerns because of what's going on in the 306 00:15:48,200 --> 00:15:52,320 Speaker 2: Middle East and access to chips, the chips that are 307 00:15:52,440 --> 00:15:55,320 Speaker 2: used in phones, and it looks like that the production 308 00:15:55,520 --> 00:15:59,160 Speaker 2: of Android devices and Snapdragon, the different chips in those, 309 00:15:59,240 --> 00:16:02,400 Speaker 2: and there the tensor trips, the chips that they're being 310 00:16:02,480 --> 00:16:06,120 Speaker 2: used by androids. Where Apple has its own little ecosystem, 311 00:16:06,160 --> 00:16:09,680 Speaker 2: they have a different production line, and so it seems. 312 00:16:09,240 --> 00:16:11,480 Speaker 3: Like there's more confidence in the Apple products. 313 00:16:13,440 --> 00:16:16,080 Speaker 1: I thought you, how's yours gone, Tiff is in your 314 00:16:16,120 --> 00:16:18,400 Speaker 1: phone being a bit problematic at the moment. 315 00:16:18,320 --> 00:16:21,800 Speaker 5: Yes, I write today I would jump ship to an iPhone. 316 00:16:21,840 --> 00:16:24,800 Speaker 5: My phone keeps dropping out in phone calls all the time. 317 00:16:24,920 --> 00:16:26,640 Speaker 5: I don't know if it's a phone issue or a 318 00:16:26,680 --> 00:16:27,880 Speaker 5: network issue or what. 319 00:16:28,320 --> 00:16:29,480 Speaker 4: Maybe Patrick could help you. 320 00:16:29,640 --> 00:16:31,520 Speaker 3: What is it? What sort of phone is it? 321 00:16:33,200 --> 00:16:33,320 Speaker 4: So? 322 00:16:33,880 --> 00:16:35,480 Speaker 3: Do you restart the phone regularly? 323 00:16:36,200 --> 00:16:40,280 Speaker 4: Yeah? And I believe it started after I did an update. 324 00:16:41,920 --> 00:16:43,920 Speaker 3: Yeah. That that can actually be really problematic. 325 00:16:44,000 --> 00:16:48,000 Speaker 2: Sometimes when an update comes through, you can roll back updates. 326 00:16:48,040 --> 00:16:50,200 Speaker 3: There's a way to do it. You'd need to google it. 327 00:16:50,280 --> 00:16:53,160 Speaker 2: But one of the things that you can solve a 328 00:16:53,160 --> 00:16:55,760 Speaker 2: lot of those problems is to go into aeroplane mode 329 00:16:55,840 --> 00:16:57,640 Speaker 2: for a few minutes and that or a few seconds 330 00:16:57,680 --> 00:17:01,800 Speaker 2: and then switch airplane mode off, and that basically reinitiates 331 00:17:01,840 --> 00:17:03,200 Speaker 2: the connection to the network. 332 00:17:03,360 --> 00:17:05,119 Speaker 3: That may help as well, if that's going to be 333 00:17:05,200 --> 00:17:09,080 Speaker 3: ongoing problem. That Yeah is good, didn't she Craigo? 334 00:17:09,760 --> 00:17:12,880 Speaker 1: Oh yeah, she's better than me and potentially better than you. 335 00:17:13,680 --> 00:17:17,160 Speaker 1: I'm interested in this because my dad has sleep apna, 336 00:17:17,440 --> 00:17:21,200 Speaker 1: and as do a bunch of people. Of course, the 337 00:17:21,320 --> 00:17:25,280 Speaker 1: one hundred and ninety nine dollars Vita ring sounds almost 338 00:17:25,320 --> 00:17:27,920 Speaker 1: like a drink. A Vita ring wakes you up mid 339 00:17:28,040 --> 00:17:30,919 Speaker 1: apna before you ever know it. Happened, So what is 340 00:17:30,920 --> 00:17:33,800 Speaker 1: that just vibrated on your finger or something does? 341 00:17:33,920 --> 00:17:36,360 Speaker 2: Yeah, the thing that bear in mind is that one 342 00:17:36,400 --> 00:17:38,800 Speaker 2: hundred and ninety nine is a special on Kickstarter in 343 00:17:39,000 --> 00:17:42,000 Speaker 2: US dollars, so it's got just about maybe two hundred 344 00:17:42,040 --> 00:17:44,359 Speaker 2: and sixty two hundred and seventy dollars. This is one 345 00:17:44,400 --> 00:17:47,320 Speaker 2: of the first smart rings that I've considered getting for myself. 346 00:17:48,280 --> 00:17:52,040 Speaker 2: I live by myself and Fritz never complains. Of course, 347 00:17:52,400 --> 00:17:54,040 Speaker 2: then go around the other side, but no, he doesn't 348 00:17:54,040 --> 00:17:57,200 Speaker 2: normally explain about me, you know, snoring or making any 349 00:17:57,240 --> 00:17:57,960 Speaker 2: noises at night. 350 00:17:58,359 --> 00:17:58,680 Speaker 3: That's it. 351 00:17:59,440 --> 00:18:02,720 Speaker 2: This is an interesting one because for people who do 352 00:18:02,840 --> 00:18:05,600 Speaker 2: you smart tech to monitor their sleep, say it's an 353 00:18:05,600 --> 00:18:09,520 Speaker 2: Apple watch or whatever device or a Samsung, the problem 354 00:18:09,560 --> 00:18:11,520 Speaker 2: is that generally you've got to charge them up every 355 00:18:11,640 --> 00:18:14,439 Speaker 2: two or three days, whereas a smart ring like this 356 00:18:14,600 --> 00:18:18,000 Speaker 2: has a lot longer in terms of its ability to track. 357 00:18:18,080 --> 00:18:20,600 Speaker 2: But it's also less cumbersome. It looks nice or it's 358 00:18:20,720 --> 00:18:25,560 Speaker 2: very solid. Like looking at this particular one, it's it 359 00:18:25,600 --> 00:18:30,359 Speaker 2: looks pretty substantive, you know, it's really it's scratch proof, 360 00:18:30,359 --> 00:18:31,359 Speaker 2: all that sort of stuff. 361 00:18:31,720 --> 00:18:32,840 Speaker 3: And they have this system. 362 00:18:32,920 --> 00:18:35,920 Speaker 2: So Vedas come up with this idea of alert, advice 363 00:18:36,119 --> 00:18:39,880 Speaker 2: and act. So some of these health metrics that you 364 00:18:39,920 --> 00:18:43,240 Speaker 2: look at are just monitoring and they don't actively do anything. 365 00:18:43,400 --> 00:18:45,399 Speaker 2: So you wake up in the morning, you look at 366 00:18:45,400 --> 00:18:48,399 Speaker 2: your smart watch statistics and it says, right, you had 367 00:18:48,600 --> 00:18:50,639 Speaker 2: six and a half hours of really good sleep, but 368 00:18:50,760 --> 00:18:53,000 Speaker 2: you tossed and turned for two hours or something. You know, 369 00:18:53,000 --> 00:18:55,560 Speaker 2: I'm not eveentirely sure what the algorithm says, but in 370 00:18:55,600 --> 00:18:58,680 Speaker 2: this instance, it monitors in real time and it looks 371 00:18:58,800 --> 00:19:02,320 Speaker 2: like yours s patterns are and when it notices you 372 00:19:02,359 --> 00:19:05,840 Speaker 2: potentially going into an appnear mode, that's when the vibration 373 00:19:06,040 --> 00:19:08,880 Speaker 2: starts enough for you to be jolted to the point 374 00:19:08,880 --> 00:19:11,040 Speaker 2: where you may roll over in bed or do something 375 00:19:11,080 --> 00:19:13,720 Speaker 2: different just because of the movement. So it may not 376 00:19:13,800 --> 00:19:16,199 Speaker 2: even wake you up, but it will cause you to 377 00:19:16,440 --> 00:19:19,159 Speaker 2: alter your sleep pattern or your movement in bed. And 378 00:19:19,160 --> 00:19:21,000 Speaker 2: this sounds really interesting. It's got a whole lot of 379 00:19:21,000 --> 00:19:25,639 Speaker 2: different met health metrics, so seventeen different health metrics are 380 00:19:25,680 --> 00:19:26,760 Speaker 2: being monitored. 381 00:19:26,320 --> 00:19:27,040 Speaker 3: By the ring. 382 00:19:27,440 --> 00:19:29,159 Speaker 2: But I can also see this would be great for 383 00:19:29,280 --> 00:19:33,160 Speaker 2: older people potentially because they're probably more used to wearing 384 00:19:33,160 --> 00:19:36,160 Speaker 2: a ring than a like a smart watch can becomesome 385 00:19:36,200 --> 00:19:38,480 Speaker 2: for them and too technical, whereas this is a lot 386 00:19:38,520 --> 00:19:41,760 Speaker 2: more subtle. And that's what I like about the idea 387 00:19:41,760 --> 00:19:43,680 Speaker 2: of wearing something like this. I tried wearing my watch 388 00:19:43,720 --> 00:19:46,000 Speaker 2: to bed a few times. I struggle wearing a watch 389 00:19:46,040 --> 00:19:48,000 Speaker 2: to bed. What about you, guys? Do you do to 390 00:19:48,040 --> 00:19:48,840 Speaker 2: health track it all? 391 00:19:49,160 --> 00:19:51,120 Speaker 3: Tiff bed. 392 00:19:53,040 --> 00:19:55,960 Speaker 1: I can't wear watches, old fashioned watches because they start, 393 00:19:56,000 --> 00:19:59,440 Speaker 1: you know, the chronograph watches. If I wear a watch, 394 00:19:59,480 --> 00:20:01,880 Speaker 1: the watch like within about an hour. 395 00:20:02,520 --> 00:20:07,000 Speaker 3: What yep, yeah, litrically charged. What the hell? 396 00:20:07,080 --> 00:20:11,040 Speaker 1: Yeah, I've had it fifty times. I get a watch 397 00:20:11,160 --> 00:20:14,800 Speaker 1: and put it on and the watch stops. Yeah. I 398 00:20:14,840 --> 00:20:17,159 Speaker 1: know that's weird and that you might not believe me. 399 00:20:17,200 --> 00:20:21,440 Speaker 1: Like digital watch is fine, but like old school watches, yeah, 400 00:20:21,480 --> 00:20:24,720 Speaker 1: I can't work. That's why you've never seen like neither 401 00:20:24,760 --> 00:20:27,119 Speaker 1: of you have seen me ever with a watch. I 402 00:20:27,760 --> 00:20:29,400 Speaker 1: like watching. I just can't wear them. 403 00:20:29,840 --> 00:20:32,760 Speaker 2: So what this battery powered and wind up So both 404 00:20:32,800 --> 00:20:35,320 Speaker 2: you've tried both the battery ones and the wind up ones. 405 00:20:36,280 --> 00:20:39,640 Speaker 1: That's a good question. I well, definitely had wind up ones. Yeah, 406 00:20:39,680 --> 00:20:42,320 Speaker 1: but they stop. Yeah, I don't know. Well, I'm sure 407 00:20:42,359 --> 00:20:44,560 Speaker 1: there's a reason for that. I was just going to say, Patrick, 408 00:20:44,600 --> 00:20:47,800 Speaker 1: I could see you in a potential mister Patrick getting 409 00:20:47,840 --> 00:20:50,320 Speaker 1: married and exchanging Vita rings at the altar? 410 00:20:52,800 --> 00:20:53,440 Speaker 3: Did that at work? 411 00:20:53,480 --> 00:20:53,879 Speaker 2: I do that? 412 00:20:54,840 --> 00:20:57,320 Speaker 1: Yeah, like as long as he's a geek as well. 413 00:20:59,480 --> 00:21:03,520 Speaker 1: Google's start shaming apps that drain your battery? 414 00:21:03,840 --> 00:21:06,159 Speaker 2: How good is that? This is interesting? We'll see what 415 00:21:06,280 --> 00:21:08,680 Speaker 2: people may not realize. Well, look, for most of us, 416 00:21:08,720 --> 00:21:10,639 Speaker 2: we know we're in a new city or where somewhere 417 00:21:10,640 --> 00:21:13,679 Speaker 2: where we haven't been, and turn on our GPS and 418 00:21:13,800 --> 00:21:16,960 Speaker 2: use our phone to walk around and get around the place. 419 00:21:17,280 --> 00:21:20,320 Speaker 3: Our batteries get drained pretty significantly. I don't know if you've. 420 00:21:20,200 --> 00:21:24,760 Speaker 2: Noticed that before, but certainly using GPS drains your battery. However, 421 00:21:25,119 --> 00:21:27,359 Speaker 2: what a lot of people don't realize is that lots 422 00:21:27,359 --> 00:21:29,920 Speaker 2: of apps do lots of different things and use up 423 00:21:29,920 --> 00:21:34,439 Speaker 2: different system resources. Consequently, they actually take up more power 424 00:21:34,440 --> 00:21:35,159 Speaker 2: than other apps. 425 00:21:35,520 --> 00:21:36,000 Speaker 3: So in this. 426 00:21:36,040 --> 00:21:39,320 Speaker 2: Instance, what Google is now doing is it's rolling out 427 00:21:39,359 --> 00:21:43,640 Speaker 2: in the play Store a little indicated to say how 428 00:21:43,760 --> 00:21:45,680 Speaker 2: much that app is power. 429 00:21:45,520 --> 00:21:46,879 Speaker 3: Hungry, which is kind of cool. 430 00:21:47,119 --> 00:21:49,240 Speaker 2: So it means that when you're loading up new apps 431 00:21:49,240 --> 00:21:51,960 Speaker 2: on your phone, you can say, okay, this may be problematic, 432 00:21:52,200 --> 00:21:53,960 Speaker 2: or you switch the app off. So you know that 433 00:21:54,000 --> 00:21:56,960 Speaker 2: it's not going to cause problems if you want, you know, 434 00:21:57,560 --> 00:22:00,320 Speaker 2: I like that. A lot of phones now go into 435 00:22:00,359 --> 00:22:03,320 Speaker 2: a power save mode so you can set and this 436 00:22:03,400 --> 00:22:04,879 Speaker 2: is an interesting one. If you're going to spend a 437 00:22:04,880 --> 00:22:08,480 Speaker 2: whole day away from chargers and you don't want to 438 00:22:09,320 --> 00:22:11,879 Speaker 2: waste battery, what you can do is go into power 439 00:22:11,880 --> 00:22:14,560 Speaker 2: save mode. That will effectively turn off all the things 440 00:22:14,560 --> 00:22:17,520 Speaker 2: that aren't necessary, so you'll extend the battery life. I 441 00:22:17,600 --> 00:22:20,440 Speaker 2: was at a constant recently and a friend of mine 442 00:22:20,760 --> 00:22:23,840 Speaker 2: his phone was dropped down to like about fifteen percent 443 00:22:23,920 --> 00:22:26,400 Speaker 2: or less than that. And so I've got a battery 444 00:22:26,480 --> 00:22:29,240 Speaker 2: share feature. So at dinner, I flipped my phone over 445 00:22:29,440 --> 00:22:32,160 Speaker 2: and my phone can share my battery power and charge 446 00:22:32,160 --> 00:22:32,760 Speaker 2: his phone. 447 00:22:33,000 --> 00:22:34,280 Speaker 3: So that was kind of cool. We did that for 448 00:22:34,280 --> 00:22:35,520 Speaker 3: a little bit. The Pixel phone does that. 449 00:22:35,560 --> 00:22:37,879 Speaker 2: I'm not sure about other phones that can do that 450 00:22:37,960 --> 00:22:40,200 Speaker 2: as well, but I can share my battery with other people, 451 00:22:40,520 --> 00:22:41,560 Speaker 2: just kind of Can I tell. 452 00:22:41,480 --> 00:22:46,159 Speaker 1: You something funny that is interesting? When it said I 453 00:22:46,320 --> 00:22:50,879 Speaker 1: thought this said this heading Google starts shaming apps. I 454 00:22:50,880 --> 00:22:57,119 Speaker 1: thought that were apps that shamed people. That's how I 455 00:22:57,200 --> 00:23:00,920 Speaker 1: read it. Yeah, yeah, and I thought I'd created these 456 00:23:00,960 --> 00:23:06,320 Speaker 1: apps that shame people and also happen to drain your battery. 457 00:23:06,400 --> 00:23:09,280 Speaker 1: So I'm a fucking idiot, is the take home message. 458 00:23:10,040 --> 00:23:16,320 Speaker 1: Chemistry student develops clear nail polish that turns your fingernail 459 00:23:16,640 --> 00:23:18,679 Speaker 1: into a touchscreen stylust. 460 00:23:19,040 --> 00:23:19,919 Speaker 3: How good is that? 461 00:23:19,920 --> 00:23:23,479 Speaker 2: This is really interesting because I've observed now there's a 462 00:23:23,560 --> 00:23:26,159 Speaker 2: lady who's a receptionist at a company I do some 463 00:23:26,200 --> 00:23:29,080 Speaker 2: work with, and whenever she uses her smartphone, it does 464 00:23:29,119 --> 00:23:31,960 Speaker 2: my head in because she's got really long fingernails. When 465 00:23:31,960 --> 00:23:34,639 Speaker 2: she uses a smartphone, there's this really bizarre way that 466 00:23:34,680 --> 00:23:37,000 Speaker 2: she has to flatten her hand to use the s 467 00:23:37,680 --> 00:23:40,320 Speaker 2: because generally we just kind of broaden punch. 468 00:23:40,200 --> 00:23:41,640 Speaker 1: Souse the nail gets in the way. 469 00:23:41,800 --> 00:23:42,600 Speaker 3: Yeah, yeah, yeah. 470 00:23:42,640 --> 00:23:45,800 Speaker 2: So this student and this I love this that this 471 00:23:45,880 --> 00:23:52,000 Speaker 2: is a student at Centenary College of Louisiana and her 472 00:23:52,080 --> 00:23:55,679 Speaker 2: name is Manside de Sai, and she noticed that it 473 00:23:55,720 --> 00:23:56,399 Speaker 2: was problematic. 474 00:23:56,480 --> 00:23:59,159 Speaker 3: So she is a researcher. 475 00:23:59,640 --> 00:24:02,720 Speaker 2: In the area of I think it was like in 476 00:24:02,800 --> 00:24:07,040 Speaker 2: cosmetic chemistry. So she had the smarts in that area. 477 00:24:07,359 --> 00:24:10,239 Speaker 2: So she's come up with a clear nail polish that 478 00:24:10,520 --> 00:24:14,080 Speaker 2: acts effectively as a styluss once it's coated on the nail, 479 00:24:14,359 --> 00:24:16,639 Speaker 2: it becomes a styluss and it makes it easier for 480 00:24:16,640 --> 00:24:17,639 Speaker 2: you to use on your phone. 481 00:24:17,760 --> 00:24:20,080 Speaker 3: I thought that's awesome. Isn't it a great idea? 482 00:24:21,160 --> 00:24:23,359 Speaker 1: Okay, so I see a couple of issues. So the 483 00:24:23,440 --> 00:24:26,080 Speaker 1: nail keeps growing, right, and then you've got to cut 484 00:24:26,119 --> 00:24:28,159 Speaker 1: the nail or shape the nail. Do you have to 485 00:24:28,240 --> 00:24:30,800 Speaker 1: keep you must have to keep reapplying the stuff. 486 00:24:31,280 --> 00:24:33,439 Speaker 2: Yeah, but I guess if it's just the one finger, 487 00:24:33,600 --> 00:24:36,840 Speaker 2: you know, yeah, that's fine, You're right, Yeah, for sure. 488 00:24:36,680 --> 00:24:38,440 Speaker 3: It's not going to affect for the rest of it. 489 00:24:39,760 --> 00:24:42,560 Speaker 1: Yeah, but I'm just planning for my styluss fingernail. 490 00:24:44,359 --> 00:24:47,479 Speaker 3: See, do do you let your fingernails grow very much? 491 00:24:47,520 --> 00:24:49,159 Speaker 2: Because I get to a point where I have to 492 00:24:49,160 --> 00:24:50,840 Speaker 2: cut them because I used to do a lot of 493 00:24:50,880 --> 00:24:53,320 Speaker 2: rock climbing and these are way too long. I'm just 494 00:24:53,359 --> 00:24:55,359 Speaker 2: noticing the talons at the end of my fingers at 495 00:24:55,400 --> 00:24:55,760 Speaker 2: the moment. 496 00:24:56,240 --> 00:24:59,119 Speaker 1: Well, that's that's creepy. It's funny you ask, because I 497 00:24:59,160 --> 00:25:02,000 Speaker 1: actually have an inch answer. So, as some people know, 498 00:25:02,119 --> 00:25:04,480 Speaker 1: I used to play guitar a lot for a long time, 499 00:25:04,520 --> 00:25:07,240 Speaker 1: but I have not picked up a guitar in ten years. 500 00:25:07,880 --> 00:25:11,600 Speaker 1: And so the other day I took my guitars. I've 501 00:25:11,640 --> 00:25:14,000 Speaker 1: got a few up to the old guitar mechanic and went, 502 00:25:14,080 --> 00:25:16,040 Speaker 1: can you make these or razzle dazz or put some 503 00:25:16,119 --> 00:25:19,280 Speaker 1: new strings on them? Blah blah blah. So a lot 504 00:25:19,280 --> 00:25:22,399 Speaker 1: of guitarists, like people who play a lot. You'll notice 505 00:25:23,080 --> 00:25:25,879 Speaker 1: if they're a normal right handed guitar player, their left 506 00:25:25,960 --> 00:25:29,040 Speaker 1: hand will have no fingernails. Their right hand will have 507 00:25:29,119 --> 00:25:33,359 Speaker 1: long fingernails, so they can pluck. You're right, and so 508 00:25:33,600 --> 00:25:36,320 Speaker 1: right now my right nails are or my right hand 509 00:25:36,400 --> 00:25:38,840 Speaker 1: has got long nails, my left hand has got short nails. 510 00:25:39,280 --> 00:25:41,960 Speaker 1: So that's a weird little bit of trivia that people 511 00:25:41,960 --> 00:25:44,800 Speaker 1: don't need to know. But generally mine is short, tip 512 00:25:44,880 --> 00:25:49,440 Speaker 1: yours short? Sure its functional? 513 00:25:49,960 --> 00:25:52,040 Speaker 4: Yes, because I'm learning guitar harps. 514 00:25:52,480 --> 00:25:55,080 Speaker 1: Yeah, I know, I know you are. My nails down 515 00:25:55,680 --> 00:25:58,520 Speaker 1: the female Tommy Emmanuel, you can maybe let the right 516 00:25:58,800 --> 00:26:03,040 Speaker 1: side grow. Patrick, tell me about the ideal amount of 517 00:26:03,119 --> 00:26:08,800 Speaker 1: coffee allegedly that's going to there seems almost a counterintuitive statement, 518 00:26:09,280 --> 00:26:12,240 Speaker 1: the right amount of coffee to lower stress. 519 00:26:12,480 --> 00:26:16,320 Speaker 2: Yeah, isn't that interesting? This is not just an arbitrary study. 520 00:26:16,359 --> 00:26:18,600 Speaker 2: So the researchers in China, and this is I find 521 00:26:18,680 --> 00:26:22,119 Speaker 2: this interesting because it actually says China. I paid the 522 00:26:22,200 --> 00:26:24,359 Speaker 2: most have ever paid for a cup of coffee in China. 523 00:26:24,480 --> 00:26:28,600 Speaker 2: Was crap because Chinese generally don't drink that much coffee. 524 00:26:28,800 --> 00:26:30,879 Speaker 1: And I'm not not renowned for it. 525 00:26:30,960 --> 00:26:35,639 Speaker 2: No, paid fourteen dollars for a coffee like That's outrageous, 526 00:26:35,680 --> 00:26:35,960 Speaker 2: isn't it? 527 00:26:36,560 --> 00:26:37,399 Speaker 3: And it was terrible. 528 00:26:37,440 --> 00:26:39,359 Speaker 2: It was Yeah, anyway, I should have gone back and 529 00:26:39,359 --> 00:26:43,840 Speaker 2: got more, but any other language barrier. But so, what 530 00:26:43,960 --> 00:26:48,480 Speaker 2: these researchers did at Fudan University in China, they took 531 00:26:48,680 --> 00:26:53,120 Speaker 2: some findings, and it was findings from over four hundred 532 00:26:53,160 --> 00:26:58,400 Speaker 2: and sixty thousand individuals over thirteen point four years. And 533 00:26:58,760 --> 00:27:02,639 Speaker 2: what they did was they looked at the mental health 534 00:27:02,680 --> 00:27:05,560 Speaker 2: at the start of the study and then over that 535 00:27:05,720 --> 00:27:08,560 Speaker 2: span of time they charted how much. It was just 536 00:27:08,600 --> 00:27:11,000 Speaker 2: self reporting, but they said, you know, how much coffee 537 00:27:11,000 --> 00:27:15,120 Speaker 2: do you drink? And they seem that after the number crunching, 538 00:27:16,320 --> 00:27:18,720 Speaker 2: those people who drank two to three cups of coffee 539 00:27:18,760 --> 00:27:23,200 Speaker 2: a day were the least likely to develop mental health 540 00:27:23,240 --> 00:27:26,120 Speaker 2: problems compared to those that didn't drink coffee at all 541 00:27:26,440 --> 00:27:29,399 Speaker 2: or who drank more than three cups. So that sweet 542 00:27:29,440 --> 00:27:32,040 Speaker 2: spot is two to three cups is what they found out. 543 00:27:32,160 --> 00:27:34,359 Speaker 2: Now this again they were looking at other people's research 544 00:27:34,359 --> 00:27:36,399 Speaker 2: and they collated it all. But it seems like, you know, 545 00:27:36,440 --> 00:27:39,119 Speaker 2: nearly half a million people over such a long amount 546 00:27:39,160 --> 00:27:43,120 Speaker 2: of time, and that's conclusions that they drew from that, 547 00:27:43,400 --> 00:27:45,639 Speaker 2: you know, that number crunching of that data. 548 00:27:46,240 --> 00:27:48,560 Speaker 1: That's interesting. One thing I would want to know is 549 00:27:48,560 --> 00:27:51,600 Speaker 1: how many milligrams of caffeine per cup, because that can 550 00:27:51,680 --> 00:27:54,359 Speaker 1: vary from forty into one hundred and forty to like 551 00:27:54,440 --> 00:27:59,320 Speaker 1: one hundred. So but yeah, it's I don't know. I 552 00:27:59,480 --> 00:28:03,480 Speaker 1: think moderate caffeine intake for me, and it is not 553 00:28:03,560 --> 00:28:08,120 Speaker 1: medical advice everyone, but for me, definitely better, Like it's 554 00:28:08,160 --> 00:28:10,720 Speaker 1: a kind of a good stimulant, not a crazy, out 555 00:28:10,720 --> 00:28:14,560 Speaker 1: of control stimulant. Definitely an improved cognitive performance for me, 556 00:28:15,119 --> 00:28:18,760 Speaker 1: attention focus for a period of time. And also just 557 00:28:18,920 --> 00:28:21,280 Speaker 1: I like my little morning ritual Patrick. 558 00:28:21,240 --> 00:28:23,560 Speaker 2: Yeah, me too, I mate, I mean I was running, 559 00:28:23,920 --> 00:28:26,520 Speaker 2: I wasn't actually running late this morning, but I had 560 00:28:26,520 --> 00:28:28,159 Speaker 2: a shower, I got ready because I've got to go 561 00:28:28,160 --> 00:28:30,280 Speaker 2: into Melbourne on the train on the chu Chuo. 562 00:28:31,480 --> 00:28:33,080 Speaker 3: I can't believe just said that. Can you cut that 563 00:28:33,119 --> 00:28:34,560 Speaker 3: out to anyway? I love it. 564 00:28:34,640 --> 00:28:39,440 Speaker 1: The chuo choo that you do. Did you do yeah? 565 00:28:41,080 --> 00:28:42,959 Speaker 3: Did you chose a direction? 566 00:28:43,160 --> 00:28:43,360 Speaker 1: Yeah? 567 00:28:43,480 --> 00:28:43,640 Speaker 3: Nah? 568 00:28:43,800 --> 00:28:47,520 Speaker 1: Fuck that. Let's know, let's leave that all in yep, 569 00:28:47,600 --> 00:28:49,600 Speaker 1: go on. So what's your coffee ritual? 570 00:28:50,240 --> 00:28:52,000 Speaker 2: Well, this morning I had to have a coffee before 571 00:28:52,040 --> 00:28:53,520 Speaker 2: I came in, and you're probably not going to be 572 00:28:53,760 --> 00:28:55,160 Speaker 2: you might be able to see it. It's a really 573 00:28:55,200 --> 00:28:56,760 Speaker 2: cute mug that I've got. Can you see that? 574 00:28:56,960 --> 00:28:57,280 Speaker 1: Yeah? 575 00:28:57,360 --> 00:29:01,720 Speaker 3: Yeah, yeah, yeah, loved by a schnauzer, loved by someone gave. 576 00:29:01,920 --> 00:29:05,400 Speaker 1: You've obviously heard Dame Edner's or watched the video about 577 00:29:06,160 --> 00:29:09,560 Speaker 1: if have you ever seen Dame Edner talking about his 578 00:29:09,640 --> 00:29:17,720 Speaker 1: wife schnauzer? Yes, viewers, Viewers, you probably if you haven't, viewers, listeners, 579 00:29:17,800 --> 00:29:21,200 Speaker 1: if you haven't seen that, just Dame Edner. It's on 580 00:29:21,600 --> 00:29:23,800 Speaker 1: Park Parkinson. 581 00:29:23,640 --> 00:29:25,640 Speaker 3: Liz Patterson. It was Liz Patterson. 582 00:29:26,080 --> 00:29:31,840 Speaker 1: Oh yeah, that's right, sorry, Les Patterson. Exactly right. Yeah, 583 00:29:32,080 --> 00:29:38,880 Speaker 1: very very funny. New psychology research reveals the cognitive cost 584 00:29:39,240 --> 00:29:42,360 Speaker 1: of smartphone notifications. Oh this is see. 585 00:29:42,360 --> 00:29:44,800 Speaker 2: This is an interesting one because we talk about the 586 00:29:44,840 --> 00:29:48,520 Speaker 2: impact of using your phones and being distracted by technology, 587 00:29:48,920 --> 00:29:52,160 Speaker 2: and it seems that it's not how long you spend 588 00:29:52,160 --> 00:29:55,640 Speaker 2: on the device. It's how often you're distracted by the device, 589 00:29:55,760 --> 00:29:58,760 Speaker 2: is what this particular study. So this was a study 590 00:29:58,880 --> 00:30:02,440 Speaker 2: in computers and human behavior, and what they looked at 591 00:30:02,560 --> 00:30:06,040 Speaker 2: is how your concentration gets broken by notifications. So one 592 00:30:06,040 --> 00:30:08,240 Speaker 2: of the things I realized many many, many years ago 593 00:30:08,600 --> 00:30:12,040 Speaker 2: was that every time I was using my computer and 594 00:30:12,160 --> 00:30:15,560 Speaker 2: Outlook was checking for emails, I'd get a ping and 595 00:30:15,840 --> 00:30:18,280 Speaker 2: that would be distracting because the first thing that you 596 00:30:18,320 --> 00:30:20,120 Speaker 2: do when you hear the ping is you rush to 597 00:30:20,120 --> 00:30:22,480 Speaker 2: your inbox to see it might be a new email. 598 00:30:22,480 --> 00:30:25,160 Speaker 2: It's that kind of reward, you know, the ping is 599 00:30:25,280 --> 00:30:27,400 Speaker 2: the notification. The reward is to see if there's a 600 00:30:27,440 --> 00:30:32,320 Speaker 2: new email. So I turn off my Outlooks ability to 601 00:30:32,440 --> 00:30:35,680 Speaker 2: check for emails so that only I choose when I 602 00:30:35,720 --> 00:30:38,360 Speaker 2: want to check for emails. So I kind of take 603 00:30:38,400 --> 00:30:40,880 Speaker 2: it away from there and I'm not distracted. But smartphones 604 00:30:40,920 --> 00:30:43,600 Speaker 2: do this all the time when you look at your notifications. 605 00:30:43,920 --> 00:30:46,640 Speaker 2: You know, I always turn notifications off when I'm teaching 606 00:30:46,760 --> 00:30:48,440 Speaker 2: or I'm going to be in a meeting. One of 607 00:30:48,520 --> 00:30:51,200 Speaker 2: the loveliest features, my favorite feature of my phone, my 608 00:30:51,240 --> 00:30:53,800 Speaker 2: pixel is I like that when you turn it upside down, 609 00:30:54,160 --> 00:30:56,320 Speaker 2: it goes into do not disturb mode, and I find 610 00:30:56,360 --> 00:30:58,200 Speaker 2: that when I go into a meeting or I'm talking 611 00:30:58,280 --> 00:31:01,200 Speaker 2: to someone, putting your phone fail down is also a 612 00:31:01,360 --> 00:31:04,200 Speaker 2: very visual queue that I want to talk to you 613 00:31:04,400 --> 00:31:06,640 Speaker 2: and not be distracted by my phone. I think that 614 00:31:06,760 --> 00:31:10,480 Speaker 2: says sends a clear message when you're somebody, because there's 615 00:31:10,480 --> 00:31:13,240 Speaker 2: nothing worse than having a conversation and the person picks 616 00:31:13,320 --> 00:31:15,640 Speaker 2: up the phone and you think, are you talking to 617 00:31:15,680 --> 00:31:17,280 Speaker 2: me or are you're looking at your phone? And I 618 00:31:17,360 --> 00:31:20,320 Speaker 2: know people who are like that mid conversation will start 619 00:31:20,360 --> 00:31:23,120 Speaker 2: looking at their phone. So it makes a lot of 620 00:31:23,160 --> 00:31:26,360 Speaker 2: sense when it comes to being distracted. You think about 621 00:31:26,360 --> 00:31:28,680 Speaker 2: all the different ways you're distracted during the day. Tip 622 00:31:29,000 --> 00:31:32,320 Speaker 2: do you have notifications on what pings on your phone? 623 00:31:32,360 --> 00:31:33,960 Speaker 2: What you know, what's your computer doing? 624 00:31:35,360 --> 00:31:38,960 Speaker 1: It's not a fair question because she's got ADHD. Yeah. 625 00:31:39,080 --> 00:31:43,040 Speaker 5: I turned all my notifications off and very few of 626 00:31:43,080 --> 00:31:44,240 Speaker 5: them will show the. 627 00:31:44,160 --> 00:31:46,280 Speaker 4: Red dots, and I hide things. 628 00:31:46,360 --> 00:31:49,520 Speaker 5: But I'm still stupidly addicted to still pick it up, 629 00:31:49,640 --> 00:31:51,840 Speaker 5: pick it up all the time. Check I'll pick it 630 00:31:51,960 --> 00:31:54,960 Speaker 5: up for one thing, and I opened four other apps first, 631 00:31:55,000 --> 00:31:56,800 Speaker 5: and then I forget what I picked it up to do. 632 00:31:57,440 --> 00:31:59,600 Speaker 3: It was a landmark court case recently, Tiff that you 633 00:31:59,640 --> 00:32:01,160 Speaker 3: may be with the leaving readjob. 634 00:32:03,280 --> 00:32:04,120 Speaker 4: We're thinking about it. 635 00:32:04,480 --> 00:32:08,960 Speaker 2: You can retire six million US. Holy crap, that's worth 636 00:32:08,960 --> 00:32:09,800 Speaker 2: the case, isn't it? 637 00:32:11,480 --> 00:32:13,920 Speaker 4: No mate, I'm on big bucks at type. That's not 638 00:32:14,000 --> 00:32:14,520 Speaker 4: worth it for me. 639 00:32:14,720 --> 00:32:18,080 Speaker 2: We do in fact, Craig Didy double my, would I 640 00:32:18,080 --> 00:32:19,360 Speaker 2: get for the Joe this year? 641 00:32:22,880 --> 00:32:27,120 Speaker 1: You're too okay now you go? Yeah, fucking idiots. That's 642 00:32:27,160 --> 00:32:29,280 Speaker 1: what I had before the start of the show. Everyone, 643 00:32:30,080 --> 00:32:32,880 Speaker 1: I expect me to pay them. Get good and we 644 00:32:32,960 --> 00:32:37,800 Speaker 1: might talk about it. AI software for smart glasses wins 645 00:32:37,840 --> 00:32:41,960 Speaker 1: a million dollar or a million pound prize for technology 646 00:32:42,080 --> 00:32:44,840 Speaker 1: to help people with dementia. I might have to get 647 00:32:44,840 --> 00:32:47,080 Speaker 1: some for some people that I know quite closely. 648 00:32:47,760 --> 00:32:50,840 Speaker 2: This is interesting because you know, there's lots of talk 649 00:32:50,920 --> 00:32:54,040 Speaker 2: around smart glasses and the use and applications, and there's 650 00:32:54,120 --> 00:32:57,960 Speaker 2: the centers in one sense that call them glassholes because 651 00:32:58,000 --> 00:33:00,600 Speaker 2: they can film things when you're not sure about. But 652 00:33:00,680 --> 00:33:03,040 Speaker 2: this is actually a really interesting one and it kind 653 00:33:03,080 --> 00:33:05,880 Speaker 2: of comes to heart to me because Mum had dementia 654 00:33:06,320 --> 00:33:09,640 Speaker 2: and what they're talking about is embedding AI software into 655 00:33:09,680 --> 00:33:13,920 Speaker 2: smart glasses so that people wearing them can be prompted 656 00:33:14,440 --> 00:33:18,320 Speaker 2: whilst with verbal cues, and that's really good. So think 657 00:33:18,360 --> 00:33:20,040 Speaker 2: about maybe, you know one of the things that happened 658 00:33:20,040 --> 00:33:22,920 Speaker 2: with mum. Dad still worked and Mum was at home, 659 00:33:23,480 --> 00:33:25,680 Speaker 2: and if there were a few times where there was 660 00:33:25,720 --> 00:33:27,840 Speaker 2: one instance where Mum turned the burner on on the 661 00:33:27,880 --> 00:33:30,920 Speaker 2: stove and put the plastic kettle on the electric kettle. 662 00:33:31,480 --> 00:33:36,360 Speaker 2: Oh yeah, and that's the reality of how dementia can work. 663 00:33:36,400 --> 00:33:40,000 Speaker 2: And you know, keeping people at home safe is really important. 664 00:33:40,320 --> 00:33:43,360 Speaker 2: So potentially the smart glasses could be monitoring what's happening 665 00:33:43,400 --> 00:33:45,400 Speaker 2: on a day to day basis and if you went 666 00:33:45,440 --> 00:33:48,080 Speaker 2: to you know, if you could imagine then turn the 667 00:33:48,120 --> 00:33:52,360 Speaker 2: gas off, if those prompts were being able to be 668 00:33:52,440 --> 00:33:55,120 Speaker 2: given to the person with dementia, it could make their 669 00:33:55,160 --> 00:33:58,320 Speaker 2: life so much safer. So there's no there's no surprise 670 00:33:58,400 --> 00:34:00,960 Speaker 2: that they've won a one million pound fries. 671 00:34:00,720 --> 00:34:02,320 Speaker 3: To further develop this. 672 00:34:02,440 --> 00:34:06,600 Speaker 2: So it's called cross Sense and but quickly you wear 673 00:34:06,640 --> 00:34:10,040 Speaker 2: it and it's a chatty assistant. It's called Wispy and yeah, 674 00:34:10,120 --> 00:34:13,120 Speaker 2: it's able to give you prompts and feedback during tasks, 675 00:34:13,120 --> 00:34:16,320 Speaker 2: so verbal cues and now you can even have text 676 00:34:16,360 --> 00:34:18,839 Speaker 2: floating in front of your eyes and you can ask 677 00:34:18,920 --> 00:34:21,879 Speaker 2: questions to it, you know, and engage in conversation. So 678 00:34:22,160 --> 00:34:25,480 Speaker 2: even the loneliness factor of somebody suffering from dementia, they 679 00:34:25,480 --> 00:34:27,280 Speaker 2: may be able to combat that as well. 680 00:34:27,440 --> 00:34:29,960 Speaker 3: I think it's awesome. This is really exciting stuff. 681 00:34:31,239 --> 00:34:34,200 Speaker 1: Yeah, I think one of the I think that's brilliant too. 682 00:34:34,560 --> 00:34:38,759 Speaker 1: One of the challenges is like my mum's you know, 683 00:34:39,080 --> 00:34:41,359 Speaker 1: she's in that space and she's eighty six and she's 684 00:34:41,400 --> 00:34:45,040 Speaker 1: had an iPhone for like that I buy every few 685 00:34:45,120 --> 00:34:48,360 Speaker 1: years a new one, like for the last ten years, 686 00:34:49,200 --> 00:34:51,920 Speaker 1: and she still doesn't know how to use an iPhone, 687 00:34:52,440 --> 00:34:54,600 Speaker 1: like and she taps it with her finger and I 688 00:34:54,800 --> 00:34:58,200 Speaker 1: like a fucking woodpecker. I'm like, Mum, you know, just 689 00:34:58,200 --> 00:35:02,320 Speaker 1: just put your index on, you know, just use your finger, 690 00:35:02,440 --> 00:35:05,040 Speaker 1: not your nail. And it's like if she comes the 691 00:35:05,200 --> 00:35:08,200 Speaker 1: higher she comes from, like if it doesn't work, then 692 00:35:08,239 --> 00:35:12,040 Speaker 1: she taps it harder, you know, Like it's like force 693 00:35:12,360 --> 00:35:14,040 Speaker 1: is going to make I don't know, like it's. 694 00:35:13,920 --> 00:35:15,840 Speaker 3: A capable as you can buy now that. 695 00:35:16,080 --> 00:35:18,480 Speaker 1: Yeah, let's get that. Let's get some of that for her. 696 00:35:20,320 --> 00:35:24,239 Speaker 1: I wanted to talk about this one because I and 697 00:35:24,440 --> 00:35:28,399 Speaker 1: I wear headphones all the time when I'm about has 698 00:35:28,440 --> 00:35:32,640 Speaker 1: it as substances found in all headphones, Oh my goodness, 699 00:35:32,800 --> 00:35:35,600 Speaker 1: tested by the tox Free Project. 700 00:35:36,200 --> 00:35:38,239 Speaker 2: So this was an article in The Guardian a few 701 00:35:38,280 --> 00:35:41,320 Speaker 2: weeks ago, and it really kind of frightened me a 702 00:35:41,360 --> 00:35:45,080 Speaker 2: little bit because I do wear headphones all day long. 703 00:35:45,520 --> 00:35:48,319 Speaker 2: I have in my office and they sit on they 704 00:35:48,360 --> 00:35:50,360 Speaker 2: literally sit on me all the time. But that means 705 00:35:50,440 --> 00:35:54,359 Speaker 2: that that plastic is sitting against my skin and so 706 00:35:54,400 --> 00:35:58,440 Speaker 2: it heats up and it cools down. And so this organization, 707 00:35:58,640 --> 00:36:02,640 Speaker 2: this tox Free Pronchi, is looking at what actually happens 708 00:36:02,960 --> 00:36:05,560 Speaker 2: when we come into contact with plastics. And I know 709 00:36:05,600 --> 00:36:09,560 Speaker 2: plastics are the whole new Achilles Heel of the human 710 00:36:09,680 --> 00:36:12,600 Speaker 2: race at the moment, and it's frightening to think that, 711 00:36:12,800 --> 00:36:15,160 Speaker 2: you know, I mean, how long has plastics been around 712 00:36:15,200 --> 00:36:18,239 Speaker 2: for It's a petroleum product, that's you know, and it's 713 00:36:18,320 --> 00:36:21,319 Speaker 2: in everything. But now they're saying that even the top 714 00:36:21,400 --> 00:36:24,080 Speaker 2: level brands. This is what the report said in the Guardian, 715 00:36:24,160 --> 00:36:28,040 Speaker 2: bows Panasonic, Samsung, sen Hi. So we're found to contain 716 00:36:28,120 --> 00:36:31,560 Speaker 2: harmful chemicals because of the way the plastics are made. 717 00:36:31,600 --> 00:36:34,840 Speaker 2: So the formula of the plastics which they're actually made of, 718 00:36:34,880 --> 00:36:38,439 Speaker 2: and there's nothing regulating what sort of plastic is being 719 00:36:38,560 --> 00:36:42,360 Speaker 2: used in manufacturing, so there is somewhat they call not 720 00:36:42,480 --> 00:36:44,640 Speaker 2: medical grade, I think they call they call it medical 721 00:36:44,680 --> 00:36:48,120 Speaker 2: grade plastics. So they're designed to be used in medical 722 00:36:48,160 --> 00:36:52,719 Speaker 2: products or they're safe safer to use, so whether it's 723 00:36:52,719 --> 00:36:55,440 Speaker 2: being used in a microwave or being used to store food. 724 00:36:55,800 --> 00:36:58,319 Speaker 2: But that's not necessarily the case with headphones. And this 725 00:36:58,360 --> 00:37:01,520 Speaker 2: is where the industry is now home under scrutiny because 726 00:37:01,840 --> 00:37:05,239 Speaker 2: these plastics are in contact with us constantly, and that's 727 00:37:05,280 --> 00:37:07,839 Speaker 2: the concern because we've got them in our ears, we've 728 00:37:07,840 --> 00:37:08,320 Speaker 2: got them. 729 00:37:08,239 --> 00:37:11,600 Speaker 3: Vibrating against our jaws or whatever, or over ear headphones. 730 00:37:12,000 --> 00:37:14,560 Speaker 2: And now they're kind of saying, well, there's no immediate 731 00:37:14,680 --> 00:37:16,839 Speaker 2: health risk, but what they're concerned about is the long 732 00:37:16,920 --> 00:37:20,239 Speaker 2: term exposure. So don't throw away your headphones now. But 733 00:37:20,400 --> 00:37:23,879 Speaker 2: they talk again about vulnerable group groups like teenagers, as 734 00:37:24,000 --> 00:37:27,640 Speaker 2: children are growing, what are the impacts of having chemicals 735 00:37:27,760 --> 00:37:31,279 Speaker 2: like coming out of these plastics that potentially could be 736 00:37:31,320 --> 00:37:35,200 Speaker 2: influencing because they're even talking about affecting young males, and 737 00:37:36,400 --> 00:37:41,239 Speaker 2: with some of the plastics mimic hormones as well, so 738 00:37:41,280 --> 00:37:44,120 Speaker 2: they have the same effect that hormones do on young 739 00:37:44,200 --> 00:37:45,120 Speaker 2: forming bodies here. 740 00:37:45,800 --> 00:37:49,640 Speaker 1: Yeah, well they're called end to crime disruptors, and you're 741 00:37:49,680 --> 00:37:52,160 Speaker 1: exactly right. So, and what we know is the softer 742 00:37:52,280 --> 00:37:56,600 Speaker 1: the plastic, the worse because it's more permeable. And so 743 00:37:56,640 --> 00:37:58,960 Speaker 1: you get a bottle of water, you leave it in 744 00:37:59,000 --> 00:38:03,080 Speaker 1: your car and it's day. You should absolutely not drink 745 00:38:03,120 --> 00:38:07,720 Speaker 1: that absolutely also somewhat distressing for most of our audience 746 00:38:07,760 --> 00:38:11,280 Speaker 1: that I'm about to share with. Think about a warm 747 00:38:11,520 --> 00:38:14,759 Speaker 1: or a hot coffee, or worse, a black coffee in 748 00:38:14,800 --> 00:38:18,160 Speaker 1: a takeaway cup with a plastic lid. You put your 749 00:38:18,200 --> 00:38:21,800 Speaker 1: lips on the plastic, and then you have this heated 750 00:38:21,840 --> 00:38:26,320 Speaker 1: liquid coming through the plastic into your mouth. I think, 751 00:38:26,800 --> 00:38:29,120 Speaker 1: and I've never heard this theory, this is mine, maybe 752 00:38:29,160 --> 00:38:31,719 Speaker 1: somebody else came up with it. I think we're going 753 00:38:31,760 --> 00:38:34,120 Speaker 1: to find out that's a big fucking problem down the 754 00:38:34,160 --> 00:38:39,000 Speaker 1: track because you've literally got this stuff that can permeate 755 00:38:39,440 --> 00:38:42,640 Speaker 1: a that's a very vulnerable type of plastic that and 756 00:38:42,719 --> 00:38:47,399 Speaker 1: we're drinking through that plastic. That's a pretty easy entry 757 00:38:47,400 --> 00:38:51,400 Speaker 1: system into your physiology. So either drink out of a 758 00:38:51,480 --> 00:38:55,320 Speaker 1: mug like a ceramic mug or whatever is my advice, 759 00:38:56,000 --> 00:38:59,960 Speaker 1: or get yourself a steal keep cup. 760 00:39:00,760 --> 00:39:02,640 Speaker 2: Now that's interesting you say that though, because I've got 761 00:39:02,640 --> 00:39:04,920 Speaker 2: a glass keep cup and we thought, oh yeah, glass 762 00:39:04,960 --> 00:39:05,359 Speaker 2: keep cup. 763 00:39:05,360 --> 00:39:08,920 Speaker 3: That's a bit of option. But the lead is still plastic. 764 00:39:08,840 --> 00:39:11,280 Speaker 1: That's right, and that that is an issue. 765 00:39:11,520 --> 00:39:13,680 Speaker 3: That's a more plastic though, isn't it. 766 00:39:13,680 --> 00:39:16,440 Speaker 1: It's that's very very hard. I've got the same I've 767 00:39:16,440 --> 00:39:20,080 Speaker 1: got a steel keep cup and the lid is like 768 00:39:21,160 --> 00:39:25,400 Speaker 1: steel plastic. It's still I'm not, I'm not. It's great, 769 00:39:25,800 --> 00:39:27,680 Speaker 1: but I will often take the lid off anywhere and 770 00:39:27,760 --> 00:39:30,400 Speaker 1: just drink out of the you know, like unless you 771 00:39:30,440 --> 00:39:32,520 Speaker 1: need it to be hot for ouse. We've got to 772 00:39:32,760 --> 00:39:34,960 Speaker 1: wind up very quickly because we've all got to get 773 00:39:35,000 --> 00:39:37,359 Speaker 1: out early. Is today, let's do one more tell us 774 00:39:37,400 --> 00:39:41,640 Speaker 1: about mini brains replacing lab animals. I do not even 775 00:39:41,680 --> 00:39:42,720 Speaker 1: know what that means. 776 00:39:43,320 --> 00:39:45,760 Speaker 3: Oh, this is yeah what I love it being a vegan. 777 00:39:46,560 --> 00:39:49,239 Speaker 2: You. I finally introduced it into the conversation and there 778 00:39:49,320 --> 00:39:51,840 Speaker 2: was a bit of seguey there, no, what the idea 779 00:39:51,960 --> 00:39:55,160 Speaker 2: is that they're now lab growing cells. 780 00:39:56,080 --> 00:39:58,440 Speaker 3: That's tickled fancy there by the. 781 00:39:58,360 --> 00:40:01,000 Speaker 4: Way look on that his face is. 782 00:40:01,040 --> 00:40:03,440 Speaker 3: Just it is. 783 00:40:03,640 --> 00:40:06,840 Speaker 2: I wish we could if you show this to our listeners, 784 00:40:06,920 --> 00:40:09,839 Speaker 2: because you know he does get that. Look, wait a minute, 785 00:40:09,880 --> 00:40:11,520 Speaker 2: he's going to pick up his shabby now and wipe 786 00:40:11,520 --> 00:40:11,879 Speaker 2: his face. 787 00:40:11,880 --> 00:40:14,120 Speaker 4: I think you just swap into his smart glasses so he. 788 00:40:15,960 --> 00:40:17,880 Speaker 3: May put him at the start of the show. 789 00:40:18,680 --> 00:40:21,760 Speaker 1: See what I mean? Everyone, what I mean bullying? Bullying? 790 00:40:21,800 --> 00:40:26,279 Speaker 1: It's subversive, it seems low level, and they gang up 791 00:40:26,320 --> 00:40:29,560 Speaker 1: on me like that all the time. Yeah, the comments 792 00:40:29,600 --> 00:40:32,440 Speaker 1: are going to be boo fucking who harps? And then 793 00:40:32,520 --> 00:40:34,000 Speaker 1: that makes you a bully as well? 794 00:40:34,480 --> 00:40:39,640 Speaker 3: That boo who harps? 795 00:40:40,000 --> 00:40:42,720 Speaker 1: Is that the title of the episode. Yeah, boo boo, 796 00:40:43,320 --> 00:40:47,799 Speaker 1: poor poor fragile jumbo. I'll survive, all right, tell us 797 00:40:47,840 --> 00:40:48,120 Speaker 1: come on. 798 00:40:49,440 --> 00:40:52,919 Speaker 2: So obviously, mice and rats and pigs have often been 799 00:40:53,040 --> 00:40:56,919 Speaker 2: used as the animals for bio research. Now we can 800 00:40:57,000 --> 00:41:00,239 Speaker 2: culture and we're talking two hundred million animals used in 801 00:41:00,320 --> 00:41:04,160 Speaker 2: labs every year, and if we can combat that. And 802 00:41:04,400 --> 00:41:07,200 Speaker 2: the thing is, you know, we've cured cancer and so 803 00:41:07,239 --> 00:41:11,880 Speaker 2: many different mice, but is pothetically identical to us. So 804 00:41:11,920 --> 00:41:14,719 Speaker 2: the reality of it is, even though we're using animals 805 00:41:14,719 --> 00:41:17,680 Speaker 2: that are very similar to us genetically, they're not identical 806 00:41:17,719 --> 00:41:21,200 Speaker 2: to us. So if we can culture stem cells that 807 00:41:21,239 --> 00:41:23,920 Speaker 2: then grow into say a brain culture, and then we 808 00:41:24,080 --> 00:41:27,399 Speaker 2: use that for study and research that has a lot 809 00:41:27,440 --> 00:41:30,840 Speaker 2: of merit because they're human brain cells, not a pig 810 00:41:31,160 --> 00:41:34,400 Speaker 2: or a chicken or whatever it is using. So it 811 00:41:34,480 --> 00:41:37,239 Speaker 2: has this double benefit, one being that you know, two 812 00:41:37,320 --> 00:41:40,879 Speaker 2: hundred million animals may not necessarily have to be experimented on, 813 00:41:41,360 --> 00:41:46,520 Speaker 2: and we're closely aligning whatever we're researching with what's genetically 814 00:41:46,600 --> 00:41:47,600 Speaker 2: much closer to us. 815 00:41:49,600 --> 00:41:52,360 Speaker 1: Well, that's good. That's good for a range of reasons. 816 00:41:52,600 --> 00:41:53,480 Speaker 3: The win win, isn't it? 817 00:41:53,800 --> 00:41:54,000 Speaker 4: Oh? 818 00:41:54,040 --> 00:41:58,359 Speaker 1: It certainly is? Patrick. Where can people find you and 819 00:41:59,320 --> 00:42:02,280 Speaker 1: you know, come and see you and employ your. 820 00:42:02,160 --> 00:42:06,920 Speaker 2: Services websites noow dot com dot au and if you 821 00:42:07,000 --> 00:42:09,320 Speaker 2: have anything you want us to talk about. If you 822 00:42:09,360 --> 00:42:12,359 Speaker 2: think we should be bullying Craig regularly on the show, 823 00:42:12,400 --> 00:42:14,799 Speaker 2: please feel free to contact me and I will pass 824 00:42:14,840 --> 00:42:15,040 Speaker 2: it on. 825 00:42:16,000 --> 00:42:19,200 Speaker 1: I will accept it. I will take it now because 826 00:42:19,320 --> 00:42:22,799 Speaker 1: it's all about getting uncomfortable. It's okay, I'm okay with it. 827 00:42:23,400 --> 00:42:27,000 Speaker 1: You're bullying equals my resilience, So have at it. You 828 00:42:27,120 --> 00:42:30,800 Speaker 1: just you just let lose you too and bully TIF. 829 00:42:30,840 --> 00:42:32,400 Speaker 1: What have you got to say for yourself? 830 00:42:33,360 --> 00:42:33,680 Speaker 4: Nothing? 831 00:42:33,719 --> 00:42:34,000 Speaker 1: For me? 832 00:42:34,080 --> 00:42:37,279 Speaker 5: I think you're fabulous in your floating today and it's 833 00:42:37,280 --> 00:42:37,920 Speaker 5: been a pleasure. 834 00:42:38,400 --> 00:42:40,239 Speaker 1: So you don't try and you can't wind it back. 835 00:42:40,280 --> 00:42:44,080 Speaker 1: Now what's done is done, and don't worry. It's it's 836 00:42:44,200 --> 00:42:47,640 Speaker 1: locked away in the fucking cognitive vault. So don't you 837 00:42:47,680 --> 00:42:52,160 Speaker 1: come to me for any favors. Ah, we're just gagging. 838 00:42:52,200 --> 00:42:54,839 Speaker 1: We love each other everyone, so don't think, oh my god, 839 00:42:54,880 --> 00:42:57,480 Speaker 1: there actually is a if there's no fucking riff. I 840 00:42:57,560 --> 00:42:59,399 Speaker 1: always say to people, if I pick on you, I 841 00:42:59,400 --> 00:43:04,120 Speaker 1: love you. So if I don't pick on you, oh. 842 00:43:03,600 --> 00:43:06,960 Speaker 3: Sorry, can I just tell you a quick quit thing? 843 00:43:06,960 --> 00:43:09,319 Speaker 3: I know we shouldn't do this. Ah, when I was 844 00:43:09,320 --> 00:43:09,920 Speaker 3: in Italy. 845 00:43:10,600 --> 00:43:13,080 Speaker 1: Fucker now, it's like you know, when you've said good 846 00:43:13,160 --> 00:43:15,799 Speaker 1: night to your mum and she still keeps talking on 847 00:43:15,840 --> 00:43:19,319 Speaker 1: the phone night and you're not fucking getting. 848 00:43:19,280 --> 00:43:20,080 Speaker 3: So we're gon. 849 00:43:20,400 --> 00:43:23,759 Speaker 2: We're on a golf buggy tour of Rome, right, so 850 00:43:23,880 --> 00:43:25,560 Speaker 2: three of us in the back of these little cabs 851 00:43:25,560 --> 00:43:27,760 Speaker 2: at the back of this thing. Anyway, we're twenty minutes 852 00:43:27,760 --> 00:43:29,920 Speaker 2: into the tour and the tour guy turns around and says, 853 00:43:30,120 --> 00:43:32,919 Speaker 2: you guys must be really good friends because you keep 854 00:43:32,960 --> 00:43:33,919 Speaker 2: giving each other shit. 855 00:43:34,520 --> 00:43:36,400 Speaker 3: It's like, because that's what I do with my friends. 856 00:43:36,440 --> 00:43:38,799 Speaker 2: And it's true, isn't it When we rip each other 857 00:43:38,920 --> 00:43:40,799 Speaker 2: and do all that, we don't really make it because 858 00:43:40,840 --> 00:43:41,399 Speaker 2: we love you. 859 00:43:41,560 --> 00:43:43,120 Speaker 1: Do you know who's not as good at it? 860 00:43:43,760 --> 00:43:43,880 Speaker 3: Who? 861 00:43:44,680 --> 00:43:49,600 Speaker 1: Ladies? Ladies? Ladies like a lady. 862 00:43:49,760 --> 00:43:51,880 Speaker 4: And then I realized you weren't talking about me, as 863 00:43:51,920 --> 00:43:52,600 Speaker 4: I'm old lady. 864 00:43:52,760 --> 00:43:55,520 Speaker 1: No, no, lady, Well you know that you've said that, 865 00:43:55,840 --> 00:43:58,000 Speaker 1: But what about you know, Diane comes up in a 866 00:43:58,040 --> 00:44:02,560 Speaker 1: new Frock and Gail goes, that is fucking horrible, Like, 867 00:44:02,680 --> 00:44:05,919 Speaker 1: what did someone pay you? Is this a bet? What's 868 00:44:05,920 --> 00:44:06,560 Speaker 1: going on with that? 869 00:44:07,080 --> 00:44:07,120 Speaker 2: No? 870 00:44:07,600 --> 00:44:11,799 Speaker 1: Because ladies are more I guess, caring and empathetic, and 871 00:44:12,360 --> 00:44:15,040 Speaker 1: they'll lie their asses off to make someone feel good. 872 00:44:15,760 --> 00:44:18,040 Speaker 1: Am I right? You're not like if you think that 873 00:44:18,080 --> 00:44:20,960 Speaker 1: something's fucking horrible, you're a lady of friends, you know, 874 00:44:22,000 --> 00:44:24,399 Speaker 1: I don't. I think some women would go I would 875 00:44:24,400 --> 00:44:27,480 Speaker 1: tell them. I don't know though in the real world, ladies, 876 00:44:27,600 --> 00:44:29,960 Speaker 1: let me know. Am I wrong or right? Go to 877 00:44:30,000 --> 00:44:33,239 Speaker 1: the Facebook page? I think because you want to make 878 00:44:33,280 --> 00:44:35,719 Speaker 1: people feel good, so you'll tell, as Mary would say, 879 00:44:35,719 --> 00:44:38,160 Speaker 1: a little white lie, a little white lie, because you 880 00:44:38,160 --> 00:44:43,080 Speaker 1: don't want to hurt people's feelings. See, but blokes We're like, nah, 881 00:44:43,120 --> 00:44:50,080 Speaker 1: fuck you, you look like a bag of shit. All right, 882 00:44:50,480 --> 00:44:53,400 Speaker 1: great story, great ad, by the way, fucking hell that'll 883 00:44:53,440 --> 00:44:56,360 Speaker 1: go down. Then in the fucking the trophy cabinet of 884 00:44:56,400 --> 00:44:57,560 Speaker 1: typ do you. 885 00:44:57,480 --> 00:45:00,400 Speaker 2: Reckon anyone's even listening, because we already kind of signed off. 886 00:45:01,520 --> 00:45:04,839 Speaker 3: See you guys, see Craig, see t Bye.