1 00:00:04,480 --> 00:00:12,399 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,400 --> 00:00:15,720 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:15,720 --> 00:00:18,639 Speaker 1: I'm an executive producer with iHeart Podcasts and how the 4 00:00:18,760 --> 00:00:21,880 Speaker 1: tech are you? So today I thought I would talk 5 00:00:21,880 --> 00:00:25,360 Speaker 1: about something that hits a bit close to home, which 6 00:00:25,400 --> 00:00:31,240 Speaker 1: is the relationship between social media, social networks, and mental health. 7 00:00:31,760 --> 00:00:35,520 Speaker 1: This is a very complicated topic for a whole bunch 8 00:00:35,520 --> 00:00:39,080 Speaker 1: of reasons. I mean, for one thing, just to be 9 00:00:39,479 --> 00:00:42,880 Speaker 1: transparent with all of y'all, I'm a gen xer, okay. 10 00:00:42,960 --> 00:00:45,280 Speaker 1: I grew up in an era in which there was 11 00:00:45,320 --> 00:00:50,280 Speaker 1: a pretty darn hefty stigma attached to all things mental health, 12 00:00:50,760 --> 00:00:53,720 Speaker 1: Like if you had mental health struggles, the feeling was 13 00:00:53,720 --> 00:00:56,920 Speaker 1: that somehow that was your fault and a failing of 14 00:00:56,960 --> 00:01:00,200 Speaker 1: you personally. So to this day, while I recognize the 15 00:01:00,240 --> 00:01:03,440 Speaker 1: importance of mental health and seeking help when you're struggling, 16 00:01:03,600 --> 00:01:05,160 Speaker 1: like when a friend of mine tells me, oh, I 17 00:01:05,160 --> 00:01:08,600 Speaker 1: found this awesome therapist, I'm so happy for them, it's 18 00:01:08,600 --> 00:01:13,280 Speaker 1: still a barrier for me, which is screwed up. Like rationally, 19 00:01:13,400 --> 00:01:16,240 Speaker 1: I can recognize it as being important, and I can 20 00:01:16,280 --> 00:01:18,480 Speaker 1: be happy for my friends to seek that health and 21 00:01:18,560 --> 00:01:21,640 Speaker 1: yet I still have the blocks, the mental blocks that 22 00:01:21,680 --> 00:01:25,200 Speaker 1: are rock solid when it comes to my own mental health, 23 00:01:25,319 --> 00:01:28,080 Speaker 1: which kind of stinks, Like it really stinks when you 24 00:01:28,240 --> 00:01:30,800 Speaker 1: are trying to think of things rationally and you still 25 00:01:30,880 --> 00:01:35,000 Speaker 1: encounter this because you're like, Okay, some things go beyond rationality. 26 00:01:35,160 --> 00:01:37,920 Speaker 1: I have to admit that. But apart from my own 27 00:01:37,959 --> 00:01:41,920 Speaker 1: personal reasons, it's a tricky topic because it's really hard 28 00:01:41,920 --> 00:01:45,160 Speaker 1: to be definitive about things that relate to mental health. 29 00:01:45,640 --> 00:01:48,760 Speaker 1: There are all different types of human beings out there 30 00:01:48,800 --> 00:01:51,840 Speaker 1: in the world, and there's stuff that could roll right 31 00:01:51,920 --> 00:01:56,800 Speaker 1: off the back of one person but really traumatically impact 32 00:01:56,960 --> 00:02:00,320 Speaker 1: someone else, and it often can be really different, dificult 33 00:02:00,360 --> 00:02:05,800 Speaker 1: to determine a causal relationship between different factors. Now by that, 34 00:02:05,920 --> 00:02:09,440 Speaker 1: I mean there are a ton of studies out there 35 00:02:09,760 --> 00:02:13,840 Speaker 1: that have looked into the potential impact of social media 36 00:02:13,960 --> 00:02:16,720 Speaker 1: on mental health. For example, a study might find that 37 00:02:16,840 --> 00:02:22,079 Speaker 1: people who identify as being depressed or experiencing anxiety might 38 00:02:22,120 --> 00:02:25,120 Speaker 1: be spending a lot more time on social media sites 39 00:02:25,360 --> 00:02:29,440 Speaker 1: than people who do not identify as that. But does 40 00:02:29,440 --> 00:02:33,239 Speaker 1: that mean the social media sites are causing this anxiety 41 00:02:33,280 --> 00:02:36,480 Speaker 1: and depression that by staying on these sites that's what's 42 00:02:36,520 --> 00:02:40,320 Speaker 1: making people feel anxious and depressed. Or could it mean 43 00:02:40,440 --> 00:02:45,080 Speaker 1: that people who are already experiencing anxiety and depression are 44 00:02:45,200 --> 00:02:48,520 Speaker 1: seeking out social media sites. You know, maybe that's a 45 00:02:48,560 --> 00:02:51,480 Speaker 1: coping mechanism for them. In other words, it's the old 46 00:02:51,600 --> 00:02:57,520 Speaker 1: phrase correlation is not causation. Just because two things appear 47 00:02:57,600 --> 00:03:01,200 Speaker 1: to happen together doesn't mean that one caused the other. 48 00:03:01,520 --> 00:03:04,160 Speaker 1: They could be unrelated. They could both be caused by 49 00:03:04,200 --> 00:03:07,400 Speaker 1: the same common factor. We just don't know without looking 50 00:03:07,400 --> 00:03:10,480 Speaker 1: into it further. So today I thought, you know what, 51 00:03:10,720 --> 00:03:12,959 Speaker 1: I'm going to actually go through one of these studies. 52 00:03:13,280 --> 00:03:17,079 Speaker 1: Because I've talked about studies in general, but typically I'm 53 00:03:17,120 --> 00:03:20,079 Speaker 1: reading about an article that's written about the study. I'm 54 00:03:20,120 --> 00:03:23,200 Speaker 1: not reading the study itself. So I found a study 55 00:03:23,400 --> 00:03:27,120 Speaker 1: from twenty twenty that was a sort of meta analysis 56 00:03:27,280 --> 00:03:31,040 Speaker 1: on the subject of mental health and social networking sites. Now, 57 00:03:31,240 --> 00:03:35,040 Speaker 1: if you are unfamiliar with the term meta analysis, that 58 00:03:35,120 --> 00:03:37,160 Speaker 1: is a type of study that looks at the results 59 00:03:37,160 --> 00:03:41,760 Speaker 1: of other studies in order to reach some conclusion. So 60 00:03:42,400 --> 00:03:46,080 Speaker 1: you might say, all right, let's take these thirty studies 61 00:03:46,160 --> 00:03:49,760 Speaker 1: about this one topic. Really look at what the conclusions 62 00:03:49,800 --> 00:03:51,240 Speaker 1: are of all of them and see if we can 63 00:03:51,360 --> 00:03:54,720 Speaker 1: use that to draw new conclusions. So in this case, 64 00:03:55,000 --> 00:03:59,280 Speaker 1: the researchers of this particular study, they identified fifty papers 65 00:03:59,600 --> 00:04:02,800 Speaker 1: about social media and mental health using Google scholar. By 66 00:04:02,800 --> 00:04:05,360 Speaker 1: the way, when they did the search terms for things 67 00:04:05,400 --> 00:04:08,960 Speaker 1: like mental health and social media and social networking, they 68 00:04:09,000 --> 00:04:12,000 Speaker 1: came up with tens of thousands of papers. Even with 69 00:04:12,160 --> 00:04:16,080 Speaker 1: social media and mental health together, it was like eighteen 70 00:04:16,160 --> 00:04:20,159 Speaker 1: thousand papers. From that group, they got fifty, and from 71 00:04:20,160 --> 00:04:23,400 Speaker 1: that list of fifty, they paired the list down to 72 00:04:23,520 --> 00:04:27,080 Speaker 1: sixteen studies. They had a whole process for reviewing these 73 00:04:27,120 --> 00:04:30,880 Speaker 1: papers and determining whether or not they fit within their 74 00:04:30,920 --> 00:04:34,240 Speaker 1: own study. And as you can see, with sixteen, that 75 00:04:34,320 --> 00:04:38,360 Speaker 1: means fewer than half of those fifty made it through. Eight. 76 00:04:38,440 --> 00:04:41,680 Speaker 1: Of those studies that they'd considered. They were cross sectional studies. 77 00:04:41,880 --> 00:04:45,640 Speaker 1: That means it's a study where they analyzed data, or 78 00:04:45,880 --> 00:04:49,520 Speaker 1: the original researchers analyzed data from a group of subjects 79 00:04:49,560 --> 00:04:51,479 Speaker 1: at a single point in time, so it's like a 80 00:04:51,560 --> 00:04:55,120 Speaker 1: cross section of time. That made up half of the 81 00:04:55,160 --> 00:04:58,640 Speaker 1: papers that they were looking at. Two of the sixteen 82 00:04:58,720 --> 00:05:01,920 Speaker 1: were qualitative STUF studies. That means they were looking at 83 00:05:02,120 --> 00:05:07,440 Speaker 1: non quantifiable information and trying conclusions from that. So, in 84 00:05:07,480 --> 00:05:10,160 Speaker 1: other words, these studies look at stuff like social phenomena, 85 00:05:10,400 --> 00:05:13,680 Speaker 1: which you cannot really measure with scientific units. Right, if 86 00:05:13,720 --> 00:05:17,840 Speaker 1: you're measuring, like in chemistry, you're using units of like 87 00:05:17,920 --> 00:05:21,839 Speaker 1: weight and volume, those are quantifiable. You can put actual 88 00:05:22,080 --> 00:05:25,360 Speaker 1: units of measurement to them. If you're saying how happy 89 00:05:25,440 --> 00:05:30,599 Speaker 1: is society, that's not really quantifiable. That's qualitative, not quantifiable. 90 00:05:30,720 --> 00:05:34,120 Speaker 1: So those were the kinds of studies for two of 91 00:05:34,160 --> 00:05:38,360 Speaker 1: the sixteen papers that they chose. It's tricky to do 92 00:05:38,400 --> 00:05:42,279 Speaker 1: a qualitative analysis because you're still trying to come to 93 00:05:42,560 --> 00:05:48,400 Speaker 1: a scientific conclusion, but you're using unquantifiable factors to do so. 94 00:05:48,400 --> 00:05:50,920 Speaker 1: So they get a little whibbly wobbly, and a lot 95 00:05:50,920 --> 00:05:53,800 Speaker 1: of this stuff falls into fields like sociology, which, by 96 00:05:53,800 --> 00:05:55,839 Speaker 1: the way, I love sociology is one of my favorite 97 00:05:55,880 --> 00:05:59,960 Speaker 1: classes when I was in college. But sociology is by 98 00:06:00,040 --> 00:06:05,000 Speaker 1: its nature difficult to grow because of a lack of 99 00:06:05,080 --> 00:06:10,280 Speaker 1: quantifiable units. Anyway, three of the remaining studies were longitudinal studies. 100 00:06:10,320 --> 00:06:13,440 Speaker 1: That means they explored the same list of variables and 101 00:06:13,480 --> 00:06:16,279 Speaker 1: how those variables changed over a long period of time. 102 00:06:16,360 --> 00:06:18,719 Speaker 1: So this is like if you have a group of 103 00:06:18,760 --> 00:06:23,400 Speaker 1: subjects and you are observing them periodically over a long 104 00:06:23,440 --> 00:06:28,960 Speaker 1: course like potentially years. The remaining studies were systematic, meaning 105 00:06:29,000 --> 00:06:32,480 Speaker 1: they looked at patterns that would indicate cause and effect. 106 00:06:32,800 --> 00:06:37,400 Speaker 1: Are there recognizable and reliable patterns so much so that 107 00:06:37,680 --> 00:06:42,159 Speaker 1: should you start to see one thing, you would immediately 108 00:06:42,279 --> 00:06:45,200 Speaker 1: begin to draw conclusions that a pattern exists, even if 109 00:06:45,240 --> 00:06:48,279 Speaker 1: it's not readily evident at the time. One of the 110 00:06:48,320 --> 00:06:51,920 Speaker 1: really big challenges of meta analyses is that you have 111 00:06:52,040 --> 00:06:56,120 Speaker 1: to try and synthesize the findings of different studies that 112 00:06:56,160 --> 00:06:59,880 Speaker 1: are all using totally different methodologies, and you do this 113 00:07:00,000 --> 00:07:02,960 Speaker 1: while you're trying to draw your own conclusions. That's pretty 114 00:07:03,000 --> 00:07:07,839 Speaker 1: tough because you might accidentally misinterpret findings in an effort 115 00:07:07,960 --> 00:07:10,400 Speaker 1: to reach the conclusion you've already made. Or you might 116 00:07:11,080 --> 00:07:14,600 Speaker 1: pair two different papers together to say these papers support 117 00:07:14,640 --> 00:07:17,920 Speaker 1: one another, but because they're different methodologies, it may not 118 00:07:18,400 --> 00:07:21,440 Speaker 1: be as clear as that. Right. If the methods were 119 00:07:21,800 --> 00:07:25,480 Speaker 1: totally different, then yeah, the conclusions might be similar, but 120 00:07:25,520 --> 00:07:27,960 Speaker 1: you might not be able to say this study supports 121 00:07:28,000 --> 00:07:31,160 Speaker 1: this other study because they they took such different pathways 122 00:07:31,160 --> 00:07:33,640 Speaker 1: to get there. You can't be sure that they are 123 00:07:33,680 --> 00:07:36,400 Speaker 1: actually saying the same thing, and there's bias that you 124 00:07:36,440 --> 00:07:41,080 Speaker 1: have to deal with. Everybody has bias, and suppressing bias 125 00:07:41,200 --> 00:07:44,480 Speaker 1: is important. It's also really hard to do. Sometimes it's impossible. 126 00:07:44,960 --> 00:07:47,520 Speaker 1: You likely are going to have your own bias when 127 00:07:47,560 --> 00:07:50,680 Speaker 1: you go into a study, like you might already have 128 00:07:50,760 --> 00:07:56,160 Speaker 1: a preconceived idea of something that you're just expecting to prove. 129 00:07:56,560 --> 00:07:59,400 Speaker 1: So that will make you pay more attention to the 130 00:07:59,440 --> 00:08:04,640 Speaker 1: things that really reinforce your bias and potentially dismiss or 131 00:08:04,680 --> 00:08:08,680 Speaker 1: discount things that are not aligned with your bias unless 132 00:08:08,720 --> 00:08:11,840 Speaker 1: it gets to a point where it's just overwhelmingly impossible 133 00:08:11,960 --> 00:08:15,520 Speaker 1: to ignore. So this gets into things like cherry picking, 134 00:08:15,680 --> 00:08:18,360 Speaker 1: right where you're cherry picking the points of data that 135 00:08:18,480 --> 00:08:23,000 Speaker 1: support your perspective or your argument. Now, I'm not saying 136 00:08:23,080 --> 00:08:26,840 Speaker 1: that all meta analyzes are bad. I'm just saying they're 137 00:08:26,880 --> 00:08:30,640 Speaker 1: tricky to do and they're easy to do poorly, So 138 00:08:31,000 --> 00:08:34,400 Speaker 1: they're not bad just out of the gate, but they 139 00:08:34,400 --> 00:08:38,640 Speaker 1: are hard to do well. And obviously your conclusions are 140 00:08:38,640 --> 00:08:41,480 Speaker 1: only as reliable as the individual studies are. Like, you 141 00:08:41,480 --> 00:08:45,040 Speaker 1: could do a fantastic meta analysis, but if all the 142 00:08:45,080 --> 00:08:48,360 Speaker 1: studies that are part of your meta analysis are crap, 143 00:08:48,800 --> 00:08:52,480 Speaker 1: then the results of your meta analysis aren't reliable either. 144 00:08:53,080 --> 00:08:55,600 Speaker 1: Garbage in, garbage out kind of thing. So that's why 145 00:08:55,640 --> 00:08:59,600 Speaker 1: that selection process was important. So while they only use 146 00:08:59,640 --> 00:09:02,839 Speaker 1: six out of fifty papers that they selected out of 147 00:09:02,920 --> 00:09:07,080 Speaker 1: a larger like eighteen thousand potential papers, you can at 148 00:09:07,200 --> 00:09:09,240 Speaker 1: least say, well, they had a process there to try 149 00:09:09,240 --> 00:09:11,640 Speaker 1: and weed out things that would either be a bad 150 00:09:11,640 --> 00:09:15,160 Speaker 1: fit or were poorly designed. So this paper, I haven't 151 00:09:15,160 --> 00:09:17,440 Speaker 1: even mentioned the title yet. Here's the title. You can 152 00:09:17,640 --> 00:09:20,800 Speaker 1: look this up and read it yourself. It's social Media 153 00:09:20,960 --> 00:09:24,480 Speaker 1: Use and its Connection to Mental health, a systematic review. 154 00:09:25,000 --> 00:09:27,840 Speaker 1: And this was by a collection of authors. There's like 155 00:09:28,080 --> 00:09:30,800 Speaker 1: six or seven authors attached to this. I found it 156 00:09:30,880 --> 00:09:34,400 Speaker 1: by using the National Library of Medicine when I was 157 00:09:34,760 --> 00:09:38,440 Speaker 1: looking for a paper to talk about, and it was 158 00:09:38,440 --> 00:09:42,080 Speaker 1: originally published in a web based peer reviewed medical journal 159 00:09:42,120 --> 00:09:47,120 Speaker 1: called Curious. Now let's cur eus. We're going to get 160 00:09:47,120 --> 00:09:49,200 Speaker 1: to the paper in a second, and I'll also have 161 00:09:49,280 --> 00:09:52,680 Speaker 1: more to say about Curious at the end of this episode, because, 162 00:09:52,720 --> 00:09:57,280 Speaker 1: as it turns out, Curious has its own curious reputation. 163 00:09:57,760 --> 00:10:00,120 Speaker 1: I'm not saying it's a bad paper, but I am 164 00:10:00,120 --> 00:10:03,640 Speaker 1: saying like it is a matter of debate among the 165 00:10:03,800 --> 00:10:07,600 Speaker 1: research circle, and yeah, I kind of tripped into that 166 00:10:07,640 --> 00:10:11,120 Speaker 1: one without anticipating it. So first, before we get to 167 00:10:11,160 --> 00:10:14,000 Speaker 1: the actual paper, I think it is important to establish 168 00:10:14,040 --> 00:10:19,120 Speaker 1: the connection between socialization in general and mental health. Human 169 00:10:19,160 --> 00:10:22,200 Speaker 1: beings are social animals, even though some days I feel 170 00:10:22,240 --> 00:10:23,880 Speaker 1: like I should just run off to be a hermit 171 00:10:23,920 --> 00:10:28,120 Speaker 1: in the woods. Some days, y'all, that compulsion is a 172 00:10:28,160 --> 00:10:32,559 Speaker 1: strong one. So in a different scientific paper by Deborah 173 00:10:32,600 --> 00:10:36,920 Speaker 1: Umberson and Jennifer carraz Montez title Social Relationships and Health 174 00:10:36,960 --> 00:10:40,600 Speaker 1: a Flashpoint for Health Policy, there is a very powerful 175 00:10:40,600 --> 00:10:45,360 Speaker 1: statement that I wanted to share. Quote Captors use social 176 00:10:45,480 --> 00:10:49,920 Speaker 1: isolation to torture prisoners of war to drastic effect. Social 177 00:10:50,000 --> 00:10:55,160 Speaker 1: isolation of otherwise healthy, well functioning individuals eventually results in 178 00:10:55,200 --> 00:11:01,040 Speaker 1: psychological and physical disintegration and even death. In quote, that 179 00:11:01,160 --> 00:11:03,640 Speaker 1: is a heck of a way to argue for the 180 00:11:03,679 --> 00:11:07,839 Speaker 1: power of socialization, because when we are deprived of socialization, 181 00:11:08,320 --> 00:11:12,720 Speaker 1: we suffer. Generally speaking, studies show that the quantity and 182 00:11:12,920 --> 00:11:17,400 Speaker 1: quality of our social relationships have an enormous impact on 183 00:11:17,440 --> 00:11:21,280 Speaker 1: our well being, both mental health and physical health. People 184 00:11:21,280 --> 00:11:26,000 Speaker 1: who maintain more and high quality social relationships tend to 185 00:11:26,040 --> 00:11:28,840 Speaker 1: live longer and healthier than those who do not. So, 186 00:11:29,440 --> 00:11:32,160 Speaker 1: you know, for being someone who has very few friends 187 00:11:32,200 --> 00:11:34,080 Speaker 1: at this point, like I don't hang out with very 188 00:11:34,080 --> 00:11:35,800 Speaker 1: many people at all, I look at this and I 189 00:11:35,800 --> 00:11:38,360 Speaker 1: think I need to get out there more and actually 190 00:11:38,400 --> 00:11:41,360 Speaker 1: for meaningful friendships, not just be like, hey, how's it going, 191 00:11:41,400 --> 00:11:44,040 Speaker 1: what's your sign? Nice to see you come here. Often 192 00:11:44,280 --> 00:11:48,120 Speaker 1: like to actually get meaningful relationships because they are very 193 00:11:48,200 --> 00:11:51,600 Speaker 1: important to our health. So there is strong evidence supporting 194 00:11:51,640 --> 00:11:54,680 Speaker 1: a link between socialization in general and mental health. There's 195 00:11:54,760 --> 00:11:58,960 Speaker 1: lots of research that says that it's an important factor 196 00:11:59,120 --> 00:12:02,640 Speaker 1: for our mental health. Is this aspect of socialization not 197 00:12:02,760 --> 00:12:05,640 Speaker 1: that people who are kind of loaners or whatever are 198 00:12:05,920 --> 00:12:10,080 Speaker 1: mentally unwell, that's not necessarily the case. But generally speaking, 199 00:12:10,520 --> 00:12:13,720 Speaker 1: we tend, we humans tend to do better when we 200 00:12:14,160 --> 00:12:18,440 Speaker 1: have good socialization. Now let's move on to social networks. Now. 201 00:12:18,480 --> 00:12:21,280 Speaker 1: I'm sure some of y'all out there are old enough 202 00:12:21,360 --> 00:12:24,480 Speaker 1: like me to remember a time before there were really 203 00:12:24,520 --> 00:12:27,240 Speaker 1: online social networks, or at least a time before we 204 00:12:27,320 --> 00:12:31,680 Speaker 1: had sites that served purely as a social network. I 205 00:12:31,720 --> 00:12:35,000 Speaker 1: think back to the bulletin Board System or BBS days 206 00:12:35,040 --> 00:12:37,480 Speaker 1: and I can remember logging into a service and skimming 207 00:12:37,520 --> 00:12:41,280 Speaker 1: the message boards, and these BBSs often existed on a 208 00:12:41,320 --> 00:12:44,960 Speaker 1: single person's computer somewhere. So this wasn't the Internet. You 209 00:12:45,000 --> 00:12:48,160 Speaker 1: weren't logging into a network of networks. You were literally 210 00:12:48,240 --> 00:12:53,439 Speaker 1: dialing into a computer that hosted this bulletin board. Now 211 00:12:53,480 --> 00:12:56,360 Speaker 1: that computer might link to other computers and share a 212 00:12:56,400 --> 00:13:00,199 Speaker 1: message board between them, which increased the reach of the 213 00:13:00,240 --> 00:13:02,680 Speaker 1: bulletin board system, but it still wasn't the Internet yet, 214 00:13:03,200 --> 00:13:05,679 Speaker 1: not for the average person, but for a lot of folks, 215 00:13:05,679 --> 00:13:08,280 Speaker 1: it was a preview of what the Internet would be. 216 00:13:08,360 --> 00:13:10,800 Speaker 1: It was just on a much smaller scale, kind of 217 00:13:10,840 --> 00:13:15,280 Speaker 1: think of like a community bulletin board version of the Internet. 218 00:13:15,600 --> 00:13:19,640 Speaker 1: And back in those days, a lot of folks, myself included, thought, Wow, 219 00:13:19,920 --> 00:13:23,880 Speaker 1: this technology is going to transform the world. We're going 220 00:13:23,960 --> 00:13:26,320 Speaker 1: to be able to communicate with each other instantly, no 221 00:13:26,440 --> 00:13:28,920 Speaker 1: matter where in the world we happen to be. We'll 222 00:13:28,960 --> 00:13:31,440 Speaker 1: be able to find people who share our interests and 223 00:13:31,520 --> 00:13:34,320 Speaker 1: make friends in brand new ways. It is going to 224 00:13:34,360 --> 00:13:37,200 Speaker 1: be amazing. In fact, I'm going to tell you another 225 00:13:37,240 --> 00:13:40,480 Speaker 1: story to sort of illustrate this Before I get to that. However, 226 00:13:40,920 --> 00:13:53,600 Speaker 1: let's take a quick break to thank our sponsors. Okay, 227 00:13:53,640 --> 00:13:56,400 Speaker 1: so before the ad break, I promised y'all a story. 228 00:13:56,640 --> 00:14:00,000 Speaker 1: So when I was a kid, I loved fantasy novel 229 00:14:00,679 --> 00:14:02,240 Speaker 1: I mean I still do, but I don't read them 230 00:14:02,280 --> 00:14:03,800 Speaker 1: as much as I used to because I read a 231 00:14:03,800 --> 00:14:06,120 Speaker 1: lot of other stuff now. But when I was a kid, 232 00:14:06,200 --> 00:14:08,640 Speaker 1: I wasn't really into science fiction very much. I mean 233 00:14:08,679 --> 00:14:12,480 Speaker 1: I liked some science fiction movies and television shows, but 234 00:14:12,520 --> 00:14:17,160 Speaker 1: I didn't read science fiction books. I loved fantasy novels. 235 00:14:17,480 --> 00:14:22,600 Speaker 1: I knew precisely three other kids in my personal life 236 00:14:22,680 --> 00:14:26,840 Speaker 1: who also liked fantasy novels to various degrees. So our 237 00:14:27,000 --> 00:14:30,320 Speaker 1: tiny little social group of four people kind of helped 238 00:14:30,400 --> 00:14:33,200 Speaker 1: us get through the experiences of like middle and high 239 00:14:33,240 --> 00:14:35,960 Speaker 1: school because none of us fit in particularly well with 240 00:14:36,080 --> 00:14:40,480 Speaker 1: the rest of the student body. I wouldn't say we 241 00:14:40,480 --> 00:14:43,920 Speaker 1: were like ostracized or ridiculed or anything. I mean, maybe 242 00:14:43,960 --> 00:14:45,720 Speaker 1: we were, but I wasn't aware of it, which is 243 00:14:45,840 --> 00:14:49,560 Speaker 1: probably for the best. But like, I just didn't integrate 244 00:14:49,640 --> 00:14:54,000 Speaker 1: well with the main student body, not being so savvy 245 00:14:54,040 --> 00:14:58,880 Speaker 1: with things like mainstream entertainment or sports or any of that. However, 246 00:14:59,000 --> 00:15:02,760 Speaker 1: you know, there was something special about my childhood that 247 00:15:02,800 --> 00:15:06,200 Speaker 1: my friends lacked, and that was that my parents write 248 00:15:06,560 --> 00:15:09,720 Speaker 1: science fiction and fantasy and horror and mysteries and other 249 00:15:09,760 --> 00:15:13,720 Speaker 1: types of fiction. They are published authors. My father has 250 00:15:13,760 --> 00:15:17,000 Speaker 1: written more than a hundred published works at this point 251 00:15:17,280 --> 00:15:20,880 Speaker 1: in that field in fiction. And one way that my 252 00:15:21,000 --> 00:15:23,840 Speaker 1: parents would promote their work it was really my dad. 253 00:15:23,840 --> 00:15:25,880 Speaker 1: At this point. Mom would also write, but that was 254 00:15:25,960 --> 00:15:31,600 Speaker 1: later on. Dad would go to different regional science fiction 255 00:15:31,760 --> 00:15:35,840 Speaker 1: and fantasy conventions where fans would come together and they 256 00:15:35,840 --> 00:15:39,080 Speaker 1: would hang out and party and have a great time 257 00:15:39,120 --> 00:15:42,440 Speaker 1: for a weekend. These conventions had names like Dixie Trek, 258 00:15:42,800 --> 00:15:47,960 Speaker 1: Dixie Dixie Trek, or Phoenix Con. Atlanta is known as 259 00:15:48,000 --> 00:15:51,320 Speaker 1: the City of the Phoenix, or the Atlanta Fantasy Fair. 260 00:15:51,400 --> 00:15:53,920 Speaker 1: That was a really big one. So these days, the 261 00:15:54,000 --> 00:15:57,240 Speaker 1: really big one in the Southeast is Dragon Con. And 262 00:15:57,240 --> 00:16:00,600 Speaker 1: in fact, my dad was the first toastmaster Dragging when 263 00:16:00,640 --> 00:16:03,640 Speaker 1: it first got started. And it was at these science 264 00:16:03,680 --> 00:16:06,360 Speaker 1: fiction and fantasy conventions that I saw the power of community. 265 00:16:06,720 --> 00:16:09,800 Speaker 1: So in the quote unquote real world, a fantasy novel 266 00:16:09,840 --> 00:16:13,480 Speaker 1: geek could end up feeling pretty darn isolated in those days, 267 00:16:13,680 --> 00:16:16,560 Speaker 1: but at these conventions I would become part of an 268 00:16:16,760 --> 00:16:19,920 Speaker 1: enormous community of fans, so you could go to panel 269 00:16:20,000 --> 00:16:23,520 Speaker 1: discussions about your favorite book where people would talk about 270 00:16:23,600 --> 00:16:28,040 Speaker 1: fan theories or discuss certain works in depth, or sometimes 271 00:16:28,160 --> 00:16:30,080 Speaker 1: you might even get a chance to hear the author 272 00:16:30,240 --> 00:16:34,560 Speaker 1: himself or herself speak, And everything was a celebration of 273 00:16:34,600 --> 00:16:38,040 Speaker 1: the geeky interests for the most folks attending, I mean, 274 00:16:38,040 --> 00:16:40,640 Speaker 1: it was an experience you just couldn't replicate back home, 275 00:16:40,760 --> 00:16:43,320 Speaker 1: because there just weren't enough people you knew in your 276 00:16:43,360 --> 00:16:46,880 Speaker 1: everyday life where you could have these kinds of interactions. 277 00:16:47,120 --> 00:16:49,800 Speaker 1: These conventions were special. They gave fans a place in 278 00:16:49,840 --> 00:16:53,640 Speaker 1: time to really engage in their interests and to celebrate them. 279 00:16:53,880 --> 00:16:57,320 Speaker 1: So early on the Internet seemed to be shaping up 280 00:16:57,680 --> 00:17:00,880 Speaker 1: in a way that it could do this, but through computers. Right, 281 00:17:01,280 --> 00:17:05,720 Speaker 1: that could involve creating, you know, communities that celebrate specific 282 00:17:05,760 --> 00:17:10,080 Speaker 1: interests online, and you wouldn't be restricted to just attending 283 00:17:10,320 --> 00:17:12,680 Speaker 1: a convention one weekend out of the year in order 284 00:17:12,680 --> 00:17:14,520 Speaker 1: to get together with friends and talk about, you know, 285 00:17:14,560 --> 00:17:17,399 Speaker 1: the latest episode of Quantum Leap or whatever. Now you 286 00:17:17,440 --> 00:17:21,000 Speaker 1: could go online and join a forum dedicated to your 287 00:17:21,040 --> 00:17:24,119 Speaker 1: favorite show or movie or book series or whatever. And 288 00:17:24,200 --> 00:17:26,440 Speaker 1: if there wasn't one, out there, you could make one 289 00:17:26,640 --> 00:17:29,160 Speaker 1: and folks would find it. Now way back in the day, 290 00:17:29,200 --> 00:17:32,720 Speaker 1: I remember joining a forum called the Bronze, and it 291 00:17:32,800 --> 00:17:35,960 Speaker 1: was a community that celebrated the television series Buffy the 292 00:17:36,040 --> 00:17:38,600 Speaker 1: Vampire Slayer, and I ended up meeting up a bunch 293 00:17:38,600 --> 00:17:41,600 Speaker 1: of other fans that way, including someone who ultimately went 294 00:17:41,640 --> 00:17:44,760 Speaker 1: on to write for the series Angel, which spun off 295 00:17:44,760 --> 00:17:47,920 Speaker 1: of Buffy. I met some of the musicians who provided 296 00:17:48,280 --> 00:17:51,439 Speaker 1: music for the show's soundtrack. I even ultimately ended up 297 00:17:51,440 --> 00:17:54,200 Speaker 1: meeting some of the actors, writers, and directors of the series. 298 00:17:54,520 --> 00:17:56,560 Speaker 1: So first I met them online and then later I 299 00:17:56,600 --> 00:17:58,639 Speaker 1: met them in person. It was great. So for a 300 00:17:58,680 --> 00:18:01,080 Speaker 1: while it seemed like the Internet and the Web in particular, 301 00:18:01,119 --> 00:18:05,240 Speaker 1: we're going to revolutionize the way we socialize with one another, 302 00:18:05,359 --> 00:18:08,240 Speaker 1: and a lot of us who were optimists thought that 303 00:18:08,320 --> 00:18:11,960 Speaker 1: we might be able to form really deep, meaningful relationships 304 00:18:12,200 --> 00:18:16,480 Speaker 1: online and that would be just as important and relevant 305 00:18:16,520 --> 00:18:19,680 Speaker 1: and deep and meaningful as the relationships we had out 306 00:18:19,680 --> 00:18:22,360 Speaker 1: in the quote unquote real world. It ended up being 307 00:18:22,359 --> 00:18:24,840 Speaker 1: true for me. I mean, I met my partner online 308 00:18:25,160 --> 00:18:27,600 Speaker 1: way back in the nineteen nineties and we're still together 309 00:18:27,840 --> 00:18:32,240 Speaker 1: thirty years later. But social platforms would end up introducing 310 00:18:32,280 --> 00:18:34,600 Speaker 1: a lot more than just a way to connect with 311 00:18:34,760 --> 00:18:38,000 Speaker 1: other people. I don't think the optimists out there took 312 00:18:38,040 --> 00:18:42,679 Speaker 1: into account the development of recommendation algorithms, for example. So 313 00:18:43,200 --> 00:18:45,720 Speaker 1: the algorithm's job, when you really get on to it 314 00:18:45,800 --> 00:18:48,199 Speaker 1: is to convince you to stay on the platform for 315 00:18:48,400 --> 00:18:52,680 Speaker 1: as long as possible, to hold your attention as long 316 00:18:52,720 --> 00:18:55,800 Speaker 1: as it possibly can. So the algorithm is supposed to 317 00:18:55,840 --> 00:18:58,800 Speaker 1: serve up material that you're going to find engaging. Now 318 00:18:58,800 --> 00:19:03,320 Speaker 1: that doesn't mean good, but engaging, because it doesn't matter 319 00:19:03,640 --> 00:19:06,639 Speaker 1: if the stuff you see makes you feel happy or sad, 320 00:19:07,160 --> 00:19:11,080 Speaker 1: or angry or scared or any other specific emotion. That 321 00:19:11,160 --> 00:19:14,760 Speaker 1: doesn't matter at all to the algorithm. What matters is 322 00:19:14,760 --> 00:19:18,159 Speaker 1: that you stay there, You don't leave the page. Preferably, 323 00:19:18,320 --> 00:19:21,640 Speaker 1: you engage with whatever the content is, you know, by 324 00:19:21,640 --> 00:19:24,840 Speaker 1: clicking that you like it, or leaving a comment, or 325 00:19:25,080 --> 00:19:28,320 Speaker 1: perhaps best of all, sharing it with other people. That's 326 00:19:28,400 --> 00:19:32,119 Speaker 1: really the algorithm's job, And the recommendation algorithm is necessary 327 00:19:32,200 --> 00:19:36,520 Speaker 1: because the social networking site is a business. We're the product, 328 00:19:37,280 --> 00:19:40,760 Speaker 1: right The site isn't selling anything to us, apart from 329 00:19:40,800 --> 00:19:43,720 Speaker 1: some sites that offer like a premium experience in return 330 00:19:43,800 --> 00:19:48,240 Speaker 1: for a subscription. Otherwise, these companies make their money through advertising. 331 00:19:48,600 --> 00:19:51,760 Speaker 1: The more valuable the landscape is, the more these sites 332 00:19:51,800 --> 00:19:55,160 Speaker 1: can charge to put ads across that landscape. And if 333 00:19:55,160 --> 00:19:58,280 Speaker 1: the site can assure advertisers that their ads are going 334 00:19:58,320 --> 00:20:02,200 Speaker 1: to match up with appropriate audiences, thus improving the chance 335 00:20:02,240 --> 00:20:05,080 Speaker 1: that folks will actually click through the ad to buy something, 336 00:20:05,359 --> 00:20:08,359 Speaker 1: then they can charge even more for those ads. So again, 337 00:20:08,840 --> 00:20:12,480 Speaker 1: the emotional reaction users have while they're using the service, 338 00:20:12,840 --> 00:20:17,040 Speaker 1: that doesn't matter, because sad people they're worth exactly the 339 00:20:17,040 --> 00:20:20,160 Speaker 1: same amount of money as happy people as long as 340 00:20:20,160 --> 00:20:23,600 Speaker 1: they're staying engaged on the site. If you are happy 341 00:20:23,720 --> 00:20:26,760 Speaker 1: or sad, you're worth the same amount to Facebook. And 342 00:20:26,960 --> 00:20:29,560 Speaker 1: it's easier to find stuff that makes users sad or 343 00:20:29,600 --> 00:20:32,680 Speaker 1: anxious or whatever. Well, naturally, if it's easier to find 344 00:20:32,720 --> 00:20:34,679 Speaker 1: that stuff, that material is going to get served up 345 00:20:34,720 --> 00:20:37,919 Speaker 1: to users more frequently, and you're going to be encountering 346 00:20:37,920 --> 00:20:40,679 Speaker 1: that stuff on a more frequent basis. Now, if you 347 00:20:40,800 --> 00:20:44,560 Speaker 1: contrast that with the early days of social networking sites, 348 00:20:44,600 --> 00:20:47,800 Speaker 1: before they had found a way to monetize their operations, 349 00:20:48,119 --> 00:20:51,840 Speaker 1: then things are drastically different. You often had sites that 350 00:20:51,960 --> 00:20:55,760 Speaker 1: used a much more straightforward approach to content organization. I 351 00:20:55,800 --> 00:20:58,639 Speaker 1: still miss the old days where I could log into 352 00:20:58,680 --> 00:21:01,160 Speaker 1: a site like Facebook and I could just look at 353 00:21:01,160 --> 00:21:05,119 Speaker 1: my friends' posts in reverse chronological order, so all I 354 00:21:05,119 --> 00:21:07,680 Speaker 1: had to do was keep scrolling, and I would eventually 355 00:21:07,680 --> 00:21:10,680 Speaker 1: catch up on what everyone was doing, and I would 356 00:21:10,720 --> 00:21:13,440 Speaker 1: know I had a good idea, like I've seen everything. 357 00:21:13,760 --> 00:21:15,840 Speaker 1: But today, if I go to Facebook, I get a 358 00:21:15,880 --> 00:21:20,040 Speaker 1: hodgepodge of posts from the last several days. They're organized 359 00:21:20,040 --> 00:21:24,000 Speaker 1: in no discernible order, and there's tons of ads peppered 360 00:21:24,040 --> 00:21:26,960 Speaker 1: into boot. All right, now, let's get back to the paper. 361 00:21:27,400 --> 00:21:30,320 Speaker 1: So according to the meta analysis paper that I mentioned 362 00:21:30,359 --> 00:21:33,000 Speaker 1: at the top of the episode, the negative impact of 363 00:21:33,080 --> 00:21:36,320 Speaker 1: social media, or the correlation between social media and the 364 00:21:36,359 --> 00:21:41,439 Speaker 1: mental health problems is apparent. And then that sounds logical. 365 00:21:41,480 --> 00:21:44,240 Speaker 1: I mean, there's like a common sense element to that, right, 366 00:21:44,280 --> 00:21:46,639 Speaker 1: I mean I just explained that the site is designed 367 00:21:46,640 --> 00:21:48,679 Speaker 1: to keep people there as long as possible, and the 368 00:21:48,680 --> 00:21:51,200 Speaker 1: way to do that is to catch and hold their attention, 369 00:21:51,480 --> 00:21:54,400 Speaker 1: and negative stuff can do that fairly effectively. So it's 370 00:21:54,440 --> 00:21:57,800 Speaker 1: no surprise that negative stuff rises to the top and 371 00:21:57,800 --> 00:22:01,000 Speaker 1: that this can have an impact on people. But that's 372 00:22:01,000 --> 00:22:03,040 Speaker 1: still a long way to go from proving there's a 373 00:22:03,160 --> 00:22:07,960 Speaker 1: causal link between social networking use and mental health issues. 374 00:22:08,320 --> 00:22:13,280 Speaker 1: So the paper did share some pretty interesting findings. One 375 00:22:13,320 --> 00:22:15,879 Speaker 1: of those is that a person's age didn't seem to 376 00:22:16,560 --> 00:22:20,520 Speaker 1: affect the impact of social networking use, so whether you 377 00:22:20,560 --> 00:22:22,640 Speaker 1: were old or young, there didn't seem to be much 378 00:22:22,640 --> 00:22:25,080 Speaker 1: of a change there, although a lot of the studies 379 00:22:25,119 --> 00:22:29,439 Speaker 1: would end up focusing on pre adolescent users. Gender, however, 380 00:22:29,640 --> 00:22:33,080 Speaker 1: did seem to have a factor when it came to impact. 381 00:22:33,080 --> 00:22:35,840 Speaker 1: Those who identified as female were, in the words of 382 00:22:35,880 --> 00:22:39,959 Speaker 1: their authors, much more likely to experience a negative impact 383 00:22:39,960 --> 00:22:44,440 Speaker 1: to mental health than those who identified as male. Now, 384 00:22:44,480 --> 00:22:47,520 Speaker 1: I'm not sure how much I should actually trust this 385 00:22:47,640 --> 00:22:50,359 Speaker 1: paper if I'm honest, because as I was reading it 386 00:22:50,520 --> 00:22:54,520 Speaker 1: early on, I found an error in the paper. It 387 00:22:54,560 --> 00:22:57,919 Speaker 1: includes a bar graph that shows gender distribution among the 388 00:22:58,000 --> 00:23:02,080 Speaker 1: various platforms, And so this paper came out in twenty twenty, 389 00:23:02,359 --> 00:23:05,320 Speaker 1: so keep that in mind. But even so, the distribution 390 00:23:05,520 --> 00:23:08,119 Speaker 1: caught me off guard because it flew in the face 391 00:23:08,160 --> 00:23:10,639 Speaker 1: of what I had believed. It doesn't mean that I 392 00:23:10,800 --> 00:23:13,440 Speaker 1: was right and the paper was wrong, but it did 393 00:23:13,440 --> 00:23:15,640 Speaker 1: surprise me. So the one for Twitter was the one 394 00:23:15,640 --> 00:23:17,720 Speaker 1: that really surprised me. Now keep in mind, again, this 395 00:23:17,760 --> 00:23:20,240 Speaker 1: paper came out in twenty twenty. Twitter was still Twitter 396 00:23:20,320 --> 00:23:22,760 Speaker 1: back in those days. And if you had asked me 397 00:23:22,800 --> 00:23:25,680 Speaker 1: in twenty twenty, what do you think the gender distribution 398 00:23:25,960 --> 00:23:29,080 Speaker 1: is on Twitter? I would have guessed it skewed male, 399 00:23:29,400 --> 00:23:32,720 Speaker 1: that there'd be more men on Twitter than women, But 400 00:23:32,840 --> 00:23:35,960 Speaker 1: in fact, it apparently was much more skewed toward females. 401 00:23:36,400 --> 00:23:39,200 Speaker 1: So men made up only thirty eight percent of Twitter 402 00:23:39,320 --> 00:23:42,439 Speaker 1: users according to this study. However, this is where we 403 00:23:42,480 --> 00:23:45,000 Speaker 1: get to the mistake in the bar chart. So the 404 00:23:45,160 --> 00:23:48,760 Speaker 1: chart says that eighty two percent of Twitter users were 405 00:23:48,800 --> 00:23:52,520 Speaker 1: female in twenty twenty. Eighty two percent. Now clearly that's wrong. 406 00:23:52,840 --> 00:23:55,280 Speaker 1: Like if I told you eighty two percent of the 407 00:23:55,320 --> 00:23:57,359 Speaker 1: people on Twitter in twenty twenty were women, you would 408 00:23:57,359 --> 00:24:00,680 Speaker 1: automatically say no, that cannot be right. But even the 409 00:24:00,760 --> 00:24:04,240 Speaker 1: chart itself proves that it's wrong because the two numbers 410 00:24:04,280 --> 00:24:06,359 Speaker 1: are supposed to add up to one hundred right, eighty 411 00:24:06,400 --> 00:24:09,680 Speaker 1: two percent are women, and yet it also says thirty 412 00:24:09,720 --> 00:24:12,040 Speaker 1: eight percent are men. If you add those together you 413 00:24:12,040 --> 00:24:14,639 Speaker 1: get one hundred and twenty percent. So my guess is 414 00:24:14,680 --> 00:24:18,600 Speaker 1: the bar chart should have said sixty two percent women, 415 00:24:19,320 --> 00:24:22,080 Speaker 1: not eighty two. I was surprised to see a mistake 416 00:24:22,160 --> 00:24:26,680 Speaker 1: like that make it all the way through edits into publishing, because, again, curious. 417 00:24:26,800 --> 00:24:31,160 Speaker 1: The journal that published this paper is a peer reviewed journal, 418 00:24:31,240 --> 00:24:35,399 Speaker 1: and typically part of peer review means checking for things 419 00:24:35,480 --> 00:24:39,239 Speaker 1: like stupid mistakes, and yet this one made it all 420 00:24:39,240 --> 00:24:43,239 Speaker 1: the way through into published format. And maybe it's not 421 00:24:43,440 --> 00:24:46,680 Speaker 1: fair to judge a paper purely by a single mistake, 422 00:24:47,160 --> 00:24:50,840 Speaker 1: but that is such a simple, careless error, and one 423 00:24:50,880 --> 00:24:54,320 Speaker 1: that's actually really easy to catch if you're just I mean, 424 00:24:54,440 --> 00:24:57,760 Speaker 1: I was just casually reading this. I wasn't reading this 425 00:24:57,920 --> 00:25:01,080 Speaker 1: as an editor, and I just caught it immediately. Well, 426 00:25:01,200 --> 00:25:04,000 Speaker 1: that raises concerns about the rest of the findings of 427 00:25:04,040 --> 00:25:06,920 Speaker 1: the paper, right, like, if this mistake made it through, 428 00:25:07,240 --> 00:25:09,239 Speaker 1: and it made it through not just the writing, but 429 00:25:09,280 --> 00:25:11,800 Speaker 1: the peer review and the editing processes, and still made 430 00:25:11,800 --> 00:25:15,320 Speaker 1: it through to publishing, how can I count on the 431 00:25:15,359 --> 00:25:18,040 Speaker 1: findings of the rest of the paper. But let's carry on, 432 00:25:18,080 --> 00:25:20,439 Speaker 1: because I had already chosen this one, so I was like, well, 433 00:25:20,640 --> 00:25:23,119 Speaker 1: let's see it through to the grizzly end. So the 434 00:25:23,160 --> 00:25:25,720 Speaker 1: paper actually takes its time getting going, which I appreciate 435 00:25:26,040 --> 00:25:28,920 Speaker 1: it's kind of like me. The researchers justify their work 436 00:25:28,920 --> 00:25:32,080 Speaker 1: by calling out the need for systematic reviews, essentially pointing 437 00:25:32,080 --> 00:25:36,000 Speaker 1: out that social networking sites are still relatively young and 438 00:25:36,040 --> 00:25:39,760 Speaker 1: that as a result, there's not much research information available 439 00:25:39,840 --> 00:25:42,440 Speaker 1: that you can work from, and that their own paper 440 00:25:42,520 --> 00:25:45,680 Speaker 1: stands as a resource mainly for future research like this 441 00:25:45,760 --> 00:25:49,440 Speaker 1: isn't to draw firm conclusions, but rather to help serve 442 00:25:49,440 --> 00:25:52,320 Speaker 1: as a sort of summary for more than a dozen 443 00:25:52,520 --> 00:25:55,360 Speaker 1: studies conducted in the area, so that people who are 444 00:25:55,720 --> 00:25:59,480 Speaker 1: looking into it further are more readily able to identify. 445 00:26:00,560 --> 00:26:03,399 Speaker 1: I don't know how useful that is, honestly, because again 446 00:26:03,760 --> 00:26:06,040 Speaker 1: they also pointed out that when they went through Google 447 00:26:06,160 --> 00:26:09,960 Speaker 1: scholar to look for scholarly works on the subject of 448 00:26:10,160 --> 00:26:13,920 Speaker 1: mental health and social networks, they found like seventeeny eighteen 449 00:26:13,960 --> 00:26:17,040 Speaker 1: thousand hits and if there's that many hits, and then 450 00:26:17,040 --> 00:26:21,160 Speaker 1: they selected fifty, I don't know what criteria they used 451 00:26:21,240 --> 00:26:24,240 Speaker 1: just to select the initial fifty, apart from they wanted 452 00:26:24,240 --> 00:26:27,440 Speaker 1: to avoid duplicates, they selected fifty and narrowed it down 453 00:26:27,440 --> 00:26:29,480 Speaker 1: to sixteen. I'm not sure that that's going to be 454 00:26:29,520 --> 00:26:33,440 Speaker 1: a huge help to future researchers, so I question that 455 00:26:33,560 --> 00:26:38,280 Speaker 1: particular part of the justification. But the research site in 456 00:26:38,320 --> 00:26:41,840 Speaker 1: the paper is interesting and it really runs the spectrum, 457 00:26:41,960 --> 00:26:44,680 Speaker 1: like they summarize what each of the papers are. They 458 00:26:44,680 --> 00:26:47,080 Speaker 1: don't go into a lot of detail about the findings, 459 00:26:47,080 --> 00:26:50,200 Speaker 1: which is interesting to me. There are studies that concluded 460 00:26:50,240 --> 00:26:52,879 Speaker 1: that there's no real link between social media use and 461 00:26:52,920 --> 00:26:56,440 Speaker 1: mental health, which seems to be counterproductive to the point 462 00:26:56,480 --> 00:26:59,399 Speaker 1: the paper was making. Others that were cited found that 463 00:26:59,440 --> 00:27:03,800 Speaker 1: social media could exacerbate mental health problems, so the suggestion 464 00:27:03,880 --> 00:27:06,639 Speaker 1: there was that the issues were already present and the 465 00:27:06,720 --> 00:27:09,760 Speaker 1: use of social media made them worse. Some discovered that 466 00:27:10,040 --> 00:27:15,600 Speaker 1: reading posts correlated more with depression than creating posts, So 467 00:27:16,160 --> 00:27:18,679 Speaker 1: it's not just using social network, but how are you 468 00:27:18,840 --> 00:27:22,160 Speaker 1: using it, Like if you're just doom scrolling, that would 469 00:27:22,160 --> 00:27:25,199 Speaker 1: be associated more with stuff like anxiety and depression, but 470 00:27:25,240 --> 00:27:28,639 Speaker 1: if you are creating that isn't a Few of the 471 00:27:28,640 --> 00:27:31,479 Speaker 1: studies focused on gender and found that people identifying as 472 00:27:31,520 --> 00:27:34,840 Speaker 1: women were more prone to social media addiction than those 473 00:27:34,880 --> 00:27:38,080 Speaker 1: who identified as men. One study titled the Use of 474 00:27:38,080 --> 00:27:41,320 Speaker 1: Social Media by Australian pre Adolescents and its Links with 475 00:27:41,440 --> 00:27:45,560 Speaker 1: Mental Health found that young users of sites like Instagram 476 00:27:45,760 --> 00:27:50,520 Speaker 1: and YouTube reported more body image issues and more eating 477 00:27:50,520 --> 00:27:53,960 Speaker 1: disorders than those who did not use those sites. That 478 00:27:54,040 --> 00:27:55,639 Speaker 1: was something that was really brought to light when the 479 00:27:55,640 --> 00:27:58,840 Speaker 1: Facebook whistleblower came forward a couple of years ago. And 480 00:27:59,280 --> 00:28:02,000 Speaker 1: the paper goes on to explain that many, but not all, 481 00:28:02,119 --> 00:28:05,480 Speaker 1: of the various studies included in their meta analysis indicated 482 00:28:05,520 --> 00:28:09,679 Speaker 1: a correlation between mental health and social media use. I'll 483 00:28:09,760 --> 00:28:13,320 Speaker 1: expand on that further, and then I'll talk more about Curious, 484 00:28:13,480 --> 00:28:16,280 Speaker 1: the journal that this was published in. But first let's 485 00:28:16,320 --> 00:28:28,480 Speaker 1: take another quick break. Okay, we're back. So before that break, 486 00:28:28,520 --> 00:28:32,760 Speaker 1: I was talking about how the various studies, most of 487 00:28:32,800 --> 00:28:36,240 Speaker 1: them were indicating some form of correlation between mental health 488 00:28:36,280 --> 00:28:38,840 Speaker 1: and social media use, which, again, that seems to go 489 00:28:38,880 --> 00:28:42,040 Speaker 1: along with common sense. I think most people, if you 490 00:28:42,120 --> 00:28:44,480 Speaker 1: ask them if they were familiar with social networks at 491 00:28:44,480 --> 00:28:46,400 Speaker 1: any rate, they would probably say, yeah, I think that 492 00:28:46,440 --> 00:28:49,920 Speaker 1: if you use social networks a lot, you're probably dealing 493 00:28:49,960 --> 00:28:54,200 Speaker 1: with some mental health issues, challenges like anxiety and depression. However, 494 00:28:54,280 --> 00:28:56,960 Speaker 1: common sense, it's dangerous to go along with that, right, Like, 495 00:28:57,240 --> 00:29:00,720 Speaker 1: everyone could have this kind of common sense still be wrong. 496 00:29:01,160 --> 00:29:03,800 Speaker 1: It could be that once you look into something purely 497 00:29:03,840 --> 00:29:08,360 Speaker 1: from a scientific approach, the links that were believed to 498 00:29:08,360 --> 00:29:11,920 Speaker 1: be there don't actually exist. I'm thinking of stuff like 499 00:29:12,160 --> 00:29:16,280 Speaker 1: quantum mechanics, Like the world of quantum mechanics is counterintuitive 500 00:29:16,320 --> 00:29:20,000 Speaker 1: because it doesn't behave along the same laws as what 501 00:29:20,040 --> 00:29:24,600 Speaker 1: we experience in the classical world. Like classic physics and 502 00:29:24,680 --> 00:29:28,160 Speaker 1: quantum physics seem to conflict with one another, and it 503 00:29:28,200 --> 00:29:32,360 Speaker 1: can be really hard to grasp certain concepts in quantum 504 00:29:32,360 --> 00:29:36,480 Speaker 1: physics because they run counter intuitive to the way we 505 00:29:36,560 --> 00:29:40,160 Speaker 1: experience the world. So their common sense would fail you 506 00:29:40,440 --> 00:29:43,120 Speaker 1: if you were just to use that to guide your way. 507 00:29:43,320 --> 00:29:46,360 Speaker 1: So again, like while common sense might say, yeah, mental 508 00:29:46,400 --> 00:29:51,719 Speaker 1: health and excessive social networking use are dangerously linked without 509 00:29:51,800 --> 00:29:54,800 Speaker 1: actually studying it, you can't say that definitively. So the 510 00:29:54,880 --> 00:29:58,600 Speaker 1: authors say that a causal relationship is unsupported based on 511 00:29:58,680 --> 00:30:01,600 Speaker 1: the studies at this time. So again they're just kind 512 00:30:01,600 --> 00:30:03,640 Speaker 1: of saying what I said before, which is that, yeah, 513 00:30:03,680 --> 00:30:07,000 Speaker 1: there are these two different factors that appear to be correlated, 514 00:30:07,280 --> 00:30:11,800 Speaker 1: but we can't definitively say one causes the other. So 515 00:30:12,200 --> 00:30:15,400 Speaker 1: more studies are needed, in other words, and these studies 516 00:30:15,440 --> 00:30:17,760 Speaker 1: need to be designed in order to determine if there 517 00:30:17,840 --> 00:30:21,320 Speaker 1: is an actual causal relationship here, or if both mental 518 00:30:21,360 --> 00:30:24,800 Speaker 1: health issues and an increase in social media use are 519 00:30:24,920 --> 00:30:30,040 Speaker 1: perhaps symptoms of something else, or maybe just a comorbidity. So, 520 00:30:30,080 --> 00:30:31,880 Speaker 1: in other words, the findings say pretty much all of 521 00:30:31,960 --> 00:30:33,840 Speaker 1: what I said earlier in this episode, we don't have 522 00:30:34,000 --> 00:30:37,480 Speaker 1: enough information to make a determination. Let's talk about some 523 00:30:37,600 --> 00:30:41,000 Speaker 1: of the problems I have with this study. For one thing, 524 00:30:41,040 --> 00:30:43,720 Speaker 1: I mean, it doesn't say anything ultimately. I mean that's 525 00:30:43,840 --> 00:30:46,920 Speaker 1: kind of unfair, like saying, oh, it doesn't really say anything, 526 00:30:47,000 --> 00:30:49,840 Speaker 1: or it says exactly what everybody already knows, which is 527 00:30:49,840 --> 00:30:52,040 Speaker 1: that we don't know. But what the whole point of 528 00:30:52,040 --> 00:30:55,400 Speaker 1: it was, it was to analyze these other studies and 529 00:30:55,440 --> 00:30:59,240 Speaker 1: to see like if there were any common points that 530 00:30:59,360 --> 00:31:03,760 Speaker 1: supported a more firm stance, And ultimately they found that 531 00:31:04,120 --> 00:31:06,800 Speaker 1: it appears that there is a link between mental health 532 00:31:06,800 --> 00:31:10,080 Speaker 1: and social networking use, but what that link is precisely 533 00:31:10,600 --> 00:31:14,400 Speaker 1: is not possible to be determined at this point. Now, 534 00:31:14,760 --> 00:31:17,680 Speaker 1: I also wanted to talk about Curious, the journal that 535 00:31:17,760 --> 00:31:20,520 Speaker 1: it was published in. It has I would argue a 536 00:31:20,560 --> 00:31:23,800 Speaker 1: bit of a shaky reputation based upon what I have 537 00:31:24,000 --> 00:31:28,520 Speaker 1: seen it is a peer reviewed journal. That is a 538 00:31:28,520 --> 00:31:31,959 Speaker 1: good thing. In general, it's a good thing. Peer review 539 00:31:32,240 --> 00:31:37,840 Speaker 1: is important in that if it's done correctly, then papers 540 00:31:37,920 --> 00:31:42,640 Speaker 1: that have issues are less likely to be accepted and published, 541 00:31:42,920 --> 00:31:48,440 Speaker 1: which means they're less likely to muddy the scholarly output 542 00:31:48,800 --> 00:31:53,200 Speaker 1: of researchers. You want good papers to get published so 543 00:31:53,240 --> 00:31:57,280 Speaker 1: that we continue to build knowledge and not make things 544 00:31:57,400 --> 00:32:01,240 Speaker 1: more murky by including stuff that is unsupported or poorly 545 00:32:01,320 --> 00:32:05,680 Speaker 1: researched or poorly designed, whatever it may be. So you 546 00:32:05,800 --> 00:32:09,920 Speaker 1: want a good, robust peer review process. However, peer review 547 00:32:10,000 --> 00:32:15,160 Speaker 1: is a tricky thing to do. Even really notable papers 548 00:32:15,200 --> 00:32:17,920 Speaker 1: that have really good reputations they have issues with peer review. 549 00:32:18,040 --> 00:32:21,200 Speaker 1: It's tough. The peer review process over it Curious is 550 00:32:21,240 --> 00:32:25,320 Speaker 1: reportedly a very fast one. There's a quick turnaround. Now. 551 00:32:25,360 --> 00:32:27,960 Speaker 1: That can be a good thing for researchers who need 552 00:32:28,000 --> 00:32:31,960 Speaker 1: their work to be published. There are students who need 553 00:32:32,000 --> 00:32:35,720 Speaker 1: to publish works as part of their graduate work before 554 00:32:35,760 --> 00:32:39,719 Speaker 1: they can graduate with an advanced degree. I suspect that 555 00:32:39,880 --> 00:32:44,240 Speaker 1: this article or this paper was such a project. It 556 00:32:44,320 --> 00:32:47,440 Speaker 1: comes across to me as, oh, these were students who 557 00:32:47,520 --> 00:32:50,240 Speaker 1: took a bunch of other studies and then they produce 558 00:32:50,320 --> 00:32:52,840 Speaker 1: this paper. It strikes me that way. I don't know 559 00:32:52,920 --> 00:32:55,440 Speaker 1: that for sure, by the way, that's just the feeling 560 00:32:55,480 --> 00:32:58,040 Speaker 1: I get as I read it. And of course, there 561 00:32:58,040 --> 00:33:01,280 Speaker 1: are also positions and titles that require the holder of 562 00:33:01,320 --> 00:33:04,360 Speaker 1: that position or title has to publish work at regular 563 00:33:04,440 --> 00:33:08,479 Speaker 1: intervals or else risk losing their position. Like professors, there 564 00:33:08,480 --> 00:33:10,800 Speaker 1: are a lot of professors at universities who are required 565 00:33:10,800 --> 00:33:13,840 Speaker 1: to publish a certain number of papers per year. That's 566 00:33:14,120 --> 00:33:17,120 Speaker 1: just the expectation, or else they can lose their position. 567 00:33:17,680 --> 00:33:22,960 Speaker 1: And publication takes time, like especially for scientific papers. If 568 00:33:23,000 --> 00:33:26,360 Speaker 1: you're talking about scientific or medical papers, that review process 569 00:33:26,360 --> 00:33:29,200 Speaker 1: could take as long as a year, and ultimately there's 570 00:33:29,200 --> 00:33:31,600 Speaker 1: no guarantee that the paper is going to go through. 571 00:33:32,120 --> 00:33:35,360 Speaker 1: So if you're under the gun and you have to publish, 572 00:33:35,400 --> 00:33:38,200 Speaker 1: and you really need your work to get out there 573 00:33:38,280 --> 00:33:40,800 Speaker 1: in order for that to count toward you know, whether 574 00:33:40,840 --> 00:33:43,920 Speaker 1: your graduation or holding your job or whatever, going through 575 00:33:43,920 --> 00:33:47,040 Speaker 1: a lengthy peer review and editing process is not high 576 00:33:47,080 --> 00:33:49,960 Speaker 1: on your list of priorities. So a resource that takes 577 00:33:50,160 --> 00:33:52,880 Speaker 1: work and fast tracks it toward publication can be a 578 00:33:53,000 --> 00:33:55,520 Speaker 1: huge help to those who need to have their work published. 579 00:33:55,680 --> 00:33:58,760 Speaker 1: But obviously the flip side of this is that if 580 00:33:58,760 --> 00:34:03,240 Speaker 1: this process is a fact fasttracked, mistakes can slip through. 581 00:34:03,960 --> 00:34:07,400 Speaker 1: Like I mentioned one earlier in this episode, it was 582 00:34:07,560 --> 00:34:10,080 Speaker 1: clearly a mistake, and it made it all the way 583 00:34:10,080 --> 00:34:12,719 Speaker 1: through the process. I actually found a few things in 584 00:34:12,760 --> 00:34:16,360 Speaker 1: this paper that struck me as odd or poorly worded, 585 00:34:16,680 --> 00:34:19,719 Speaker 1: Like there were bits where I thought that sentence is 586 00:34:19,800 --> 00:34:23,440 Speaker 1: missing something, the syntax doesn't quite work. I'm not entirely 587 00:34:23,520 --> 00:34:27,040 Speaker 1: certain what they were trying to say, so several passages 588 00:34:27,120 --> 00:34:30,280 Speaker 1: in the paper struck me as in need of editing, 589 00:34:30,400 --> 00:34:33,160 Speaker 1: just for the purposes of clarity, if nothing else. And 590 00:34:33,840 --> 00:34:36,040 Speaker 1: the thought occurred to me that if I had written 591 00:34:36,120 --> 00:34:39,160 Speaker 1: this for HowStuffWorks dot Com and had submitted it, my 592 00:34:39,320 --> 00:34:41,200 Speaker 1: editor would have returned it to me with a note 593 00:34:41,239 --> 00:34:44,959 Speaker 1: that said I needed to rewrite that passage. Then again, 594 00:34:45,600 --> 00:34:48,680 Speaker 1: maybe it's because I'm not a scientist. Because I'm reading 595 00:34:48,719 --> 00:34:51,920 Speaker 1: this the way an English major reads a paper. I'm 596 00:34:51,960 --> 00:34:55,200 Speaker 1: not reading it the way a scientific researcher does. And 597 00:34:55,640 --> 00:34:58,600 Speaker 1: that's a fair statement, right. I am not a scientific researcher, 598 00:34:58,840 --> 00:35:03,120 Speaker 1: so maybe I am being unfair with this. I did 599 00:35:03,120 --> 00:35:06,040 Speaker 1: some digging and found there's actually a quite a bit 600 00:35:06,080 --> 00:35:09,920 Speaker 1: of disagreement about Curious in the research space as to 601 00:35:10,560 --> 00:35:14,160 Speaker 1: whether it's a good resource or like you know, a 602 00:35:14,280 --> 00:35:18,000 Speaker 1: junk journal or something along those lines. So some people 603 00:35:18,400 --> 00:35:21,080 Speaker 1: have pointed to it as being really helpful if you 604 00:35:21,120 --> 00:35:23,279 Speaker 1: need to get your work published and seen, and that 605 00:35:23,760 --> 00:35:26,040 Speaker 1: when it comes to that, it ends up being an 606 00:35:26,040 --> 00:35:29,880 Speaker 1: incredible resource. Others have argued that the journal has a 607 00:35:29,920 --> 00:35:32,600 Speaker 1: low rejection rate, meaning it doesn't reject a lot of 608 00:35:32,680 --> 00:35:35,520 Speaker 1: articles right off the bat, and that the fast turnaround 609 00:35:35,600 --> 00:35:39,200 Speaker 1: time means that as a result, they publish a lot 610 00:35:39,280 --> 00:35:42,920 Speaker 1: of low quality studies, or at least lower quality studies, 611 00:35:43,320 --> 00:35:45,920 Speaker 1: And I fell down a rabbit hole. That is the 612 00:35:46,040 --> 00:35:49,920 Speaker 1: mire of scientific publishing and how it puts researchers in 613 00:35:49,960 --> 00:35:52,520 Speaker 1: a really tough position, and that a lot of journals 614 00:35:52,600 --> 00:35:55,960 Speaker 1: end up being predatory right, like they end up looking 615 00:35:56,000 --> 00:35:58,640 Speaker 1: to get researchers to spend thousands of dollars in things 616 00:35:58,680 --> 00:36:01,799 Speaker 1: like the editing and peer review services, which makes me 617 00:36:01,920 --> 00:36:04,279 Speaker 1: question the whole system. If I'm being honest with you now, 618 00:36:04,280 --> 00:36:07,200 Speaker 1: I will say that even the critics of Curious said, no, 619 00:36:07,280 --> 00:36:09,759 Speaker 1: it's not predatory. It's not like it's one of those 620 00:36:10,040 --> 00:36:13,560 Speaker 1: journals set up to builk people out of money so 621 00:36:13,600 --> 00:36:16,040 Speaker 1: that they can get their work in print. They're not 622 00:36:16,160 --> 00:36:18,600 Speaker 1: like that, which is good, Like I'm glad to hear that, 623 00:36:18,920 --> 00:36:22,400 Speaker 1: So I don't want to cast that aspersion on Curious. 624 00:36:22,560 --> 00:36:25,960 Speaker 1: It does appear to be very much legitimate in that regard. 625 00:36:26,000 --> 00:36:29,400 Speaker 1: It's just that the process being so fast tracked means 626 00:36:29,480 --> 00:36:33,400 Speaker 1: that stuff that shouldn't slip through sometimes does. I have 627 00:36:33,520 --> 00:36:37,319 Speaker 1: not read other papers in Curious, so I don't know 628 00:36:37,360 --> 00:36:41,400 Speaker 1: how prevalent that is, but just reading this one, I 629 00:36:41,440 --> 00:36:44,239 Speaker 1: thought there's some issues here. So anyway, that's why I 630 00:36:44,280 --> 00:36:46,239 Speaker 1: went through this whole paper was to kind of get 631 00:36:46,680 --> 00:36:51,520 Speaker 1: my head wrapped around what does the science say about this, 632 00:36:51,760 --> 00:36:55,920 Speaker 1: because we often will hear things, especially in politics, that 633 00:36:56,200 --> 00:37:00,120 Speaker 1: end up relating to the use of social media and 634 00:37:00,160 --> 00:37:04,040 Speaker 1: social networking sites and how that impacts people's health. And 635 00:37:04,960 --> 00:37:08,080 Speaker 1: while again it seems to go along with common sense, 636 00:37:08,360 --> 00:37:11,360 Speaker 1: I think it's important for us to really recognize that 637 00:37:11,560 --> 00:37:14,120 Speaker 1: we need more research in this area just so that 638 00:37:14,160 --> 00:37:18,360 Speaker 1: we address the issue properly. Right, if the underlying problem 639 00:37:18,760 --> 00:37:23,080 Speaker 1: is not the use of social networks, then limiting people's 640 00:37:23,080 --> 00:37:26,799 Speaker 1: time on social networks or policing social networks so that 641 00:37:27,000 --> 00:37:30,319 Speaker 1: they cause less harm or perceived harm. That's not going 642 00:37:30,360 --> 00:37:32,520 Speaker 1: to actually solve the problem if it turns out that 643 00:37:32,560 --> 00:37:35,840 Speaker 1: there's another issue that's really at play here and it 644 00:37:36,000 --> 00:37:41,120 Speaker 1: just manifests both as mental health challenges and a desire 645 00:37:41,120 --> 00:37:43,640 Speaker 1: to use social networks more. If you're just elimiting the 646 00:37:43,680 --> 00:37:46,920 Speaker 1: social networks, then you're not really solving that common problem. 647 00:37:47,239 --> 00:37:49,880 Speaker 1: So that's why more studies are really needed now. It 648 00:37:49,920 --> 00:37:52,680 Speaker 1: may very well turn out to be that the over 649 00:37:52,920 --> 00:37:56,520 Speaker 1: use of social networks does in fact impact mental health 650 00:37:56,520 --> 00:37:59,880 Speaker 1: in a negative way, and thus by limiting your exposure 651 00:37:59,880 --> 00:38:02,480 Speaker 1: to social networks you can improve your mental health. That 652 00:38:02,600 --> 00:38:06,560 Speaker 1: might be true, but without the actual studies to support that, 653 00:38:06,640 --> 00:38:08,640 Speaker 1: we don't know for sure, and we're just kind of 654 00:38:08,640 --> 00:38:11,160 Speaker 1: stumbling around in the dark trying to come to a 655 00:38:11,200 --> 00:38:15,000 Speaker 1: solution that may or may not address the problems we have. 656 00:38:15,360 --> 00:38:17,440 Speaker 1: And there are better ways to go about doing that, 657 00:38:17,880 --> 00:38:23,400 Speaker 1: and more scientific research is certainly one of those ways. Hopefully, 658 00:38:23,640 --> 00:38:26,319 Speaker 1: the research that's done in the future will be done 659 00:38:26,320 --> 00:38:29,160 Speaker 1: in such a way that the methodologies will be clear, 660 00:38:29,640 --> 00:38:32,440 Speaker 1: they'll be replicable, so that if someone else wants to 661 00:38:32,440 --> 00:38:34,360 Speaker 1: do the same study, they're going to get more or 662 00:38:34,440 --> 00:38:36,799 Speaker 1: less the same sort of results. And that we can 663 00:38:36,880 --> 00:38:42,200 Speaker 1: then draw firm conclusions and create real solutions from that work. 664 00:38:42,520 --> 00:38:45,600 Speaker 1: Science is tricky, I mean, ultimately it's not. Science is 665 00:38:45,640 --> 00:38:47,640 Speaker 1: not tricky when you get down if you boil it 666 00:38:47,680 --> 00:38:51,440 Speaker 1: down to it to its core principles, which is that 667 00:38:51,520 --> 00:38:54,680 Speaker 1: you know, you're asking questions, you're designing tests to test 668 00:38:54,719 --> 00:38:57,800 Speaker 1: those questions, and you're coming up with answers. That's pretty simple, 669 00:38:58,040 --> 00:39:01,239 Speaker 1: but going about it ends up being a lot more 670 00:39:01,320 --> 00:39:05,600 Speaker 1: work and pretty complicated. But I hope you appreciated this 671 00:39:05,680 --> 00:39:10,799 Speaker 1: episode and the look into what does it actually mean 672 00:39:11,200 --> 00:39:13,319 Speaker 1: to read one of these studies? This one, I think 673 00:39:13,360 --> 00:39:16,800 Speaker 1: it was almost like Baby's first study for me because 674 00:39:17,040 --> 00:39:19,640 Speaker 1: against it was a meta analysis. It didn't actually dive 675 00:39:19,719 --> 00:39:23,600 Speaker 1: into things like statistical analysis or anything like that. Like 676 00:39:23,640 --> 00:39:26,480 Speaker 1: there were no complicated formula or anything like that that 677 00:39:26,520 --> 00:39:30,360 Speaker 1: I needed to read over. I was just reading conclusions 678 00:39:30,400 --> 00:39:34,879 Speaker 1: about other studies, So this was a pretty simplistic one. 679 00:39:35,239 --> 00:39:38,680 Speaker 1: But yeah, it gave me a deeper appreciation for the 680 00:39:38,760 --> 00:39:41,840 Speaker 1: challenges that people in the field face when they're trying 681 00:39:41,880 --> 00:39:46,280 Speaker 1: to design their studies and publish their work, which wasn't 682 00:39:46,400 --> 00:39:49,120 Speaker 1: my intent when I started out in this episode, but 683 00:39:49,160 --> 00:39:51,680 Speaker 1: that's where I ended up. I hope all of you 684 00:39:51,719 --> 00:39:55,279 Speaker 1: out there are doing really well, and I will talk 685 00:39:55,320 --> 00:40:06,120 Speaker 1: to you again really soon. Tech Stuff is an iHeartRadio production. 686 00:40:06,400 --> 00:40:11,440 Speaker 1: For more podcasts from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, 687 00:40:11,560 --> 00:40:13,520 Speaker 1: or wherever you listen to your favorite shows.