1 00:00:00,120 --> 00:00:02,480 Speaker 1: I think it's a it's a problem that we have 2 00:00:02,680 --> 00:00:06,920 Speaker 1: become used to having free digital services and free in 3 00:00:06,960 --> 00:00:09,520 Speaker 1: quotation mark because of course we've been really paying without 4 00:00:09,600 --> 00:00:16,120 Speaker 1: private data. These days, there seems to be a smartphone 5 00:00:16,120 --> 00:00:19,520 Speaker 1: app for just about everything. One of the more popular 6 00:00:19,600 --> 00:00:23,400 Speaker 1: categories are apps that track menstrual cycles. More than a 7 00:00:23,480 --> 00:00:27,040 Speaker 1: hundred million women around the world already use them. Some 8 00:00:27,120 --> 00:00:30,320 Speaker 1: are trying to get pregnant, others are trying to avoid 9 00:00:30,360 --> 00:00:33,600 Speaker 1: getting pregnant, and many just want to better handle on 10 00:00:33,680 --> 00:00:38,440 Speaker 1: what their bodies are doing. But these free apps have 11 00:00:38,640 --> 00:00:43,320 Speaker 1: trade offs. Some use sensitive health information to place targeted ads. 12 00:00:44,520 --> 00:00:47,479 Speaker 1: The apps all need to make money, and that might 13 00:00:47,560 --> 00:00:50,400 Speaker 1: mean in a big picture sense, their main product is 14 00:00:50,440 --> 00:00:57,480 Speaker 1: actually their users data. Welcome to Prognosis, Bloomberg's podcast about 15 00:00:57,480 --> 00:01:01,280 Speaker 1: the intersection of health and technology and the unexpected places 16 00:01:01,320 --> 00:01:04,800 Speaker 1: that is taking us. I'm your host, Michelle fe Cortes. 17 00:01:06,360 --> 00:01:09,000 Speaker 1: For most of us, giving away our data is just 18 00:01:09,040 --> 00:01:11,199 Speaker 1: the deal we make when we're using a free app. 19 00:01:11,880 --> 00:01:14,520 Speaker 1: But a growing number of women are saying no, thank you, 20 00:01:15,280 --> 00:01:18,520 Speaker 1: and some are taking matters into their own hands. Bloomberg's 21 00:01:18,560 --> 00:01:21,200 Speaker 1: Naomi Kraski went to meet a group of feminist coders. 22 00:01:21,640 --> 00:01:25,120 Speaker 1: Our story starts at the Rainbow Factory, a community space 23 00:01:25,160 --> 00:01:33,720 Speaker 1: in Berlin's hot spot for startups, the Kreuzburg neighborhood. I'm 24 00:01:33,720 --> 00:01:37,520 Speaker 1: standing in the cafe at the community space. It's packed 25 00:01:37,800 --> 00:01:42,400 Speaker 1: with about two dozen people crammed around mismatched schoolroom style 26 00:01:42,480 --> 00:01:46,959 Speaker 1: tables and shares. Someone just popped open a bottle of 27 00:01:47,040 --> 00:01:50,400 Speaker 1: East Germans sparkling wine behind the bar, and they're handing 28 00:01:50,400 --> 00:01:55,760 Speaker 1: it out for free. People are here to celebrate. Everyone 29 00:01:55,840 --> 00:01:59,280 Speaker 1: stops hosting each other and settles down, turning their attention 30 00:01:59,320 --> 00:02:02,520 Speaker 1: to a project and screen. Next to that screen is 31 00:02:02,600 --> 00:02:07,080 Speaker 1: Marie cox Seek, a thirty year old sociologist and software developer. 32 00:02:07,960 --> 00:02:10,679 Speaker 1: She pulls up a slide with a grid of logos 33 00:02:10,880 --> 00:02:14,040 Speaker 1: for different period tracking apps. Don't get me wrong, I 34 00:02:14,360 --> 00:02:17,359 Speaker 1: don't think pink is a bad color. I just think 35 00:02:17,400 --> 00:02:22,440 Speaker 1: it's not the only one. And I also think flowers 36 00:02:22,560 --> 00:02:27,600 Speaker 1: or the herd, or what butterfly? Or this this very 37 00:02:27,760 --> 00:02:32,480 Speaker 1: cute girl with these big eyes. They don't really represent 38 00:02:32,840 --> 00:02:36,480 Speaker 1: represent me, They don't really represent us. Marie is one 39 00:02:36,560 --> 00:02:39,800 Speaker 1: third of the leadership team for Bloody Health, the coding 40 00:02:39,840 --> 00:02:44,280 Speaker 1: collectives that just released the first female developed open source 41 00:02:44,400 --> 00:02:48,680 Speaker 1: menstrual cycle tracker. In English, it's called DRIP. And this 42 00:02:48,760 --> 00:02:52,320 Speaker 1: is the launch party. Now after all these things, oh no, 43 00:02:52,600 --> 00:02:57,720 Speaker 1: there's like no privacy, no security, black box algorithms, Genno stereotypes. 44 00:02:57,880 --> 00:03:01,000 Speaker 1: Now this is DRIP. The seats for the project were 45 00:03:01,040 --> 00:03:06,040 Speaker 1: sewn back in when Marie got her first smartphone. It 46 00:03:06,120 --> 00:03:08,880 Speaker 1: was an Android. She was a bit of a late 47 00:03:08,960 --> 00:03:12,200 Speaker 1: adopter on the smartphone, but she knew what she wanted 48 00:03:12,200 --> 00:03:15,040 Speaker 1: to download right away. I looked into the play store 49 00:03:15,080 --> 00:03:18,919 Speaker 1: and I found a period app, um, and I just 50 00:03:18,960 --> 00:03:22,160 Speaker 1: took like the first one. Yeah, I think it was Womanlock, 51 00:03:22,400 --> 00:03:26,160 Speaker 1: which is also the icon with a pink flower on 52 00:03:26,200 --> 00:03:28,960 Speaker 1: the black screen, and it was. It was the first 53 00:03:29,000 --> 00:03:33,200 Speaker 1: app that I've seen. And the very same day, I 54 00:03:33,280 --> 00:03:36,120 Speaker 1: met with a friend of mine and I was very 55 00:03:36,200 --> 00:03:38,480 Speaker 1: excited and I told her, oh my god, I just 56 00:03:38,600 --> 00:03:42,320 Speaker 1: found this cool period app. It's so useful. I love it. 57 00:03:42,680 --> 00:03:45,800 Speaker 1: If you have ever had a period, you know why. 58 00:03:45,880 --> 00:03:48,520 Speaker 1: Marie was so happy at first when she found that app. 59 00:03:49,360 --> 00:03:53,200 Speaker 1: Periods can turn up at the most annoying times. They 60 00:03:53,200 --> 00:03:57,200 Speaker 1: can ruin your sheets and your Tinder dates. But period 61 00:03:57,240 --> 00:03:59,839 Speaker 1: apps promised to do more than just help you keep 62 00:04:00,000 --> 00:04:03,360 Speaker 1: act of bleeding. The broader pitch is that if you 63 00:04:03,440 --> 00:04:06,160 Speaker 1: give the app and of details, you can get a 64 00:04:06,200 --> 00:04:09,680 Speaker 1: heads up on all the others, sometimes frustrating changes that 65 00:04:09,760 --> 00:04:13,360 Speaker 1: come with your cycle. Why am I feeling depressed? What 66 00:04:13,480 --> 00:04:16,320 Speaker 1: the heck is going on with my digestion? Could I 67 00:04:16,360 --> 00:04:19,919 Speaker 1: be pregnant? The desire to connect the dots between the 68 00:04:20,000 --> 00:04:24,039 Speaker 1: period and everything else has helped make the apps so popular. 69 00:04:24,720 --> 00:04:27,680 Speaker 1: They're consistently in the top ten most downloaded in the 70 00:04:27,680 --> 00:04:34,520 Speaker 1: app Store. Just about every commercial period app lets you 71 00:04:34,560 --> 00:04:37,000 Speaker 1: do more than just click a calendar day so you 72 00:04:37,040 --> 00:04:40,719 Speaker 1: remember when your last one happened. They also make predictions 73 00:04:40,720 --> 00:04:43,719 Speaker 1: about when your next period will come. They let you 74 00:04:43,760 --> 00:04:48,440 Speaker 1: track everything from sex and boozy nights out to ovulation tests, 75 00:04:48,880 --> 00:04:54,839 Speaker 1: vaginal discharge, and resting body temperature. But apps don't necessarily 76 00:04:55,000 --> 00:04:58,960 Speaker 1: have to meet the privacy standards of say, doctor or 77 00:04:59,040 --> 00:05:03,400 Speaker 1: hospital records. In fact, as one of Marie's friends told her, 78 00:05:03,839 --> 00:05:06,800 Speaker 1: the commercial apps can use your data for plenty of 79 00:05:06,880 --> 00:05:09,359 Speaker 1: things that do not have anything to do with keeping 80 00:05:09,400 --> 00:05:13,240 Speaker 1: track of your period. This first looked at me. She 81 00:05:13,360 --> 00:05:15,200 Speaker 1: was a very close friend, so she looked at me 82 00:05:15,200 --> 00:05:19,360 Speaker 1: and said, like, Marie, have you read the privacy policy? 83 00:05:19,400 --> 00:05:22,919 Speaker 1: Of this app, and um, I was a bit annoyed 84 00:05:22,960 --> 00:05:25,000 Speaker 1: at her because she kind of took away my joy 85 00:05:25,279 --> 00:05:29,279 Speaker 1: in that moment um. But the following day I deleted 86 00:05:29,320 --> 00:05:32,120 Speaker 1: the app. All of that data can be used for 87 00:05:32,160 --> 00:05:35,760 Speaker 1: the same type of business that Facebook, Instagram and other 88 00:05:35,839 --> 00:05:40,680 Speaker 1: major apps use it for. Facebook is already controversial, but 89 00:05:40,800 --> 00:05:43,720 Speaker 1: here we're talking about even more personal data than what 90 00:05:43,800 --> 00:05:48,400 Speaker 1: most people share with social media. Joanna Baron, a Brazilian 91 00:05:48,440 --> 00:05:52,600 Speaker 1: researcher and digital rights activists who leads the group Coding Rights, explains, 92 00:05:53,120 --> 00:05:55,480 Speaker 1: so it's a lot, a lot, a lot of information 93 00:05:55,720 --> 00:05:58,799 Speaker 1: that is collected and the information that you get back 94 00:05:59,440 --> 00:06:02,960 Speaker 1: it's a when's your period and when's your fertile moment. 95 00:06:03,400 --> 00:06:06,159 Speaker 1: It doesn't need to ask all those questions to have 96 00:06:06,560 --> 00:06:12,080 Speaker 1: this outcome as an information. So of course this is 97 00:06:12,120 --> 00:06:15,080 Speaker 1: going somewhere else. And then if you go and look 98 00:06:15,080 --> 00:06:18,080 Speaker 1: at the terms of services, they say we might give 99 00:06:18,120 --> 00:06:24,400 Speaker 1: it to the data to third parties for advertisement purposes 100 00:06:25,200 --> 00:06:28,880 Speaker 1: and for whatever purpose that's not the purpose of the app. 101 00:06:29,400 --> 00:06:32,800 Speaker 1: Joanna's group did a study of period apps called Menstrue 102 00:06:32,880 --> 00:06:36,719 Speaker 1: Apps how to turn your period into money for others. 103 00:06:37,720 --> 00:06:41,600 Speaker 1: They found that information users give the apps can be 104 00:06:41,680 --> 00:06:46,520 Speaker 1: shared with third parties, including social media platforms, marketing agencies, 105 00:06:46,960 --> 00:06:53,000 Speaker 1: research institutes, employers, and Google Analytics. Flow, the biggest period 106 00:06:53,040 --> 00:06:56,400 Speaker 1: app in the US, has collected more than thirteen billion 107 00:06:56,520 --> 00:07:00,520 Speaker 1: data points about its users. Flow says the day helps 108 00:07:00,560 --> 00:07:03,599 Speaker 1: women monitor irregular periods and has helped a lot of 109 00:07:03,600 --> 00:07:07,200 Speaker 1: its users get pregnant. The app, based out of Mintsk 110 00:07:07,279 --> 00:07:10,920 Speaker 1: in Belarus, got into hot water recently over sharing some 111 00:07:11,040 --> 00:07:15,200 Speaker 1: user data with Facebook. After an uproar, the company said 112 00:07:15,240 --> 00:07:18,239 Speaker 1: it was only using Facebook analytics to improve the user 113 00:07:18,320 --> 00:07:23,040 Speaker 1: experience and that it would stop. But in China, where 114 00:07:23,040 --> 00:07:26,160 Speaker 1: the biggest period apps in the world are based, companies 115 00:07:26,200 --> 00:07:30,119 Speaker 1: have gone much further. Diema, one of the market leaders there, 116 00:07:30,400 --> 00:07:33,160 Speaker 1: said it crunches its users data to find out whether 117 00:07:33,200 --> 00:07:39,320 Speaker 1: to show them ads for tampons or ads for ovulation kits. Still, 118 00:07:39,560 --> 00:07:43,160 Speaker 1: Joanna's group found that when it comes to privacy policies, 119 00:07:43,400 --> 00:07:47,880 Speaker 1: not all commercial period apps are created equal. They cited Clue, 120 00:07:48,000 --> 00:07:51,560 Speaker 1: a Berlin based period app that's especially popular in Europe, 121 00:07:52,040 --> 00:07:56,000 Speaker 1: as an example of a clear privacy policy. Clue also 122 00:07:56,080 --> 00:07:58,840 Speaker 1: lets you store your health tracking data on your phone 123 00:07:59,440 --> 00:08:02,320 Speaker 1: without sharing it with the company, as long as you 124 00:08:02,440 --> 00:08:06,320 Speaker 1: do not set up a user account. I wanted to 125 00:08:06,440 --> 00:08:10,800 Speaker 1: get an industry perspective on this open source backlash against 126 00:08:10,800 --> 00:08:15,120 Speaker 1: commercial period apps, so I went to visit Clue CEO. 127 00:08:15,800 --> 00:08:20,000 Speaker 1: I'd a ten. I never started this company because of 128 00:08:20,440 --> 00:08:23,000 Speaker 1: wanted to build a business. I wanted to solve a problem, 129 00:08:23,040 --> 00:08:26,000 Speaker 1: and so I think there are definitely many nasty ways 130 00:08:26,040 --> 00:08:28,040 Speaker 1: that you can be commercial in this industry, and that 131 00:08:28,200 --> 00:08:32,160 Speaker 1: is happening. I do understand their sentiment. I still believe 132 00:08:32,240 --> 00:08:34,920 Speaker 1: that there must be a different way. I think it's 133 00:08:34,920 --> 00:08:37,720 Speaker 1: an it's the problem that we have become used to 134 00:08:37,800 --> 00:08:42,640 Speaker 1: having free digital services and free incuptation mark because of course, 135 00:08:42,679 --> 00:08:46,600 Speaker 1: within really paying without private data. Cluse terms of service 136 00:08:46,760 --> 00:08:51,040 Speaker 1: allow user data to be shared with academic researchers, and 137 00:08:51,160 --> 00:08:57,240 Speaker 1: it doesn't explicitly rule out commercial use. However, i'd just 138 00:08:57,360 --> 00:09:00,800 Speaker 1: said she does not want to run targeted at or 139 00:09:00,880 --> 00:09:04,840 Speaker 1: sell user data. But Clue has thirty million dollars in 140 00:09:04,920 --> 00:09:08,280 Speaker 1: capital from investors, and she's wrestling right now with how 141 00:09:08,360 --> 00:09:12,760 Speaker 1: to turn her period app into a sustainable business. She 142 00:09:12,760 --> 00:09:16,240 Speaker 1: wouldn't talk about exactly how that might work, except to 143 00:09:16,240 --> 00:09:18,960 Speaker 1: say that she has three concrete ideas that she's working 144 00:09:19,000 --> 00:09:24,040 Speaker 1: on right now. Clue has already experimented with a subscription model, 145 00:09:24,720 --> 00:09:28,560 Speaker 1: charging women about one dollar a month for more data analysis. 146 00:09:30,200 --> 00:09:34,160 Speaker 1: Marie started seriously thinking about period apps after doing a 147 00:09:34,200 --> 00:09:37,800 Speaker 1: study of Clue. She was working on her master's thesis 148 00:09:37,800 --> 00:09:43,079 Speaker 1: and sociology and early and she interviewed ten women about 149 00:09:43,160 --> 00:09:45,640 Speaker 1: how they used the app and what they were getting 150 00:09:45,640 --> 00:09:50,160 Speaker 1: out of it. Some more bisexual, some heterosexual, Some wanted 151 00:09:50,200 --> 00:09:53,079 Speaker 1: to get pregnant, and some did not. They were all 152 00:09:53,120 --> 00:09:56,040 Speaker 1: getting something different out of the app. At some point, 153 00:09:56,040 --> 00:09:58,280 Speaker 1: I had like nine different period apps on my phone 154 00:09:58,320 --> 00:10:01,760 Speaker 1: and I was trying to track see how different they are. 155 00:10:03,320 --> 00:10:05,680 Speaker 1: And that's where we get back to the idea of 156 00:10:05,720 --> 00:10:09,000 Speaker 1: a black box. Murray had all these different apps on 157 00:10:09,000 --> 00:10:12,200 Speaker 1: her phone, but she could not see into any of them. 158 00:10:12,200 --> 00:10:15,800 Speaker 1: She didn't have access to any of the code. Even Clue, 159 00:10:15,920 --> 00:10:19,679 Speaker 1: with it's easy to understand privacy policy didn't let her 160 00:10:19,760 --> 00:10:23,559 Speaker 1: see what was going on behind the screens. If we 161 00:10:23,720 --> 00:10:27,240 Speaker 1: use technology, um and we think it's useful and we 162 00:10:27,280 --> 00:10:30,400 Speaker 1: think it makes her life easier, but at the same 163 00:10:30,480 --> 00:10:34,240 Speaker 1: time we can't look behind the scenes, We can't understand 164 00:10:35,360 --> 00:10:39,200 Speaker 1: what is actually happening with the data we generate. What 165 00:10:39,400 --> 00:10:42,120 Speaker 1: is the code actually about? It became more and more 166 00:10:42,120 --> 00:10:46,080 Speaker 1: important to me UM to uh, to rely on open 167 00:10:46,120 --> 00:10:50,840 Speaker 1: source software. UM and so I was in a way 168 00:10:51,559 --> 00:10:55,000 Speaker 1: like unsatisfied with period apps and with I think with 169 00:10:55,080 --> 00:11:01,520 Speaker 1: health trekking apps in general. By September teen, Marie was 170 00:11:01,559 --> 00:11:04,640 Speaker 1: toying with the idea of writing her own open source 171 00:11:04,679 --> 00:11:08,920 Speaker 1: period app. She had already linked up with one potential collaborator, 172 00:11:09,440 --> 00:11:13,839 Speaker 1: a thirty year old mathematician named Tina Baumant. Marie turned 173 00:11:13,840 --> 00:11:16,920 Speaker 1: to Twitter to see if anyone else was interested in 174 00:11:17,000 --> 00:11:20,920 Speaker 1: making it a more serious project. She tweeted in German 175 00:11:21,120 --> 00:11:24,160 Speaker 1: something along the lines of who wants to fiddle around 176 00:11:24,160 --> 00:11:28,160 Speaker 1: with an open source menstrual cycle app? With me? She 177 00:11:28,200 --> 00:11:31,000 Speaker 1: included a gift of a little girl with a lot 178 00:11:31,040 --> 00:11:35,840 Speaker 1: of attitude. Yeah, it's just a gift showing UM a 179 00:11:35,880 --> 00:11:41,080 Speaker 1: small girl UM throwing very confidently giving away three tampons. 180 00:11:41,120 --> 00:11:44,560 Speaker 1: I guess I just thought, in this moment, Okay, let's 181 00:11:44,640 --> 00:11:46,840 Speaker 1: just give it a try, and let's see what happens 182 00:11:46,880 --> 00:11:51,880 Speaker 1: if I tweet about it, UM and I got um 183 00:11:52,120 --> 00:11:54,840 Speaker 1: quite a few retweets, quite a few likes. More than 184 00:11:54,880 --> 00:11:59,280 Speaker 1: a dozen people responded. Among them was Julia Frazel, a 185 00:11:59,400 --> 00:12:03,120 Speaker 1: thirty four year old software developer. She became the third 186 00:12:03,240 --> 00:12:06,840 Speaker 1: member of the team and brought the coding experience they needed. 187 00:12:07,400 --> 00:12:12,040 Speaker 1: But there was one important thing. They were still missing money. 188 00:12:12,120 --> 00:12:15,280 Speaker 1: In order to work full time on the period app project. 189 00:12:15,520 --> 00:12:18,480 Speaker 1: They needed some other way to pay the rent. In 190 00:12:18,559 --> 00:12:23,600 Speaker 1: early they got a German government grant called the Prototype Fund. 191 00:12:24,280 --> 00:12:28,600 Speaker 1: It's designed to promote open source technology and public interest projects. 192 00:12:28,880 --> 00:12:31,720 Speaker 1: And it felt like, um, okay, this is it. We 193 00:12:31,840 --> 00:12:36,559 Speaker 1: really was super motivated by this fund and we thought, okay, 194 00:12:36,600 --> 00:12:39,679 Speaker 1: this is our chance, now or never. Marie and her 195 00:12:39,760 --> 00:12:42,760 Speaker 1: team already had a list of ways they knew they 196 00:12:42,840 --> 00:12:46,000 Speaker 1: wanted their period app to be different. Some of them 197 00:12:46,000 --> 00:12:49,840 Speaker 1: had to do with design things like no gendered colors, pictures, 198 00:12:50,000 --> 00:12:52,880 Speaker 1: or text. Some had to do with making the tech 199 00:12:53,000 --> 00:12:56,400 Speaker 1: open source, and some had to do with how the 200 00:12:56,440 --> 00:12:59,720 Speaker 1: app itself would work and what kinds of predictions it 201 00:12:59,720 --> 00:13:06,160 Speaker 1: would make or not make. Beyond the privacy issues, predicting 202 00:13:06,280 --> 00:13:10,440 Speaker 1: when women might get pregnant is probably the most controversial 203 00:13:10,520 --> 00:13:14,760 Speaker 1: thing about period apps. Most of the commercial apps you'll 204 00:13:14,760 --> 00:13:17,120 Speaker 1: find in the app store will show you a few 205 00:13:17,200 --> 00:13:19,679 Speaker 1: days or a week when you should be most fertile. 206 00:13:20,600 --> 00:13:23,560 Speaker 1: Some even give the likelihood of whether you'll get pregnant 207 00:13:23,559 --> 00:13:25,880 Speaker 1: on a certain day down to the tenth of a percent, 208 00:13:26,400 --> 00:13:30,839 Speaker 1: based purely on when your last periods started. But everyone's 209 00:13:30,880 --> 00:13:35,960 Speaker 1: cycle is slightly different and some irregularity is common. Cycles 210 00:13:35,960 --> 00:13:39,360 Speaker 1: can last anywhere from twenty one days to forty days. 211 00:13:40,200 --> 00:13:43,439 Speaker 1: Period apps do not come close to being fail proof 212 00:13:43,520 --> 00:13:48,640 Speaker 1: as contraceptives. The Drip team decided they needed to base 213 00:13:48,720 --> 00:13:53,400 Speaker 1: their app on science. They turned to Patre Frankermen, a 214 00:13:53,440 --> 00:13:58,280 Speaker 1: gynecologist at Heidelberg University Hospital. PIT is an expert on 215 00:13:58,320 --> 00:14:03,880 Speaker 1: what's called FAB or fertility awareness based methods for determining 216 00:14:04,000 --> 00:14:08,040 Speaker 1: when women can get pregnant. Recently, she has become interested 217 00:14:08,120 --> 00:14:11,920 Speaker 1: in period apps since more than thirty years, I'm doing 218 00:14:12,000 --> 00:14:16,000 Speaker 1: research on the fertile window of the female cycle and 219 00:14:16,120 --> 00:14:19,560 Speaker 1: on FAV methods, and of course I'm very interested in 220 00:14:19,600 --> 00:14:24,320 Speaker 1: app supporting women in using those methods. And my second 221 00:14:24,400 --> 00:14:29,920 Speaker 1: motivation was is that as a gynecologist, I meet young 222 00:14:30,000 --> 00:14:32,920 Speaker 1: women and teens, or I met already young women and 223 00:14:32,960 --> 00:14:38,520 Speaker 1: teens who experienced unplanned pregnancies with those apps, even the 224 00:14:38,600 --> 00:14:41,640 Speaker 1: daughter of a close friend of mine. Page Cluss surveyed 225 00:14:41,680 --> 00:14:44,480 Speaker 1: a dozen apps that were already on the market last 226 00:14:44,560 --> 00:14:47,760 Speaker 1: year and which claimed to help women choose the best 227 00:14:47,800 --> 00:14:50,040 Speaker 1: time of the month to have sex in order to 228 00:14:50,080 --> 00:14:53,880 Speaker 1: get pregnant. She graded then on a thirty point scale 229 00:14:54,240 --> 00:14:57,760 Speaker 1: based on how they determined the fertile window, what study 230 00:14:57,800 --> 00:15:01,600 Speaker 1: results existed to back up that less it, what study 231 00:15:01,640 --> 00:15:05,000 Speaker 1: results they had to show the app works, and whether 232 00:15:05,040 --> 00:15:09,440 Speaker 1: they offer any counseling to users. All the calendar based 233 00:15:09,440 --> 00:15:14,720 Speaker 1: apps she reviewed, including Clue and Flow, got zero out 234 00:15:14,720 --> 00:15:19,520 Speaker 1: of thirty points. Even Natural Cycles, a Swedish app that 235 00:15:19,600 --> 00:15:23,440 Speaker 1: asks women to input their resting body temperature or basil 236 00:15:23,480 --> 00:15:27,520 Speaker 1: body temperature for better accuracy, got only two out of 237 00:15:27,560 --> 00:15:31,840 Speaker 1: thirty points in the survey. Part of the reason PA's 238 00:15:31,880 --> 00:15:35,480 Speaker 1: team graded all the apps solo is that there's not 239 00:15:35,600 --> 00:15:38,840 Speaker 1: much independent research to show whether they work. The other 240 00:15:38,960 --> 00:15:42,680 Speaker 1: reason is that when it comes to periods, past performance 241 00:15:42,840 --> 00:15:45,480 Speaker 1: is no guarantee of what will happen in the future. 242 00:15:46,280 --> 00:15:50,320 Speaker 1: Most of the apps to predictions, even if they use, 243 00:15:50,480 --> 00:15:55,560 Speaker 1: for example, parameters like the temperature. This is for example, 244 00:15:55,640 --> 00:16:01,200 Speaker 1: the apt Natural Cycles. It does predictions as well, and 245 00:16:01,240 --> 00:16:06,840 Speaker 1: in our opinion, therefore they are useless for contraception. When 246 00:16:06,880 --> 00:16:10,200 Speaker 1: you say predictions, do you mean making predictions based on 247 00:16:10,320 --> 00:16:15,320 Speaker 1: past cycles? Past the predictions on fertile days on the 248 00:16:15,320 --> 00:16:19,440 Speaker 1: basis of past scientists. Okay, and that is is not 249 00:16:19,560 --> 00:16:22,920 Speaker 1: going to help you not get pergnant yes, or get 250 00:16:22,920 --> 00:16:29,680 Speaker 1: pergnant yes. Yes, due to the variation of O eolation 251 00:16:29,920 --> 00:16:35,360 Speaker 1: and day and fertile face even in women with fairly 252 00:16:35,440 --> 00:16:38,160 Speaker 1: regular scientists think that's something that a lot of women 253 00:16:38,200 --> 00:16:42,120 Speaker 1: don't realize. Yes, yes, that's true. This is this is 254 00:16:42,160 --> 00:16:46,000 Speaker 1: the point. So do you think people should be trusting 255 00:16:46,080 --> 00:16:51,120 Speaker 1: their family planning to these apps? Um? No, because most 256 00:16:51,160 --> 00:16:56,240 Speaker 1: apps are lacking scientific standards or they showed poor results 257 00:16:56,360 --> 00:17:01,080 Speaker 1: up to now. Your body can actually tell you more 258 00:17:01,400 --> 00:17:04,760 Speaker 1: about when you're likely to get pregnant. There's a method 259 00:17:04,800 --> 00:17:08,200 Speaker 1: that's been around for decades called the Symptoms thermal method. 260 00:17:09,000 --> 00:17:13,040 Speaker 1: It involves taking your resting body temperature every morning and 261 00:17:13,119 --> 00:17:16,680 Speaker 1: checking your vaginal secretions to see when the cervical mucus 262 00:17:16,840 --> 00:17:21,440 Speaker 1: turns clear and stretchy like rag whites. It's a lot 263 00:17:21,480 --> 00:17:24,359 Speaker 1: more work than taking off days on a calendar, but 264 00:17:24,520 --> 00:17:27,800 Speaker 1: it fits the ethos of the project, which is helping 265 00:17:27,840 --> 00:17:31,280 Speaker 1: when and learn about their own bodies and take control 266 00:17:31,359 --> 00:17:34,359 Speaker 1: of their own fertility. I think that's the point that 267 00:17:34,400 --> 00:17:38,120 Speaker 1: you can't really predict anovulation will happen, but you can 268 00:17:38,280 --> 00:17:42,160 Speaker 1: only like watch your body symptoms to see when it happened, 269 00:17:42,760 --> 00:17:44,840 Speaker 1: and so you can only be sure after it happened. 270 00:17:44,880 --> 00:17:47,640 Speaker 1: But like I don't know, bodies are defended. Also depends 271 00:17:47,720 --> 00:17:55,440 Speaker 1: on your stress level or yeah, I think time zones, traveling, sleeping. 272 00:17:55,840 --> 00:17:58,720 Speaker 1: There's so much it can influence of relation. At the 273 00:17:58,760 --> 00:18:02,520 Speaker 1: time of relation. That was Tina, the mathematician on the 274 00:18:02,600 --> 00:18:06,080 Speaker 1: DRIP team. Before the team could start work on the 275 00:18:06,080 --> 00:18:12,320 Speaker 1: project in Earnest, something ironic happened. Tina got pregnant. I 276 00:18:12,400 --> 00:18:15,960 Speaker 1: was kind of afraid to tell them because I didn't 277 00:18:16,000 --> 00:18:17,480 Speaker 1: know if this would mean like the end of the 278 00:18:17,520 --> 00:18:20,080 Speaker 1: project for me. I was sitting down the kitchen, like 279 00:18:20,119 --> 00:18:24,520 Speaker 1: I have to tell you something. Um no, I pricked 280 00:18:24,520 --> 00:18:27,920 Speaker 1: it in, like oh how cool, And I'm like, yeah, 281 00:18:27,960 --> 00:18:29,920 Speaker 1: I don't know what this means for the project, and 282 00:18:29,960 --> 00:18:32,760 Speaker 1: they're like, oh, we make it work. The team finally 283 00:18:32,800 --> 00:18:35,720 Speaker 1: started work on their app in April of last year. 284 00:18:36,320 --> 00:18:39,880 Speaker 1: In late June, Tina had her baby. A few weeks 285 00:18:39,960 --> 00:18:43,520 Speaker 1: after that, they were back to weekly meetings at Tina's apartment, 286 00:18:44,040 --> 00:18:47,960 Speaker 1: working on the project while she breast fed. But the 287 00:18:47,960 --> 00:18:51,119 Speaker 1: prototype funding from the German government was only good for 288 00:18:51,160 --> 00:18:54,439 Speaker 1: about six months of work, and the team knew that 289 00:18:54,520 --> 00:18:57,760 Speaker 1: they would not be able to finish the entire project themselves. 290 00:18:58,560 --> 00:19:01,119 Speaker 1: They needed the open Sore part of the app to 291 00:19:01,240 --> 00:19:04,280 Speaker 1: really come into play. They needed a bigger team of 292 00:19:04,359 --> 00:19:08,359 Speaker 1: people to help code, and so we actually organized events, 293 00:19:08,520 --> 00:19:12,480 Speaker 1: I think three at least. We said, hey, if you're 294 00:19:12,520 --> 00:19:15,239 Speaker 1: interested in contributing, or if you like, if you want 295 00:19:15,280 --> 00:19:18,280 Speaker 1: to see what the project is all about, you can 296 00:19:18,400 --> 00:19:20,800 Speaker 1: meet us and we will set up the project together. 297 00:19:20,840 --> 00:19:22,800 Speaker 1: Because it's a little bit of a pain and I 298 00:19:22,840 --> 00:19:26,480 Speaker 1: think sometimes it even took two hours from like saying 299 00:19:26,520 --> 00:19:30,679 Speaker 1: hello to the first person having the app running on 300 00:19:30,720 --> 00:19:32,600 Speaker 1: their phone. But I think it was like it's kind 301 00:19:32,600 --> 00:19:37,520 Speaker 1: of nice to get people over this sometimes painful step, 302 00:19:37,880 --> 00:19:40,280 Speaker 1: like not sitting at home and you're like, oh, why 303 00:19:40,320 --> 00:19:43,080 Speaker 1: do I get this error message? Um, and then maybe 304 00:19:43,160 --> 00:19:46,560 Speaker 1: quitting but you're sitting there all together like talking a 305 00:19:46,600 --> 00:19:49,560 Speaker 1: little bit, but also important person might be already a 306 00:19:49,560 --> 00:19:52,600 Speaker 1: step ahead and they can tell you, like what they did. 307 00:19:55,359 --> 00:19:59,119 Speaker 1: The Drip team won another fellowship from the Mozilla Foundation 308 00:19:59,600 --> 00:20:01,679 Speaker 1: and that help them get the word out about the 309 00:20:01,680 --> 00:20:06,240 Speaker 1: project and they joined some programming workshops called Code and Cake. 310 00:20:06,800 --> 00:20:09,239 Speaker 1: The workshops are run by a group of programmers who 311 00:20:09,400 --> 00:20:12,720 Speaker 1: use the coding language Ruby on rails, and they helped 312 00:20:12,760 --> 00:20:16,040 Speaker 1: teach newcomers to code. They found a lot of interest. 313 00:20:16,840 --> 00:20:20,560 Speaker 1: A designer helped figure out the user interface. Another designer 314 00:20:20,600 --> 00:20:23,960 Speaker 1: made a logo It's not Pink and Curly, and over 315 00:20:24,000 --> 00:20:26,520 Speaker 1: the course of a few months, they created an app 316 00:20:26,680 --> 00:20:30,960 Speaker 1: that lets users track population and other symptoms while keeping 317 00:20:31,000 --> 00:20:34,760 Speaker 1: their personal data privates. Along the way, the number of 318 00:20:34,800 --> 00:20:39,720 Speaker 1: code contributors grew from three Marie, Tina and Julia to seventeen. 319 00:20:40,560 --> 00:20:43,879 Speaker 1: Here are two of the volunteers, Sophia and Maria, a 320 00:20:44,000 --> 00:20:47,120 Speaker 1: Russian couple who live in Berlin, talking about how they 321 00:20:47,119 --> 00:20:50,640 Speaker 1: got involved. We were standing outside the Rainbow Cafe during 322 00:20:50,640 --> 00:20:53,760 Speaker 1: a break in the Drip launch party. So, okay, I 323 00:20:53,800 --> 00:20:57,400 Speaker 1: will reveal the secret. So I'm my older they working 324 00:20:57,480 --> 00:21:01,800 Speaker 1: this programmer and they is my wife and she was 325 00:21:01,840 --> 00:21:06,040 Speaker 1: really curious about programming. She actually has a degree in programming. 326 00:21:06,119 --> 00:21:10,560 Speaker 1: But it's never gonna yeah, yeah, So we decided to 327 00:21:10,600 --> 00:21:13,920 Speaker 1: make something real, like to find some project to work 328 00:21:13,920 --> 00:21:18,479 Speaker 1: on it on, something grills that we can potentially also use. 329 00:21:19,000 --> 00:21:22,280 Speaker 1: Sophia doesn't even have an Android phone, so she cannot 330 00:21:22,359 --> 00:21:26,639 Speaker 1: use drip herself, but Maria does. She had been using Flow, 331 00:21:26,920 --> 00:21:30,200 Speaker 1: the Belarusian app that came under fire for sharing data 332 00:21:30,240 --> 00:21:34,359 Speaker 1: with Facebook. So which from this ink and faery floor 333 00:21:35,400 --> 00:21:39,760 Speaker 1: and try to drip out for iditon at clue. The 334 00:21:39,800 --> 00:21:43,399 Speaker 1: Bloody Health Collective and the Drip app are a sign 335 00:21:43,400 --> 00:21:47,040 Speaker 1: of the times. She thinks people are caring more and 336 00:21:47,160 --> 00:21:49,840 Speaker 1: more about what health apps and the rest of the 337 00:21:49,880 --> 00:21:53,679 Speaker 1: tech industry do with their data. When a group of 338 00:21:53,720 --> 00:21:55,879 Speaker 1: people get together and say, hey, let's just build this, 339 00:21:56,160 --> 00:21:57,919 Speaker 1: I think that's great. I mean, it's the same that 340 00:21:57,960 --> 00:22:00,679 Speaker 1: I did, right and and very years to talk to 341 00:22:00,680 --> 00:22:02,880 Speaker 1: them because I think we have, you know, very much 342 00:22:02,880 --> 00:22:04,760 Speaker 1: on the same mission. And I would be very curious 343 00:22:04,800 --> 00:22:06,560 Speaker 1: also to ask them, like, what is it that you 344 00:22:06,600 --> 00:22:10,560 Speaker 1: feel that we are doing that doesn't meet your requirements? 345 00:22:10,640 --> 00:22:12,840 Speaker 1: Why did you feel that there is a need for this. 346 00:22:13,119 --> 00:22:16,480 Speaker 1: Widea argues that for lots of users, having their data 347 00:22:16,560 --> 00:22:19,639 Speaker 1: stored somewhere else than their phones is actually a really 348 00:22:19,640 --> 00:22:23,280 Speaker 1: good thing. It helps ensure they don't lose years worth 349 00:22:23,359 --> 00:22:26,600 Speaker 1: of period tracking data if they lose their phones. It 350 00:22:26,720 --> 00:22:30,919 Speaker 1: could help researchers use aggregated data from lots of different 351 00:22:31,000 --> 00:22:36,439 Speaker 1: users to learn more about periods themselves. And finally, because 352 00:22:36,520 --> 00:22:40,520 Speaker 1: clues developers can see how users are interacting with the app, 353 00:22:40,920 --> 00:22:43,239 Speaker 1: it's a lot easier for them to fix things that 354 00:22:43,280 --> 00:22:46,560 Speaker 1: go wrong. And then maybe they have something also to 355 00:22:46,680 --> 00:22:50,080 Speaker 1: learn from me. Who have you know, maybe seen the 356 00:22:50,119 --> 00:22:52,240 Speaker 1: limitations of what happens when you don't have a back end, 357 00:22:52,280 --> 00:22:55,520 Speaker 1: when you don't like well, maybe they want to do 358 00:22:55,600 --> 00:22:57,800 Speaker 1: something different. Now, I don't know. I don't haven't spoken 359 00:22:57,840 --> 00:23:00,359 Speaker 1: to them, but I'd be very curious to the bad sekend. 360 00:23:00,800 --> 00:23:03,040 Speaker 1: All the parts of the app that users don't really 361 00:23:03,080 --> 00:23:06,639 Speaker 1: see will be a bigger challenge for Marie and Tina's group. 362 00:23:07,240 --> 00:23:10,840 Speaker 1: They are dependent on users telling them what's working with 363 00:23:10,880 --> 00:23:14,000 Speaker 1: the ab and what isn't they cannot see it themselves. 364 00:23:14,720 --> 00:23:18,160 Speaker 1: And another challenge is that for now, Drip is only 365 00:23:18,200 --> 00:23:22,520 Speaker 1: available on git lab and Open Source Software Forum. It's 366 00:23:22,560 --> 00:23:25,120 Speaker 1: not in the app stores where most people download new 367 00:23:25,160 --> 00:23:31,720 Speaker 1: programs for their phones. To make the first female designed 368 00:23:31,800 --> 00:23:36,040 Speaker 1: open source period app a success, Drip first needs a 369 00:23:36,080 --> 00:23:39,560 Speaker 1: lot more people to start using it and even more importantly, 370 00:23:39,680 --> 00:23:42,720 Speaker 1: to give feedback about how to make it better, and 371 00:23:42,800 --> 00:23:45,199 Speaker 1: Marie and the team will eventually need to focus on 372 00:23:45,280 --> 00:23:50,920 Speaker 1: other jobs, so they need coders to contribute to it too. Yes. 373 00:23:52,480 --> 00:23:55,680 Speaker 1: One important step in creating that team was the launch 374 00:23:55,680 --> 00:23:59,119 Speaker 1: party at their Rainbow factory. The initial response was good. 375 00:23:59,720 --> 00:24:02,840 Speaker 1: Here Tam Eastley, a Canadian developer who has lived in 376 00:24:02,840 --> 00:24:05,720 Speaker 1: Berlin for years. The first thing that I was really 377 00:24:05,800 --> 00:24:08,399 Speaker 1: excited about is normally, when you download an app, it 378 00:24:08,480 --> 00:24:11,040 Speaker 1: says this app is going to have access to like 379 00:24:11,119 --> 00:24:14,520 Speaker 1: your camera, your location. You're like this random folder you 380 00:24:14,560 --> 00:24:16,719 Speaker 1: didn't even know existed, and then you you have to 381 00:24:16,760 --> 00:24:20,280 Speaker 1: agree otherwise you can't use it, and you're so rarely 382 00:24:20,440 --> 00:24:23,040 Speaker 1: aware of what you were agreeing to and you just 383 00:24:23,119 --> 00:24:24,800 Speaker 1: end up saying okay because you want to use it. 384 00:24:24,840 --> 00:24:27,160 Speaker 1: But with this app, it was like, this app will 385 00:24:27,200 --> 00:24:30,080 Speaker 1: not have access to anything on your device, and I forget. 386 00:24:30,160 --> 00:24:32,040 Speaker 1: I assume you had to say okay to this, but 387 00:24:32,080 --> 00:24:34,880 Speaker 1: I was just like, I've never ever seen this before 388 00:24:35,240 --> 00:24:37,919 Speaker 1: and an app this is so cool. They will need 389 00:24:37,960 --> 00:24:40,600 Speaker 1: to find a lot more users like Tam to make 390 00:24:40,720 --> 00:24:43,920 Speaker 1: drips self sustaining and to ensure that the work they've 391 00:24:43,960 --> 00:24:47,320 Speaker 1: put into it so far won't be in vain. A 392 00:24:47,320 --> 00:24:50,399 Speaker 1: few weeks after the launch, Marie told me that about 393 00:24:50,480 --> 00:24:54,199 Speaker 1: fifty beta testers had downloaded the app and suggestions for 394 00:24:54,240 --> 00:24:58,280 Speaker 1: tweaks to the code we're pouring in. Marie is optimistic. 395 00:24:58,760 --> 00:25:02,919 Speaker 1: It's also about switching the role of this passive user 396 00:25:03,160 --> 00:25:06,320 Speaker 1: to an active contributor, if you want to see it 397 00:25:06,440 --> 00:25:09,880 Speaker 1: like that, So being like, do I actually like that? 398 00:25:10,119 --> 00:25:12,439 Speaker 1: Did I expect something else? And this is kind of 399 00:25:12,440 --> 00:25:16,240 Speaker 1: the question that everybody can answer. Next month, the team 400 00:25:16,320 --> 00:25:19,200 Speaker 1: is planning to add Drip to f Roid, an app 401 00:25:19,280 --> 00:25:23,240 Speaker 1: store for free and open source software. They hope that 402 00:25:23,280 --> 00:25:27,720 Speaker 1: will increase their user base. To Marie, the small community 403 00:25:27,800 --> 00:25:31,800 Speaker 1: that's developing around the app is its biggest strength. It's 404 00:25:31,840 --> 00:25:34,639 Speaker 1: also a sign of how people are thinking more and 405 00:25:34,720 --> 00:25:38,800 Speaker 1: more about not just how the software they use helps, 406 00:25:39,440 --> 00:25:43,840 Speaker 1: but also what it takes from users and whether that 407 00:25:43,960 --> 00:25:51,560 Speaker 1: trade off is worth it. And that's it for this 408 00:25:51,600 --> 00:25:56,000 Speaker 1: week's prognosis. Thanks for listening. Do you have a story 409 00:25:56,000 --> 00:25:59,000 Speaker 1: about healthcare in the US or around the world, We 410 00:25:59,040 --> 00:26:01,680 Speaker 1: want to hear from you. We're on Twitter at the 411 00:26:01,880 --> 00:26:05,320 Speaker 1: Cortes or at Naomi Kresky. If you were a fan 412 00:26:05,400 --> 00:26:07,919 Speaker 1: of this episode, please take a moment to rate and 413 00:26:07,960 --> 00:26:10,880 Speaker 1: review it. It really helps new listeners find the show. 414 00:26:12,160 --> 00:26:15,919 Speaker 1: This episode was produced by Lindsey Craterwell. Our story editors 415 00:26:15,920 --> 00:26:19,440 Speaker 1: were Drew Armstrong and Rick Shine, Francesco Levia's head of 416 00:26:19,440 --> 00:26:23,399 Speaker 1: Bloomberg Podcasts. We'll be back on April twenty five with 417 00:26:23,440 --> 00:26:24,960 Speaker 1: a new episode