1 00:00:04,480 --> 00:00:12,559 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,600 --> 00:00:16,360 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:16,360 --> 00:00:19,479 Speaker 1: I'm an executive producer with iHeart Podcasts and how the 4 00:00:19,560 --> 00:00:24,200 Speaker 1: tech are you? So? On Saturday August twenty fourth, twenty 5 00:00:24,280 --> 00:00:29,040 Speaker 1: twenty four, French law enforcement officials arrested a man named 6 00:00:29,160 --> 00:00:33,239 Speaker 1: Pavel Durov shortly after his private plane landed at an 7 00:00:33,280 --> 00:00:38,000 Speaker 1: airport outside of Paris. So why did they do that? Well, 8 00:00:38,120 --> 00:00:43,320 Speaker 1: Durov is the CEO of the company Telegram, a communications 9 00:00:43,320 --> 00:00:47,760 Speaker 1: and file sharing app. Pavel and his brother Nikolai founded 10 00:00:47,800 --> 00:00:51,480 Speaker 1: Telegram partly as a way to allow users to communicate 11 00:00:51,560 --> 00:00:55,320 Speaker 1: with one another without fear of censorship. But this freedom 12 00:00:55,560 --> 00:00:59,200 Speaker 1: also means that some people use Telegram to either commit 13 00:00:59,400 --> 00:01:03,480 Speaker 1: or facilitate illegal activities, and that's what led to Pavel 14 00:01:03,560 --> 00:01:07,240 Speaker 1: Durov's arrest. Now I should probably put this in right here, 15 00:01:07,480 --> 00:01:10,000 Speaker 1: even though it is a spoiler. Dirov has since been 16 00:01:10,080 --> 00:01:13,800 Speaker 1: released from police custody, but we'll still have to appear 17 00:01:13,920 --> 00:01:18,080 Speaker 1: in court, which, as I'm recording this has not yet happened. 18 00:01:18,080 --> 00:01:21,080 Speaker 1: By the time you hear this, maybe it has now. 19 00:01:21,120 --> 00:01:23,960 Speaker 1: I thought I would give a history of Telegram and 20 00:01:24,000 --> 00:01:27,720 Speaker 1: some information on the charges that Pavol faces, as well 21 00:01:27,760 --> 00:01:31,000 Speaker 1: as more information about what Telegram itself does. But to 22 00:01:31,040 --> 00:01:33,600 Speaker 1: do that, when we first have to talk about a 23 00:01:33,800 --> 00:01:38,240 Speaker 1: predecessor to Telegram. It is a social media platform called 24 00:01:38,800 --> 00:01:46,280 Speaker 1: v Contact. That's Vko n Takte or VK for short, 25 00:01:46,760 --> 00:01:50,800 Speaker 1: because this launched in Russia. So in two thousand and six, 26 00:01:50,880 --> 00:01:56,200 Speaker 1: two years after Facebook, or rather the Facebook launched, Pavel 27 00:01:56,280 --> 00:02:00,360 Speaker 1: Durov created VK with Nikolai So it featured many of 28 00:02:00,400 --> 00:02:03,360 Speaker 1: the same functions that you would find on platforms like 29 00:02:03,680 --> 00:02:08,120 Speaker 1: Facebook or MySpace back in the day. Users could create profiles, 30 00:02:08,160 --> 00:02:11,120 Speaker 1: they could connect with friends, and as I said, Pavel 31 00:02:11,240 --> 00:02:15,200 Speaker 1: lived in Russia at the time. He has citizenship in 32 00:02:15,400 --> 00:02:18,679 Speaker 1: many different countries at this point, including France, and the 33 00:02:18,760 --> 00:02:23,240 Speaker 1: Russian government has let's say, a little history with wanting 34 00:02:23,320 --> 00:02:26,880 Speaker 1: to control the flow of information. That's quite hard to 35 00:02:26,919 --> 00:02:30,680 Speaker 1: do on a platform made up of user generated content. 36 00:02:31,000 --> 00:02:34,760 Speaker 1: The information isn't coming from some central source. It's not 37 00:02:34,880 --> 00:02:37,840 Speaker 1: like a media company. It's being generated by the people 38 00:02:37,840 --> 00:02:41,600 Speaker 1: who are actually on the platform. One twenty eleven, some 39 00:02:41,800 --> 00:02:48,080 Speaker 1: Russian citizens were taking to VK to protest recent parliamentary elections. 40 00:02:48,720 --> 00:02:52,280 Speaker 1: The Russian government didn't want anyone upsetting the borsched cart, 41 00:02:52,400 --> 00:02:56,640 Speaker 1: so to speak, and so Durav received the polite request 42 00:02:57,160 --> 00:03:02,360 Speaker 1: that perhaps he censored the posts that were protesting the elections, 43 00:03:02,639 --> 00:03:08,119 Speaker 1: you know, or else. Durov declined to acquiesce to their request, 44 00:03:08,400 --> 00:03:11,320 Speaker 1: and so the Russian government decided to pull out all 45 00:03:11,320 --> 00:03:16,040 Speaker 1: the stops. Now those stops included using essentially propaganda to 46 00:03:16,160 --> 00:03:20,960 Speaker 1: heap aspersions upon both VK and Durov himself, but perhaps 47 00:03:21,000 --> 00:03:26,240 Speaker 1: the more effective tactic was they sent an armed police 48 00:03:26,520 --> 00:03:31,240 Speaker 1: presence to Durov's house to intimidate him into complying with 49 00:03:31,360 --> 00:03:35,520 Speaker 1: Russian demands. On top of that, the government allegedly demanded 50 00:03:35,520 --> 00:03:40,000 Speaker 1: that Durov handover user information linked to accounts identified as 51 00:03:40,040 --> 00:03:45,600 Speaker 1: having posted anti government messages. Durov decided that he had 52 00:03:45,600 --> 00:03:48,840 Speaker 1: had enough of that kind of stuff, so ultimately he 53 00:03:48,920 --> 00:03:52,240 Speaker 1: would cash out and sell his interest in VK. This, 54 00:03:52,720 --> 00:03:56,200 Speaker 1: from what I can tell, happened around twenty fourteen when 55 00:03:56,240 --> 00:03:59,800 Speaker 1: he officially sold off his interest in the company, but 56 00:04:00,200 --> 00:04:04,680 Speaker 1: he was already thinking ahead to the next thing and 57 00:04:04,840 --> 00:04:08,120 Speaker 1: getting out of Dodge or Moscow, as it turns out, 58 00:04:08,760 --> 00:04:11,280 Speaker 1: much earlier than that, So he and his brother would 59 00:04:11,360 --> 00:04:16,520 Speaker 1: found telegram back in twenty thirteen, so he still had 60 00:04:16,600 --> 00:04:19,160 Speaker 1: interests in VK at this point, but would sell them 61 00:04:19,200 --> 00:04:21,840 Speaker 1: off the following year. Now, another thing that happened in 62 00:04:21,920 --> 00:04:26,520 Speaker 1: twenty thirteen that I'm guessing influenced the Brothers was the 63 00:04:26,600 --> 00:04:30,880 Speaker 1: tale of Edward Snowden. So Snowden was a contractor for 64 00:04:30,920 --> 00:04:34,160 Speaker 1: the NSA here in America. Now, in the course of 65 00:04:34,200 --> 00:04:39,320 Speaker 1: Snowden's work, he learned about massive surveillance programs, including ones 66 00:04:39,360 --> 00:04:43,000 Speaker 1: that involved not just the United States, but it's allies 67 00:04:43,240 --> 00:04:47,520 Speaker 1: collaborating with one another and sharing surveillance information between them. 68 00:04:47,800 --> 00:04:51,840 Speaker 1: And the scope and depth of these surveillance programs really 69 00:04:51,960 --> 00:04:56,520 Speaker 1: concerned Snowden, so he purposefully leaked information ultimately to the 70 00:04:56,520 --> 00:05:01,640 Speaker 1: public about them, and that got various governments very upset 71 00:05:02,120 --> 00:05:06,480 Speaker 1: at Snowden. To put it lightly, Snowden ultimately sought asylum 72 00:05:06,480 --> 00:05:10,080 Speaker 1: in Russia, where he was granted asylum, and in twenty 73 00:05:10,120 --> 00:05:14,320 Speaker 1: twenty two he received citizenship in Russia. While the US 74 00:05:14,440 --> 00:05:18,119 Speaker 1: government viewed Snowden's actions as a violation of several laws, 75 00:05:18,160 --> 00:05:21,760 Speaker 1: including the Espionage Act of nineteen seventeen, lots of other 76 00:05:21,800 --> 00:05:25,680 Speaker 1: people saw what he did as a heroic act of defiance, 77 00:05:25,880 --> 00:05:30,280 Speaker 1: uncovering questionable and disturbing practices across the world, some of 78 00:05:30,320 --> 00:05:35,480 Speaker 1: those practices subsequently being found unconstitutional and illegal here in 79 00:05:35,520 --> 00:05:39,080 Speaker 1: the United States, and he also helped drive home the 80 00:05:39,120 --> 00:05:41,679 Speaker 1: concern that your communications with the people in your life 81 00:05:41,920 --> 00:05:45,160 Speaker 1: might not be as private as you would like to think. 82 00:05:45,640 --> 00:05:49,479 Speaker 1: One other important change that had happened between the launch 83 00:05:49,600 --> 00:05:53,400 Speaker 1: of VK and the launch of Telegram was the rise 84 00:05:53,520 --> 00:05:56,760 Speaker 1: of the smartphone. Back in two thousand and six, consumer 85 00:05:56,800 --> 00:06:00,400 Speaker 1: smartphones weren't actually a thing yet. The only phonelks who 86 00:06:00,480 --> 00:06:05,640 Speaker 1: touted smartphones around were like business executives and bleeding edge 87 00:06:05,640 --> 00:06:09,920 Speaker 1: technology enthusiasts. Everyone else was sporting a cell phone, you know. 88 00:06:10,040 --> 00:06:12,440 Speaker 1: Maybe it was a flip phone, maybe it was a 89 00:06:12,480 --> 00:06:15,800 Speaker 1: candy bar style phone. Maybe you were super cool and 90 00:06:15,839 --> 00:06:20,200 Speaker 1: you had a sidekick. I really really wanted a sidekick anyway. 91 00:06:20,400 --> 00:06:23,719 Speaker 1: The point is, accessing the Web in general and social 92 00:06:23,760 --> 00:06:27,359 Speaker 1: media in particular typically meant you were on a desktop 93 00:06:27,560 --> 00:06:31,160 Speaker 1: or a laptop computer. But by twenty thirteen we were 94 00:06:31,200 --> 00:06:34,160 Speaker 1: well into a seed change in which more people were 95 00:06:34,200 --> 00:06:38,600 Speaker 1: relying on smartphones in order to access web content. Now, 96 00:06:38,600 --> 00:06:42,039 Speaker 1: it wouldn't really be until like early twenty fourteen that 97 00:06:42,080 --> 00:06:46,200 Speaker 1: we started to see web Internet usage on mobile eclipsing 98 00:06:46,480 --> 00:06:48,919 Speaker 1: desktop usage here in the United States. That is, it 99 00:06:48,960 --> 00:06:50,920 Speaker 1: was different for different parts of the world. Here in 100 00:06:50,960 --> 00:06:53,839 Speaker 1: the US around twenty fourteen, that's when that started to happen, 101 00:06:53,880 --> 00:06:57,679 Speaker 1: where mobile phones were becoming the primary way that people 102 00:06:57,680 --> 00:07:00,800 Speaker 1: were interacting with content on the web, which meant web 103 00:07:00,839 --> 00:07:04,440 Speaker 1: designers were hectically trying to figure out how to optimize 104 00:07:04,440 --> 00:07:09,120 Speaker 1: pages for users on mobile devices, or to create mobile 105 00:07:09,160 --> 00:07:12,400 Speaker 1: specific apps to try and drive traffic to those. I've 106 00:07:12,400 --> 00:07:16,280 Speaker 1: seen plenty Heck, I recorded ads for how Stuff Works 107 00:07:16,360 --> 00:07:21,240 Speaker 1: is mobile app way back in the day, because optimizing 108 00:07:21,280 --> 00:07:24,040 Speaker 1: a page so that it looks good no matter what 109 00:07:24,160 --> 00:07:27,840 Speaker 1: device you're on was tricky. Like it's easier today because 110 00:07:28,120 --> 00:07:31,360 Speaker 1: best practices have been created to handle that sort of thing. 111 00:07:31,640 --> 00:07:35,160 Speaker 1: But back in like the twenty tens era, that was 112 00:07:35,520 --> 00:07:39,120 Speaker 1: still a developing discipline, and so there were a lot 113 00:07:39,160 --> 00:07:42,720 Speaker 1: of outlets out there that created apps hoping that they 114 00:07:42,720 --> 00:07:46,280 Speaker 1: could drive people to use those apps and manage to 115 00:07:46,360 --> 00:07:50,640 Speaker 1: monetize that traffic. I never saw download figures on how 116 00:07:50,680 --> 00:07:53,480 Speaker 1: well the house Stuff Works app did. If I had 117 00:07:53,560 --> 00:07:56,640 Speaker 1: to suspect what I would guess it was fairly low. 118 00:07:57,000 --> 00:07:59,400 Speaker 1: I thought it was a decent app, but not not 119 00:07:59,480 --> 00:08:05,680 Speaker 1: equivalent to the web page experience. Anyway, the mobile landscape 120 00:08:05,760 --> 00:08:10,320 Speaker 1: was clearly becoming much more important, and so the Durovs 121 00:08:10,320 --> 00:08:15,560 Speaker 1: decided they would focus on a mobile centered application for Telegram, 122 00:08:15,560 --> 00:08:18,560 Speaker 1: at least initially. They set up their headquarters in the 123 00:08:18,680 --> 00:08:23,320 Speaker 1: United Arab Immirates, specifically in Dubai. So Durov chose this 124 00:08:23,480 --> 00:08:26,760 Speaker 1: because he said it was quote the best place for 125 00:08:26,840 --> 00:08:29,760 Speaker 1: a neutral platform like ours to be in if we 126 00:08:29,840 --> 00:08:33,040 Speaker 1: want to make sure we can defend our user's privacy 127 00:08:33,200 --> 00:08:36,040 Speaker 1: and freedom of speech end quote. So in other words, 128 00:08:36,240 --> 00:08:40,280 Speaker 1: Durov was confident that the UAE government wouldn't get so 129 00:08:40,600 --> 00:08:45,200 Speaker 1: handsy with Telegram as Russia had with VK. The company 130 00:08:45,240 --> 00:08:49,800 Speaker 1: would launch apps for iOS in August twenty thirteen and 131 00:08:49,800 --> 00:08:53,080 Speaker 1: Android in October of that same year. That was after 132 00:08:53,160 --> 00:08:56,840 Speaker 1: Telegram had held a contest for Android developers to essentially 133 00:08:56,920 --> 00:09:02,000 Speaker 1: port the code over to the Android environnment. Nikolai developed 134 00:09:02,000 --> 00:09:07,760 Speaker 1: the initial mobile protocol for Telegram that included the encryption protocol. 135 00:09:07,920 --> 00:09:11,040 Speaker 1: Now I should add that not all messages are encrypted 136 00:09:11,200 --> 00:09:16,120 Speaker 1: on Telegram, and that's actually a key component to the 137 00:09:16,200 --> 00:09:19,960 Speaker 1: issues that Durov currently faces. So in fact, even a 138 00:09:20,080 --> 00:09:25,280 Speaker 1: one on one conversation inside telegram is not encrypted by default. 139 00:09:25,600 --> 00:09:29,960 Speaker 1: You actually have to manually turn on encryption if you 140 00:09:30,040 --> 00:09:33,880 Speaker 1: want your communication to be encrypted. For those one on 141 00:09:33,880 --> 00:09:37,760 Speaker 1: one conversations, you can have to end encryption that protects 142 00:09:37,760 --> 00:09:40,600 Speaker 1: the message from prying eyes. And I've talked about encryption 143 00:09:40,760 --> 00:09:44,160 Speaker 1: many times before, but basically, from a very high level, 144 00:09:44,200 --> 00:09:47,200 Speaker 1: here's how it works. You have two users, and they 145 00:09:47,200 --> 00:09:51,160 Speaker 1: have a means of encrypting and decrypting messages so that 146 00:09:51,240 --> 00:09:55,160 Speaker 1: the information that's actually sent between the two appears to 147 00:09:55,200 --> 00:09:59,800 Speaker 1: be gibberish. An outside party snooping in on the conversation 148 00:10:00,080 --> 00:10:03,600 Speaker 1: would just see a string of apparently meaningless characters with 149 00:10:03,640 --> 00:10:07,880 Speaker 1: no real message inside of it, but the individual users 150 00:10:07,920 --> 00:10:12,320 Speaker 1: would get the decrypted raw text messages or files or 151 00:10:12,320 --> 00:10:15,679 Speaker 1: whatever it might be into end. Encryption has lots of legitimate, 152 00:10:15,720 --> 00:10:19,520 Speaker 1: important uses that can help people maintain secure and private 153 00:10:19,600 --> 00:10:24,320 Speaker 1: methods of communication, and law enforcement agencies often really don't 154 00:10:24,480 --> 00:10:27,760 Speaker 1: like it because it makes their jobs harder. I guess 155 00:10:27,800 --> 00:10:30,199 Speaker 1: it depends on how you look at their jobs, because 156 00:10:30,360 --> 00:10:33,680 Speaker 1: some people would say, oh, sure, law enforcement's job of 157 00:10:34,280 --> 00:10:38,559 Speaker 1: surveying all innocent citizens and trying to look for crimes, 158 00:10:38,720 --> 00:10:41,640 Speaker 1: and others would say, oh, law enforcement's job of being 159 00:10:41,720 --> 00:10:46,199 Speaker 1: able to detect and then protect people from criminal activity. 160 00:10:46,520 --> 00:10:48,199 Speaker 1: I guess it really depends on your point of view. 161 00:10:48,360 --> 00:10:51,600 Speaker 1: But detecting illegal activities is a lot harder if all 162 00:10:51,600 --> 00:10:55,960 Speaker 1: the communications between the practitioners are scrambled. And there's been 163 00:10:55,960 --> 00:10:57,880 Speaker 1: a lot of pressure in different parts of the world 164 00:10:58,000 --> 00:11:02,880 Speaker 1: on different platforms to open up backdoor access to encrypted communications, 165 00:11:03,200 --> 00:11:09,920 Speaker 1: essentially to give law enforcement a universal key to decrypt 166 00:11:10,120 --> 00:11:12,800 Speaker 1: communications so that they can see if there's anything hinky 167 00:11:12,880 --> 00:11:16,160 Speaker 1: going on. Now, in many cases this really isn't possible 168 00:11:16,600 --> 00:11:19,800 Speaker 1: unless you grant access to the end points themselves, like 169 00:11:19,840 --> 00:11:23,960 Speaker 1: you somehow get access to the actual end devices where 170 00:11:24,000 --> 00:11:26,920 Speaker 1: the decryption is taking place, because good end to end 171 00:11:27,000 --> 00:11:30,120 Speaker 1: encryption means that even the platform that's offering it is 172 00:11:30,320 --> 00:11:33,960 Speaker 1: unable to break that encryption. Moreover, the design was such 173 00:11:34,000 --> 00:11:37,520 Speaker 1: that these encrypted chats would only exist on those end 174 00:11:37,520 --> 00:11:42,959 Speaker 1: devices because Telegram didn't store the encrypted communications in the cloud, 175 00:11:43,080 --> 00:11:46,400 Speaker 1: So even if someone were to compromise Telegram systems, they 176 00:11:46,440 --> 00:11:49,800 Speaker 1: wouldn't be able to access the encrypted communications stored there 177 00:11:50,000 --> 00:11:54,800 Speaker 1: because there weren't any stored there. Anyway, this protocol that 178 00:11:55,000 --> 00:11:59,760 Speaker 1: was primarily about encryption, was called mt proto, and Nikolai 179 00:12:00,120 --> 00:12:03,199 Speaker 1: authored the first version of it, and you can visit 180 00:12:03,240 --> 00:12:06,480 Speaker 1: the Telegram website to read up on how it works. Although, 181 00:12:06,800 --> 00:12:09,800 Speaker 1: to be fair, the version that Telegram uses now is 182 00:12:10,160 --> 00:12:14,560 Speaker 1: MT proto two point zero, and so it's different. Right. 183 00:12:14,640 --> 00:12:19,440 Speaker 1: It's evolved considerably since the launch in twenty thirteen, and 184 00:12:19,480 --> 00:12:23,480 Speaker 1: it gets a bit complicated to talk about the details 185 00:12:23,520 --> 00:12:26,440 Speaker 1: in a podcast that doesn't have visual aids. Besides that, 186 00:12:26,720 --> 00:12:29,440 Speaker 1: even if this were a video podcast, I am no 187 00:12:29,640 --> 00:12:32,920 Speaker 1: expert in encryption, and so chances are I would end 188 00:12:33,000 --> 00:12:37,120 Speaker 1: up communicating something poorly or just outright incorrectly if I 189 00:12:37,160 --> 00:12:39,760 Speaker 1: were to really tackle it. The important bit is that 190 00:12:39,960 --> 00:12:42,840 Speaker 1: mt proto was one of the early building blocks Nickel 191 00:12:42,920 --> 00:12:45,360 Speaker 1: I made for Telegram, and they also chose to make 192 00:12:45,440 --> 00:12:49,520 Speaker 1: this an open protocol, meaning the entire community could review 193 00:12:49,640 --> 00:12:52,400 Speaker 1: and examine how the protocol worked. This was meant to 194 00:12:52,440 --> 00:12:56,000 Speaker 1: eliminate trust issues, like the idea is remove trust from 195 00:12:56,040 --> 00:12:59,120 Speaker 1: being a concern, like there's no reason to trust Telegram. 196 00:12:59,160 --> 00:13:01,520 Speaker 1: You can look into all of this and make sure 197 00:13:01,800 --> 00:13:05,760 Speaker 1: that things are run the way the app was claiming it, 198 00:13:06,000 --> 00:13:09,160 Speaker 1: so users could see exactly how the protocol was handling encryption. 199 00:13:09,360 --> 00:13:13,280 Speaker 1: And verify that communications were secure, right, like, no copies 200 00:13:13,320 --> 00:13:17,760 Speaker 1: of that communication would be going to Telegram itself, et cetera. 201 00:13:18,040 --> 00:13:19,800 Speaker 1: All right, We've got a lot more to talk about 202 00:13:19,960 --> 00:13:22,839 Speaker 1: when it comes to Telegram and its features. Before we 203 00:13:22,880 --> 00:13:24,800 Speaker 1: get to that, let's take a quick break to thank 204 00:13:24,840 --> 00:13:36,960 Speaker 1: our sponsors, so we're back. We talked a bit about 205 00:13:37,080 --> 00:13:41,400 Speaker 1: encryption with Telegram. Another feature that the app would offer 206 00:13:41,480 --> 00:13:46,000 Speaker 1: early on was a self destruct option for messages. That is, 207 00:13:46,240 --> 00:13:50,559 Speaker 1: once a message had been received, and then later it 208 00:13:50,720 --> 00:13:54,400 Speaker 1: was improved so once it had been read, then it 209 00:13:54,440 --> 00:13:57,680 Speaker 1: would self delete after a given amount of time. So 210 00:13:57,800 --> 00:14:01,400 Speaker 1: early on the self destruct was literally like within a 211 00:14:01,400 --> 00:14:05,320 Speaker 1: certain amount of time since the message was received. Problem is, 212 00:14:05,360 --> 00:14:08,880 Speaker 1: not everyone reads messages just when they get them, right. 213 00:14:08,960 --> 00:14:12,000 Speaker 1: You could be in a situation where you can't. Maybe 214 00:14:12,000 --> 00:14:14,000 Speaker 1: you're on a flight or something and you haven't had 215 00:14:14,000 --> 00:14:17,880 Speaker 1: your phone connected to Wi Fi and you're not connected 216 00:14:17,920 --> 00:14:21,800 Speaker 1: to the internet, so you could receive the message through Telegram, 217 00:14:22,160 --> 00:14:24,480 Speaker 1: but you haven't had a chance to read it yet, 218 00:14:24,640 --> 00:14:27,240 Speaker 1: and then meanwhile the timer is ticking down for when 219 00:14:27,240 --> 00:14:31,400 Speaker 1: it's going to self delete. Later on, they did increase 220 00:14:31,560 --> 00:14:34,760 Speaker 1: this so that it was after you had opened the 221 00:14:34,800 --> 00:14:37,520 Speaker 1: message and then the timer would start. So this was 222 00:14:37,600 --> 00:14:41,080 Speaker 1: kind of like the original concept behind Snapchat, in which 223 00:14:41,080 --> 00:14:44,520 Speaker 1: a user could send an image and after a given 224 00:14:44,560 --> 00:14:47,520 Speaker 1: amount of time, that image would go puff in the 225 00:14:47,560 --> 00:14:50,800 Speaker 1: receiver's app, except, of course, in the case of Snapchat, 226 00:14:51,040 --> 00:14:54,480 Speaker 1: copies of those images could still exist on Snapchat's cloud servers. 227 00:14:54,520 --> 00:14:57,120 Speaker 1: That was a whole thing Telegram was saying, Okay, well, 228 00:14:57,120 --> 00:14:58,720 Speaker 1: that's not going to be the case with us, Like, 229 00:14:59,080 --> 00:15:02,560 Speaker 1: the messages are only going to exist on the end devices, 230 00:15:02,600 --> 00:15:05,640 Speaker 1: not in the cloud, So once they delete, that's it. 231 00:15:06,040 --> 00:15:09,520 Speaker 1: We don't have a copy of it, so they're gone. Also, originally, 232 00:15:10,000 --> 00:15:14,960 Speaker 1: the self delete feature only worked for the receiver's side, 233 00:15:15,360 --> 00:15:17,800 Speaker 1: so the person who sent the message would still be 234 00:15:17,840 --> 00:15:20,680 Speaker 1: able to see the message after it had been deleted 235 00:15:20,720 --> 00:15:26,400 Speaker 1: off the second party's device. Eventually, Telegram would change that 236 00:15:26,600 --> 00:15:29,680 Speaker 1: as well, where the message would get deleted off all 237 00:15:29,760 --> 00:15:32,520 Speaker 1: the devices, both of the devices, because this only works 238 00:15:32,560 --> 00:15:35,640 Speaker 1: on end to end or user to use their communications, 239 00:15:36,080 --> 00:15:39,160 Speaker 1: so one on one situations. But yeah, originally it was 240 00:15:39,280 --> 00:15:42,880 Speaker 1: just the person who received it. Their message would get deleted. Clearly, 241 00:15:42,920 --> 00:15:46,080 Speaker 1: if you want to send let's say, a really sensitive 242 00:15:46,200 --> 00:15:49,720 Speaker 1: message and you don't want other people to snoop in 243 00:15:49,800 --> 00:15:53,320 Speaker 1: and see this. Having it self delete not just on 244 00:15:53,800 --> 00:15:56,960 Speaker 1: the person who received the message, but also your device 245 00:15:57,040 --> 00:15:59,360 Speaker 1: would be really important. I mean, what if someone got 246 00:15:59,360 --> 00:16:02,320 Speaker 1: hold of your and then thumb through to try and 247 00:16:02,400 --> 00:16:05,400 Speaker 1: see what kind of messaging you'd been up to. If 248 00:16:05,400 --> 00:16:09,160 Speaker 1: you are in a country that has a really authoritarian 249 00:16:09,280 --> 00:16:12,720 Speaker 1: government that is just looking for a reason to throw 250 00:16:12,760 --> 00:16:16,640 Speaker 1: you in the whoscal Yeah, you want to have a 251 00:16:16,680 --> 00:16:19,400 Speaker 1: messaging app that's going to clear your history so that 252 00:16:19,480 --> 00:16:23,760 Speaker 1: you don't have to worry about getting hauled away for 253 00:16:24,240 --> 00:16:28,400 Speaker 1: violating some authoritarian rule. Now telegram would allow for more 254 00:16:28,440 --> 00:16:32,160 Speaker 1: than just one on one communications, though these other methods 255 00:16:32,440 --> 00:16:36,200 Speaker 1: would not enjoy the benefit of end to end encryption, 256 00:16:36,640 --> 00:16:40,040 Speaker 1: which again becomes part of the problem for dear off. Now, 257 00:16:40,240 --> 00:16:43,720 Speaker 1: so you could have a session in which one user 258 00:16:44,000 --> 00:16:47,480 Speaker 1: was posting to many other users. This would be kind 259 00:16:47,480 --> 00:16:51,040 Speaker 1: of a broadcast approach, one person broadcasting to many I 260 00:16:51,160 --> 00:16:53,200 Speaker 1: like to think of this kind of similar to ways 261 00:16:53,440 --> 00:16:57,480 Speaker 1: that Twitter now x works, or threads or something like that, 262 00:16:57,520 --> 00:17:01,080 Speaker 1: where you can post a general message and it goes 263 00:17:01,120 --> 00:17:04,920 Speaker 1: out to anyone who can follow you, or or sometimes 264 00:17:05,000 --> 00:17:08,480 Speaker 1: if you're posting to the public, anybody at all. You 265 00:17:08,560 --> 00:17:10,720 Speaker 1: could have a session in which you know, you have 266 00:17:10,760 --> 00:17:14,840 Speaker 1: a group chat, potentially a really massive group chat. Telegram 267 00:17:14,880 --> 00:17:17,280 Speaker 1: is said to be able to accommodate up to two 268 00:17:17,520 --> 00:17:22,399 Speaker 1: hundred thousand users in a single chat session. I have 269 00:17:22,640 --> 00:17:25,200 Speaker 1: no idea how you would be able to parse such 270 00:17:25,200 --> 00:17:27,600 Speaker 1: a thing or even just keep up with chat now. 271 00:17:27,640 --> 00:17:31,080 Speaker 1: I've been in YouTube chat rooms where there were you know, 272 00:17:31,240 --> 00:17:35,440 Speaker 1: around one thousand people watching something, and obviously only a 273 00:17:36,040 --> 00:17:40,960 Speaker 1: slice of that population is even bothering chatting while they're watching, 274 00:17:41,440 --> 00:17:44,280 Speaker 1: and even in that case, keeping up with what's going 275 00:17:44,320 --> 00:17:47,000 Speaker 1: on is really challenging to do, so I don't know 276 00:17:47,040 --> 00:17:49,639 Speaker 1: how you do it in a Telegram chat room. To 277 00:17:49,720 --> 00:17:53,040 Speaker 1: be fair, I've also never used Telegram personally, so I 278 00:17:53,080 --> 00:17:55,480 Speaker 1: don't have any real experience with this app on a 279 00:17:56,119 --> 00:18:00,720 Speaker 1: user level. Telegram also allows for file transfers between users, which, 280 00:18:00,760 --> 00:18:03,920 Speaker 1: along with the encryption and messages that self delete after 281 00:18:03,960 --> 00:18:06,160 Speaker 1: a given amount of time, make up a large part 282 00:18:06,160 --> 00:18:09,480 Speaker 1: of the reasons that authorities have been concerned about this app, 283 00:18:09,720 --> 00:18:12,959 Speaker 1: because anytime you have ways for people to share information, 284 00:18:13,760 --> 00:18:15,760 Speaker 1: there's a concern that people are going to do that 285 00:18:16,320 --> 00:18:21,920 Speaker 1: in order to further nefarious goals. By December twenty thirteen, 286 00:18:22,400 --> 00:18:27,200 Speaker 1: the Telegram community had plugged one quote unquote significant vulnerability 287 00:18:27,600 --> 00:18:31,720 Speaker 1: in mt proto as part of the first Telegram crypto contest. 288 00:18:32,000 --> 00:18:37,000 Speaker 1: That's according to Telegram's own timeline of how things evolved 289 00:18:37,119 --> 00:18:40,760 Speaker 1: in the app. The person who discovered the vulnerability received 290 00:18:40,800 --> 00:18:44,040 Speaker 1: a one hundred thousand dollars bug bounty for doing so, 291 00:18:44,040 --> 00:18:49,640 Speaker 1: so there was an incentive to help improve the protocols 292 00:18:49,840 --> 00:18:53,960 Speaker 1: and to look for things like vulnerabilities, and overall this 293 00:18:54,000 --> 00:18:57,240 Speaker 1: would benefit the entire community, So it was an effective 294 00:18:57,320 --> 00:19:01,199 Speaker 1: way to improve the product. You may your own community 295 00:19:01,520 --> 00:19:05,320 Speaker 1: QA testers in a way. The following month, Telegram offered 296 00:19:05,359 --> 00:19:09,399 Speaker 1: the feature of file transfers, with an initial file size 297 00:19:09,440 --> 00:19:12,439 Speaker 1: limit of one and a half gigabytes. That's a fairly 298 00:19:12,520 --> 00:19:16,000 Speaker 1: hefty file sized and Telegram did not restrict the types 299 00:19:16,040 --> 00:19:20,199 Speaker 1: of documents that could be transferred across its service, so 300 00:19:20,840 --> 00:19:23,920 Speaker 1: whether it was a JPEG or a doc file or 301 00:19:23,960 --> 00:19:26,640 Speaker 1: a PDF or whatever it might be, you could send 302 00:19:26,680 --> 00:19:28,760 Speaker 1: it as long as it was as it wasn't larger 303 00:19:28,800 --> 00:19:31,720 Speaker 1: than one and a half gigabytes. Around that same time, 304 00:19:31,920 --> 00:19:36,879 Speaker 1: developers created a client for PCs, so this expanded Telegram 305 00:19:36,880 --> 00:19:40,919 Speaker 1: beyond the smartphone environment. This would eventually evolve into the 306 00:19:41,119 --> 00:19:46,480 Speaker 1: Telegram desktop app. In February twenty fourteen, developers created a 307 00:19:46,560 --> 00:19:49,600 Speaker 1: web based app for the service, and others made a 308 00:19:49,680 --> 00:19:52,679 Speaker 1: version of the app for Windows Phone. Do you remember 309 00:19:52,720 --> 00:19:55,160 Speaker 1: Windows Phone? I mean that was still a thing back 310 00:19:55,160 --> 00:19:58,000 Speaker 1: in twenty fourteen. In fact, it would remain a thing 311 00:19:58,160 --> 00:20:01,800 Speaker 1: for five more years. Microsoft officially in support for Windows 312 00:20:01,880 --> 00:20:07,679 Speaker 1: Phone in twenty nineteen. Telegram was evolving rapidly, so the 313 00:20:07,680 --> 00:20:10,920 Speaker 1: next feature added to the app in March twenty fourteen 314 00:20:11,280 --> 00:20:15,879 Speaker 1: was support for voice messages. So now you've got file transfers, 315 00:20:15,880 --> 00:20:18,280 Speaker 1: you've got one on one communication, you have one to 316 00:20:18,480 --> 00:20:22,200 Speaker 1: many in the broadcast channels, you've got chat, you've got 317 00:20:22,520 --> 00:20:26,240 Speaker 1: voice messaging. At this point, also the app changed how 318 00:20:26,280 --> 00:20:29,480 Speaker 1: it handled secret messages. This is when we got to 319 00:20:29,520 --> 00:20:33,680 Speaker 1: the change for self destruct. So again, earlier self destruct 320 00:20:33,720 --> 00:20:36,880 Speaker 1: only affected the receiving device, but now, starting in March 321 00:20:36,880 --> 00:20:39,720 Speaker 1: twenty fourteen, the messages would disappear from both the sender's 322 00:20:39,760 --> 00:20:43,840 Speaker 1: device and the receiver's devices, so you no longer had 323 00:20:44,000 --> 00:20:48,160 Speaker 1: a trail of these messages. To list all the feature 324 00:20:48,320 --> 00:20:52,960 Speaker 1: upgrades month by month would become really tedious, And as 325 00:20:53,000 --> 00:20:57,760 Speaker 1: I said, if you go to Telegram's timeline of the 326 00:20:57,800 --> 00:21:02,400 Speaker 1: evolution of the app. You can actually read all about this. Weirdly, 327 00:21:02,640 --> 00:21:06,119 Speaker 1: it's it's listed in reverse chronological order. I guess that 328 00:21:06,119 --> 00:21:07,600 Speaker 1: makes sense if you just want to know what the 329 00:21:07,600 --> 00:21:11,520 Speaker 1: most recent additions are to the app, But if you're 330 00:21:11,560 --> 00:21:15,240 Speaker 1: looking at it from a historical perspective, it's very weird 331 00:21:15,680 --> 00:21:17,800 Speaker 1: to start from the most recent and work your way backward, 332 00:21:17,800 --> 00:21:20,520 Speaker 1: because it just means that as you go on, the 333 00:21:20,600 --> 00:21:23,760 Speaker 1: app gets fewer features over time because you're going back 334 00:21:23,800 --> 00:21:27,320 Speaker 1: in time. I actually read it in reverse order anyway, 335 00:21:27,800 --> 00:21:30,879 Speaker 1: There's no reason to go through all of them. It 336 00:21:30,880 --> 00:21:34,280 Speaker 1: would just get very tedious. You did see a lot 337 00:21:34,320 --> 00:21:37,440 Speaker 1: more features get added over time. The app just would 338 00:21:37,480 --> 00:21:40,480 Speaker 1: get more and more robust every year, gain support for 339 00:21:41,040 --> 00:21:45,320 Speaker 1: everything from multiple file uploads where you could do several 340 00:21:45,359 --> 00:21:48,080 Speaker 1: files at a time, particularly photos like you could you 341 00:21:48,119 --> 00:21:50,680 Speaker 1: could choose multiple photos as opposed to doing them one 342 00:21:50,720 --> 00:21:53,800 Speaker 1: at a time, to playing animated gifts, you know, like 343 00:21:53,920 --> 00:21:56,320 Speaker 1: you know, the standard stuff that you encounter in various 344 00:21:56,359 --> 00:21:59,880 Speaker 1: social platforms today. Later updates incorporated the ability to play 345 00:22:00,080 --> 00:22:04,080 Speaker 1: media files directly within the Telegram app. You know, previously 346 00:22:04,200 --> 00:22:06,680 Speaker 1: you would have to download the file and then play 347 00:22:06,720 --> 00:22:10,840 Speaker 1: it in some other media player app. Now that capability 348 00:22:10,880 --> 00:22:13,959 Speaker 1: was built directly into Telegram itself, it also opened up 349 00:22:13,960 --> 00:22:17,639 Speaker 1: support for bots. These could be incorporated into chat rooms 350 00:22:17,640 --> 00:22:21,080 Speaker 1: to enhance the experience in various ways, though frequently not 351 00:22:21,720 --> 00:22:25,440 Speaker 1: really for moderation, which is again like content moderation, I mean, 352 00:22:25,680 --> 00:22:28,480 Speaker 1: and that is another issue that we'll talk about in 353 00:22:28,520 --> 00:22:31,399 Speaker 1: this episode. So the list of features has grown now 354 00:22:31,440 --> 00:22:33,640 Speaker 1: to a point where it would take an entire episode 355 00:22:33,680 --> 00:22:36,960 Speaker 1: to cover all the things that Telegram facilitates today. It 356 00:22:37,080 --> 00:22:40,960 Speaker 1: encompasses everything from allowing payments through the app for physical 357 00:22:40,960 --> 00:22:44,920 Speaker 1: and digital goods, to an in app currency called Stars 358 00:22:45,280 --> 00:22:48,560 Speaker 1: that users can use to reward other accounts for stuff. 359 00:22:48,600 --> 00:22:51,520 Speaker 1: It's kind of like the bits you would find on Twitch, 360 00:22:51,800 --> 00:22:56,439 Speaker 1: that kind of stuff. Okay, So that's all fine. Telegram 361 00:22:56,520 --> 00:22:59,280 Speaker 1: has evolved since it first hit the scene in mid 362 00:22:59,320 --> 00:23:02,720 Speaker 1: twenty thirteen. You would expect that, right, as many of 363 00:23:02,760 --> 00:23:05,360 Speaker 1: the same features that you find in other platforms, while 364 00:23:05,400 --> 00:23:09,720 Speaker 1: also being largely free from traditional advertising and, according to 365 00:23:09,840 --> 00:23:14,879 Speaker 1: the dourovs from data mining. Right Like, if you're on WhatsApp, 366 00:23:15,240 --> 00:23:18,840 Speaker 1: which is owned by Meta, you know that Meta is 367 00:23:19,160 --> 00:23:21,399 Speaker 1: scraping a lot of data from you. Because that's what 368 00:23:21,440 --> 00:23:23,600 Speaker 1: they do with all their platforms. That's their bread and butter, 369 00:23:23,960 --> 00:23:27,520 Speaker 1: right like whether it's on Facebook or Instagram or WhatsApp, 370 00:23:28,000 --> 00:23:31,680 Speaker 1: that ends up being really valuable information when Facebook is 371 00:23:31,720 --> 00:23:36,200 Speaker 1: looking for ways to sell targeted advertising with you in mind. 372 00:23:36,359 --> 00:23:39,920 Speaker 1: So the original concept behind Telegram was that it wouldn't 373 00:23:39,960 --> 00:23:44,400 Speaker 1: be profit oriented at all. It was not organized as 374 00:23:44,440 --> 00:23:48,320 Speaker 1: a not for profit organization, however, that's important to remember, 375 00:23:48,680 --> 00:23:51,600 Speaker 1: but the concept was that profit would not be a 376 00:23:51,680 --> 00:23:55,360 Speaker 1: driving consideration for the service. That the Dourovs were determined 377 00:23:55,359 --> 00:23:59,800 Speaker 1: to create a useful communications platform free from external interference, 378 00:24:00,000 --> 00:24:03,399 Speaker 1: which included not just governments, but things like corporations. But 379 00:24:03,480 --> 00:24:07,760 Speaker 1: Telegram does earn revenue primarily. It does this through in 380 00:24:07,920 --> 00:24:12,560 Speaker 1: app purchases, so rather than settle for the bog standard experience, 381 00:24:13,040 --> 00:24:16,640 Speaker 1: a user can shell out money to access premium features 382 00:24:16,880 --> 00:24:20,320 Speaker 1: like customization tools and such. This can include stuff like 383 00:24:20,400 --> 00:24:23,679 Speaker 1: stickers that can be used in chat rooms or themes 384 00:24:23,720 --> 00:24:26,600 Speaker 1: for chat spaces, that kind of thing. Users can even 385 00:24:26,640 --> 00:24:29,600 Speaker 1: design their own stickers and offer them up for sale 386 00:24:29,800 --> 00:24:33,000 Speaker 1: in a digital marketplace, with Telegram getting a cut of 387 00:24:33,040 --> 00:24:36,240 Speaker 1: the action, so they take a little percentage of each 388 00:24:36,359 --> 00:24:40,159 Speaker 1: sale done through that method. Telegram also does work with 389 00:24:40,240 --> 00:24:43,840 Speaker 1: businesses in ways that allow for other revenue generation methods. 390 00:24:44,040 --> 00:24:47,439 Speaker 1: I did say also is largely free of advertising, but 391 00:24:47,600 --> 00:24:52,920 Speaker 1: not totally free of advertising. There are ads on Telegram, however, 392 00:24:53,000 --> 00:24:55,840 Speaker 1: they are limited to appearing within broadcast channels that have 393 00:24:55,920 --> 00:24:59,800 Speaker 1: at least one thousand subscribers ONTs more, these ads can 394 00:24:59,840 --> 00:25:02,320 Speaker 1: own only appear as text messages, and they have a 395 00:25:02,359 --> 00:25:05,280 Speaker 1: hard limit of one hundred and sixty characters. On top 396 00:25:05,359 --> 00:25:08,800 Speaker 1: of that, businesses can establish their own channels and their 397 00:25:08,840 --> 00:25:13,240 Speaker 1: own groups within Telegram, and that in turn is another 398 00:25:13,280 --> 00:25:17,080 Speaker 1: revenue stream for Telegram itself. But now let's talk about 399 00:25:17,320 --> 00:25:22,679 Speaker 1: the dark side of this application. So one consequence of 400 00:25:22,720 --> 00:25:26,240 Speaker 1: creating a platform that aims to be free from censorship 401 00:25:26,320 --> 00:25:29,640 Speaker 1: and government involvement is that people who wish to engage 402 00:25:29,680 --> 00:25:33,080 Speaker 1: in illegal activities will make use of those services in 403 00:25:33,160 --> 00:25:37,000 Speaker 1: order to further their own goals. Complicating matters is that 404 00:25:37,040 --> 00:25:40,840 Speaker 1: you're talking about an app that is available around the world, 405 00:25:41,240 --> 00:25:43,800 Speaker 1: and what is legal in one nation may not be 406 00:25:43,960 --> 00:25:49,159 Speaker 1: legal in another. Plus, Deurov himself has citizenship in multiple countries, 407 00:25:49,280 --> 00:25:55,639 Speaker 1: including France. Further, Telegram's general policy in response to governments 408 00:25:55,680 --> 00:25:59,960 Speaker 1: demanding data is to tell those governments to pound sand it. Really, 409 00:26:00,000 --> 00:26:04,000 Speaker 1: it doesn't matter what the situation is, Telegram will say 410 00:26:04,000 --> 00:26:06,840 Speaker 1: it's not our policy to share that user information with you, 411 00:26:06,960 --> 00:26:09,200 Speaker 1: and we're not going to do it. Durov has said 412 00:26:09,240 --> 00:26:13,080 Speaker 1: that Telegram's commitment to privacy is more important than things 413 00:26:13,160 --> 00:26:17,119 Speaker 1: like our fear of how bad people could use Telegram. 414 00:26:17,200 --> 00:26:20,720 Speaker 1: He says privacy is more important than what someone might 415 00:26:20,960 --> 00:26:24,480 Speaker 1: use Telegram to do. So, when US lawmakers ask Telegram 416 00:26:24,520 --> 00:26:28,000 Speaker 1: to hand over information connected to people who were involved 417 00:26:28,080 --> 00:26:32,200 Speaker 1: or suspected as being involved in the insurrection on January sixth, 418 00:26:32,240 --> 00:26:36,199 Speaker 1: twenty twenty one, Telegram denied that request. They said, no, 419 00:26:36,560 --> 00:26:40,199 Speaker 1: that's against our policy. Now there are concerns that, you know, 420 00:26:40,320 --> 00:26:43,800 Speaker 1: things like terrorist cells are making use of Telegram in 421 00:26:43,960 --> 00:26:47,120 Speaker 1: order to communicate with each other. There are also cases 422 00:26:47,119 --> 00:26:50,320 Speaker 1: in which people are using Telegram to distribute everything from 423 00:26:50,359 --> 00:26:55,680 Speaker 1: pirated content to really serious issues like child pornography. And 424 00:26:55,720 --> 00:26:59,720 Speaker 1: that encryption, as I said earlier, really only works for 425 00:26:59,760 --> 00:27:02,920 Speaker 1: one on one communication, So for the case of things 426 00:27:02,960 --> 00:27:06,320 Speaker 1: like chat rooms or broadcast channels, as well as just 427 00:27:06,400 --> 00:27:11,119 Speaker 1: the default settings for user to user communications, there is 428 00:27:11,800 --> 00:27:17,080 Speaker 1: no encryption, which means any illegal activity that's happening across 429 00:27:17,200 --> 00:27:21,960 Speaker 1: those channels is potentially viewable. In fact, the only type 430 00:27:21,960 --> 00:27:25,840 Speaker 1: of communication that actually uses encryption and only if it's 431 00:27:26,000 --> 00:27:30,720 Speaker 1: manually switched on, means that Telegram isn't really an encrypted 432 00:27:30,720 --> 00:27:35,439 Speaker 1: communication tool. It's just a communication app that happens to 433 00:27:35,480 --> 00:27:39,359 Speaker 1: have one end to end encryption feature that's only on 434 00:27:39,480 --> 00:27:42,480 Speaker 1: if you turn it on, and only for certain use cases. 435 00:27:42,880 --> 00:27:46,960 Speaker 1: That ends up becoming a massive problem for Deroff, or 436 00:27:47,000 --> 00:27:49,680 Speaker 1: potentially a massive problem. I mean, it could turn out 437 00:27:50,000 --> 00:27:54,520 Speaker 1: that no charges are filed and nothing happens. But the 438 00:27:54,640 --> 00:27:58,280 Speaker 1: reason why he was detained by police in the first place, 439 00:27:58,680 --> 00:28:03,200 Speaker 1: you could argue, is because cause Telegram does not encrypt everything. 440 00:28:03,720 --> 00:28:07,399 Speaker 1: I'll explain more, but first let's take a quick break 441 00:28:07,440 --> 00:28:21,240 Speaker 1: to thank our sponsors. Okay, So the reason that it's 442 00:28:21,280 --> 00:28:26,200 Speaker 1: important that Telegram does not offer encryption across all methods 443 00:28:26,200 --> 00:28:29,639 Speaker 1: of communication on the app is that it means Telegram 444 00:28:29,720 --> 00:28:34,560 Speaker 1: potentially could view the stuff that happens on its own network. 445 00:28:35,000 --> 00:28:38,760 Speaker 1: It could be aware of the things that are transpiring 446 00:28:39,040 --> 00:28:43,040 Speaker 1: on Telegram, and lots of countries have rules in place 447 00:28:43,600 --> 00:28:48,200 Speaker 1: that state a platform is obligated to moderate the content 448 00:28:48,600 --> 00:28:52,680 Speaker 1: that happens on the platform itself. It's not responsible for 449 00:28:52,880 --> 00:28:57,760 Speaker 1: generating that content necessarily, but it is responsible for moderating it. 450 00:28:58,080 --> 00:29:00,719 Speaker 1: Here in the United States, we have rules that protect 451 00:29:00,720 --> 00:29:03,719 Speaker 1: platforms from being held accountable for the stuff that users 452 00:29:03,840 --> 00:29:08,240 Speaker 1: post to them. The infamous section two thirty is about this. 453 00:29:08,600 --> 00:29:10,840 Speaker 1: So the thought behind all of this is that you 454 00:29:10,960 --> 00:29:14,480 Speaker 1: can't really blame a platform for something that someone a 455 00:29:14,680 --> 00:29:19,720 Speaker 1: user does on the platform itself. The responsible party is 456 00:29:19,800 --> 00:29:22,920 Speaker 1: the person who did the illegal activity, not the platform 457 00:29:22,960 --> 00:29:27,960 Speaker 1: where that illegal activity happened. However, this protection only extends 458 00:29:28,200 --> 00:29:32,680 Speaker 1: so far. If a platform is unwilling or unable to 459 00:29:32,880 --> 00:29:38,280 Speaker 1: moderate content, to remove illegal content, to act when needed 460 00:29:38,400 --> 00:29:42,680 Speaker 1: to curtail illegal activity on the platform itself, then it 461 00:29:42,760 --> 00:29:46,560 Speaker 1: can see that protection get stripped away. So your protection 462 00:29:46,720 --> 00:29:50,680 Speaker 1: only lasts as long as you are accountable. So, as 463 00:29:50,680 --> 00:29:53,640 Speaker 1: an example, if someone were to upload a movie to 464 00:29:53,720 --> 00:29:55,840 Speaker 1: YouTube and they don't have the right to do this 465 00:29:56,280 --> 00:30:00,000 Speaker 1: right like they've taken a pirated copy of a film. 466 00:30:00,280 --> 00:30:03,880 Speaker 1: Let's say it's Big Trouble in Little China, arguably the 467 00:30:03,880 --> 00:30:06,560 Speaker 1: best movie ever made, and they've put Big Trouble in 468 00:30:06,600 --> 00:30:08,880 Speaker 1: Little China up on YouTube and they don't own the 469 00:30:08,960 --> 00:30:11,400 Speaker 1: rights to Big trouble in Little China, and YouTube is 470 00:30:11,440 --> 00:30:15,560 Speaker 1: made aware of this, They're alerted, Hey, someone has uploaded 471 00:30:15,600 --> 00:30:18,239 Speaker 1: copyrighted material and they don't have the right to do it. 472 00:30:18,480 --> 00:30:22,320 Speaker 1: Then YouTube is obligated to take that video down or 473 00:30:22,440 --> 00:30:27,320 Speaker 1: whatever action the copyright holder deems appropriate, or else YouTube 474 00:30:27,440 --> 00:30:32,040 Speaker 1: risks losing that protection we talked about. YouTube itself would 475 00:30:32,040 --> 00:30:34,960 Speaker 1: not be held responsible for the initial upload as long 476 00:30:35,000 --> 00:30:40,120 Speaker 1: as it did act accordingly once alerted to the infraction. Now, 477 00:30:40,160 --> 00:30:45,520 Speaker 1: Telegram largely doesn't police content on its platform. However, there 478 00:30:45,560 --> 00:30:48,920 Speaker 1: are exceptions. One big one is in matters that deal 479 00:30:48,960 --> 00:30:52,600 Speaker 1: with child abuse. The company relies on users to report 480 00:30:52,680 --> 00:30:56,720 Speaker 1: instances of content relating to child abuse and then takes action. So, 481 00:30:56,800 --> 00:30:59,880 Speaker 1: according to an article by Jordan Pearson of the Verb 482 00:31:00,000 --> 00:31:04,320 Speaker 1: which Telegram claims to do this around one thousand times 483 00:31:04,560 --> 00:31:08,960 Speaker 1: per day, yikes, it is horrifying to think that child 484 00:31:09,000 --> 00:31:12,200 Speaker 1: abuse material is that rampant to begin with, And of 485 00:31:12,200 --> 00:31:16,080 Speaker 1: course that just marks the instances where someone actually reported it, 486 00:31:16,480 --> 00:31:20,080 Speaker 1: so that's pretty horrifying. Complicating matters is that Telegram has 487 00:31:20,120 --> 00:31:23,000 Speaker 1: been accused of only putting up a show when it 488 00:31:23,040 --> 00:31:28,280 Speaker 1: comes to content moderation that, rather than outright removing offending 489 00:31:28,400 --> 00:31:32,360 Speaker 1: channels and material. Telegram simply just makes them hidden, so 490 00:31:32,400 --> 00:31:35,560 Speaker 1: they're not removed, they're just hidden from average users, which 491 00:31:35,560 --> 00:31:38,000 Speaker 1: means people who know where to go could still go 492 00:31:38,040 --> 00:31:40,720 Speaker 1: there and still engage in this activity, and that seems 493 00:31:40,760 --> 00:31:42,480 Speaker 1: to be a pretty big problem. It's kind of the 494 00:31:42,560 --> 00:31:46,680 Speaker 1: look the other way approach, which is you can easily 495 00:31:46,800 --> 00:31:51,840 Speaker 1: argue that is essentially facilitating and being complicit in illegal 496 00:31:51,880 --> 00:31:56,400 Speaker 1: activity that's happening on those channels, and these policies are 497 00:31:56,400 --> 00:31:59,480 Speaker 1: what put durav on thin ice with authorities in France. 498 00:32:00,000 --> 00:32:04,880 Speaker 1: Telegram could technically take a much firmer stance with content moderation. 499 00:32:05,200 --> 00:32:09,000 Speaker 1: There's nothing stopping the company from doing so. The communications 500 00:32:09,400 --> 00:32:13,760 Speaker 1: are not encrypted in things like chat channels and broadcast 501 00:32:13,840 --> 00:32:19,960 Speaker 1: channels and most to end user communications unless they've manually 502 00:32:20,000 --> 00:32:23,600 Speaker 1: turned that setting on. So the fact that Telegram doesn't 503 00:32:23,600 --> 00:32:26,760 Speaker 1: appear to take this kind of action opens the possibility 504 00:32:26,760 --> 00:32:30,080 Speaker 1: for authorities to charge Durov and the company overall with 505 00:32:30,200 --> 00:32:35,760 Speaker 1: facilitating illegal activity. The act of not acting becomes the issue. 506 00:32:35,800 --> 00:32:38,800 Speaker 1: The French authorities have argued that Durov is complicit in 507 00:32:38,840 --> 00:32:42,400 Speaker 1: crimes that range from money laundering to the distribution of 508 00:32:42,480 --> 00:32:47,560 Speaker 1: abusive materials and everything in between. This is what Dirov 509 00:32:47,640 --> 00:32:49,760 Speaker 1: is going to have to face when brought before a judge, 510 00:32:49,840 --> 00:32:53,160 Speaker 1: where he might possibly be indicted. By the time you 511 00:32:53,200 --> 00:32:56,120 Speaker 1: hear this episode, that decision may already have been made, 512 00:32:56,240 --> 00:32:58,680 Speaker 1: but as I record this, it is yet to happen. 513 00:32:58,800 --> 00:33:02,360 Speaker 1: Though he again was released from police custody. Now that's 514 00:33:02,360 --> 00:33:04,800 Speaker 1: not necessarily an indication of where things are going to go, 515 00:33:05,000 --> 00:33:08,800 Speaker 1: because the authorities had until today, which is Wednesday, August 516 00:33:08,840 --> 00:33:11,440 Speaker 1: twenty eighth, twenty twenty four, when I'm recording this, to 517 00:33:11,600 --> 00:33:15,080 Speaker 1: officially charge du Off or to let him go. They 518 00:33:15,120 --> 00:33:18,760 Speaker 1: couldn't hold him longer, not legally anyway. So if he 519 00:33:18,760 --> 00:33:22,960 Speaker 1: gets charged, that's up to the judge. But yeah, the 520 00:33:23,040 --> 00:33:25,840 Speaker 1: case is a really complicated one. So on the one hand, 521 00:33:26,120 --> 00:33:28,480 Speaker 1: I do believe there is a real need for systems 522 00:33:28,480 --> 00:33:31,960 Speaker 1: that allow for secure and private communication. There are people 523 00:33:32,080 --> 00:33:35,240 Speaker 1: all around the world whose lives could be in danger 524 00:33:35,360 --> 00:33:38,200 Speaker 1: if they do not have access to those kinds of tools, 525 00:33:38,520 --> 00:33:42,480 Speaker 1: and there are plenty of examples, including here in the 526 00:33:42,560 --> 00:33:48,840 Speaker 1: United States, where if your communications were open to surveillance, 527 00:33:49,240 --> 00:33:51,800 Speaker 1: then you could really suffer as a result of that, 528 00:33:51,960 --> 00:33:54,560 Speaker 1: even if you were not guilty of any crimes. I mean, 529 00:33:54,600 --> 00:33:58,360 Speaker 1: there were cases in the nssay of people who were 530 00:33:58,400 --> 00:34:03,040 Speaker 1: allegedly spy paying on communications that had no illegal activity 531 00:34:03,120 --> 00:34:07,040 Speaker 1: connected to them, but happened to belong to say an 532 00:34:07,400 --> 00:34:12,839 Speaker 1: X right, like an NSA contractor or agent was using 533 00:34:12,920 --> 00:34:15,680 Speaker 1: the tools of the agency to spy on people they 534 00:34:15,760 --> 00:34:18,960 Speaker 1: knew personally, or to look at things like let's say 535 00:34:19,000 --> 00:34:22,960 Speaker 1: someone is sending a nude photo of themselves to their 536 00:34:23,000 --> 00:34:26,400 Speaker 1: loved one or whatever. Being able to intercept that and 537 00:34:26,480 --> 00:34:29,440 Speaker 1: look at it. I mean, that happened a lot. And 538 00:34:29,840 --> 00:34:33,440 Speaker 1: you know, again that's not connected with illegal activity necessarily, 539 00:34:33,640 --> 00:34:38,600 Speaker 1: so there's no justification for intercepting that information and then 540 00:34:38,760 --> 00:34:41,239 Speaker 1: you know, saving it or looking at or whatever it 541 00:34:41,280 --> 00:34:45,280 Speaker 1: may be. So there is a real need for ways 542 00:34:45,320 --> 00:34:50,360 Speaker 1: to communicate securely and privately. However, we're not really talking 543 00:34:50,360 --> 00:34:53,359 Speaker 1: about the secure component of telegram in this case. We're 544 00:34:53,360 --> 00:34:57,120 Speaker 1: talking about a platform in which Durroff could conceivably be 545 00:34:57,280 --> 00:35:02,040 Speaker 1: made aware of illegal activity going on across his platform 546 00:35:02,320 --> 00:35:05,960 Speaker 1: that obligates him to take action. Failure to do so 547 00:35:06,600 --> 00:35:10,839 Speaker 1: indicates an element of complicity in those crimes. Now, if 548 00:35:10,920 --> 00:35:14,440 Speaker 1: everything were encrypted, then Durov would really be free and clear, 549 00:35:14,880 --> 00:35:17,480 Speaker 1: because true encryption would mean he would have no way 550 00:35:17,480 --> 00:35:21,360 Speaker 1: of knowing what is actually transpiring across the platform. The 551 00:35:21,400 --> 00:35:25,160 Speaker 1: people might make use of the platform to conduct illegal activities, 552 00:35:25,239 --> 00:35:28,520 Speaker 1: but that's beside the point, because people commit crimes all 553 00:35:28,560 --> 00:35:31,680 Speaker 1: the time on things like the road. Right, you don't 554 00:35:31,719 --> 00:35:34,480 Speaker 1: shut the road down. It's not the road's fault that 555 00:35:34,520 --> 00:35:38,000 Speaker 1: anyone did that, that's just where it happened. So if 556 00:35:38,040 --> 00:35:40,480 Speaker 1: everything were encrypted and there was no way to know 557 00:35:41,000 --> 00:35:44,760 Speaker 1: what anyone was doing on the platform, Durov would probably 558 00:35:44,840 --> 00:35:48,439 Speaker 1: have a really strong defense. But the fact that there 559 00:35:48,520 --> 00:35:51,160 Speaker 1: are all these methods that are not encrypted. In fact, 560 00:35:51,200 --> 00:35:55,600 Speaker 1: only one method is encrypted, that's what really gives him 561 00:35:56,040 --> 00:35:59,719 Speaker 1: potentially a huge problem, because you can make the argument, hey, 562 00:36:00,040 --> 00:36:04,520 Speaker 1: there's nothing stopping you from being aware of this illegal activity, 563 00:36:04,680 --> 00:36:06,960 Speaker 1: and the fact that you're not doing enough to curtail 564 00:36:07,040 --> 00:36:09,839 Speaker 1: that means you are complicit in that, and we're going 565 00:36:09,880 --> 00:36:12,080 Speaker 1: to hold you accountable. So that's kind of where he 566 00:36:12,320 --> 00:36:15,680 Speaker 1: finds himself today. The ferocity of authorities in this matter 567 00:36:15,840 --> 00:36:19,719 Speaker 1: also raise concerns about security and privacy. Some experts that 568 00:36:19,800 --> 00:36:23,600 Speaker 1: Pearson in his article on The Verge quotes, they say 569 00:36:23,680 --> 00:36:26,440 Speaker 1: that really the matter is more about how much did 570 00:36:26,480 --> 00:36:29,520 Speaker 1: Derov know about the illegal activity, not so much about 571 00:36:29,520 --> 00:36:34,040 Speaker 1: the private, secure communication aspect of Telegram. But still, we 572 00:36:34,120 --> 00:36:38,399 Speaker 1: do live in a post NSA prism world. We live 573 00:36:38,440 --> 00:36:41,680 Speaker 1: in a world where we are aware of the various 574 00:36:41,719 --> 00:36:46,520 Speaker 1: attempts to monitor communications, whether those communications are criminal or otherwise, 575 00:36:46,880 --> 00:36:49,799 Speaker 1: and we know that people have exploited those programs to 576 00:36:49,880 --> 00:36:54,880 Speaker 1: various degrees to the harm of innocent citizens. So seeing 577 00:36:54,960 --> 00:36:59,080 Speaker 1: authorities go after the CEO of a company that provides 578 00:36:59,360 --> 00:37:02,000 Speaker 1: that kind of communication, even though that's just one small 579 00:37:02,040 --> 00:37:05,040 Speaker 1: part of what telegram does, it does raise concerns. It 580 00:37:05,080 --> 00:37:09,920 Speaker 1: makes you worry about surveillance states and this almost pathological 581 00:37:10,160 --> 00:37:15,120 Speaker 1: need to have access to all information just in the 582 00:37:15,160 --> 00:37:20,320 Speaker 1: case that something in that giant mass of data represents 583 00:37:20,320 --> 00:37:24,320 Speaker 1: illegal activity. It's kind of the concern about being presumed 584 00:37:24,320 --> 00:37:27,440 Speaker 1: guilty until proven innocent. That's kind of the opposite of, 585 00:37:27,800 --> 00:37:32,480 Speaker 1: at least how we like to think the American system goes. 586 00:37:32,880 --> 00:37:35,600 Speaker 1: Once in a while, it actually is true that people 587 00:37:35,640 --> 00:37:39,480 Speaker 1: are presumed innocent until proven guilty. That's nice when that happens, 588 00:37:39,880 --> 00:37:43,480 Speaker 1: but yeah, something like this, it gives the lie to 589 00:37:43,560 --> 00:37:47,360 Speaker 1: that right. The implication is that you're presumed to have 590 00:37:47,480 --> 00:37:52,200 Speaker 1: been guilty of something. It's just that something may not 591 00:37:52,360 --> 00:37:58,760 Speaker 1: yet be discoverable. Pretty dark stuff. But yeah, my personal 592 00:37:58,800 --> 00:38:01,920 Speaker 1: opinion does doesn't really matter in this case. I'm curious 593 00:38:01,960 --> 00:38:05,880 Speaker 1: what other people's opinions are. But I feel I have 594 00:38:05,920 --> 00:38:10,400 Speaker 1: a complex reaction to this. I don't like the concept 595 00:38:10,560 --> 00:38:16,040 Speaker 1: of a platform allowing illegal content, particularly illegal content that 596 00:38:16,200 --> 00:38:20,719 Speaker 1: disproportionately hurts children, to continue to be able to do 597 00:38:20,760 --> 00:38:25,239 Speaker 1: that without repercussions. I find that to be really disturbing. 598 00:38:25,680 --> 00:38:29,040 Speaker 1: I appreciate the need for a place where free speech 599 00:38:29,120 --> 00:38:33,960 Speaker 1: can freely happen, but even free speech, at least here 600 00:38:33,960 --> 00:38:37,040 Speaker 1: in the United States, has its limitations. Free speech is 601 00:38:37,080 --> 00:38:40,719 Speaker 1: not meant to be absolutely free of consequence. Just means 602 00:38:40,760 --> 00:38:44,160 Speaker 1: the government can't dictate what you can and cannot say, 603 00:38:44,200 --> 00:38:47,080 Speaker 1: but there can be consequences to what you do say. 604 00:38:47,320 --> 00:38:51,720 Speaker 1: It's a fine line and it's complicated anyway. I hope 605 00:38:51,960 --> 00:38:54,799 Speaker 1: that you learned something in this episode, that you learn 606 00:38:54,880 --> 00:38:58,680 Speaker 1: more about what telegram is, where it came from some 607 00:38:58,719 --> 00:39:01,440 Speaker 1: of you out there maybe a telegram users. I know 608 00:39:01,560 --> 00:39:05,160 Speaker 1: that a lot of folks who use telegram are in 609 00:39:05,239 --> 00:39:10,279 Speaker 1: other countries, in places like Iran and India, and these 610 00:39:10,320 --> 00:39:14,520 Speaker 1: are places where governments can be quite authoritarian in their 611 00:39:15,200 --> 00:39:18,239 Speaker 1: desire to control the flow of information. Also, I do 612 00:39:18,360 --> 00:39:22,120 Speaker 1: find it somewhat ironic that when it was announced that 613 00:39:22,200 --> 00:39:27,719 Speaker 1: Durov was arrested in France, some countries, notably Russia, expressed 614 00:39:28,280 --> 00:39:32,240 Speaker 1: condemnation for that, saying this is a strike against free speech, 615 00:39:32,320 --> 00:39:35,440 Speaker 1: which is rich coming from Russia. I mean, that's the 616 00:39:35,480 --> 00:39:41,080 Speaker 1: same country that Durov fled from after Russian authorities essentially 617 00:39:41,160 --> 00:39:45,440 Speaker 1: tried to seize control of VK and Duv left Russia 618 00:39:45,520 --> 00:39:49,080 Speaker 1: to found Telegram in the UAE largely because of that. 619 00:39:49,560 --> 00:39:52,200 Speaker 1: And here you have Russia saying shame on you Frans 620 00:39:52,239 --> 00:39:55,680 Speaker 1: for arresting this guy who's a Russian citizen as well, 621 00:39:55,719 --> 00:39:59,240 Speaker 1: he still maintains Russian citizenship. And meanwhile, it's the same 622 00:39:59,320 --> 00:40:03,080 Speaker 1: country that caused Durov to flee in the first place. 623 00:40:03,360 --> 00:40:06,280 Speaker 1: So yeah, everything's politics. I guess that's what that boils 624 00:40:06,320 --> 00:40:09,080 Speaker 1: down to. That's a cheerful thought, you know what. I'm 625 00:40:09,080 --> 00:40:11,680 Speaker 1: just gonna leave that there, and I'm gonna go off 626 00:40:11,719 --> 00:40:15,320 Speaker 1: and I'm gonna have a snack, maybe I think a cupcake. 627 00:40:15,840 --> 00:40:19,000 Speaker 1: Gonna have a little cupcake to kind of soothe my 628 00:40:19,120 --> 00:40:22,000 Speaker 1: feelings on this matter. I hope all of you out 629 00:40:22,080 --> 00:40:25,400 Speaker 1: there are doing well, and I'll talk to you again 630 00:40:26,080 --> 00:40:35,839 Speaker 1: really soon. Tech Stuff is an iHeartRadio production. For more 631 00:40:35,920 --> 00:40:40,640 Speaker 1: podcasts from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, or 632 00:40:40,680 --> 00:40:46,320 Speaker 1: wherever you listen to your favorite shows.