1 00:00:04,160 --> 00:00:07,240 Speaker 1: Get in touch with technology with tech Stuff from half 2 00:00:07,240 --> 00:00:13,840 Speaker 1: stuff works dot com. Hey there, and welcome to tech Stuff. 3 00:00:13,880 --> 00:00:17,080 Speaker 1: I'm your host, Jonathan Strickland. I'm an executive producer here 4 00:00:17,079 --> 00:00:19,680 Speaker 1: at hell Stuff Works in a love all things tech. 5 00:00:20,320 --> 00:00:24,160 Speaker 1: And in that last episode about the birth of Instagram, 6 00:00:24,200 --> 00:00:27,920 Speaker 1: I described how Kevin Systrom and Mike Krieger built the 7 00:00:28,160 --> 00:00:33,280 Speaker 1: app by paring down an earlier idea for a related app. 8 00:00:33,800 --> 00:00:36,760 Speaker 1: I left off right around two thousand twelve, after the 9 00:00:36,800 --> 00:00:40,360 Speaker 1: company had a successful second round of funding for seven 10 00:00:40,920 --> 00:00:45,440 Speaker 1: million dollars in early and saw its number of users 11 00:00:45,520 --> 00:00:48,479 Speaker 1: steadily increase, even though the app was only available for 12 00:00:48,560 --> 00:00:51,880 Speaker 1: iOS at the time. By the end of two thousand eleven, 13 00:00:51,920 --> 00:00:54,280 Speaker 1: celebrities were in on the game, with folks like Justin 14 00:00:54,320 --> 00:00:58,240 Speaker 1: Bieber giving the service a huge boost by making their 15 00:00:58,240 --> 00:01:04,040 Speaker 1: own accounts and sharing their photos with their baby Baby Baby. 16 00:01:04,080 --> 00:01:08,320 Speaker 1: In January two thousand and twelve, Instagram made another higher. 17 00:01:08,760 --> 00:01:12,600 Speaker 1: Philip McAllister joined the company as an engineer. He had 18 00:01:12,640 --> 00:01:16,160 Speaker 1: previously worked for go Wala, which was a location based 19 00:01:16,280 --> 00:01:20,160 Speaker 1: social network that got gobbled up by Facebook. And March 20 00:01:20,480 --> 00:01:24,080 Speaker 1: two more employees joined the company. They were Ryan Gamba, 21 00:01:24,200 --> 00:01:27,720 Speaker 1: a developer who had founded a business called app That, 22 00:01:28,319 --> 00:01:32,280 Speaker 1: and Bailey Richardson, a Stanford graduate and community manager who 23 00:01:32,280 --> 00:01:36,000 Speaker 1: had previously worked for you Gallery. I mentioned these hires 24 00:01:36,040 --> 00:01:39,240 Speaker 1: only to point out that Instagram at this phase was 25 00:01:39,319 --> 00:01:44,960 Speaker 1: still a very small company, despite having a massive installed base. 26 00:01:45,160 --> 00:01:48,520 Speaker 1: So you had millions of people using the app, but 27 00:01:48,720 --> 00:01:52,760 Speaker 1: less than a dozen people actually working for the company, 28 00:01:52,800 --> 00:01:56,160 Speaker 1: and that base got larger in early April of twelve, 29 00:01:56,480 --> 00:02:00,640 Speaker 1: when Instagram finally launched on Android, you needed a device 30 00:02:00,680 --> 00:02:03,880 Speaker 1: that was capable of running at least Android two point two, 31 00:02:04,000 --> 00:02:06,760 Speaker 1: also known as Froyo. It taken about a year and 32 00:02:06,760 --> 00:02:09,120 Speaker 1: a half to bring the app over to Google's mobile 33 00:02:09,160 --> 00:02:14,960 Speaker 1: operating system, and clearly Android users were eager for that app. 34 00:02:15,000 --> 00:02:18,280 Speaker 1: More than a million people downloaded the app in the 35 00:02:18,480 --> 00:02:23,880 Speaker 1: first twenty four hours. Instagram would hit fifty million active 36 00:02:24,000 --> 00:02:27,680 Speaker 1: users in early April, which was a huge jump since 37 00:02:27,720 --> 00:02:31,560 Speaker 1: it had only hit ten million the previous September. It's 38 00:02:31,639 --> 00:02:35,720 Speaker 1: meteoric rise helped the company hold a successful funding round, 39 00:02:36,240 --> 00:02:40,760 Speaker 1: raising fifty million dollars in investments in the process, which 40 00:02:40,800 --> 00:02:43,840 Speaker 1: means that's a dollar per active user, which is pretty 41 00:02:43,919 --> 00:02:47,120 Speaker 1: darn good, and it put the valuation of the company 42 00:02:47,160 --> 00:02:51,919 Speaker 1: at about five hundred million dollars. Investors included Sequoya Capital, 43 00:02:52,240 --> 00:02:58,080 Speaker 1: Thrive Capital, Benchmark, and Graylock. And then almost immediately after 44 00:02:58,200 --> 00:03:02,600 Speaker 1: that happened came the big news. Facebook had come knocking 45 00:03:02,919 --> 00:03:06,280 Speaker 1: and System answered the door. The news broke that Facebook 46 00:03:06,280 --> 00:03:12,000 Speaker 1: would acquire Instagram for the princely sum of one billion 47 00:03:12,240 --> 00:03:16,480 Speaker 1: dollars in a combination cash and stock deal. The timing 48 00:03:16,480 --> 00:03:19,679 Speaker 1: of both the funding round and the acquisition announcement meant 49 00:03:19,720 --> 00:03:22,280 Speaker 1: that it was very likely Instagram was in talks with 50 00:03:22,360 --> 00:03:25,760 Speaker 1: both groups at the same time. So what the heck 51 00:03:25,840 --> 00:03:29,720 Speaker 1: was going on here? Well, chances are Sistrom and company. 52 00:03:29,840 --> 00:03:33,919 Speaker 1: We're using the investment round as two things, a negotiation 53 00:03:34,000 --> 00:03:37,600 Speaker 1: tactic and a safety net. The fact that investors were 54 00:03:37,680 --> 00:03:42,080 Speaker 1: rushing to put more money into Instagram heightened the company's value, 55 00:03:42,240 --> 00:03:45,440 Speaker 1: which gave Instagram a better position when Facebook was looking 56 00:03:45,440 --> 00:03:49,120 Speaker 1: to acquire the company. Instagram could point at its recent 57 00:03:49,160 --> 00:03:52,520 Speaker 1: funding round as evidence of the company's worth and thus 58 00:03:52,520 --> 00:03:56,280 Speaker 1: demand a higher acquisition price, and the companies that had 59 00:03:56,360 --> 00:03:59,960 Speaker 1: just invested stood to see a return on those investment. 60 00:04:00,000 --> 00:04:04,000 Speaker 1: It's practically overnight if you and poured fifty million dollars 61 00:04:04,000 --> 00:04:06,480 Speaker 1: into a company and then that company sold for a 62 00:04:06,560 --> 00:04:09,840 Speaker 1: billion dollars, you would stand to receive a really tidy 63 00:04:09,920 --> 00:04:13,840 Speaker 1: profit on your investment. And worst case scenario, if the 64 00:04:13,880 --> 00:04:16,640 Speaker 1: deal broke down and Facebook walked away, you still have 65 00:04:16,680 --> 00:04:20,440 Speaker 1: fifty million dollars of investment money to work with. At 66 00:04:20,440 --> 00:04:23,960 Speaker 1: the time of its acquisition, Kevin's Systram owned about forty 67 00:04:24,080 --> 00:04:28,039 Speaker 1: percent of the company, Mike Kreeger owned about ten percent, 68 00:04:28,480 --> 00:04:31,599 Speaker 1: and the rest was owned by their various investors. The 69 00:04:31,760 --> 00:04:36,640 Speaker 1: one billion dollar deal made them both multimillionaires overnight, at 70 00:04:36,680 --> 00:04:40,360 Speaker 1: least on paper, they were multimillionaires, with Systram being worth 71 00:04:40,400 --> 00:04:44,600 Speaker 1: about four million dollars, and there were just thirteen employees 72 00:04:44,640 --> 00:04:47,120 Speaker 1: at Instagram at the time of the acquisition, and that 73 00:04:47,200 --> 00:04:52,920 Speaker 1: includes the two founders, So it was a phenomenal deal. However, 74 00:04:53,000 --> 00:04:55,799 Speaker 1: that would not stop people from later saying Instagram made 75 00:04:55,839 --> 00:04:59,760 Speaker 1: the move too quickly. Now, before I get into that analysis, 76 00:05:00,320 --> 00:05:04,320 Speaker 1: win did Facebook one Instagram in the first place? Well, 77 00:05:04,360 --> 00:05:07,680 Speaker 1: for one thing, Facebook was heading toward its own initial 78 00:05:07,760 --> 00:05:10,920 Speaker 1: public offering or i p O. That's when a company 79 00:05:10,960 --> 00:05:15,160 Speaker 1: switches from being a privately held corporation into a publicly 80 00:05:15,240 --> 00:05:18,240 Speaker 1: traded one where you can end up trading stocks on 81 00:05:18,279 --> 00:05:21,080 Speaker 1: the market. And if an i p O goes well, 82 00:05:21,279 --> 00:05:25,360 Speaker 1: it earns a company a whole lot of cash, So 83 00:05:25,400 --> 00:05:28,400 Speaker 1: they become flush with cash they can suddenly do lots 84 00:05:28,400 --> 00:05:31,640 Speaker 1: of stuff with it, which typically can be used for 85 00:05:31,680 --> 00:05:36,159 Speaker 1: things like building out resources or acquiring other companies. But 86 00:05:36,279 --> 00:05:39,480 Speaker 1: Facebook had already found itself in a great position as 87 00:05:39,520 --> 00:05:43,080 Speaker 1: far as cash goes. Forbes estimated that due to private 88 00:05:43,120 --> 00:05:46,160 Speaker 1: share sales to Golden and Sacks, which happened in advance 89 00:05:46,160 --> 00:05:49,400 Speaker 1: of the i p O, Facebook had about four billion 90 00:05:49,480 --> 00:05:54,080 Speaker 1: dollars burning a hole in its enormous corporate wallet. Making 91 00:05:54,120 --> 00:05:57,080 Speaker 1: a move like buying another company before Facebook had its 92 00:05:57,160 --> 00:06:00,159 Speaker 1: own i p O was a pretty baller move, I guess. 93 00:06:00,360 --> 00:06:04,200 Speaker 1: For another thing, Facebook really wanted to get at Instagram's 94 00:06:04,279 --> 00:06:08,240 Speaker 1: reach with mobile users. Facebook was doing well on desktops 95 00:06:08,279 --> 00:06:11,080 Speaker 1: and laptops, but hatt't managed to be as successful in 96 00:06:11,120 --> 00:06:14,480 Speaker 1: the mobile world, and other entities like Google and Twitter 97 00:06:14,640 --> 00:06:19,000 Speaker 1: were eyeing Instagram to which created a sense of urgency 98 00:06:19,120 --> 00:06:21,479 Speaker 1: and Facebook needed to find ways to make sure it 99 00:06:21,480 --> 00:06:26,719 Speaker 1: would remain relevant as demographics and technology changed. Instagram has 100 00:06:26,760 --> 00:06:30,920 Speaker 1: always skewed younger with its demographics than other social platforms. 101 00:06:31,279 --> 00:06:34,120 Speaker 1: A Facebook as a platform started to age out it 102 00:06:34,200 --> 00:06:37,400 Speaker 1: needed the next big thing to help keep everything moving forward. 103 00:06:38,080 --> 00:06:42,000 Speaker 1: So why did some analysts later say Instagram made a 104 00:06:42,040 --> 00:06:46,200 Speaker 1: big mistake selling to Facebook in April In two thousand 105 00:06:46,200 --> 00:06:49,920 Speaker 1: and fourteen, Business Insider ran a piece about this very question. 106 00:06:50,320 --> 00:06:55,200 Speaker 1: By that time, Instagram boasted two hundred million users that 107 00:06:55,320 --> 00:06:58,480 Speaker 1: put it on parallel with Twitter, which had already held 108 00:06:58,520 --> 00:07:01,760 Speaker 1: its own I p O and raised two billion dollars 109 00:07:01,839 --> 00:07:05,280 Speaker 1: in the process, reaching a valuation of more than twenty 110 00:07:05,480 --> 00:07:09,159 Speaker 1: billion dollars. And then Facebook went on to purchase the 111 00:07:09,200 --> 00:07:14,440 Speaker 1: messaging service WhatsApp, which was five hundred million users large, 112 00:07:14,840 --> 00:07:17,240 Speaker 1: and that's about two and a half times the size 113 00:07:17,240 --> 00:07:22,280 Speaker 1: of Instagram. But Facebook spent nineteen billion dollars on it 114 00:07:22,360 --> 00:07:24,600 Speaker 1: in the process, so way more than two and a 115 00:07:24,680 --> 00:07:27,320 Speaker 1: half times the amount it's it's spent on Instagram. So 116 00:07:27,400 --> 00:07:30,360 Speaker 1: some analysts have said that Instagram could have held out 117 00:07:30,360 --> 00:07:33,480 Speaker 1: a little longer, continued working under the investment money it 118 00:07:33,520 --> 00:07:36,520 Speaker 1: had raised in that fifty million dollar funding round, and 119 00:07:36,600 --> 00:07:40,360 Speaker 1: watched as its price tag ticked higher and higher. Systrom, 120 00:07:40,360 --> 00:07:43,360 Speaker 1: for his part, says that it is impossible to see 121 00:07:43,360 --> 00:07:47,960 Speaker 1: the future and that playing this game in hindsight is pointless. Rather, 122 00:07:48,160 --> 00:07:50,960 Speaker 1: he said, the acquisition enabled his team to work on 123 00:07:51,000 --> 00:07:53,840 Speaker 1: the project that they loved with an enormous amount of support, 124 00:07:53,960 --> 00:07:57,240 Speaker 1: and without that support, they may never hit that two 125 00:07:57,320 --> 00:08:00,680 Speaker 1: hundred million user mark, So it could have just been 126 00:08:00,960 --> 00:08:05,200 Speaker 1: a moot point, a moot question. When the acquisition became 127 00:08:05,240 --> 00:08:09,840 Speaker 1: public knowledge, Systrom and Zuckerberg both reassured people that Instagram 128 00:08:09,880 --> 00:08:13,120 Speaker 1: would continue as its own entity and not be swallowed 129 00:08:13,160 --> 00:08:17,640 Speaker 1: up into Facebook, incorporated into Facebook's features, and then later disappear, 130 00:08:18,000 --> 00:08:21,680 Speaker 1: And so far that has been the case. While Instagram 131 00:08:21,680 --> 00:08:26,120 Speaker 1: employees celebrate their windfall, the company continued to roll out features, 132 00:08:26,480 --> 00:08:30,520 Speaker 1: build a larger audience, and generally stay on target until 133 00:08:30,640 --> 00:08:35,760 Speaker 1: late when a corporate decision would backfire hard on Instagram 134 00:08:35,840 --> 00:08:39,600 Speaker 1: and cause the company to backpedal a bit. So what happened, Well, 135 00:08:39,640 --> 00:08:41,880 Speaker 1: it all had to do with those pesky terms of 136 00:08:41,960 --> 00:08:44,839 Speaker 1: service that we all agree to but almost none of 137 00:08:44,920 --> 00:08:48,320 Speaker 1: us ever bother to actually read. I'll explain more in 138 00:08:48,440 --> 00:08:52,040 Speaker 1: just a moment, but first let's take a quick break 139 00:08:52,360 --> 00:09:04,680 Speaker 1: to thank our sponsor. In December, Instagram updated its terms 140 00:09:04,679 --> 00:09:07,599 Speaker 1: of service with the intent for the new policies to 141 00:09:07,679 --> 00:09:13,320 Speaker 1: take effects starting January. Among those updates was a sentence 142 00:09:13,400 --> 00:09:16,520 Speaker 1: that caused a bit of a ruckus, and the sentence 143 00:09:16,559 --> 00:09:20,839 Speaker 1: read quote, you agree that a business may pay Instagram 144 00:09:20,960 --> 00:09:24,400 Speaker 1: to display your photos in connection with paid or sponsored 145 00:09:24,440 --> 00:09:28,880 Speaker 1: content or promotions without any compensation to you end quote, 146 00:09:29,640 --> 00:09:32,080 Speaker 1: not first blush. That seems to say that if you 147 00:09:32,120 --> 00:09:35,280 Speaker 1: agree to use Instagram, any or all of your photos 148 00:09:35,280 --> 00:09:38,360 Speaker 1: could be featured in commercials for products you never use 149 00:09:38,720 --> 00:09:42,240 Speaker 1: or you've never heard of, or you oppose morally. And 150 00:09:42,280 --> 00:09:45,920 Speaker 1: what's more, you would never see a dime in compensation, 151 00:09:46,040 --> 00:09:49,840 Speaker 1: whether you liked the product or not. The advertisers would 152 00:09:49,840 --> 00:09:53,640 Speaker 1: get the wealth of countless photographs uploaded to Instagram by 153 00:09:53,679 --> 00:09:58,040 Speaker 1: millions of users without having to pay anyone other than 154 00:09:58,080 --> 00:10:02,960 Speaker 1: Instagram for that privilege. Outrage followed, But what was really 155 00:10:03,000 --> 00:10:05,880 Speaker 1: going on? Well, as it turns out, the terms of 156 00:10:05,960 --> 00:10:11,720 Speaker 1: service were actually restricting what Instagram could do. Earlier language 157 00:10:11,880 --> 00:10:15,599 Speaker 1: was more vague and permissive. But we need to understand 158 00:10:15,760 --> 00:10:19,479 Speaker 1: a few truths. One of those truths is that companies 159 00:10:19,520 --> 00:10:23,360 Speaker 1: that display user content require a lot of protection to 160 00:10:23,440 --> 00:10:27,240 Speaker 1: make sure they can't be held liable if someone says, hey, 161 00:10:27,360 --> 00:10:30,240 Speaker 1: I didn't want you to share that. The terms of 162 00:10:30,280 --> 00:10:34,600 Speaker 1: service give broad licenses to such companies. To display images, 163 00:10:34,920 --> 00:10:39,360 Speaker 1: because that's what the companies do. Instagram displays the photos 164 00:10:39,400 --> 00:10:43,040 Speaker 1: you upload and they store them as well, So in 165 00:10:43,120 --> 00:10:46,199 Speaker 1: order for that to happen, Instagram has to have permission 166 00:10:46,440 --> 00:10:49,600 Speaker 1: from you to do that because the photos don't belong 167 00:10:49,679 --> 00:10:55,360 Speaker 1: to Instagram. Those are your photos. Presumably you're presumably uploading 168 00:10:55,400 --> 00:10:57,760 Speaker 1: your own and not someone else's work, and so to 169 00:10:57,800 --> 00:11:01,040 Speaker 1: be able to display those images and to store them 170 00:11:01,080 --> 00:11:06,439 Speaker 1: on servers, Instagram requires users permission to do so. Instagram 171 00:11:06,640 --> 00:11:09,959 Speaker 1: can't sell your photos to anyone else because the service 172 00:11:10,200 --> 00:11:14,120 Speaker 1: doesn't own your photos. I can't sell something that doesn't 173 00:11:14,160 --> 00:11:17,920 Speaker 1: belong to me, at least not legally. Nor can Instagram 174 00:11:18,000 --> 00:11:20,920 Speaker 1: change your photos beyond the ways you've indicated. So, in 175 00:11:20,960 --> 00:11:24,240 Speaker 1: other words, you can use Instagram's tools to change your photos. 176 00:11:24,320 --> 00:11:26,600 Speaker 1: But once you do that and you've settled on what 177 00:11:26,679 --> 00:11:29,800 Speaker 1: the photo needs to look like, Instagram doesn't have permission 178 00:11:29,800 --> 00:11:32,240 Speaker 1: to change it. Again, it doesn't have a license to 179 00:11:32,280 --> 00:11:35,920 Speaker 1: create derivative works based off of the images you upload. 180 00:11:36,480 --> 00:11:39,679 Speaker 1: There are similar terms of service elsewhere online, such as 181 00:11:39,679 --> 00:11:43,120 Speaker 1: with Google Drive. Without those terms, someone might try to 182 00:11:43,160 --> 00:11:46,280 Speaker 1: bring a case against a company saying, hey, I only 183 00:11:46,320 --> 00:11:49,560 Speaker 1: authorize this company to show me my material on this 184 00:11:49,679 --> 00:11:53,520 Speaker 1: particular device, nothing more. So some of this is in 185 00:11:53,600 --> 00:11:56,800 Speaker 1: the interest of self protection, but that didn't mean that 186 00:11:56,880 --> 00:12:00,880 Speaker 1: Instagram didn't have some wording that concerned poll For example, 187 00:12:01,040 --> 00:12:05,319 Speaker 1: there was this passage quote to help us deliver interesting, 188 00:12:05,400 --> 00:12:09,520 Speaker 1: paid or sponsored content or promotions. You agree that a 189 00:12:09,600 --> 00:12:13,480 Speaker 1: business or other entity may pay us to display your 190 00:12:13,640 --> 00:12:18,520 Speaker 1: user name, likeness, photos, along with any associated meta data, 191 00:12:18,679 --> 00:12:22,360 Speaker 1: and or actions you take in connection with paid or 192 00:12:22,480 --> 00:12:27,760 Speaker 1: sponsored content or promotions, without any compensation to you. End quote. 193 00:12:28,480 --> 00:12:31,480 Speaker 1: But the wording stated that Instagram could be paid to 194 00:12:31,559 --> 00:12:36,840 Speaker 1: allow advertisers to display your content in connection with advertising. Now, 195 00:12:36,880 --> 00:12:40,000 Speaker 1: as The Verge pointed out in an insightful article on 196 00:12:40,040 --> 00:12:43,520 Speaker 1: the subject back in two thousand twelve, this actually created 197 00:12:43,600 --> 00:12:47,960 Speaker 1: limitations for Instagram and advertisers. A sponsor would not be 198 00:12:48,040 --> 00:12:51,720 Speaker 1: allowed to alter a photo in any way. It would 199 00:12:51,720 --> 00:12:54,920 Speaker 1: be able to display a photo in connection with advertising, 200 00:12:55,200 --> 00:12:58,600 Speaker 1: but couldn't turn the photo itself into an ad. So 201 00:12:58,640 --> 00:13:00,720 Speaker 1: if you took a picture of a perfect day on 202 00:13:00,760 --> 00:13:03,920 Speaker 1: a seashore, a nearby resort would not be allowed to 203 00:13:03,920 --> 00:13:07,560 Speaker 1: display that photo with overlaid text all about the resort. 204 00:13:08,120 --> 00:13:11,000 Speaker 1: The Verge piece goes on to point out that Facebook 205 00:13:11,040 --> 00:13:14,000 Speaker 1: already did something akin to this. If you were to 206 00:13:14,480 --> 00:13:17,320 Speaker 1: like a page on Facebook that was run by a 207 00:13:17,360 --> 00:13:21,280 Speaker 1: brand that like could show up in friends news feeds 208 00:13:21,320 --> 00:13:24,880 Speaker 1: as a sponsored post. So if I liked the podcast 209 00:13:24,920 --> 00:13:28,280 Speaker 1: page for the soundtrack show, which I recently did do 210 00:13:28,960 --> 00:13:32,200 Speaker 1: the podcast, could pay Facebook to promote that action across 211 00:13:32,240 --> 00:13:35,800 Speaker 1: my friends news feeds, and they would see Jonathan liked 212 00:13:36,080 --> 00:13:39,480 Speaker 1: this page, and that spreads the word and perhaps even 213 00:13:39,520 --> 00:13:44,360 Speaker 1: inspires my friends to also like that page. The Verge 214 00:13:44,440 --> 00:13:47,920 Speaker 1: argued that Instagram didn't make a mistake with the way 215 00:13:48,000 --> 00:13:51,480 Speaker 1: it formulated its terms of service, but rather made a 216 00:13:51,559 --> 00:13:55,360 Speaker 1: huge mistake by not explaining those terms clearly in advance 217 00:13:55,720 --> 00:13:59,560 Speaker 1: so that people wouldn't have a freak count. Instead, users 218 00:13:59,600 --> 00:14:03,320 Speaker 1: got the opression that all their pictures of vacations, pets, 219 00:14:03,360 --> 00:14:06,760 Speaker 1: and let's face it, food were to become the ads 220 00:14:06,880 --> 00:14:09,880 Speaker 1: of tomorrow, and while the users were the ones generating 221 00:14:09,920 --> 00:14:12,960 Speaker 1: all the content, they wouldn't get paid for it. The 222 00:14:13,040 --> 00:14:17,760 Speaker 1: reaction was, to put it, lightly negative. Systrom issued a 223 00:14:17,840 --> 00:14:22,520 Speaker 1: statement apologizing for the communication and the misunderstanding, but at 224 00:14:22,560 --> 00:14:25,400 Speaker 1: that point the outrage was spiraling out of control, and 225 00:14:25,480 --> 00:14:29,440 Speaker 1: so Instagram chose to roll back. The language. Just a 226 00:14:29,440 --> 00:14:32,200 Speaker 1: little bit later, in December of two thousand twelve, still 227 00:14:32,240 --> 00:14:35,400 Speaker 1: a month before the policies were even supposed to take effect, 228 00:14:36,040 --> 00:14:38,800 Speaker 1: the terms of service essentially reverted back to what they 229 00:14:38,840 --> 00:14:42,640 Speaker 1: were when Instagram first launched, back in which, as the 230 00:14:42,720 --> 00:14:45,560 Speaker 1: Verge piece pointed out, actually meant that there were fewer 231 00:14:45,640 --> 00:14:49,480 Speaker 1: protections in place for users, so it ironically got worse, 232 00:14:49,600 --> 00:14:53,560 Speaker 1: not better. Another piece, also in The Verge, went so 233 00:14:53,680 --> 00:14:55,960 Speaker 1: far as to suggest that the whole debacle would feed 234 00:14:56,000 --> 00:14:59,840 Speaker 1: into a culture of deceit among companies, that instagrams ex 235 00:15:00,000 --> 00:15:03,760 Speaker 1: arians would encourage other companies to develop terms of service 236 00:15:04,080 --> 00:15:06,640 Speaker 1: that would be difficult to suss out. They wouldn't be 237 00:15:06,680 --> 00:15:10,480 Speaker 1: written in plain language in order that people didn't get 238 00:15:10,520 --> 00:15:12,920 Speaker 1: bent out of shape from the outset, And as the 239 00:15:12,920 --> 00:15:16,320 Speaker 1: piece pointed out, Instagram's updated policy put their strategy in 240 00:15:16,360 --> 00:15:19,680 Speaker 1: plain language. It made it easier to understand, and that's 241 00:15:19,720 --> 00:15:22,880 Speaker 1: what set people off. When the terms were more obtuse. 242 00:15:23,240 --> 00:15:25,200 Speaker 1: No one got upset because no one took the effort 243 00:15:25,200 --> 00:15:27,400 Speaker 1: to understand what the heck the terms meant in the 244 00:15:27,440 --> 00:15:30,160 Speaker 1: first place. But once you spell it out, everyone got 245 00:15:30,240 --> 00:15:33,520 Speaker 1: up in arms about it, even though nothing had really changed. 246 00:15:33,560 --> 00:15:37,920 Speaker 1: That dramatically. Whether Instagram's experience has had an overall negative 247 00:15:37,960 --> 00:15:41,760 Speaker 1: impact on the way companies create their terms of services 248 00:15:41,800 --> 00:15:45,280 Speaker 1: difficult to say, but Instagram did have its first real 249 00:15:45,520 --> 00:15:50,200 Speaker 1: blush with user blowback, and at first some analysts claimed 250 00:15:50,200 --> 00:15:52,800 Speaker 1: the move had a real impact on user behavior. There 251 00:15:52,840 --> 00:15:55,960 Speaker 1: was a company called app Data, which measured Instagram usage 252 00:15:56,000 --> 00:15:59,280 Speaker 1: through users who had linked their Instagram accounts to their 253 00:15:59,280 --> 00:16:02,720 Speaker 1: Facebook profile. Else They reported that the app had seen 254 00:16:02,760 --> 00:16:06,920 Speaker 1: a twenty decline and daily activity over the week that 255 00:16:07,000 --> 00:16:10,960 Speaker 1: followed the policy announcements and rollback, but a later statement 256 00:16:10,960 --> 00:16:13,800 Speaker 1: from app Data cleared things up. According to a representative, 257 00:16:13,880 --> 00:16:17,360 Speaker 1: the report was based on inaccurate information and was far 258 00:16:17,480 --> 00:16:19,560 Speaker 1: more likely to be connected to the fact that the 259 00:16:19,640 --> 00:16:23,120 Speaker 1: week in question included Christmas and people were just not 260 00:16:23,240 --> 00:16:26,560 Speaker 1: posting as much that day because they were enjoying a holiday, 261 00:16:26,600 --> 00:16:29,760 Speaker 1: and that there probably wasn't any direct connection to the 262 00:16:29,800 --> 00:16:33,600 Speaker 1: policy controversy. Now, while the policy issue was a blip 263 00:16:33,640 --> 00:16:37,280 Speaker 1: in the radar, Instagram was able to get back on track. 264 00:16:37,520 --> 00:16:42,440 Speaker 1: By February of the service boasted one hundred million monthly 265 00:16:42,520 --> 00:16:45,480 Speaker 1: active users, and while some were starting to whisper that 266 00:16:45,560 --> 00:16:49,080 Speaker 1: Systram had sold too quickly. System continue to argue that 267 00:16:49,120 --> 00:16:51,840 Speaker 1: without the support of Facebook, the service wouldn't have grown 268 00:16:51,880 --> 00:16:54,880 Speaker 1: that quickly. To start with, Now, I've got a little 269 00:16:54,880 --> 00:16:57,600 Speaker 1: bit more to say about where Instagram is today and 270 00:16:57,600 --> 00:17:00,000 Speaker 1: how it got there, But first let's take another quick 271 00:17:00,080 --> 00:17:11,040 Speaker 1: break to thank our sponsor. In two thousand thirteen, Instagram 272 00:17:11,040 --> 00:17:15,000 Speaker 1: introduced a few new features. One was photo tagging. Now 273 00:17:15,040 --> 00:17:17,359 Speaker 1: people could tag one another in photos, and it made 274 00:17:17,400 --> 00:17:19,720 Speaker 1: it much easier to find photos of yourself with the 275 00:17:19,760 --> 00:17:23,680 Speaker 1: inclusion of a photos of you tab. That tab displayed 276 00:17:23,720 --> 00:17:26,119 Speaker 1: all the pictures of a user that other people had 277 00:17:26,200 --> 00:17:29,760 Speaker 1: uploaded and tagged. Pretty useful when you go to a 278 00:17:29,800 --> 00:17:31,639 Speaker 1: lot of events and parties with friends and then you 279 00:17:31,640 --> 00:17:34,639 Speaker 1: want to see how many pictures of you looking like 280 00:17:34,680 --> 00:17:37,440 Speaker 1: an idiot there are out there. I might be projecting 281 00:17:37,560 --> 00:17:40,600 Speaker 1: a little bit. I'm not suggesting that you look like 282 00:17:40,640 --> 00:17:44,680 Speaker 1: an idiot at parties. I'm admitting that I always do 283 00:17:44,920 --> 00:17:49,960 Speaker 1: look like an idiot at parties. In June, Instagram introduced 284 00:17:50,080 --> 00:17:53,040 Speaker 1: video sharing. This seemed to be a response to another 285 00:17:53,280 --> 00:17:56,640 Speaker 1: popular product that had launched the year before. This one 286 00:17:56,720 --> 00:17:59,960 Speaker 1: was called vine. Vine would allow users to create video 287 00:18:00,040 --> 00:18:03,880 Speaker 1: loops that lasted just six seconds and then upload those 288 00:18:03,920 --> 00:18:07,640 Speaker 1: clips to sites like Twitter, which bought Vine just before 289 00:18:07,640 --> 00:18:11,240 Speaker 1: it launched officially. The short form challenged users to find 290 00:18:11,280 --> 00:18:15,520 Speaker 1: creative ways to leverage that format. Instagram's approach was similar, 291 00:18:15,560 --> 00:18:18,440 Speaker 1: only it was nearly three times longer, with clips lasting 292 00:18:18,480 --> 00:18:22,040 Speaker 1: up to fifteen seconds. Instagram also introduced more than a 293 00:18:22,080 --> 00:18:25,879 Speaker 1: dozen special filters specifically designed to work with video. This 294 00:18:26,000 --> 00:18:29,760 Speaker 1: marked the first really big change to Instagram's features since 295 00:18:29,760 --> 00:18:32,959 Speaker 1: it launched, though the team behind Instagram had been working 296 00:18:33,080 --> 00:18:37,080 Speaker 1: the whole time to improve services speed and reliability, so 297 00:18:37,119 --> 00:18:39,359 Speaker 1: it wasn't like they were shirking their duty. They just 298 00:18:39,400 --> 00:18:42,880 Speaker 1: hadn't really added any big features. It also meant some 299 00:18:42,920 --> 00:18:46,640 Speaker 1: people began to worry that Instagram could become Facebook's prototype 300 00:18:46,680 --> 00:18:50,800 Speaker 1: ground for features that Facebook wanted to explore but couldn't 301 00:18:50,840 --> 00:18:55,000 Speaker 1: easily implement into the Facebook experience directly. Instead, you just 302 00:18:55,119 --> 00:18:58,919 Speaker 1: add more stuff to Instagram. Ironically, this would mean pushing 303 00:18:58,920 --> 00:19:02,840 Speaker 1: Instagram closer to its predecessor, the app known as Bourbon 304 00:19:03,280 --> 00:19:06,480 Speaker 1: bu r b N. Also, Facebook had really wanted to 305 00:19:06,520 --> 00:19:08,680 Speaker 1: buy Vine that didn't work out for them, This would 306 00:19:08,720 --> 00:19:11,840 Speaker 1: become an ongoing thing for Facebook. Mark Zuckerberg would want 307 00:19:11,880 --> 00:19:14,800 Speaker 1: to go out and acquire a company, he would get 308 00:19:14,880 --> 00:19:18,600 Speaker 1: a negative reply, and then he would tell Instagram, hey, 309 00:19:18,640 --> 00:19:21,320 Speaker 1: can you make a tool that essentially does this thing 310 00:19:21,359 --> 00:19:25,160 Speaker 1: that we wanted to buy and they'd say okay. In October, 311 00:19:26,040 --> 00:19:31,200 Speaker 1: Instagram encountered another controversy. This one was one that continued 312 00:19:31,240 --> 00:19:34,880 Speaker 1: for a while. A photographer and model named Petra Collins 313 00:19:35,240 --> 00:19:37,919 Speaker 1: posted a photo of herself in a bikini, and in 314 00:19:38,000 --> 00:19:41,320 Speaker 1: the photo, some of Collins's pubic hair was actually visible 315 00:19:41,359 --> 00:19:45,320 Speaker 1: beneath her bikini bottom. Instagram ended up deleting her account, 316 00:19:45,359 --> 00:19:48,439 Speaker 1: which led to Collins protesting the move, claiming her photo 317 00:19:48,520 --> 00:19:52,960 Speaker 1: didn't violate any of Instagram's stated terms and conditions, and 318 00:19:53,000 --> 00:19:56,760 Speaker 1: that led to a wider discussion about Instagram's policies and 319 00:19:56,800 --> 00:20:00,760 Speaker 1: the seemingly inconsistent way the company enforced them. You could 320 00:20:00,840 --> 00:20:05,000 Speaker 1: argue that some photos, while not technically showing any nudity whatsoever, 321 00:20:05,320 --> 00:20:08,840 Speaker 1: we're incredibly suggestive or sexual in nature, and therefore we're 322 00:20:08,840 --> 00:20:13,720 Speaker 1: a quote unquote fine by Instagram standards. Other photos that 323 00:20:13,800 --> 00:20:17,120 Speaker 1: might include a little nudity, but we're not overtly sexual 324 00:20:17,280 --> 00:20:21,320 Speaker 1: or sensual, might end up being violations of Instagram's policies. 325 00:20:21,960 --> 00:20:26,200 Speaker 1: And there were other issues about a disparity between photos 326 00:20:26,240 --> 00:20:30,640 Speaker 1: of men and women, and this all started to spiral 327 00:20:30,720 --> 00:20:34,639 Speaker 1: into a big debate online that continues to this day. 328 00:20:35,119 --> 00:20:37,680 Speaker 1: It's been an ongoing issue for Instagram. There were similar 329 00:20:37,720 --> 00:20:42,399 Speaker 1: incidents that happened in and there's still arguments about whether 330 00:20:42,520 --> 00:20:46,560 Speaker 1: or not the policies make sense, whether they are enforced equitably, 331 00:20:47,160 --> 00:20:49,919 Speaker 1: and whether it's a different story for men than it 332 00:20:50,040 --> 00:20:53,040 Speaker 1: is for women. Yet another controversy popped up in November, 333 00:20:54,000 --> 00:20:56,680 Speaker 1: which must have been a pretty rough year to work 334 00:20:56,720 --> 00:21:01,480 Speaker 1: in pr and Instagram. A b b See investigative report 335 00:21:01,680 --> 00:21:05,800 Speaker 1: revealed that people were selling illegal drugs using Instagram to 336 00:21:05,920 --> 00:21:10,560 Speaker 1: advertise their products, typically with a message that would send 337 00:21:10,600 --> 00:21:14,560 Speaker 1: people to the WhatsApp messenger in order to complete the transaction. 338 00:21:15,000 --> 00:21:18,600 Speaker 1: Instagram responded by cracking down no pun intended on the 339 00:21:18,680 --> 00:21:21,920 Speaker 1: various hashtags being used by people to sell drugs over 340 00:21:21,960 --> 00:21:27,080 Speaker 1: the service. Also in two thousand thirteen, Instagram introduced sponsored posts, 341 00:21:27,400 --> 00:21:30,960 Speaker 1: which would insert advertisements within the feeds of users as 342 00:21:30,960 --> 00:21:34,040 Speaker 1: a way to monetize the service, and the company also 343 00:21:34,119 --> 00:21:38,159 Speaker 1: introduced a feature called direct in which users could choose 344 00:21:38,200 --> 00:21:41,640 Speaker 1: to send photos to specific people as a private message, 345 00:21:41,760 --> 00:21:44,960 Speaker 1: rather than sharing it in their public feed. This last 346 00:21:44,960 --> 00:21:47,640 Speaker 1: feature seemed to be a response to apps like Snapchat, 347 00:21:47,880 --> 00:21:51,080 Speaker 1: which was growing in popularity at that time. Two thousand 348 00:21:51,160 --> 00:21:54,359 Speaker 1: thirteen was also the year Instagram finally came to the 349 00:21:54,400 --> 00:21:58,960 Speaker 1: Windows phone platform. Nakia had been pressuring Instagram for a 350 00:21:59,080 --> 00:22:02,640 Speaker 1: version of its app to run on Windows Mobile operating system. 351 00:22:02,760 --> 00:22:05,959 Speaker 1: Nakia's handset division was in the process of being acquired 352 00:22:06,000 --> 00:22:08,159 Speaker 1: by Microsoft by the time the app was on the 353 00:22:08,200 --> 00:22:12,439 Speaker 1: verge of launching. The app came out in November, and 354 00:22:12,560 --> 00:22:17,680 Speaker 1: immediately there were problems. For example, in the beta build, 355 00:22:17,920 --> 00:22:21,920 Speaker 1: you could not take photos directly from the app. Instead, 356 00:22:22,240 --> 00:22:26,159 Speaker 1: you would take photos with your phones camera app, then 357 00:22:26,200 --> 00:22:30,960 Speaker 1: you would import the photos from your film role into Instagram. Worse, 358 00:22:31,320 --> 00:22:35,200 Speaker 1: the original announcement mentioned video support, but the Windows Mobile 359 00:22:35,280 --> 00:22:38,879 Speaker 1: version of Instagram had no such capability. In fact, it 360 00:22:38,920 --> 00:22:42,440 Speaker 1: would take nearly two and a half years before Instagram 361 00:22:42,560 --> 00:22:46,280 Speaker 1: on Windows ten Mobile could support video. There have been 362 00:22:46,320 --> 00:22:50,800 Speaker 1: some standalone apps that interoperate with Instagram that officially come 363 00:22:50,960 --> 00:22:54,840 Speaker 1: from Instagram the company. In fourteen, the company launched a 364 00:22:54,880 --> 00:22:58,040 Speaker 1: messaging app called Bolt, in which users could send an 365 00:22:58,040 --> 00:23:02,040 Speaker 1: image to other users that disappears after it's been viewed, 366 00:23:02,240 --> 00:23:06,919 Speaker 1: similar to snapchats design. Later, the company also released hyper Laps, 367 00:23:07,080 --> 00:23:10,400 Speaker 1: a time lapse video app. In two thousand and fifteen, 368 00:23:10,440 --> 00:23:15,080 Speaker 1: Instagram launched a companion app called Boomerang. Boomerang allows users 369 00:23:15,119 --> 00:23:17,919 Speaker 1: to take a burst of five photos within the span 370 00:23:17,960 --> 00:23:20,680 Speaker 1: of a second. The photos are then shown in sequence 371 00:23:20,760 --> 00:23:23,320 Speaker 1: as if they are five frames of a video. Then 372 00:23:23,800 --> 00:23:26,280 Speaker 1: the sequence reverses back to the beginning and so on 373 00:23:26,400 --> 00:23:29,520 Speaker 1: in a never ending loop, and it can be funny 374 00:23:29,640 --> 00:23:34,560 Speaker 1: or irritating, all depending on the execution. In s, Instagram 375 00:23:34,600 --> 00:23:38,400 Speaker 1: introduced Stories, which lets you create a series of photos 376 00:23:38,400 --> 00:23:41,880 Speaker 1: and videos, complete with layers and effects, and string them 377 00:23:41,880 --> 00:23:45,800 Speaker 1: together to form into a narrative. The story expires after 378 00:23:45,840 --> 00:23:48,560 Speaker 1: twenty four hours it disappears from feeds at that point. 379 00:23:49,160 --> 00:23:53,040 Speaker 1: In April, Instagram changed up its algorithm in a way 380 00:23:53,040 --> 00:23:58,000 Speaker 1: that still drives me crazy, kind of the way Facebook's 381 00:23:58,040 --> 00:24:00,919 Speaker 1: algorithm drives me crazy. And maybe this is just me. 382 00:24:01,440 --> 00:24:03,320 Speaker 1: I know it's not just me, because there's so many 383 00:24:03,400 --> 00:24:06,320 Speaker 1: articles about it, but maybe it doesn't bother you. But 384 00:24:06,400 --> 00:24:10,119 Speaker 1: the app changed from listing entries in reverse chronological order. 385 00:24:10,680 --> 00:24:12,840 Speaker 1: That would be where you have the most recent posts 386 00:24:12,840 --> 00:24:15,359 Speaker 1: at the top of your feed and older posts are 387 00:24:15,400 --> 00:24:18,720 Speaker 1: following behind, so you just scroll down and you're seeing 388 00:24:18,720 --> 00:24:21,840 Speaker 1: the older posts. Instead, they use a new algorithm in 389 00:24:21,960 --> 00:24:23,760 Speaker 1: order to show the posts in a way that was 390 00:24:23,800 --> 00:24:27,520 Speaker 1: supposed to be the kind that would show you more 391 00:24:27,520 --> 00:24:29,840 Speaker 1: of the type of stuff you really liked at the top. 392 00:24:30,400 --> 00:24:33,719 Speaker 1: Presumably the algorithm would take into account the people and 393 00:24:33,800 --> 00:24:36,960 Speaker 1: hashtags of photos that you had hit like on in 394 00:24:37,000 --> 00:24:39,760 Speaker 1: the past and use that to determine what would rise 395 00:24:39,760 --> 00:24:41,960 Speaker 1: to the top of your feed, so that you would 396 00:24:41,960 --> 00:24:44,880 Speaker 1: not miss out on those things, which I guess makes 397 00:24:44,920 --> 00:24:47,399 Speaker 1: some sort of sense, except it really cuts down on 398 00:24:47,480 --> 00:24:51,280 Speaker 1: discover ability. I much prefer the chronological approach, which I 399 00:24:51,320 --> 00:24:54,280 Speaker 1: think works better as long as you're not following thousands 400 00:24:54,280 --> 00:24:56,719 Speaker 1: of accounts. If you are following thousands of accounts, then 401 00:24:56,720 --> 00:24:58,600 Speaker 1: clearly you need some help to see the stuff that 402 00:24:58,640 --> 00:25:01,919 Speaker 1: you would most likely enjoy away. But if you're like 403 00:25:02,040 --> 00:25:04,720 Speaker 1: me and you're following maybe a couple hundred of accounts, 404 00:25:05,000 --> 00:25:08,840 Speaker 1: then chronological makes perfect sense. That same year, the company 405 00:25:08,880 --> 00:25:11,800 Speaker 1: faced more controversy when The Daily Star reported that it 406 00:25:11,840 --> 00:25:16,080 Speaker 1: had discovered users were uploading pornography to Instagram and hiding 407 00:25:16,119 --> 00:25:19,840 Speaker 1: it through Arabic hashtags and another awful moment came when 408 00:25:19,840 --> 00:25:23,760 Speaker 1: Olivia Salon um, the senior technology reporter for the U 409 00:25:23,880 --> 00:25:26,800 Speaker 1: S branch of The Guardian, posted a screenshot of an 410 00:25:26,800 --> 00:25:29,919 Speaker 1: email she had received, and that email included death and 411 00:25:30,080 --> 00:25:33,600 Speaker 1: rape threats against her. That part is awful enough, but 412 00:25:34,040 --> 00:25:37,560 Speaker 1: the truly terrible part for this story was that the 413 00:25:37,600 --> 00:25:42,480 Speaker 1: post got a lot of engagement, and Instagram's add algorithm 414 00:25:42,520 --> 00:25:47,880 Speaker 1: plucked the picture out for use in an advertisement, which 415 00:25:47,880 --> 00:25:52,520 Speaker 1: appeared on Solon's sisters feed, which is pretty horrifying to 416 00:25:52,560 --> 00:25:56,480 Speaker 1: see an email addressed to your sister appearing as a 417 00:25:56,520 --> 00:26:00,440 Speaker 1: sponsored post in your feed. The purpose behind the agorithm 418 00:26:00,440 --> 00:26:03,359 Speaker 1: was to help encourage engagement, so if a post was 419 00:26:03,400 --> 00:26:06,040 Speaker 1: doing well, it's supposed to get featured on your friends 420 00:26:06,080 --> 00:26:08,919 Speaker 1: feeds so that they don't miss out. But in this case, 421 00:26:09,520 --> 00:26:12,600 Speaker 1: it seemed as if Instagram was promoting the awful threats 422 00:26:12,760 --> 00:26:16,879 Speaker 1: against Ms Salon, a very upsetting situation. Also full disclosure, 423 00:26:16,880 --> 00:26:19,359 Speaker 1: I should say this. I can't say that I know 424 00:26:19,480 --> 00:26:22,760 Speaker 1: Olivia Salon, but she and I have interacted online a 425 00:26:22,800 --> 00:26:26,520 Speaker 1: few times, so I admit I'm not unbiased in this 426 00:26:26,560 --> 00:26:29,120 Speaker 1: particular story, though I would imagine I'd be horrified even 427 00:26:29,160 --> 00:26:31,960 Speaker 1: if I had never communicated with her, but that particular 428 00:26:31,960 --> 00:26:34,719 Speaker 1: problem is indicative of the sort of things that can 429 00:26:34,800 --> 00:26:39,040 Speaker 1: happen with algorithms without very careful supervision. And algorithm is 430 00:26:39,200 --> 00:26:42,600 Speaker 1: just a set of instructions, and if a situation seems 431 00:26:42,640 --> 00:26:46,240 Speaker 1: to fall within those parameters, the algorithm will act upon it. 432 00:26:46,240 --> 00:26:50,520 Speaker 1: It's not indicative of anything malicious necessarily, but certainly falls 433 00:26:50,600 --> 00:26:54,920 Speaker 1: in the problematic or perhaps even negligent categories. And leading 434 00:26:55,000 --> 00:26:57,919 Speaker 1: up to the recording of these episodes, some more news 435 00:26:57,920 --> 00:27:01,200 Speaker 1: about Instagram recently came out. One feature that has yet 436 00:27:01,240 --> 00:27:03,800 Speaker 1: to be implemented as of the time I'm recording this, 437 00:27:03,840 --> 00:27:06,320 Speaker 1: but should soon be part of the service is a 438 00:27:06,400 --> 00:27:09,840 Speaker 1: mute button, which will let you silence friends without actually 439 00:27:09,840 --> 00:27:13,439 Speaker 1: on following them, so you can see who is bugging 440 00:27:13,480 --> 00:27:15,040 Speaker 1: you and then you can mute that person. You you 441 00:27:15,040 --> 00:27:18,080 Speaker 1: would stop seeing photos of my dog Tibolt without having 442 00:27:18,080 --> 00:27:21,040 Speaker 1: to let me know you've unfollowed me. Though, let's be honest, 443 00:27:21,080 --> 00:27:23,880 Speaker 1: Tibblet is so darn cute he deserves his own Instagram account. 444 00:27:23,880 --> 00:27:28,639 Speaker 1: He's just so darn cute. Instagram is also introducing a 445 00:27:28,640 --> 00:27:31,320 Speaker 1: feature to let you know when you've seen all the 446 00:27:31,440 --> 00:27:35,080 Speaker 1: latest posts from the past given amount of time, which 447 00:27:35,080 --> 00:27:38,160 Speaker 1: would save you the effort of scrolling beyond images you've 448 00:27:38,200 --> 00:27:41,760 Speaker 1: seen multiple times in the hopes of glimpsing something new. Now, 449 00:27:41,800 --> 00:27:44,160 Speaker 1: this would not be necessary if the app, I don't know, 450 00:27:44,800 --> 00:27:49,199 Speaker 1: listed images and reverse chronological order or something, but it 451 00:27:49,280 --> 00:27:51,280 Speaker 1: lets you know you've hit the last of the new 452 00:27:51,359 --> 00:27:53,760 Speaker 1: stuff and that scrolling further will just show you more 453 00:27:53,760 --> 00:27:56,600 Speaker 1: of what you've already seen. So you start scrolling down 454 00:27:56,840 --> 00:27:59,000 Speaker 1: and eventually you see a message saying you have seen 455 00:27:59,240 --> 00:28:03,800 Speaker 1: all the latest posts from the last six hours. Let's 456 00:28:03,800 --> 00:28:06,480 Speaker 1: say I'm just arbitrarily making up that time, but you 457 00:28:06,520 --> 00:28:08,440 Speaker 1: get the idea, and that way you know, all right, well, 458 00:28:08,480 --> 00:28:10,720 Speaker 1: that's how many were there for the last six hours. 459 00:28:10,720 --> 00:28:12,760 Speaker 1: I'll check again in a couple of hours and it 460 00:28:12,760 --> 00:28:14,440 Speaker 1: will tell me when I get to that same spot 461 00:28:14,440 --> 00:28:16,119 Speaker 1: and say, oh, you've seen the latest ones from the 462 00:28:16,160 --> 00:28:19,040 Speaker 1: last two hours. And you can keep doing it that 463 00:28:19,119 --> 00:28:21,600 Speaker 1: way without having to dig way back in your Instagram 464 00:28:21,640 --> 00:28:24,240 Speaker 1: feed again. Wouldn't be necessary if it were in reverse 465 00:28:24,320 --> 00:28:27,280 Speaker 1: chronological order. But I'm beating a dead horse. And another 466 00:28:27,359 --> 00:28:30,480 Speaker 1: thing that the app is launching is a native payments feature, 467 00:28:30,920 --> 00:28:32,679 Speaker 1: so you will soon be able to if you if 468 00:28:32,680 --> 00:28:35,120 Speaker 1: you can already. You soon will be able to register 469 00:28:35,280 --> 00:28:38,280 Speaker 1: a credit or debit card in a profile. You would 470 00:28:38,280 --> 00:28:42,200 Speaker 1: then set up a security pen personal identification number, and 471 00:28:42,240 --> 00:28:45,680 Speaker 1: then you can buy stuff through Instagram. You might be 472 00:28:45,720 --> 00:28:48,440 Speaker 1: able to do things like make dinner reservations or book 473 00:28:48,440 --> 00:28:51,480 Speaker 1: a session in a sheep salon. You may soon be 474 00:28:51,520 --> 00:28:54,760 Speaker 1: able to buy things advertise on Instagram directly through the app, 475 00:28:55,120 --> 00:28:58,600 Speaker 1: removing the necessity of navigating away from Instagram to buy 476 00:28:58,640 --> 00:29:02,480 Speaker 1: those killer shade you saw or whatever. Now again, will 477 00:29:02,520 --> 00:29:07,000 Speaker 1: these features make Instagram ever more popular? Or are we 478 00:29:07,040 --> 00:29:11,400 Speaker 1: starting to see that feature creep problem up here, where 479 00:29:11,400 --> 00:29:14,600 Speaker 1: more and more things that were never part of Instagram's 480 00:29:15,360 --> 00:29:19,160 Speaker 1: uh makeup back in the old days start getting crammed 481 00:29:19,200 --> 00:29:23,400 Speaker 1: in there. Are we going to see Instagram become bourbon 482 00:29:24,240 --> 00:29:29,160 Speaker 1: the old proto Instagram app from And if it does, 483 00:29:29,280 --> 00:29:33,720 Speaker 1: will users still enjoy using Instagram or will they say 484 00:29:34,080 --> 00:29:37,760 Speaker 1: there's so many features here that it's cluttered and it's clumsy, 485 00:29:38,240 --> 00:29:41,400 Speaker 1: it's not fun to use. I much prefer a streamlined app, 486 00:29:41,440 --> 00:29:44,240 Speaker 1: and they'll go looking for something else. It's too early 487 00:29:44,280 --> 00:29:46,560 Speaker 1: to say for sure. I know that I tend to 488 00:29:46,600 --> 00:29:49,239 Speaker 1: think that it would be best to keep things as 489 00:29:49,240 --> 00:29:54,280 Speaker 1: streamlined as possible, and perhaps develop partner apps that have 490 00:29:54,480 --> 00:29:57,480 Speaker 1: these other features that are starting to creep into Instagram 491 00:29:58,000 --> 00:30:01,080 Speaker 1: and then go that route instead of putting them all 492 00:30:01,080 --> 00:30:03,080 Speaker 1: in one. But who am I to say? I'm not 493 00:30:03,200 --> 00:30:09,240 Speaker 1: worth four million dollars, so I certainly can't argue that 494 00:30:09,280 --> 00:30:12,840 Speaker 1: I have all the answers. I'm curious what you guys think, 495 00:30:13,120 --> 00:30:16,160 Speaker 1: and also if you have any suggestions for future episodes, 496 00:30:16,720 --> 00:30:20,200 Speaker 1: you should write me your thoughts. The email address is 497 00:30:20,320 --> 00:30:23,800 Speaker 1: tech stuff at how stuff works dot com, or you 498 00:30:23,840 --> 00:30:26,520 Speaker 1: can drop me a line on Facebook or Twitter to 499 00:30:26,560 --> 00:30:28,800 Speaker 1: handle it. Both of those is tech Stuff hs W 500 00:30:29,520 --> 00:30:32,600 Speaker 1: or also, hey, follow us on Instagram. I've been talking 501 00:30:32,600 --> 00:30:35,360 Speaker 1: about so much. Just go there, follow tech stuff on 502 00:30:35,400 --> 00:30:37,880 Speaker 1: There'd be great to see you there, and I'll talk 503 00:30:37,920 --> 00:30:46,200 Speaker 1: to you again really soon for more on this and 504 00:30:46,240 --> 00:30:48,760 Speaker 1: thousands of other topics. Because it how stuff works dot 505 00:30:48,800 --> 00:30:58,880 Speaker 1: com