1 00:00:04,600 --> 00:00:08,039 Speaker 1: Zuckerberg gets grilled. I'm Rich Demiro. This is Rich on 2 00:00:08,200 --> 00:00:11,360 Speaker 1: Tech Daily. Well, if you're watching the news at all, 3 00:00:11,520 --> 00:00:15,360 Speaker 1: you saw that Facebook CEO and founder Mark Zuckerberg talked 4 00:00:15,360 --> 00:00:19,599 Speaker 1: to Congress yesterday and also today, and man was that 5 00:00:19,680 --> 00:00:22,599 Speaker 1: interesting to watch. If you watched any of it, well, 6 00:00:22,640 --> 00:00:25,880 Speaker 1: you realize several things. First off, well, I think our 7 00:00:26,079 --> 00:00:29,920 Speaker 1: lawmakers are a little bit in the dark about how 8 00:00:29,920 --> 00:00:32,839 Speaker 1: the internet works, how Facebook works, how they track us, 9 00:00:32,880 --> 00:00:35,040 Speaker 1: how they make money. But also I think it also 10 00:00:35,159 --> 00:00:38,280 Speaker 1: showed another side of things aside that what these tech 11 00:00:38,280 --> 00:00:42,240 Speaker 1: companies are doing is not always transparent to the average person. 12 00:00:42,720 --> 00:00:45,839 Speaker 1: And I can't knock these guys in Washington for not 13 00:00:46,040 --> 00:00:49,040 Speaker 1: knowing the inner workings of the Internet and how technology works, 14 00:00:49,080 --> 00:00:52,040 Speaker 1: because let's face it, most people don't when you think 15 00:00:52,080 --> 00:00:54,440 Speaker 1: about it. Back in the day, we had newspapers, they 16 00:00:54,480 --> 00:00:57,040 Speaker 1: were a trusted source. If something was published in the 17 00:00:57,080 --> 00:00:59,800 Speaker 1: newspaper or a magazine, well you knew an editor went 18 00:00:59,840 --> 00:01:03,080 Speaker 1: through it. Now with Facebook, people post stories from all 19 00:01:03,200 --> 00:01:05,440 Speaker 1: kinds of sources. You don't really know where they're coming from. 20 00:01:05,760 --> 00:01:07,360 Speaker 1: We have to be the editor at the end of 21 00:01:07,400 --> 00:01:09,120 Speaker 1: the day. And that's a big job for a lot 22 00:01:09,160 --> 00:01:11,280 Speaker 1: of people. Most of us have a normal job. We're 23 00:01:11,319 --> 00:01:14,080 Speaker 1: sitting there working. We don't have time to fact check 24 00:01:14,160 --> 00:01:17,680 Speaker 1: every single article or story or posts that we see, 25 00:01:18,000 --> 00:01:20,840 Speaker 1: so we take it at face value, and as we've seen, 26 00:01:21,160 --> 00:01:23,080 Speaker 1: that has sort of backfired. Let me go through some 27 00:01:23,080 --> 00:01:25,360 Speaker 1: of the remarks that Zuckerberg made and I'll kind of 28 00:01:25,400 --> 00:01:28,240 Speaker 1: comment on some of them. First off, he said, Facebook 29 00:01:28,280 --> 00:01:32,119 Speaker 1: is an idealistic and optimistic company. From our existence, we've 30 00:01:32,160 --> 00:01:36,200 Speaker 1: been focused on all the good that connecting people can do. 31 00:01:37,120 --> 00:01:40,080 Speaker 1: As Facebook has grown, people everywhere have gotten a powerful 32 00:01:40,120 --> 00:01:42,320 Speaker 1: new tool for staying connected to the people they love, 33 00:01:42,640 --> 00:01:46,240 Speaker 1: making their voices heard, and building communities and businesses. All 34 00:01:46,240 --> 00:01:49,600 Speaker 1: of that is very true. So when you think about Facebook, 35 00:01:49,640 --> 00:01:52,600 Speaker 1: I mean, it really is connecting people in new ways, 36 00:01:52,800 --> 00:01:55,720 Speaker 1: connecting people in different ways, in ways that we never 37 00:01:55,760 --> 00:01:59,720 Speaker 1: really really thought of. But as they went down that line, 38 00:02:00,120 --> 00:02:02,800 Speaker 1: we've seen it's been a tough road because we think 39 00:02:02,800 --> 00:02:04,720 Speaker 1: about the ways that Facebook has always been trying to 40 00:02:04,720 --> 00:02:07,480 Speaker 1: connect people, but they've come under a lot of heat 41 00:02:07,640 --> 00:02:09,520 Speaker 1: for the ways that they do that. So these are 42 00:02:09,560 --> 00:02:11,880 Speaker 1: all things that they're learning from. Zuckerberg then goes on 43 00:02:11,919 --> 00:02:13,480 Speaker 1: to say, it's clear now we didn't do enough to 44 00:02:13,480 --> 00:02:16,480 Speaker 1: prevent these tools from being used for harm as well. 45 00:02:16,840 --> 00:02:20,160 Speaker 1: That goes for fake news, foreign interference and elections, hate speech, 46 00:02:20,480 --> 00:02:23,160 Speaker 1: as well as developers and data privacy. And right there 47 00:02:23,200 --> 00:02:25,920 Speaker 1: he's really talking about what happened with Cambridge Analytica. And 48 00:02:25,960 --> 00:02:29,360 Speaker 1: here's the thing. When you're something like Facebook, a large 49 00:02:29,440 --> 00:02:33,200 Speaker 1: organization that's building these tools, you don't necessarily know how 50 00:02:33,240 --> 00:02:35,520 Speaker 1: everyone's going to use them. Now, you might have an 51 00:02:35,560 --> 00:02:38,160 Speaker 1: idea how they're going to be used, but you can't 52 00:02:38,320 --> 00:02:40,799 Speaker 1: figure out every possibility in the world. And I think 53 00:02:40,840 --> 00:02:43,840 Speaker 1: that's what he's saying here. So now Facebook, and since 54 00:02:43,919 --> 00:02:46,200 Speaker 1: twenty fourteen or twenty fifteen, when they change a lot 55 00:02:46,240 --> 00:02:48,600 Speaker 1: of these things, they have done better, and I think 56 00:02:48,639 --> 00:02:51,640 Speaker 1: they will now with the spotlight being on them, continue 57 00:02:51,639 --> 00:02:53,920 Speaker 1: to do better. Zuckerberg goes on to say, it's not 58 00:02:53,960 --> 00:02:55,760 Speaker 1: just enough to connect people. We want to make sure 59 00:02:55,800 --> 00:02:58,840 Speaker 1: those connections are positive. So that's a big one too, 60 00:02:58,880 --> 00:03:02,440 Speaker 1: because this really deals with fake news. And here's the 61 00:03:02,480 --> 00:03:05,919 Speaker 1: thing when it comes to Facebook, they want to spread information. 62 00:03:06,080 --> 00:03:08,800 Speaker 1: It is an amazing place to spread information. If you 63 00:03:08,840 --> 00:03:10,840 Speaker 1: have an announcement, if you ever had a kid, you 64 00:03:10,880 --> 00:03:12,840 Speaker 1: put a birth announcement on there, or you say you're 65 00:03:12,840 --> 00:03:15,520 Speaker 1: going to the hospital. It's amazing all the people that 66 00:03:15,560 --> 00:03:18,640 Speaker 1: come out of the woodwork to offer you congratulations or 67 00:03:18,680 --> 00:03:21,760 Speaker 1: to give you some information, or if you're traveling somewhere, 68 00:03:21,960 --> 00:03:23,640 Speaker 1: you know how people can tell you, oh, I've been 69 00:03:23,639 --> 00:03:26,520 Speaker 1: to that place. You can't possibly pull every single one 70 00:03:26,520 --> 00:03:29,400 Speaker 1: of your friends and say, hey, have you been to France? 71 00:03:29,440 --> 00:03:31,799 Speaker 1: Tell me the best restaurants. But on Facebook you can 72 00:03:31,880 --> 00:03:35,120 Speaker 1: do that. And so again they're saying here that there 73 00:03:35,160 --> 00:03:38,560 Speaker 1: are tools that really do connect people, but you have 74 00:03:38,640 --> 00:03:40,880 Speaker 1: to make sure that they're not used to spread misinformation. 75 00:03:41,120 --> 00:03:43,560 Speaker 1: And that's a tough thing to do because Facebook can't 76 00:03:43,600 --> 00:03:46,520 Speaker 1: fact check everything that's posted to their site, but they 77 00:03:46,600 --> 00:03:50,680 Speaker 1: can create some tools that let them figure out what's 78 00:03:50,720 --> 00:03:53,360 Speaker 1: good and what's bad. Zuckerberg goes on to say, I 79 00:03:53,600 --> 00:03:55,560 Speaker 1: believe deeply in what we're doing, and I know that 80 00:03:55,600 --> 00:03:58,080 Speaker 1: when we address these challenges, we'll look back and view 81 00:03:58,120 --> 00:04:00,680 Speaker 1: helping people and connecting and giving people voice as a 82 00:04:00,720 --> 00:04:03,400 Speaker 1: positive force in the world. I remember when I first 83 00:04:03,560 --> 00:04:06,600 Speaker 1: got my access to Facebook. It was such an honor. 84 00:04:06,880 --> 00:04:10,280 Speaker 1: I mean, I remember I used my alumni email address 85 00:04:10,320 --> 00:04:12,640 Speaker 1: to get access to the website, because you had to 86 00:04:12,680 --> 00:04:15,960 Speaker 1: have an edu email address to get access to Facebook 87 00:04:15,960 --> 00:04:17,880 Speaker 1: back in the day, and when I did that, it 88 00:04:17,960 --> 00:04:20,120 Speaker 1: was like a whole new world had been unlocked to me, 89 00:04:20,560 --> 00:04:22,920 Speaker 1: and a lot of people thought that way, and we've 90 00:04:22,920 --> 00:04:25,279 Speaker 1: gotten away from that. Now we take Facebook for granted, 91 00:04:25,440 --> 00:04:27,960 Speaker 1: but the reality is it has done a lot of 92 00:04:27,960 --> 00:04:30,240 Speaker 1: good for people. But at the same time, it has 93 00:04:30,279 --> 00:04:34,000 Speaker 1: become a huge force in our lives, and Facebook needs 94 00:04:34,000 --> 00:04:37,120 Speaker 1: to know that. They need to be cognizant of the 95 00:04:37,240 --> 00:04:40,160 Speaker 1: strength that they have and of the power that they have. 96 00:04:40,720 --> 00:04:44,000 Speaker 1: And that's part of this learning situation. By the way, 97 00:04:44,040 --> 00:04:46,640 Speaker 1: So what's the point about all this tracking? Why do 98 00:04:46,720 --> 00:04:49,440 Speaker 1: these companies want to track us so badly? It all 99 00:04:49,480 --> 00:04:52,720 Speaker 1: comes down to one thing money. I've said this before. 100 00:04:52,960 --> 00:04:56,279 Speaker 1: They want to sell us stuff, whether it's Facebook selling ads, 101 00:04:56,320 --> 00:04:59,480 Speaker 1: whether it's Google selling ads, whether it's Target getting us 102 00:04:59,480 --> 00:05:02,279 Speaker 1: into their store to buy stuff. The more that they 103 00:05:02,360 --> 00:05:05,320 Speaker 1: know about us, the easier it is to sell us stuff. 104 00:05:05,360 --> 00:05:07,839 Speaker 1: And the more that companies know about us, the better 105 00:05:07,880 --> 00:05:10,360 Speaker 1: it is for advertisers who want to target us. And 106 00:05:10,400 --> 00:05:12,240 Speaker 1: that's really what's going on here. So when you think 107 00:05:12,279 --> 00:05:14,640 Speaker 1: about Facebook, what they're doing is they're trying to collect 108 00:05:14,640 --> 00:05:17,640 Speaker 1: as much data on us as possible because that way 109 00:05:17,760 --> 00:05:20,920 Speaker 1: marketers can sell us stuff in an easy way. If 110 00:05:20,960 --> 00:05:23,120 Speaker 1: you're on Facebook back in the early days, you would 111 00:05:23,160 --> 00:05:25,000 Speaker 1: see a bunch of ads in your news feed, and 112 00:05:25,080 --> 00:05:27,960 Speaker 1: the biggest complaint about those ads on the side was 113 00:05:28,000 --> 00:05:30,520 Speaker 1: that they meant nothing to you. They'd be for products 114 00:05:30,560 --> 00:05:33,600 Speaker 1: that you just didn't care about. Well, now, obviously Facebook 115 00:05:33,640 --> 00:05:36,120 Speaker 1: has gotten really good at targeting us, so a lot 116 00:05:36,120 --> 00:05:38,479 Speaker 1: of the ads you see are for products that a 117 00:05:38,480 --> 00:05:40,200 Speaker 1: lot of people say, Oh, I was just thinking about 118 00:05:40,200 --> 00:05:42,680 Speaker 1: that product, or I search for that product. Same thing 119 00:05:42,680 --> 00:05:45,480 Speaker 1: with Instagram, same thing with pretty much any website that 120 00:05:45,600 --> 00:05:50,240 Speaker 1: uses Google Ads or any sort of online major online 121 00:05:50,240 --> 00:05:52,839 Speaker 1: ad provider. In reality, this is kind of a good 122 00:05:52,839 --> 00:05:55,039 Speaker 1: thing for us because think about it, it's bad for 123 00:05:55,080 --> 00:05:57,039 Speaker 1: your wallet because you're probably going to spend more money, 124 00:05:57,040 --> 00:05:59,560 Speaker 1: but it's a good thing because you're interested in the 125 00:05:59,560 --> 00:06:01,760 Speaker 1: products that you see, or you could be interested in 126 00:06:01,800 --> 00:06:03,839 Speaker 1: the products you see back in the day. Let's just 127 00:06:03,960 --> 00:06:08,080 Speaker 1: use the hypothetical of watching a national telecast, right, so 128 00:06:08,320 --> 00:06:11,039 Speaker 1: millions of people would be watching that show. You'd probably 129 00:06:11,080 --> 00:06:14,720 Speaker 1: see an ad for a really broad product like Colgate toothpaste. 130 00:06:14,920 --> 00:06:17,560 Speaker 1: Millions of people would see that ad. It would build 131 00:06:17,560 --> 00:06:20,680 Speaker 1: brand awareness. Yes, maybe a percentage of those people would 132 00:06:20,680 --> 00:06:24,160 Speaker 1: purchase it. Now we're getting highly targeted ads for products 133 00:06:24,160 --> 00:06:26,560 Speaker 1: that we may like, that we may be interested in 134 00:06:26,920 --> 00:06:29,440 Speaker 1: for the time period in our lives. And this is 135 00:06:29,520 --> 00:06:32,239 Speaker 1: really nothing new. It just happens to be on the Internet. 136 00:06:32,320 --> 00:06:35,200 Speaker 1: So it's much more optimized. Think about when you went 137 00:06:35,200 --> 00:06:37,640 Speaker 1: to the mall. You saw that big shiny car in 138 00:06:37,680 --> 00:06:39,839 Speaker 1: the center of the mall that you can win. Well, 139 00:06:39,880 --> 00:06:42,560 Speaker 1: guess what it wasn't just hey, let's just pick someone 140 00:06:42,560 --> 00:06:45,040 Speaker 1: out of a raffle, No, it was fill out this 141 00:06:45,200 --> 00:06:48,560 Speaker 1: card that asked you everything, where you live, how much 142 00:06:48,600 --> 00:06:50,800 Speaker 1: you make, what your email address is, what your cell 143 00:06:50,800 --> 00:06:53,080 Speaker 1: phone number is. Well, they're given away that car that 144 00:06:53,160 --> 00:06:56,919 Speaker 1: might cost thirty thousand dollars, but they're gathering thousands of 145 00:06:56,960 --> 00:07:00,560 Speaker 1: dollars worth of data on the individuals that enter the raffle. 146 00:07:00,600 --> 00:07:02,679 Speaker 1: I Guess what happens to that information? It's all sold 147 00:07:02,720 --> 00:07:05,599 Speaker 1: and resold to marketers, and that's what it's all about. 148 00:07:05,600 --> 00:07:07,680 Speaker 1: So this has been going on from the beginning of time. 149 00:07:08,040 --> 00:07:11,360 Speaker 1: It's just that companies like Facebook and Google have gotten 150 00:07:11,680 --> 00:07:15,480 Speaker 1: extremely good at it. Overall, putting all this stuff on 151 00:07:15,560 --> 00:07:18,440 Speaker 1: TV and having this all come to light is a 152 00:07:18,440 --> 00:07:20,840 Speaker 1: good thing for all of us, because tech companies in 153 00:07:20,920 --> 00:07:24,160 Speaker 1: general are now going to realize that they can't get 154 00:07:24,200 --> 00:07:26,400 Speaker 1: away with all the crazy stuff that they might have 155 00:07:26,440 --> 00:07:28,400 Speaker 1: been able to get away with in the past. There 156 00:07:28,440 --> 00:07:31,240 Speaker 1: you have it. Thanks so much for listening, rich Demiro 157 00:07:31,480 --> 00:07:34,160 Speaker 1: rich on tech dot TV. Tell me what you think. 158 00:07:34,240 --> 00:07:37,800 Speaker 1: You can tweet me on Twitter or on Facebook as well. 159 00:07:38,080 --> 00:07:40,840 Speaker 1: I know you're still on there because I am so 160 00:07:40,920 --> 00:07:42,960 Speaker 1: Thanks so much for listening. I'll talk to you real soon.