1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,080 --> 00:00:14,920 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:15,080 --> 00:00:18,360 Speaker 1: Jonathan Strickland. I'm an executive producer with iHeart Radio and 4 00:00:18,360 --> 00:00:21,160 Speaker 1: I love all things tech. And this is the tech 5 00:00:21,239 --> 00:00:25,919 Speaker 1: news for Thursday, June tenth, two thousand twenty one. Let's 6 00:00:25,960 --> 00:00:30,520 Speaker 1: get to it first. Some background. In two thousand nine, 7 00:00:30,680 --> 00:00:34,479 Speaker 1: some hackers were able to access databases that were hosting 8 00:00:34,960 --> 00:00:38,880 Speaker 1: a social app called rock You, and they got away 9 00:00:38,920 --> 00:00:42,400 Speaker 1: with more than thirty million user passwords, which was considered 10 00:00:42,440 --> 00:00:45,960 Speaker 1: to be a lot. And those passwords, by the way, 11 00:00:46,200 --> 00:00:49,279 Speaker 1: we're in plain text. In other words, they were unencrypted. 12 00:00:49,479 --> 00:00:52,760 Speaker 1: So once the hackers got them, they had, you know, 13 00:00:53,440 --> 00:00:55,800 Speaker 1: passwords that were as valid for as long as people 14 00:00:56,360 --> 00:00:59,800 Speaker 1: you know failed to change them. Well, now, where it 15 00:00:59,880 --> 00:01:03,560 Speaker 1: is out that a data breach could mean anywhere from 16 00:01:03,720 --> 00:01:06,760 Speaker 1: eight point four billion to potentially as many as eighty 17 00:01:06,800 --> 00:01:12,960 Speaker 1: two billion passwords have been leaked billion. Now eight point 18 00:01:13,000 --> 00:01:20,200 Speaker 1: four billion unique users or unique logins were identified of 19 00:01:20,240 --> 00:01:25,400 Speaker 1: this batch, so that's already bad. But multiple passwords belonging 20 00:01:25,440 --> 00:01:27,960 Speaker 1: to the same logan could be in that group. So 21 00:01:28,000 --> 00:01:31,000 Speaker 1: that's why the eight two billion is on the upper level. 22 00:01:31,200 --> 00:01:35,480 Speaker 1: So people are calling it rock You twenty twenty one 23 00:01:35,720 --> 00:01:38,280 Speaker 1: in reference to that you know data breach from two 24 00:01:38,319 --> 00:01:41,920 Speaker 1: thousand nine. I would like to suggest an alternative, rock 25 00:01:42,040 --> 00:01:45,679 Speaker 1: you like a hurricane. But what's actually going on here? 26 00:01:46,360 --> 00:01:51,240 Speaker 1: That's a good question. Someone, some hacker has posted a 27 00:01:51,320 --> 00:01:56,520 Speaker 1: text file that is one hundred gigabytes in size. That 28 00:01:56,680 --> 00:02:00,200 Speaker 1: is huge for a text file, and they've post did 29 00:02:00,200 --> 00:02:04,880 Speaker 1: it on various hacker forums. This text file contains billions 30 00:02:04,960 --> 00:02:09,639 Speaker 1: of user names and passwords for various sites and services. 31 00:02:09,680 --> 00:02:13,480 Speaker 1: All the passwords are between six and twenty characters long. 32 00:02:14,240 --> 00:02:17,360 Speaker 1: There's no telling how many of those passwords are currently 33 00:02:17,480 --> 00:02:21,239 Speaker 1: valid versus inactive. In fact, I would not be surprised 34 00:02:21,240 --> 00:02:24,079 Speaker 1: if a lot of those log ins were for services 35 00:02:24,080 --> 00:02:27,760 Speaker 1: that don't even really exist anymore. But generally speaking, it's 36 00:02:27,760 --> 00:02:29,320 Speaker 1: a good idea to check to see if you were 37 00:02:29,320 --> 00:02:32,360 Speaker 1: potentially affected. One way you can do that is to 38 00:02:32,440 --> 00:02:37,120 Speaker 1: search your email address against tools like have I Been Pooned? 39 00:02:38,000 --> 00:02:41,080 Speaker 1: But another is to use a password vault program because 40 00:02:41,120 --> 00:02:44,200 Speaker 1: a lot of those will monitor news about data breaches 41 00:02:44,720 --> 00:02:48,120 Speaker 1: and they will proactively reach out to users and prompt 42 00:02:48,120 --> 00:02:51,800 Speaker 1: them to change their passwords at various sites and services. 43 00:02:52,080 --> 00:02:54,160 Speaker 1: It's also by the way, a very good idea to 44 00:02:54,240 --> 00:02:59,480 Speaker 1: activate two factor authentication for those services that allow for that. 45 00:02:59,600 --> 00:03:02,680 Speaker 1: If it allows for it, you should probably do it. 46 00:03:03,560 --> 00:03:05,639 Speaker 1: These sorts of data breaches are the kind that are 47 00:03:05,680 --> 00:03:09,160 Speaker 1: outside of our control as users, so we kind of 48 00:03:09,160 --> 00:03:11,000 Speaker 1: have to take steps to protect ourselves. We also have 49 00:03:11,040 --> 00:03:14,040 Speaker 1: to remember every time there's a big data breach like this, 50 00:03:14,639 --> 00:03:17,880 Speaker 1: then those passwords, even if they're not active, even if 51 00:03:17,919 --> 00:03:22,639 Speaker 1: they aren't currently associated with those accounts, they go into 52 00:03:22,800 --> 00:03:29,160 Speaker 1: an ever increasingly huge dictionary of existing passwords. So if 53 00:03:29,200 --> 00:03:31,680 Speaker 1: you're the type of person who creates a password and 54 00:03:31,720 --> 00:03:35,440 Speaker 1: then over time cycles back to that password, let's say 55 00:03:35,480 --> 00:03:39,400 Speaker 1: that you know you alternate between a few uh, that 56 00:03:39,480 --> 00:03:42,600 Speaker 1: means that if you're part of this breach, then even 57 00:03:42,600 --> 00:03:45,320 Speaker 1: a password you're not using right now but could use 58 00:03:45,360 --> 00:03:48,880 Speaker 1: in the future, that could mean that you leave yourself vulnerable. 59 00:03:49,360 --> 00:03:52,240 Speaker 1: So again, always a good idea to get like a 60 00:03:52,240 --> 00:03:57,680 Speaker 1: password vault program. There's some that will generate very tough 61 00:03:57,760 --> 00:04:00,280 Speaker 1: passwords on your behalf and store them for are you 62 00:04:00,400 --> 00:04:03,280 Speaker 1: and you don't have to keep track of all that. Uh, 63 00:04:03,360 --> 00:04:05,560 Speaker 1: it is a little bit more of a hassle, I mean, 64 00:04:05,560 --> 00:04:09,240 Speaker 1: obviously more than a hassle than just using the single 65 00:04:09,320 --> 00:04:13,480 Speaker 1: password for everything. But obviously that is not safe and 66 00:04:13,520 --> 00:04:17,880 Speaker 1: it means that everywhere you go you're potentially vulnerable. So 67 00:04:18,800 --> 00:04:21,479 Speaker 1: I recommend getting a password vault and using it. There 68 00:04:21,520 --> 00:04:23,840 Speaker 1: are a lot of good ones out there, uh so 69 00:04:24,000 --> 00:04:27,680 Speaker 1: I recommend researching those and choose one that fits your needs. 70 00:04:28,480 --> 00:04:32,080 Speaker 1: And speaking of hackers, the video game developer and publisher 71 00:04:32,160 --> 00:04:36,559 Speaker 1: Electronic Arts or e A, got hacked, and I feel 72 00:04:36,600 --> 00:04:38,440 Speaker 1: like a lot of people outside of the A don't 73 00:04:38,480 --> 00:04:40,960 Speaker 1: have a whole lot of sympathy for the company. But 74 00:04:41,279 --> 00:04:44,960 Speaker 1: that's more because there are people with very strong negative 75 00:04:44,960 --> 00:04:48,159 Speaker 1: opinions about e A. I think it is a shame. 76 00:04:48,360 --> 00:04:51,320 Speaker 1: I don't like it when any company gets hacked. And apparently, 77 00:04:51,320 --> 00:04:54,200 Speaker 1: in this case, the hackers were able to access some 78 00:04:54,320 --> 00:04:57,880 Speaker 1: of e a's internal systems and they gained access to 79 00:04:58,200 --> 00:05:01,040 Speaker 1: the source code for certain game as well as access 80 00:05:01,040 --> 00:05:04,800 Speaker 1: to other internal tools. Now, according to e A, no 81 00:05:05,040 --> 00:05:09,040 Speaker 1: user data was compromised as a result of this breach, 82 00:05:09,240 --> 00:05:12,600 Speaker 1: so in theory at least users were not affected by this. 83 00:05:13,360 --> 00:05:15,520 Speaker 1: The company also says it's are A taking steps to 84 00:05:15,560 --> 00:05:18,840 Speaker 1: address the issue by changing things up so that the 85 00:05:18,880 --> 00:05:21,880 Speaker 1: source code that the hackers access would be useless. In 86 00:05:21,880 --> 00:05:25,200 Speaker 1: other words, that they wouldn't be able to exploit e 87 00:05:25,400 --> 00:05:29,279 Speaker 1: A titles because e A is going to change those 88 00:05:29,320 --> 00:05:33,040 Speaker 1: titles significantly enough where that source code will be you know, 89 00:05:33,080 --> 00:05:35,640 Speaker 1: a moot thing. Now, unless the source code was for 90 00:05:35,760 --> 00:05:39,080 Speaker 1: some pretty minor stuff, I imagine that last bit is 91 00:05:39,120 --> 00:05:41,880 Speaker 1: going to be really challenging to pull off. It's not 92 00:05:42,000 --> 00:05:44,839 Speaker 1: as easy as just waving your hand at it. The 93 00:05:44,880 --> 00:05:48,320 Speaker 1: hackers claim to have stolen around seven hundred eighty gigabytes 94 00:05:48,360 --> 00:05:51,400 Speaker 1: of data, though so far the hackers have largely just 95 00:05:51,520 --> 00:05:54,880 Speaker 1: alluded to what they've stolen. They've shown some screenshots and stuff, 96 00:05:55,360 --> 00:05:58,560 Speaker 1: but they haven't actually shared the information itself. They are 97 00:05:58,640 --> 00:06:02,000 Speaker 1: advertising the data is being up for sale on various 98 00:06:02,080 --> 00:06:07,960 Speaker 1: black market forums, so there's some crime right there. El 99 00:06:08,040 --> 00:06:12,520 Speaker 1: Salvador recently announced that the country may soon accept bitcoin 100 00:06:12,720 --> 00:06:16,360 Speaker 1: as legal tender and consider it parallel to the Salvadoran dollar, 101 00:06:17,080 --> 00:06:19,080 Speaker 1: and in the sense that you know, this would be 102 00:06:19,400 --> 00:06:22,800 Speaker 1: an accepted currency that you could use bitcoin anywhere within 103 00:06:22,839 --> 00:06:26,279 Speaker 1: El Salvador and it would be accepted just as the 104 00:06:26,720 --> 00:06:30,440 Speaker 1: actual currency of the country would. And the government of 105 00:06:30,440 --> 00:06:33,520 Speaker 1: El Salvador says that this will aid people who are 106 00:06:33,640 --> 00:06:36,440 Speaker 1: from El Salvador but who are living abroad to be 107 00:06:36,520 --> 00:06:40,599 Speaker 1: able to send remittances back home. That it will remove 108 00:06:40,839 --> 00:06:43,960 Speaker 1: some of the barriers that would exist otherwise when they 109 00:06:43,960 --> 00:06:46,680 Speaker 1: want to transfer funds back to people in El Salvador. 110 00:06:47,000 --> 00:06:50,640 Speaker 1: But the International Monetary Fund has expressed concern for this move, 111 00:06:50,720 --> 00:06:55,200 Speaker 1: stating that cryptocurrencies in general and bitcoin in particular, represents 112 00:06:55,279 --> 00:06:58,800 Speaker 1: some legal and economic concerns that have yet to be addressed. Uh, 113 00:06:58,839 --> 00:07:01,080 Speaker 1: they didn't go into a whole lot of detail on that. 114 00:07:01,200 --> 00:07:03,680 Speaker 1: I'm sure there's a lot of different things that we 115 00:07:03,720 --> 00:07:09,040 Speaker 1: could easily point to, like the fact that cryptocurrencies frequently 116 00:07:09,120 --> 00:07:13,880 Speaker 1: get used in concert with illicit activities, not not exclusively, 117 00:07:14,760 --> 00:07:19,320 Speaker 1: but it does have that association unfortunately. And also taxation 118 00:07:19,800 --> 00:07:22,800 Speaker 1: becomes an issue, and that brings us to here in 119 00:07:22,880 --> 00:07:26,280 Speaker 1: the United States, where the Internal Revenue Service where i 120 00:07:26,440 --> 00:07:31,240 Speaker 1: r S, is seeking authority from Congress to oversea operations 121 00:07:31,280 --> 00:07:35,440 Speaker 1: that relate to cryptocurrencies. So as it stands, there is 122 00:07:35,480 --> 00:07:38,560 Speaker 1: no clear law that gives the i r S authority 123 00:07:38,600 --> 00:07:42,640 Speaker 1: to do stuff like collect taxes on gains that are 124 00:07:42,720 --> 00:07:47,360 Speaker 1: generated through cryptocurrency transactions. Um and Further, the i r 125 00:07:47,480 --> 00:07:50,080 Speaker 1: S lacks the legal foundation to go after instances of 126 00:07:50,080 --> 00:07:55,720 Speaker 1: fraud that involve cryptocurrencies. The decentralized nature of cryptocurrencies is 127 00:07:55,760 --> 00:07:58,120 Speaker 1: one of the many features that attract people to it, 128 00:07:58,600 --> 00:08:01,760 Speaker 1: and another is that a lot of transactions, while part 129 00:08:01,800 --> 00:08:05,640 Speaker 1: of a publicly viewable ledger, are otherwise kind of off 130 00:08:05,680 --> 00:08:08,640 Speaker 1: the radar. And it remains to be seen if Congress 131 00:08:08,640 --> 00:08:11,360 Speaker 1: will grant the authority to the I r S in 132 00:08:11,480 --> 00:08:15,240 Speaker 1: order to pursue, you know, more responsibility with regard to cryptocurrencies. 133 00:08:15,600 --> 00:08:19,760 Speaker 1: And I imagine that this news is affecting the cryptocurrency 134 00:08:19,840 --> 00:08:23,720 Speaker 1: market a bit, because everything seems to affect the cryptocurrency 135 00:08:23,760 --> 00:08:26,640 Speaker 1: market a bit. I mean, if Elon Musk tweets about 136 00:08:26,640 --> 00:08:30,000 Speaker 1: it, it it will really get going. Sticking with politics for 137 00:08:30,040 --> 00:08:33,720 Speaker 1: a moment, a group of Democrat politicians in the United 138 00:08:33,760 --> 00:08:37,920 Speaker 1: States have proposed a few different pieces of antitrust legislation 139 00:08:38,880 --> 00:08:42,760 Speaker 1: that seems to be aimed specifically at really big tech companies, 140 00:08:42,800 --> 00:08:48,080 Speaker 1: and particularly the Big Five of Apple, Amazon, Alphabet, Facebook, 141 00:08:48,200 --> 00:08:52,280 Speaker 1: and Microsoft. While there are draft bills on the matter, 142 00:08:52,679 --> 00:08:54,839 Speaker 1: and there are several of them, and they come from 143 00:08:54,880 --> 00:08:58,199 Speaker 1: different people and they say slightly different things, you can 144 00:08:58,360 --> 00:09:01,360 Speaker 1: kind of boil them all down own to trying to 145 00:09:01,440 --> 00:09:06,400 Speaker 1: deal with two main concerns. Concern number one regards companies 146 00:09:06,440 --> 00:09:10,199 Speaker 1: that can both operate a marketplace and sell their own 147 00:09:10,240 --> 00:09:14,160 Speaker 1: goods and services through that marketplace, along with goods and 148 00:09:14,200 --> 00:09:18,520 Speaker 1: services from other companies. So this would include corporations like 149 00:09:18,600 --> 00:09:22,120 Speaker 1: Apple and Amazon. Both of them have their own products 150 00:09:22,160 --> 00:09:27,520 Speaker 1: listed for sale in market places that they each control, respectively. Right, 151 00:09:28,000 --> 00:09:31,520 Speaker 1: So Amazon's got its own shop obviously, and it has 152 00:09:31,559 --> 00:09:34,280 Speaker 1: its own products in that shop side by side with 153 00:09:34,320 --> 00:09:37,559 Speaker 1: products from other companies. That's kind of what they're talking about. 154 00:09:37,600 --> 00:09:41,360 Speaker 1: That represents a conflict of interest, and it means that 155 00:09:41,480 --> 00:09:46,040 Speaker 1: these companies could conceivably promote their own products over those 156 00:09:46,160 --> 00:09:49,160 Speaker 1: from other entities. So if you've ever done a search 157 00:09:49,200 --> 00:09:51,640 Speaker 1: for a product on Amazon and you notice that the 158 00:09:51,679 --> 00:09:55,239 Speaker 1: first few results all seem to be Amazon branded products, 159 00:09:55,920 --> 00:09:58,640 Speaker 1: then you get what I'm saying. This is a little 160 00:09:58,679 --> 00:10:02,160 Speaker 1: different than stores that have a store brand. Usually in 161 00:10:02,240 --> 00:10:05,880 Speaker 1: those stores, you don't see the store brand presented in 162 00:10:05,920 --> 00:10:11,000 Speaker 1: a way that obscures or you know, eclipses the other brands. 163 00:10:11,040 --> 00:10:14,280 Speaker 1: It's side by side, and the idea is that, well, 164 00:10:14,880 --> 00:10:17,960 Speaker 1: it's not promoted over these others. It may be cheaper 165 00:10:18,000 --> 00:10:22,200 Speaker 1: and thus more attractive. But it's not like the store 166 00:10:22,360 --> 00:10:26,440 Speaker 1: is necessarily using tricks to promote one over the other. 167 00:10:26,880 --> 00:10:30,840 Speaker 1: The concern here is that through technology, those are tricks 168 00:10:30,880 --> 00:10:34,240 Speaker 1: that these companies are using, and it's there's a worry 169 00:10:34,280 --> 00:10:36,800 Speaker 1: that creates an unfair market and sets all the other 170 00:10:36,880 --> 00:10:40,520 Speaker 1: parties at a disadvantage over the main company. The other 171 00:10:40,559 --> 00:10:43,840 Speaker 1: big concern is about acquisitions. So a lot of these 172 00:10:43,840 --> 00:10:48,480 Speaker 1: big tech companies got that big not just through organic growth, 173 00:10:48,760 --> 00:10:52,920 Speaker 1: but by gobbling up other companies along the way. Sometimes 174 00:10:52,960 --> 00:10:57,280 Speaker 1: these acquisitions lead to less competition in a market, which 175 00:10:57,360 --> 00:11:00,640 Speaker 1: is rarely good news for consumers, and we see this 176 00:11:00,720 --> 00:11:06,439 Speaker 1: across numerous industries, from tech hardware to entertainment media and beyond. 177 00:11:06,960 --> 00:11:09,240 Speaker 1: The draft bills generally are going to make it more 178 00:11:09,320 --> 00:11:13,960 Speaker 1: difficult for a large company to pursue acquisitions. Now all 179 00:11:14,040 --> 00:11:17,400 Speaker 1: the stuff is again just in draft form. It could 180 00:11:17,480 --> 00:11:20,920 Speaker 1: change significantly before it ever goes to the floor for 181 00:11:21,000 --> 00:11:23,320 Speaker 1: a vote, and of course there's no guarantee that any 182 00:11:23,360 --> 00:11:26,240 Speaker 1: of these bills will even get that far, but it 183 00:11:26,360 --> 00:11:30,520 Speaker 1: is interesting to see where things are going. Speaking of Amazon, 184 00:11:31,040 --> 00:11:34,400 Speaker 1: the company could face a fine in excess of four 185 00:11:34,480 --> 00:11:38,599 Speaker 1: hundred twenty five million dollars in the EU for violations 186 00:11:38,640 --> 00:11:41,120 Speaker 1: of the g d p R. That's the General Data 187 00:11:41,200 --> 00:11:44,520 Speaker 1: Privacy Regulation. The g d p R is a set 188 00:11:44,520 --> 00:11:48,400 Speaker 1: of pretty strict rules that dictate how companies can collect 189 00:11:48,440 --> 00:11:52,719 Speaker 1: and use information that's generated by citizens of the EU. 190 00:11:52,840 --> 00:11:56,840 Speaker 1: And apparently Amazon's approach was not in line with those rules, 191 00:11:56,880 --> 00:11:59,360 Speaker 1: and now the company will have to pay up. I 192 00:11:59,400 --> 00:12:02,120 Speaker 1: recently did a couple of episodes about privacy and how 193 00:12:02,200 --> 00:12:05,760 Speaker 1: tech companies collect and use our personal information, so I 194 00:12:05,800 --> 00:12:08,640 Speaker 1: recommend you check out the episodes that published on Monday 195 00:12:08,679 --> 00:12:12,960 Speaker 1: and Wednesday to learn more about that. And related to 196 00:12:13,000 --> 00:12:16,920 Speaker 1: Amazon is Ring, a company Amazon purchased in two thousand 197 00:12:16,960 --> 00:12:21,240 Speaker 1: and eighteen. Ring is famous for its smart doorbells, and 198 00:12:21,280 --> 00:12:24,920 Speaker 1: it's also famous for being a part of massive surveillance 199 00:12:25,040 --> 00:12:29,680 Speaker 1: systems utilized by police forces around the United States. Ring 200 00:12:29,760 --> 00:12:32,840 Speaker 1: filed a report earlier this year that showed that in 201 00:12:33,000 --> 00:12:36,760 Speaker 1: twenty twenty, the company received more than eighteen hundred legal 202 00:12:36,840 --> 00:12:40,760 Speaker 1: demands to share video footage with various agencies, which was 203 00:12:41,360 --> 00:12:44,400 Speaker 1: more than double than the year before that. But what 204 00:12:44,520 --> 00:12:48,199 Speaker 1: the report does not include is how many users actually 205 00:12:48,240 --> 00:12:52,680 Speaker 1: had footage from their Ring systems shared with police. You know, 206 00:12:52,720 --> 00:12:56,760 Speaker 1: that's eighteen hundred separate demands. Those demands could have included 207 00:12:56,960 --> 00:13:01,280 Speaker 1: more than one video source, and RING didn't really say 208 00:13:01,400 --> 00:13:05,160 Speaker 1: how many users had their their footage shared with police. 209 00:13:05,520 --> 00:13:09,559 Speaker 1: Privacy advocacy groups and civil rights organizations have long protested 210 00:13:09,559 --> 00:13:13,840 Speaker 1: how RING will work with law enforcement, including the fact 211 00:13:13,920 --> 00:13:17,400 Speaker 1: that you know, you can have instances in which surveillance 212 00:13:17,440 --> 00:13:20,480 Speaker 1: is being shared with police without the police first securing 213 00:13:20,520 --> 00:13:23,760 Speaker 1: a warrant for that information, which does seem to to 214 00:13:23,840 --> 00:13:28,080 Speaker 1: tread pretty close to unreasonable search and seizure, which is, 215 00:13:28,280 --> 00:13:32,080 Speaker 1: you know, something that we should be protected against by 216 00:13:32,320 --> 00:13:37,120 Speaker 1: the Bill of Rights to the Constitution of the United States. Meanwhile, 217 00:13:37,559 --> 00:13:40,400 Speaker 1: over at Google, the company has recently made changes to 218 00:13:40,600 --> 00:13:44,559 Speaker 1: down list search results from sites that focus on slander. 219 00:13:45,000 --> 00:13:48,360 Speaker 1: That is, sites that are perpetrating slander. So if you've 220 00:13:48,360 --> 00:13:51,120 Speaker 1: ever googled someone's name and you saw some results that 221 00:13:51,200 --> 00:13:54,160 Speaker 1: seem to be targeting that person for whatever reason and 222 00:13:54,240 --> 00:13:57,840 Speaker 1: slagging off on them, that's really what we're talking about here. 223 00:13:58,080 --> 00:14:00,280 Speaker 1: And some of these sites have been using very as 224 00:14:00,360 --> 00:14:03,520 Speaker 1: tactics to kind of game the system and rise up 225 00:14:03,600 --> 00:14:06,920 Speaker 1: the page rankings so they dominate the top search results 226 00:14:06,960 --> 00:14:10,000 Speaker 1: for the those people, and now Google is going to 227 00:14:10,080 --> 00:14:13,920 Speaker 1: actively remove that advantage. They're going to start down listing 228 00:14:14,040 --> 00:14:17,760 Speaker 1: or pushing down the results for sites that appear to 229 00:14:17,800 --> 00:14:21,200 Speaker 1: be dedicated to dragging other people down. But this is 230 00:14:21,240 --> 00:14:24,240 Speaker 1: a really delicate thing. You know, over in the EU, 231 00:14:24,320 --> 00:14:27,400 Speaker 1: there's this concept of the right to be forgotten, and 232 00:14:27,480 --> 00:14:30,040 Speaker 1: that's caused Google a lot of grief. And some folks 233 00:14:30,040 --> 00:14:33,560 Speaker 1: point out that if it is easy to get negative 234 00:14:33,600 --> 00:14:37,560 Speaker 1: things removed from search results, that could be doing a 235 00:14:37,680 --> 00:14:42,400 Speaker 1: public disservice in some cases. So for example, let's say 236 00:14:42,440 --> 00:14:45,480 Speaker 1: you've got a high ranking public official and this person 237 00:14:45,600 --> 00:14:49,240 Speaker 1: does something truly terrible. It is in the public's best 238 00:14:49,320 --> 00:14:53,400 Speaker 1: interest to know about that thing, and having a means 239 00:14:53,440 --> 00:14:57,840 Speaker 1: to squelch that kind of information would effectively be suppressing 240 00:14:57,880 --> 00:15:01,160 Speaker 1: the truth. So I am interested to see how this progresses. 241 00:15:01,400 --> 00:15:03,760 Speaker 1: I do think it's important to knock down the trolls, 242 00:15:03,960 --> 00:15:07,320 Speaker 1: who are, you know, slagging off people in bad faith, 243 00:15:07,920 --> 00:15:11,120 Speaker 1: usually as a way of either harming those people or 244 00:15:11,240 --> 00:15:14,600 Speaker 1: profiting off of this, versus the cases where you have 245 00:15:14,680 --> 00:15:19,720 Speaker 1: actual journalists who are attempting to document misbehavior. The Verge 246 00:15:19,720 --> 00:15:22,680 Speaker 1: reports that Facebook is working on a smart watch that, 247 00:15:23,080 --> 00:15:27,000 Speaker 1: should things go as planned, will debut next summer. The 248 00:15:27,040 --> 00:15:30,560 Speaker 1: smart watch will have a display that incorporates two cameras, 249 00:15:30,600 --> 00:15:33,400 Speaker 1: and that display will be detachable from the watch band. 250 00:15:34,040 --> 00:15:36,840 Speaker 1: One of the two cameras will be forward facing, and 251 00:15:37,000 --> 00:15:40,200 Speaker 1: users will rely on that one to make video calls 252 00:15:40,320 --> 00:15:43,800 Speaker 1: and the like. The second cameras rear facing, and thus 253 00:15:43,840 --> 00:15:46,640 Speaker 1: it's really only useful if you detach the smart watch 254 00:15:46,680 --> 00:15:50,480 Speaker 1: display from the band, Otherwise you're just taking very dark 255 00:15:50,560 --> 00:15:54,280 Speaker 1: pictures and videos of your wrist I imagine anyway, the 256 00:15:54,360 --> 00:15:57,640 Speaker 1: rear facing camera will have a ten a DP resolution. 257 00:15:58,120 --> 00:16:01,840 Speaker 1: Facebook is already reportedly working with other companies to develop 258 00:16:01,920 --> 00:16:05,240 Speaker 1: tech that could house the smart watch display. Like you 259 00:16:05,240 --> 00:16:09,760 Speaker 1: could plug the smart watch into other types of stuff, 260 00:16:09,800 --> 00:16:13,760 Speaker 1: like maybe a backpack. Or imagine a robot that can 261 00:16:13,800 --> 00:16:16,360 Speaker 1: accept this little smart watch in a in a slot 262 00:16:16,400 --> 00:16:18,560 Speaker 1: in the robot and then it uses that smart watch 263 00:16:18,600 --> 00:16:22,400 Speaker 1: to power some functions. I might just be dreaming here, anyway. 264 00:16:22,440 --> 00:16:25,200 Speaker 1: The watch might also include a heart rate sensor, as 265 00:16:25,200 --> 00:16:28,840 Speaker 1: well as other sensors. That, of course, has raised questions 266 00:16:28,880 --> 00:16:32,920 Speaker 1: from people and organizations that are concerned about privacy because 267 00:16:32,960 --> 00:16:36,760 Speaker 1: Facebook has well, let's just say it's known for being 268 00:16:36,880 --> 00:16:40,760 Speaker 1: pretty darn hungry when it comes to personal information, and 269 00:16:40,800 --> 00:16:44,160 Speaker 1: so if you've got yet another device feeding data points 270 00:16:44,160 --> 00:16:47,840 Speaker 1: to Facebook, that's even more juicy information that the company 271 00:16:47,880 --> 00:16:51,440 Speaker 1: can use to sell to advertisers and whatnot. But a 272 00:16:51,520 --> 00:16:54,520 Speaker 1: year is a really long time in the tech industry, 273 00:16:54,760 --> 00:16:56,920 Speaker 1: so we'll have to wait and see if this watch 274 00:16:56,920 --> 00:17:00,720 Speaker 1: ever really comes out. If it doesn't, it will have 275 00:17:00,840 --> 00:17:03,960 Speaker 1: been a very expensive lesson for Facebook, because the company 276 00:17:04,000 --> 00:17:08,880 Speaker 1: has reportedly invested around a billion dollars in this endeavor already. 277 00:17:08,960 --> 00:17:12,440 Speaker 1: And finally, Liz Hameron, the c v P of xboxes 278 00:17:12,480 --> 00:17:16,480 Speaker 1: Gaming Experience and Platforms, laid out future plans to games 279 00:17:16,600 --> 00:17:20,600 Speaker 1: Radar with regard to where Xbox is going next. Hameron 280 00:17:20,680 --> 00:17:23,679 Speaker 1: says that the company will continue to develop new consoles 281 00:17:23,680 --> 00:17:27,440 Speaker 1: and hardware, so that's not going away, but we shouldn't 282 00:17:27,440 --> 00:17:30,679 Speaker 1: expect anything new for a while because the series X 283 00:17:30,720 --> 00:17:34,200 Speaker 1: console is still a very young one. Also, I would 284 00:17:34,240 --> 00:17:37,639 Speaker 1: like to propose that the name of whatever follows should 285 00:17:37,640 --> 00:17:42,000 Speaker 1: be the next box, so I'll take my check, thank you. 286 00:17:42,680 --> 00:17:46,560 Speaker 1: Beyond that, however, it appears that you know, Microsoft is 287 00:17:46,640 --> 00:17:49,920 Speaker 1: really focusing on creating a way for gamers to access 288 00:17:50,000 --> 00:17:54,200 Speaker 1: the company's services. That includes partnerships with various smart TV 289 00:17:54,320 --> 00:17:58,320 Speaker 1: manufacturers so that future televisions have a version of Xbox 290 00:17:58,560 --> 00:18:01,160 Speaker 1: Game Pass just built right into them. So that means 291 00:18:01,400 --> 00:18:04,240 Speaker 1: all you would need is a compatible controller to connect 292 00:18:04,240 --> 00:18:07,160 Speaker 1: to your TV, and then you could stream games from 293 00:18:07,160 --> 00:18:08,960 Speaker 1: the cloud to your TV and you don't need a 294 00:18:08,960 --> 00:18:13,600 Speaker 1: console at all. There's also talk of standalone games streaming devices, 295 00:18:13,640 --> 00:18:16,679 Speaker 1: you know, kind of the little gaming sticks that you 296 00:18:16,680 --> 00:18:19,480 Speaker 1: could plug right into a smart TV, possibly like through 297 00:18:19,520 --> 00:18:23,480 Speaker 1: an HDMI connector, and then you link that device to 298 00:18:23,520 --> 00:18:26,479 Speaker 1: your WiFi network at home and you stream games through that. 299 00:18:26,560 --> 00:18:29,280 Speaker 1: So if your television doesn't come with Xbox Game Pass 300 00:18:29,320 --> 00:18:32,760 Speaker 1: built into it, you could use this method to do it. 301 00:18:32,880 --> 00:18:36,240 Speaker 1: In other words, Microsoft's really exploring ways to reach beyond 302 00:18:36,440 --> 00:18:39,480 Speaker 1: the consoles themselves to let players get access to games 303 00:18:39,520 --> 00:18:42,359 Speaker 1: content through the cloud. And this is really interesting to 304 00:18:42,400 --> 00:18:45,520 Speaker 1: me because not that long ago Microsoft was really pushing 305 00:18:45,520 --> 00:18:49,040 Speaker 1: consoles as being sort of a a nexus for cloud 306 00:18:49,080 --> 00:18:52,080 Speaker 1: based services. You would buy a console, and then that 307 00:18:52,119 --> 00:18:55,680 Speaker 1: console would be your gaming rig. It would be your 308 00:18:55,920 --> 00:18:58,439 Speaker 1: your set top device where you could watch YouTube or 309 00:18:58,520 --> 00:19:02,840 Speaker 1: Netflix or Hulu or whatever. Plus it could interact with 310 00:19:02,960 --> 00:19:07,280 Speaker 1: live television broadcasts and cable. And now Microsoft is sort 311 00:19:07,320 --> 00:19:09,720 Speaker 1: of taking an opposite approach, at least when it comes 312 00:19:09,720 --> 00:19:13,240 Speaker 1: to the games themselves. Again, the company is not abandoning 313 00:19:13,359 --> 00:19:16,400 Speaker 1: hardware or consoles or anything like that, and I'm sure 314 00:19:16,400 --> 00:19:20,160 Speaker 1: that future consoles will continue to support that nexus approach 315 00:19:20,240 --> 00:19:23,200 Speaker 1: to services, but this does show a change in strategy. 316 00:19:23,240 --> 00:19:25,400 Speaker 1: I like to think of it as the Netflix approach, 317 00:19:25,720 --> 00:19:29,520 Speaker 1: where your goal is to get your service on every 318 00:19:29,560 --> 00:19:34,119 Speaker 1: possible platform you can, and you extend your reach that way. 319 00:19:35,080 --> 00:19:38,320 Speaker 1: And that wraps up the news for Thursday, June two 320 00:19:38,400 --> 00:19:42,240 Speaker 1: thousand twenty one. I hope you're all doing well and 321 00:19:42,320 --> 00:19:51,280 Speaker 1: I'll talk to you again really soon. Text Stuff is 322 00:19:51,280 --> 00:19:54,440 Speaker 1: an I Heart Radio production. For more podcasts from my 323 00:19:54,560 --> 00:19:58,160 Speaker 1: Heart Radio, visit the I Heart Radio app, Apple Podcasts, 324 00:19:58,280 --> 00:20:01,919 Speaker 1: or wherever you listen to your favorite It's only