1 00:00:04,440 --> 00:00:12,280 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,280 --> 00:00:16,599 Speaker 1: and welcome to tech Stuff. I'm your host Jonathan Strickland, Diamond, 3 00:00:16,640 --> 00:00:21,119 Speaker 1: executive producer at iHeartRadio. And how the tech are you. 4 00:00:21,720 --> 00:00:24,800 Speaker 1: It is time for the tech News for Tuesday, July eleventh, 5 00:00:24,880 --> 00:00:28,840 Speaker 1: twenty twenty three, and we've got several stories relating to 6 00:00:29,120 --> 00:00:33,800 Speaker 1: TikTok and the ongoing concerns about whether that app poses 7 00:00:34,080 --> 00:00:39,320 Speaker 1: as a national security threat to various countries and up. First, 8 00:00:39,760 --> 00:00:43,519 Speaker 1: US lawmakers are looking to amend a proposed law that 9 00:00:43,560 --> 00:00:47,559 Speaker 1: would give the US president broad powers to ban TikTok 10 00:00:47,600 --> 00:00:52,320 Speaker 1: in the United States as well as other Chinese made apps. Already, 11 00:00:52,320 --> 00:00:55,800 Speaker 1: we've seen bans for various federal and state agencies here 12 00:00:56,520 --> 00:01:00,840 Speaker 1: that you are not supposed to install TikTok on a 13 00:01:01,880 --> 00:01:05,440 Speaker 1: government owned device in a lot of these places. But 14 00:01:05,520 --> 00:01:08,479 Speaker 1: this legislation would give the president the authority to block 15 00:01:08,520 --> 00:01:11,560 Speaker 1: transactions that would allow foreign companies to even build out 16 00:01:11,680 --> 00:01:15,600 Speaker 1: information and communications infrastructure here in the United States. And 17 00:01:15,640 --> 00:01:18,280 Speaker 1: in a lot of ways, this actually mirrors how the 18 00:01:18,400 --> 00:01:23,280 Speaker 1: US treated radio way back in the early twentieth century. 19 00:01:23,760 --> 00:01:28,520 Speaker 1: As the First World War escalated, US authorities seized radio 20 00:01:28,640 --> 00:01:33,360 Speaker 1: broadcast assets, essentially radio stations that had previously been under 21 00:01:33,360 --> 00:01:36,560 Speaker 1: the ownership of companies that had a foreign parent company, 22 00:01:37,120 --> 00:01:42,319 Speaker 1: namely the British Marconi Company. So after the war, the 23 00:01:42,440 --> 00:01:46,920 Speaker 1: US declined to return those assets to those foreign companies, 24 00:01:47,240 --> 00:01:49,440 Speaker 1: concerned that it might not be a good idea to 25 00:01:49,520 --> 00:01:53,160 Speaker 1: allow a company that's native to some other country to 26 00:01:53,360 --> 00:01:58,880 Speaker 1: own and operate telecommunications infrastructure within America. So this legislation 27 00:01:59,840 --> 00:02:03,040 Speaker 1: is taken kind of a similar approach. I think the 28 00:02:03,120 --> 00:02:06,600 Speaker 1: legislation was introduced in early spring, but according to Senator 29 00:02:06,680 --> 00:02:11,360 Speaker 1: Mark Warner, TikTok quote spent one hundred million dollars in 30 00:02:11,480 --> 00:02:15,160 Speaker 1: lobbying and slowed a bit of our momentum end quote. 31 00:02:15,680 --> 00:02:18,560 Speaker 1: But this proposal is still on the table. I think 32 00:02:18,600 --> 00:02:21,400 Speaker 1: the main concern among critics is that the law would 33 00:02:21,400 --> 00:02:24,480 Speaker 1: expand the president's powers in ways that could have negative 34 00:02:24,520 --> 00:02:28,320 Speaker 1: consequences down the road, which is a reasonable concern that 35 00:02:28,440 --> 00:02:31,800 Speaker 1: you want to make sure that whatever solution you're proposing 36 00:02:32,320 --> 00:02:37,040 Speaker 1: doesn't end up being worse than whatever the perceived problem is. Meanwhile, 37 00:02:37,080 --> 00:02:41,680 Speaker 1: down Under in Australia, the country's leaders are grilling TikTok 38 00:02:41,760 --> 00:02:45,399 Speaker 1: executives about the company's operations as well as an incident 39 00:02:45,919 --> 00:02:50,280 Speaker 1: that happened last December, and that incident was when Byteedance 40 00:02:50,400 --> 00:02:54,000 Speaker 1: admitted Bye Dance being the parent company to TikTok. It 41 00:02:54,040 --> 00:02:58,000 Speaker 1: admitted that some of its employees had tracked journalists and 42 00:02:58,040 --> 00:03:03,040 Speaker 1: these were journalists who were covering TikTok, and apparently this 43 00:03:03,280 --> 00:03:07,200 Speaker 1: was in an attempt to cross reference the journalist's locations 44 00:03:07,760 --> 00:03:10,760 Speaker 1: with byte Dance employees in an effort to track down 45 00:03:10,800 --> 00:03:15,480 Speaker 1: whistleblowers or leaks. The TikTok representatives said that while that 46 00:03:15,639 --> 00:03:19,040 Speaker 1: incident did happen, the employees who did the tracking did 47 00:03:19,040 --> 00:03:23,119 Speaker 1: so without the knowledge or direction of the company itself, 48 00:03:23,400 --> 00:03:27,000 Speaker 1: So their explanation is that these employees went rogue. In 49 00:03:27,080 --> 00:03:30,640 Speaker 1: other words, some of the Australian leaders became frustrated with 50 00:03:30,680 --> 00:03:35,080 Speaker 1: the TikTok representatives and accused them of obfuscation. At one point, 51 00:03:35,160 --> 00:03:37,720 Speaker 1: a TikTok rep said she could not say for certain 52 00:03:38,080 --> 00:03:42,800 Speaker 1: where Byte Dance's quote unquote formal headquarters were. That led 53 00:03:42,840 --> 00:03:47,840 Speaker 1: Senator James Patterson to say, are you seriously not able 54 00:03:47,880 --> 00:03:51,040 Speaker 1: to say how your parent company, which ultimately owns and 55 00:03:51,080 --> 00:03:54,600 Speaker 1: controls you, is operated? So I think it's safe to 56 00:03:54,600 --> 00:03:57,360 Speaker 1: say that TikTok isn't doing any favors for itself in 57 00:03:57,440 --> 00:04:01,240 Speaker 1: the attempt to walk a little tightrope here to try 58 00:04:01,280 --> 00:04:06,320 Speaker 1: and answer questions without making themselves seem guilty. But also 59 00:04:06,360 --> 00:04:08,840 Speaker 1: the company reps are in a pretty tough position because 60 00:04:09,040 --> 00:04:11,320 Speaker 1: a lot of places around the world have already worked 61 00:04:11,360 --> 00:04:15,840 Speaker 1: with the presumption that TikTok is guilty, and therefore they're 62 00:04:16,040 --> 00:04:22,200 Speaker 1: looking for any bit of evidence that backs up that assumption. Now, 63 00:04:22,240 --> 00:04:25,720 Speaker 1: back here in the United States, Clemson University announced this 64 00:04:25,760 --> 00:04:28,200 Speaker 1: week that no one will be able to access TikTok 65 00:04:28,600 --> 00:04:31,600 Speaker 1: through the campus network. So if your device connects to 66 00:04:31,760 --> 00:04:36,080 Speaker 1: either Clemson University's wired network or its Wi Fi, you 67 00:04:36,120 --> 00:04:39,799 Speaker 1: would find access to TikTok blocked. Of course, you could 68 00:04:39,960 --> 00:04:42,920 Speaker 1: swap to a cellular network and still access it. I 69 00:04:42,960 --> 00:04:45,800 Speaker 1: assume you could also use a VPN and still access it, 70 00:04:46,200 --> 00:04:50,400 Speaker 1: but campus provided internet would be a non starter. The 71 00:04:50,520 --> 00:04:53,880 Speaker 1: ban began yesterday, and the reason Clemson gave her the 72 00:04:53,920 --> 00:04:57,720 Speaker 1: band was to quote unquote protect the integrity of information 73 00:04:57,880 --> 00:05:01,960 Speaker 1: and resources. So with that wording, my guess is that 74 00:05:02,000 --> 00:05:05,080 Speaker 1: this concern may relate to worries that TikTok could serve 75 00:05:05,080 --> 00:05:08,520 Speaker 1: as sort of a carrier for other types of code 76 00:05:08,839 --> 00:05:12,080 Speaker 1: that could be used to exploit networks. I can't say 77 00:05:12,080 --> 00:05:14,320 Speaker 1: for sure. I mean, maybe they're just worried that someone's 78 00:05:14,360 --> 00:05:19,120 Speaker 1: gonna do a TikTok dance in front of some sensitive 79 00:05:19,720 --> 00:05:24,839 Speaker 1: records that relate to Clemson University. I don't know. News 80 00:05:24,880 --> 00:05:29,000 Speaker 1: stories like the ones I have just mentioned probably contributed 81 00:05:29,160 --> 00:05:32,039 Speaker 1: to our next story. And that is according to a 82 00:05:32,120 --> 00:05:37,320 Speaker 1: survey that the Few Research Center conducted, about fifty nine 83 00:05:37,360 --> 00:05:40,919 Speaker 1: percent of Americans see TikTok as a threat to national 84 00:05:40,920 --> 00:05:43,800 Speaker 1: security in the United States. Some of them think it 85 00:05:43,960 --> 00:05:46,760 Speaker 1: is a major threat, others feel it's more like a 86 00:05:46,839 --> 00:05:51,560 Speaker 1: minor threat. The survey found that conservatives were more likely 87 00:05:51,640 --> 00:05:54,359 Speaker 1: to see TikTok as being dangerous. There there was something 88 00:05:54,400 --> 00:05:58,599 Speaker 1: like a seventy percent of conservatives said so. And also 89 00:05:58,640 --> 00:06:02,160 Speaker 1: they found that the older the person was, the more 90 00:06:02,240 --> 00:06:05,479 Speaker 1: likely they were to feel TikTok is a risk. I 91 00:06:05,520 --> 00:06:09,600 Speaker 1: am guessing there's probably a correlation between people who have 92 00:06:09,640 --> 00:06:13,400 Speaker 1: had little to no experience using the app and a 93 00:06:13,560 --> 00:06:16,200 Speaker 1: fear that the app could potentially serve as a threat 94 00:06:16,240 --> 00:06:20,640 Speaker 1: to national security, though I don't think the survey directly 95 00:06:20,839 --> 00:06:24,039 Speaker 1: established that. It just again, it feels to me like 96 00:06:24,120 --> 00:06:28,479 Speaker 1: that's probably true. I know that the survey used a 97 00:06:28,520 --> 00:06:33,159 Speaker 1: fairly modest sample size of five one hundred one respondents, 98 00:06:33,400 --> 00:06:36,200 Speaker 1: so the headlines that generalize this to all Americans are 99 00:06:36,200 --> 00:06:39,800 Speaker 1: perhaps a little bit over zealous, and any conclusions you 100 00:06:39,839 --> 00:06:41,880 Speaker 1: would draw from this survey would need to have some 101 00:06:41,920 --> 00:06:44,599 Speaker 1: really big qualifiers. You need to keep in mind that 102 00:06:45,160 --> 00:06:47,920 Speaker 1: one what was the size of the sample? And two 103 00:06:48,040 --> 00:06:50,200 Speaker 1: how did they word the questions? And for the record, 104 00:06:50,760 --> 00:06:53,479 Speaker 1: the Pew Research Center actually shares all the information so 105 00:06:53,520 --> 00:06:57,360 Speaker 1: you can see that. But personally, I think the whole 106 00:06:57,400 --> 00:07:00,600 Speaker 1: TikTok issue is one of those things where you can't 107 00:07:00,640 --> 00:07:04,200 Speaker 1: see the forest for the trees kind of problem. I've 108 00:07:04,240 --> 00:07:06,280 Speaker 1: said it a lot of times before, but it does 109 00:07:06,360 --> 00:07:11,720 Speaker 1: bear repeating. Should a country be concerned about TikTok's ties 110 00:07:11,760 --> 00:07:15,800 Speaker 1: to China, Well, probably, I mean that is something to 111 00:07:15,840 --> 00:07:20,000 Speaker 1: be concerned about, but we shouldn't focus solely on that 112 00:07:20,040 --> 00:07:23,360 Speaker 1: connection to the point where we ignore the broader problem 113 00:07:23,880 --> 00:07:27,520 Speaker 1: with data brokers and data collection. As long as there 114 00:07:27,520 --> 00:07:30,760 Speaker 1: are platforms that are gobbling up data and then selling 115 00:07:30,760 --> 00:07:34,320 Speaker 1: it to whomever pays for it, there's a potential threat 116 00:07:34,320 --> 00:07:38,040 Speaker 1: to national security. The connection may not be as blatant 117 00:07:38,480 --> 00:07:41,520 Speaker 1: as a company owned by a Chinese parent company. I mean, yeah, 118 00:07:41,960 --> 00:07:45,600 Speaker 1: that's going to set up red flags immediately, but that's 119 00:07:45,720 --> 00:07:50,360 Speaker 1: just obvious, right. Even without that, the connections are still there. 120 00:07:50,440 --> 00:07:52,920 Speaker 1: Even if the United States were to ban TikTok and 121 00:07:53,160 --> 00:07:58,200 Speaker 1: any other Chinese based app, the data floodgates are still 122 00:07:58,240 --> 00:08:04,600 Speaker 1: wide open. It does not us the problem, and that 123 00:08:03,600 --> 00:08:06,480 Speaker 1: is that is something that we have to grapple with. 124 00:08:06,960 --> 00:08:09,320 Speaker 1: And my fear is that TikTok ends up being an 125 00:08:09,440 --> 00:08:12,800 Speaker 1: enormous distraction, that whether TikTok is a danger or not 126 00:08:13,000 --> 00:08:15,600 Speaker 1: is kind of beside the point, because the danger exists 127 00:08:15,640 --> 00:08:19,480 Speaker 1: whether TikTok does or doesn't, and that we can't just 128 00:08:19,640 --> 00:08:23,240 Speaker 1: have it become a conversation about TikTok, because then we 129 00:08:23,320 --> 00:08:26,360 Speaker 1: have the false sense of security that we're all safe 130 00:08:26,840 --> 00:08:29,840 Speaker 1: even if TikTok gets wiped off the face of the planet, 131 00:08:30,400 --> 00:08:34,120 Speaker 1: and that's just an incorrect assumption. On a related note, 132 00:08:34,200 --> 00:08:37,760 Speaker 1: in the state of Massachusetts, legislators are proposing a law 133 00:08:37,920 --> 00:08:40,920 Speaker 1: that would make it illegal for data brokers to buy 134 00:08:41,000 --> 00:08:45,160 Speaker 1: and sell cellular location data. Now, this is in part 135 00:08:45,200 --> 00:08:48,880 Speaker 1: an effort to protect people who seek abortions and it 136 00:08:48,880 --> 00:08:52,280 Speaker 1: would keep their location data private so that some other 137 00:08:52,480 --> 00:08:56,240 Speaker 1: entity can't deduce where those people have gone or what 138 00:08:56,280 --> 00:08:58,880 Speaker 1: they've been up to, and that is really important here 139 00:08:58,880 --> 00:09:01,719 Speaker 1: in the United States. States here in the US are 140 00:09:01,840 --> 00:09:06,280 Speaker 1: aggressively criminalizing abortion in the wake of the overturning of 141 00:09:06,440 --> 00:09:11,480 Speaker 1: Roe Versus Way. That includes having laws that would punish 142 00:09:11,480 --> 00:09:15,040 Speaker 1: citizens who travel out of state to seek an abortion. 143 00:09:15,200 --> 00:09:17,840 Speaker 1: So even if an abortion is legal somewhere else, if 144 00:09:17,880 --> 00:09:19,760 Speaker 1: you're a citizen of one of those states, you would 145 00:09:19,800 --> 00:09:22,400 Speaker 1: be breaking the law. So this kind of law that 146 00:09:22,440 --> 00:09:27,040 Speaker 1: Massachusetts is proposing would protect people who were seeking medical 147 00:09:27,080 --> 00:09:30,680 Speaker 1: help outside of their home state. The bill is called 148 00:09:30,679 --> 00:09:36,480 Speaker 1: the Location Shield Act, and it would ban the buying, selling, renting, leasing, 149 00:09:36,679 --> 00:09:41,520 Speaker 1: or trading of location data across the state of Massachusetts. Moreover, 150 00:09:42,000 --> 00:09:46,520 Speaker 1: companies would first have to get consent from citizens before 151 00:09:46,559 --> 00:09:49,080 Speaker 1: they could even collect that kind of data in the 152 00:09:49,120 --> 00:09:52,000 Speaker 1: first place. They would have to have the express consent 153 00:09:52,200 --> 00:09:56,000 Speaker 1: of people in Massachusetts before they do that. Companies that 154 00:09:56,040 --> 00:09:59,040 Speaker 1: failed to comply with this bill, if it does become law, 155 00:09:59,360 --> 00:10:02,400 Speaker 1: would face finds as well as open up the possibility 156 00:10:02,440 --> 00:10:06,560 Speaker 1: of class action lawsuits against the company. Chances are pretty 157 00:10:06,559 --> 00:10:09,880 Speaker 1: good that this legislation is going to be passed into 158 00:10:10,000 --> 00:10:12,679 Speaker 1: law in Massachusetts. It would become a pioneer piece of 159 00:10:12,800 --> 00:10:16,240 Speaker 1: legislation here in the US. Moreover, it would be a 160 00:10:16,360 --> 00:10:20,680 Speaker 1: very tiny step toward oversight and regulation of the data 161 00:10:20,840 --> 00:10:25,960 Speaker 1: brokerage industry, which has gone unchecked in the US for decades, 162 00:10:26,000 --> 00:10:29,679 Speaker 1: And in my mind, it is the larger problem compared 163 00:10:29,720 --> 00:10:32,120 Speaker 1: to the TikTok stuff that we talked about just a 164 00:10:32,160 --> 00:10:34,920 Speaker 1: moment ago. I think this is a really good step. 165 00:10:35,480 --> 00:10:37,959 Speaker 1: It's one of many steps that I think need to 166 00:10:38,000 --> 00:10:42,199 Speaker 1: be taken in order to place some restrictions on the 167 00:10:42,280 --> 00:10:50,199 Speaker 1: rampant collection and exploitation of information. That information has real value, obviously, 168 00:10:50,280 --> 00:10:52,920 Speaker 1: because companies are willing to buy it, and other companies 169 00:10:52,960 --> 00:10:57,760 Speaker 1: make billions of dollars by selling it. And meanwhile, we, 170 00:10:58,040 --> 00:11:01,480 Speaker 1: the people who are generating the information, end up being 171 00:11:01,800 --> 00:11:05,000 Speaker 1: exploited by it as opposed to profiting from it or 172 00:11:05,040 --> 00:11:07,760 Speaker 1: having any control of it whatsoever. So I think that 173 00:11:07,840 --> 00:11:11,839 Speaker 1: this is a good step. Okay, we're going to take 174 00:11:11,840 --> 00:11:14,480 Speaker 1: a quick break to thank our sponsors. When we come back, 175 00:11:14,480 --> 00:11:26,199 Speaker 1: we've got some more news to talk about. We're back. 176 00:11:26,320 --> 00:11:29,320 Speaker 1: So the United States and the European Union have come 177 00:11:29,360 --> 00:11:32,760 Speaker 1: to an agreement on data policies that will re establish 178 00:11:32,920 --> 00:11:37,480 Speaker 1: transatlantic data transfers between the US and the EU, at 179 00:11:37,520 --> 00:11:41,560 Speaker 1: least for the time being. So Previously, the European Union 180 00:11:41,720 --> 00:11:45,000 Speaker 1: turned off the data valves and cited concerns that US 181 00:11:45,080 --> 00:11:51,680 Speaker 1: intelligence officials could potentially comb through EU citizens data and 182 00:11:51,760 --> 00:11:54,840 Speaker 1: that the data belonging to EU citizens would not be 183 00:11:54,960 --> 00:11:59,640 Speaker 1: safe from surveillance. And that is a perfectly reasonable concern, y'all, 184 00:12:00,200 --> 00:12:04,280 Speaker 1: goodness knows, there are US intelligence agencies that are dedicated 185 00:12:04,320 --> 00:12:07,880 Speaker 1: to specifically doing that kind of thing, even to the 186 00:12:07,920 --> 00:12:12,520 Speaker 1: extent of doing it on American citizens. Anyway, the EU 187 00:12:12,600 --> 00:12:17,080 Speaker 1: demanded assurances that EU citizen data would receive the same 188 00:12:17,200 --> 00:12:20,440 Speaker 1: level of protection over in the US as it does 189 00:12:20,480 --> 00:12:23,600 Speaker 1: in the EU, and that US intelligence agencies will have 190 00:12:24,320 --> 00:12:27,280 Speaker 1: limited access to any kind of data from the EU. 191 00:12:27,720 --> 00:12:31,200 Speaker 1: This agreement, which is called the EU US Data Privacy 192 00:12:31,200 --> 00:12:35,840 Speaker 1: Framework or DPF, is the third attempt to establish new 193 00:12:35,880 --> 00:12:40,960 Speaker 1: safeguards that adequately meet EU concerns over citizen privacy and safety. 194 00:12:41,559 --> 00:12:44,240 Speaker 1: That being said, critics of this new agreement argue that 195 00:12:44,280 --> 00:12:48,440 Speaker 1: it doesn't actually make substantive changes in policy, and that 196 00:12:48,600 --> 00:12:51,320 Speaker 1: inevitably there are going to be legal challenges to this 197 00:12:51,400 --> 00:12:54,880 Speaker 1: new agreement, and in all likelihood it will end up 198 00:12:54,920 --> 00:12:59,600 Speaker 1: getting revoked as the previous two agreements had. So what 199 00:12:59,760 --> 00:13:02,880 Speaker 1: the critics are saying is that this still isn't actually 200 00:13:03,120 --> 00:13:07,520 Speaker 1: enough to meet the needs of the EU and to 201 00:13:07,679 --> 00:13:11,600 Speaker 1: protect EU citizens, So chances are this is a temporary 202 00:13:11,760 --> 00:13:15,320 Speaker 1: fix and we'll be back to the drawing board within 203 00:13:15,840 --> 00:13:19,120 Speaker 1: a year or so. Apple has pushed out some Rapid 204 00:13:19,160 --> 00:13:23,319 Speaker 1: Security Response or RSR updates to address zero day vulnerabilities 205 00:13:23,679 --> 00:13:27,800 Speaker 1: that hackers can use to compromise devices like iPhones, iPads, 206 00:13:27,880 --> 00:13:31,400 Speaker 1: and max. This flaw is within the WebKit browser engine, 207 00:13:31,440 --> 00:13:33,959 Speaker 1: so a hacker can essentially set up a web page 208 00:13:33,960 --> 00:13:36,560 Speaker 1: that contains malicious content, and if you get tricked into 209 00:13:36,600 --> 00:13:39,439 Speaker 1: visiting that web page, then you can end up downloading 210 00:13:39,440 --> 00:13:42,400 Speaker 1: and executing some code that then allows the hacker to 211 00:13:42,520 --> 00:13:45,800 Speaker 1: remotely execute code on your device, so essentially you hand 212 00:13:45,840 --> 00:13:48,800 Speaker 1: over control of your device to the hacker. And once 213 00:13:48,840 --> 00:13:51,360 Speaker 1: again this illustrates how it's important to keep your devices 214 00:13:51,480 --> 00:13:54,000 Speaker 1: up to date with the latest security patches, and that's 215 00:13:54,080 --> 00:13:56,800 Speaker 1: a good idea to allow updates to install as they 216 00:13:56,840 --> 00:14:00,320 Speaker 1: become available, rather than to hold off until later. To 217 00:14:00,360 --> 00:14:03,560 Speaker 1: make sure you don't become one of the victims of 218 00:14:03,559 --> 00:14:07,960 Speaker 1: these kinds of schemes. CNN reports that Meta, while basking 219 00:14:08,000 --> 00:14:09,920 Speaker 1: in the fact that more than one hundred million people 220 00:14:09,920 --> 00:14:13,720 Speaker 1: have signed up for Threads already, has also made drastic 221 00:14:13,800 --> 00:14:17,640 Speaker 1: cuts to its teams that focused on fighting misinformation and 222 00:14:17,760 --> 00:14:21,840 Speaker 1: disinformation across company platforms. Now, this has caused for concern 223 00:14:21,920 --> 00:14:24,200 Speaker 1: for lots of reasons, not the least of which is 224 00:14:24,200 --> 00:14:26,760 Speaker 1: that the United States is heading into an election year 225 00:14:26,840 --> 00:14:29,320 Speaker 1: next year, and Meta is a company with a pretty 226 00:14:29,400 --> 00:14:34,120 Speaker 1: awful reputation for facilitating the spread of election misinformation and worse. 227 00:14:34,200 --> 00:14:37,640 Speaker 1: After all, it was Meta's platform Facebook that took center 228 00:14:37,720 --> 00:14:42,080 Speaker 1: stage during the Cambridge Analytica scandal. CNN has trouble getting 229 00:14:42,120 --> 00:14:45,680 Speaker 1: firm answers to some pretty simple questions such as how 230 00:14:45,680 --> 00:14:48,680 Speaker 1: many cuts to those departments have been made, or what 231 00:14:49,240 --> 00:14:51,960 Speaker 1: was the size of those departments now compared to before, 232 00:14:52,480 --> 00:14:55,040 Speaker 1: or even what tools do they have at their disposal 233 00:14:55,160 --> 00:14:57,640 Speaker 1: to do their jobs? And the fact that there were 234 00:14:57,720 --> 00:15:01,680 Speaker 1: cuts at all isn't totally surprising obviously, you know. Mark Zuckerberg, 235 00:15:01,840 --> 00:15:05,320 Speaker 1: CEO of Meta, famously said that twenty twenty three is 236 00:15:05,360 --> 00:15:08,560 Speaker 1: the year of efficiency for the company and that mostly 237 00:15:08,600 --> 00:15:11,440 Speaker 1: seems to boil down to making massive job cuts and 238 00:15:11,480 --> 00:15:14,960 Speaker 1: trying to do more with fewer people. I imagine it's 239 00:15:15,000 --> 00:15:19,200 Speaker 1: easier from a corporate standpoint to make cuts to departments 240 00:15:19,240 --> 00:15:23,400 Speaker 1: like content moderation and trust in safety teams because they 241 00:15:23,400 --> 00:15:27,280 Speaker 1: don't necessarily contribute directly to the bottom line the way 242 00:15:27,320 --> 00:15:29,680 Speaker 1: some other departments do. But as we have seen in 243 00:15:29,720 --> 00:15:33,200 Speaker 1: past years, a lack of resources and policy can create 244 00:15:33,480 --> 00:15:37,280 Speaker 1: massive negative consequences. So if Zuckerberg would like to avoid 245 00:15:37,320 --> 00:15:40,560 Speaker 1: another future date with US Congress, it might be wise 246 00:15:40,600 --> 00:15:44,040 Speaker 1: to make some investments in those departments rather than cuts. 247 00:15:44,720 --> 00:15:48,000 Speaker 1: I talked a lot about Twitter in yesterday's Tech Stuff episode, 248 00:15:48,000 --> 00:15:50,400 Speaker 1: but one thing I did not cover was how the 249 00:15:50,400 --> 00:15:54,760 Speaker 1: company has changed its API policies. So an API is 250 00:15:54,800 --> 00:15:58,880 Speaker 1: an application programming interface. This is what allows a third 251 00:15:58,880 --> 00:16:01,600 Speaker 1: party developer to a tool that can tap into some 252 00:16:01,720 --> 00:16:05,360 Speaker 1: other platform or software. So for Twitter, this is the 253 00:16:05,360 --> 00:16:07,960 Speaker 1: interface that app developers can use if they want to 254 00:16:07,960 --> 00:16:11,080 Speaker 1: create their own third party Twitter client, for example, or 255 00:16:11,120 --> 00:16:14,720 Speaker 1: any other tool that sends requests or posts to Twitter. 256 00:16:15,320 --> 00:16:18,360 Speaker 1: Not long ago, Twitter changed its API policy to increase 257 00:16:18,360 --> 00:16:21,120 Speaker 1: the amount charged to developers for this privilege, with three 258 00:16:21,160 --> 00:16:25,120 Speaker 1: different tiers. If you're at the enterprise tier, you're talking 259 00:16:25,120 --> 00:16:29,160 Speaker 1: about starting at forty two thousand dollars per month, So 260 00:16:29,240 --> 00:16:32,240 Speaker 1: it really depends upon which tier you're subscribed to and 261 00:16:32,280 --> 00:16:35,520 Speaker 1: what the purpose is. But you are paying a good 262 00:16:35,520 --> 00:16:38,600 Speaker 1: deal of money for access to the API, and yet, 263 00:16:38,640 --> 00:16:43,360 Speaker 1: according to developers, the API has become unreliable and unstable 264 00:16:43,440 --> 00:16:46,640 Speaker 1: this year, meaning that apps are not always functional. Twitter 265 00:16:46,720 --> 00:16:50,680 Speaker 1: will make changes without clearly communicating those changes to app 266 00:16:50,720 --> 00:16:53,880 Speaker 1: developers ahead of time, and it kind of breaks the app, 267 00:16:54,240 --> 00:16:56,960 Speaker 1: and meanwhile the developers have to scramble to fix things. 268 00:16:57,320 --> 00:16:59,840 Speaker 1: So it's not due to flaws in the apps themselves. 269 00:17:00,000 --> 00:17:03,600 Speaker 1: Instead's due to problems with the API or changes to 270 00:17:03,640 --> 00:17:06,520 Speaker 1: the API, and that's something that developers don't have any 271 00:17:06,560 --> 00:17:10,119 Speaker 1: control over. So they're complaining that they're spending more money 272 00:17:10,119 --> 00:17:12,840 Speaker 1: than ever to be able to work with Twitter, and 273 00:17:12,880 --> 00:17:15,399 Speaker 1: that Twitter isn't working as well as it should a 274 00:17:15,520 --> 00:17:18,600 Speaker 1: coin immashable. The problem has led to several developers abandoning 275 00:17:18,640 --> 00:17:22,119 Speaker 1: Twitter altogether, which seems to be quite the trend because 276 00:17:22,240 --> 00:17:26,240 Speaker 1: users and advertisers have been doing that as well. There 277 00:17:26,280 --> 00:17:29,320 Speaker 1: have been countless apps and services designed to let you 278 00:17:29,359 --> 00:17:32,199 Speaker 1: make notes and access them wherever you happen to be. 279 00:17:32,560 --> 00:17:35,360 Speaker 1: One of the most famous is ever note, which traces 280 00:17:35,400 --> 00:17:38,240 Speaker 1: its history way back to two thousand, before the era 281 00:17:38,320 --> 00:17:41,720 Speaker 1: of the consumer smartphone, although you could argue that really 282 00:17:41,760 --> 00:17:45,920 Speaker 1: it was the explosion of the smartphone market that made 283 00:17:45,920 --> 00:17:49,680 Speaker 1: Evernote rise to prominence. But last November, a Milan based 284 00:17:49,720 --> 00:17:54,360 Speaker 1: company called Bending Spoons purchased Evernote, or at least announced 285 00:17:54,359 --> 00:17:56,200 Speaker 1: that it was going to, and then the acquisition was 286 00:17:56,240 --> 00:17:59,800 Speaker 1: complete in early twenty twenty three. Evernote is based out 287 00:17:59,840 --> 00:18:02,160 Speaker 1: of Redwood City in America, or at least it used 288 00:18:02,160 --> 00:18:05,560 Speaker 1: to be, and this week news broke that Bending Spoons 289 00:18:05,560 --> 00:18:08,560 Speaker 1: had laid off nearly all of Evernote's staff and the 290 00:18:08,560 --> 00:18:11,560 Speaker 1: company plans to move all operations to Europe. This has 291 00:18:11,600 --> 00:18:14,240 Speaker 1: prompted Evernote fans to worry about the future of the app. 292 00:18:14,280 --> 00:18:17,000 Speaker 1: It's not the first time that Evernote has faced challenges. 293 00:18:17,480 --> 00:18:19,720 Speaker 1: The company upset a lot of users with an overhaul 294 00:18:19,760 --> 00:18:22,200 Speaker 1: to the app several years ago. A lot of users 295 00:18:22,240 --> 00:18:26,080 Speaker 1: felt that the company failed to deliver upon promised features 296 00:18:26,119 --> 00:18:28,840 Speaker 1: and broke a lot of stuff that they liked about Evernote, 297 00:18:29,040 --> 00:18:31,639 Speaker 1: but it kind of recovered from that, although the company 298 00:18:31,720 --> 00:18:34,880 Speaker 1: also held layoffs in twenty fifteen, twenty eighteen, and even 299 00:18:34,920 --> 00:18:39,399 Speaker 1: as recently as twenty twenty two, and now Evernote faces 300 00:18:39,440 --> 00:18:42,760 Speaker 1: a whole lot more competition in the field, so a 301 00:18:42,800 --> 00:18:44,480 Speaker 1: lot of people worry that this could be the end 302 00:18:44,520 --> 00:18:47,440 Speaker 1: of Evernote. And it may not be. Everynote may still 303 00:18:47,480 --> 00:18:50,560 Speaker 1: be sticking around, but things are certainly looking a little 304 00:18:50,640 --> 00:18:53,359 Speaker 1: dark at the moment. So here's hoping all the staff 305 00:18:53,359 --> 00:18:56,480 Speaker 1: who formerly worked for Evernote are able to land on 306 00:18:56,520 --> 00:19:00,800 Speaker 1: their feet. Finally, protesters in San Francisco who object to 307 00:19:00,920 --> 00:19:05,160 Speaker 1: autonomous vehicles have employed a reportedly effective way to paralyze 308 00:19:05,200 --> 00:19:08,720 Speaker 1: self driving cars, and that's to gently place a traffic 309 00:19:08,760 --> 00:19:12,719 Speaker 1: cone on the hood of the vehicle, which then will 310 00:19:12,800 --> 00:19:16,320 Speaker 1: just make it sit still. The protesters, who belong to 311 00:19:16,359 --> 00:19:19,719 Speaker 1: a group that calls themselves the Safe Streets Rebel, have 312 00:19:19,840 --> 00:19:24,120 Speaker 1: dubbed this the Week of Cone. They object to startups 313 00:19:24,119 --> 00:19:26,800 Speaker 1: that are putting more vehicles on the streets. They argue 314 00:19:26,840 --> 00:19:30,000 Speaker 1: that cities should invest in making streets more pedestrian and 315 00:19:30,080 --> 00:19:34,600 Speaker 1: bike friendly rather than to accommodate more vehicles. They also 316 00:19:34,680 --> 00:19:38,680 Speaker 1: object to vehicles that are covered in various sensors and cameras. 317 00:19:38,960 --> 00:19:43,200 Speaker 1: They have likened autonomous vehicles to being surveillance pods because 318 00:19:43,560 --> 00:19:45,679 Speaker 1: in order to operate, they have to be able to 319 00:19:45,800 --> 00:19:49,880 Speaker 1: survey the entire area around them, but that also poses 320 00:19:50,000 --> 00:19:54,760 Speaker 1: a potential risk to privacy and security for citizens. The 321 00:19:54,800 --> 00:19:57,879 Speaker 1: protests aren't meant to cause physical damage to the vehicles, 322 00:19:57,920 --> 00:20:01,399 Speaker 1: but they can still create a hazard because protesters have 323 00:20:01,480 --> 00:20:03,679 Speaker 1: coned cars that just have come to a stop at 324 00:20:03,680 --> 00:20:07,080 Speaker 1: an intersection, and then the self driving vehicle ends up 325 00:20:07,080 --> 00:20:10,520 Speaker 1: being stuck there and ends up blocking traffic. So it 326 00:20:10,520 --> 00:20:13,879 Speaker 1: does have its consequences. And that's it for the news 327 00:20:13,920 --> 00:20:18,200 Speaker 1: for today, Tuesday, July eleventh, twenty twenty three. I hope 328 00:20:18,280 --> 00:20:20,840 Speaker 1: you are all well, and I'll talk to you again 329 00:20:21,760 --> 00:20:31,720 Speaker 1: really soon. Tech Stuff is an iHeartRadio production. For more 330 00:20:31,800 --> 00:20:36,520 Speaker 1: podcasts from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, or 331 00:20:36,560 --> 00:20:42,080 Speaker 1: wherever you listen to your favorite shows.