1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to Tech Stuff, a production from my Heart Radio. 2 00:00:12,480 --> 00:00:15,320 Speaker 1: Hey there, and welcome to text Stuff. I'm your host, 3 00:00:15,440 --> 00:00:18,200 Speaker 1: Jonathan Strickland. I'm an executive producer with iHeart Radio and 4 00:00:18,239 --> 00:00:21,360 Speaker 1: a love of all things tech. And today's episode requires 5 00:00:21,400 --> 00:00:24,439 Speaker 1: a bit of an explanation. You see, this is going 6 00:00:24,480 --> 00:00:30,840 Speaker 1: to be the episode I recorded for yesterday, Thursday, March one. 7 00:00:30,920 --> 00:00:34,680 Speaker 1: It's a news oriented episode, but I did that forgetting 8 00:00:34,920 --> 00:00:39,519 Speaker 1: that I had already agreed to run the trailer for 9 00:00:39,560 --> 00:00:42,960 Speaker 1: the next season of Smart Talks, hosted by Malcolm Gladwell, 10 00:00:43,440 --> 00:00:46,760 Speaker 1: and so I didn't want to get rid of the 11 00:00:46,800 --> 00:00:50,879 Speaker 1: episode I had researched and written and recorded because I 12 00:00:50,920 --> 00:00:53,760 Speaker 1: felt like there was a lot of good information there. 13 00:00:54,200 --> 00:00:57,080 Speaker 1: Even though the news is a little less fresh than 14 00:00:57,120 --> 00:01:00,160 Speaker 1: it would have been yesterday, it's still an important stuff 15 00:01:00,360 --> 00:01:03,920 Speaker 1: because it's really focusing on a lot of stories about 16 00:01:03,960 --> 00:01:09,399 Speaker 1: how tech and governments and companies are using technology to 17 00:01:10,080 --> 00:01:15,679 Speaker 1: track us and to mass huge amounts of data, sometimes 18 00:01:15,720 --> 00:01:20,280 Speaker 1: without our knowledge, sometimes explicitly without our knowledge, and what 19 00:01:20,319 --> 00:01:22,520 Speaker 1: does that mean and is there anything we can do 20 00:01:22,560 --> 00:01:25,680 Speaker 1: about it? I feel like this is an important thing 21 00:01:25,800 --> 00:01:29,720 Speaker 1: to know about, even if you don't really have any 22 00:01:29,720 --> 00:01:32,200 Speaker 1: plans to change things. It's good to at least be 23 00:01:32,240 --> 00:01:35,160 Speaker 1: aware that it's going on. So with that in mind, 24 00:01:35,520 --> 00:01:40,919 Speaker 1: I hope you enjoy this slightly stale tech News episode. 25 00:01:40,959 --> 00:01:44,040 Speaker 1: I mean, it's only a day late, and that means 26 00:01:44,040 --> 00:01:47,320 Speaker 1: no classic episode this week. But next week we should 27 00:01:47,319 --> 00:01:49,640 Speaker 1: be pretty much back to normal. And in the coming 28 00:01:49,680 --> 00:01:52,360 Speaker 1: weeks tech stuff is going to change up a little 29 00:01:52,400 --> 00:01:55,840 Speaker 1: bit and that occasionally we will run an episode of 30 00:01:55,880 --> 00:01:59,200 Speaker 1: smart Talks in the tech Stuff feed. And again that's 31 00:01:59,200 --> 00:02:01,640 Speaker 1: going to be hosted by Malcolm Gladwell, not by me. 32 00:02:02,120 --> 00:02:05,920 Speaker 1: That is a big level up, but that's only going 33 00:02:05,960 --> 00:02:09,520 Speaker 1: to be occasionally. Most of the time, it's gonna be 34 00:02:09,560 --> 00:02:13,119 Speaker 1: the regular old tech stuff, so you're stuck with me. Anyway. 35 00:02:13,480 --> 00:02:18,720 Speaker 1: Let's listen to a slightly less fresh tech News episode. 36 00:02:18,960 --> 00:02:24,119 Speaker 1: Take it away, Jonathan from the past. Wikipedia the online 37 00:02:24,160 --> 00:02:27,320 Speaker 1: resource that you're not supposed to cite in your term papers, 38 00:02:27,520 --> 00:02:30,160 Speaker 1: and for good reason. But I won't get off track here. 39 00:02:30,440 --> 00:02:33,639 Speaker 1: I'll just say Wikipedia is a great resource to use 40 00:02:33,840 --> 00:02:37,359 Speaker 1: as a starting point. It just isn't a primary resource 41 00:02:37,680 --> 00:02:40,760 Speaker 1: and was never intended to be anyway. It will soon 42 00:02:40,880 --> 00:02:45,079 Speaker 1: launch a paid for service, but don't worry. This won't 43 00:02:45,120 --> 00:02:47,480 Speaker 1: mean you'll have to cough up cash the next time 44 00:02:47,520 --> 00:02:50,000 Speaker 1: you want to read up on, you know, Michael Bay 45 00:02:50,040 --> 00:02:54,240 Speaker 1: Transformer movie, or you want to learn about medieval villages 46 00:02:54,240 --> 00:02:56,560 Speaker 1: in the Netherlands, or you want to skim articles about 47 00:02:56,639 --> 00:03:00,839 Speaker 1: quantum entanglement or whatever. The paid for service customers will 48 00:03:00,840 --> 00:03:06,320 Speaker 1: actually be really big companies like Google and Amazon and Facebook. 49 00:03:06,639 --> 00:03:10,400 Speaker 1: The service will offer up developer tools, so these companies 50 00:03:10,880 --> 00:03:15,440 Speaker 1: can use those to republish database information on other platforms, 51 00:03:15,480 --> 00:03:19,800 Speaker 1: you know, to repurpose the info that's on Wikipedia for 52 00:03:19,919 --> 00:03:24,119 Speaker 1: their own you know uses, and presumably these customers will 53 00:03:24,320 --> 00:03:27,600 Speaker 1: have access to tools and data that aren't necessarily available 54 00:03:27,600 --> 00:03:31,720 Speaker 1: to the average Wikipedia user. According to Lane Becker, Senior 55 00:03:31,880 --> 00:03:36,240 Speaker 1: director of the Wikimedia Foundation, some companies have been repurposing 56 00:03:36,240 --> 00:03:39,520 Speaker 1: Wikipedia articles on their own sites for years, and often 57 00:03:39,600 --> 00:03:44,400 Speaker 1: they employ people to clean and reformat articles to better 58 00:03:44,480 --> 00:03:47,320 Speaker 1: fit the owned and operated sites design. I see this 59 00:03:47,360 --> 00:03:50,280 Speaker 1: all the time, where I'll be doing research and it 60 00:03:50,320 --> 00:03:52,600 Speaker 1: will send me to a page that looks like it's 61 00:03:52,640 --> 00:03:54,520 Speaker 1: an owned and operated page. But as I read, I 62 00:03:54,560 --> 00:04:00,200 Speaker 1: realized this is literally pulling the article from Wikipedia into 63 00:04:00,320 --> 00:04:03,280 Speaker 1: this page. That's the kind of thing that this paid 64 00:04:03,320 --> 00:04:06,920 Speaker 1: for service will cover. Wikimedia has formed a division called 65 00:04:06,920 --> 00:04:11,160 Speaker 1: Wikimedia Enterprises to develop this tool and to negotiate agreements 66 00:04:11,600 --> 00:04:14,880 Speaker 1: with various customers. Now the company is still working out 67 00:04:15,040 --> 00:04:17,760 Speaker 1: the finer details, and I could see this being used 68 00:04:17,800 --> 00:04:21,440 Speaker 1: in lots of ways, including with Google Smart Home products. 69 00:04:21,800 --> 00:04:24,640 Speaker 1: Asking a Google Home device a question could lead it 70 00:04:24,680 --> 00:04:28,440 Speaker 1: to pull from data that originated from Wikipedia, and it 71 00:04:28,440 --> 00:04:31,960 Speaker 1: would be enabled by this sort of licensing agreement. It's 72 00:04:32,000 --> 00:04:35,880 Speaker 1: also good to remember that the Wikimedia Foundation is a 73 00:04:35,920 --> 00:04:40,520 Speaker 1: nonprofit organization. The money from these projects would presumably go 74 00:04:40,640 --> 00:04:45,360 Speaker 1: back into supporting the hosting and continued development of Wikimedia itself. 75 00:04:46,320 --> 00:04:51,640 Speaker 1: Now cast your memory back to the summer of twenty twenty, which, 76 00:04:52,040 --> 00:04:57,280 Speaker 1: by my reckoning, was approximately a lifetime ago. One of 77 00:04:57,320 --> 00:05:00,920 Speaker 1: the many news stories that summer is how the Twitter 78 00:05:01,000 --> 00:05:06,039 Speaker 1: accounts for several prominent people, including Bill Gates, Elon Musk, 79 00:05:06,160 --> 00:05:10,640 Speaker 1: and Joe Biden all got hijacked by hackers, and they 80 00:05:10,760 --> 00:05:14,960 Speaker 1: used those accounts to perpetuate a scam. Now, basically, the 81 00:05:15,000 --> 00:05:18,640 Speaker 1: scam promised a big return on investments in a supposed 82 00:05:18,760 --> 00:05:23,280 Speaker 1: money making strategy. In fact, that scam claimed that participants 83 00:05:23,320 --> 00:05:26,440 Speaker 1: would double their money. They would give a certain amount 84 00:05:26,480 --> 00:05:29,320 Speaker 1: in bitcoin, and they would get twice that back, and 85 00:05:29,360 --> 00:05:32,200 Speaker 1: of course some folks fell for it and handed their 86 00:05:32,560 --> 00:05:36,320 Speaker 1: hard earned cryptocurrency cash over to the hackers to the 87 00:05:36,360 --> 00:05:39,560 Speaker 1: tune of more than a hundred fifteen thousand dollars. Well. 88 00:05:39,680 --> 00:05:44,479 Speaker 1: One of those hackers, Graham Ivan Clark, was caught and 89 00:05:44,720 --> 00:05:47,400 Speaker 1: charged and he pled guilty to charge as a fraud. 90 00:05:47,800 --> 00:05:51,160 Speaker 1: In return, he received a sentence of three years in 91 00:05:51,760 --> 00:05:56,200 Speaker 1: prison or a junile boot camp type thing, followed by 92 00:05:56,200 --> 00:05:59,600 Speaker 1: three years of probation. And during that probation he's not 93 00:06:00,040 --> 00:06:04,120 Speaker 1: supposed to use a computer without actual permission and supervision. 94 00:06:04,680 --> 00:06:07,159 Speaker 1: At the time of the crime, Clark was seventeen. He 95 00:06:07,200 --> 00:06:10,640 Speaker 1: has since turned eighteen, so he was sentenced as a 96 00:06:10,760 --> 00:06:14,120 Speaker 1: youthful offender. Otherwise he would be looking at a mandatory 97 00:06:14,200 --> 00:06:17,159 Speaker 1: sentence of ten years in prison. He and the other 98 00:06:17,200 --> 00:06:21,080 Speaker 1: two hackers used social engineering to get administrative access to 99 00:06:21,200 --> 00:06:24,880 Speaker 1: various Twitter handles. So, according to investigators, what they did 100 00:06:25,080 --> 00:06:28,599 Speaker 1: was they scoured linked in to find profiles of Twitter 101 00:06:28,680 --> 00:06:32,640 Speaker 1: employees who could have you know, like administrative access to 102 00:06:32,680 --> 00:06:35,320 Speaker 1: the back end. Then they did deep dives to find 103 00:06:35,600 --> 00:06:39,880 Speaker 1: how to contact their marks, typically by phone, and then 104 00:06:39,960 --> 00:06:43,359 Speaker 1: convinced those Twitter employees that the hackers were in fact 105 00:06:43,400 --> 00:06:46,719 Speaker 1: authorized to access Twitter systems for the purposes of maintenance. 106 00:06:47,160 --> 00:06:49,719 Speaker 1: They tricked the Twitter employees to go to a mocked 107 00:06:49,800 --> 00:06:52,200 Speaker 1: up log in page, which was really just a means 108 00:06:52,240 --> 00:06:55,599 Speaker 1: to fish those login credentials in order to get access 109 00:06:55,760 --> 00:06:58,119 Speaker 1: to the back end of Twitter, and then they moved 110 00:06:58,120 --> 00:07:01,760 Speaker 1: on from there. The two of or hackers, Nema Fazelli 111 00:07:02,160 --> 00:07:05,559 Speaker 1: and Mason Shepherd, are older than Clark and will likely 112 00:07:05,640 --> 00:07:08,600 Speaker 1: face more serious sentences for their part in the crime. 113 00:07:09,200 --> 00:07:12,200 Speaker 1: And I just want to point out social engineering is 114 00:07:12,320 --> 00:07:15,520 Speaker 1: a major tool in the hacker tool set. It's one 115 00:07:15,560 --> 00:07:17,640 Speaker 1: of those things where you know, you don't have to 116 00:07:17,680 --> 00:07:22,480 Speaker 1: figure out how to crack a security system or find 117 00:07:22,520 --> 00:07:25,680 Speaker 1: a vulnerability if in fact you just leverage the people 118 00:07:25,760 --> 00:07:28,840 Speaker 1: who have access to that system and you go into 119 00:07:28,880 --> 00:07:32,120 Speaker 1: it through there. That's a very effective means. And you know, 120 00:07:32,160 --> 00:07:34,760 Speaker 1: with COVID nineteen making a lot of people have to 121 00:07:34,760 --> 00:07:38,240 Speaker 1: work from home, it created a lot of and still 122 00:07:38,320 --> 00:07:41,000 Speaker 1: does creates a lot of opportunities for hackers to go 123 00:07:41,080 --> 00:07:45,440 Speaker 1: after that social engineering point of attack. So just be 124 00:07:45,680 --> 00:07:49,120 Speaker 1: aware of that, and you know, use critical thinking whenever 125 00:07:49,160 --> 00:07:54,400 Speaker 1: you get requests to perhaps sign into something that you 126 00:07:54,400 --> 00:07:58,559 Speaker 1: know you didn't anticipate. It's not to say that every 127 00:07:58,600 --> 00:08:03,520 Speaker 1: case of that is, you know, not legit, but it's something. 128 00:08:03,640 --> 00:08:06,480 Speaker 1: It's a red flag, so you know, just be wary. 129 00:08:07,160 --> 00:08:11,040 Speaker 1: Another big tech story in twenty twenty was how companies 130 00:08:11,080 --> 00:08:14,840 Speaker 1: like Uber campaigned really hard to defeat a proposition in 131 00:08:14,880 --> 00:08:19,400 Speaker 1: California that would have forced the company like Uber in 132 00:08:19,400 --> 00:08:23,240 Speaker 1: this case, but also companies like Lift to classify drivers 133 00:08:23,640 --> 00:08:27,600 Speaker 1: as employees rather than as contract workers as as sort 134 00:08:27,600 --> 00:08:32,240 Speaker 1: of independent contractors. Such a classification would require Uber to 135 00:08:32,280 --> 00:08:36,040 Speaker 1: provide additional compensation and benefits to drivers, and that's something 136 00:08:36,040 --> 00:08:38,880 Speaker 1: that the company is not too keen on doing. While 137 00:08:39,000 --> 00:08:42,280 Speaker 1: Uber was successful in convincing enough voters to oppose the 138 00:08:42,280 --> 00:08:46,640 Speaker 1: proposition in California, things are different across the pond. The 139 00:08:46,720 --> 00:08:50,360 Speaker 1: courts in the United Kingdom, after five year legal battle, 140 00:08:50,440 --> 00:08:54,520 Speaker 1: ruled that Uber drivers are effectively employees, and Uber says 141 00:08:54,559 --> 00:08:57,880 Speaker 1: now that drivers will earn at least the UK's national 142 00:08:57,960 --> 00:09:02,280 Speaker 1: living wage, which is currently set at eight pounds seventy 143 00:09:02,320 --> 00:09:05,560 Speaker 1: two pence per hour. It will also offer holiday pay 144 00:09:05,720 --> 00:09:09,800 Speaker 1: and pensions to drivers. Uber already offered free insurance to 145 00:09:09,840 --> 00:09:13,600 Speaker 1: cover cases of sickness or injury. Those will remain in place. 146 00:09:14,160 --> 00:09:18,199 Speaker 1: The change only applies to those who are driving passengers around. However, 147 00:09:18,559 --> 00:09:21,360 Speaker 1: Uber drivers who are delivering food as part of Uber 148 00:09:21,400 --> 00:09:25,920 Speaker 1: Eats still classified as being self employed. Also, that hourly 149 00:09:26,040 --> 00:09:29,040 Speaker 1: rate only applies to the times when Uber drivers are 150 00:09:29,080 --> 00:09:33,800 Speaker 1: actually transporting customers. Once someone has dropped off, that clock 151 00:09:33,880 --> 00:09:38,199 Speaker 1: effectively stops until another fare enters the car. That's something 152 00:09:38,200 --> 00:09:42,120 Speaker 1: that unions say is inadequate. But still this may mark 153 00:09:42,240 --> 00:09:46,360 Speaker 1: a change in direction for the gig economy in general, 154 00:09:46,440 --> 00:09:50,880 Speaker 1: and possibly will see further measures in the future. Meanwhile, 155 00:09:51,200 --> 00:09:53,679 Speaker 1: over here in the United States, the Washington Post had 156 00:09:53,720 --> 00:09:57,000 Speaker 1: a pretty critical piece about Uber, saying that while the 157 00:09:57,040 --> 00:10:00,480 Speaker 1: company was seeing huge boosts to its stock, you so, 158 00:10:00,559 --> 00:10:03,319 Speaker 1: the value of the company was going through the stratosphere 159 00:10:04,120 --> 00:10:08,840 Speaker 1: back in especially after it helped get that California proposition 160 00:10:08,880 --> 00:10:13,400 Speaker 1: off the table. It also wasn't really helping out when 161 00:10:13,400 --> 00:10:17,000 Speaker 1: it came to things like unemployment benefits for drivers. That 162 00:10:17,080 --> 00:10:19,920 Speaker 1: left a lot of drivers in a very tough economic position. 163 00:10:20,240 --> 00:10:24,480 Speaker 1: So those drivers instead largely depended upon government assistance. Around 164 00:10:24,559 --> 00:10:28,080 Speaker 1: eighty million dollars of it all told, they received funds 165 00:10:28,120 --> 00:10:32,000 Speaker 1: from the Economic Injury Disaster Loans Program. Now, as the 166 00:10:32,080 --> 00:10:36,360 Speaker 1: name suggests, that program gives out loans and grants two 167 00:10:36,360 --> 00:10:40,400 Speaker 1: small businesses in times of economic upheaval to help those 168 00:10:40,400 --> 00:10:43,360 Speaker 1: businesses survive as well as you know, the people who 169 00:10:43,440 --> 00:10:46,640 Speaker 1: run those businesses. That's really what's important here. So you've 170 00:10:46,640 --> 00:10:50,559 Speaker 1: got a multibillion dollar company with Uber, which I should 171 00:10:50,559 --> 00:10:53,840 Speaker 1: add has never once turned a profit by the end 172 00:10:53,880 --> 00:10:57,520 Speaker 1: of a fiscal year in its entire existence, and meanwhile 173 00:10:57,559 --> 00:11:00,760 Speaker 1: it has workers who have qualified for a small business 174 00:11:00,800 --> 00:11:05,640 Speaker 1: government assistance program. Again, because Uber was able to maintain 175 00:11:05,840 --> 00:11:10,320 Speaker 1: that arrangement that these workers are independent contractors, they're they're 176 00:11:10,320 --> 00:11:14,040 Speaker 1: self employed, they're not employees according to the law. So 177 00:11:14,160 --> 00:11:17,120 Speaker 1: Uber saw its value increase while the US government took 178 00:11:17,160 --> 00:11:20,720 Speaker 1: over the job of helping Uber's drivers make ends meet. Now, 179 00:11:20,760 --> 00:11:23,080 Speaker 1: this experience really points to how people in the gig 180 00:11:23,120 --> 00:11:27,600 Speaker 1: economy are particularly vulnerable to economic disruption, which I know 181 00:11:27,880 --> 00:11:30,400 Speaker 1: is kind of like me telling you that water is wet. 182 00:11:30,800 --> 00:11:34,520 Speaker 1: These are people who have to hustle constantly just to 183 00:11:34,600 --> 00:11:37,440 Speaker 1: make ends meet. So if you are someone who works 184 00:11:37,480 --> 00:11:40,040 Speaker 1: in the gig economy, my hat is off to you, 185 00:11:40,080 --> 00:11:42,400 Speaker 1: and I really hope things are going well for you 186 00:11:42,480 --> 00:11:45,880 Speaker 1: right now and that they just keep getting better. Joseph 187 00:11:45,960 --> 00:11:49,120 Speaker 1: cox Over at Vice Media has written a piece titled 188 00:11:49,280 --> 00:11:52,720 Speaker 1: cars Have Your Location. This spy firm wants to sell 189 00:11:52,760 --> 00:11:55,880 Speaker 1: it to the US military, which is a heck of 190 00:11:55,880 --> 00:11:58,320 Speaker 1: a headline. I mean it made me click on the story. 191 00:11:58,920 --> 00:12:02,200 Speaker 1: So what's going on here? Well, it really kind of 192 00:12:02,240 --> 00:12:05,760 Speaker 1: boils down to telematics. Telematics is a kind of portmanteau 193 00:12:05,800 --> 00:12:11,560 Speaker 1: of telecommunications and informatics. So modern cars have numerous sensors 194 00:12:11,600 --> 00:12:14,480 Speaker 1: to monitor car performance and safety parameters. You know, that's 195 00:12:14,840 --> 00:12:17,520 Speaker 1: when you get that check engine light or whatever. Someone 196 00:12:17,520 --> 00:12:19,600 Speaker 1: always has to plug your car up to a computer 197 00:12:19,720 --> 00:12:22,840 Speaker 1: to read what is actually going on. Well, that's kind 198 00:12:22,840 --> 00:12:25,079 Speaker 1: of what I'm talking about here, But these sensors are 199 00:12:25,120 --> 00:12:27,480 Speaker 1: doing more than just keeping an eye on how things 200 00:12:27,559 --> 00:12:30,880 Speaker 1: are handling while you're driving the vehicle. Many of these 201 00:12:30,880 --> 00:12:35,520 Speaker 1: systems pair with communications devices essentially like a simcard and 202 00:12:35,600 --> 00:12:39,520 Speaker 1: a modem, and it sends data back to automotive companies. 203 00:12:39,840 --> 00:12:42,800 Speaker 1: So what do these companies do with that data? Well, 204 00:12:42,840 --> 00:12:45,040 Speaker 1: they do a lot of things. They might use that 205 00:12:45,120 --> 00:12:48,240 Speaker 1: information to help design the next generation of vehicles based 206 00:12:48,240 --> 00:12:51,200 Speaker 1: on how people are using their cars today, or they 207 00:12:51,280 --> 00:12:54,760 Speaker 1: might have deals with major insurance companies which then use 208 00:12:54,840 --> 00:12:57,880 Speaker 1: those telematics to figure out what kind of driver you are, 209 00:12:58,080 --> 00:13:00,440 Speaker 1: how big a risk you pose, and that in turn 210 00:13:00,960 --> 00:13:04,240 Speaker 1: affects the rates you pay for car insurance. Or they 211 00:13:04,320 --> 00:13:07,400 Speaker 1: might sell data to other parties, which is probably where 212 00:13:07,440 --> 00:13:10,000 Speaker 1: the company mentioned in the Vice article comes in. That 213 00:13:10,080 --> 00:13:12,760 Speaker 1: company is called the Ulysses Group. It's a company that 214 00:13:12,800 --> 00:13:16,080 Speaker 1: has worked in various surveillance related products and services for 215 00:13:16,120 --> 00:13:20,679 Speaker 1: several years. Ulysses has proposed a deal with US government 216 00:13:20,720 --> 00:13:23,800 Speaker 1: to provide data that could give real time location information 217 00:13:24,120 --> 00:13:27,880 Speaker 1: about more than fifteen billion vehicles around the world in 218 00:13:27,960 --> 00:13:33,200 Speaker 1: pretty much every country except North Korea and Cuba Gali. Now, 219 00:13:33,200 --> 00:13:37,560 Speaker 1: according to ulysses own document about this proposal, quote, the 220 00:13:37,679 --> 00:13:41,480 Speaker 1: data can be used to geolocate, track and target time 221 00:13:41,520 --> 00:13:46,720 Speaker 1: sensitive mobile targets, tip and que sensors, developed patterns of life, 222 00:13:46,920 --> 00:13:52,480 Speaker 1: identified networks and relationships, and enhance situational awareness, among many 223 00:13:52,600 --> 00:13:56,840 Speaker 1: other applications end quote. See this is the kind of 224 00:13:56,840 --> 00:13:59,280 Speaker 1: thing that makes me long for the old days of cars, 225 00:13:59,280 --> 00:14:03,080 Speaker 1: where the systems were really more just like mechanical devices 226 00:14:03,120 --> 00:14:05,800 Speaker 1: and they were less like computers. Now I should add 227 00:14:05,960 --> 00:14:08,840 Speaker 1: that Ulysses doesn't have some sort of magic bug or 228 00:14:08,920 --> 00:14:12,280 Speaker 1: tracker that's installed in every car. The company would be 229 00:14:12,320 --> 00:14:16,080 Speaker 1: relying on data provided by those telematics systems. It's just 230 00:14:16,160 --> 00:14:18,640 Speaker 1: a question of how they get hold of that data. 231 00:14:18,840 --> 00:14:21,240 Speaker 1: So in a way, ulysses could be sort of a 232 00:14:21,320 --> 00:14:24,640 Speaker 1: resale business in that regard. Now, I presume the company 233 00:14:24,680 --> 00:14:27,640 Speaker 1: would first purchase the data from some other party and 234 00:14:27,680 --> 00:14:31,360 Speaker 1: then package it specifically for the US government, should the 235 00:14:31,360 --> 00:14:34,480 Speaker 1: government want to pursue this opportunity. I should also add 236 00:14:34,480 --> 00:14:39,000 Speaker 1: that not every vehicle out there actually has onboard telematics systems. 237 00:14:39,440 --> 00:14:42,360 Speaker 1: They are really common in the commercial vehicle markets, so 238 00:14:43,000 --> 00:14:46,080 Speaker 1: play you know, businesses that have fleets of cars frequently 239 00:14:46,160 --> 00:14:48,000 Speaker 1: have this just so that they can keep an eye 240 00:14:48,000 --> 00:14:50,440 Speaker 1: on how all those cars are doing in order to 241 00:14:50,720 --> 00:14:54,320 Speaker 1: you know, maintain the proper efficiencies. And some automakers have 242 00:14:54,480 --> 00:14:59,920 Speaker 1: embraced telematics more passionately I suppose than others like BMED 243 00:15:00,120 --> 00:15:02,920 Speaker 1: you and GM are are leaders in the space. Now, 244 00:15:02,960 --> 00:15:05,360 Speaker 1: there's not a whole lot you can do about this 245 00:15:05,520 --> 00:15:09,240 Speaker 1: as a as a driver, apart from maybe buying and 246 00:15:09,280 --> 00:15:13,800 Speaker 1: maintaining older vehicles that don't have onboard telematic systems. But 247 00:15:14,000 --> 00:15:16,880 Speaker 1: those have their own issues. For example, they might not 248 00:15:16,920 --> 00:15:20,800 Speaker 1: be terribly efficient, they could have some emissions problems. Getting 249 00:15:20,840 --> 00:15:23,600 Speaker 1: them repaired can sometimes be a bigger bit of a 250 00:15:23,640 --> 00:15:26,680 Speaker 1: pain because it may be hard to find parts for them, 251 00:15:26,760 --> 00:15:29,520 Speaker 1: So there is definitely a trade off there. But I 252 00:15:29,520 --> 00:15:31,840 Speaker 1: did want to cover the story because it's one of 253 00:15:31,840 --> 00:15:35,560 Speaker 1: those things that people just should be aware of. Ours 254 00:15:35,600 --> 00:15:39,920 Speaker 1: technical reports that TikTok parent company Bite dances investigating ways 255 00:15:39,960 --> 00:15:42,880 Speaker 1: to track iPhone users with the intent of serving those 256 00:15:43,000 --> 00:15:47,720 Speaker 1: users targeted advertisements. But see that happens to be against 257 00:15:47,840 --> 00:15:51,840 Speaker 1: Apple's privacy rules, which now state that apps have to 258 00:15:52,000 --> 00:15:56,280 Speaker 1: alert users before they can track those users, and they 259 00:15:56,280 --> 00:15:59,800 Speaker 1: have to give users the option to opt out of tracking. 260 00:16:00,320 --> 00:16:03,040 Speaker 1: But that's something that companies in general aren't too keen 261 00:16:03,160 --> 00:16:07,560 Speaker 1: on because targeted advertising is really a cash cow in 262 00:16:07,640 --> 00:16:12,160 Speaker 1: revenue terms. It's an incredibly valuable tool for companies like Facebook, 263 00:16:12,200 --> 00:16:18,520 Speaker 1: for example. It's very valuable to sell that capability to advertisers. However, 264 00:16:18,560 --> 00:16:21,400 Speaker 1: these companies worry that if you give people the choice, 265 00:16:21,920 --> 00:16:24,400 Speaker 1: they're gonna opt out of being tracked, because hey, you 266 00:16:24,440 --> 00:16:26,880 Speaker 1: know what, most folks aren't super keen on feeling like 267 00:16:26,920 --> 00:16:30,560 Speaker 1: they're in a song by the police. You know that song. 268 00:16:30,720 --> 00:16:34,040 Speaker 1: Every little thing she does is magic. Wait, no, no, no, no, 269 00:16:34,040 --> 00:16:37,920 Speaker 1: I'm sorry. I'm sorry man, every breath you take. Anyway. 270 00:16:37,960 --> 00:16:41,520 Speaker 1: The China Advertising Association, which I'm sure you will not 271 00:16:41,600 --> 00:16:45,040 Speaker 1: be surprised to hear, is a state backed institution in China, 272 00:16:45,600 --> 00:16:49,040 Speaker 1: is now trying an alternative way to track iPhone users 273 00:16:49,080 --> 00:16:53,560 Speaker 1: that bypasses the methods that apps are using right now now. Essentially, 274 00:16:54,080 --> 00:16:57,880 Speaker 1: this group is looking to sidestep the process and keep 275 00:16:57,960 --> 00:17:01,280 Speaker 1: tracking people without having to them know about it and 276 00:17:01,320 --> 00:17:04,480 Speaker 1: potentially opt out of it. Apple says it is going 277 00:17:04,520 --> 00:17:08,320 Speaker 1: to ban any app that tries to circumvent the privacy rules. 278 00:17:08,320 --> 00:17:11,359 Speaker 1: But on the flip side, if most of China's apps 279 00:17:11,359 --> 00:17:15,360 Speaker 1: are actually using this alternative method, it would be very 280 00:17:15,400 --> 00:17:18,520 Speaker 1: weird to see Apple actually take advance action against all 281 00:17:18,560 --> 00:17:21,199 Speaker 1: of them, because doing so would essentially open up the 282 00:17:21,240 --> 00:17:25,320 Speaker 1: opportunity for the Chinese government to just outright ban Apple 283 00:17:25,400 --> 00:17:27,679 Speaker 1: from operating in the country. They could say, hey, you 284 00:17:27,760 --> 00:17:30,040 Speaker 1: need us more than we need you. You're out of here, 285 00:17:30,640 --> 00:17:34,880 Speaker 1: And you know, China's got a lot of people over there. Anyway. 286 00:17:35,000 --> 00:17:38,600 Speaker 1: While TikTok tries to downplay its connection to its Chinese 287 00:17:38,600 --> 00:17:43,000 Speaker 1: parent company, it is worth remembering Byte Dance is definitely 288 00:17:43,040 --> 00:17:47,880 Speaker 1: one of the companies that is pursuing this, and we're 289 00:17:47,920 --> 00:17:51,439 Speaker 1: not done with stories about companies and governments tracking people. 290 00:17:51,840 --> 00:17:54,600 Speaker 1: Apple's new privacy rules also meant that companies had to 291 00:17:54,640 --> 00:17:58,120 Speaker 1: disclose more information about how they track and use personal 292 00:17:58,200 --> 00:18:01,760 Speaker 1: data from users, and that includes Google, a company that 293 00:18:01,800 --> 00:18:05,680 Speaker 1: has built its entire empire around the aggregation and exploitation 294 00:18:05,720 --> 00:18:09,560 Speaker 1: of data, personal and otherwise. Google took a long time 295 00:18:09,600 --> 00:18:12,560 Speaker 1: to comply with Apple's new rules, but once it did, 296 00:18:12,640 --> 00:18:16,280 Speaker 1: it became clear that the company collects a lot of 297 00:18:16,359 --> 00:18:20,040 Speaker 1: data for lots of different reasons. Uh. Sometimes it's to 298 00:18:20,119 --> 00:18:24,119 Speaker 1: provide a personalized experience through an app. Sometimes it's just 299 00:18:24,200 --> 00:18:27,320 Speaker 1: a monitor app functionality. Make sure that if an app 300 00:18:27,400 --> 00:18:30,720 Speaker 1: keeps crashing, figure out why it's doing that. Sometimes it's 301 00:18:30,800 --> 00:18:34,720 Speaker 1: general analytics. But it prompted duct duct Go, a web 302 00:18:34,720 --> 00:18:37,399 Speaker 1: browser and search engine, to take to Twitter and fire 303 00:18:37,400 --> 00:18:41,000 Speaker 1: a few shots at Google. The company posted quote. After 304 00:18:41,119 --> 00:18:44,400 Speaker 1: months of stalling, Google finally revealed how much personal data 305 00:18:44,440 --> 00:18:47,560 Speaker 1: they collect in Chrome and the Google App. No wonder 306 00:18:47,600 --> 00:18:50,440 Speaker 1: they wanted to hide it. Spying on users has nothing 307 00:18:50,440 --> 00:18:53,000 Speaker 1: to do with building a great web browser research engine. 308 00:18:53,240 --> 00:18:56,800 Speaker 1: We should know our app is both. In one end, quote, 309 00:18:57,560 --> 00:19:01,080 Speaker 1: some serious shade there, duc duct go. Meanwhile, Google is 310 00:19:01,119 --> 00:19:04,199 Speaker 1: also facing a class action lawsuit brought against the company 311 00:19:04,240 --> 00:19:08,080 Speaker 1: by users who allege the Google violated their privacy by 312 00:19:08,080 --> 00:19:12,000 Speaker 1: collecting data while the users were using Chrome in in 313 00:19:12,040 --> 00:19:15,960 Speaker 1: incognito mode. The claim is that Google was collecting info 314 00:19:16,040 --> 00:19:19,320 Speaker 1: on browser history even when people are in private mode. 315 00:19:19,720 --> 00:19:21,960 Speaker 1: Google has moved to have the case dismissed, but a 316 00:19:22,040 --> 00:19:24,520 Speaker 1: judge then denied that request, so it's going to go 317 00:19:24,560 --> 00:19:27,919 Speaker 1: to court. Google representatives have pointed out that when you 318 00:19:27,960 --> 00:19:30,800 Speaker 1: open an incognito window and Chrome, you're greeted with a 319 00:19:30,840 --> 00:19:34,240 Speaker 1: page that says Chrome doesn't say browsing history, but the 320 00:19:34,240 --> 00:19:36,879 Speaker 1: activity could still be visible to other websites that you 321 00:19:37,040 --> 00:19:40,119 Speaker 1: visit and moral on those lines. So I suppose this 322 00:19:40,200 --> 00:19:42,479 Speaker 1: case we'll try to determine if Google is being a 323 00:19:42,480 --> 00:19:46,520 Speaker 1: bit coy about that whole browser history thing or not. 324 00:19:47,560 --> 00:19:51,520 Speaker 1: And finally, do y'all remember the Apple commercials in which 325 00:19:51,640 --> 00:19:54,600 Speaker 1: Justin Long would come on screen he announced that he's 326 00:19:54,600 --> 00:19:57,639 Speaker 1: a Mac, and then John Hodgman would come on screen 327 00:19:57,680 --> 00:20:00,600 Speaker 1: and announced that he's a PC. And Austin Long was 328 00:20:00,600 --> 00:20:03,760 Speaker 1: always portrayed as kind of a hip, young, cool guy 329 00:20:03,880 --> 00:20:06,520 Speaker 1: with a lot of creative ideas, and Hodgment always came 330 00:20:06,520 --> 00:20:09,239 Speaker 1: across as outdated and out of touch and a bit 331 00:20:09,280 --> 00:20:12,840 Speaker 1: of a fuddy duddy. Well, now how the turns have 332 00:20:13,040 --> 00:20:17,280 Speaker 1: tabled or whatever, because Justin Long is now appearing in 333 00:20:17,320 --> 00:20:20,880 Speaker 1: a series of commercials for Intel in which he's kind 334 00:20:20,920 --> 00:20:24,520 Speaker 1: of slagging off on his old Mac buddies. The ads 335 00:20:24,600 --> 00:20:29,720 Speaker 1: show Long comparing Max, which now sport the Apple designed CPUs, 336 00:20:29,840 --> 00:20:34,159 Speaker 1: not the Intel processors, and he compares them against PCs 337 00:20:34,240 --> 00:20:37,560 Speaker 1: that do have Intel chips inside them. And I guess 338 00:20:37,560 --> 00:20:40,119 Speaker 1: you can figure out where this is going over and 339 00:20:40,200 --> 00:20:43,359 Speaker 1: over in each of these ads. Justin Long's job is 340 00:20:43,400 --> 00:20:46,199 Speaker 1: to suggest that Apple is really a hassle and that 341 00:20:46,400 --> 00:20:49,359 Speaker 1: Max limit what users can do with their machines, and 342 00:20:49,440 --> 00:20:52,760 Speaker 1: he's particularly brutal when it comes to gaming. Now, I 343 00:20:52,800 --> 00:20:55,000 Speaker 1: normally wouldn't report on ads, but this was just one 344 00:20:55,000 --> 00:20:56,880 Speaker 1: of those things I found kind of amusing for those 345 00:20:56,920 --> 00:20:59,880 Speaker 1: of us who have been subjected to tech company adver 346 00:21:00,040 --> 00:21:02,679 Speaker 1: tisements for a few decades. So if you remember the 347 00:21:02,840 --> 00:21:05,520 Speaker 1: I'm a Mac, I'm a PC ads, maybe you watch 348 00:21:05,520 --> 00:21:08,040 Speaker 1: a couple of the new Intel ones just to see 349 00:21:08,040 --> 00:21:11,919 Speaker 1: how they are leaning hard on that history. And it 350 00:21:12,000 --> 00:21:14,600 Speaker 1: just makes me think, as someone who has read plenty 351 00:21:14,600 --> 00:21:20,800 Speaker 1: of ads himself, how awkward that initial conversation must have been. Um. 352 00:21:20,840 --> 00:21:23,439 Speaker 1: I mean, granted, Apple's very different company now than it 353 00:21:23,520 --> 00:21:25,919 Speaker 1: was when Justin Long was doing ads for it, but 354 00:21:26,400 --> 00:21:31,520 Speaker 1: even so awkward. That wraps up the news for Thursday, 355 00:21:31,600 --> 00:21:36,000 Speaker 1: March one. If you have any suggestions for topics I 356 00:21:36,000 --> 00:21:39,520 Speaker 1: should tackle in future episodes of tech Stuff, let me know. 357 00:21:39,760 --> 00:21:41,280 Speaker 1: The Best way to get in touch with me is 358 00:21:41,320 --> 00:21:44,119 Speaker 1: over on Twitter. The handle for the show is text 359 00:21:44,160 --> 00:21:47,639 Speaker 1: Stuff H. S W and I'll talk to you again 360 00:21:48,520 --> 00:21:56,520 Speaker 1: really soon. Text Stuff is an I Heart Radio production. 361 00:21:56,800 --> 00:21:59,600 Speaker 1: For more podcasts from my Heart Radio, visit the i 362 00:21:59,720 --> 00:22:02,960 Speaker 1: Heart Radio app, Apple Podcasts, or wherever you listen to 363 00:22:03,000 --> 00:22:08,320 Speaker 1: your favorite shows. H.