1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,039 --> 00:00:14,880 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:15,040 --> 00:00:18,880 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:19,040 --> 00:00:22,119 Speaker 1: and I love all things tech, and it is time 5 00:00:22,200 --> 00:00:27,920 Speaker 1: for the tech news for Tuesday, September twenty one. First 6 00:00:28,000 --> 00:00:32,600 Speaker 1: up today is when Apple holds its iPhone event. Now, 7 00:00:32,640 --> 00:00:36,400 Speaker 1: as I record this, that event has not yet happened. 8 00:00:36,680 --> 00:00:40,200 Speaker 1: It is scheduled to happen at one pm Eastern, and 9 00:00:40,640 --> 00:00:44,040 Speaker 1: I'm recording this approximately ten am Eastern, so it's quite 10 00:00:44,080 --> 00:00:47,400 Speaker 1: possible that it has already happened by the time you 11 00:00:47,600 --> 00:00:50,559 Speaker 1: listen to this podcast. And in that case, if you 12 00:00:50,560 --> 00:00:52,960 Speaker 1: could do me a solid and send me an email 13 00:00:53,040 --> 00:00:56,200 Speaker 1: through some sort of subspace time warp to let me 14 00:00:56,240 --> 00:00:58,360 Speaker 1: know what's going to be talked about, I would really 15 00:00:58,400 --> 00:01:02,280 Speaker 1: appreciate it. And just check email. There's nothing there. You 16 00:01:02,320 --> 00:01:04,840 Speaker 1: have all failed me. But the rumor is that the 17 00:01:04,920 --> 00:01:07,920 Speaker 1: new iPhone is not going to look that much different 18 00:01:08,040 --> 00:01:11,800 Speaker 1: from the previous generation, and that it will probably have 19 00:01:11,920 --> 00:01:15,920 Speaker 1: a relatively modest set of upgrades. Now still undoubtedly going 20 00:01:15,959 --> 00:01:19,200 Speaker 1: to be more powerful and have more features than earlier iPhones, 21 00:01:19,600 --> 00:01:22,120 Speaker 1: but the general expectation is that it won't be a 22 00:01:22,200 --> 00:01:27,080 Speaker 1: guard ganguin leap forward or departure. Other rumors suggest that 23 00:01:27,120 --> 00:01:30,160 Speaker 1: we're gonna see an updated Apple Watch with perhaps a 24 00:01:30,160 --> 00:01:34,400 Speaker 1: bigger screen and maybe some updates to the air Pod earbuds. 25 00:01:35,400 --> 00:01:38,520 Speaker 1: Some real optimists out there are hoping that Apple will 26 00:01:38,560 --> 00:01:42,440 Speaker 1: finally talk about its VR and a our products and development. Now, 27 00:01:42,520 --> 00:01:45,760 Speaker 1: maybe that will happen. It would be one heck of 28 00:01:45,800 --> 00:01:48,760 Speaker 1: a of a one more thing. But I suspect we 29 00:01:48,840 --> 00:01:51,280 Speaker 1: are not going to get that at this event. I 30 00:01:51,320 --> 00:01:54,640 Speaker 1: would love to be proven wrong. However, speaking of Apple, 31 00:01:55,040 --> 00:01:59,080 Speaker 1: late last week, a federal judge granted an injunction against 32 00:01:59,120 --> 00:02:02,640 Speaker 1: Apple on alf of Epic Games. So, for those of 33 00:02:02,680 --> 00:02:05,600 Speaker 1: you not familiar with this story, here's the gist of it. 34 00:02:06,040 --> 00:02:10,240 Speaker 1: Epic Games makes a game called Fortnite, and one way 35 00:02:10,320 --> 00:02:14,840 Speaker 1: that Fortnite makes money is through in game purchases. Players 36 00:02:15,000 --> 00:02:18,680 Speaker 1: spend real world money to get in game currency, which 37 00:02:18,720 --> 00:02:20,880 Speaker 1: they can then use to purchase all sorts of stuff 38 00:02:21,360 --> 00:02:24,800 Speaker 1: inside the game, like you know, character skins and emotes 39 00:02:24,840 --> 00:02:28,000 Speaker 1: and that kind of thing. Both Apple and Google have 40 00:02:28,200 --> 00:02:31,800 Speaker 1: a policy that any app on their platform that has 41 00:02:31,919 --> 00:02:36,120 Speaker 1: in app purchases is supposed to use the official app 42 00:02:36,160 --> 00:02:41,240 Speaker 1: store transaction system that gives Apple or Google a cut, 43 00:02:41,280 --> 00:02:46,520 Speaker 1: and it's typically around the mark. Epic cheekily encouraged players 44 00:02:46,560 --> 00:02:49,800 Speaker 1: on iOS and Android devices to use a workaround to 45 00:02:49,880 --> 00:02:55,160 Speaker 1: purchase currency in game directly from Epic bypassing those app 46 00:02:55,480 --> 00:02:59,360 Speaker 1: transaction systems, and it would give Epic a share of 47 00:02:59,400 --> 00:03:02,320 Speaker 1: those trans actions, cutting out you know, Apple and Google 48 00:03:02,360 --> 00:03:05,520 Speaker 1: in the process. Apple and Epic have since been in 49 00:03:05,520 --> 00:03:10,280 Speaker 1: a series of fights, both legal and public relations fights 50 00:03:10,320 --> 00:03:14,120 Speaker 1: since that time. This recent injunction would allow developers to 51 00:03:14,200 --> 00:03:18,560 Speaker 1: use those outside payment solutions for iOS apps. Now, that's 52 00:03:18,560 --> 00:03:22,440 Speaker 1: obviously a big hit to Apple's revenue stream, although, as 53 00:03:22,520 --> 00:03:26,040 Speaker 1: I've mentioned on this show before, Apple does not get 54 00:03:26,080 --> 00:03:28,960 Speaker 1: so granular as to break out how much money the 55 00:03:29,000 --> 00:03:32,000 Speaker 1: app store brings into the company. It gets lumped in 56 00:03:32,200 --> 00:03:35,240 Speaker 1: with a bunch of other stuff. But that move in 57 00:03:35,360 --> 00:03:38,680 Speaker 1: turn prompted a bit of a sell off of Apple stock, 58 00:03:39,040 --> 00:03:43,680 Speaker 1: which dropped three point three and value. Now, that's less 59 00:03:43,680 --> 00:03:46,000 Speaker 1: than five percent. It doesn't sound like very much, right, 60 00:03:46,080 --> 00:03:48,440 Speaker 1: but we're talking about a company that's worth more than 61 00:03:48,520 --> 00:03:52,680 Speaker 1: two trillion dollars, So that meant Apple lost about eighty 62 00:03:52,680 --> 00:03:56,880 Speaker 1: five billion dollars in value due to that slip in 63 00:03:57,000 --> 00:04:02,720 Speaker 1: stock price, and that blows my mind, like billion dollars 64 00:04:02,760 --> 00:04:05,600 Speaker 1: is an unimaginable amount of money all by itself, let 65 00:04:05,680 --> 00:04:09,560 Speaker 1: alone the fact that Apple is worth more than two trillion. Anyway, 66 00:04:09,640 --> 00:04:12,400 Speaker 1: the battles will likely not end here, although I have 67 00:04:12,600 --> 00:04:15,280 Speaker 1: been seeing more of a tendency for courts to push 68 00:04:15,360 --> 00:04:18,760 Speaker 1: against companies like Google and Apple with the argument that 69 00:04:18,839 --> 00:04:22,920 Speaker 1: the in app transaction policies are anti competitive. If you've 70 00:04:22,920 --> 00:04:25,520 Speaker 1: been listening to my news episodes, you've also heard me 71 00:04:25,560 --> 00:04:29,039 Speaker 1: talk about the n s O Group and Israeli company 72 00:04:29,080 --> 00:04:32,920 Speaker 1: that makes iOS malware designed to infect the devices of 73 00:04:33,040 --> 00:04:37,640 Speaker 1: highly targeted individuals through a zero click message vector I 74 00:04:37,839 --> 00:04:40,480 Speaker 1: message vector. Now that means that the n s A 75 00:04:40,560 --> 00:04:44,400 Speaker 1: Group sells this product to various government agencies, at least 76 00:04:44,440 --> 00:04:48,520 Speaker 1: those that don't have an adversarial relationship with Israel, because 77 00:04:49,120 --> 00:04:52,320 Speaker 1: so group has to do this under Israel's you know, permission, 78 00:04:53,200 --> 00:04:56,440 Speaker 1: and those agencies can then send out an attack message 79 00:04:56,560 --> 00:04:59,720 Speaker 1: via I message to a specific target. Then if the 80 00:04:59,760 --> 00:05:02,279 Speaker 1: tar get does so much as open that message, it's 81 00:05:02,320 --> 00:05:05,000 Speaker 1: game over. There's no need to convince them to open 82 00:05:05,080 --> 00:05:07,880 Speaker 1: a file or click on a link. A flaw and 83 00:05:07,920 --> 00:05:11,880 Speaker 1: Apple security allowed for that kind of attack. The Citizen 84 00:05:12,000 --> 00:05:15,719 Speaker 1: Lab out of the University of Toronto has been keeping 85 00:05:15,760 --> 00:05:19,000 Speaker 1: tabs on these types of attacks and managed to isolate 86 00:05:19,040 --> 00:05:22,880 Speaker 1: the exploit in the malware to understand how this really works. 87 00:05:23,279 --> 00:05:26,920 Speaker 1: The lad passed this information onto Apple, and Apple subsequently 88 00:05:27,000 --> 00:05:30,640 Speaker 1: patched the vulnerability and pushed out the iOS fourteen point 89 00:05:30,640 --> 00:05:34,160 Speaker 1: eight and iPad OS fourteen point eight updates in order 90 00:05:34,200 --> 00:05:39,280 Speaker 1: to address this weakness. Now, generally speaking, most iOS users 91 00:05:39,360 --> 00:05:43,359 Speaker 1: should be fine because these attacks really are targeted attacks. 92 00:05:43,920 --> 00:05:47,120 Speaker 1: So unless there's a government that's out there that has 93 00:05:47,160 --> 00:05:51,120 Speaker 1: an AX to grind with you, you're probably okay. But 94 00:05:51,520 --> 00:05:57,360 Speaker 1: for diplomats, politicians, movement leaders like union leaders and and 95 00:05:57,640 --> 00:06:01,920 Speaker 1: UH activist leaders journalist for them, it's a different story. 96 00:06:02,440 --> 00:06:04,880 Speaker 1: It'll be interesting to see if Apple's update renders the 97 00:06:04,960 --> 00:06:09,039 Speaker 1: NSO group's main product less valuable or if the hackers 98 00:06:09,080 --> 00:06:11,200 Speaker 1: that the company are even more steps ahead and have 99 00:06:11,520 --> 00:06:15,760 Speaker 1: you know, contingencies in place in case Apple did address 100 00:06:15,800 --> 00:06:22,640 Speaker 1: this particular vulnerability. And in Apple adjacent news, Steve Wozniak, 101 00:06:22,720 --> 00:06:25,880 Speaker 1: the co founder of Apple, who hasn't been part of 102 00:06:25,880 --> 00:06:30,720 Speaker 1: the company for decades, is launching figuratively and hopefully one 103 00:06:30,800 --> 00:06:36,000 Speaker 1: day literally a private space company called Privateer. We do 104 00:06:36,080 --> 00:06:38,920 Speaker 1: not know much about it yet, though by the end 105 00:06:38,960 --> 00:06:41,800 Speaker 1: of today that might change because he plans to announce 106 00:06:41,800 --> 00:06:46,160 Speaker 1: further details at the AMOS that's a MS tech conference, 107 00:06:46,320 --> 00:06:49,320 Speaker 1: and that is going on this week in Hawaii. The 108 00:06:49,520 --> 00:06:53,359 Speaker 1: general rumor is that Privateer will focus on the problem 109 00:06:53,440 --> 00:06:56,960 Speaker 1: of space junk. That's stuff that's whizzing around in orbit 110 00:06:57,000 --> 00:06:59,800 Speaker 1: at super fast speeds. Usually it's stuff from you know, 111 00:07:00,040 --> 00:07:03,880 Speaker 1: man made satellites and things that or rockets, launch vehicles, 112 00:07:03,880 --> 00:07:05,719 Speaker 1: that kind of thing that have broken apart, and then 113 00:07:06,720 --> 00:07:09,800 Speaker 1: service of potential obstacle in space. You know, when it's 114 00:07:09,800 --> 00:07:12,720 Speaker 1: moving that fast, it can be catastrophic if that collides 115 00:07:12,760 --> 00:07:16,760 Speaker 1: with something, and it can threaten stuff like satellites, space stations, 116 00:07:16,760 --> 00:07:19,760 Speaker 1: and other spacecraft. Now, considering that lots of companies are 117 00:07:19,800 --> 00:07:22,840 Speaker 1: looking to launch thousands of satellites up into orbit in 118 00:07:22,880 --> 00:07:26,880 Speaker 1: the near future, it's probably a pretty decent motivation to 119 00:07:26,920 --> 00:07:29,440 Speaker 1: get a company like this going. But I don't know 120 00:07:29,600 --> 00:07:32,080 Speaker 1: enough details to give an opinion about whether or not 121 00:07:32,160 --> 00:07:35,640 Speaker 1: the approach makes sense. However, I will allow that Wosniak 122 00:07:35,760 --> 00:07:39,560 Speaker 1: is orders of magnitude smarter than I am. We'll just 123 00:07:39,600 --> 00:07:42,360 Speaker 1: have to, you know, make a call once we get 124 00:07:42,400 --> 00:07:47,640 Speaker 1: more details about this. Now, let us consider Google and 125 00:07:47,680 --> 00:07:50,400 Speaker 1: how South Korea has hit Google with a fine of 126 00:07:50,480 --> 00:07:53,600 Speaker 1: nearly a hundred seventy nine million dollars for what the 127 00:07:53,600 --> 00:07:57,440 Speaker 1: government says are anti competitive practices. According to the Korea 128 00:07:57,520 --> 00:08:01,120 Speaker 1: Fair Trade Commission, Google has used its dominant market position 129 00:08:01,160 --> 00:08:04,600 Speaker 1: to pressure certain handset companies to prevent them from allowing 130 00:08:04,640 --> 00:08:08,880 Speaker 1: anything other than the official Google Android operating system to 131 00:08:09,160 --> 00:08:11,400 Speaker 1: work on their devices. So at the heart of the 132 00:08:11,400 --> 00:08:15,960 Speaker 1: matter is that some companies would make Android derivatives, you know, 133 00:08:16,040 --> 00:08:19,440 Speaker 1: operating systems for handsets that had Android as the foundation, 134 00:08:20,120 --> 00:08:23,640 Speaker 1: but they had been tweaked so much as you effectively 135 00:08:23,680 --> 00:08:26,320 Speaker 1: create a new operating system, you create a fork. It's 136 00:08:26,400 --> 00:08:28,920 Speaker 1: kind of like the timeline stuff in the m c 137 00:08:29,160 --> 00:08:33,439 Speaker 1: U if you watched Loki uh. And these forks can 138 00:08:33,480 --> 00:08:37,320 Speaker 1: have proprietary features. They could even have, you know, different 139 00:08:37,400 --> 00:08:42,880 Speaker 1: user interface approaches and could be different enough from vanilla 140 00:08:42,960 --> 00:08:47,400 Speaker 1: Android to potentially cause issues. Now, according to the Fair 141 00:08:47,400 --> 00:08:51,000 Speaker 1: Trade Commission, Google was making companies sign agreement saying they 142 00:08:51,000 --> 00:08:53,280 Speaker 1: would knock that off and they would just go with 143 00:08:53,360 --> 00:08:56,360 Speaker 1: pure Android, or else they would lose access to early 144 00:08:56,440 --> 00:08:59,360 Speaker 1: builds of Android and such, or they might see their 145 00:08:59,400 --> 00:09:02,720 Speaker 1: own app get buried in the Google App Store. Now, 146 00:09:02,760 --> 00:09:05,840 Speaker 1: on the one hand, I definitely see how Google's approach 147 00:09:05,920 --> 00:09:10,040 Speaker 1: can be anti competitive. That's not cool. On the other hand, 148 00:09:10,400 --> 00:09:13,600 Speaker 1: as a consumer, I really hate having a ton of 149 00:09:13,640 --> 00:09:16,600 Speaker 1: confusion in the market. I mean, years ago, I made 150 00:09:16,600 --> 00:09:19,600 Speaker 1: sure that I would just buy Google flagship phones so 151 00:09:19,679 --> 00:09:21,679 Speaker 1: that way I could be I could be certain that 152 00:09:21,720 --> 00:09:24,240 Speaker 1: I was having a pure form of the Android operating 153 00:09:24,240 --> 00:09:26,959 Speaker 1: system without all the stuff I saw as being superfluous 154 00:09:27,480 --> 00:09:32,440 Speaker 1: from various handset manufacturers and telecommunications providers. Still, I feel 155 00:09:32,480 --> 00:09:35,079 Speaker 1: like that choice should be left to the individual consumer, 156 00:09:35,440 --> 00:09:38,440 Speaker 1: and if companies want to muddy the waters, then they can. 157 00:09:38,720 --> 00:09:41,280 Speaker 1: I don't think they should, but they should be able to. 158 00:09:42,240 --> 00:09:44,480 Speaker 1: I don't think it's super cool for Google to just 159 00:09:44,600 --> 00:09:47,920 Speaker 1: muscle in and say, hey, don't do that or else. 160 00:09:48,400 --> 00:09:52,600 Speaker 1: Amazon continues to build out its enormous database of biometric 161 00:09:52,679 --> 00:09:56,640 Speaker 1: data with the Amazon One system. This is Amazon's palm 162 00:09:56,840 --> 00:10:00,000 Speaker 1: scanning technology, which uses a scanner to look for unique 163 00:10:00,160 --> 00:10:03,960 Speaker 1: details in a palm, including the arrangement and pattern of 164 00:10:04,080 --> 00:10:07,679 Speaker 1: veins under your skin. This is all as a way 165 00:10:07,720 --> 00:10:12,320 Speaker 1: to authenticate a person's identity. Amazon has been using the 166 00:10:12,400 --> 00:10:15,640 Speaker 1: system in some of its physical storefronts, including a few 167 00:10:15,679 --> 00:10:18,600 Speaker 1: Whole Foods locations. So the way it works as you 168 00:10:18,720 --> 00:10:21,560 Speaker 1: establish an account and you scan your palm as part 169 00:10:21,559 --> 00:10:25,760 Speaker 1: of that, and it associates your palm scan with your account. 170 00:10:26,520 --> 00:10:29,040 Speaker 1: Then you can use your palm scan to act as 171 00:10:29,080 --> 00:10:32,320 Speaker 1: a way to authorize payments. For example, so in one 172 00:10:32,360 --> 00:10:34,120 Speaker 1: of these stores, you might be able to just walk in, 173 00:10:34,320 --> 00:10:37,160 Speaker 1: pick up a product, you walk to a scanner, you 174 00:10:37,240 --> 00:10:40,160 Speaker 1: hold your palm over the scanner, and boom, you've bought 175 00:10:40,320 --> 00:10:44,040 Speaker 1: that whatever it is. Well, now Amazon has expanded this 176 00:10:44,160 --> 00:10:49,320 Speaker 1: technology beyond just those you know, owned and operated storefronts 177 00:10:49,360 --> 00:10:51,800 Speaker 1: and has offered it as a way to access the 178 00:10:51,880 --> 00:10:57,600 Speaker 1: Red Rocks Amphitheater in Colorado. Presumably you would create a 179 00:10:57,679 --> 00:11:00,760 Speaker 1: similar account with the Amphitheater and as a part of 180 00:11:00,800 --> 00:11:03,200 Speaker 1: that you would scan your palm. Then when you buy 181 00:11:03,200 --> 00:11:05,800 Speaker 1: a ticket to an event, you could have that ticket 182 00:11:05,880 --> 00:11:09,480 Speaker 1: associated with your identity. Then you just show up at 183 00:11:09,480 --> 00:11:13,520 Speaker 1: the Amphitheater, you scan your palm, validates your ticket, and 184 00:11:13,559 --> 00:11:17,120 Speaker 1: you're in your you can go into the the event. 185 00:11:17,600 --> 00:11:20,520 Speaker 1: Now that could be kind of cool and that it 186 00:11:20,559 --> 00:11:23,319 Speaker 1: could lead to stuff like a massive decrease in scalping, 187 00:11:23,360 --> 00:11:25,840 Speaker 1: and I'm all for that. I love the idea of 188 00:11:25,840 --> 00:11:28,160 Speaker 1: systems that allow people who really want to go to 189 00:11:28,280 --> 00:11:31,200 Speaker 1: something get the chance to do it, rather than you know, 190 00:11:31,240 --> 00:11:33,640 Speaker 1: some enterprising folks with a ton of money just buying 191 00:11:33,720 --> 00:11:36,280 Speaker 1: up all the tickets and then selling those tickets for 192 00:11:36,360 --> 00:11:40,000 Speaker 1: obscene markups. But then there are other things to think about. 193 00:11:40,320 --> 00:11:42,640 Speaker 1: One of those is that we could tie tickets to 194 00:11:42,800 --> 00:11:46,320 Speaker 1: specific mobile devices like our phones. We don't have to 195 00:11:46,360 --> 00:11:48,760 Speaker 1: make it tied to you know, a palm print or 196 00:11:48,800 --> 00:11:52,800 Speaker 1: other biometric data, and it's not really any less convenient 197 00:11:52,920 --> 00:11:56,160 Speaker 1: because most of us never go anywhere without our phones. 198 00:11:57,080 --> 00:11:59,560 Speaker 1: For another thing, you could argue that the whole scheme 199 00:11:59,640 --> 00:12:02,720 Speaker 1: is really geared toward giving Amazon access to even more 200 00:12:02,800 --> 00:12:05,520 Speaker 1: personal data, which could be used in ways that we 201 00:12:05,600 --> 00:12:09,079 Speaker 1: don't anticipate in the future, and that we would essentially 202 00:12:09,160 --> 00:12:12,040 Speaker 1: be signing that stuff over to a company for the 203 00:12:12,080 --> 00:12:16,480 Speaker 1: sake of some convenience without really understanding or appreciating what 204 00:12:16,720 --> 00:12:19,520 Speaker 1: might be done with that data in the future. That's 205 00:12:19,640 --> 00:12:24,400 Speaker 1: usually not the best idea for the individual consumer. We 206 00:12:24,480 --> 00:12:27,319 Speaker 1: have more news to cover, but before we get to that, 207 00:12:27,640 --> 00:12:38,160 Speaker 1: let's take a quick break, We're back and in news 208 00:12:38,200 --> 00:12:41,160 Speaker 1: that hits a little close to home. The company Into It, 209 00:12:41,679 --> 00:12:45,520 Speaker 1: best known for its tax preparation products, plans to acquire 210 00:12:45,800 --> 00:12:50,320 Speaker 1: mail Chimp for twelve billion dollars. Mail Chimp is a 211 00:12:50,360 --> 00:12:55,160 Speaker 1: local Atlanta startup that focuses on marketing primarily through email lists, 212 00:12:55,679 --> 00:12:59,040 Speaker 1: so companies that want to create, say, email newsletters or 213 00:12:59,160 --> 00:13:02,319 Speaker 1: marketing campaign can lean on mail Chip to put something 214 00:13:02,360 --> 00:13:05,520 Speaker 1: together for them and to maintain those email lists and 215 00:13:05,640 --> 00:13:09,360 Speaker 1: handle the distribution for the moment. Mail Chimp has an 216 00:13:09,400 --> 00:13:12,400 Speaker 1: office space that's close to our own here in Atlanta, 217 00:13:12,840 --> 00:13:16,040 Speaker 1: though it will soon relocate to a new office building 218 00:13:16,600 --> 00:13:19,360 Speaker 1: buildings that are currently under construction and are not too 219 00:13:19,360 --> 00:13:22,200 Speaker 1: far away from where we are right now. But anyway, 220 00:13:22,240 --> 00:13:25,520 Speaker 1: the announcement caused some folks to engage in some head scratching, 221 00:13:25,840 --> 00:13:28,960 Speaker 1: as it wasn't immediately apparent how mail Chump would factor 222 00:13:29,000 --> 00:13:31,960 Speaker 1: into into its strategy, like how would the two companies 223 00:13:32,800 --> 00:13:37,280 Speaker 1: meld together In a press release, Into It said together, 224 00:13:37,679 --> 00:13:40,160 Speaker 1: Into It and mail Chip will work to deliver on 225 00:13:40,240 --> 00:13:43,600 Speaker 1: the vision of an innovative end to end customer growth 226 00:13:43,640 --> 00:13:47,280 Speaker 1: platform for small and mid market businesses, allowing them to 227 00:13:47,360 --> 00:13:52,040 Speaker 1: get their business online market their business, manage customer relationships, 228 00:13:52,240 --> 00:13:57,559 Speaker 1: benefit from insights and analytics, get paid, access capital, pay employees, 229 00:13:57,640 --> 00:14:01,800 Speaker 1: optimize cash flow, be organized, and stay compliant with experts 230 00:14:01,880 --> 00:14:07,199 Speaker 1: at their fingertips. That's the end of that quote. And sure, 231 00:14:07,480 --> 00:14:11,360 Speaker 1: I guess like I'm sure that there are lots of 232 00:14:11,400 --> 00:14:15,240 Speaker 1: plans on how these companies are going to integrate. Uh. 233 00:14:15,440 --> 00:14:17,480 Speaker 1: But yeah, like a lot of other people, this was 234 00:14:17,520 --> 00:14:21,040 Speaker 1: one of those moves that surprised me and puzzled me 235 00:14:21,080 --> 00:14:23,880 Speaker 1: a little bit. But we'll just have to wait and 236 00:14:23,920 --> 00:14:28,400 Speaker 1: see how this all shakes out. Now it's cryptocurrency time, 237 00:14:28,960 --> 00:14:31,560 Speaker 1: So if you've listened to the show for any length 238 00:14:31,560 --> 00:14:33,560 Speaker 1: of time, you know that I tend to be a 239 00:14:33,600 --> 00:14:38,080 Speaker 1: little bit wary of cryptocurrency in general for lots of reasons. 240 00:14:38,320 --> 00:14:41,680 Speaker 1: There are tons of reasons to be a little skeptical 241 00:14:41,720 --> 00:14:45,480 Speaker 1: about cryptocurrencies, but one of those reasons is that people 242 00:14:45,560 --> 00:14:49,280 Speaker 1: sometimes go to great links to inflate a cryptocurrencies value 243 00:14:49,640 --> 00:14:51,520 Speaker 1: in order to make a ton of money in a 244 00:14:51,600 --> 00:14:55,040 Speaker 1: very short time before hitting the eject button and leaving 245 00:14:55,080 --> 00:14:57,960 Speaker 1: everyone else holding the bag. That appears to be what 246 00:14:58,120 --> 00:15:01,880 Speaker 1: happened with light coin. Someone went to a great deal 247 00:15:01,880 --> 00:15:05,400 Speaker 1: of trouble to fabricate a fake press release that announced 248 00:15:05,440 --> 00:15:10,560 Speaker 1: retail goliath Walmart would soon accept the cryptocurrency like coin 249 00:15:10,920 --> 00:15:15,080 Speaker 1: as payment for online transactions. The scammers were able to 250 00:15:15,120 --> 00:15:19,680 Speaker 1: get this fake press release accepted by Globe Newswire. Globe 251 00:15:19,680 --> 00:15:24,000 Speaker 1: Newswire distributes press releases and sends them out to various 252 00:15:24,040 --> 00:15:28,000 Speaker 1: news outlets, but apparently doesn't do a whole lot of 253 00:15:28,440 --> 00:15:32,360 Speaker 1: quality control or fact checking before they do that. Then 254 00:15:32,400 --> 00:15:35,320 Speaker 1: the news outlets they get these press releases, and they 255 00:15:35,360 --> 00:15:38,600 Speaker 1: typically want to get the scoop on major news. A 256 00:15:38,600 --> 00:15:41,280 Speaker 1: lot of these news outlets will just run a press 257 00:15:41,320 --> 00:15:45,520 Speaker 1: release verbatim, so there's not even an article written about it. 258 00:15:45,520 --> 00:15:48,680 Speaker 1: They'll just push publish on the press release and boom, 259 00:15:48,720 --> 00:15:52,040 Speaker 1: they've got some content up on their sites. So some 260 00:15:52,120 --> 00:15:54,640 Speaker 1: of these news outlets ran with that story without actually 261 00:15:54,720 --> 00:15:57,840 Speaker 1: checking with Walmart first to verify that it was in 262 00:15:57,920 --> 00:16:04,840 Speaker 1: fact true. Well, that initial surge of information ended up 263 00:16:04,920 --> 00:16:07,920 Speaker 1: pushing the value of l coin. It jumped from around 264 00:16:07,920 --> 00:16:10,400 Speaker 1: a hundred seventy five dollars per coin, two more than 265 00:16:10,600 --> 00:16:13,120 Speaker 1: or right around two or and twenty dollars per coin, 266 00:16:13,680 --> 00:16:18,080 Speaker 1: so pretty significant increase in the cryptocurrency's value. But then 267 00:16:18,200 --> 00:16:20,760 Speaker 1: a Walmart spokesperson said the whole thing was a hoax, 268 00:16:21,040 --> 00:16:23,280 Speaker 1: and the value came back down to the amount it 269 00:16:23,400 --> 00:16:27,080 Speaker 1: was around the time that the hoax was first you know, unveiled. 270 00:16:27,360 --> 00:16:30,960 Speaker 1: Presumably this was all a pump and dump scheme that 271 00:16:31,080 --> 00:16:34,160 Speaker 1: worked like Gangbusters for at least a short while. Such 272 00:16:34,160 --> 00:16:37,520 Speaker 1: schemes typically have a pretty short shelf life, but if 273 00:16:37,560 --> 00:16:39,320 Speaker 1: you're in the know, you can make a whole lot 274 00:16:39,320 --> 00:16:42,200 Speaker 1: of money. Of course, you are engaging in some seriously 275 00:16:42,400 --> 00:16:45,360 Speaker 1: shady practices, and this is the kind of stuff that 276 00:16:45,400 --> 00:16:49,320 Speaker 1: turns the heat up on all cryptocurrencies as regulatory agencies 277 00:16:49,320 --> 00:16:52,960 Speaker 1: start to look into them more closely. Well, from Walmart 278 00:16:53,200 --> 00:16:58,000 Speaker 1: to Walgreens, let's keep the bad news train going. So, 279 00:16:58,200 --> 00:17:03,640 Speaker 1: security experts criticize Walgreens for a lapse in data security. 280 00:17:03,680 --> 00:17:06,159 Speaker 1: At the heart of the matter is personal data related 281 00:17:06,160 --> 00:17:09,280 Speaker 1: to people who registered with Walgreens in order to get 282 00:17:09,320 --> 00:17:12,960 Speaker 1: a COVID nineteen test. So in this process, you would 283 00:17:13,040 --> 00:17:18,080 Speaker 1: register with Walgreens, you would create essentially an account, and 284 00:17:18,440 --> 00:17:21,800 Speaker 1: you would get a unique thirty two digit I D 285 00:17:21,880 --> 00:17:26,480 Speaker 1: number to associate with your account. Then a back end 286 00:17:26,520 --> 00:17:31,119 Speaker 1: system on Walgreens you know network would automatically create a 287 00:17:31,160 --> 00:17:35,520 Speaker 1: patient page, a web page specifically for you, with all 288 00:17:35,520 --> 00:17:39,720 Speaker 1: of your personal information included on that web page, and 289 00:17:40,119 --> 00:17:43,560 Speaker 1: the thirty two digit i D that you received would 290 00:17:43,560 --> 00:17:46,600 Speaker 1: be part of the u r L for this specific 291 00:17:46,680 --> 00:17:48,840 Speaker 1: web page. So as long as you have a link 292 00:17:49,040 --> 00:17:51,479 Speaker 1: to that u r L, you can visit the page. 293 00:17:51,520 --> 00:17:54,439 Speaker 1: And there was no authentication process. There was no like 294 00:17:54,520 --> 00:17:57,040 Speaker 1: password or anything you needed to use before you saw it, 295 00:17:57,680 --> 00:18:03,040 Speaker 1: You just needed the link. Well, that means someone could 296 00:18:03,480 --> 00:18:06,959 Speaker 1: potentially create a means to test out various thirty two 297 00:18:07,040 --> 00:18:11,800 Speaker 1: digit ideas just like generating them, you know, randomly or 298 00:18:12,160 --> 00:18:15,240 Speaker 1: in sequence in a brute force attack, and then look 299 00:18:15,280 --> 00:18:17,240 Speaker 1: to see which ones lead to valid u r l 300 00:18:17,359 --> 00:18:20,240 Speaker 1: s and use that to scrape some pretty valuable personal 301 00:18:20,280 --> 00:18:25,119 Speaker 1: information off the website. So you could, you know, essentially 302 00:18:25,400 --> 00:18:29,240 Speaker 1: make wild guesses and try and find people's personal information. Now, 303 00:18:29,280 --> 00:18:32,320 Speaker 1: thirty two digits is a big number, so the odds 304 00:18:32,320 --> 00:18:36,000 Speaker 1: of you actually getting hits are pretty remote, but it's 305 00:18:36,040 --> 00:18:38,840 Speaker 1: not outside the realm of possibility. But in addition, just 306 00:18:38,920 --> 00:18:41,800 Speaker 1: having access to someone's browsing history would give you the 307 00:18:41,880 --> 00:18:45,240 Speaker 1: link to their personal u r L and thus all 308 00:18:45,280 --> 00:18:48,640 Speaker 1: their personal information. Because again there was no protection on that. 309 00:18:49,240 --> 00:18:53,680 Speaker 1: Recode published an article about this titled how Walgreen's sloppy 310 00:18:53,760 --> 00:18:58,919 Speaker 1: COVID nineteen test registration system exposed patient data. According to 311 00:18:58,960 --> 00:19:02,439 Speaker 1: that piece, re Code alerted Walgreens of this problem. They 312 00:19:02,480 --> 00:19:05,280 Speaker 1: also said that other security researchers had done the same, 313 00:19:05,800 --> 00:19:08,480 Speaker 1: and then Recode gave the company some time to address 314 00:19:08,520 --> 00:19:11,760 Speaker 1: the issue. When that didn't happen, they went ahead and 315 00:19:11,880 --> 00:19:17,679 Speaker 1: published the article. So there you go. You know, there 316 00:19:17,720 --> 00:19:20,399 Speaker 1: are certain things that we all know to be true, 317 00:19:20,800 --> 00:19:22,879 Speaker 1: and one of those is that the rules most of 318 00:19:22,960 --> 00:19:27,639 Speaker 1: us have to follow don't necessarily apply to everyone, particularly 319 00:19:27,680 --> 00:19:31,119 Speaker 1: people who have a lot of money and or status. 320 00:19:31,240 --> 00:19:34,359 Speaker 1: And that's true on Facebook at least. The Wall Street 321 00:19:34,440 --> 00:19:36,800 Speaker 1: Journal reports that Facebook has a system in place that 322 00:19:36,880 --> 00:19:40,240 Speaker 1: essentially says people in this group are above the law. 323 00:19:40,720 --> 00:19:45,640 Speaker 1: The system apparently includes around five point eight million Facebook users, 324 00:19:46,240 --> 00:19:50,200 Speaker 1: and it includes people that Facebook deems as being influential, 325 00:19:50,640 --> 00:19:54,000 Speaker 1: or newsworthy, or just a pr risk. In other words, 326 00:19:54,280 --> 00:19:58,240 Speaker 1: if Facebook were to hold these people accountable, those people 327 00:19:58,240 --> 00:20:01,159 Speaker 1: would make such a stink about it and would be 328 00:20:01,200 --> 00:20:04,280 Speaker 1: able to get such publicity around it that it would 329 00:20:04,280 --> 00:20:08,200 Speaker 1: create an enormous headache for Facebook. So I guess it's 330 00:20:08,240 --> 00:20:10,919 Speaker 1: best to just let them do whatever they want and 331 00:20:11,000 --> 00:20:13,640 Speaker 1: not be subject to the rules. So when users get 332 00:20:13,640 --> 00:20:16,399 Speaker 1: added to this list, then moderators will find it more 333 00:20:16,440 --> 00:20:19,639 Speaker 1: difficult to take any kind of action against those accounts. 334 00:20:19,680 --> 00:20:22,840 Speaker 1: So let's say that I got added to this list. 335 00:20:23,160 --> 00:20:26,280 Speaker 1: I'm sure I'm not on it. I'm not nearly important enough. 336 00:20:26,680 --> 00:20:28,800 Speaker 1: But let's say I was added to this list, and 337 00:20:28,840 --> 00:20:33,600 Speaker 1: then I started posting stuff that was explicitly against Facebook policies, 338 00:20:33,960 --> 00:20:39,320 Speaker 1: and someone rightfully flags my post and gets the attention 339 00:20:39,320 --> 00:20:42,320 Speaker 1: of a moderator. Well, the moderator might find that their 340 00:20:42,359 --> 00:20:46,960 Speaker 1: normal options, which might include everything from sequestering a post 341 00:20:47,040 --> 00:20:49,600 Speaker 1: so that fewer people see it, or just blocking the 342 00:20:49,640 --> 00:20:53,600 Speaker 1: post entirely, or deleting it, maybe even deactivating my account 343 00:20:53,760 --> 00:20:56,720 Speaker 1: temporarily or permanently, they might find that those are not 344 00:20:56,960 --> 00:21:00,800 Speaker 1: valid options. Uh andy, that they can actually choose it. 345 00:21:01,119 --> 00:21:03,960 Speaker 1: That could be what happens in the case of someone 346 00:21:04,359 --> 00:21:08,320 Speaker 1: in this list getting flagged. Now, not everyone inside Facebook 347 00:21:08,400 --> 00:21:11,480 Speaker 1: is really a big fan of this particular system, which, 348 00:21:11,720 --> 00:21:15,119 Speaker 1: according to the article, is called x check, And it 349 00:21:15,160 --> 00:21:17,360 Speaker 1: does sound a little bit like it was taken out 350 00:21:17,359 --> 00:21:20,040 Speaker 1: of animal Farm, in which we learned that all animals 351 00:21:20,080 --> 00:21:23,840 Speaker 1: are equal, but some are more equal than others. I've 352 00:21:23,880 --> 00:21:26,880 Speaker 1: got a few more stories, including a couple more Facebook pieces, 353 00:21:27,200 --> 00:21:30,440 Speaker 1: but before we get to that, let's take another quick break. 354 00:21:38,000 --> 00:21:40,320 Speaker 1: We're back and uh, I lied. I only have one 355 00:21:40,320 --> 00:21:43,280 Speaker 1: other Facebook piece, but here it is. So Also in 356 00:21:43,359 --> 00:21:47,119 Speaker 1: Facebook News, the New York Times investigated how Facebook was 357 00:21:47,160 --> 00:21:51,879 Speaker 1: sharing data relating to misinformation campaigns on the platform. So 358 00:21:52,119 --> 00:21:56,320 Speaker 1: Facebook has been working with various researchers on this matter, 359 00:21:56,920 --> 00:21:59,800 Speaker 1: and as part of that, Facebook has been giving researchers 360 00:21:59,840 --> 00:22:05,000 Speaker 1: access to data about misinformation campaigns and how they affect people, 361 00:22:05,040 --> 00:22:08,760 Speaker 1: like how do people interact with misinformation campaigns? How many 362 00:22:08,800 --> 00:22:11,840 Speaker 1: people you know like it or share it or comment 363 00:22:11,960 --> 00:22:14,760 Speaker 1: on it and such, and how does that affect the 364 00:22:14,840 --> 00:22:18,320 Speaker 1: spread of misinformation? However, the New York Times discovered that 365 00:22:18,320 --> 00:22:23,119 Speaker 1: Facebook was omitting half of all the user activity of 366 00:22:23,200 --> 00:22:25,879 Speaker 1: Facebook in the United States in the data they shared, 367 00:22:25,960 --> 00:22:29,080 Speaker 1: like fifty percent of the data that needed to go 368 00:22:29,119 --> 00:22:33,159 Speaker 1: to researchers was just not there. And the belief was 369 00:22:33,160 --> 00:22:36,520 Speaker 1: that Facebook was sharing all that data. So that meant 370 00:22:36,560 --> 00:22:39,520 Speaker 1: researchers were working under the assumption that they had the 371 00:22:39,600 --> 00:22:42,280 Speaker 1: big picture, the full picture when in fact, they only 372 00:22:42,320 --> 00:22:44,440 Speaker 1: had half of it. Now, in the world of research, 373 00:22:45,280 --> 00:22:47,640 Speaker 1: that's a pretty big problem because it means that any 374 00:22:47,720 --> 00:22:51,920 Speaker 1: conclusions you have drawn are based off of incomplete data sets, 375 00:22:52,080 --> 00:22:56,400 Speaker 1: and that those conclusions could be very much faulty. Uh. 376 00:22:56,440 --> 00:22:59,679 Speaker 1: It appears as though this was an innocent oversight, at 377 00:22:59,760 --> 00:23:03,480 Speaker 1: least that's the implication I get from reading about it, 378 00:23:03,600 --> 00:23:06,840 Speaker 1: that this was not a deliberate attempt to create misinformation 379 00:23:06,920 --> 00:23:11,240 Speaker 1: about misinformation. But you know, I can't say that for sure. 380 00:23:12,200 --> 00:23:15,320 Speaker 1: It could be that this was at least partly deliberate 381 00:23:15,440 --> 00:23:18,520 Speaker 1: the whole time. I don't think it was, but I 382 00:23:18,560 --> 00:23:22,240 Speaker 1: can't I don't have evidence saying one versus the other. 383 00:23:23,320 --> 00:23:26,119 Speaker 1: But when you do couple this problem with the issue 384 00:23:26,160 --> 00:23:29,879 Speaker 1: that Facebook recently banned some security researcher accounts, you know, 385 00:23:30,200 --> 00:23:32,600 Speaker 1: just a few months ago, and was saying that those 386 00:23:32,600 --> 00:23:35,840 Speaker 1: accounts were effectively scraping data from Facebook and that's against 387 00:23:35,880 --> 00:23:39,560 Speaker 1: the platforms policies. So as a result, the researchers found 388 00:23:39,600 --> 00:23:43,080 Speaker 1: their accounts suspended. This all makes it seem as that 389 00:23:43,160 --> 00:23:47,120 Speaker 1: the company is throwing roadblocks in the way of researchers 390 00:23:47,119 --> 00:23:51,639 Speaker 1: who are looking into the platform's involvement with misinformation campaigns. 391 00:23:52,520 --> 00:23:57,600 Speaker 1: Whether that's intentional or not, that isn't great. Uh, It 392 00:23:57,640 --> 00:24:02,680 Speaker 1: at least seems like Facebook is protecting It's maybe not consciously, 393 00:24:03,080 --> 00:24:07,440 Speaker 1: but that effectively that's what's happening. And I would say 394 00:24:07,480 --> 00:24:11,080 Speaker 1: that the the stories point to a need for Facebook 395 00:24:11,119 --> 00:24:14,320 Speaker 1: to get more proactive in making sure that people have 396 00:24:14,400 --> 00:24:16,800 Speaker 1: access to the information they need in order to really 397 00:24:16,800 --> 00:24:21,640 Speaker 1: get to the bottom of this particular subject matter. Now, 398 00:24:21,720 --> 00:24:26,119 Speaker 1: let's head on over to Pinterest. So Christina Martinez, a 399 00:24:26,160 --> 00:24:30,439 Speaker 1: woman who claims to essentially be a co founder of 400 00:24:30,480 --> 00:24:35,040 Speaker 1: the popular site pinterest, is suing co founders Ben Silberman 401 00:24:35,359 --> 00:24:41,080 Speaker 1: and Paul Schiarra for allegedly stealing ideas and engaging in 402 00:24:41,240 --> 00:24:46,320 Speaker 1: unfair business practices, as well as a breach of implied contract. Now, 403 00:24:46,480 --> 00:24:50,119 Speaker 1: Martinez was never an employee of pinterest, you know, she 404 00:24:50,240 --> 00:24:54,840 Speaker 1: was never officially associated with the company. However, in the lawsuit, 405 00:24:54,960 --> 00:24:58,040 Speaker 1: she says that she and the co founders had this 406 00:24:58,280 --> 00:25:02,040 Speaker 1: implied agreement that she would be compensated for her ideas 407 00:25:02,600 --> 00:25:07,560 Speaker 1: when they were first thinking about creating pinterest, and according 408 00:25:07,560 --> 00:25:11,199 Speaker 1: to the lawsuit, Martinez consulted with the co founders and 409 00:25:11,280 --> 00:25:15,000 Speaker 1: gave guidance towards the design and marketing of the site, 410 00:25:15,480 --> 00:25:19,640 Speaker 1: including the idea of organizing the site into boards as 411 00:25:19,680 --> 00:25:24,960 Speaker 1: a means of conceptualizing how pinterest is structured. They are 412 00:25:25,040 --> 00:25:29,520 Speaker 1: essentially like virtual corkboards upon which people can pin stuff 413 00:25:29,760 --> 00:25:33,240 Speaker 1: related to whatever the board's focuses, often with a heavy 414 00:25:33,280 --> 00:25:37,800 Speaker 1: emphasis on things like interior design, something that Martinez has 415 00:25:37,840 --> 00:25:42,520 Speaker 1: worked extensively in, and she says that the co founders 416 00:25:42,600 --> 00:25:47,160 Speaker 1: had verbally indicated that they would compensate her for her contributions, 417 00:25:47,160 --> 00:25:49,800 Speaker 1: but that that never happened, and after the company went 418 00:25:49,840 --> 00:25:52,879 Speaker 1: public in twenty nineteen and Martinez still had not received 419 00:25:52,880 --> 00:25:57,320 Speaker 1: any compensation, she says she realized those were empty promises 420 00:25:57,359 --> 00:26:00,160 Speaker 1: and they were never going to follow through on it. Now, 421 00:26:00,160 --> 00:26:02,560 Speaker 1: the company states it is going to fight the lawsuit 422 00:26:02,640 --> 00:26:05,560 Speaker 1: and that the charges are without merit. And in semi 423 00:26:05,600 --> 00:26:08,840 Speaker 1: related news, pinterest has been in some pretty hot water 424 00:26:09,040 --> 00:26:11,080 Speaker 1: for the last year or so as women in the 425 00:26:11,119 --> 00:26:15,000 Speaker 1: company have raised issues relating to pay disparities within Pinterest, 426 00:26:15,359 --> 00:26:19,520 Speaker 1: pointing out that pinterest targets a largely female user base 427 00:26:19,760 --> 00:26:23,480 Speaker 1: and yet apparently practices unfair compensation policies that show a 428 00:26:23,600 --> 00:26:27,639 Speaker 1: gap between male and female employees. We will put a 429 00:26:27,720 --> 00:26:30,520 Speaker 1: pen in this story for now, but we will come 430 00:26:30,520 --> 00:26:33,560 Speaker 1: back to it as more develops. New York is the 431 00:26:33,640 --> 00:26:36,679 Speaker 1: latest state to legislate a ban on the sale of 432 00:26:36,760 --> 00:26:42,200 Speaker 1: all gas powered vehicles beginning in so all new vehicles 433 00:26:43,560 --> 00:26:48,000 Speaker 1: have to be zero emissions vehicles. Nothing else will legally 434 00:26:48,040 --> 00:26:51,080 Speaker 1: be allowed to be sold. Nothing new. That is, people 435 00:26:51,119 --> 00:26:53,840 Speaker 1: will still be allowed to sell their used vehicles that 436 00:26:53,960 --> 00:26:57,639 Speaker 1: run on internal combustion engines. It's not like if you've got, 437 00:26:57,800 --> 00:27:01,879 Speaker 1: you know, a pre internal combustion car that you know, 438 00:27:01,920 --> 00:27:04,639 Speaker 1: you're just stuck with it until it falls apart. However, 439 00:27:04,680 --> 00:27:06,480 Speaker 1: if you want to sell new vehicles in the state 440 00:27:06,480 --> 00:27:08,280 Speaker 1: of New York, they have to be zero emissions. This 441 00:27:08,359 --> 00:27:11,760 Speaker 1: puts a lot of pressure on the various car companies 442 00:27:11,760 --> 00:27:15,800 Speaker 1: out there to really get their their electric vehicle and 443 00:27:15,880 --> 00:27:20,280 Speaker 1: zero emission vehicle strategies in gear, so to speak. That 444 00:27:20,280 --> 00:27:24,280 Speaker 1: that pun was semi intentional. Um New York is not 445 00:27:24,359 --> 00:27:26,880 Speaker 1: the first state to do this. California has done it already. 446 00:27:27,840 --> 00:27:31,919 Speaker 1: There are other states that have similar bands in place. 447 00:27:32,040 --> 00:27:34,639 Speaker 1: There are other places in the world that are talking 448 00:27:34,680 --> 00:27:39,480 Speaker 1: about this as well. So there's a general move toward 449 00:27:39,520 --> 00:27:44,320 Speaker 1: this uh forcing companies to migrate away from internal combustion 450 00:27:44,600 --> 00:27:48,679 Speaker 1: and fossil fuel powered vehicles to move toward vehicles that 451 00:27:48,840 --> 00:27:53,200 Speaker 1: are electric. And again I've said this many times. Moving 452 00:27:53,240 --> 00:27:56,600 Speaker 1: to electric on its own is great. However, it does 453 00:27:56,800 --> 00:27:59,240 Speaker 1: require that you look at the big picture and see 454 00:27:59,240 --> 00:28:01,480 Speaker 1: where the electric is coming from. If it's coming from 455 00:28:01,520 --> 00:28:05,480 Speaker 1: like a renewable source, a hydro electric sources, where you 456 00:28:05,480 --> 00:28:08,520 Speaker 1: know you've got a dawn that's just generating electricity through turbines, 457 00:28:09,359 --> 00:28:13,160 Speaker 1: that's great, that's not creating carbon emissions. But if you're 458 00:28:13,200 --> 00:28:16,760 Speaker 1: getting your electricity from, say a coal powered power plant, 459 00:28:17,480 --> 00:28:22,280 Speaker 1: then while your car is not directly generating carbon emissions, 460 00:28:22,760 --> 00:28:27,280 Speaker 1: the charging of your car is contributing to carbon emissions. 461 00:28:27,400 --> 00:28:30,600 Speaker 1: So we always have to remember to keep looking outward. 462 00:28:31,040 --> 00:28:33,520 Speaker 1: There are ripples in these kind of strategies, and we 463 00:28:33,560 --> 00:28:35,280 Speaker 1: have to keep looking outward to make sure that we're 464 00:28:35,280 --> 00:28:40,240 Speaker 1: addressing each piece of that bigger picture, or else we're 465 00:28:40,280 --> 00:28:44,320 Speaker 1: really just shifting problems from one part to another. And 466 00:28:44,440 --> 00:28:48,720 Speaker 1: now for a final couple of weird news items. One 467 00:28:48,840 --> 00:28:52,480 Speaker 1: is that a company called Situs Studio X. I think 468 00:28:52,480 --> 00:28:55,600 Speaker 1: I'm saying that correctly. It's s I d U S. Anyway, 469 00:28:55,640 --> 00:29:00,080 Speaker 1: this company has generated an AI created influencer called z 470 00:29:00,680 --> 00:29:03,280 Speaker 1: r o z Y, and the whole purpose of Rosie, 471 00:29:03,320 --> 00:29:07,960 Speaker 1: who appears to be a young woman, is to serve 472 00:29:08,000 --> 00:29:11,600 Speaker 1: as an influencer and brand ambassador for you know, whichever 473 00:29:11,720 --> 00:29:15,880 Speaker 1: companies decided to use Rosie's services. Rosie will be forever 474 00:29:17,040 --> 00:29:21,080 Speaker 1: presumably because Forever twenty one was already taken, and will 475 00:29:21,160 --> 00:29:25,480 Speaker 1: post to various social platforms as part of marketing campaigns. 476 00:29:25,560 --> 00:29:28,320 Speaker 1: So I guess you could argue the only thing separating 477 00:29:28,440 --> 00:29:32,959 Speaker 1: Rosie from some other influencers out there is that Rosie 478 00:29:33,040 --> 00:29:37,280 Speaker 1: isn't quote unquote real, but rather a computer generated entity. 479 00:29:37,360 --> 00:29:40,240 Speaker 1: But heck, some influencers appear to be nothing more than 480 00:29:40,280 --> 00:29:44,000 Speaker 1: a persona that's crafted to market and sell stuff. So 481 00:29:44,040 --> 00:29:46,280 Speaker 1: you can say, well, what's the difference If an influencer 482 00:29:46,360 --> 00:29:49,400 Speaker 1: is just doing the same thing as this computer generated model, 483 00:29:49,440 --> 00:29:54,160 Speaker 1: then you know, to the outward facing audience, like to 484 00:29:54,000 --> 00:29:57,800 Speaker 1: the audience, it's the same. But according to said Studio X, 485 00:29:58,120 --> 00:30:00,280 Speaker 1: the really big difference is that brands will never have 486 00:30:00,400 --> 00:30:03,840 Speaker 1: to worry about Rosie going off script. Rosie is not 487 00:30:03,880 --> 00:30:06,520 Speaker 1: going to engage in some sort of online feud with 488 00:30:06,560 --> 00:30:11,080 Speaker 1: other influencers. Rosie won't get pulled into a scandal. Rosie 489 00:30:11,120 --> 00:30:14,560 Speaker 1: is not going to post inflammatory statements about disadvantaged or 490 00:30:14,640 --> 00:30:17,920 Speaker 1: vulnerable groups. We've seen tons of stories over the last 491 00:30:17,920 --> 00:30:20,760 Speaker 1: few years of various folks who have become really influential 492 00:30:20,800 --> 00:30:24,840 Speaker 1: online and how they've sometimes engaged in behavior was at 493 00:30:24,920 --> 00:30:30,120 Speaker 1: best problematic. Rosie will not do that. Now, the question remains, 494 00:30:30,160 --> 00:30:34,280 Speaker 1: will people follow Rosie? Will they find Rosie to be 495 00:30:34,680 --> 00:30:38,720 Speaker 1: influential and interesting? Will folks care to follow an artificial 496 00:30:38,760 --> 00:30:42,000 Speaker 1: person to see what they post? On the sponsor side, 497 00:30:42,160 --> 00:30:44,960 Speaker 1: the company says that it's already received a hundred offers 498 00:30:45,280 --> 00:30:48,600 Speaker 1: then is processing them. But we just don't know if 499 00:30:48,600 --> 00:30:50,520 Speaker 1: a lot of people are going to eagerly follow the 500 00:30:50,640 --> 00:30:55,520 Speaker 1: fictional exploits of a computer generated person. Then again, there 501 00:30:55,560 --> 00:30:59,200 Speaker 1: are entire real world concerts that feature c g I 502 00:30:59,320 --> 00:31:03,560 Speaker 1: characters per forming in front of a crowd. So stranger 503 00:31:03,600 --> 00:31:08,520 Speaker 1: things have happened. And finally, a company called Colossal wants 504 00:31:08,520 --> 00:31:12,160 Speaker 1: to use the gene editing technology known as Crisper c 505 00:31:12,560 --> 00:31:17,040 Speaker 1: r I spr to clone wooly mammoths, and to do 506 00:31:17,080 --> 00:31:23,360 Speaker 1: it by cloning those kind of being a little generous. Rather, 507 00:31:23,440 --> 00:31:26,479 Speaker 1: they really want to take genetic information from wooly mammoth 508 00:31:26,560 --> 00:31:30,640 Speaker 1: remains and then use that genetic information to edit the 509 00:31:30,680 --> 00:31:37,200 Speaker 1: gene sequences of elephants to introduce genetic sequences that will 510 00:31:38,080 --> 00:31:43,320 Speaker 1: allow elephants to take on certain genetic aspects of wooly mammoths, 511 00:31:43,360 --> 00:31:48,360 Speaker 1: like the ability to adapt to extreme cold weather environments. 512 00:31:48,400 --> 00:31:51,640 Speaker 1: So this isn't so much a Jurassic Park situation as 513 00:31:51,720 --> 00:31:55,200 Speaker 1: it is an Island of Dr Moreau situation, except as 514 00:31:55,200 --> 00:31:57,440 Speaker 1: far as I know, Colossal is not planning on making 515 00:31:57,440 --> 00:32:01,040 Speaker 1: elephants more human like in the process. The company does 516 00:32:01,080 --> 00:32:04,000 Speaker 1: say that it's dream goal is the restoration of the 517 00:32:04,000 --> 00:32:07,080 Speaker 1: wooly mammoth, though I would argue that really this is 518 00:32:07,080 --> 00:32:10,560 Speaker 1: more of a soft reboot of wooly mammoths as opposed 519 00:32:10,560 --> 00:32:14,080 Speaker 1: to restoration. According to Ben Lamb, the chief executive of 520 00:32:14,120 --> 00:32:17,840 Speaker 1: the company, the real purpose for Colossal is to serve 521 00:32:17,920 --> 00:32:21,920 Speaker 1: in the field of genetic preservation that creating these types 522 00:32:21,960 --> 00:32:26,000 Speaker 1: of creatures, we can then preserve the genetic info and 523 00:32:26,480 --> 00:32:30,520 Speaker 1: the populations of various endangered and threatened species. It gives 524 00:32:30,600 --> 00:32:33,960 Speaker 1: species and evolutionary speed boost to allow them to potentially 525 00:32:34,040 --> 00:32:37,240 Speaker 1: roam new regions when their traditional homes are under threat 526 00:32:37,680 --> 00:32:41,240 Speaker 1: due to like human encroachment or climate change, that kind 527 00:32:41,240 --> 00:32:44,640 Speaker 1: of thing. Lamb argues that the company will protect bio 528 00:32:44,680 --> 00:32:47,960 Speaker 1: diversity this way and bring more attention to the crisis 529 00:32:48,040 --> 00:32:53,320 Speaker 1: of biodiversity collapse, which is a real thing for multiple species, 530 00:32:53,360 --> 00:32:55,760 Speaker 1: you know, not just animals but also plants, Like this 531 00:32:55,800 --> 00:33:00,240 Speaker 1: is a big crisis. However, there are other people there 532 00:33:00,240 --> 00:33:02,920 Speaker 1: who are arguing that the money being spent to do 533 00:33:02,960 --> 00:33:08,120 Speaker 1: this could be better spent toward protecting existing species rather 534 00:33:08,200 --> 00:33:11,840 Speaker 1: than trying to get a simulacrum of an extinct species 535 00:33:12,000 --> 00:33:15,000 Speaker 1: up and about. Again, I don't know where I come 536 00:33:15,040 --> 00:33:17,719 Speaker 1: down on this. I kind of would rather see us 537 00:33:18,000 --> 00:33:22,400 Speaker 1: protect the stuff that's already out there and perhaps build 538 00:33:22,400 --> 00:33:27,080 Speaker 1: out you know, kind of genetic repositories so that we 539 00:33:27,160 --> 00:33:33,400 Speaker 1: do have the capability of potentially helping species that are 540 00:33:33,520 --> 00:33:37,440 Speaker 1: on the verge of extinction make a recovery. Uh, it's 541 00:33:37,480 --> 00:33:41,280 Speaker 1: a little it's a little squeaky to me simply because 542 00:33:42,280 --> 00:33:44,440 Speaker 1: I don't, you know, we can't really understand what the 543 00:33:44,480 --> 00:33:49,080 Speaker 1: consequences will be of trying to bring an extinct species back, 544 00:33:49,720 --> 00:33:52,360 Speaker 1: like if we did it and then that species were 545 00:33:52,400 --> 00:33:55,080 Speaker 1: to die out again because we did a bad job 546 00:33:55,120 --> 00:33:58,280 Speaker 1: of it. I would say that the whole attempt was 547 00:33:58,400 --> 00:34:02,800 Speaker 1: rather unethical in that case, right, Like it's tricky. I 548 00:34:02,840 --> 00:34:07,680 Speaker 1: get the motivation. I just worry about the execution. Anyway. 549 00:34:08,160 --> 00:34:13,719 Speaker 1: That's it for the news for Tuesday, September one. If 550 00:34:13,719 --> 00:34:16,600 Speaker 1: you have suggestions for topics I should cover on tech Stuff, 551 00:34:16,920 --> 00:34:18,920 Speaker 1: reach out to me. The best way to do that 552 00:34:19,040 --> 00:34:21,320 Speaker 1: is over on Twitter. The handle we use for the 553 00:34:21,320 --> 00:34:25,520 Speaker 1: show is tech Stuff HSW. Look forward to hearing from you, 554 00:34:25,840 --> 00:34:35,000 Speaker 1: and I'll talk to you again really soon. Tech Stuff 555 00:34:35,080 --> 00:34:38,239 Speaker 1: is an I Heart Radio production. For more podcasts from 556 00:34:38,239 --> 00:34:42,040 Speaker 1: my Heart Radio, visit the i Heart Radio app, Apple Podcasts, 557 00:34:42,160 --> 00:34:44,120 Speaker 1: or wherever you listen to your favorite shows.