1 00:00:04,440 --> 00:00:12,479 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,520 --> 00:00:16,040 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:16,040 --> 00:00:18,840 Speaker 1: I'm an executive producer with iHeartRadio. And how the tech 4 00:00:18,880 --> 00:00:21,720 Speaker 1: are you. It's time for the tech news for Thursday, 5 00:00:21,840 --> 00:00:26,599 Speaker 1: June fifteenth, twenty twenty three, and first up, Google leadership 6 00:00:26,680 --> 00:00:31,040 Speaker 1: have reportedly warned employees against using chatbots to do stuff like, 7 00:00:31,400 --> 00:00:35,920 Speaker 1: you know, organize information that could include sensitive or proprietary data, 8 00:00:36,760 --> 00:00:42,959 Speaker 1: and that also includes Google's own chat bot Bard. And 9 00:00:43,080 --> 00:00:47,159 Speaker 1: you might think that that's a bit concerning that a 10 00:00:47,200 --> 00:00:51,040 Speaker 1: company that has developed an AI chatbot and is actively 11 00:00:51,200 --> 00:00:55,640 Speaker 1: marketing that chat bot to business customers has now warned 12 00:00:55,640 --> 00:00:58,400 Speaker 1: its own staff against using such tools in the first place, 13 00:00:58,760 --> 00:01:03,600 Speaker 1: for everything from organized information to developing code. And I 14 00:01:03,640 --> 00:01:07,240 Speaker 1: also think that's concerning. Google's messaging has been that the 15 00:01:07,240 --> 00:01:11,479 Speaker 1: company wishes to remain transparent that it acknowledges that these 16 00:01:11,520 --> 00:01:15,920 Speaker 1: tools are far from perfect. Not only can they generate 17 00:01:15,959 --> 00:01:20,080 Speaker 1: responses that are unreliable through AI hallucinations, they also have 18 00:01:20,160 --> 00:01:24,800 Speaker 1: the potential to incorporate any information that you submit to them. So, 19 00:01:24,840 --> 00:01:27,400 Speaker 1: in other words, if you're using an AI chatbot to 20 00:01:27,440 --> 00:01:30,600 Speaker 1: help you create a presentation meant for an internal meeting, 21 00:01:31,040 --> 00:01:34,800 Speaker 1: and that data includes stuff that's not meant for public consumption. 22 00:01:35,319 --> 00:01:38,960 Speaker 1: It's possible the chatbot could essentially absorb that data and 23 00:01:39,000 --> 00:01:42,840 Speaker 1: who knows, maybe populate some future response with information you 24 00:01:42,920 --> 00:01:46,520 Speaker 1: provided to it to someone else, perhaps a competitor. We've 25 00:01:46,520 --> 00:01:50,480 Speaker 1: already seen how an unintentional error could compromise information you 26 00:01:50,520 --> 00:01:54,200 Speaker 1: shared with a chatbot. Chat GPT famously had a glitch 27 00:01:54,520 --> 00:01:57,320 Speaker 1: in which users were able to see past conversations that 28 00:01:57,440 --> 00:02:02,960 Speaker 1: other people had had with chat gpt. Interestingly, Google does 29 00:02:03,120 --> 00:02:06,520 Speaker 1: offer a version of BARD to commercial customers that, for 30 00:02:06,600 --> 00:02:10,280 Speaker 1: a price, will keep conversations strictly on the DL that 31 00:02:10,440 --> 00:02:14,920 Speaker 1: is barred, won't incorporate data entered into such interactions into 32 00:02:14,960 --> 00:02:18,919 Speaker 1: the larger public database of information. I think the story 33 00:02:18,960 --> 00:02:23,000 Speaker 1: really reinforces the fact that Google rushed into the AI 34 00:02:23,120 --> 00:02:26,320 Speaker 1: space and felt pressured by the launch of chat GPT, 35 00:02:26,720 --> 00:02:29,320 Speaker 1: and that the company had not intended to go to 36 00:02:29,400 --> 00:02:34,280 Speaker 1: market so soon with BARD. But that's already known. But 37 00:02:34,360 --> 00:02:37,480 Speaker 1: the fear is that tools like chat GPT could potentially 38 00:02:37,560 --> 00:02:41,440 Speaker 1: spell disaster for traditional web searches, and so to Google, 39 00:02:41,600 --> 00:02:46,280 Speaker 1: the emergence of generative AI represented an existential threat. With 40 00:02:46,360 --> 00:02:50,280 Speaker 1: Google's business so heavily dependent upon ad search, a hit 41 00:02:50,880 --> 00:02:54,080 Speaker 1: to the search business would be potentially devastating, and so 42 00:02:54,200 --> 00:02:57,919 Speaker 1: the company barreled into launching its own AI chatbot. Whether 43 00:02:58,000 --> 00:03:01,920 Speaker 1: this ultimately keeps Google's business safe remains to be seen, 44 00:03:02,360 --> 00:03:05,639 Speaker 1: but based upon the company's warning to its own employees, 45 00:03:05,960 --> 00:03:09,320 Speaker 1: I'm not confident that it's the best tech to push 46 00:03:09,360 --> 00:03:14,080 Speaker 1: out worldwide. I guess Google's former motto don't be evil 47 00:03:14,440 --> 00:03:17,119 Speaker 1: is now do as I say, not as I do. 48 00:03:18,760 --> 00:03:23,480 Speaker 1: Vira Jarova, the VP of the European Commission's Office of 49 00:03:23,840 --> 00:03:28,680 Speaker 1: Values and Transparency, is calling for EU legislators to update 50 00:03:28,720 --> 00:03:33,120 Speaker 1: the Code of Practice on Disinformation to include new rules 51 00:03:33,280 --> 00:03:37,240 Speaker 1: about generative AI. She is calling for tech companies in 52 00:03:37,240 --> 00:03:39,840 Speaker 1: the space to create guidelines and rules that will protect 53 00:03:39,840 --> 00:03:45,520 Speaker 1: EU citizens against misinformation and disinformation created by AI, and 54 00:03:45,560 --> 00:03:50,040 Speaker 1: that could include things like labeling when content was generated 55 00:03:50,320 --> 00:03:54,120 Speaker 1: by AI in the first place. According to pcgamer dot com, 56 00:03:54,280 --> 00:03:56,400 Speaker 1: she was asked if the new rules would have a 57 00:03:56,400 --> 00:04:01,560 Speaker 1: negative impact on the freedom of expression. To that replied quote, 58 00:04:01,880 --> 00:04:05,120 Speaker 1: I don't see any right of machines to freedom of 59 00:04:05,160 --> 00:04:10,120 Speaker 1: expression end quote. That's an interesting point, saying that well, 60 00:04:10,480 --> 00:04:14,760 Speaker 1: machines don't have that right. They're not people. The EU 61 00:04:14,880 --> 00:04:17,279 Speaker 1: is also a region where in the past advocates have 62 00:04:17,400 --> 00:04:21,560 Speaker 1: argued that they should consider if robots have a right 63 00:04:21,800 --> 00:04:25,360 Speaker 1: to personhood. And I know robots and AI chatbots are 64 00:04:25,400 --> 00:04:29,400 Speaker 1: different things, but they start to bleed together awful quickly, 65 00:04:29,960 --> 00:04:33,200 Speaker 1: so it might sound a bit premature to hold conversations 66 00:04:33,240 --> 00:04:35,839 Speaker 1: about whether or not robots should be treated as people. 67 00:04:36,200 --> 00:04:38,960 Speaker 1: To be clear, we are a very long way away 68 00:04:39,000 --> 00:04:42,880 Speaker 1: from machines that have a sense of self. But part 69 00:04:42,960 --> 00:04:47,479 Speaker 1: of this call for such consideration was to help lay 70 00:04:47,480 --> 00:04:51,040 Speaker 1: out clear rules as to whom should be held responsible 71 00:04:51,480 --> 00:04:56,440 Speaker 1: if a machine operating under artificial intelligence causes harm either 72 00:04:56,520 --> 00:05:00,840 Speaker 1: to people or property. Should the manufacturer be responsible in 73 00:05:00,920 --> 00:05:06,440 Speaker 1: that effect? Should programmers the end user? These are big questions, 74 00:05:06,480 --> 00:05:08,320 Speaker 1: and they are relevant in a world where we have 75 00:05:08,400 --> 00:05:12,200 Speaker 1: stuff like vehicles that are operating at some level of autonomy. 76 00:05:12,520 --> 00:05:16,840 Speaker 1: If there's an accident, who whom should be held responsible? 77 00:05:17,120 --> 00:05:19,960 Speaker 1: But back to Jarova's point, She believes that the Code 78 00:05:19,960 --> 00:05:23,800 Speaker 1: of Disinformation needs updates, specifically to adjust the threat of 79 00:05:24,000 --> 00:05:28,560 Speaker 1: generative AI, and that any restrictions on that communication can't 80 00:05:28,600 --> 00:05:31,919 Speaker 1: be considered a violation of free speech. I'm not sure 81 00:05:31,960 --> 00:05:35,080 Speaker 1: that argument would actually fly here in the United States, 82 00:05:35,480 --> 00:05:38,560 Speaker 1: not because there's a strong legal basis to provide freedom 83 00:05:38,560 --> 00:05:42,400 Speaker 1: of speech to machines, there's not, but because such speech 84 00:05:42,720 --> 00:05:47,000 Speaker 1: could be seen as an extension of a corporation's communications, 85 00:05:47,120 --> 00:05:50,400 Speaker 1: namely the company responsible for creating the AI in the 86 00:05:50,440 --> 00:05:53,040 Speaker 1: first place. And as I'm sure most of you all know, 87 00:05:53,360 --> 00:05:57,360 Speaker 1: here in the United States, corporations are legally considered to 88 00:05:57,600 --> 00:06:01,400 Speaker 1: be people, complete with the right to expression. So it 89 00:06:01,440 --> 00:06:03,839 Speaker 1: will be interesting to see where this goes from here, 90 00:06:04,080 --> 00:06:07,680 Speaker 1: whether it will progress in the EU, and how companies 91 00:06:08,000 --> 00:06:11,000 Speaker 1: and the government in the United States might end up 92 00:06:11,680 --> 00:06:15,360 Speaker 1: shifting as kind of a result of that. Now Here 93 00:06:15,400 --> 00:06:18,960 Speaker 1: in the US, there is a bipartisan effort to alter 94 00:06:19,279 --> 00:06:23,000 Speaker 1: a different law to make exceptions for generative AI. This 95 00:06:23,080 --> 00:06:26,800 Speaker 1: time we're talking about the infamous Section two thirty. Now, 96 00:06:26,839 --> 00:06:31,159 Speaker 1: generally speaking, this law limits the liability of web based 97 00:06:31,160 --> 00:06:34,680 Speaker 1: platforms for the stuff that their users are posting to 98 00:06:34,760 --> 00:06:38,200 Speaker 1: those platforms. So, in other words, because of Section two thirty, 99 00:06:38,680 --> 00:06:42,839 Speaker 1: if someone were to post illegal material on Facebook, Meta 100 00:06:42,880 --> 00:06:46,479 Speaker 1: would not be held responsible for hosting that information. Now, 101 00:06:46,480 --> 00:06:49,920 Speaker 1: this rule has its own limitations. Platforms are supposed to 102 00:06:49,960 --> 00:06:54,160 Speaker 1: take reasonable steps to address illegal material, And there are 103 00:06:54,200 --> 00:06:57,080 Speaker 1: further questions as to what role the platforms actually play 104 00:06:57,160 --> 00:07:03,640 Speaker 1: in disseminating information, including misinformation and illegal material, because obviously 105 00:07:03,839 --> 00:07:07,560 Speaker 1: these platforms have their own recommendation algorithms, So there are 106 00:07:07,640 --> 00:07:13,440 Speaker 1: questions about, well, if a platform is promoting something, then 107 00:07:13,520 --> 00:07:17,560 Speaker 1: doesn't that mean the platform is at least partly accountable 108 00:07:17,680 --> 00:07:21,320 Speaker 1: for it. But generally speaking, platforms enjoy a great deal 109 00:07:21,360 --> 00:07:25,040 Speaker 1: of legal protection regarding the stuff that people post to them, 110 00:07:25,520 --> 00:07:28,480 Speaker 1: which has vexed both sides of the political aisle here 111 00:07:28,520 --> 00:07:32,800 Speaker 1: in the United States for very different philosophical reasons. Now, 112 00:07:33,000 --> 00:07:38,560 Speaker 1: Democrat Richard Blumenthal and Republican Josh Holly have proposed a 113 00:07:38,600 --> 00:07:43,120 Speaker 1: bill that would essentially say content from generative AI exists 114 00:07:43,280 --> 00:07:47,080 Speaker 1: outside the protections of Section two thirty. So if this 115 00:07:47,240 --> 00:07:50,080 Speaker 1: bill were to become law, there would be a legal 116 00:07:50,120 --> 00:07:54,240 Speaker 1: foundation to bring lawsuits against platforms that allow harmful generative 117 00:07:54,280 --> 00:07:58,400 Speaker 1: AI content on them that includes stuff like deep fake 118 00:07:58,520 --> 00:08:03,360 Speaker 1: videos or a I impersonations of people's voices. So if 119 00:08:03,440 --> 00:08:06,840 Speaker 1: this did become law, and someone uploaded a deep fake 120 00:08:06,960 --> 00:08:11,480 Speaker 1: video that appears to show you committing illegal or awful acts, 121 00:08:12,000 --> 00:08:14,560 Speaker 1: you would have the legal foundation to bring a lawsuit 122 00:08:14,560 --> 00:08:17,920 Speaker 1: against not just the person responsible for creating the deep fake, 123 00:08:18,480 --> 00:08:22,320 Speaker 1: but against say Meta for allowing it on the Facebook 124 00:08:22,320 --> 00:08:25,600 Speaker 1: platform in the first place. I imagine the various platforms 125 00:08:25,640 --> 00:08:29,440 Speaker 1: out there will have a lot of objections to this proposal. 126 00:08:30,360 --> 00:08:33,600 Speaker 1: Most of them have limited, if any, involvement with actual 127 00:08:33,679 --> 00:08:36,719 Speaker 1: generative AI as it stands. And it also raises the 128 00:08:36,800 --> 00:08:41,480 Speaker 1: question of what makes AI generated harmful material different from 129 00:08:41,559 --> 00:08:45,840 Speaker 1: harmful material created by a person. Right if Meta can't 130 00:08:45,880 --> 00:08:51,679 Speaker 1: be held responsible because Jimbo posted illegal material, but it 131 00:08:51,760 --> 00:08:56,760 Speaker 1: can be held responsible if Jimbo bought posted the illegal material, 132 00:08:57,800 --> 00:09:00,679 Speaker 1: what's the reasoning behind that? Why is one allowed and 133 00:09:00,800 --> 00:09:03,800 Speaker 1: one not? Or why is one protected in one not? Now, 134 00:09:03,840 --> 00:09:07,120 Speaker 1: I suppose one major difference you could argue is that 135 00:09:07,800 --> 00:09:10,560 Speaker 1: you could create a whole lot of harmful material using 136 00:09:10,600 --> 00:09:14,000 Speaker 1: AI a whole lot faster than if you went you know, 137 00:09:14,360 --> 00:09:20,000 Speaker 1: the more bespoke malicious content route. This content was handcrafted 138 00:09:20,120 --> 00:09:24,760 Speaker 1: to be evil. Anyway, this is still in the bill phase. 139 00:09:25,000 --> 00:09:26,920 Speaker 1: It will have to move through a lot of different 140 00:09:26,960 --> 00:09:29,920 Speaker 1: steps if it ever is to become law. There's no 141 00:09:30,000 --> 00:09:34,560 Speaker 1: guarantee that that will happen. But it is interesting that 142 00:09:34,800 --> 00:09:38,760 Speaker 1: it's a bipartisan proposal. Do you need a get rich 143 00:09:38,880 --> 00:09:43,240 Speaker 1: quick scheme? Well, how about creating an AI centric company? 144 00:09:43,520 --> 00:09:46,200 Speaker 1: The Financial Times reported on a new startup in the 145 00:09:46,240 --> 00:09:49,800 Speaker 1: EU called Misstral, which is like a month old. And 146 00:09:49,840 --> 00:09:52,240 Speaker 1: you might wonder what is Mistral or right now? It's 147 00:09:52,280 --> 00:09:55,720 Speaker 1: a company that has about a dozen staff and no product. 148 00:09:56,200 --> 00:09:58,400 Speaker 1: It's also a company that received more than one hundred 149 00:09:58,520 --> 00:10:02,280 Speaker 1: million dollars in C funding over the last month. So 150 00:10:02,440 --> 00:10:05,680 Speaker 1: you've got yourself a small staff, you have nothing to sell. 151 00:10:06,120 --> 00:10:09,600 Speaker 1: What could possibly make this company worth that much of 152 00:10:09,640 --> 00:10:12,800 Speaker 1: an early investment. Well, Mistral's goal is to produce its 153 00:10:12,840 --> 00:10:15,480 Speaker 1: own large language model. That means it would be in 154 00:10:15,520 --> 00:10:18,960 Speaker 1: competition with the likes of companies like Google and open Ai, 155 00:10:19,120 --> 00:10:22,559 Speaker 1: among many others. And it shows how the AI gold 156 00:10:22,640 --> 00:10:26,319 Speaker 1: rush is still going strong, that lots of people are 157 00:10:26,360 --> 00:10:30,160 Speaker 1: betting big on AI having a huge impact moving forward. 158 00:10:30,679 --> 00:10:34,240 Speaker 1: For that to extend to investing in small startups early on, 159 00:10:34,360 --> 00:10:37,520 Speaker 1: when you've already got established players like Google and open 160 00:10:37,559 --> 00:10:41,200 Speaker 1: Ai out there is pretty remarkable. And the investors are 161 00:10:41,200 --> 00:10:45,000 Speaker 1: folks who are heavy hitters, including a former Google CEO 162 00:10:45,120 --> 00:10:48,520 Speaker 1: namely Eric Schmidt. Now I should add that's not like 163 00:10:48,600 --> 00:10:50,720 Speaker 1: Mistral has just a bunch of folks who don't know 164 00:10:50,760 --> 00:10:52,600 Speaker 1: what they're doing. It's not like a bunch of people 165 00:10:52,640 --> 00:10:54,320 Speaker 1: who said we're going to make some AI and they 166 00:10:54,320 --> 00:10:57,760 Speaker 1: have no background. That's not true. In fact, the CTO 167 00:10:58,320 --> 00:11:02,640 Speaker 1: for the company is Tim Lacroix, who cut his teeth 168 00:11:02,760 --> 00:11:06,800 Speaker 1: over at deep Mind. That is the AI focused subsidiary 169 00:11:06,880 --> 00:11:10,600 Speaker 1: of Alphabet, which, in case you've forgotten, is Google's parent company. 170 00:11:11,240 --> 00:11:14,800 Speaker 1: So I suppose that knowing the expertise of the team 171 00:11:14,840 --> 00:11:17,720 Speaker 1: in place goes a long way with raising expectations among 172 00:11:17,760 --> 00:11:20,440 Speaker 1: investors and gives them the confidence to pour that kind 173 00:11:20,520 --> 00:11:24,600 Speaker 1: of huge money into a small company that doesn't have 174 00:11:24,840 --> 00:11:27,800 Speaker 1: anything to show for it so far. All right, We've 175 00:11:27,840 --> 00:11:29,640 Speaker 1: got some more news items we're going to cover in 176 00:11:29,679 --> 00:11:42,319 Speaker 1: this episode, but first let's take a quick break. We're back. 177 00:11:42,559 --> 00:11:46,240 Speaker 1: So tech Crunch reports that Twitter's offices in Boulder, Colorado 178 00:11:46,320 --> 00:11:50,280 Speaker 1: are headed toward eviction. The Chicago based company that owns 179 00:11:50,320 --> 00:11:54,160 Speaker 1: the office space sought approval to evict Twitter after the 180 00:11:54,200 --> 00:11:57,360 Speaker 1: company failed to pay three months worth of rent. In fact, 181 00:11:57,400 --> 00:12:00,480 Speaker 1: according to tech Crunch, Twitter was using a letter of 182 00:12:00,559 --> 00:12:04,280 Speaker 1: credit worth nearly a million dollars with this landlord and 183 00:12:04,400 --> 00:12:07,040 Speaker 1: simply was drawing upon that letter of credit to cover 184 00:12:07,120 --> 00:12:10,160 Speaker 1: the rent month after month until the credit ran out 185 00:12:10,240 --> 00:12:12,920 Speaker 1: back in March of this year. Now a judge has 186 00:12:13,040 --> 00:12:16,360 Speaker 1: ordered the Sheriff's office to assist in the eviction before 187 00:12:16,400 --> 00:12:19,120 Speaker 1: the end of July. We've heard time and again that 188 00:12:19,200 --> 00:12:22,160 Speaker 1: part of Twitter's cost saving strategy was, you know, to 189 00:12:22,280 --> 00:12:26,240 Speaker 1: just stop paying vendors and rent and stuff. I'm sure 190 00:12:26,280 --> 00:12:28,760 Speaker 1: this move didn't come as a complete surprise to the 191 00:12:28,760 --> 00:12:30,960 Speaker 1: company when they were told that their offices were going 192 00:12:31,000 --> 00:12:34,000 Speaker 1: to get evicted. I'm also pretty sure that after all 193 00:12:34,040 --> 00:12:37,200 Speaker 1: the rounds of layoffs, there probably aren't that many Twitter 194 00:12:37,240 --> 00:12:40,760 Speaker 1: employees left in Boulder to begin with. But it's another 195 00:12:40,920 --> 00:12:44,720 Speaker 1: ugly chapter in the post Elon Musk takeover Twitter story. 196 00:12:45,200 --> 00:12:49,920 Speaker 1: The United States Federal Communications Commission, or FCC, previously approved 197 00:12:49,960 --> 00:12:54,679 Speaker 1: a label law that applies to broadband service providers. Now, 198 00:12:54,720 --> 00:12:57,120 Speaker 1: this law is meant to make the various elements of 199 00:12:57,160 --> 00:13:01,120 Speaker 1: a customer's bill more transparent, so that consumers can actually 200 00:13:01,200 --> 00:13:04,400 Speaker 1: see what they're really paying for, like how much of 201 00:13:04,440 --> 00:13:07,920 Speaker 1: the bill is going to cover the base service versus 202 00:13:08,160 --> 00:13:13,080 Speaker 1: government fees versus you know, weird service fees that don't 203 00:13:13,120 --> 00:13:16,960 Speaker 1: seem to cover anything other than boosting up the bill. 204 00:13:17,440 --> 00:13:22,319 Speaker 1: And Comcast, a ginormous broadband service provider that also owns 205 00:13:22,440 --> 00:13:26,040 Speaker 1: lots of other stuff, including NBC Universal, has filed a 206 00:13:26,080 --> 00:13:28,760 Speaker 1: request to drop some of the requirements of the labeling, 207 00:13:28,840 --> 00:13:33,000 Speaker 1: stating that quote two aspects of the Commission's order impose 208 00:13:33,160 --> 00:13:38,920 Speaker 1: significant administrative burdens and unnecessary complexity in complying with the 209 00:13:38,960 --> 00:13:42,840 Speaker 1: broadband label requirements end quote. Now I'm not sure that 210 00:13:43,080 --> 00:13:46,320 Speaker 1: arguing that your fees are so complicated that explaining them 211 00:13:46,440 --> 00:13:49,640 Speaker 1: is a hardship is the right way to go, but anyway, 212 00:13:50,000 --> 00:13:52,880 Speaker 1: the purpose of the rules is to make it harder 213 00:13:52,920 --> 00:13:57,280 Speaker 1: for broadband providers to offuskate how much consumers will actually 214 00:13:57,360 --> 00:14:00,600 Speaker 1: pay when they sign up for service. So the argument 215 00:14:00,679 --> 00:14:03,440 Speaker 1: goes that providers will often run promotions that will make 216 00:14:03,440 --> 00:14:07,240 Speaker 1: it seem as if customers will pay relatively low monthly bills, 217 00:14:07,600 --> 00:14:10,360 Speaker 1: but then when it actually comes time to pay the bill, 218 00:14:10,440 --> 00:14:13,840 Speaker 1: the customer will see all sorts of different fees piled 219 00:14:13,880 --> 00:14:18,360 Speaker 1: on top of their basic service, which inflates the amount significantly. 220 00:14:18,760 --> 00:14:21,320 Speaker 1: So the rules are meant to force providers to be 221 00:14:21,400 --> 00:14:24,880 Speaker 1: more upfront about such things, and Comcast is essentially saying 222 00:14:25,520 --> 00:14:29,840 Speaker 1: this is too hard now to be somewhat fair to Comcast, 223 00:14:30,200 --> 00:14:32,640 Speaker 1: some of the rules do get pretty involved. For example, 224 00:14:33,120 --> 00:14:37,320 Speaker 1: providers have to keep a record of when they provide 225 00:14:37,400 --> 00:14:42,320 Speaker 1: labels to customers through quote unquote alternate sale channels, and 226 00:14:42,360 --> 00:14:44,240 Speaker 1: that can include stuff like if you were to have 227 00:14:44,240 --> 00:14:49,600 Speaker 1: an interaction with someone at a kiosk like a Comcast 228 00:14:49,680 --> 00:14:52,880 Speaker 1: kiosk in a mall, or if you were to call 229 00:14:52,960 --> 00:14:55,000 Speaker 1: up a customer service rep and you were to ask 230 00:14:55,080 --> 00:14:58,920 Speaker 1: to be told the label over the phone. All of 231 00:14:58,920 --> 00:15:03,280 Speaker 1: those would count as an alternate sales channel, and Comcast 232 00:15:03,360 --> 00:15:06,040 Speaker 1: is supposed to keep a record of every single one 233 00:15:06,080 --> 00:15:08,840 Speaker 1: of those. And Comcast argues that the company has millions 234 00:15:08,840 --> 00:15:12,040 Speaker 1: of interactions with customers and potential customers, and recording every 235 00:15:12,080 --> 00:15:15,640 Speaker 1: single time that a Comcast employee communicates the label information 236 00:15:15,680 --> 00:15:18,920 Speaker 1: to a customer would rapidly become very difficult to manage. 237 00:15:19,200 --> 00:15:22,600 Speaker 1: And you know what, I think that's probably true. But 238 00:15:22,680 --> 00:15:25,200 Speaker 1: at the same time, the FCC wants to hold these 239 00:15:25,200 --> 00:15:28,200 Speaker 1: companies accountable, and part of that is keeping track of 240 00:15:28,240 --> 00:15:30,880 Speaker 1: when they're following the rules and when they fail to 241 00:15:30,960 --> 00:15:34,000 Speaker 1: do so. So I'm not sure there's an easy solution here. 242 00:15:34,080 --> 00:15:38,000 Speaker 1: I do think that more transparency is absolutely needed because 243 00:15:39,280 --> 00:15:42,000 Speaker 1: a lot of those fees, if I'm being honest, same 244 00:15:42,040 --> 00:15:46,119 Speaker 1: a bit sus if you're asking me. San Francisco, California, 245 00:15:46,160 --> 00:15:48,080 Speaker 1: has become the first city in the United States to 246 00:15:48,080 --> 00:15:51,080 Speaker 1: have more than half of its car sales to be 247 00:15:51,280 --> 00:15:55,760 Speaker 1: either in electric or hybrid vehicle categories. Now, technically the 248 00:15:55,800 --> 00:15:59,560 Speaker 1: city hit that benchmark in March. April sales figures showed 249 00:15:59,560 --> 00:16:01,880 Speaker 1: that even and there was even more of a tendency 250 00:16:01,880 --> 00:16:05,000 Speaker 1: for car shoppers to go electric or hybrid. And that's 251 00:16:05,320 --> 00:16:08,400 Speaker 1: particularly good news for Tesla because about half of all 252 00:16:08,520 --> 00:16:12,280 Speaker 1: those vehicles actually came from Tesla, So about a quarter 253 00:16:12,560 --> 00:16:16,320 Speaker 1: of all car sales in San Francisco or Tesla vehicles. 254 00:16:16,360 --> 00:16:19,840 Speaker 1: That's incredible. Now, this doesn't necessarily mark a trend for 255 00:16:19,920 --> 00:16:22,920 Speaker 1: the larger United States. For one thing, you do have 256 00:16:22,960 --> 00:16:27,560 Speaker 1: states like Wyoming that have obstinately introduced legislation that would 257 00:16:27,760 --> 00:16:32,480 Speaker 1: actually phase out new electric vehicle sales, which was just 258 00:16:32,520 --> 00:16:35,120 Speaker 1: a response of being told that they should phase out 259 00:16:35,240 --> 00:16:38,760 Speaker 1: internal combustion engine vehicle sales. I should also add that 260 00:16:38,840 --> 00:16:43,120 Speaker 1: those states have very small populations, so ultimately they don't 261 00:16:43,160 --> 00:16:44,760 Speaker 1: have a whole lot of say in the matter, because 262 00:16:44,800 --> 00:16:48,119 Speaker 1: the automotive industry is going to respond to the majority, 263 00:16:48,560 --> 00:16:51,920 Speaker 1: not the minority. So you know, it doesn't make sense 264 00:16:52,000 --> 00:16:55,720 Speaker 1: to produce internal combustion engine vehicles if it's only the 265 00:16:55,760 --> 00:16:59,200 Speaker 1: state of Wyoming that's buying them. But anyway, another reason 266 00:16:59,200 --> 00:17:03,480 Speaker 1: why this isn't necessarily a trend across the entire country 267 00:17:03,560 --> 00:17:08,320 Speaker 1: is that electric vehicles are still more expensive than other vehicles, 268 00:17:08,720 --> 00:17:11,040 Speaker 1: and the people who buy them typically are on the 269 00:17:11,160 --> 00:17:14,679 Speaker 1: affluent side, and that is a large part of the 270 00:17:14,720 --> 00:17:16,960 Speaker 1: population of San Francisco. Obviously, there are a lot of 271 00:17:17,000 --> 00:17:19,919 Speaker 1: people living in San Francisco who are not affluent, but 272 00:17:20,320 --> 00:17:23,399 Speaker 1: it has a larger affluent population than a lot of 273 00:17:23,440 --> 00:17:26,240 Speaker 1: other cities do, and so we're not likely to see 274 00:17:26,240 --> 00:17:30,119 Speaker 1: a huge shift toward electric vehicles just because San Francisco 275 00:17:30,200 --> 00:17:32,880 Speaker 1: did it. But it still has become the first US 276 00:17:32,920 --> 00:17:36,199 Speaker 1: city where more than half of the cars purchased in 277 00:17:36,280 --> 00:17:40,399 Speaker 1: a month went to electric or hybrid vehicles. Finally, a 278 00:17:40,440 --> 00:17:43,440 Speaker 1: rear admiral for the country of Iran recently showed off 279 00:17:43,440 --> 00:17:46,280 Speaker 1: what was claimed to be a quantum processor. It was 280 00:17:46,320 --> 00:17:48,880 Speaker 1: a circuit board with lots of chips installed on it. 281 00:17:49,680 --> 00:17:52,000 Speaker 1: There was like a circular array of chips. It looked, 282 00:17:52,080 --> 00:17:55,360 Speaker 1: you know, circuit boardy, but if you know anything about 283 00:17:55,440 --> 00:17:58,600 Speaker 1: quantum computers, you would probably think, well, that can't be right, 284 00:17:59,080 --> 00:18:01,680 Speaker 1: and you would have been correct. The circuit board, while 285 00:18:01,720 --> 00:18:04,600 Speaker 1: admittedly kind of nifty looking, turned out to be nothing 286 00:18:04,640 --> 00:18:08,520 Speaker 1: more than a z board Zinc seven thousand development board. 287 00:18:08,680 --> 00:18:11,120 Speaker 1: So it's a system on a chip or an SoC 288 00:18:11,680 --> 00:18:14,480 Speaker 1: and it's meant for developers to use, you know, develop 289 00:18:14,520 --> 00:18:19,160 Speaker 1: applications and software and stuff. It is not a quantum processor. 290 00:18:19,240 --> 00:18:22,320 Speaker 1: It's a classic computer processor or system on a chip. 291 00:18:22,480 --> 00:18:25,200 Speaker 1: And it's not even that powerful either. It has five 292 00:18:25,280 --> 00:18:28,360 Speaker 1: hundred and twelve megabytes of dd R three RAM. Now 293 00:18:28,440 --> 00:18:32,440 Speaker 1: note I said megabytes, not gigabytes, as a dual core 294 00:18:32,720 --> 00:18:36,080 Speaker 1: arm Core tex A nine processor, so it's not the 295 00:18:36,160 --> 00:18:38,560 Speaker 1: kind of equipment you would need to keep cubits in 296 00:18:38,640 --> 00:18:43,560 Speaker 1: superposition as you run incredibly complicated quantum algorithms through the system. Now, 297 00:18:43,600 --> 00:18:47,280 Speaker 1: I'm not sure if the Iranian government or the Iranian 298 00:18:47,320 --> 00:18:50,879 Speaker 1: military was behind this as kind of a way of 299 00:18:51,000 --> 00:18:54,600 Speaker 1: posturing and making claims that they don't actually you know, 300 00:18:54,720 --> 00:18:58,120 Speaker 1: they can't back up, or if perhaps the authorities had 301 00:18:58,160 --> 00:19:02,160 Speaker 1: been hoodwinked by some snake oil salespeople who passed off 302 00:19:02,240 --> 00:19:05,600 Speaker 1: this development board as a quantum processor and said, yeah, yeah, sure, 303 00:19:05,760 --> 00:19:09,600 Speaker 1: sure it's quantum now cough over the dough Mac. Because 304 00:19:09,640 --> 00:19:13,920 Speaker 1: there's a history of tech scam artists in the Middle 305 00:19:13,960 --> 00:19:17,480 Speaker 1: East passing off substandard or outright fake technology as if 306 00:19:17,520 --> 00:19:21,520 Speaker 1: it were the real thing, so it's hard to say. Now. 307 00:19:21,640 --> 00:19:25,159 Speaker 1: Before I go on Monday, we will be publishing an 308 00:19:25,200 --> 00:19:28,480 Speaker 1: episode of IBM Smart Talks in the feed, so you'll 309 00:19:28,520 --> 00:19:31,640 Speaker 1: see that as opposed to a normal tech Stuff episode. 310 00:19:32,160 --> 00:19:36,560 Speaker 1: And yeah, that's about it for that. We'll be having 311 00:19:36,600 --> 00:19:40,720 Speaker 1: more Smart Talks episodes, publishing about once a month moving forward, 312 00:19:41,280 --> 00:19:44,600 Speaker 1: and everything else is all Jonathan all the time. So 313 00:19:44,960 --> 00:19:48,720 Speaker 1: I hope you are all well and I'll know I'll 314 00:19:48,720 --> 00:19:58,280 Speaker 1: talk to you again really soon. Tech Stuff is an 315 00:19:58,280 --> 00:20:03,800 Speaker 1: iHeartRadio production. For more podcasts from iHeartRadio, visit the iHeartRadio app, 316 00:20:03,960 --> 00:20:07,119 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.