1 00:00:02,520 --> 00:00:17,919 Speaker 1: Bloomberg Audio Studios, podcasts, radio news. 2 00:00:13,119 --> 00:00:17,400 Speaker 2: Rings, super Bowl ad showcasing AI to help find lost 3 00:00:17,520 --> 00:00:20,400 Speaker 2: pets has put a new spotlight on the expanding role 4 00:00:20,840 --> 00:00:24,720 Speaker 2: of artificial intelligence in home security, and it's raising fresh 5 00:00:24,800 --> 00:00:29,040 Speaker 2: questions about surveillance joining US now. As Jamie Simonoff, founder 6 00:00:29,080 --> 00:00:34,240 Speaker 2: and chief inventor of Ring, Jamie safe to say, these 7 00:00:34,280 --> 00:00:39,120 Speaker 2: cameras have become ubiquitous. I saw in Weapons, you know, 8 00:00:39,200 --> 00:00:42,360 Speaker 2: they were highlighted as a way to investigate the disappearance 9 00:00:42,360 --> 00:00:46,960 Speaker 2: of children. That was a fictional movie, but clearly, clearly 10 00:00:48,080 --> 00:00:52,360 Speaker 2: law enforcement is hoping to use these cameras more and more. 11 00:00:52,520 --> 00:00:55,279 Speaker 2: How do you where do you draw a line between, 12 00:00:55,440 --> 00:00:58,920 Speaker 2: you know, helping people find lost pets or people and 13 00:00:59,320 --> 00:01:02,080 Speaker 2: trying to limit the surveillance of US citizens. 14 00:01:03,280 --> 00:01:05,039 Speaker 1: I mean, the line for us is simple, and it 15 00:01:05,080 --> 00:01:09,000 Speaker 1: always has been that you control your video and if 16 00:01:09,000 --> 00:01:11,319 Speaker 1: you want to share it with law enforcement, if you 17 00:01:11,360 --> 00:01:13,200 Speaker 1: want to share it with a neighbor, you can do that. 18 00:01:13,360 --> 00:01:14,800 Speaker 3: But you always control your video. 19 00:01:14,840 --> 00:01:17,280 Speaker 1: In the case of search party for dogs, we just 20 00:01:17,319 --> 00:01:19,800 Speaker 1: tell you, hey, this dog looks like this dog in 21 00:01:19,800 --> 00:01:21,919 Speaker 1: front of your house. Do you want to contact your neighbor? 22 00:01:21,920 --> 00:01:24,640 Speaker 1: If you say no, you're one hundred percent anonymous. If 23 00:01:24,640 --> 00:01:26,480 Speaker 1: you say yes, then you talk to your neighbor, just 24 00:01:26,520 --> 00:01:28,560 Speaker 1: like you did when you found their dog in front 25 00:01:28,600 --> 00:01:31,520 Speaker 1: of your house years ago and called the number on 26 00:01:31,560 --> 00:01:31,920 Speaker 1: the tag. 27 00:01:35,400 --> 00:01:36,920 Speaker 4: I do want to just ask you, because there's this 28 00:01:36,959 --> 00:01:39,520 Speaker 4: big question of surveillance and the usage of it. You 29 00:01:39,920 --> 00:01:44,160 Speaker 4: had told Fortune talking about the missing case of Nancy Guthrie, 30 00:01:44,319 --> 00:01:46,399 Speaker 4: basically saying that there are more cameras in the house, 31 00:01:46,480 --> 00:01:48,280 Speaker 4: you might have been solved. And Jamie, you face a 32 00:01:48,320 --> 00:01:50,680 Speaker 4: little bit of backlash online for those comments. So I 33 00:01:50,680 --> 00:01:53,000 Speaker 4: would love to just give you the opportunity just to 34 00:01:53,040 --> 00:01:56,680 Speaker 4: explain your thinking there and clear up any misunderstandings. 35 00:01:56,920 --> 00:01:59,560 Speaker 3: Sure, I mean, if you actually saw the quote, it's 36 00:01:59,600 --> 00:01:59,920 Speaker 3: not that. 37 00:02:00,000 --> 00:02:02,040 Speaker 1: But what I did say and what I do is 38 00:02:02,080 --> 00:02:05,840 Speaker 1: I wake up every morning, you know, with a mission 39 00:02:06,160 --> 00:02:09,520 Speaker 1: and knowing that the business that we're in has been 40 00:02:09,560 --> 00:02:13,720 Speaker 1: extremely impactful, and we see it. The video evidence that 41 00:02:14,120 --> 00:02:17,280 Speaker 1: so far that they have in the Nancy Guthrie case, 42 00:02:17,560 --> 00:02:20,360 Speaker 1: I think everyone would agree is some of the most 43 00:02:20,400 --> 00:02:22,320 Speaker 1: impactful evidence in the case. 44 00:02:22,760 --> 00:02:23,520 Speaker 3: And so what I. 45 00:02:23,440 --> 00:02:26,160 Speaker 1: Did say is that if there had been just more 46 00:02:26,200 --> 00:02:29,400 Speaker 1: cameras in general, I would have hoped that this would 47 00:02:29,400 --> 00:02:31,720 Speaker 1: have helped us get to a resolution in the case, 48 00:02:31,800 --> 00:02:33,760 Speaker 1: and that is what I believe in. I mean, this 49 00:02:33,840 --> 00:02:35,839 Speaker 1: is what I do every day, and I think it's 50 00:02:35,840 --> 00:02:38,360 Speaker 1: hard to watch a dateline episode or anything else not 51 00:02:38,520 --> 00:02:42,760 Speaker 1: seeing where ring has had a major impact in helping 52 00:02:42,880 --> 00:02:47,160 Speaker 1: solve cases. And I do believe, you know, as I 53 00:02:47,200 --> 00:02:50,639 Speaker 1: think I should, that more cameras should help solve more 54 00:02:50,680 --> 00:02:52,280 Speaker 1: cases and reduce more crime. 55 00:02:52,880 --> 00:02:57,480 Speaker 2: How do you work with cybersecurity within your firm, because 56 00:02:58,120 --> 00:03:01,200 Speaker 2: you know, I also have a ton of cameras around 57 00:03:01,639 --> 00:03:04,560 Speaker 2: my house and I would hate to have anyone else 58 00:03:04,639 --> 00:03:08,960 Speaker 2: tap in to that network and use my video surveillance 59 00:03:09,000 --> 00:03:12,080 Speaker 2: against me. I'm sure it's something you think about often, Jamie. 60 00:03:12,120 --> 00:03:14,000 Speaker 2: So what kind of work do you do to counter 61 00:03:15,960 --> 00:03:17,320 Speaker 2: that kind of cyber attack? 62 00:03:18,440 --> 00:03:21,519 Speaker 1: Yeah, I think cyber and security is a living, breathing thing. 63 00:03:21,560 --> 00:03:23,280 Speaker 1: You have to just continue. I mean, you have to 64 00:03:23,320 --> 00:03:24,799 Speaker 1: consue to invest in it. You have to have people 65 00:03:24,840 --> 00:03:27,920 Speaker 1: always looking at it. It's always changing and evolving. I 66 00:03:27,919 --> 00:03:30,080 Speaker 1: think we're best in class in it. Also, we offer 67 00:03:30,400 --> 00:03:31,920 Speaker 1: I believe we're one of the only companies in the 68 00:03:31,919 --> 00:03:35,360 Speaker 1: world that offers in this space and and encryption as 69 00:03:35,400 --> 00:03:38,520 Speaker 1: an option. So we really do focus on security. We 70 00:03:38,520 --> 00:03:41,320 Speaker 1: focus on trusts we need we call our customers neighbors. 71 00:03:41,320 --> 00:03:43,160 Speaker 1: We need our neighbors that trust us or else obviously 72 00:03:43,200 --> 00:03:44,800 Speaker 1: they're not going to want to have our products, and 73 00:03:44,800 --> 00:03:47,720 Speaker 1: then I can't achieve my mission of making neighborhoods safer. 74 00:03:48,600 --> 00:03:50,680 Speaker 4: And Jamie just just on this point bringing it back 75 00:03:50,720 --> 00:03:54,720 Speaker 4: to AI too, because AI is advancing so rapidly. There 76 00:03:54,760 --> 00:03:58,160 Speaker 4: was the story, for example, of a Chinese home vacuum 77 00:03:58,880 --> 00:04:01,560 Speaker 4: robot brand that's someone used Claude to hack into and 78 00:04:01,640 --> 00:04:06,120 Speaker 4: accidentally hacked all seven thousand of those robot vacuums in 79 00:04:06,160 --> 00:04:08,880 Speaker 4: the world. AI is getting really advanced, but it's also 80 00:04:09,000 --> 00:04:11,520 Speaker 4: exposing some security flaws. What do you and the team 81 00:04:11,560 --> 00:04:13,880 Speaker 4: need to do at RING to make sure you're keeping 82 00:04:13,960 --> 00:04:17,080 Speaker 4: up with the technology, not just mistakes like that, but 83 00:04:17,200 --> 00:04:19,599 Speaker 4: bad actors who are increasingly able to use AI. 84 00:04:21,360 --> 00:04:23,880 Speaker 5: I mean, there's always been security threats, and the security 85 00:04:23,880 --> 00:04:26,760 Speaker 5: threats I'd say, what's good is it as AI gets better, 86 00:04:26,800 --> 00:04:30,120 Speaker 5: it means that your counter security, you're you know, countering 87 00:04:30,240 --> 00:04:32,479 Speaker 5: these kind of things, is also getting better. And so 88 00:04:33,200 --> 00:04:35,080 Speaker 5: it's just it's just a cat and mouse game that 89 00:04:35,120 --> 00:04:38,240 Speaker 5: we've always had from the beginning of the company, and 90 00:04:38,360 --> 00:04:41,159 Speaker 5: you know, I believe we're vesting class at it, but 91 00:04:41,200 --> 00:04:41,880 Speaker 5: it is something you have. 92 00:04:41,920 --> 00:04:42,599 Speaker 3: To stay focused on. 93 00:04:42,640 --> 00:04:45,240 Speaker 5: I mean security for every company in the world, you 94 00:04:45,279 --> 00:04:46,919 Speaker 5: got to stay in your toes, but you always had to. 95 00:04:47,080 --> 00:04:50,120 Speaker 5: AI is just the new thing that's happening today. And 96 00:04:50,160 --> 00:04:52,920 Speaker 5: then you also help, you know, use this technology to 97 00:04:52,960 --> 00:04:54,840 Speaker 5: also do the counterfeat so countermeasures. 98 00:04:55,440 --> 00:04:58,080 Speaker 2: I want to ask about manufacturing because President Trump is 99 00:04:58,080 --> 00:05:01,320 Speaker 2: trying to bring manufactur sturing back to these shores, and 100 00:05:01,440 --> 00:05:05,479 Speaker 2: Ring Camera obviously is a booming business that uses a 101 00:05:05,480 --> 00:05:08,640 Speaker 2: lot of hardware I would imagine for the most part 102 00:05:08,680 --> 00:05:13,760 Speaker 2: manufactured in Asia. How do you, when you talk to 103 00:05:13,800 --> 00:05:18,279 Speaker 2: the administration or when you talk to government leaders, impress 104 00:05:18,360 --> 00:05:20,920 Speaker 2: upon them the fact that you just we just can't 105 00:05:20,960 --> 00:05:23,839 Speaker 2: in this country really compete in terms of price on 106 00:05:24,000 --> 00:05:28,320 Speaker 2: manufacturing this kind of hardware, you know. 107 00:05:28,360 --> 00:05:30,280 Speaker 3: I mean, certainly it's a complex one. 108 00:05:31,120 --> 00:05:33,640 Speaker 1: You know, there's an ecosystem around this stuff, and that's 109 00:05:33,640 --> 00:05:34,520 Speaker 1: what we're seeing. 110 00:05:34,600 --> 00:05:36,279 Speaker 3: But I am hopeful. 111 00:05:35,880 --> 00:05:39,720 Speaker 1: That you know, over the next few years, with what's happening, 112 00:05:39,760 --> 00:05:42,520 Speaker 1: that we will continue to move and onshore things and 113 00:05:42,560 --> 00:05:44,080 Speaker 1: be able to do more of it. So I do 114 00:05:44,160 --> 00:05:46,640 Speaker 1: think it's going to take time for businesses like ours, 115 00:05:46,680 --> 00:05:49,599 Speaker 1: because there are ecosystem things. You know, there's probably nine 116 00:05:49,680 --> 00:05:52,279 Speaker 1: hundred to one thousand parts in every single ring camera 117 00:05:52,600 --> 00:05:55,680 Speaker 1: that come from different suppliers, and so it's going to 118 00:05:55,760 --> 00:05:58,480 Speaker 1: take time. But I am seeing I'd say a lot 119 00:05:58,520 --> 00:06:02,960 Speaker 1: of good, you know, positive sort of direction on that. 120 00:06:03,640 --> 00:06:07,640 Speaker 2: In terms of the tariff uncertainty that businesses are dealing with, Jamie, 121 00:06:07,680 --> 00:06:10,640 Speaker 2: how do you face that, you know, when you're when 122 00:06:10,640 --> 00:06:13,839 Speaker 2: you're looking at tariffs that are you know, twenty percent 123 00:06:13,880 --> 00:06:15,520 Speaker 2: and then all of a sudden they're eighty percent and 124 00:06:15,560 --> 00:06:17,880 Speaker 2: now they're fifty. I mean, how do you deal with 125 00:06:17,920 --> 00:06:19,400 Speaker 2: that as a as a CEO? 126 00:06:21,040 --> 00:06:22,760 Speaker 3: You know? I mean I'm sort of a CEO, but 127 00:06:22,760 --> 00:06:23,839 Speaker 3: I'm also a founder. You know. 128 00:06:23,880 --> 00:06:28,920 Speaker 1: I started this in my garage in California fifteen years ago. 129 00:06:29,400 --> 00:06:32,320 Speaker 1: I'd say tariff uncertainty compared to the things that I've 130 00:06:32,360 --> 00:06:35,240 Speaker 1: lived through is, while it is serious, it's not as 131 00:06:35,240 --> 00:06:37,000 Speaker 1: serious as all the other stuff that I've gone through. 132 00:06:37,080 --> 00:06:39,240 Speaker 1: And I think when you're a founder and you've you know, 133 00:06:39,360 --> 00:06:42,440 Speaker 1: faced bankruptcy many times and trying to raise money and 134 00:06:42,800 --> 00:06:45,080 Speaker 1: things changing that are completely out of your control, I 135 00:06:45,120 --> 00:06:46,599 Speaker 1: think he just fartens you a little bit. I have 136 00:06:46,640 --> 00:06:49,440 Speaker 1: a pretty thick skin. We just have to keep you know, 137 00:06:49,480 --> 00:06:50,960 Speaker 1: keep our heads down. I tell the team, like we 138 00:06:51,040 --> 00:06:52,560 Speaker 1: all we can do is control our inputs. 139 00:06:52,880 --> 00:06:56,200 Speaker 5: Just keep inventing great products, making neighborhoods safer, and we'll 140 00:06:56,200 --> 00:06:57,560 Speaker 5: get rewarded for doing that. 141 00:06:57,640 --> 00:06:59,000 Speaker 3: And we just got to, like, you know. 142 00:06:59,040 --> 00:07:02,200 Speaker 1: Hope that the world continues to go into a positive direction, 143 00:07:02,240 --> 00:07:03,360 Speaker 1: which you know so far it has. 144 00:07:03,600 --> 00:07:06,440 Speaker 4: So I'm ass thinking about the hardware that goes into it, Jamie, 145 00:07:06,520 --> 00:07:09,800 Speaker 4: I'm thinking about the coding and the software and the 146 00:07:09,840 --> 00:07:13,520 Speaker 4: technology within it. What does your talent pool look like? 147 00:07:13,680 --> 00:07:17,080 Speaker 4: Are you still expanding when it comes to hiring engineers 148 00:07:17,440 --> 00:07:20,400 Speaker 4: or are you looking at technology that's making your employees 149 00:07:20,440 --> 00:07:23,320 Speaker 4: more efficient and maybe means you need less of them. 150 00:07:24,120 --> 00:07:26,080 Speaker 3: I mean, certainly we are embracing AI. 151 00:07:26,320 --> 00:07:29,160 Speaker 1: I am, you know, pushing every team member here to 152 00:07:29,320 --> 00:07:31,080 Speaker 1: be AI first, think in AI. 153 00:07:31,600 --> 00:07:33,480 Speaker 3: We are seeing great efficiencies from that. 154 00:07:33,680 --> 00:07:36,240 Speaker 1: We're also growing really fast though, so it's hard to tell, 155 00:07:36,680 --> 00:07:38,720 Speaker 1: you know, we are still i'd say a net grower, 156 00:07:38,920 --> 00:07:41,960 Speaker 1: and so we are still growing. Our features are coming 157 00:07:41,960 --> 00:07:45,640 Speaker 1: out faster though now our product cycle times are definitely 158 00:07:45,680 --> 00:07:49,760 Speaker 1: been reduced by like you know, almost triple digit percentage. 159 00:07:49,800 --> 00:07:53,920 Speaker 1: I mean, it's like, it's incredible how efficient we are 160 00:07:53,960 --> 00:07:57,600 Speaker 1: now getting with AI and what it's doing. That said, 161 00:07:57,600 --> 00:08:00,000 Speaker 1: we're also still growing. We have lots lots of ideas 162 00:08:00,240 --> 00:08:03,560 Speaker 1: and business to build, so, you know, on the on 163 00:08:03,640 --> 00:08:05,720 Speaker 1: the bigger picture, I don't know where where that goes, 164 00:08:05,760 --> 00:08:07,120 Speaker 1: but I do know for Ring, we're you know, we're 165 00:08:07,160 --> 00:08:08,280 Speaker 1: still growing and still building. 166 00:08:08,680 --> 00:08:10,320 Speaker 4: And Jamie, can I just say it's always very cool 167 00:08:10,320 --> 00:08:12,720 Speaker 4: when a founder stays with the company through this era 168 00:08:12,800 --> 00:08:16,320 Speaker 4: of big growth too, so really fantantastic insight as both 169 00:08:16,400 --> 00:08:19,480 Speaker 4: the head of Ring and an inventor himself. Jamie Simonov 170 00:08:19,560 --> 00:08:20,840 Speaker 4: of Ring, thanks for joining us.