1 00:00:02,520 --> 00:00:07,440 Speaker 1: Bloomberg Audio Studios, podcasts, radio news. 2 00:00:08,000 --> 00:00:12,320 Speaker 2: Gaming platform Roadblocks is introducing a new feature for verifying 3 00:00:12,440 --> 00:00:16,799 Speaker 2: users ages. Take a selfie and the AI will estimate 4 00:00:16,840 --> 00:00:20,560 Speaker 2: the user's age. If the tech determines they're over thirteen, 5 00:00:21,160 --> 00:00:23,919 Speaker 2: then the user will have access to the new Trusted 6 00:00:24,040 --> 00:00:28,320 Speaker 2: Connections feature where they can chat freely with other Roadblocks users. 7 00:00:28,320 --> 00:00:30,960 Speaker 2: Delighted to say that joining us now is Roadblocks is 8 00:00:31,040 --> 00:00:34,360 Speaker 2: CEO David Bazooki, DA, good morning. Thank you for being 9 00:00:34,400 --> 00:00:38,720 Speaker 2: back on Bloomberg Tech. I find the tech so interesting here. 10 00:00:38,760 --> 00:00:41,199 Speaker 2: Could we just go over the basics of how it 11 00:00:41,360 --> 00:00:45,720 Speaker 2: works and why this is the right technology for this issue. 12 00:00:46,600 --> 00:00:47,400 Speaker 3: Yeah, thank you. 13 00:00:47,560 --> 00:00:50,800 Speaker 1: We've been innovating on safety and civility for almost twenty years, 14 00:00:50,840 --> 00:00:54,160 Speaker 1: and what we're rolling out today is our vision for 15 00:00:54,320 --> 00:00:59,360 Speaker 1: age based communication. Age based communication, especially for teens thirteen 16 00:00:59,400 --> 00:01:04,919 Speaker 1: through seven, which is a vulnerable time, involves a couple things. First, 17 00:01:05,040 --> 00:01:09,880 Speaker 1: trusted connections, so identifying friends and people that you know 18 00:01:10,080 --> 00:01:15,440 Speaker 1: and trust, and second, using age estimation to enable trusted 19 00:01:15,480 --> 00:01:19,680 Speaker 1: connections to actually allow those teams to speak a little 20 00:01:19,680 --> 00:01:23,280 Speaker 1: more freely on our platform, even though we do scan 21 00:01:23,400 --> 00:01:27,039 Speaker 1: the text for critical harms. And this is super important 22 00:01:27,080 --> 00:01:32,840 Speaker 1: because business notwithstanding, engagement notwithstanding, we're trying to keep people 23 00:01:32,840 --> 00:01:36,360 Speaker 1: on our platform so they don't dump to other platforms 24 00:01:36,440 --> 00:01:39,880 Speaker 1: where they may share images or where they may communicate 25 00:01:39,959 --> 00:01:41,480 Speaker 1: without text monitoring. 26 00:01:43,280 --> 00:01:46,200 Speaker 2: Dave, the skeptic here, will say, well, well, how good 27 00:01:46,280 --> 00:01:49,640 Speaker 2: is the AI? How can the video and an AI 28 00:01:49,680 --> 00:01:54,120 Speaker 2: analysis of it, analysis of it guarantee a correct result 29 00:01:54,560 --> 00:01:58,520 Speaker 2: that the person using it is indeed age thirteen or older. 30 00:01:59,400 --> 00:02:00,720 Speaker 3: I want to highlight a couple of things. 31 00:02:00,720 --> 00:02:04,360 Speaker 1: We've been innovating on AI for four years plus and 32 00:02:04,400 --> 00:02:08,480 Speaker 1: we're running over two hundred AI systems. We've open sourced 33 00:02:08,480 --> 00:02:12,480 Speaker 1: our voice moderation system because it's so good, and we've 34 00:02:12,560 --> 00:02:15,240 Speaker 1: really advanced the state of the art and text moderation 35 00:02:15,480 --> 00:02:16,440 Speaker 1: systems as well. 36 00:02:16,840 --> 00:02:20,600 Speaker 3: This is age estimation. The AI is really good. 37 00:02:20,639 --> 00:02:25,000 Speaker 1: It's not perfect, of course, but we're leaning conservative in 38 00:02:25,040 --> 00:02:27,920 Speaker 1: that case, and I'll highlight we're using this in a 39 00:02:27,960 --> 00:02:33,560 Speaker 1: conservative way to enable freer communication that is still monitored. 40 00:02:33,680 --> 00:02:36,600 Speaker 1: So we've really thought a lot about it, and we 41 00:02:36,639 --> 00:02:40,480 Speaker 1: think this is the future of how teens will communicate. 42 00:02:41,320 --> 00:02:44,680 Speaker 2: In testing. What is the proportion bin of sort of 43 00:02:45,160 --> 00:02:48,280 Speaker 2: false positives and accurate results? And is there an element 44 00:02:48,320 --> 00:02:50,799 Speaker 2: of sort of human moderation in the first instance. 45 00:02:51,600 --> 00:02:55,480 Speaker 1: Yeah, well, I want to highlight for historically there's always 46 00:02:55,480 --> 00:02:58,640 Speaker 1: been an element of moderation on roadblocks, and all texts, 47 00:02:59,160 --> 00:03:04,200 Speaker 1: all voice, all images has always gone through moderation, including 48 00:03:04,280 --> 00:03:08,560 Speaker 1: AI and human moderation. So we have such a firm 49 00:03:08,720 --> 00:03:13,680 Speaker 1: foundation for safety. We're using age estimation to in a 50 00:03:13,720 --> 00:03:15,440 Speaker 1: way slightly free. 51 00:03:15,200 --> 00:03:16,400 Speaker 3: Up the communication. 52 00:03:16,520 --> 00:03:19,480 Speaker 1: If you and I were fourteen on roadblocks and you 53 00:03:19,600 --> 00:03:21,680 Speaker 1: called me a butt head, I wouldn't see it. 54 00:03:21,760 --> 00:03:23,400 Speaker 3: We'd see a bunch of hashtags. 55 00:03:23,639 --> 00:03:25,400 Speaker 1: But it's the kind of thing a lot of teens 56 00:03:25,520 --> 00:03:28,680 Speaker 1: like to chat about, and when it's with people you 57 00:03:28,720 --> 00:03:31,679 Speaker 1: know and trust, we're open to letting them communicate more 58 00:03:31,720 --> 00:03:32,720 Speaker 1: freely in that way. 59 00:03:34,040 --> 00:03:35,920 Speaker 2: Dave is probably really helpful to put this in the 60 00:03:35,920 --> 00:03:39,280 Speaker 2: context of a case study example, and Grow a Garden 61 00:03:39,360 --> 00:03:41,600 Speaker 2: is probably the best right. So if you just bear 62 00:03:41,640 --> 00:03:44,320 Speaker 2: with me, and then we bring it back to product safety, 63 00:03:44,600 --> 00:03:48,240 Speaker 2: there are all of these teenagers wanting to be on 64 00:03:48,320 --> 00:03:51,520 Speaker 2: Grow a Garden and then share with their peers, right 65 00:03:51,600 --> 00:03:54,800 Speaker 2: their friends, that their community of a similar age. How 66 00:03:54,840 --> 00:04:00,280 Speaker 2: does this new technology fit into the daily run of 67 00:04:00,320 --> 00:04:04,240 Speaker 2: that game and how players and users interact with one another. 68 00:04:05,040 --> 00:04:06,840 Speaker 3: Yeah, thanks for highlighting grow a Garden. 69 00:04:07,000 --> 00:04:10,800 Speaker 1: Just as last summer, Dressed to Impress was going crazy 70 00:04:10,840 --> 00:04:13,240 Speaker 1: on roadblocks. Right now, grow a Garden has hit over 71 00:04:13,280 --> 00:04:15,720 Speaker 1: twenty million people playing it at the same time, which 72 00:04:15,760 --> 00:04:18,280 Speaker 1: is actually a record, we believe. 73 00:04:17,960 --> 00:04:20,440 Speaker 3: For any game in the history of gaming. 74 00:04:21,000 --> 00:04:24,320 Speaker 1: People really want to hang out with their friends and communicate, 75 00:04:24,480 --> 00:04:27,280 Speaker 1: and up until now, if you and I were fifteen 76 00:04:27,320 --> 00:04:29,960 Speaker 1: and we were trying to communicate once again, we might 77 00:04:30,000 --> 00:04:32,560 Speaker 1: see some hashes and some blocking when we're trying to 78 00:04:32,880 --> 00:04:33,880 Speaker 1: compete in the game. 79 00:04:34,360 --> 00:04:35,600 Speaker 3: With the release of this. 80 00:04:35,680 --> 00:04:38,640 Speaker 1: If we are trusted connections and we can validate we're 81 00:04:38,720 --> 00:04:41,080 Speaker 1: people we know and we trust, and then we take 82 00:04:41,120 --> 00:04:44,320 Speaker 1: a selfie to estimate our age, we're going to be 83 00:04:44,320 --> 00:04:47,720 Speaker 1: able to communicate more freely and have possibly more fun 84 00:04:47,760 --> 00:04:50,839 Speaker 1: and grow a Garden and be less likely to go 85 00:04:50,920 --> 00:04:55,240 Speaker 1: to some other platform where maybe we start sharing selfies 86 00:04:55,320 --> 00:04:57,600 Speaker 1: or things that we don't allow on roadblocks. 87 00:04:58,360 --> 00:05:01,520 Speaker 2: Right Dave, is grow a Go the biggest game you've 88 00:05:01,520 --> 00:05:02,080 Speaker 2: had ever? 89 00:05:03,160 --> 00:05:05,560 Speaker 3: Well, I think it's been very public. 90 00:05:05,640 --> 00:05:08,600 Speaker 1: A couple of weekends ago we hit over thirty million 91 00:05:08,680 --> 00:05:12,719 Speaker 1: people on roadblocks playing at the same time, and for 92 00:05:12,800 --> 00:05:15,880 Speaker 1: a moment in time, grow a Garden hit over twenty 93 00:05:15,920 --> 00:05:18,800 Speaker 1: million people playing at the same time. It's not just 94 00:05:18,880 --> 00:05:21,480 Speaker 1: the biggest we've ever had on Roadblocks. That is, we 95 00:05:21,600 --> 00:05:25,320 Speaker 1: believe the largest concurrent player of any game in history. 96 00:05:26,279 --> 00:05:28,880 Speaker 2: And just very quickly, Dave, would you just define largest 97 00:05:28,880 --> 00:05:29,839 Speaker 2: concurrent played. 98 00:05:32,160 --> 00:05:34,719 Speaker 1: We believe from the Guinness Book of World Records that 99 00:05:34,880 --> 00:05:38,640 Speaker 1: twenty million is the largest concurrent players of any game 100 00:05:38,839 --> 00:05:40,160 Speaker 1: in real time in history. 101 00:05:41,000 --> 00:05:44,880 Speaker 2: Okay, understood. What many people don't appreciate about Roadblocks is 102 00:05:44,920 --> 00:05:47,400 Speaker 2: the scale of your own infrastructure. I think that you 103 00:05:47,560 --> 00:05:50,160 Speaker 2: that you run yourself. You have a lot of footprint 104 00:05:50,279 --> 00:05:54,240 Speaker 2: on prem with all the activity. Are your servers and 105 00:05:54,360 --> 00:05:58,480 Speaker 2: CPUs just melting on Friday and Saturday nights or how 106 00:05:58,520 --> 00:06:00,360 Speaker 2: are you handling all of the traffic right now? 107 00:06:00,800 --> 00:06:03,200 Speaker 1: Well, we do like it when our servers melt, as 108 00:06:03,279 --> 00:06:05,520 Speaker 1: long as we stay live and we stay up. And 109 00:06:06,040 --> 00:06:08,720 Speaker 1: what we have done over the last couple of years 110 00:06:08,920 --> 00:06:12,760 Speaker 1: is abstracted all of this infrastructure we have, and you're 111 00:06:12,839 --> 00:06:17,119 Speaker 1: absolutely correct. We believe we get better performance, better reliability, 112 00:06:17,200 --> 00:06:19,920 Speaker 1: and better cost building out all of our own data 113 00:06:19,920 --> 00:06:24,440 Speaker 1: centers around the world. We have started working with partners though, 114 00:06:24,600 --> 00:06:30,240 Speaker 1: and you know those partners as you're Amazon AWSGCP at 115 00:06:30,320 --> 00:06:34,400 Speaker 1: peak times to burst into their cloud as well, so 116 00:06:34,440 --> 00:06:36,400 Speaker 1: we can run all of our own infro, but for 117 00:06:36,520 --> 00:06:39,279 Speaker 1: Saturday morning for two hours, we're actually relying on some 118 00:06:39,400 --> 00:06:41,799 Speaker 1: of these partners to hit those peaks. 119 00:06:43,040 --> 00:06:46,240 Speaker 2: Dave Roadblocks has been subject to a lot of scrutiny 120 00:06:46,920 --> 00:06:50,480 Speaker 2: over child safety, the feature that you're announcing today, but 121 00:06:50,520 --> 00:06:53,960 Speaker 2: also the sort of technological capability for you as the 122 00:06:54,040 --> 00:06:56,240 Speaker 2: leader of this company. Where does it rank and what 123 00:06:56,279 --> 00:06:59,640 Speaker 2: you think will sort of address those historic issues but 124 00:06:59,720 --> 00:07:02,719 Speaker 2: also just allow you to move forward in growing because 125 00:07:02,720 --> 00:07:04,680 Speaker 2: you have an ambition right to capture ten percent of 126 00:07:04,720 --> 00:07:07,240 Speaker 2: the market, and I'm just trying to understand how this 127 00:07:07,320 --> 00:07:10,200 Speaker 2: piece of tech allows you to tap into a demographic 128 00:07:10,440 --> 00:07:11,280 Speaker 2: that will get you there. 129 00:07:12,680 --> 00:07:16,800 Speaker 1: We've been focusing on safety civility as top priority for 130 00:07:16,840 --> 00:07:20,720 Speaker 1: almost twenty years. We believe this is the future of 131 00:07:20,880 --> 00:07:25,720 Speaker 1: communication on social platforms, especially for teens, and we believe 132 00:07:25,760 --> 00:07:29,840 Speaker 1: the future will be for teens in that vulnerable time, 133 00:07:30,320 --> 00:07:35,200 Speaker 1: trusted connections plus some form of age estimation or verification 134 00:07:35,480 --> 00:07:39,280 Speaker 1: or maybe someday help from the phone vendors and other things, 135 00:07:39,320 --> 00:07:43,520 Speaker 1: but ultimately for private chat. We believe this is the future. 136 00:07:43,600 --> 00:07:45,840 Speaker 1: So we're we haven't been waiting for the law. We 137 00:07:45,840 --> 00:07:49,280 Speaker 1: haven't been waiting for legislation. We've We think this thirteen 138 00:07:49,320 --> 00:07:51,720 Speaker 1: through seventeen segment is just as important. 139 00:07:51,760 --> 00:07:53,880 Speaker 3: It's thirteen and under and eighteen and nine. 140 00:07:55,600 --> 00:07:58,120 Speaker 2: We didn't even get to advertising on the platform. All 141 00:07:58,160 --> 00:08:00,960 Speaker 2: of the millionaires that are being minted through platform. Dave Bazooki, 142 00:08:01,000 --> 00:08:03,600 Speaker 2: you have to come back, CEO of Roadblocks. Thank you 143 00:08:03,680 --> 00:08:04,080 Speaker 2: very much