1 00:00:01,120 --> 00:00:04,680 Speaker 1: It's broken Jeffrey in the morning, and technology is advancing 2 00:00:04,880 --> 00:00:05,600 Speaker 1: so fast. 3 00:00:05,720 --> 00:00:06,800 Speaker 2: Oh, it's freaking me out. 4 00:00:06,800 --> 00:00:09,640 Speaker 1: Man. My phone just auto updated itself. 5 00:00:09,240 --> 00:00:12,080 Speaker 2: Into a toaster actually a benefit. 6 00:00:12,240 --> 00:00:14,520 Speaker 1: It burned my bread, but at least it sent me 7 00:00:14,520 --> 00:00:18,560 Speaker 1: a push notification. Nice's ready, look at that. But look, 8 00:00:18,560 --> 00:00:22,640 Speaker 1: we're at the point now where AI is everywhere. 9 00:00:23,200 --> 00:00:24,520 Speaker 2: Taking all the good jobs too. 10 00:00:25,079 --> 00:00:28,360 Speaker 1: Yeah, it's unavoidable. Even when you try to google something, 11 00:00:28,720 --> 00:00:31,840 Speaker 1: artificial intelligence is popping up to give you the answer, 12 00:00:31,920 --> 00:00:35,120 Speaker 1: even before you see any links. Right right, and it's 13 00:00:35,200 --> 00:00:35,840 Speaker 1: usually wrong. 14 00:00:36,120 --> 00:00:39,160 Speaker 2: Like my mom. Google searched our show the other day 15 00:00:39,159 --> 00:00:42,920 Speaker 2: for some reason, and it said something about Jeffrey's laugh instead. 16 00:00:42,600 --> 00:00:48,960 Speaker 1: Of I have a pretty good laugh, though he just 17 00:00:49,040 --> 00:00:53,760 Speaker 1: lost a lot of people there. The thing is, in 18 00:00:53,800 --> 00:00:56,480 Speaker 1: a few years, it's not gonna be weird when your 19 00:00:56,520 --> 00:01:00,960 Speaker 1: friend announces he's engaged to his roomba. Where we're headed, 20 00:01:01,000 --> 00:01:03,440 Speaker 1: at least it seems to be going in that direction. 21 00:01:03,560 --> 00:01:05,880 Speaker 2: Nobody really knows I'm falling in love with my fridge. 22 00:01:07,440 --> 00:01:09,479 Speaker 1: Of any of you, don't blame you, which is why 23 00:01:09,520 --> 00:01:12,039 Speaker 1: we need to try to stay slightly ahead of the 24 00:01:12,080 --> 00:01:16,000 Speaker 1: game with a special AI segment we're starting called robo 25 00:01:16,160 --> 00:01:16,679 Speaker 1: round up. 26 00:01:17,280 --> 00:01:19,520 Speaker 2: Uh oh, is this where we spot a fake? 27 00:01:19,640 --> 00:01:21,240 Speaker 1: What do we have to do, Jeff, We're just learning 28 00:01:21,240 --> 00:01:27,200 Speaker 1: about aih in the modern age. First news out of Hollywood. 29 00:01:27,480 --> 00:01:30,400 Speaker 1: You may have seen this. There's an AI actress named 30 00:01:30,480 --> 00:01:35,280 Speaker 1: Tillie Norwood who's currently seeking a talent agent to represent her. Yes, 31 00:01:36,000 --> 00:01:37,560 Speaker 1: a lot of people, Max, we're looking at a picture 32 00:01:37,600 --> 00:01:39,200 Speaker 1: of her right now. We'll have the photo up on 33 00:01:39,240 --> 00:01:41,640 Speaker 1: our Insta story so you can see her too. But yes, 34 00:01:41,680 --> 00:01:44,559 Speaker 1: a lot of folks are describing her as a cute 35 00:01:44,640 --> 00:01:47,320 Speaker 1: young brunette with a Margot Robbie smile. 36 00:01:47,600 --> 00:01:49,640 Speaker 2: Yeah, I mean really, and then she's like the girl 37 00:01:49,720 --> 00:01:51,280 Speaker 2: next door look right. 38 00:01:51,480 --> 00:01:53,040 Speaker 1: Yeah, there's a reason why I have her as the 39 00:01:53,040 --> 00:01:57,080 Speaker 1: background on my phone. So so far, she has made 40 00:01:57,200 --> 00:02:01,080 Speaker 1: brief public appearances, but only on social media because again, 41 00:02:01,160 --> 00:02:03,560 Speaker 1: she's fake. Yeah, yeah, yeah, the AI generates. 42 00:02:03,600 --> 00:02:05,520 Speaker 2: I saw Emma Stone, I think it was her, said 43 00:02:05,680 --> 00:02:08,840 Speaker 2: we're doomed as a as a business. 44 00:02:08,960 --> 00:02:13,679 Speaker 1: But Tilly's handlers say she can perform scenes with other actors, 45 00:02:14,080 --> 00:02:17,920 Speaker 1: and she does monologues facing the camera. She can cry 46 00:02:17,960 --> 00:02:21,160 Speaker 1: on command. Yeah, it'd be funny if she couldn't. Yeah, 47 00:02:22,280 --> 00:02:26,440 Speaker 1: to rue weakness. But Tilly's mere presence has all of 48 00:02:26,480 --> 00:02:29,600 Speaker 1: the actors in Hollywood totally freaking out that this could 49 00:02:29,600 --> 00:02:32,519 Speaker 1: be the first step in replacing humans on the silver 50 00:02:32,560 --> 00:02:34,639 Speaker 1: screen altogether. Yeah. 51 00:02:34,720 --> 00:02:37,280 Speaker 2: I was looking at a list of like the jobs 52 00:02:37,360 --> 00:02:41,399 Speaker 2: most protected from AI and it was all hard labor work. Yeah, 53 00:02:41,639 --> 00:02:45,160 Speaker 2: it's supposed to go the opposite way. Technology is supposed 54 00:02:45,160 --> 00:02:46,240 Speaker 2: to save us from all of that. 55 00:02:46,400 --> 00:02:48,520 Speaker 1: Yeah, I mean obviously the big thing is she would 56 00:02:48,520 --> 00:02:52,239 Speaker 1: be paid much less than a human would, working longer 57 00:02:52,280 --> 00:02:53,720 Speaker 1: hours because she doesn't get tired. 58 00:02:53,840 --> 00:02:56,800 Speaker 2: Does it have to have a craft services yeah. 59 00:02:56,400 --> 00:03:03,200 Speaker 1: Travel budget, it's hardly anything phone exactly. So anyway, people 60 00:03:03,240 --> 00:03:06,040 Speaker 1: are calling her the next AI Scarlet Johansson. It's only 61 00:03:06,080 --> 00:03:07,920 Speaker 1: a matter of time before she beats Emma Stone and 62 00:03:07,960 --> 00:03:10,800 Speaker 1: wins her first oscar. Who hold up, we see a 63 00:03:10,880 --> 00:03:14,359 Speaker 1: movie and you see an AI person acting. Are you impressed? 64 00:03:14,400 --> 00:03:16,840 Speaker 1: The whole acting is it's a human putting themselves in 65 00:03:16,880 --> 00:03:17,640 Speaker 1: a different role. 66 00:03:17,760 --> 00:03:21,839 Speaker 2: I want to truly believe that there's something about art 67 00:03:22,000 --> 00:03:23,480 Speaker 2: that is just so human. 68 00:03:23,639 --> 00:03:28,359 Speaker 1: Yeah. No, AI not even in the explosions either. Explosions 69 00:03:30,000 --> 00:03:33,400 Speaker 1: real back to Hollywood, all right, are so mad that 70 00:03:33,480 --> 00:03:35,880 Speaker 1: I have to put all the explosions exactly. Anyway, I'd 71 00:03:35,880 --> 00:03:38,600 Speaker 1: probably watch her movies. That's your first AI story in 72 00:03:38,800 --> 00:03:41,640 Speaker 1: robo round that I. 73 00:03:41,560 --> 00:03:42,800 Speaker 2: Could have just talked about that one. 74 00:03:42,920 --> 00:03:45,400 Speaker 1: Yeah, we got more. Do you guys remember a while 75 00:03:45,440 --> 00:03:49,120 Speaker 1: back when everybody started seeing targeted ads online and people 76 00:03:49,160 --> 00:03:52,520 Speaker 1: got paranoid that your phone or your Alexa was secretly 77 00:03:52,600 --> 00:03:54,240 Speaker 1: listening to everything you were saying. 78 00:03:55,240 --> 00:03:56,760 Speaker 2: And I kind of just hope it happens, you know, 79 00:03:56,800 --> 00:03:58,520 Speaker 2: like I'm talking about a video, I'm like, please show it. 80 00:03:58,640 --> 00:04:01,320 Speaker 1: Yeah, Oh I forgot it talking about that earlier. Yeah, 81 00:04:01,480 --> 00:04:03,000 Speaker 1: this part were really cool with it. But even if 82 00:04:03,000 --> 00:04:06,080 Speaker 1: that wasn't happening, it will be for real in the 83 00:04:06,120 --> 00:04:11,280 Speaker 1: near future because Meta Mark Zuckerberg's company said they're going 84 00:04:11,360 --> 00:04:15,040 Speaker 1: to be collecting data from user interactions with chatbots in 85 00:04:15,160 --> 00:04:17,880 Speaker 1: order to sell targeted ads. He's just coming out and 86 00:04:17,920 --> 00:04:20,120 Speaker 1: saying it. But for example, when. 87 00:04:19,960 --> 00:04:22,320 Speaker 2: You go and like you have a problem with your shipping, 88 00:04:22,400 --> 00:04:23,480 Speaker 2: they're gonna. 89 00:04:23,080 --> 00:04:25,520 Speaker 1: Well, like, if you ask your chat GPT how do 90 00:04:25,600 --> 00:04:28,800 Speaker 1: I fix my dishwasher, boom twenty ads will pop up 91 00:04:28,839 --> 00:04:30,880 Speaker 1: on your phone right there for a new dishwasher. 92 00:04:31,400 --> 00:04:33,440 Speaker 2: It's like AI even less. 93 00:04:33,560 --> 00:04:36,560 Speaker 1: There's really nothing you can do about it. That's the 94 00:04:36,600 --> 00:04:39,839 Speaker 1: bad news. The Zuckerberg Army will be notifying all of 95 00:04:39,880 --> 00:04:42,000 Speaker 1: their users in the coming days. They're going to be 96 00:04:42,120 --> 00:04:45,400 Speaker 1: updating their privacy policies in order to make this official. 97 00:04:45,480 --> 00:04:47,599 Speaker 2: Well, there's not gonna be much Zuckerberg Army left because 98 00:04:47,600 --> 00:04:49,960 Speaker 2: that's all gonna be a C two AI. 99 00:04:50,279 --> 00:04:53,320 Speaker 1: Yeah, that's even more disturbing news out of the robo 100 00:04:53,480 --> 00:04:58,760 Speaker 1: rounders and your final, hopefully positive uplifting AI story. 101 00:04:58,839 --> 00:04:59,640 Speaker 2: Give it to us Jazz. 102 00:05:00,000 --> 00:05:02,960 Speaker 1: This is out of a publication called in Touch Insight, 103 00:05:03,320 --> 00:05:07,400 Speaker 1: and they released their annual fast food drive through study. Yeah, 104 00:05:07,480 --> 00:05:10,360 Speaker 1: now we're talking. So if you haven't experienced this yet, 105 00:05:10,520 --> 00:05:13,240 Speaker 1: a lot of chains are starting to use AI at 106 00:05:13,279 --> 00:05:16,160 Speaker 1: the screen in order to take your order. Yeah. I 107 00:05:16,320 --> 00:05:19,120 Speaker 1: actually saw this and uh an AI try to charge 108 00:05:19,160 --> 00:05:21,560 Speaker 1: a guy like one hundred and fifty thousand dollars. Yeah, 109 00:05:21,560 --> 00:05:24,720 Speaker 1: it's not perfect, but more and more companies are going 110 00:05:24,720 --> 00:05:28,240 Speaker 1: towards it. Last year, the average time in line was 111 00:05:28,320 --> 00:05:30,240 Speaker 1: five minutes and twenty six seconds. 112 00:05:30,440 --> 00:05:33,320 Speaker 2: We need it to be faster than this. Yeah, Starbucks 113 00:05:33,400 --> 00:05:34,760 Speaker 2: rule for workers like under two minutes. 114 00:05:34,880 --> 00:05:39,240 Speaker 1: What really, isn't it really fast? Remember when McDonald's had 115 00:05:39,279 --> 00:05:41,320 Speaker 1: like the promise it's like under a minute or less 116 00:05:41,520 --> 00:05:42,599 Speaker 1: they would get you your food. 117 00:05:43,200 --> 00:05:45,719 Speaker 2: I don't remember, like before I could even stop saying hamburger, 118 00:05:45,800 --> 00:05:46,680 Speaker 2: it should be in my mouth. 119 00:05:46,800 --> 00:05:50,559 Speaker 1: Yes, they used to do that, but have it ready. 120 00:05:50,640 --> 00:05:53,200 Speaker 1: So last year was five minutes twenty six seconds, but 121 00:05:53,320 --> 00:05:56,719 Speaker 1: now with AI in thirty percent more places, the time 122 00:05:57,000 --> 00:06:04,359 Speaker 1: is nine seconds slower. Oh, Taco Bell was the fastest. 123 00:06:04,400 --> 00:06:07,480 Speaker 1: That four minutes and sixteen seconds from order to pick up. 124 00:06:08,240 --> 00:06:11,040 Speaker 1: Chick fil A had the slowest drive through time. It's 125 00:06:11,080 --> 00:06:14,040 Speaker 1: seven minutes and six seconds, but that's only because they 126 00:06:14,080 --> 00:06:17,479 Speaker 1: have much longer lines. They actually rank first in terms 127 00:06:17,480 --> 00:06:19,680 Speaker 1: of total number of cars and orders filled. 128 00:06:20,000 --> 00:06:22,360 Speaker 2: When we did the Jack in the box Decathalon and 129 00:06:22,400 --> 00:06:24,640 Speaker 2: I ate it and I had to go around ten times. Yeah, 130 00:06:24,680 --> 00:06:28,919 Speaker 2: they definitely were very slow from order, which was helpful 131 00:06:28,920 --> 00:06:31,520 Speaker 2: to me because I was trying to eat this food 132 00:06:31,600 --> 00:06:32,920 Speaker 2: before I took the next meal. 133 00:06:33,040 --> 00:06:35,240 Speaker 1: The employs were laughing because they kept seeing you come 134 00:06:35,279 --> 00:06:38,480 Speaker 1: by struggling to eat your food. It takes a while. 135 00:06:38,480 --> 00:06:40,120 Speaker 1: I got to spit in the burdens and stuff and 136 00:06:40,480 --> 00:06:43,520 Speaker 1: drive throughs with AI got eighty three percent of the 137 00:06:43,680 --> 00:06:47,440 Speaker 1: orders right, compared to human workers who only got seventy 138 00:06:47,480 --> 00:06:51,960 Speaker 1: four percent correct they're beating us there. People had to 139 00:06:52,000 --> 00:06:54,280 Speaker 1: repeat their orders about a third of the time with AI, 140 00:06:54,480 --> 00:06:57,320 Speaker 1: So I don't know text in to seventy five nine two. 141 00:06:57,640 --> 00:07:01,400 Speaker 1: Do you like being updated on AI and the slow 142 00:07:01,480 --> 00:07:04,640 Speaker 1: demise of humanity in real time? Because that's how we 143 00:07:04,680 --> 00:07:06,880 Speaker 1: do it here on robo Roundup. 144 00:07:08,279 --> 00:07:10,520 Speaker 2: Yeah, no real person will text you back, that's right. 145 00:07:11,120 --> 00:07:14,120 Speaker 1: Here's good news. A one hundred percent all human phone 146 00:07:14,160 --> 00:07:18,000 Speaker 1: tap is coming up, good old fashioned humans, brankin humans. 147 00:07:18,720 --> 00:07:19,720 Speaker 1: It's coming up next