Limitless Tech

Balancing Generative AI with the need for humans in CX, Hosted by Chris Dumpleton of GigCX Podcast - Featuring Lately CEO Kate Bradley Chernis

Watch the Video ›
Transcript

Speaker 1: (00:06)

Okay. Hello. Welcome back to the Gig CX Decoded podcast, where we talk all things customer service. We talk about technology, we talk about gig based customer service, brought to you by Limitless. I'm your host, as always, Chris Dumpton. I am the G officer over at Limitless. And today we are joined by Kate Bradley, Churnis c Churnis from De Furnace, as I've just been told. Um, and Kate is the founder and c e o of Late Lead do ai, uh, which is a lead gen social selling platform. I'm gonna, I'm reading a script, Kate, so can you, this part, uh, anyway, in fact, I'm not gonna ask you to do it, but the one thing I'm really keen for you to expand on is you were a former rock and roll dj, uh, an award-winning radio producer. So, you know, like a, like we were talking offline, I'm expecting big things for you, 25 years, broadcast, communications, brand building. Give us a bit of a prey on you and, you know, and tell us a bit more about lately. That'd be great.

Speaker 2: (01:01)

Yeah. Well, my husband always says to me like, why don't I get the nice voice? You know? 'cause that's my, my, my radio voice is a little warm.

Speaker 1: (01:09)

Uhhuh , you

Speaker 2: (01:09)

Know, . But yeah, I, I used to be a rock and roll dj. Um, I was doing it up and down the eastern seaboard, and this is when radio was mostly live, actually, and before the internet, so you couldn't like, look people up, you know? Yeah. You had to really just go by their voice. And, um, it was, it was great. I mean, I was really lucky. The kind of radio I was in was very artistic and we would spend hours, um, creating commercials that we thought were really entertaining and just kind of being, you know, a little bit nutty.

Speaker 1: (01:39)

Oh, I see. Right. Yeah.

Speaker 2: (01:40)

Um, but yeah, it was really, really, really fun. And then my, my last gig was at XM broadcasting to 20 million listeners a day. That was a startup back then, you know, it was, I was there year three. Oh

Speaker 1: (01:49)

Yeah. Yeah.

Speaker 2: (01:50)

Um, hard to, hard to believe. But it was wild. You know, the first day Tony Bennett walked by me and, uh, , this is before, um, you know, Google and its model of open offices hadn't existed yet. And so the, this was all open and can you imagine there's like 200 radio stations all playing at once in this room. Big, huge room, you know, and it was just chaos. There was stuff hanging all over the walls and, you know, it was like a, it's like a bunch of college kids. And we weren't in college at all. I mean, we were all Yeah, absolutely. Professionals. Yeah. Yeah. But it was like that as, you know, a bit of, a bit of a zoo. And, um, and then, you know, they got, as always happens with startups, they get, it gets big and then the corporate comes in and they make it all fun. And they moved us. We were, when we started, we were in the top floor of the building, and they moved to just the basement . They moved the d all the DJs to the basement. Wow.

Speaker 1: (02:42)

And no windows ,

Speaker 2: (02:44)

There were somewhere in way in the front of the room, I remember. But, um, you know, and, and then that's when they started piping in Oprah and Derek Jeter, like, you know, for all the famous people and not just the lowly old DJs, you know? So, uh, so I, I left that gig and somehow, somehow got here. .

Speaker 1: (03:05)

You tell us more about lately. I mean, um, yeah, I was just looking offline, but give us a, give us a bit of an intro.

Speaker 2: (03:11)

Yeah. Well, my, so my, my Uber power, Chris was turning listeners into fans or customers into evangelists, right? And I, um, had written, I was a fiction writing major, major, and I'd written hundreds of commercials and radio. And so there was a parallel there for me. Um, we were talking about, you know, how you couldn't look people up. Um, I was really interested in the theater of the mind, and there was a connection in writing. I learned as well. Um, obviously when you read a book, you know, your, your imagination plays a role there. Mm-hmm. . And I remember like my, one of my GMs calling me like, Hey, you were number one, this is impossible. Because AAA was kind of radio. I did the format, which is like 20, or this is a long answer to your question, 20 or 21 in the market, not, not number one, like country or rock would normally be.

Speaker 2: (04:02)

And so we were never number one. And he was like, how, what did you do? How did you do this? And I was like, I don't know. And, um, I started thinking about, well, what is it? What was I doing? And I read, um, Daniel Levin's book, this Is Your Brain On Music, and sort of learned about the neuroscience of music listening. And, and I ended up taking what I was doing, translating it into writing, and brought it to Walmart and got them 130% r o i year over year for three years doing what became the prototype for what is lately now. So what is lately now is the question, um, the, the, the question is, um, this, so lately you learns any brand voice or individual voice, and we're able to create a, a model that's uniquely yours, that understands the patterns of when you write well, what goes into that, what are the ideas, the phrases, the sentence structures.

Speaker 2: (04:57)

And then we're also able to learn what your unique target audience is on social, marry those to together. And this model learns in real time all the time. And we apply it to long form content like text, audio, or video. So it could be a blog, for example. It could be an interview like we're doing today. It could also be a webinar, for example. And you run it through the model and the model's trying to find, okay, well, of, from this podcast, what parts of the podcast meet the model and it lifts them out into social posts. And over time, it'll start to write them in your voice so you don't have to make any edits. And then we'll clip up the podcast as well in all these places where it's highlighting like the quotes that meet the model. And so you get dozens of mini show promos in just a few seconds that are designed to get you super high engagement online.

Speaker 1: (05:51)

Well, that sounds, you, you needed a bit of name dropping for me. Who, who's your, uh, I saw actually on your board of advisors, you've got Gary V on your

Speaker 2: (05:59)

Gary. Yeah, he's

Speaker 1: (06:00)

Gary's

Speaker 2: (06:00)

Very nice. Um, also David Meerman Scott who wrote, yeah, I dunno, 13 books. Uh, he's on the board of HubSpot and, and, uh, he opens for Tony Robbins, like, you know, for all of Tony shows. Yeah, it's a great group. Um, Rob Stefans from Marvel Comics and Disney is, is on my cap table. And, um, Ja, Jason Callis, Joanne Wilson. And if you're a venture nerd, like you know who these people are. ,

Speaker 1: (06:24)

. Well, I know half of those. I'm gonna have to look.

Speaker 2: (06:29)

Okay, so, well,

Speaker 1: (06:30)

So today we're just gonna, we're have a, we're gonna have a chat about, um, we we're gonna try and bridge the gap a little bit. So this is gonna be a lot of about sort of AI and a bit technology. And, and actually, one, one thing you've picked up on there is, is, um, is kind of paramount to us, which is turning fans into evangelists, turning

Speaker 1: (06:44)

Customers into ambassadors. And that's exactly what we're all about, getting customers to help answer other customers inquiries. And you go to the ones that are already already customers and you say, Hey, thanks for being a customer. Do you wanna, do you wanna become a brand ambassador and do you wanna help them? You know, nine times outta 10 people say, yeah, of course I'd like to do that, because they're already, uh, they're already a fan of the brand. And, um, you know, this is the thing I, I always find one of my sort of anecdotes, what anecdotes, but it's something I always say when I'm talking to companies, which is that the way customer service has kind of been structured is you, nobody wants there to be in the aftercare service. Mm-hmm. It's a, it is a, you know, something that comes along with the territories.

Speaker 1: (07:23)

So you try and deal with it as sufficiently as possible. And what that word means is as cheaply as possible, , but still maintaining some form of, you know, decent customer service and a massively, you know, um, sure. Yeah. But, but that's fundamentally what it's all about. But what that means, obviously you're pushing a lot of contacts into low cost labor markets, but also you're only able to afford to pay people, um, minimum wages because the economics are not there. And you're trying to not, not to be any contacts, let alone there be contacts. So what that means is we, you know, if you were to say to a, you say to a brand, right? Go to your contact center and ask everyone stop what they're doing and put their hand up if they've actually got the product or service that they're giving customer service on.

Speaker 1: (08:03)

And, and, you know, again, nine times outta 10 people don't because they're just there to do a job mm-hmm. rather than something that they, you know, they absolutely care about. So that's what, that's all what we are trying to tap into. There are, there are fans out there, there are a dozen brands that I'm a fan of. And if someone asks me to become a thing, I'll be like, well, I understand why you're asking me, and if I've got the time and inclination, I'll do it. You know, that's what, that's what we're all about.

Speaker 2: (08:26)

Amazing.

Speaker 1: (08:27)

So we are gonna talk a bit more about, um, uh, we, we're gonna keep this quite technology orientated, but we, we, uh, we, we are two humans. None of this is generative ai. We're gonna talk about AI later on. So this just dive into, you know, big broad topics to start with and, and let's work our way through. So how should, um, businesses integrate AI technology into customer experience based upon your, your exposure to it? And what, what's your opinion on that?

Speaker 2: (08:58)

Hmm. Well, I mean, it's so funny. Everybody thinks that they know what generative AI is right now, but they don't. So let's just start there.

Speaker 1: (09:07)

Yeah.

Speaker 2: (09:07)

Um, David Merman Scott, who we just talked about, um, you guys should Google him. He wrote this book called Fan Ocracy, which is all about, um, the Grateful Deads and how they were able to turn their, their listeners into fans the same idea that we're talking about here, right? He's the king of this mm-hmm. . And, um, so he wanted to sort of distill this, um, and make and, and help people who were sort of trying to understand and research inner of ai the basics. So there's two questions that matter, which are whose data and who's math, right? Those are the first two questions you wanna think about when you're talking about any kind of AI really, but certainly generative ai. Now with lately it's your data. So we use your data only and we keep it 100% private and we use our math, right?

Speaker 2: (09:52)

Right. And so our math is pretty great. When I write a social post on LinkedIn, I get 86,000 views. And that's part of the secret sauce of the math of how, how we work right now with, um, chat G B T and, um, meta and everybody else, Google's G four, that's public data, right? Your data, like, it's from, from a giant public data set, and it's generic math. It's open source math, right? And so the reason these things mattered, it doesn't really matter what you're using the generative AI for, whether it's CX or accounting or HR or what, you know, whatever industry you're, whatever part of the industry you're in. Um, the legal questions surrounding this is, is really what, what it comes down to. So for example, um, I think it was, I always forget if it was the Washington Post or the Wall Street Journal, which are two things no one should confuse, but I do. 'cause of the w Anyways, one of them had released this widget where you could literally type in any, any, uh, company, any U R L and see what percentage of data from that company that Google was using to feed g it's G four dataset.

Speaker 1: (11:01)

Wow. Right.

Speaker 2: (11:02)

I know. Total eye-opener. Right. That that really helped it, it broke 'cause so many people were using it , but like it was a real sales booster for us for a while, you know? Yeah.

Speaker 1: (11:12)

Yeah.

Speaker 2: (11:13)

Um, and then, you know, the other kind of thing, which is worth, worth pointing out here, um, is Harvard Business Review just released an article, and I believe the title is, um, why AI Prompt Engineering Is Not the Future. And their argument is not even argument. They did a several studies that show when humans and AI are used together, they outperform AI alone to x to seven X every single time.

Speaker 1: (11:43)

Okay. Right.

Speaker 2: (11:44)

And so, like that's another thing that we've been, we built lately to, um, enfold humans in the mix, right?

Speaker 1: (11:53)

Yeah. From the

Speaker 2: (11:53)

Beginning, you know, ground up. So either these are just questions you wanna be asking, like, is the AI I'm using, is it an afterthought? 'cause everybody's just putting a doll up of AI on top of their platforms now and being like, yeah, we're hip. Yeah, yeah, yeah, yeah. Um, so that's something, I mean, it's a very powerful tool to be wielding here, right? I think so those are, those are the questions to think about sort of even before you're considering, you know, applications. Um, like, and I mean, I have a lot of other questions I can, you know, talk to you guys about, but, um, you know, the copyright issue is certainly one as well. That's why Sarah Silverman is, is suing, you know? So again, that's a risk that you take when you're using these big public data sets. Yeah. Um, you know, we have nine years of our own machine learning and natural language processing, which ironically, Chris is like, it's why we're getting green lit by, uh, legal and IT teams from our customers, because, you know, we're not in this mess. But it's also why investors look at me and like, I smell because they're like, well, you're nine year old, you're not a young startup anymore. You know, and you're like, I can't win,

Speaker 1: (13:04)

But they, they secure. So that's an interesting one. So is that, is that because the, you know, so whose data, who's math, but if the data is being processed by companies, is there risk there?

Speaker 2: (13:18)

Yes. Huge

Speaker 1: (13:19)

Closing things and things are going through, you know, machines aren't on premises, is there in other, you know, in, in the cloud. Yeah.

Speaker 2: (13:26)

And where is that,

Speaker 1: (13:26)

Where's that data going? Where is it being processed? It's

Speaker 2: (13:28)

Going out to the public use case, right? So like, just by using, just by using your, um, pretty much every j g I in the market is, is a layer on top of OpenAI or Meta or G four, right? I mean, everybody is even Jasper for example, right? So just by using them, I'm like literally putting my my data out into the world to be used for something else. So that's a huge problem. Right? And then you can't copyright it also.

Speaker 1: (13:56)

Yeah.

Speaker 2: (13:56)

Another problem. So the privacy issues, I mean, this is a, certainly a risk, um, and no one's really solving it yet. I mean,

Speaker 1: (14:06)

And it, it probably is that because of the pace of deployment when it became mainstream and the legislation and the ability to, to, to try and legislate it is, is can't catch up yet. And they're still trying to figure out how.

Speaker 2: (14:18)

Yeah, I mean, I, it, I, it reminds me like here in the states, like companies like that or, or like organizations like the I R S for example, like last time I was working with the I R s, um, through for a project I was involved in, they couldn't use like Google Docs. Like there's companies like that who banned the use of docs, right? Yeah. For example, because of they're worried about who knows safe safety concerns. So it probably, that's, that's I think where we are now and probably where we will go, it's sort of sort, it's like companies that ban ban social media or Facebook, you know, you're like, well, okay, that's kind of silly. People are gonna find a way around it anyways. Um, for us, we actually, we do have an integration with chat. Bt we were friends with them back in, we were in their closed beta like four years ago. Um, but we, it's not core to the project, so we can switch it off or on. So we have companies who have banned chatt pt, and so for them, we just turn it off.

Speaker 1: (15:16)

Okay. I didn't realize that people banning it already. That's interesting.

Speaker 2: (15:20)

Yeah, lots of large companies, especially because they don't know how to, to your point, they just haven't figured out like how to address it. You know? There's too, it's too much of a wild west kind of right now. And, um, we'll see. I mean, you know, the other thing that, on the generic thing as well is like, it's like everybody just got CliffNotes. Do you guys remember CliffNotes at all? when I was in school? Um, so

Speaker 1: (15:48)

Is this, sorry, is this a US thing or is this like a, this

Speaker 2: (15:50)

A global, uh, no, I think it's global, but it, but um, so they would, they would write summaries of, of all the classic books that we had to read in school, you know, weathering Heights, Moby, whatever, Uhhuh . And they were yellow and black, big black stripes on the, on these little tiny books. And it was a summary and it had questions that your teacher would probably ask you with an essays written out already. Right. So everybody would lift it, right? , you know, , who's cheating? We're all cheating, you know? Yeah, right. Too, and can tell right away. And so the whole world is sort of going like right now with this, with, with chat j p t as well, because, you know, if you and I put in the same prompts, we get out the same thing,

Speaker 1: (16:28)

Uhhuh , right?

Speaker 2: (16:29)

Because there's no, yeah.

Speaker 1: (16:31)

Performance

Speaker 2: (16:32)

Learning loop tied to the, the results in any ways. I had no idea. Like, I mean, may, this is awesome. Write me a thousand word blog on, you know, best practices in the cx. Great, thank you for so much. But there's no understanding if this is the content that my, um, audience will want to read, or if it will actually get me the conversions I'm looking for or do anything for me, because there's, it's just, it's outta the blue, right? Mm-hmm. So yeah, those are problems.

Speaker 1: (17:02)

, you sparked of a few questions that I've got for you now. So one, one was, um, lemme throw this one at you then. So the, when we, um, so you know, six months ago when, when chat G p t kind of first, you know, hit the news if you like. I mean, obviously AI's been around for a long time. It's often been in the background, just sort of helping augment. And I've got, I do have a question minute about the relationship between AI and, and people. But what, what would you say are the, based upon your experience, the pitfalls or the potential benefits of saying, right, well, we don't need frontline customer service agents anymore because we can just deploy very intelligent AI engines sat behind chatbots that can do excellent natural language understanding and deliver very real world kind of opinions. And therefore we can, we can get rid of a gazillion frontline customer service people.

Speaker 2: (17:54)

Hmm. Well, certainly all automation changes the way people work. I mean, that's to be expected, you know, um, the, what we like to say, or a lot of people like to say, it's not that you're going to lose your job, it's because of ai it's because you're not using AI as part of your job that you're gonna lose your job, right? So, I mean, that Harvard Business Review study is a great one to point to, but another one is this, um, Nina Betty Crocker, the brand, right?

Speaker 1: (18:27)

It sounds good. Yeah. You, Betty Crocker.

Speaker 2: (18:29)

Betty Crocker. So Betty Crocker invented cake in the box, like Duncan Hines. Betty Crocker, right? Yeah, yeah, yeah, yeah. And, um, and back in the day, and this was, is in the mid fifties, um, it, all you had to do was add water, right? And the how wives who they were marketing this to thought it was too weird. They had no connection. They didn't feel like they'd made a cake or baked anything. And so Betty Crocker took the powdered eggs out of the box. So now you just had to add an egg, and that was the slogan, just add an egg and sales skyrocketed. That's

Speaker 1: (19:04)

Very interesting. Yeah. Mm-hmm.

Speaker 2: (19:05)

. So when the human is in the mix, it is better, not just ethically right, but the results as well. And we see that too, you know, so like for example, uh, lately we only use Lately to market lately and nothing else, and we have, uh, human component woven through the entire process. And I have a 98% trial to sale conversion because it's so good.

Speaker 1: (19:32)

90 ridiculous numbers. Mm-hmm. . Mm-hmm. .

Speaker 2: (19:36)

That's the difference between great results and galactic results. And that's what you want when you have humans and AI collaborating together.

Speaker 1: (19:45)

So tell me a bit more about that, because the, like you said, I mean, you picked someone, so I saw a slogan the other day, which was, you know, like, I think you just mentioned, which is, you know, you're not, you're not gonna lose your job to ai, but you might lose it to someone who is using it, uh, in a, in the right application. So it sounds like the best, uh, recipe using Betty Crocker's analogy at the moment is a bit of both. Um, how, how overt are you to people that AI is being used? Is it something you wanna, you don't tell anybody about? I'm, and I'm thinking about it in the contact center or custom service world? Or is it something you, you openly talk about? We've got this to help people get your answer more quickly and or they're gonna, you know, they're gonna add the egg into that, you know, analogy,

Speaker 2: (20:29)

Right? Yeah. I mean, we, we work with agencies who have this problem 'cause they, you know, wanna be able to use our product, for example, to sell it back to their customer as a different line of business. Um, that's a good question. I, I mean, in that context, I would totally say don't tell anybody , just like, take the credit for it. I mean, just like you would for any kind of automation or technology that you use all the time, right? I mean, for example, it's, well, there's don't telling and then there's lying. Like, if you come over to my house for dinner, the chances that I made anything from scratch on the table are pretty much zero. I probably pulled together Patty

Speaker 1: (21:10)

Crock a cake, right? If they're still,

Speaker 2: (21:11)

Yeah. Not even the cake. I mean, you know, , so like , I love pulling together different things from places I love. And then my, my gift is how I present them. I love the presentation, right? So it's, that's what I do. But I'm not running around telling everybody like, I didn't make this meal. I'm not, also not saying I made the meal. You know, like

Speaker 1: (21:30)

Yeah, yeah. If somebody ask you, you know, you tell them, but yeah, they tell them. I tell them, you're not offering it. You know, I'm not saying, yeah. Okay. That's,

Speaker 2: (21:36)

That's not pulling it off as my own . I mean, I don't know, this is a legal question. Someone's gonna probably write in and be like, oh my God, she's so unethical, . Um, but, but, uh, it's, again, just think of it as, as all technology, right? I mean, if, again, to cooking, like, I can microwave the soup, right? . Yeah. Are you gonna, is is that bad if I microwaved it versus if I like put it on the stove or Yeah. You know, I mean, we're all fine with that, right?

Speaker 1: (22:07)

It's just

Speaker 2: (22:07)

Technology doing it faster for me. I did, I I didn't need to spend 10 minutes. No one's gonna judge me. It's still hot quickly, right? It

Speaker 1: (22:15)

Still tastes the same. Hopefully. Yeah.

Speaker 2: (22:17)

Still, still tastes the same. We just get there faster and better. I mean, that's the,

Speaker 1: (22:21)

I'm, I'm really enjoying the, um, the, the linkage to food into all of us. , because I'm hungry. Uh, I know , there's some amazing analogies in all of this, which I'm gonna absolutely call. And I'm definitely gonna write a blog about how the Betsy Crocker just had an egg concept. It's

Speaker 2: (22:42)

So great. Right? I love that one, one,

Speaker 1: (22:44)

It's really good. That's really got my brain going already about the, the connection. Like, you know, that, that, like you say that people wanna feel like they're doing something and if they completely hand it over to something else, then you lose your connection and you, there's no soul in anything. So, but that's right. But how far down the line, I mean, you could get someone doing, doing everything or there's that, there's obviously the, the, the sweet or the secret sauce, if you like, the balance between the two, which is, you know, if I put an egg in this, I feel like I've really made it. Isn't it funny? It's just the dry green dump for me, sort of thing. Yeah. And,

Speaker 2: (23:17)

You know, uh, Charlie Chaplin had that movie modern Times. And this, this was a commentary on the same thing. 'cause it was, it was about, um, well not about, but in part about the assembly line for the automobiles. And like that there was this terrible feeling from the, the workers that they didn't make anything because they didn't see it made at the end. There was no, right. Yeah. That's like, one of the things I love about startup software so much is like, we make changes almost daily and we can see the results of our changes. You know, there's this feeling of pride in your work, right? Mm-hmm. , um, it ties back into music, by the way, also. So remember the imagination I was saying? So, so there's that, um, famous line, which is radio is the theater of the mind. And television is the theater of the mind less .

Speaker 2: (24:11)

That's because you're like a couch potato, it washes over you, you know, right? There's, you're a vegetable. Um, and there's no ownership. There's no action you have to take for the story to unfold. Whereas if you're listening, or if you're reading your mind like a, an extra character, like literally has to fill in some blanks here for the story, for the journey to, to unfold in the way it's supposed to. And a good, this is what it occurred to me, a good, um, great DJ or a great author knows this and allows space for the, and it's a mystery thing. I can't control this. I just have to know that it's there, but I have to give you enough guidelines to get to the end with me, or, you know, go to the next segue or whatever it is. And then, um, the other thing I had learned, this is the from that, this is your brain on music. Oh, and by the way, imagination taps into nostalgia, memory and emotion as it's activating your brain. So then with, when you're listening to music, Chris, check this out. You're, um, oh, I just said my phrase that I told everybody never to ever say, which is, check this out, shame on me, . It's like my marketing role. Number one, don't say that.

Speaker 1: (25:21)

It's

Speaker 2: (25:22)

Called action. Okay? So when your brain listens to a new song, Chris, it has to instantly access every other song you've ever heard before. And it's trying to find familiar touchpoints so it knows where to index the new song in the library of your ma brain. And guess what? It pulls on nostalgia and memory and emotion also to try to do that. And this happens really instantly. So it occurred to me that, alright, well my voice, your voice, like a song has a frequency to it. There's sound to it. And if I'm reading a text or an email or a slack message that you wrote to me, I'm gonna hear your voice in my head. Mm-hmm. Right? So if you're clever, you're gonna be tapping into nostalgia and memory and emotion. These are all the things that need to be in place for trust to happen. And trust is why we buy, right? So we built lately to do this for you. That's why the results are so galactic. So, 'cause it's not just make as, you know, it's not just make the sale, it's make the megaphone kick off the flywheel.

Speaker 1: (26:25)

Yeah. Right? Uh, I've already, we're not at the end of the show. I've already written more notes on this podcast and I have on any, any of the others, uh, I've got things I'm, uh, gonna look at myself, , right? So , uh, where are we? Right? Couple more questions. So let, let's, so we, we've talked about how it can work really well. Uh, where, where, where do you see it failing? I think I know where you're going, but I'm interested in your perspective. And if you've got a food slant to listen, feel free to wheel it out. But where do you see Gen i gen AI failing in, in terms of providing customer

Speaker 2: (27:02)

Support? Yeah. I mean, look, it's the fakeness which are already there. I mean, we can all smell like it's the cliff notes. You can smell it, right? Yeah. We all know that we've already had this experience even of somebody reading a script, right? We all, you all know when that's happened with a chatbot. Yeah. Um, or even with a live customer service person, it's just infuriating. Um, you know, a let me be very clear for everybody, sentient machine learning does not exist. It does not exist. The definition of artificial intelligence that we had we have was made up by Hollywood. It's not actually what it is. So there's an expectation for magic, which I'm sorry to also say doesn't exist. And I reread Harry Potter every summer, but like, you know, you too. Yeah. Um, and so our expectations of what AI can do are radically misaligned, number one, so that this fakeness, this roboticness is just gonna be part of it. It can't be led on its own. It just can't be. Right. That's why prompt engineering is becoming such a skill right now Hmm. For this exact reason. Right. You know? Um, so that, I think that's just the first thing is like, it's so funny to me how lazy people are. They're so lazy. Like they are choosing contents for content's sake versus content for effectiveness.

Speaker 1: (28:30)

Yeah. Yeah. Right?

Speaker 2: (28:32)

You're like guys, I mean, , thanks. Now everybody can vomit more garbage into the world ever than ever before. And guess what, as marketers, which I am, our problem is the same as it's ever been, which is, how do I cut through the noise? Yeah.

Speaker 1: (28:47)

And you cut

Speaker 2: (28:47)

Through the noise by being, we're doing it right now, Chris. We're doing, we're being real. We're

Speaker 1: (28:53)

Being real. I'm being real great. Yeah. We're jumping all over the place. But it's not a, it's not, I say a script. I'm, you know, , but we're both adlibbing a little bit as we go along, so it's good. Yeah.

Speaker 2: (29:03)

Uh, that's all I do. I can't even think in a straight line. I mean, and I certainly don't talk in a straight line , you know, , there's no, no logic that AI could, could possibly, you know, figure out here. And, and another thing while I'm on this, so like I was, I was, um, I was just at Inbound, which is like HubSpot's big party, um, talking about this. Um, it's data patterns, right? That's, those are things that have to be in place for AI to work properly. Mm-hmm. . Now if you know that it's data and patterns, the patterns are able to predict, which is another misnomer. Nobody is like looking at the crystal ball telling you who's gonna win the World Series. That's not, it's not even predicting at all. It's literally just saying, based on this from previous information, a lot of previous information I've had before, the next most likely thing to happen should be this. Mm-hmm. , we've all had the experience where you type in your phone like, you know Yeah. Something. And then Siri picks three words that she's showing to you because mm-hmm. , she's had enough data and patterns that after you've written this word, the most likely word someone else is gonna type is these words. Right. She's not smart. That's not, I mean, she's not thinking of this like pulling it out of the hat. It doesn't come from thin air,

Speaker 1: (30:21)

You know? Yeah.

Speaker 2: (30:23)

It has to be that, that decision tree, if this, then this, you

Speaker 1: (30:28)

Know, and it's different machine learning. Yep.

Speaker 2: (30:30)

Yeah. And like for, for it to work on its own. So like to avoid the fakeness or whatever, think of all the variations and instances that would have to be in, in multiple. So because only works on like, huge amounts of data Right. In order to find the patterns. 'cause otherwise the error is too great. Right? So like for us, we need 10,000 pieces of, um, content for like with this one, this one instance to happen. Mm-hmm. , like we know that's the number 10. That's just for one thing that I'm doing, you know,

Speaker 1: (31:04)

10,000 words or 10,000 like article.

Speaker 2: (31:07)

Uh, it can be any pieces of content really. Right. Um, but like, you know, have you seen the H B O show Silicon Valley?

Speaker 1: (31:15)

No,

Speaker 2: (31:16)

It's great. Um, if you are a startup, you should watch it. It's my life. It's hilarious. It's Mike Judge, you know, the, the, um, the guy.

Speaker 1: (31:24)

Oh, I've, I'm sure I've seen it.

Speaker 2: (31:25)

Yeah. So, so there's an episode where Jin Yang produces an app that's like the Shazam for food, right? He can hold it up to a hotdog and it says hotdog. But if he holds it up to something else, all it says is not hotdog because he only had enough data of pictures of hotdog if he grew it. Right. So like, if he holds it up to, and so if he holds it up to a pizza, just think about all the instances of pizza he would have to have fed into this thing. So like the different shapes, round, square, triangle, thick, thin, red, white, and all the different flavors. So there's, you know, probably millions of different combinations for it to, you start to recognize this thing as a pizza.

Speaker 1: (32:11)

Mm-hmm.

Speaker 2: (32:11)

Right.

Speaker 1: (32:14)

Yeah. Well, I'm, I'm still hungry so that I'm

Speaker 2: (32:18)

I did it again without even noticing. Yeah. But

Speaker 1: (32:20)

That what's interesting there is that you that what AI is doing is based upon the machine learning, is able to identify what if something is something. But then if it isn't, it will just say it's not, not, that's what else it might be. Unless it, so this is gonna get maybe into the, so if, if we think about sort of emotions and what the role that AI can play there and it AI may be able to say things like, I'm sorry you've had that bad experience. 'cause it is a, it's, it's identified that you have some form of distress and Oh, okay. Distress means, I must say this thing. But where, where do you perceive, or where do you understand that AI could get to in terms of being, you know, feeling very natural in terms of its emotion? Think like empathy where you can actually, where it feels more, far more realistic. Do you think it will get there? Do you think it's already there or what should be there?

Speaker 2: (33:19)

It just needs all the data to get there. I mean, like, right. And these are the program res responses that, I mean, it depends on the culture, it depends on the age of people, the language, all these things, you know, that might seem something normal or natural to say in America to a 20 something may not be the same thing as you would say to Albania to a 75 year old person, you know? Right. Yeah. The culturally might not be appropriate. Um, but as long as you have enough instances and data from it to learn from, then it, it can. But just remember in, in that you said empathy, AI will never feel empathy. It's impossible. , it can only replicate what we are projecting upon it. Theater of the mind, imagination to to be empathy. Right? Like, I like to think of the examples I find are so valuable, this is not food. Um, but I, I like to think of R two D two all the time because, you know, we, like, they programmed him to sound like a dog, basically. Right? Like, like the best friend in the world, you know mm-hmm. and all of his little sad beeps or whatever they are like those are, it's, it's our projection of what we want him to be that makes him feel real. Mm-hmm.

Speaker 1: (34:38)

Yeah.

Speaker 2: (34:39)

And even that he is a, he, right? I don't know . Yeah. . Yeah. Right.

Speaker 1: (34:46)

Do do robots, do, do those robots have gender? That's another well that know we're getting. Well,

Speaker 2: (34:51)

C three po doesn't, I mean, visibly,

Speaker 1: (34:54)

That's what I'm saying. We're, we're all imagining, we're all imagining that now anyway, we, we've gone way off topic, but this is very important.

Speaker 2: (35:01)

Well, but you can think of it here, here you go. Is so the executive function of the brain is what we all use to make decisions every day. Right? So this is the part of your brain that literally takes in everything in front of it and lets, and gives you the information and synthesizes, you know, whatever it means so that you can do the next thing you're gonna be doing. Right. So in this moment right now, my brain just made like a million decisions without even me knowing it.

Speaker 1: (35:27)

Hmm. Right.

Speaker 2: (35:27)

I mean, I know it, but it's happening automatically. Hmm. So if you think of, like, this is the, you asked about a pitfall. The number one pitfall that we make as humans is not giving ourselves credit for the actual magic that happens every second, every millisecond of what we do. Right?

Speaker 1: (35:43)

Right.

Speaker 2: (35:43)

I mean, I think about my and my friend, so my friend Jeff, um, takes a commuter bus, used to take a commuter bus to New York City from where we live every day. And he would always sit in the front seat and one day going on the highway on the other side of the highway, a car was driving that had recently met in the shop and the wheels were not very tight for some mistake. And it flew off into the other oncoming traffic crashed through the bus window. Bus driver ducked in the wheel like a cannon hit my friend Jeff in the head. Okay. He did not die. Amazingly, he's a miracle and you wouldn't know that there's anything wrong if you met him at all, because he's a true miracle person. He can drive and talk and ed and do all the things except his executive function is broken. And so the effort that it takes him to have a conversation, he's exhausted in 20 minutes, he's gotta lay down because that's, that's how much information we're taking in, right? Which is how much information any piece of AI would need. I mean, this is what it takes, right? Yeah. This is the, the data and the patterns that it takes for our miracle brains to work, you know? Um, so calm down is what I think we need to all do a little bit .

Speaker 1: (37:08)

Well, there's a beautiful blend in there. I completely believe around, I mean, you know, there's a, there's a, do you tell people that you've bought the food in, in terms of do, do you tell people you're providing a lot of the service through ii and maybe you do, maybe you don't. But the, the thing I think that you clearly feel passionate around as well is, I hate it when there is, I, I feel like in some way you're being sort of cheated and I don't wanna say like inauthentic, but like, when you just, when you feel like you're on some form of treadmill, like you're going for emotion and people are saying the things and you're like, you don't believe it, you don't, you're not relating to this. And it's just like, just shoot me now. And like, I've been trying to cancel, I've been trying to cancel a, um, uh, broadband service in my house.

Speaker 1: (37:52)

So we've, I've gone live with a new provider. I want to cancel the old one. Every time I try and talk to them, I'm shoved onto a channel and they, and they keep saying, I've got a few questions for you. Does that, does that sound good, ? And, and then by the time I get to the answer, it's asynchronous. I'm like, yeah, I'm ready for this question. And then, then it goes again. Another one comes in and, and then I'll just look, I'm, I'm, I'm not God, I'm not happy with you wanting to ask me questions. I've moved for these reasons now cancel my service and they still haven't canceled it. And they

Speaker 2: (38:19)

Keep me,

Speaker 1: (38:20)

God, they keep asking, this is week now. They keep saying, we've got some questions for you. Is it all good? Because I know what they wanna do. They wanna try and retain me.

Speaker 2: (38:26)

Sure.

Speaker 1: (38:26)

But they can't retain me because they don't, they, they don't offer the product that that is is available here. It's

Speaker 2: (38:31)

So annoying.

Speaker 1: (38:32)

It really is annoying. So anyway. And

Speaker 2: (38:33)

Legal maybe also, yeah, sounds, sounds like it must be, but well, I, I suggest you social media them tweet at them or X at them, you know, do that. Embarrass them publicly. Yeah. ,

Speaker 1: (38:45)

, I only give 'em another few days and then, uh, I'll start doing capitals in my responses, seeing if they can detect the fact that or shouting letters and that emotion and see what they can do there. But anyway. Right. Well this has been fascinating. I have made lots of notes. I'm gonna write a blog about the Betty Crock cakes. Um, send it to me. I'm gonna talk about the difference between data and math and when we, when you use it for, in the outside world, there is, you know, public data and open source mass, things like that. I think it's, it's brilliant. And I'm gonna start doing more around nostalgia, memory and emotion. So I've, I've made some great notes here. This is fantastic. , thank you for your, uh, thank you for your lesson class

Speaker 2: (39:26)

Dismissed. Awesome. Yeah,

Speaker 1: (39:28)

Brilliant. It's been really good. And I think, you know, the, the, uh, the, the, the wash that the hit the industry when AI land, even though AI's been around for a very long time. 'cause as you know, as you know very well, nine years in at least the, um, it was, it was sort, it was just gonna completely wipe the floor and take away everything. But really it comes down to how, how is applied when you, when it's used, however you make it, how it can enable people to do the job better, but still maintain that human connection, which is what we're all thriving for. And people can smell it, as you said, when it's not human. And if it feels like it's scripted, then you just switch off. And then people don't buy off people. They don't trust you don't get the emotion, you don't get any memory. There's no nostalgia. And therefore, why am I on this ride with you ? So that's hopefully what we're all about. 'cause that's pretty much what we're trying to do. Just connect people with people, let 'em talk, give them yes, the guard rails of scripts. Like, you know, say these things, don't say these things, but have your conversation away. You go. And, and we, we hope that's a better place. Well, but that has been absolutely fascinating, Kate, where, where do people find you for, uh, to hear more about your, um, food allergies?

Speaker 2: (40:36)

, thank you so much, Chris , um, get me at dub dub dub lately. Dot ai. Of course. .

Speaker 1: (40:44)

Well, I'm gonna, uh, yeah, I'm gonna drop you line up. I, I do genuinely wanna learn more about the lately AI, because I think there's some good stuff for, for our business. But, um, it's been lovely talking to you and, uh, enjoy your meal when you get to it. I'm gonna go and have mine now because I'm now absolutely famished from ate.

Speaker 2: (41:02)

Thank you. Bye.

99% of Posts on Social Get ZERO Engagement. Don’t let that be you.