5 Minute Read

Step Four of a 12 Step Recovery Program for AI Narratives

Posted In

Hi, I’m Frank, and I am an AI narrative addict on the fourth step of a 12 step recovery program. In any good 12 step program, group meetings and crowdsourcing of experiences and lessons learned is key. And this blog post is no different.

Let me start by saying, Seth Godin has it right. The future of customer service is going to be AI enabling a hierarchy of brand and consumer relationships that have large-scale, frictionless, personalized support at its core.

However, the future is not AI taking everyone’s jobs as many fear, and that’s where my recovery begins. You see, there is a messaging and narrative issue in the AI-as-a-solution universe that I and my peers have created. When MIT is publishing articleson how AI predictions need to be more pragmatic, it seems clear that a personal inventory, also known as Step 4 is required.

So here is my own inventory of the noise that is distracting us from getting to where we need to be with AI solutions, and especially the narrative or marketing surrounding it. I’ve reduced this list to three for now after ruminating on my sins and our industry’s self-created narrative problems. But I know we may have more, and the more we engage in an open dialogue, the more we can work on recovery together.

If we can admit that we need to find another way to connect a narrative of exciting possibilities and empirical evidence surrounding an approach to utilize AI within business NOW, we can then admit that the sensationalized near tabloid level approach to AI marketing can be overcome.

First, I commit to never using this image or any of its friends again. The benevolent terminator.

This is an easy one. Maybe too easy. We’ve all seen this or a version of it across every digital piece of content about AI. I could blame Stephen Spielberg, but the reality is, this is what humans do to address our innate predisposition to use a picture to say 1,000 words. Oftentimes, if we are being honest, the one word encapsulated here is bullshit.

Right now, we shouldn’t be looking to connect the dots between a creepier C3PO and an easier way to have conversations with a brand. And if we are being honest with each other, we aren’t even trying to. Instead, how about an image of a customer smiling at their Alexa? Maybe the caption reads that Sofia is changing her hotel reservation to include a cot while she is helping her husband build a spice rack in the kitchen (where most Amazon Echo live). Not as sexy as a robot overlord that will let you live, but it is an AI win that can be delivered now.

Second, I promise not to insinuate AI requires training that humans don’t even require to operate daily. The hostage negotiating Mozart.

This is a tough one for me… sales and narrative are in my DNA. I love telling stories. I love listening to stories even more. My best stories came from conducting conflict resolution meetings for high school kids who had committed serious crimes. I was humbled every day. I tried my best to get better, and I hope I showed incremental improvement in my skills to facilitate some level of understanding and peace in those kids’ lives. I’m not sure of the outcomes. But I am certain that this experience is not what should be used to “train” AI. Yet, in our space, I’ve seen sidebars about hostage negotiators being used for training sales AI models. One thing is clear: customers and hostages don’t comprise a Venn diagram that works for a business, unless it is a Dilbert cartoon on LinkedIn with extreme irony. Ignore this, of course, if your business is hostage negotiating training. Shout out to the NTOA.

In this same vein, nothing makes me cringe more than hearing Common (the former hip-hop artist, now a cross between a romcom star and human activist) spitting verses about AI. Or better yet, Bob Dylan sighing over losing a songwriting conversation to that AI system that also won on Jeopardy. Do we really want to connect the dots between a fictitious AI Mozart and using AI to just understand a human utterance and relate it to an intent?

Instead, let’s talk about AI enabling personalization at scale. There are some brands that I simply love… Nike, for example. I am a sneakerhead. I can’t get enough of their app – the smooth, sleek design and the notifications of when they drop the next opportunity for me to hand over my money for a retro Air Jordan XI. But AI can enable me, one of the millions of sneakerheads, to achieve a personalized experience based on where I’ve been, what I’ve done, and someday soon, what I’ve said with my own voice. I don’t want Nike to sell me AI designed sneakers better than Tinker Hatfield, I want to be able to buy and potentially return Tinker’s elegantly designed kicks.

AI models need not be creative geniuses or hostage negotiators to help me. Stop it.

I’m feeling better already.

My first personal inventory item for AI was an appeal for a change of imagery. The second was an appeal for a change of emotion and tone. My final is an appeal based on, ironically when it comes to AI, words.

Third, I will not call software, and in this case, AI, by terms that misguide, misinform and violate the actual definitions of the words being used. The intellectual adjudicator.  

So I made up “intellectual adjudicator” to try and keep a certain AI brand nameless. I won’t be surprised if someone else takes it and runs with it. This aforementioned brand has built a new narrative to describe AI that can understand what you say, in your own words, and turn it into an intent for customer service. We at Speakeasy AI call attempts at understanding intent derived from unfiltered customer voice, Speech-to-Intent™. {Hold for applause}. Another company, describing a similar, yet different solution or approach, is utilizing a term that indicates a court proceeding – usually related to divorce disputes – and the ability to discern knowledge from such a proceeding.

Maybe this is merely semantics, and I am being nitpicky, but the crux of AI that enables understanding at scale is words, spoken or written.

If an AI needs to act as a divorce attorney to help me better get along with brands, much like the hostage negotiating narrative, I think we’ve lost. Instead, let’s choose simple language that focuses on, “why should anyone care?” and real, useful outcomes. For example, Apple has not described the iPhone X’s new technology as Putative Investigative Cognition technology. Their narrative is: look at your phone, turn it on, and it just knows you. Lovely. Facial recognition AI. I get that. Let’s commit to describe the awesome pragmatic ways AI can actually deliver something. I.e., voice technology you will use and love. AI that actually understands you. I can get behind those messages.

There is no doubt that the future is bright for an AI-enhanced consumer and business world; however, let’s take personal inventory of the current narrative landscape, and work on a message that aligns with the outcomes we are delivering today and the future wins we are shooting for, in order to enable better, personalized service at scale with AI… now.


Note: This post was originally part of the Greenbook Blog Big Ideas series, a column highlighting the innovative thinking and thought leadership at IIeX events around the world. Frank will be speaking at IIeX North America (June 11-13 in Atlanta). If you liked this article, you’ll LOVE IIeX North America. Click here to learn more.

Join Our Mailing List

© speakeasy AI. All rights reserved. Terms of Service

What do you think of my pop up?