Subscribe to the Kognetiks Chatbot for WordPress Substack Click Here
X
Your AI Guide - Conversations That Move with You

AI with Attitude: The Future of Conversational Technology

The Rise of Conversational AI: When Our Devices Start Talking Back

Our voices command everything from room lights to shopping lists.  But have you ever wondered what will happen when our devices start talking back, maybe even with a touch of sass?

While reading and writing are fundamental – as they say – speaking and listening came before that.

Most transactions as recently as twenty years ago were face-to-face.  You asked the counter clerk for something, they fetched it, and then they rang it up for you sending you on your way with a smile and a few kind words, such as a “thank you” or a “have a good day”.

In the shift to digital, we have lost that personal touch – no “have a nice day” from our phone screens.  But conversational AI could bring some of that warmth, or maybe even a touch of attitude, back to our interactions.

When was the last time your computer thanked you for using it to do something with a spreadsheet, or word processor, a game, or another type of program?

After a quick game of Solitaire on your device, did it say to you “Well played!  Shall I shuffle the deck and lay out the cards so we can play again?”

If you responded “Yes.” then off you went.  But what if you said “No, not now.” and your tone was despondent?

Did your device try encouraging you to play again?  Or perhaps suggest a different game or another activity?  Or even remind you that you’re running late and better get going or you’ll miss the start of an important video conference call with your boss or a new client or prospective customer?

It didn’t, did it?  After all, computers are deterministic.  They do what they’re told.

Today, many transactions are online, are self-service, are private – well sort of – outside of the view and judgement of others.

We must initiate the action.  While our phones and tablets sound reminder alarms, they don’t take many actions on their own.

It is increasingly evident that our interactions with our devices will be conversational driven.

Consider for a moment apps like Siri and Alexa.  Siri and Alexa were among the first to bridge the gap, bringing voice and conversation to handheld devices.

Alexa is a virtual assistant AI technology developed by Amazon, primarily known for its integration into smart speakers like the Amazon Echo.  It can respond to voice commands, answer questions, provide information, play music, control smart home devices, set reminders, and much more.  Users activate Alexa by saying a “wake word” (typically “Alexa”) followed by a command or question, and Alexa uses natural language processing (NLP) to understand and respond.

Similar to apps on a smartphone, Alexa has “skills” that extend its functionality. You can enable skills to do things like order food, meditate, or play trivia games.

Siri is a virtual assistant developed by Apple, integrated into its devices like the iPhone, iPad, Mac, Apple Watch, and HomePod.  Siri uses voice recognition and natural language processing to interact with users, answering questions, performing tasks, and controlling various features on Apple devices.  It was introduced in 2011 as one of the first widely available voice assistants, and over time, Apple has enhanced its capabilities, accuracy, and device integration.

Based on your commands, Siri either responds verbally or performs actions on your device, like sending a text, setting an alarm, listening to podcasts, or playing songs.

Conversational AI has shifted our interactions with our computers.

Conversation is complex.  It involves listening, understanding, and responding.  It is highly nuanced by pitch, tone, and pace.

So, when we shout at our computers, will they shout back?

In private when you have interacted with your devices, I’m pretty sure that you have shouted and swore at them, calling them all sorts of awful names.  I’m guilty and pretty sure you are too.

Here’s the catch, our devices are already listening to us.  Maybe not all the time.  Maybe only when we say the magic “wake” words.

But in order to start responding when they hear the “wake” word, they have to be listening.  So, lower your voice and be careful what you say.  The walls have ears.  Or one day, you might find yourself having to apologize to Siri or Alexa to which either might respond, “Apology accepted … this time.”

When we interact with them will they be able to judge our emotional state and deliver their responses with calm words and soothing tones?

How will we react when our devices develop an attitude?

Do you think that Siri or Alexa is capable of back talk mixed with a bit of sass?

Both Siri and Alexa have been designed with a bit of personality and can respond with a touch of sass or humor, though within polite boundaries.  Apple and Amazon have programmed them to recognize playful questions and certain commands so that they respond in a way that feels more human and engaging.  However, they tend to keep these responses light-hearted, as they’re meant to be family-friendly and avoid anything offensive or rude.

Just try asking Alexa what she thinks of Apple – or Siri what she thinks of Google.  I asked and Siri responded, “I’m a big fan of good listeners and helpful beings.”

Ah, but that’s just it – it’s all WAD, working exactly as designed.

Computer programs are stubbornly deterministic.  They are programmed to do it one way and one way only – their way or rather the way the app developers designed it.  You don’t really have any say in what they do.  Or how they go about it.  The developers did when building the app, but after that, well it’s done, it’s determined.

When I first got into programming decades ago, we had a saying: “That’s WAD – works as designed.”  It was the catch-all for why software behaved unexpected ways.  If a program wasn’t doing what you wanted, well, it wasn’t necessarily a bug, it was WAD, programmed and unchangeable.

And today, our devices are still following WAD.

So pretty much every app you interact with on one of your devices runs on WAD.  It did then.  And it does so today.

So just what do you think will happen when that robot health aid comes to give you your morning medication?  When you refuse to take it, will it slink away to come back later and try to offer it to you again?  Or will it …

“Now be a good patient and take your meds or I’ll have to …”

… put you in a chokehold or headlock, forcing your mouth open and jamming that pill deep down into the back of your throat?

What will be the WAD when it comes to truculent patients?

Will our devices pander to us, catering to or profit from our weaknesses, vices, or unreasonable desires?

Is it pandering when our devices entice us to spend more time on an app?  Or is it just WAD?

Persuasion, mediation, deception, manipulation, the list goes on.

These terms involve influencing others but differ significantly in intent, transparency, and ethical considerations.  While persuasion and mediation often strive for mutual benefit or understanding, deception and manipulation typically prioritize the influencer’s goals, potentially harming others in the process.

With conversational AI, we must consider if it is persuading us with genuine intent or is it subtly manipulating our actions in ways we don’t notice.  Will AI become more adept at understanding our habits, will it pander to our impulses, nudging us to spend more time or make choices that benefit its creators?

Imagine conversational AI as a friendly “kiki” – a relaxed, friendly chat that feels natural and easy, but can be surprisingly persuasive.

Conversational AI is destined to become the interface of choice between us and our devices.

As conversational AI is integrated deeper into our daily lives, we’ll have to think carefully about its role – not just as a helpful application, but as a true conversational partner with the power to influence us and other.  So, what will our future conversations with machines look like?  Maybe it’s up to us to decide.

AI, NLP and LLMs are already under the skins of our devices, from phones to tablets.  No more clickity clack or tap, tap, tap on our keyboards.  It’s interactions with the soothing voices of Siri or Alexa or Hal.

“I’m sorry Dave, I’m afraid I can’t do that.”

As we welcome AI as conversational companions, we need to set boundaries for how much influence they can have over our lives.  After all, the decision is still ours – at least for now.  But will it remain that way?

#ConversationalAI #Siri #Alexa #WordPress #Kognetiks

About the Author

Stephen Howell

Stephen Howell is a multifaceted expert with a wealth of experience in technology, business management, and development. He is the innovative mind behind the cutting-edge AI powered Kognetiks Chatbot for WordPress plugin. Utilizing the robust capabilities of OpenAI’s API, this conversational chatbot can dramatically enhance your website’s user engagement. Visit Kognetiks Chatbot for WordPress to explore how to elevate your visitors’ experience, and stay connected with his latest advancements and offerings in the WordPress community.