Monday, August 7, 2017

Virtually Polite

I like my virtual assistants, be they "Hey, Google" or Alexa.  (I've not owned an Apple product since long before Siri.  As for Cortana, well she may be fine, but who knows?)  But they are not perfect.  Each has their own set of problems, be it interoperability with the apps I want to use or the trigger words that are used to 'wake' them up.  However, the thing that irks me the most about them is that they are only kind of polite.



By 'kind of', I mean that they follow the intent of polite conversation, but not the forms.  They respond quickly with relevant information (most of the time) if they are address correctly, but they do not handle words like 'please' and 'thank you' and 'you're welcome' with any grace.  Ffor instance, if you say, "Alexa, please tell me about the weather," then Alexa will tell you about the weather, ignoring the word 'please' as irrelevant to the request.  Then, after you have the information, if you say, "Thank you," you get nothing in return.  Instead, you have to say "Alexa, thank you".  Only then will you get a "You're welcome."  It does not fit into a natural conversation.  At least in the style of American English in the beginning of the 21st Century.

"You're Welcome"



The issue revolves around the 'wake' words and how these devices are programmed.  They are designed with a two-beat conversation: hear a question, deliver a response.  That's it.  They aren't designed/programmed/smart enough for a longer conversation without re-waking them.  So "Thank you" falls not on deaf ears, but on uncaring ones.

To fix this, the device needs to remain awake beyond the delivering the response and listen for additional input.  (This raises flags for the privacy police who are already concerned with how lackadaisical people are around these devices.  We'll put that aside as a given and assume that being polite is better than mega-corporations knowing our most intimate secrets.)  It then needs to be smart enough to know whether or not anything said after it has delivered its response to the initial query are addressed to it.  We humans are good at this, figuring out from verbal and visual cues if someone is talking to us or someone else.  This is much more difficult for a hockey puck on a shelf.

Which raises the question:

What's the Point?


Is this really that big a deal?  When I asked my daughter if these voice assistants should be polite, she thought about it, asked a few questions around the problem, and came to the conclusion that they should be polite because that would make the conversation more natural.  And Amazon seems to agree with her.

For her, that was the end of it.  Keep in mind that she is still someone who has to be reminded to use "please" and "thank you" and "you're welcome" on a moment-by-moment basis.

Of course, I was unwilling to let it be that simple.  A little over thinking adds some nuance to the issue that uncovers some potential pitfalls on the societal level.  First, what is the purpose of being 'polite' to anyone?

On the surface, it is use to acknowledge that someone has provided you with something, be it passing the salt, answering a question or merely paying attention to you.  The conversational beats of "please," "thank you," "you're welcome" are there to help us recognize that work has been done on our behalf and that that work has been recognized and appreciated.

But it goes a little deeper.  Each culture has different beats to this.  Those in the 'Western Civilized World' can mostly be translated back and forth.  "Thank you" = "Merci" = "Danke".  Farther afield, the concepts get fuzzier; how you acknowledge effort on your behalf changes depending on the relationship you have with the other person.  Knowing these nuances and using them correctly labels you as a member of that culture: a person instead of an outsider.

Voice Culture


Therefore, if these voice assistant devices start to use polite phrases in conversations with us, they are asking to be included as members of our culture.  Is that really what we want?  Or should we keep a bit of uncanny valley in these conversations to remind us that a thing like Alexa is an 'it' and not a person.

I'm not sure that we will be given a choice.  If we want to keep using devices like this, it will simply happen.  The programmers are going to get better and the machines will learn and suddenly we are having extended conversations with small, black monolith.

3 comments:

  1. I had an ulterior motive for discussing this with my daughter. I wanted to remind her that she needed to be polite when visiting my parents. This part did not work as well as intended.

    ReplyDelete