ADVERTISEMENTREMOVE AD

Giving human touch to Alexa or Siri can backfire

Giving human touch to Alexa or Siri can backfire

Published
Hot News
2 min read
story-hero-img
i
Aa
Aa
Small
Aa
Medium
Aa
Large
Hindi Female
ADVERTISEMENTREMOVE AD
Giving human touch to Alexa or Siri can backfire
The All-New Amazon Echo Plus (2nd Gen) can connect several IoT devices at home, apart from reading you news or playing songs with a smarter Alexa. (Photos: IANS)
New York, April 21 (IANS) An Indian American researcher-led team has found that giving human touch to chat bots like Apple Siri or Amazon Alexa may actually disappoint users.
Just giving a chat bot human name or adding human-like features to its avatar might not be enough to win over a user if the device fails to maintain a conversational back-and-forth with that person, according to S. Shyam Sundar, Co-director of Media Effects Research Laboratory at Pennsylvania State University.
"People are pleasantly surprised when a chat bot with fewer human cues has higher interactivity," said Sundar.
"But when there are high human cues, it may set up your expectations for high interactivity - and when the chat bot doesn't deliver that - it may leave you disappointed," he added.
In fact, human-like features might create a backlash against less responsive human-like chat bots.
During the study, Sundar found that chat bots that had human features -- such as a human avatar -- but lacked interactivity, disappointed people who used it.
However, people responded better to a less-interactive chat bot that did not have human-like cues.
High interactivity is marked by swift responses that match a user's queries and feature a threaded exchange that can be followed easily.
According to Sundar, even small changes in the dialogue, like acknowledging what the user said before providing a response, can make the chat bot seem more interactive.
Because there is an expectation that people may be leery of interacting with a machine, developers typically add human names to their chat bots -- for example, Apple's Siri -- or programme a human-like avatar to appear when the chat bot responds to a user.
The researchers, who published their findings in the journal Computers in Human Behavior, also found that just mentioning whether a human or a machine is involved -- or, providing an identity cue -- guides how people perceive the interaction.
For the study, the researchers recruited 141 participants through Amazon Mechanical Turk, a crowd-sourced site that allows people to get paid to participate in studies.
Sundar said the findings could help developers improve acceptance of chat technology among users.
"There's a big push in the industry for chat bots," said Sundar.
"They're low-cost and easy-to-use, which makes the technology attractive to companies for use in customer service, online tutoring and even cognitive therapy -- but we also know that chat bots have limitations," he added.
--IANS
na/mag/

(This story was auto-published from a syndicated feed. No part of the story has been edited by The Quint.)

(At The Quint, we are answerable only to our audience. Play an active role in shaping our journalism by becoming a member. Because the truth is worth it.)

0

Read Latest News and Breaking News at The Quint, browse for more from news and hot-news

Topics:  ians 

Speaking truth to power requires allies like you.
Become a Member
3 months
12 months
12 months
Check Member Benefits
Read More
×
×