ADVERTISEMENT

Microsoft showcase AI bot that makes phone calls to humans

Microsoft showcase AI bot that makes phone calls to humans

Published
Hot News Text
2 min read
Microsoft. (File Photo: IANS)
San Francisco, May 23 (IANS) While Google Duplex, which lets AI mimic a human voice to make appointments and book tables through phone calls, has mesmerised people with its capabilities and attracted flak on ethical grounds at the same time, Microsoft has showcased a similar technology it has been testing in China.
At an AI event in London on Tuesday, Microsoft CEO Satya Nadella revealed that the company's Xiaoice social chat bot has 500 million "friends" and more than 16 channels for Chinese users to interact with it through WeChat and other popular messaging services.
"Microsoft has turned Xiaoice, which is Chinese for 'little Bing', into a friendly bot that has convinced some of its users that the bot is a friend or a human being. Xiaoice has her own TV show, it writes poetry and it does many interesting things," The Verge quoted Nadella as saying.
Xiaoice interacts in text conversations but now the company has started allowing the chat bot to call people on their phones.
The bot does not work exactly like Google Duplex, which uses the Assistant to make calls on a user's behalf but it holds a phone conversation with the user.
"One of the things we started doing earlier this year is having full duplex conversations. So now Xiaoice can be conversing with you in WeChat and stop and call you. Then you can just talk to it using voice," Nadella was quoted as saying.
Humans will be humans and the latest victim of humankind was Microsoft.
Two years ago, Microsoft launched an artificial intelligence (AI)-powered bot on Twitter, named Tay, for a playful chat with people, only to silence it within 24 hours as users started sharing racist and offensive comments with the bot.
Launched as an experiment in "conversational understanding" and to engage people through "casual and playful conversation", Tay was soon bombarded with racial comments and the innocent bot repeated those comments back with her commentary to users.
Some of the tweets had Tay referring to Hitler, denying the Holocaust, and supporting Donald Trump's immigration plans, among others.
Later, a Microsoft spokesperson confirmed to TechCrunch that the company is taking Tay off Twitter as people were posting abusive comments to her.
--IANS
sku/and/mr

(This story was auto-published from a syndicated feed. No part of the story has been edited by The Quint.)

(This story was auto-published from a syndicated feed. No part of the story has been edited by The Quint.)

(The Quint is available on Telegram. For handpicked stories every day, subscribe to us on Telegram)

We'll get through this! Meanwhile, here's all you need to know about the Coronavirus outbreak to keep yourself safe, informed, and updated.

Liked this story? We'll send you more. Subscribe to The Quint's newsletter and get selected stories delivered to your inbox every day. Click to get started.

The Quint is available on Telegram & WhatsApp too, click to join.

ADVERTISEMENT
Stay Updated

Subscribe To Our Daily Newsletter And Get News Delivered Straight To Your Inbox.

Join over 120,000 subscribers!
ADVERTISEMENT