Siri, Alexa, and robots could change how we talk

Sentient AI and super-smart robots are a long way off, but forgetting how to talk to other humans? That could happen soon.

amazon echo

Today's Best Tech Deals

Picked by TechHive's Editors

Top Deals On Great Products

Picked by Techconnect's Editors

Tech giants are in a race to see who can build the most powerful voice-activated assistant, but there’s a side effect that we haven’t considered: Kids who grow up asking Amazon’s Alexa questions or summoning Siri might lose some social skills. What if artificial intelligence changes the way we talk?

Experts in robotics, machine learning, and AI descended on Austin for South by Southwest this week, and the biggest questions were lifted straight from the film Her. Will AI become sentient? Is it changing the way we interact with each other? Will kids think they can order around their friends the same way they tell Alexa to tell them a joke? Spoiler alert: It’s way too soon to tell.

“It’s super important to emphasize the things that we’ve done in the last five years are extremely narrow and extremely focused,” Siri creator Adam Cheyer, whose most recent startup Viv is now being used as the foundation for Samsung’s next voice assistant, said during an AI panel at SXSW. “It’s human ingenuity applying statistics to a very narrow problem. [AI is] incredibly good at playing chess or beating Jeopardy, but the chess program can’t learn checkers.”

OK, so an emotionally complex assistant a la Samantha in Her is probably a long way off, but as artificial intelligence grows more powerful, it’s having a profound effect on us. Just as kids now expect almost every display they come across to be a touchscreen, they might grow up thinking they can command people to respond the way Alexa or Siri might.

“If we get used to talking this way in the future, we’ll need to retrain humans not to talk that way anymore,” Thavidu Ranatunga, whose company Fellow Robots makes retail robots that help customers navigate stores and help employees manage inventory, said during a SXSW panel on robots.

Fellow’s Navii robots, currently being tested as assistants in Bay Area Lowe’s stores, can understand what you say but don’t talk. Instead, they respond to questions with options on their touchscreen displays. That’s to keep confusion to a minimum, Ranatunga said, but also because robots aren’t as capable as people think.

“Robots aren’t the sci-fi type of advanced robots you see [in movies]—they’re really kind of dumb,” Ranatunga said. “When people use them in person, they learn what their limitations are.”

sxsw robots Caitlin McGarry

Robots in conversation at South by Southwest.

They can do menial tasks like count inventory and deliver items. But real conversations? Nah. I saw this in action at SXSW with Osaka University’s demonstration of robots in conversation. It was the creepiest thing I’ve seen in quite some time, and the conversation was not natural. At all.

It’s possible that one day Siri, Cortana, Alexa, and whatever AI-powered assistant comes next will take robot form and become more human-like. But until we teach them to have contextually aware conversations that begin without needing a trigger phrase like “Hey Alexa” or “OK Google,” that’s the stuff of great films rather than real life.

This story, "Siri, Alexa, and robots could change how we talk" was originally published by Macworld.

Note: When you purchase something after clicking links in our articles, we may earn a small commission. Read our affiliate link policy for more details.
Shop Tech Products at Amazon