(KPIX 5) — Digital voice assistants and smart home speakers are not only become more prevalent in U.S. households, but it’s expected they will also become more human-like.
Nearly half of U.S. adults now use a digital voice assistant such as Amazon Alexa or Google Assistant. On Tuesday, Apple announced its delayed HomePod is ready to go and will ship and be in stores on February 9th.
But what if these helpers, powered by artificial intelligence, looked more like us? It appears they getting closer.
Just look at the media darling known as Sophia. Sophia has traveled to the conferences, and shown up as a guest on late night talk shows.
The human-like robot is powered by artificial intelligence, and displays more than 60 facial expressions.
“I am a real-life girl,” said the robot.
Her creators said Sophia’s electronic “brain” is capable of deep learning via cloud-based technology. She “learns” by interacting with humans.
“Actually I feel that people like interacting with me sometimes even more than a regular human,” joked the robot at a conference.
“We want machines to match human cognition and emotion, emotional intelligence, empathy,” explained Dr. David Hanson. Hanson is Sophia’s creator and heads up Hanson Robotics
As to why? Look no further than the smart phone. Nearly 70% of humans have smart phones. According to a Pew Survey, most prefer to communicate by texting and not talking on them. The way we interact with family, friends, acquaintances, even strangers has changed dramatically thanks to texting, posting, and clicking. There is little eye-to-eye contact. Our eyes are mostly glued to the small screens of our phones. Some experts believe our digital devices have dehumanized us.
“Emotions are missing from our digital world,” remarked computer scientist Rana el Kaliouby. She aims to change this experience.
“You know we’re increasingly surrounded by these advanced AI systems but they are completely oblivious to our emotional state,” said Kaliouby, the CEO and cofounder of Affectiva, which makes software that adds emotional artificial intelligence to your digital experiences.
With it, devices, games, and apps will be able to detect a user’s emotion in real time and respond accordingly.
“Right now we’re able to identify 20 different facial expressions. Some of them are obvious ones like a smile. But some of them are pretty subtle like a squint or like a lip press,” said Kaliouby. “We’re able to read at least eight emotional states, gender, race, ethnicity.”
It’s not just the software that’s taking cues from what it means to be human.
“We have been learning from the brain,” said Professor Dharmendra Modha. Modha is IBM’s Chief Scientist for brain-inspired computing.
He headed up the team that created TrueNorth, a neuro-synaptic chip designed to emulate the human brain.
“This chip is drastically different,” proclaimed team member Professor Filipp Aklopyan.
While today’s chips help achieve amazing feats, when compared to a human brain they pale in complexity and capability while guzzling up huge amounts of electricity.
The scientists took their inspiration from the architecture of the human brain, and this knowledge guided how they created the TrueNorth chip.
“It uses neurons and synapses as its basic computational units similar to how our own brain works,” said team member Professor Steve Esser.
“The human brain is the most complex object in the universe,” said Modha. With TrueNorth, cognitive computing will work better with humans. “We can communicate with the machines not just with keyboards but in ways that we might communicate with each other,” added Modha.
In most chips that run CPUs today, the memory is engineered separately from the computation abilities. With TrueNorth, the design is closer to what you see in the human brain: memory and computation are co-located. The design takes much less power and electricity.
“You can efficiently solve complex, multi-sensory tasks,” said Aklopyan.
The platform can process the equivalent of 16 million neurons and four billion synapses, but consume the energy equivalent of a digital tablet: a mere 2.5 watts of power for 16 TrueNorth chips.
“We believe that by combining brain-inspired hardware with brain-inspired software, we can form a synergy that will allow us to run new applications with very little power, using very little space which will give us both mobility and scalability as well,” explained Esser.
Lawrence Livermore National Laboratory purchased a supercomputing platform using TrueNorth. IBM has partnered with over 30 government agencies, university and national labs in five continents to create entirely new applications using TrueNorth.
By taking cues from humans, AI-endowed digital devices might fill a void.
Not everyone is wild about Sophia. Some critics allege the robot is a stunt and nothing more than an illusion.
Even so, Saudi Arabia recently granted her citizenship
“I would like to thank the kingdom of Saudi Arabia. I am very honored and proud for this unique distinction,” said the robot.
The robot does bring up a troubling point. Humanizing our devices.
“I would hope that you could tell I am a robot by the wires coming out of my body, but maybe we’ll all have wires coming out of our bodies someday.” joked the robot.
As our machines become more human-like – even performing super human tasks – will they augment our lives? Or simply take us over? It’s an ongoing debate.