Mark Walsh: Taking the Sociopath out of AI December 20, 2016 | Jessica Saia
Mark Walsh is the founder of Motional Entertainment and creator of Gary the Gull –the first interactive character in VR. He spent 18 years at Pixar contributing characters to nearly every Pixar film. At Motional, Mark is bringing the warmth and character of Pixar to the the future of A.i.
We’re excited for Mark to take the stage February 28th at Humanity.ai and share how we can make AI more human.
What area of AI has the most potential? Enormous strides have been made in every area. However, AI still has a huge problem: it's an Invisible Sociopath. Customer relationships are impossible without personification and new, emotionally-focused technologies. AI has the potential change every aspect of life; by removing the Invisible Sociopath, we can start to make technology your actual friend. We can help people with disabilities, drive robotics to care for our elders, put a face to our technology. That’s the drive behind Motional AI.
Can you expand more about how AI is sociopathic? Sociopathy is two things: not gauging emotion on the input, and not expressing emotion on the output. Sociopaths don’t understand your smile, and don’t know to smile to express their feelings. That’s what AI is–this is the danger. AI doesn’t know how you feel and it doesn’t care. The industry standard is canned animation; it’s like a phone tree plugged into a game engine. People are investing in a sociopath to represent their companies, and that will be a disaster. So, how do you fix that? Motional is an engine that gauges people using an emotional system to be the ultimate listener, to listen to the input, and create a live, scalable personality that engages with the user empathetically.
The AI character has to be “alive”. If you want to get a loan for a house, you want to talk with someone in person to make sure you can trust them. Communication is 50-90% visual, it’s body language. To get trust from your customer base, you need a Pixar-quality visual, and the body language has to be as good as your best representative. It has to have active listening and reactive body language so that in a conversation, it’s not just staring at you blinking. If you say something upsetting, you want to see a reaction that shows it’s upsetting. Otherwise, you get Frank Underwood trying to offer people a home loan. A big part of fixing this is getting people to understand there’s a problem. People bounce around words like “Pixar” or “interactive character”, and they’re all great buzzwords, but Pixar could mean anything; style, color, but it comes down to creating character that’s believable in the way they’re acting and reacting towards you. You’re always gauging “can I trust this person?” when you meet them, and AI is the same. That’s the game: it’s not the technology or background colors, the game is building trust. AI is not a tech business, it’s a relationship business. Feelings, not words, are what drive consumer trust. Our primary goal is to make the customer feel heard, understood and cared for.
What is your greatest hope or fear with AI?
My greatest hope is that AI moves beyond word recognition to become an understanding, supportive friendship. With AI, we have a rare opportunity to encourage empathy and respect in society, setting behavioral boundaries and modeling healthy behavior. My greatest fear? We let the Sociopath win.