Guy Hoffman’s TED Talk Tackles The Task Of Making Robots Seem More Human

By Joelle Renstrom | Published

robot and musicianEven though it’s in the title of this TED Talk, “souls” is a bit misleading. What roboticist, musician, and actor Guy Hoffman is actually talking about is imbuing robots with a sense of intuition and adventure, which makes them less the calculating, rational machines we expect them to be, and more like us.

Hoffman was first inspired by Pixar’s short film Luxo Jr., which features a desk lamp. When watching the film, Hoffman says, you actually feel something for this lamp, which blew his mind. So he went to NYC to study animation, wanting to learn how Pixar could arouse emotional feelings for something inanimate and commonplace. While there, Hoffman learned two main rules: first, eliciting an emotional response has more to do with how an object moves rather than how it looks, and second, that animators should be actors — they should use their own bodies to act out scenes and gauge movement, and then use what they find in their animations.

Hoffman then went to MIT’s Robotics Life Group (now the Personal Robots Group) to study the relationships between humans and robots. There, he realized that robots don’t move like animations. They’re too stiff and stilted — too robotic. So he decided to try and combine animation and robotics in an effort to try and produce something that felt more like human-human teamwork, rather than human-robot teamwork, which Hoffman likens to a game of chess. Usually when humans and robots interact, the robots’ movements are planned and reactive, as though they’re taking turns with the humans, rather than organically interacting with them. But when humans interact, they improvise, adjusting to each other’s movements and rhythms.

So Hoffman took another diversion and went to acting school for a semester. He found an old handbook for actors, showing them how to use parts of their bodies, facial expressions, etc. to convey emotion. This is a lot like how robots work, he realized, so he turned instead to method acting, which involves actors relying on themselves, their intuition, their experience, to embody the character they’re trying to play. It’s a way of “thinking” with the body, not just the brain. This is when Hoffman came up with the idea to give AI acting lessons.

robot 1 v 2 graphic

One of his subsequent experiments involved putting two different robot “brains” inside a desk lamp robot — one brain was the traditional, calculating brain, while the other was the adventurous and improvisational brain of a stage actor, capable of making and correcting mistakes. Hoffman had the robot do the same repetitive task, and asked its “colleagues” to give their impressions of the robot. Overwhelming, people preferred the robot with the intuitive brain. They thought it was smarter and more committed to the task — just generally better. They also tended to personify it more, referring to it as he or she, rather than it.

Then he started working with robotic musicians, which generally are the same planned, calculating robots that interact with humans and music via “call and response.” So Hoffman devised a robot that could improvise. The robot listened to the human and adjusted, and vice versa — they actually inspired each other and totally keep up. It’s pretty awesome.

robot and rapper

Then Hoffman designed a “socially expressive head” for the robot, because musicians aren’t just things that produce music. He went back to animation for the movements and got the robot made, and I have to say it really does make a huge difference, especially when it “looks” at the human musician before coordinating its ending. You believe this thing knows what it’s doing, even though it doesn’t have the same usual calculated programming. The robot actually reacted when the rapper appeared to be spending too much time and attention staring at his iPhone — the robot actually moved so it could be seen out of the periphery of the rapper’s eye, and when there’s a gap in the rap the robot turns to the rapper as though to ask, “Wassup?”

robot enjoying music

Hoffman asserts that a robot can “enjoy” music, not just move to it. So he made a head-bopping, foot-tapping robot that really does seem to be digging the heck out of what it’s listening to. And it’s pretty near impossible not to bop along with it. Hoffman says, “Somewhere in your future, there’s going to be a robot in your life. And if not in yours, then in your children’s life.” Thus, it makes sense to design robots that are less like chess players and more like stage actors and musicians. “Maybe a less than perfect robot is perfect for us,” he suggests.