This Autonomous Robot Creates Interpretive Works Of Art
One of the abilities that distinguishes humans from animals is the ability to create art. Sure, animals do some cool stuff, but there’s usually a practical reason, rather than an aesthetic one (with the exception of the Kraken). Until fairly recently, humans figured that the desire and ability to create art separated us from robots too, but robotic musicians and other art-generating robots call this once-unique ability into question. Still, most of those robots are programmed to play, and it’s not as though they’re mechanical Beethovens, applying what they’ve learned about musical theory to their own skills to create unique scores. Recently, artist and roboticist Patrick Tresset decided to create a robot that can autonomously create artwork inspired by its own interpretation of its environment.
Tresset has been making robots for a while—namely, upgraded versions of a robot he calls Paul, which he calls a “creative prosthetic,” originally designed when he had a terrible case of painter’s block. A few years ago he created Paul and Pete (I have to wonder if there’s a Mary on the way), robots that sketched human faces using facial recognition technology and showed off their stuff at London’s Tenderpixel gallery. Now, many iterations later, Tresset has developed Paul-IX in an attempt to explore the question of whether robots can autonomously create “artifacts that stand as artworks”—specifically, artworks that comment on the human condition.
In order for robots to even begin doing this, they have to do more than just doodle something pre-programmed. “Their style is not a pastiche but rather an interpretation influenced by the robots’ characteristics,” Tresset says. He developed algorithms and computational systems that allow Paul to interpret what he sees around him and to produce his own art in response. Even when Paul sketches a still life, there’s an unmistakable level of interpretation—in the case of the skull below, Paul clearly hasn’t painted anything close to an exact replica of the objects. He’s angled the skull the other direction, so the eyes seem to bore straight into the viewer’s.
Given that humans are capable of empathizing with robots, it’s worth exploring whether the opposite is true. Of course, without sentience perhaps this experiment is a bit limited, or even arbitrary, but anything that encourages humans to look at themselves and the world from another vantage point—even if it’s that of a non-sentient robot—is a worthwhile endeavor. Attempts to understand other points of view generally are.