S Head Poops Out AI Generated Art
Robot dogs are already a bit creepy. But slap on a hyper-realistic image of a tech billionaire’s face and have them literally crap out a piece of AI-generated art and you’re left with something that would make Black Mirror producers shudder.
“What if the act of looking at art were no longer a one-way encounter, but part of a feedback loop in which the artwork observes, learns, and remembers us in return?” Beeple said in an artist statement accompanying the installation.
But while the end products are appropriately crappy , no two photos are exactly alike. The piles of prints each carry an aesthetic that reflects the personality of the human head attached to the dog. The Picasso images appear geometric, while those pushed out of the Zuckerberg dog’s rectum look like a clip from a low-budget Matrix knockoff.
More examples of the prints, which Beeple refers to as “memories,” are viewable on the installation website .
Each artist or billionaire inspired robot dog has its own “temperament.” For example, Elon Musk’s is described as a “cognitive blueprint,” while Picasso’s is “proto-cubism.” (Beeple’s dog, for what it’s worth, has a temperament of “dystopic futurism”). Each also has its own speed setting—slow, medium, or fast.
Maybe unsurprisingly, the tech billionaires all fall into the fast category.
Beyond fueling nightmares, Beeple says the bigger point of this robodog project is to draw attention to how more and more of the observable world consists of benign design, created to fulfill the vision of a select few techno-billionaires.
That, he says, contrasts with past eras, when artists played a greater role in shaping reality.
“It used to be that we saw the world interpreted through the eyes of artists, but now Mark Zuckerberg and Elon, in particular, control a huge amount of how we see the world,” Beeple told The New York Post . “We see the world through their eyes because they control these very powerful algorithms that decide what we see.”
Alternative viewpoints and findings: See here
