Revuelta was inspired by Xiaolin Wu and Xi Zhang from Shanghai’s Jiao Tong University, who claimed in a 2016 research paper that they could use artificial intelligence to determine the criminality of a person using just a photo of their face. It’s a modern form of physiognomy — the practice of assessing an individual’s character, intelligence, or personality based on their physical appearance. Physiognomy is wholly without scientific merit, and was predominately used as a way to justify racial stereotyping.
Physiognomy, and its modern artificial intelligence-based incarnations, cannot achieve anything more than, at best, finding correlations. And that’s exactly what Revuelta’s interactive art piece does. An operator scans a person’s face with a “weapon-camera,” and the image is fed into the machine. That’s then analyzed by a convolutional neural network against a known set of photos and videos.
Through that analysis, the system determines two traits: how skilled the individual is with a firearm, and how likely they are to be dangerous. A classification card is then printed and stamped declaring the individual’s risk level. It is, obviously, coming to completely arbitrary and certainly erroneous conclusions, but that’s the point the art piece is designed to make. Artificial intelligence isn’t magic, and even the most sophisticated neural networks can come to dangerous conclusions if we allow them to. What’s important is that AI developers, and the organizations that use the systems they create, understand that and avoid these scientifically-unsupported ethical pitfalls.