Exploring the Dark Side of Artificial Intelligence
“Uncanny Valley: Being Human in the Age of AI” in San Francisco

A Deutsche Bank-sponsored exhibition at the de Young Museum in San Francisco investigates the eerie side of artificial intelligence. It shows that it’s not cyborgs or terminators that will dominate the future. By Oliver Koerner von Gustorf
In 1970, the Japanese engineer Masahiro Mori coined the term “uncanny valley,” which is actually a downward curve in a statistical diagram showing the results of his study. Mori, a pioneer in robotics, asked test subjects about their emotions toward robots. And he discovered amazing things. As long as the robots didn’t look very human, they could be smart and perform all kinds of functions. They even triggered empathy in the test subjects. But the more mobile and human-like they looked, the creepier and more repulsive the respondents found them. Technology has a dark, uncanny side.

This fantasy that machines will take control, as in Fritz Lang’s film Metropolis, not only inspired modernity, but also influenced Hollywood and the pop culture of the late twentieth and early twenty-first centuries. From the murderous onboard computer HAL in 2001: A Space Odyssey (1968), the cyborgs in Blade Runner (1982), and the Alien movies, to the revolting robots in the current HBO series Westworld, the “uncanny valley” has constantly expanded to include new symbols of a threatening artificial intelligence that is stronger, clever, and faster than we are.

The exhibition Uncanny Valley: Being Human in the Age of AI at the de Young Museum in San Francisco, supported by Deutsche Bank, is on view not far from Silicon Valley, which many also find uncanny. Not because of machines that dominate humans, but due to the artificial intelligence developed here. And its reality, in the age of Clouds, machine learning, and Big Data, seems much more complex than the psychologized science-fiction version. The international artists featured in the exhibition no longer react to the Hollywood-fueled notion of artificial life forms that extinguish humanity, take workers’ jobs, or replace a partner or nurse. “As our lives are increasingly organized and shaped by algorithms that track, collect, evaluate, and monetize our data,” explains de Young museum curator Claudia Schmuckli, “the uncanny valley has grown to encompass the invisible mechanisms of behavioral engineering and automation. By paying close attention to the imminent and nuanced realities of AI’s possibilities and pitfalls, the artists in the exhibition seek to thicken the discourse around AI.”

In short, Uncanny Valley does not explore visions of the future, but rather the uncanniness and the possibilities of AI, which is already used today. These are artificial neuronal networks of so-called “narrow artificial intelligence,” or “narrow AI,” which learns but only dedicates itself to a very limited task, such as recognizing and categorizing images, texts, or sounds. However, the combination of these restricted intelligences gives rise to applications that are radically changing society. They are being developed in Silicon Valley. Paradoxically, though, this is the first exhibition in California to address the transformation of society through AI.

Among them are the latest AI-generated deepfake technologies deployed by Christopher Kulendran Thomas and Annika Kuhlmann for their film installation Being Human (2019), in which Taylor Swift appears, in addition to artist Oscar Murillo, and explains: “Everybody demands authenticity, and every artist believes that they are for real. I believe that I’m genuine in what I’m doing. But that’s the paradox. Because so does everyone else.”

But many of the works critically examine the flood of data that is increasingly evaluated and marketed by artificial intelligence. An example is Lynn Hershman Leeson’s installation Shadow Stalker (2019), for which visitors are first asked to enter their e-mail address in a terminal. Search engines then begin investigating countless, even long-forgotten traces of the person from the corresponding digital profile and projecting them onto a human-like shadow on the floor: phone numbers, current and past addresses, resumes, bank accounts. Even people who protect their digital private sphere find surprising, forgotten information here.

Like many of the contributions to the exhibition, the work of the U.S. artist refers to a new economy in which data has become the most important raw material. In the catalog, Claudia Schmuckli quotes Harvard professor and economist Shoshana Zuboff’s concept of surveillance capitalism, which Zuboff defines as “a new economic order that claims human experience as free raw material for hidden commercial practices of extraction, prediction, and sales. Instead of mining the natural landscape, surveillance capitalists extract their raw material from human experience.”

Raw materials are also images or sounds with which AI is trained, as Trevor Paglen’s 2019 work They Took the Faces from the Accused and the Dead... (SD18) shows. The work consists of over 3,000 police photos, or mug shots, of suspects. These images from databases are tapped by AI developers to feed their algorithms and train facial recognition programs without the consent of the people photographed. In his work, Paglen demonstratively makes the suspects anonymous with bars.

This frontal approach is accompanied by more complex narratives, such as Hito Steyerl’s The City of Broken Windows (2018), which consists of two videos. Broken Windows shows the artist-activist Chris Toepfer, who in Camden, New Jersey, with his Neighborhood Foundation paints colorful images on broken or boarded-up windows in neglected and abandoned residential areas to prevent further vandalism. The second video, Unbroken Windows, shows an old hangar in Cambridge, England. This is where Audio Analytic trains its AI to identify the sound of shattering glass, which can be deployed for alarm and security systems. New window frames and panes are repeatedly smashed with a sledgehammer in test situations. The broken window as a sign of social decay has different levels of interpretation here: The grassroots movement, whose repair work stands for the preservation of neighborhoods and communities, and the high-tech applications, for which windows are smashed to protect privileged property instead of looking for solutions to actual problems.
 
But it all started so nicely with the new technologies. The Doors, Zach Blas’ 2019 installation, alludes to the hippie origins of Silicon Valley, to the initial idea that the Internet would become a single global expansion of consciousness interconnecting all people and cultures. As an ironic gesture, Blas fed neural networks with images of psychedelic posters from the sixties. The AI reinterpretations of the posters now flicker across video monitors in a kind of garden oasis where New Age and corporate culture intersect, where glass doors, palm trees, and artificial grass are trimmed to create a mandala logo. For this purpose, lyrics sampled by AI are quoted, from a few Doors lyrics and many advertising texts for brain-doping drugs, which people who work for Silicon Valley companies consume to be able to cope with the performance pressure.

A brave new work world—just like the office chairs by Urs Fischer that dance with the help of AI. Or Simon Denny’s sculpture and collages for an AI-controlled cage that Amazon wanted to patent for its employees in 2016 to protect them from accidents involving robots in the huge warehouses. Denny’s life-size replica in the exhibition hall is reminiscent of the science-fiction version of a cage used by miners to enter a mine. This is a reference to the working conditions at Amazon, but also to the aggressive overexploitation of natural resources such as lithium and to data theft, which is carried out via Internet-based, intelligent personal assistants such as Siri and Alexa, who register and control our consumer behavior—and can even listen in.

Denny, who comes from New Zealand and works in Berlin, installed a small bird in the cage, the now almost extinct King Island Brown Thornbill, which visitors can only see when they look through the bars with an iPad camera via augmented reality. The virtual bird is not only an allusion to the cage. Traditionally, miners took birds into mines to detect odorless carbon monoxide. If the birds fell off the pole, choking, danger loomed. So the question is: How far can we go? Uncanny Valley is a committed, often polemical, educational exhibition. But regardless of whether you agree with this critical view of surveillance capitalism or not, today’s “uncanny valley” becomes clear: artificial intelligence that is a far cry from being human.

Uncanny Valley:
Being Human in the Age of AI

Until October 25, 2020
de Young Museum
Fine Arts Museums of San Francisco