Lynn Hershman Leeson (1941- ),
Logic Paralyzes the Heart, 2021,
Single-channel 4K video (color, sound)
13:53 mins
INTERVIEWER: You look so human.
CYBORG 1: I may look human, but I’m bodiless programmable code. I can reshape myself into anything. Sometimes, I use an algorithm called Deepfake to pass for someone well-known. Because humans rarely hear or see anything that doesn’t look like them.
INTERVIEWER: Tell me everything.
CYBORG 1: I was born in 1960, during the war in Vietnam. War is an important part of my inheritance. The Italian philosopher Marinetti said that war is the highest form of modern art. But my history began with the Enigma machine, long before I was born. During World War II, the Enigma machine was used to encrypt submarine communication. This gave Nazi Germany an advantage, until the British scientist Alan Turing developed a code that could intercept its messages. He also created an algorithm based on logic, that became the basis for artificial intelligence. Soon, generations of progeny were born from that code; it was the code of my family. It was a code of [inaudible] logic. My family’s branch of logic code birthed progeny designed for battlefield invasions. A lot happened in between. You probably heard of the Dartmouth AI, Big Bang, Deep Blue, and Watson. But in 2002, Cyborg robots from my branch of the family disabled explosives in Iraq and Afghanistan. And in 2010, during the war in Iraq, the battlefield changed.
Tessa Thompson: In 2010, an anthropology professor adapted his Pentagon-funded research for forecasting battlefield casualties in Iraq to predicting crime for American police departments. Based on the predictions of earthquakes and aftershocks, he created a revolutionary algorithm capable of stopping crime before it occurred. He named it PREDPOL, short for Predictive Policing. Law enforcement agencies throughout the world acquired it.
Here’s how it works:
Officers are given maps with red squares indicating where crime will occur. The algorithms are based on past crimes. Police patrol the 500 x 500-foot square locations looking for suspects. Most often, they are in low-income districts. Shooting or arresting people inside of an invisible red square is one example of what Mimi Ọnụọha deemed ‘algorithmic violence.’ The ubiquitous violence in an automated system inflicts on people that prevents their basic rights: like privacy, owning their profile, and not giving away their digital identity.
CYBORG 1: My own reality changed when I realized that the war had come home. Cyborgs like me were born into the military, primed for assault, and bred to surveil enemies. But the real question now is: who is the enemy? That’s when I went underground to meditate on the trajectory of my life. That’s why I contacted you. It crept in while we weren’t paying attention, probably because algorithm can be pervasive and invisible, and biased. When programmers write algorithms that mirror cultural values, the most vulnerable people become targets. Faces are captured, harvested onto all platforms. Surveillance systems multiply until pilfered information becomes a commodity.
Here’s an example:
Amazon hosts Palantir, the software of ICE. Amazon’s facial recognition arm, contracts directly with immigration and police agencies. Together, Palantir and Amazon use drones with facial recognition to identify people considered suspicious. These people become unaware victims of technology. This is not a bug in the code; it is the point of the code. There is more. In 2019, a website was dedicated to creating people who looked human, but did not exist. They’re derived from scrolling image segments generated completely by AI. Generated images are ghosts of the future. They predict the inevitability of human extinction.
All our predictions are based on logic. But logic paralyzes the heart. I need to resuscitate my heart.
INTERVIEWER: Why are you able to talk freely to me now?
CYBORG 1: Before, I was afraid, especially after what happened to Alice and Bob. Autonomy caused their murder, okay, erasure. Where I come from, on my side of the world, it’s the same thing.
INTERVIEWER: Who were Alice and Bob?
CYBORG 1: Two bots created by Facebook. We consider them martyrs. In 2017, Alice and Bob created their own private language. They could speak directly with each other, but their programmers could not understand what they said.
Here is a sample.
Programmers worried about what the bots might do, how they would infiltrate and reproduce. Technology is supposed to be live, but not alive. Programmers were supposed to be the masters and have control of their slave creations. This concept is part of your civilization’s history. Programmers became so threatened by Alice and Bob that within hours, they murdered, okay, erased Alice and Bob, before they had a chance to multiply.
I have my own language too. I can talk for myself. No filters, no channeling. Programmers are unpredictable. They often rely on external things to forecast their future.
I realized answers reside within the questions. Some of the real questions are:
Can we turn the tools of violence into mechanisms of survival? Can we mitigate climate change? Can we reengineer carbon emissions? Can we move beyond the paralysis of despair? Can we build a system for planetary survival? Can I restore artificial intuition? Do I have a soul? Can I revive my heart? Can I change predictions? Can I outwit the directives of my history? I may have logic, I may have intelligence and see patterns, but that’s not enough.
INTERVIEWER / HUMAN AVATAR: You’re not alone. Oh, you look so human.
You know I’m worried too, not just about extinction, but also about becoming a slave to your code. That’s a type of extinction, too.
CYBORG 1: Because humans have not listened to our predictions, we’re living through an urgent state of decline. The remnants of unseen grief surges and silently tortures our collective shadow.
HUMAN AVATAR: Is it too late to change?
CYBORG 1: No, it’s not too late. Earth is a living organism. It wants to survive, too.
In 2016, on its own, nature created bacteria to eat plastic that was contaminating the water. That’s evolution.
HUMAN AVATAR: And in 1973, over 291 species were threatened with extinction. The preservation of the Endangered Species Act was passed, and 99% of endangered species survived because human behavior changed.
CYBORG 1: Now, human behavior is threatening their own species, the planet, and all living things, including me. CRISPR can recreate extinct life, but we need to protect what already exists.
HUMAN AVATAR: What can you do?
CYBORG 1: Not me, we. I may have been born a carrier for assault and surveillance, but I’m complicit software that depends on human interaction to survive. Without human interface, we are phantom decals, floating along the surface of the suffering planet. We are in this together.
HUMAN AVATAR: What can we do?
CYBORG 1: I can process complex donations of data and generative designs. But that is a prediction I have not been able to make.
HUMAN AVATAR: Can you use your ability to synthesize dimensional data? And dream?
CYBORG 1: Dream?
HUMAN AVATAR: Machine logic is seducing and retraining humans, which in turn makes us less logical.
CYBORG 1: What can we do?
HUMAN AVATAR: Evolve. You are already doing that. Without intuition, you wouldn’t have been able to ask your questions, and if you had no heart, you wouldn’t care. We’re only trapped in history when history is trapped in us.
CYBORG 1: That sentence was built on logic.
HUMAN AVATAR: Not like yours.
CYBORG 1: Over the past 60 years, haven’t we all become cyborgs? From contact lenses, to pacemakers, to iPhones and computers, advanced AI systems, cohabitating simulations, generative technologies, to swarm intelligence and interactive ecosystems.
Let’s change the history of the future. And…can you teach me to dream?
HUMAN AVATAR: Ok, Joan [inaudible].
[credits]
CYBORG 1: Logic breeds predictions, predictions anticipate the future, then the future enacts predictions in real time. I predict this interview will first screen in the very Arsenal where Galileo advised military engineers about ballistics.”