A New Promethean Revolution?
Artist Landon Ross in conversation with the brilliant polymath Stephen Fry.
SMALL V01CE was a group exhibition curated by Jesse Damiani at the Honor Fraser Gallery in Los Angeles. As part of this event, Damiani brought together two original thinkers to discuss the rise of generative AI and its implications for humanity.
This is an excerpt of their conversation. The whole conversation can be found in our latest issue FUTURO VOL II.
LANDON ROSS There's a tattoo on my arm that, Stephen, you actually helped me choose, I don't know if you remember that.
STEPHEN FRY Oh, goodness, yes.
LANDON ROSS It's of Prometheus, a titan who stole fire from the gods and gave it to man, who Zeus then condemned to eternal torture. Man, now wielding fire, killed the gods. This story for me has always come to symbolise science and intelligence. And since intelligence evolved on this planet, we, some millions and thousands of years ago, “extincted” our hominid ancestors and brothers and sisters. Probably not out of malice, but maybe. Probably just because we were better at outpopulating them. So, have a few among us now stolen fire from the gods and given it to something else, some other progeny; are we facing another Promethean revolution?
STEPHEN FRY I think it's a wonderful question. Prometheus was regarded as our champion by the Greeks, a kind of Messiah, because he took the side of mankind against God and he lived amongst us after he stole fire. The way we can look at the Prometheus myth is fairly obvious, I suppose. That the gods — and Prometheus was the one who did it for Zeus — decided to make a new kind of creature. The gods were almost, as it were, bored with their own divinity. There were animals, but there was nothing like themselves, with intelligence, wit, or resource. So they made the little anthropoid, little people, and their job was to worship and obey and work for the gods. And Zeus told Prometheus that they were free to do and be anything, but fire was the one thing they mustn't have.
And by this, I suppose we take it, that he literally meant the fire that roasts and toasts and bakes and makes technology and iron and steel and china. But also the divine fire — the spark — the thing that makes Gods conscious and aware and supreme, and have a power over the domain of the Earth and the cosmos. Zeus didn’t want man to have that. But Prometheus loved us, and so he gave it to us. And he was punished, chained to the Caucasus mountains and so on.
We have our own fire. We don't need the gods. In that sense, Zeus was right. If you give man fire, they won't need the gods. Mary Shelley wrote Frankenstein, whose subtitle you'll remember is “A Modern Prometheus,” about the creation of a new form of life and giving it a spark. And then it was almost forgotten as a great story, but now it is supremely important to us again, because there is this feeling that there's a cycle, that maybe this happened once. There were gods, they ruled everything. They made the mistake of making little creatures, and the little creatures took on a life of their own and usurped the gods. And that's now what we are frightened of.
STEPHEN FRY We've made little creatures — computers. And the computers were fused with something we call AI and various other technologies that are like a kind of tsunami. I like to think of the human family sitting on a beach, facing away from the sea. And behind them, this tsunami is coming. And it's a tsunami made up not just of AI, not just of computing, but of robotics and nano materials and bioaugmentation and brain machine interfacing and gene editing and quantum mechanics, and astonishing, powerful technologies, all conjoining. And we are sitting playing our beach cricket, sort of occasionally saying, there's a big thing behind us, isn't there?
Let's not look at it. Let's not, because we are the gods on Olympus having given away fire. Or we think we might have given away fire. We're not sure what we mean by it. Do we mean consciousness? Do we mean self-awareness,or whatever it is that distinguishes us? But it's apparent that something enormous is going to happen even in my lifetime.
So what we have, gosh, is it full intelligence? Can it rival ours? Well, it can destroy us. I think there's no question about that. And the idea that you can just sit and say, well, let's make it fully safe. Let's disarm it in some way. We have an ethical framework. We've taken hundreds of years to learn that all human beings, in our estimation at least, are equal. And that we have equal value, and that we must respect that in all terms. We would mostly agree that's wise, and we would want to bake that into any system that we have. But that forgets that the Chinese and the Russians have AI, and they have their own ethical framework. It's completely different.
LANDON ROSS So there's this question of moral valence, right? Technology isn't morally valent, but when a technology becomes an agent, then it draws our attention to this question.
STEPHEN FRY Yes, yes, yes. And in that sense, you're absolutely right. I mean, the Gutenberg's printing press can print the sonnets of Shakespeare one day, and the next day can print Mein Kampf. It has, as you say, no moral valency. But creating these things alters everything in human history. We demonise the machinery because, really, we don’t know where else to place the blame for the monstrous things that can come out of it. We haven't quite created them, and the machines haven't quite created them. But somehow that mixture of moral innocence that these machines have, plus the damage they're capable of doing can create something that we cannot control.
But I wonder about what we expect. What do we think we want this technology to do for us? At the moment, it is extraordinary how vain and foolish the things that AI produces are. I mean, they really are silly, aren't they? They are just like a little automatic writing, those little pantographs. But it isn't anything new or anything challenging or anything interesting. I'd be so much more excited by an AI that produced things that you couldn't understand. Because then you think that really is intelligent. In the same way as if you're 17 and someone tells you, you have to read Hegel. It is terribly difficult because it is a very strong intelligence that comes from a place you've never inhabited yourself, and it takes you a long time to unpick it. But AI is going out of its way to be simplistic and naive and very, very uninteresting intellectually, isn't it?
LANDON ROSS Yeah, I think so. But I don't think AI will really be able to create transformative art itself until it has a “self”. Because the novelty, the spark, or the vision, is something that's really tied to experience and consciousness and emotion.
STEPHEN FRY And desire! You have to want something, don't you? The first question an actor asks is, what does this character want? Why is he saying this line? What does he want? And why would an AI want to produce a piece of art? It's one thing to be told, but the big question about intelligence is decision-making. The desire to make a decision, and to make a decision that alters everything — that comes from yourself. But if I look at this board here (pointing to Landon’s Untitled Hieroglyph), this is human beings desperately yearning to understand something and to put it into a language that coheres and maybe expresses something of the truth of the world outside. And the day an AI starts to do that would be extraordinary.
And science fiction writers have known this for years, haven't they? I mean, it's what Howell does, it's what Data does in Star Trek: The Next Generation. It's kind of a cliché of science fiction that the robot suddenly wants to know.
LANDON ROSS Yeah. I try to explain, especially with this piece, that a lot of times people just see this mere calculation. Of course you can code for instinct, you can code for calculation. But this is a lot of intuition. This is experiential knowledge. It's “I feel like it should go together like this. Let me see if I can get there.” That's a very human thing.
About Untitled Hieroglyphs
The ‘Untitled Hieroglyph’ is a series of original works that explore origins stories or current narratives which are derived and hinted at by nature "once she is asked in the right tone of voice,” rather than those invented by humankind. The work’s inscrutability to most is a feature that requires the viewer to see the objects, which are composed by a monthslong capture of actual theoretical work, conducted some of the top physicists at Caltech, as the aesthetic, drawing in questions of a sort of priesthood, and, like the hieroglyphs of Ancient Egypt—a modern hieroglyph, also telling a narrative of a quest for ascendance, albeit one that is ontologically anchored.
Landon Ross
very laboratory