98A — Charles Mudede & Palais Sinclaire, 20th december 2024

Q&A with Charles Mudede: Is Consciousness Overrated?



As a preface to our interview with Charles Mudede, we offer here a transcription of a Q&A session that followed "God After God", a lecture given by Charles in May (2024) at HKW, Berlin.
QUESTION 1:
Out of curiosity, given that you’ve been talking about it a lot in this talk: what is your definition of conscious? Is AI conscious? Are “robots” conscious? 

Charles Mudede: Oh? Wow, okay. Well, I have a funny theory about this, about consciousness—and I’ve got a feeling that AI & robots are gonna hate us for this—that it may not be all that much. I mean we have made a lot of noise about consciousness, forever and ever. So much philosophy has been written about consciousness—you have Heidegger, you have Kierkegaard, it goes on and on—everyone is always talking about consciousness, and with so much emphasis; it’s the seat where all of this takes place. It’s the main game in town. Right? Then suddenly, robots have consciousness and they’re gonna say: “that’s it? That’s what you guys have been talking about?”. “This?” So you know what, I quite like the idea of robots having consciousness, as it boots it out of its prominent seat in our thinking. Consciousness is, for me, just… overrated. I think robots will figure that out and hate us even more.  


QUESTION 2: I’m really sorry to do this but, can I take you back to “3 Body Problem”? I’m reading the books after having watched the TV show, and I’m interested in this story about aliens take over our lives, and in particular how people worship these alien as a God. In the context of your talk, is God an alien? 

Charles Mudede: Okay. I can say this about “aliens”—and this is relevant to how I think about science fiction in general—and this goes all the way back to something like the movie Black Panther, so let’s start there. There is a flaw in that movie, for me, which kind of explains my position. I know it was written by white writers mostly, then it was adopted and absorbed into black culture and now it’s the biggest thing, right? It’s become one of the biggest cultural objects for world black culture. But, there is a big problem in it, that this.. the substance, “Vibranium”, comes from space—nobody knows where it’s from, it just arrives. With the assistance of this substance, this black African nation becomes technologically advanced. 
           Now that is a bit curious for me, because I live in a world where technological advancement & leaps coming all the way down to AI, was driven by capitalist forces primarily. If you go back and look at different societies around the world and so on and so forth, you don’t find this unified movement of technology. Some societies have advanced technology, some do not, some use this or that technology for thousands of years and then let it die, history probably has more forgotten technology than remembered ones—so there is this kind of mixed bag across the surface of the world, where there is no one driving force towards greater & greater technological complexity, right? This is the illusion—you could even say that Hegel kind of made this mistake, where he was confusing improvements in the West—technological improvements and so on—and seeing it moving towards this grand absolute; total consciousness. A lot of people who think about technology haven’t left this kind of thinking. This teleological thinking; or to say that technology is driven by something other than economic forces, particularly in the context we are in. Technological development as we experience it, as we understand it, is maybe 300 years old, which means continual, unilinear technological advancement is absolutely a new thing. And it’s not just Hegel, as you can see it in the Scottish Enlightenment, too, for example: Adam Smith and so on. It’s also behind what people call the Stage Theory of history; that we start primitive, then we get technology, then we become civilised, and so on. Yet the point is that we’re taking an experience that is still very new and imposing it on all of world history; imposing it on everything that’s ever happened. All this stuff like; it starts in the east, through China, ending up in the mediterranean with Greece and the Romans and so on, and then it arrives at Hegel’s desk, the final point of the stages; it doesn’t go beyond his desk.
           I can illustrate my point with one of my favourite jokes. I like to say that if I was to ever see an alien space ship in the sky, I would have to say “oh my god, there is capitalism in space”, “my god, help, people are being exploited”, “there are these workers in space who are being taken out of their jobs, so now you have these spaceships roaming around deep space. What are they doing? Well, they must be looking for.. you guess it, right? Raw materials!”. 
           You’re gonna tell me that there is just a God-given, natural process from primitivity to space crafts? No. Let’s be honest, technology is not being developed in some higher pursuit of human perfection; the steam engine was invented to pump out water from mines, then adapted to driving a steam locomotive. The story is right there, why do we have to act like this is not what it is? That eventually transportation lead to this, and then speculation jumps in which cranks up the technology, and not just in the productive sense of: improve how we extract raw materials from the earth but also from the consumer side, to keep developing new things for consumers to buy and so on. I hate to say it, but if I saw a UFO I would have to say that this is what is going on, there is no way else to conceive it—I would honestly expect to see some kind of brand on the UFO! I wouldn’t understand otherwise, like, how did they do it? Right? Vibranium doesn’t cut it, advanced technology just dropping from the stars and it’s like: woah, with this I can make better and better cell phones. It has never worked that way. Right now they’re mining cobalt in Zaire, under horrific conditions, for batteries that go in electric cars. This is the story I should expect to find behind a UFO. 


QUESTION 3: Taking a few steps back from the aliens. You say that there isn’t this kind of linear sense of progress, and I get this—but there is a sense of increasing complexity, and I don’t necessarily think it’s all bad. Isn’t your hope more like finding some kind of God within this process? instead of believing its entirely driven by capitalist forces. 

Charles Mudede: I’m going to give you my optimistic side, which I draw from Donna Haraway, from Cyborg Manifesto in 1985. I am referring to, in particular, this idea of accepting our hybridity; we have the natural stuff like lungs and a nervous system and so on, but we’re also artificial, we’re part machine, or a big part of us artificial. Without the Haber-Bosch process of extracting nitrogen from our air, there wouldn't be enough nitrogen to keep 7 billion people alive; so a big part of our nitrogen is artificial, so we are in part artificial. We could also debate what is natural and what is artificial, and so on, but effectively we are already cyborgs; here and now. Haraway accepts that capitalism and militarism played a big role in the advanced technologies that we have, but she points out very early in that essay that just because something came from war, doesn’t mean it has to stay there; it can be adapted for something else. Children betray their parents all the time; I think that’s how she phrases it. Just because it is from capitalism, doesn’t mean x, and so on. Right? I have to agree on this, the repurposing of things is happening all the time. 

QUESTION 3.5: Well, let me go back to something you mentioned before, about this supermarket with no employees in it—the AI store where no one bags the groceries for you and so on. Of course this is capitalism trying to avert labour costs, but bagging bags is not something anyone wants to do, its not a nice thing to do—so even if it would be better if the money flowed sideways instead of up, it is still good that no one has to do these kinds of jobs anymore; its not a bad thing in of itself. 

Charles Mudede: Well! Don’t talk like that if you come to the USA; they worship jobs there. We’re told to worship bagging, worship any job you get as a kind of deliverance. I do agree, I just think… let’s take Blaise Agüera y Arcas, an AI guy who works for Google, who is based in Seattle. I had a conversation with him and he answered me on this point. His justification for pushing technology forward, making it more and more complicated, is that humans are not machines, so robots should be doing these jobs. If you watch something like Charlie Chaplin in the factory, after working in the factory all day, he ends up leaving the factory twitching like a machine. The argument here is that machines should do these jobs because we are not machines. I am not sure I can really go into my answer to that now, because we have to go back into the whole “primitive accumulation” issue, and you have to say “well, how is it that we live in a world where we have forgotten everything?”. We have forgotten so much of our cultural past—okay, you know what—let’s go there. I’m interested in cultural learning, in memory, and there are things that may not seem to be important to us because we don’t use them right now, but they’re disappearing. My question is how did we end up in a situation where so much of our cultural past is so depreciated that it’s of no use to us anymore. How did we reach the point where we are completely dependent upon a market system, to the point where the two cannot be disassociated. This is curious to me. We can go back to primitive accumulation, and talk about the commons, right. You were not allowed to hunt when they closed off the commons, and they forced people into the factories—its just a strange situation where I can say, “what exactly are you gonna do if you’re not bagging bags at the grocery store?”, where exactly do you go? The hot idea is this universal basic income. 
            I guess, yes, I can think about it in this sense: there is a line from an electro-funk song, back in 1984, by a group called Newcleus, who are a mostly forgotten, they’re famous for a song called … um… what was it called

Palais Sinclaire: *interjecting from the audience* Jam On It.

Charles Mudede: Haha!! Yeah! Jam On It that’s the one! Haha, I always feel like I’m talking to myself; someone knows Jam On It! Heyyoo, “Jam ooon it!”  It from a great album though, Jam On Revenge, and they have one song on that album called Push the Button which is one of the most curious songs of the 80s for me, it was a semi-hit. Push the button was about how machines have taken over. Doing all of our work for us. There is this one line that always strikes me: “I know that god must be mad, he can’t be glad, that we’ve given our lives over the computer”. This is the only theological element in the song. 
           I was always struck by this: why would God be mad? How did he come to saying such a thing? God would be mad..? that we’ve…? My answer to this, my best answer, is that… I think he is talking about us losing ourselves, and I say that there is a theological theme here stemming from the banishment from Eden. The banishment involved “sweat by the brow”, right? From now on you’re going to “earn your bread”? Right? By suffering. By labour. I think that there is a link here to why God would be so mad, because we have cheated this foundational aspect of our Fall. Right? This is the fruit of knowledge business; the foundational aspect is that we need to suffer, to work, pregnancy will be brutal, putting food on the table will take days and days and days of our lives. There is this anxiety that comes into this, related to the memory of our fall. When robots are doing our work, the anxiety is that they shouldn’t be. We have broken a major part of our God’s determination. His punishment for us. That’s what I think and fear. 


Question 4: So, do robots have a God?

Charles Mudede: Here’s the thing. There was a race, not too long ago, to figure out what human DNA was, to map the human genome, and one of the figures in that race also spent a fortune trying to figure out the “leap” from inanimate substances to animate ones, which is a seemingly tremendous leap. It is such a massive leap: the complexity of bacteria is hard to replicate, even with all the technology that we have. I think a lot about this gulf, and that we can’t really reproduce what we call simple life in the labs because it’s too complicated, and yet we’re talking about consciousness? It does seem that we’re getting close to consciousness, through AI and so on; there are aspects of machine thinking that look familiar. One way of looking at it is to say that what we’re reproducing isn’t consciousness; the other way to look at it is that consciousness isn’t as far advanced as we think, and God and all of those things aren’t as far out as we think. Getting a simple cell to make decisions, to say “I want this sugar instead of that sugar”, maybe that is where the real game is taking place… not with God. 
           Though, I would also like to say here that, for example, there is a short story which Steven Shapiro talks about it in his book Discognition, which is about a computer that manages a hospital system in the United States. A programmer working on this system realises that this machine achieves some kind of consciousness. They don’t know how it happened, but there is something like consciousness in this system which manages hospital bills and treatments and rote stuff in the health care industry. Somehow, consciousness appears in this system. What is interesting however, is that this consciousness is not like what we have, there is something different about it, as it operates on a different timescale. It sort of flickers in and out. We’re talking about time within the context of our experience, but what this story implies is that consciousness can take different forms. What if consciousness happens under a completely different set of conditions than our experience? Or on different time lengths. We imagine a god that we perceive under our own constraints. Yet, if consciousness or God were to come through different conditions, the result is not going to be exactly what we experience, it could be noticeable, or vaguely recognisable, but it could well be hard for us to identify with it, because it is a consciousness in a completely different state. I ask this as an experiment, why should we always impose the way we experience time on everything? On even other life forms, or artificial intelligence. Culture itself should make us a bit .. suspicious of the nature of the religious instinct, as it can be expressed in so many ways throughout humanity. What would God be to trees? Trees have a different way of thinking than we do, they have a distributed intelligence, impossibly different to an intelligence that is concentrated on a single nervous system—a tree doesn’t have a spine—so perhaps we could think of distributed Gods.