The Hitchhiker’s Guide to the Galaxy is an entertaining adaptation of an amazingly funny book, even if it doesn’t become anything nearly as well put-together as its literary counterpart. To say that Hitchhiker’s is a funny book is the biggest understatement in fiction, as it boasts some of the best one-liners of all time. The movie is still hilarious with the funniest book quotes, but sadly because of the middling performances, the film loses a lot of the charm that the characters had in the novel. Well, except for Marvin.
Alan Rickman portrays the character I love the most from the book and he just happens to be a manically-depressed android named Marvin! Marvin is a straight-up android, a humanoid autonomous robot with intelligence (a brain as big as a planet, he says). He has enough quips on his life’s meaninglessness to prove that he’s got emotions (or at least depression). The entire idea that humans or other aliens would build an AI with emotions raises the interesting question of the morality of creating suffering. We all know that human life involves pain and sadness and we accept that an AI that accurately represents life-as-we-know-it would also experience these emotions, but what if we failed to create the pleasant emotions?
Before I get into the depths of Marvin’s depression, there is another more behind-the-scenes AI in the film: the onboard computer of the spaceship Heart of Gold. This computer is a never-ending ball of sunshine, a reverse Marvin. The Heart of Gold feels inhuman with its constant joy in a way that Marvin doesn’t with his constant depression. Since humans experience depression, we can connect with those characters emotionally, but a constantly happy character just feels strange or annoying.
If the onboard computer really is experiencing the joy it is presenting, we may consider it a great thing that its creators have added to the joy in the universe. However, it feels off-putting to consider them creating an intelligence that is not able to experience fear or distress even in the face of thermonuclear missiles.
The Marvin we see in the film says he’s a prototype for a series of AIs with emotions and makes a typically depressed statement saying, “you can tell can’t you?” The simplest way to take this is that Marvin just hates himself, but what if he’s saying that his personality programming is clearly corrupted and they all can tell that it’s weird for him to be depressed?
The story isn’t particularly invested in the ethical questions of Marvin’s existence, but it’s a big warning against creating an AI that doesn’t feel like it’s living up to its potential. If we make an AI with the powers of our supercomputers today and the primary emotions it feels is depression and anger at those who made it suffer, we’re fucking screwed.