Terminator Genisys might at first glance not be the most profound movie ever made. The plot logic is a leap into the absurd (although to the familiar Hollywood absurd), the characters are shallow and the amount of explosions and big guns is overwhelming. In other words a typical and enjoyable “switch off your brains and enjoy” -movie. However, the movie holds deeper truths about our view of the present and future, they just have to be dug out. Here are 5 mindsets I learned from the movie
Warning: this post may contain spoilers.
1. People will buy any new technology
As long as the hype and marketing is sufficient, people are willing to adopt any new technology, even without knowing what it does or what the consequences might be. In the movie a new service/technology called Genisys is close to being launched and it already has a a billion users anxiously waiting. Everything, including military systems are integrated into the service, which has caused “some concern”, but altogether the world is eager to see and use this new thing. It never becomes clear what the benefits of Genisys are, since the current systems seem to be working just fine. But that does not matter, because technological progress is an end in itself, and social change will follow technological change. The idea that technology will save us, or at least make everything better, resonates well with our consumption-oriented culture.
2. If you want to save the world, it is best to wait until last minute
The two heroes in the movie figure out when the apocalypse (or the “judgement day” in Terminator parlance) is going to hit, and they have a time machine. They decide to travel to the day before the apocalypse, because trying to prevent it earlier would be for wussies (my interpretation of their rationale). While this decision is made for dramatic effect in the movie, it is disturbingly familiar in the real world. The responses to climate change or other environmental problems are postponed until the very last minute. The problem is that the very last minute is not as clear as in the movies; there is no last minute. Indeed, if there ever was a last minute in the sense that we could continue our current way of living, we are well past it. Now more drastic and costly measures such as negative emissions are needed to mitigate the effects of climate change. People are generally bad at understanding delays in a complex system, and most of the problems we face today have significant delays before responses have an effect. Waiting for the last minute is thus definitely a losing strategy in real life, and best left for movies.
3. People are tough
The movie starts with images of the world after the apocalypse. Hyperintelligent machines control everything, most of mankind has been terminated or enslaved and robots looking like humans hunt the remaining few humans. In spite all of this there is a strong resistance movement, which actually manages to fight back against all odds. Humans persevere, they have Sisu. When it is clear what needs to be done, amazing things can happen. The problem is of course that things are not that clear in real life.
4. Apocalypse would make things easier
In the movie, the world before the apocalypse is filled with complex social structures and there is no clear line between good and bad. After the apocalypse things are more simple: machines are bad, humans are good and the common goal for humanity is to survive and win the war against machines. This longing for a simpler world, even through apocalypse is a common theme in popular culture (think about zombies, catastrophe movies etc). Douglas Rushkoff in his book Present Shock calls this “apocalypto” and explains it as a response to a world where the pace of change is so quick that we feel foreigners in our own time. However, we may choose the more complex and open future where the apocalypse does not happen, and this indeed is what happens in the movie. The future changes from being predestined to being open.
5. The future has no say in what the future is
In the ending scenes the newborn artificial intelligence wants to know why do the heroes try so hard to kill it. What are they afraid of? The AI represents the next step in human evolution, it has even made it easy for humans to merge with machines through nanotechnology. The heroes however know of one possible future after the AI becomes all powerful and launches nuclear weapons to destroy all humanity. For them that future is understandably not desirable. But could something else be possible? Apocalypse aside, the question the AI raises is something we need to ask in foresight processes as well: who gets to decide what the future should be like? When doing foresight it is good to be aware that the present values of those involved are imposed to the future.
Even a movie aimed mainly to entertain has a lot of implicit assumptions and mindsets built into it. Some of the mindsets are useful and some might be restricting. What are your mindsets when thinking about futures?