“Westworld” redefines fantasy

The sci-fi drama examines consciousness through remembrance.

Ehiyoanu.com

Ehiyoanu.com

Robert Heckert, Writer

Although the TV show is only four episodes in, but Lisa Joy and Jonathan Nolan have reimagined Michael Crichton’s book by delving into the ethics and foibles of artificial intelligence. “Westworld” centers around human guests who become immersed in a fake world representing a frontier settlement in Utah, then surrounded by robots called hosts with individualized histories and personalities. A visitor can interact with the hosts, play by the rules and follow them on their preprogrammed storylines, or they have the freedom to forget the rules and do whatever they want to the hosts. For the visitors, there are no physical or judicial repercussions. When the guests leave, the people behind the scenes — the programmers, creative directors, and maintenance crew — scurry to prepare for the next day by resetting the stage, cleaning up the carnage and erasing the memories of the robots. The hosts wake up the next day, thinking themselves real and going about their programed routines.

Unshakeable events

The first episode begins like any other day, except one of the hosts, Peter Abernathy, begins to recall some of the violence that the visitors did to him and his family. He is brought back to the lab for diagnostics, where the mastermind of the park, Robert Ford, played by Anthony Hopkins, sits down with the host and questions Abernathy. As Ford determines how he could recall what had happened, the robot snarls, “These violent delights have violent ends.” Abernathy is easily shut down, but the staff can not shake his words because the line which he quotes was supposedly erased from his memory. It is even more chilling to them because his claim makes it apparent that the debauchery of the park may not be as harmless as its creators and visitors assumed. They are actually doing violence against conscious beings — beings who are just as human as they are, and the hosts remember that.

Ford furthers that remembrance is what makes a person human when he articulates the true draw of the park, saying, “They come back because of the subtleties… They come back because they discover something they imagine no one had ever noticed before. Something they fall in love with… They’re here because they want a glimpse of who they could be.” Even though visitors typically use the park to indulge their wildest fantasies, Ford hopes that the park can be used to access something humane about themselves. So not only do the robots become more human because they remember something, the visitors somehow become more human as well. To understand Ford’s purpose for the park and its effect on the visitors, the philosopher Plato might be a helpful guide.

A lesson in philosophy

In his text, Meno, Plato develops the idea of anamnesis: the concept that humans learn virtue and truth about the world not by being taught those things, but by remembering something transcendent about themselves. He writes, “And he will know [virtue] without having been taught but only questioned, and find the knowledge within himself.” Perhaps the thing that Ford wants people to remember through experiencing the park is a virtue which they cannot find outside of the park.

In Westworld’s thesis of what it means to be human, Ford, like Plato, knows virtuous behaviors can easily become forgotten the moment something or someone removes moral judgment from the equation. The park may allow visitors to recall a truer form of virtue and thereby become more fully human. However, that comes at the price of the hosts discovering their own humanity.

0 0 votes
Article Rating