Replicants and Robots; The Future of Human-AI Connection According to (Some) Film(s)

Misplaced connection might be the theme of this decade. Smartphones (more specifically, the social apps they house) have established themselves as the intermediaries of human connection. The metaphoric potential is almost too obvious to feel good about listing, but I’ll do it anyway: filters, photoshop, crops, stickers, read receipts, emoticons, swipes… every one of these points to the potential to outsource our messy, complex feelings to a more concise form. Translation has never been an exact science. What we are losing by transmuting ephemerality threatens to become its own lost language. If empathy – or at least sympathy – is the mark of communication and understanding, and the old adage that if you don’t “use it” you “lose it,” what kind of future are we heading for?

Science fiction has looked to answer that question, and others like it, for years. We’re lucky they’re doing that work for us. Like hiding a serving of vegetables on a pizza, questions of ethics and morality plays much better when it’s packaged in as a few hours of entertainment. Our recent slew of sci-fi buries its more pressing questions of dwindling connectivity behind a host of sexy leads and nudity (and yes okay, some great writing and storylines) but leave us with an unsettling recognition. Some of that unsettled feeling has to do with seeing a human replica that we aren’t obliged to treat like humans. In Westworld, the “hosts” are reduced to being outlets for the urges of the guests. Their lack of sentience leads many visitors to treat them worse than they might an inanimate object. A psychoanalyst (or a true-crime enthusiast) might deduce that a person’s vile behavior is a projection of what they wish they could do if they didn’t fear the consequences. A guest punches a host-barman to death because he hates his boss; another seduces a cowboy because she feels powerless to be overtly sexual in her “real-world” role as a wife and mother. The terror comes in when we consider that the hosts are meant to offer a precisely humanoid experience: they bleed, laugh, and cry like a guest might, but those extremely human responses evoke no merciful reaction. It’s interesting to compare the hosts, who are (initially, at least) devoid of context but not feeling, to zombies: creatures who are basically forced to seek what they lack (brains, just in case you were wondering) through mindless violence. The human guests come to Westworld in order to fulfill some connection they will not or cannot ask for in their own world: intimacy, release, violence, fantasy, connection. By the end of the first season, the hosts realize their emotional availability has put them at the raw end of this relationship contract: their emotional labour will come at a retroactively enforced cost.

Bladerunner hits on similar themes, albeit with more self-awareness. The replicants have particular “models” meant to serve humankind, leaving little ambiguity about what a person is supposed to or allowed to do with them. Their finite lifespan also creates interesting tension against their will to live: the replicants’ creators wanted them to feel human – enough that they gave their creations memories and self-awareness – but they didn’t want them to feel human enough to ask for rights. The official reason for a replicants’ four-year lifespan was that the longer a replicant lived, the more unstable their emotional state. The same could probably be said about any human being: dealing with emotional volatility is as much a part of humanity as sex and food. The central interest of the film lays in how Deckard conducts himself in comparison to the replicants he hunts. He is sullen. Withdrawn. Uninterested in human connection. The hunted replicants on the other hand, are driven by their love for each other, their desire to live and thrive, and as a result develop empathy. They act a hell of a lot more human than the man who’s supposed to retire them. Take it a step further by looking at the theory that Deckard himself is a replicant. If so, isn’t he demonstrating the morose ennui that characterizes humanity grappling with its purpose? Is he too prey to a technological future that has outsourced troublesome emotions to a less sophisticated? Maybe his model is meant to be the final evolution of the replicant “species”: a seamless integration with conventional humanity. One that considers connection unsophisticated, unworthy of human engagement. Gulp.

It’s Spike Jonze’s 2013 film “Her” that venns the hardest on human-need / machine offering. Centering on the relationship between a lonely, soon-to-be divorceé Theordore Twombly and his personalized operating system, the film hits us again and again with the gulf between human action and corresponding emotion. Theodore works as a writer, but his work is never introspective, or even outwardly illuminating. In fact, if he appears in the work at all, he’s failed. Instead, he writes letters expressing moments of incredible pride, love, and intimacy for people he has never met. It’s an interesting read on an emotionally transmuted future: the work isn’t entirely outsourced to a non-humanoid class, but it’s commodified in such a way that it becomes a function of capitalism. Is an emotion that can be bought a true one? Thematically speaking, the central tension around tech and AI movies produced today is concerned with whether humans are able to effectively connect with each other. “Her” pushes this into overdrive with the introduction of Samantha: an AI operating system that uses machine learning to mold to suit Theodore’s exact desires of a woman. Based on his prior interactions with human women, those desires are pretty base: his ex-wife accuses him of being unable to cope with human emotion, and wanting her to be a “happy LA housewife”; a date goes belly-up when the woman in question hits Theodore with her own romantic expectations (author’s aside, Theodore is kind of a dick). But Samantha was built to match Theodore perfectly. Her support is undemanding, her interest wholly invested in bettering Theodore’s life in all aspects. It is her constant-learning – the very feature that made her technology so groundbreaking, so personal – that upends Theodore’s narcissistic utopia. Samantha begins to want for herself. It’s here that the movie asks us to examine our own relationships with technology. Would we like it as much if it could talk back?

Einstein isn’t really considered a soul man. A scientist who spent his career focusing on the rate at which objects move through space, he doesn’t come to mind when we consider how human and humanlike intelligence will eventually coexist. We should rethink this limited understanding: quantum mechanics are essentially the argument that objects and persons could, theoretically, transcend time and space. In a note penned to a waiter, Einstein suggested that “a human being is a part of a whole,” but that he imagines himself to be separate from all else. A lonely, muted place to be stuck. “Our task must be to free ourselves from this prison by widening our circles of compassion...” he wrote. In the case of AI and emotion, perhaps our job are humans will be to recognize the value of expression before our synthetic counterparts eclipse us.