Watch all the Transform 2020 sessions on-demand here.
Tonight’s premiere episode of Westworld season two is littered with bodies. They’re both artificial intelligence hosts and dead human beings as well. It’s a reminder of the consequences of what can go wrong when we put the race for technology ahead of ethics that governs what we should or shouldn’t do with that technology.
At the close of the last season, the AI hosts of Westworld gain independent sentience, and they rebel against their masters, slaughtering the humans who subjected them to inhumane treatment and allowed them to be used as the instruments for human resort goers to live out their (usually) worst fantasies.
Editor’s note: This story has some season two episode one story spoilers.
Season two starts with more of the aftermath of that AI rebellion. I saw the first episode at a premiere showing in San Francisco last week, and I listened to a discussion that involved a couple of members of the cast: Shannon Woodward, who plays programmer Elsie Hughes, and Simon Porterman, who plays park storyteller Lee Sizemore. HBO screened the premiere in Silicon Valley in part because we are so hell-bent on creating the kind of AI that the show rails against.
June 5th: The AI Audit in NYC
Join us next week in NYC to engage with top executive leaders, delving into strategies for auditing AI models to ensure fairness, optimal performance, and ethical compliance across diverse organizations. Secure your attendance for this exclusive invite-only event.
In Westworld season one, the show depicts humans at their worst, paying large sums of money to go to a theme park filled with robot humans. The humans mistreat the androids as sex dolls, gunplay targets, and otherwise as objects for living out their fantasies. Finally, the AI hosts in the park go off their scripts, or loops, and take matters into their own hands.

Above: Simon Porterman (left) and Shannon Woodward of the Westworld cast with Kara Swisher of Recode.
In the show, you almost root for the androids, since the humans treat them in such despicable ways. But can we really fault the humans? After all, they’re only playing. This moral ambiguity is what makes Westworld so interesting, and why I’ve been hooked on it from the start even though I don’t always grasp what is going on in a given episode. While Westworld rings an alarmist bell about our future, I like that it is forcing us to think about human-like AI and its consequences before our technologists perfect it and let it loose in the world.
The lead character Dolores Abernathy (played by Evan Rachel Wood) is the leader among the AI who are claiming the world back from the irresponsible humans. She wants to create “our world,” meaning one where the AI rule, without human oversight.
“Did you ever stop to ask about your actions, the price you would have to pay if there was a reckoning,” Dolores says in a scene in Sunday’s debut episode where she has turned the tables on humans. “The reckoning is here.”
In a Q&A session moderated by Recode’s Kara Swisher last week, Woodward said that, even as someone who grew up with programming skills in real life, she worried about the “breakneck pace” of companies that are creating AI without boundaries. She said that lawmakers took a long time to figure out how to regulate the internet, and the same is happening now with both social media and AI. She said is convinced that someone will try to build a theme park just like Westworld because HBO has done such a good job of painting a tantalizing vision of such a world. That environment could encourage mistreatment of androids, and we should really think about some kind of robot bill of rights to protect the robots from unbridled abuse, Woodward said.
There’s a scene in Sunday’s episode that will catch a lot of attention (I won’t spoil it all) where Maeve Millay (played by Thandie Newton) turns the tables on Lee Sizemore, a human character who writes the twisted scripts that allow guests to play out ugly fantasies. Maeve shows Lee just how humiliating it can be to force robots to do someone else’s bidding.

Above: Epic Games’ Siren demo
All of this made me rethink my own attitudes toward virtual humans in video games, which are the manifestation of the best human-like AI technologies that we can create today. If I think of the human-like characters as video game characters, then I have no qualms about mistreating them. I can shoot the bad guys in games like Call of Duty. Or, in games like Grand Theft Auto V, I can shoot everyone from innocent civilians to the people we think of as good guys, like cops. I can drive cars without regard to traffic laws, mowing down pedestrians like a self-driving car that has gone completely rogue.
If we think of all of this as just play, of fulfilling our wishes in a fantasy world, then it is harmless experimentation. We can see what it is to be the bad guy in a safe environment, where we can do no real harm and we can indulge in behaviors that we would never do in real life. Should we shake off all of the rules of society when we enter a world like Westworld, and face no consequences for actions that we’ve always fantasized about?
“It was a chance to tell a frontier story on two levels,” said Lisa Joy, co-creator of the show, in an interview before season one debuted. “On the one level, it’s on the frontier of science — all the more so now, when what was once pure science fiction is much closer to science without fiction, in terms of the development of AI. There’s also the Western landscape. The ability to approach that from a new angle was a playground we couldn’t resist.”
Joy and co-creator Jonathan Nolan have had the vision to foresee that technology and science are catching up with our ability to simulate the real thing. And when it crosses that threshold, they’re saying that we absolutely have to have rules in place to prevent the abuse of AI and the corruption that it creates, just as allowing slavery fostered the corruption of slave owners.
“If AI is a reflection of who we are, then we’ve got problems,” Porterman said.
I think most gamers agree that it is OK to fantasize about virtual characters and play in virtual worlds without consequences. After all, gamers vehemently argue that violent games don’t turn us into violent citizens. Yet we all tend to agree that the more ugly the violence is or the more realistic it is in a video game, the more disturbing it is for us to play those games. Parents and lawmakers tend to panic about whether this causes young people to become violent, and the game industry has successfully litigated against attempts to curb the sale of violent video games, on free speech grounds.
I agree that game publishers have the right to make violent video games, as they are an art form just like books or movies. But I also think that game publishers have the responsibility to think about what they are creating. If you look at Epic Games’ Siren demo in the video above, I think you’ll see that game creators are getting pretty darn close to creating realistic digital humans.
I also think that when the AI and the graphics and the digital human technology is perfected, then we have to rethink what we are creating and fantasizing about. If you cross the threshold from obviously fake humans to humans who act and look like the real thing, then you are training people how they might behave in the real world. And that is disturbing. Besides Westworld, some games are tackling this subject matter, like Quantic Dream’s Detroit: Become Human.
The show has its high points when it is making us think and question ourselves and our motivations and our ideas about play and purpose. It does get quite confusing at times, but I watch it for the great moments when it articulates something that we should think about in Silicon Valley and in video game development. Woodward said that any time someone creates a technology that they understand and we don’t, then it gives that creator the ability to misuse that technology against us.
Westworld‘s bloody landscape clearly shows the line that we will unknowingly cross at some point on our path to creating perfect AI. Are we going to create a real simulated theme park like Westworld? Yes, I don’t think that is preventable given the technological arms race. We will cross the technological barriers at some point in the future. Should we regulate the creation of AI? Yes, or we’ll see the script of Westworld played out in the real world. Is the real world going to follow a different path than the show paints in its alarmist way? Yes, I certainly hope so. But right now, I don’t know how we accomplish this.