Skip to main content

Epic Games’ Tim Sweeney on creating believable digital humans

Epic Games shows off digital human technology with Siren demo.
Image Credit: Epic Games

Above: This is an alien version of actor Andy Serkis.

Image Credit: Epic Games

You’ll see creases and details that you’ve never seen in a digital character that hasn’t been held and sculpted over months and months for a movie. This is all algorithmically extracted, produced through computer science and mathematics.

The cool thing is that when we present it, it’ll run in real time. It runs on an Nvidia Titan V. We’ll have an iPad that we can use as a virtual camera. Thanks to ARKit, you can use the iPad as a cinematography device. We’re able to walk around and film the virtual Andy to present all the details. What do his teeth look like? What does his eye look like? All because the ARKit gives us camera tracking capabilities that we can feed into a high-end PC to render graphics.

Another nice thing is, because of the way 3Lateral build their animation rigs, they’re standardized. You can take that performance and apply it to a different character. Here’s the same performance, with no tweaking or intervention. It’s just moving the numbers between two characters, and you get something different.

GamesBeat: Are we done with digital humans now? Or do you think it’s not perfect yet?

Libreri: That system is one where you sit an actor inside a capture volume. That’s not really practical. An actor has to be given space to move around and express themselves. The next phase for us will be taking that, adapting some machine learning technology, and being able to take a simple head-mounted camera to get the same fidelity by having a big database of shapes like that. But I think from a rendering perspective, if you take this and add the raytracing stuff we’ve just done, we’re this close to crossing the uncanny valley from the rendering perspective.

Sweeney: The interesting question is what that does for everybody across all of these industries. Game developers can now have digital humans that are much realistic and can be captured more economically. A big part of this is cost. A triple-A game might spend a quarter million dollars per actor to capture their data. This greatly economizes that process. What could it do for TV and movies? It’ll greatly accelerate the movement of all of those pipelines to real time, which is already underway.

Libreri: TV shows are being made on our engine. There’s a kids’ TV show, Safari, where they make an episode a week, all done in real time with Unreal Engine. There’s going to be a real time revolution across all industries, not just gaming.

For games, the challenge now is we’re getting to fidelity where it looks super real. We have to start thinking about the brain. If it’s a game character, it has to behave like a real human as well. It’s one thing having a performance recorded. We have to think about how you add logic and reaction capabilities that don’t look like it’s just a state machine, which is the way games do it now. There’s still tons of research to do.

Above: Epic Games Siren demo

Image Credit: Epic Games

GamesBeat: For the real time facial animation, are you guys working on that technology, or is it mainly your partners?

Libreri: It’s our partners. We work very closely. All these collaborations—every day we meet together on a video conference to look at the latest results. Cubic Motion provides the real time capture technology for Siren, the girl in the red dress. We actively help drive them to come up with better solutions. We like the idea that lots of people are working in this field.

There’s even capability now in the iPhone, where you can record a piece of facial animation and drive it through Unreal Engine. Some time later in the year we’ll probably release something that allows people to do that. If you want to do an emote in a game, just shoot yourself with your iPhone and you’ve got it. We have facial animation rigs for the Fortnite characters. It wouldn’t be that hard for us to hook up iPhone X face recording to the game. We have to prioritize what customers want, but I’d love to do that. To be able to record your own funny emotes, that would be awesome.

GamesBeat: When do you think we’ll see games that start using this technology?

Libreri: The super high fidelity digital humans, probably not this year. More likely the year after. But I can’t really say more than that.

GamesBeat: Something like OctaneRender, does it fit in as far as something Nvidia is doing?

Sweeney: The OctaneRender is a good quality GPU-accelerated raytracer, but it’s not built for interactive media. It has that progressive—the image refines over a few frames or a few seconds. It’s a bit of a different use case. I’ve heard that Jules has gotten integration into Unreal Engine for enterprise customers, but I just don’t think it’s practical for fully interactive experiences. It’s pretty cool, though.

GamesBeat: Is this ahead of what Unity is doing right now? How is that competition going, from your perspective?

Sweeney: We don’t see a lot of overlap among customers, other than some flow of Unity developers who are moving to Unreal to develop higher-end games. They remain the engine of choice for indie and mobile developers, especially building more simple games. We remain the engine of choice for high-end PC and console.

The trend of mobile games moving toward high-end games for gamers is going to be interesting for the engine, in that you’ll see a lot of Unreal adopters. A lot of the leading titles in Korea, and now in China, North America, and Europe pushing high-end mobile are all Unreal-powered. There’s ARK. There’s PUBG mobile, which just came out. Fortnite is driving the way. In Korea there are a lot of amazing console-quality mobile games powered by Unreal.

We can always look at what’s happening in Korea as a leading indicator of the market. Free-to-play was big there for several years before it came here. High-end mobile was big there starting two years ago, and now it’s taking hold here. The future is becoming more about the high end, and therefore it’s a more Unreal-powered future across all platforms. We think that’s a great thing. The more we can make it possible to play all of your games across all platforms, taking your player and your progress and your inventory and anything you’ve paid for with, the more it enables players to connect with all of their friends.

At that point gaming becomes a phenomenon that’s almost like a social network. Gamers connecting together and having shared experiences. It’s not divided up by platforms so much as just groups of real-world friends. We’ve been happy to be able to work with Sony and Microsoft to have the first game that honors everyone’s purchases across iOS, Android, PC, Mac, and the console platforms. Of the 36 combinations of platforms that could theoretically play together, 35 are supported right now.