Skip to main content

Epic Games’ Tim Sweeney on creating believable digital humans

Epic Games shows off digital human technology with Siren demo.
Image Credit: Epic Games

Epic Games stunned everyone a couple of years ago with the realistic digital human character Senua, from the video game Hellblade. And today, the maker of the Unreal Engine game tools showed another astounding demo, dubbed Siren, with even more realistic graphics.

CEO Tim Sweeney said technologies for creating digital humans — from partners such as Cubic Motion and 3Lateral — are racing ahead to the point where we won’t be able to tell the real from the artificial in video games and other real-time content.

I caught up with Sweeney and Kim Libreri, chief technology officer at Epic Games, in an interview during a preview session this week at the GDC. We talked about digital humans, Epic’s demos, and the success of its Fortnite battle royale game.

Here’s an edited transcript of our interview.

Above: Kim Libreri (left, CTO) and Tim Sweeney (CEO) of Epic Games.

Image Credit: Dean Takahashi

GamesBeat: I heard a rumor that you will rebrand Epic Games as Fortnite Games, and you will rename the Unreal Engine as the Fortnite Engine. (Laughs). What would you say about Fortnite’s success?

Tim Sweeney: No spreadsheet could have predicted this, right? It’s becoming a social hub. It’s awesome. It’s part of an exciting trend in the industry right now. Mobile gaming was dominated by highly casual games for a long time. That started turning in Korea, where, beginning two years ago, the market began to be dominated by serious games for gamers, both in revenue and in play time. We had some Unreal Engine 4 games.

Now that’s happening on mobile in North America and Europe. Fortnite is one of the games leading the way. It isn’t a game that was just eventually ported to mobile. It’s the same game across iOS, Android, PC, Mac, PlayStation, Xbox. You can play together across most of these platform families. Everything works in a single unified game.

Above: Epic Games’ Siren demo

Image Credit: Epic Games

GamesBeat: I still have to give the mobile a try.

Sweeney: It’s cool. Ark is on mobile. PUBG mobile just came out. You have this broad set of games that are becoming universal game experiences that run everywhere. That’s going to be a trend shaping the game industry for the next few years. At the end of this year, the number one category on mobile will be serious games for gamers – number one in play time, number one in revenue.

That’ll be a much larger segment, ultimately, than casual games. Especially these games that are ad-supported or have really greedy monetization. We’ll see the advent of generous games with traditional console-like business models. It’s awesome.

GamesBeat: With Fortnite, do you notice that women are a big part of the audience?

Sweeney: Yeah, there’s much higher adoption than other hardcore games. That’s the great things about games as social experiences. You play with all your friends across social groups. You see young girls as well as young boys playing. These are kids in school, people in offices, in pubs, all having fun together. We played it on the plane flying over here. Fortnite, because of its visual style, it’s widely acceptable to just about everyone. It’s open up to a much wider audience than a realistic military-style simulation.

GamesBeat: Did you guys hit No. 1 on iOS yet? I saw No. 2 this morning.

Sweeney: We’re No. 1 in the U.S., despite launching with just an invitation event.

GamesBeat: I don’t think that’s been done before, PC and console and mobile–

Sweeney: Yeah, and it’s the same game experience everywhere. About a month ago we unveiled 60fps across all the console platforms. That was just the result of the optimization work we were doing to get Fortnite running on iOS and Android. We made the rendering and the level of detail support so efficient that the mobile effort benefited console. All these things we’re doing with Fortnite are going straight into the engine to benefit all of our licensees, too. They’re getting a bigger set of improvements there, more sweeping than ever before.

GamesBeat: What’s Wednesday going to be all about?

Sweeney: Let’s show you the Fortnite replay stuff. We’ve been building this Unreal Engine replay system and proving it out through Fortnite. Games have become a social phenomenon. The number one Fortnite streamer, Ninja, had a big rapper join him in a game session and they had 600,000 simultaneous viewers on Twitch. Far bigger than any Twitch stream before. That’s more viewers than a lot of TV shows.

Kim Libreri: We wanted to give everyone tools. All these YouTubers that make awesome videos of our games, or any game running on Unreal Engine, we wanted to give them better tools.

Above: Fortnite in action on iOS.

Image Credit: Epic Games

As you probably know, we had a replay system in Paragon that was pretty cool, but not as fully featured as we wanted. For Fortnite, we really wanted to push it and do something amazing. We contacted one of the more famous YouTubers out there that loves Fortnite, and this is what resulted. We got together with him, played a few matches, and then invited him to go back into the games. You can rewind them, restart them, place cameras, follow the action. We came up with a piece of awesome machinima in almost no time.

Also, you can think about the applications for live viewing of tournaments. The fact that you can go and follow the action, frame the coolest moments, go from player to player—you’ll be able to do, in the game, what only ESPN can do for real sports. There’s great potential for this.

The other big thing for us, you may have seen the Microsoft announcements about their new raytracing capabilities in DirectX, DXR. We’ve partnered with Nvidia, who have the new RTX raytracing system, and we thought about how to show the world what a game could look like in the future once raytracing is added to the core capabilities of a PC, or maybe even a console one day. We teamed up with Nvidia and our friends at LucasFilm, the ILM X-Lab, to make a short film that demonstrates the core capabilities of raytracing in Unreal Engine. It’s an experimental piece, but it shows the kind of features we’ll add to the engine over the next year or so.

Above: Epic Games’ Star Wars demo shows off real-time ray tracing.

Image Credit: Epic Games

We’ve added support for what we call textured area lights, which is the same way we would light movies. You can see multiple reflections. You can see on the character, when she’s carrying her gun, the reflection of the back of the gun in her chest plate. It’s running on an Nvidia DGX-1, which is a four-GPU graphics computer they make. But as you know, hardware gets better every year. Hopefully one day there’s a machine that can do this for gamers as well as high-end professionals. It’s beginning to blur the line between what a movie looks like and what a game can look like. We think there’s an exciting time ahead.

One thing we’ve been interested in over the years is digital humans. Two years ago we showed Senua, the Hellblade character. To this day, that’s pretty much state of the art. But we wanted to see if we could get closer to crossing the uncanny valley. She was great, but you could see that the facial animation wasn’t quite there. The details in the skin and the hair—it was still a fair way from crossing the uncanny valley.

We got together with Cubic Motion and 3Lateral again, the facial reading company, and Vicon, and also our friends at Tencent Next, the research lab at Tencent, to see if we could do something amazing. We put together this project to build a digital human for a presentation in China with Tencent.

This is the actress that was our subject – we made a digital clone of her — the actress we’re going to use at the Vicon booth to drive the character’s digital body. She’s from Manchester, close to Cubic Motion. She’s able to drive this completely different character, all live in real time. You’ll be able to go by the Vicon booth and see her, Alexis is her name, and ask her questions, take a camera and film her, and it’ll be a completely different person talking to you. This all runs at 60fps on an Nvidia—I think we’re using a 1080Ti.

Above: Hellblade: Senua’s Sacrifice is chilling tale of madness and faith.

Image Credit: Ninja Theory

GamesBeat: How would you say the new character is an improvement on Senua from Hellblade, in a demo a couple of years ago?

Libreri: There’s more detail in terms of the resolution of the face. Women actually have a tiny bit of peach fuzz, little hairs all over their face, which was impossible to render two years ago, but now the GPUs are fast enough that we can do that. There are 300,000 little tiny hairs on her ears, on her nose. We also improved the shading technology. The skin shading now supports two layers of specular reflection. It also supports back scatter. If you put a light behind her ear, it glows red like it would in the real world.

We added another thing called screen space irradiance, which is the ability to bounce light off her face into her eye sockets. It’s surprising. It seems like a subtle thing, but it makes a big difference as far as believing what’s happening in your eyes. It’s a pretty significant improvement compared to where we got with Senua. We’re happy with the results. It’s still lacking some detail in terms of how the flesh moves.

Above: Andy Serkis gets fully digitized.

Image Credit: Epic Games

Vladimir, our partner at 3Lateral, has been developing a new face scanner that allows us to not only capture key shapes for different expressions, but to get every single frame of nuance out of her performance. We wanted a test subject that had a lot of dynamic range in their acting ability, and we invited Andy Serkis, of Gollum fame, to go to 3Lateral five weeks ago and get scanned in this new device. Then we were able to get all that data into our engine. We’ll show you that as well.

The thing to bear in mind in this demo is that there is no hand animation, no key framing. No human animator worked on this. This is algorithmically extracted by taking—you saw the gray image with the sparkly representation of Andy’s face. What 3Lateral are able to do is take the key components of Andy’s digital face and fit them to what’s happening in that 4D capture, extracting animation data from there. If there’s a mismatch and it doesn’t quite hit the pose, then they’re able to work out what’s missing and feed that back into the animation system to get even more detail.