Skip to main content

Epic CEO on 250 million Fortnite players, digital humans, and $100 million dev fund

Tim Sweeney, CEO of Epic Games, and Kim Libreri, CTO.
Tim Sweeney, CEO of Epic Games, and Kim Libreri, CTO.
Image Credit: Dean Takahashi

Epic Games dumped a bunch of news about the state of the Unreal Engine at its keynote talk at the Game Developers Conference today. We got a preview of the news and were able to interview Tim Sweeney, CEO of Epic, and Kim Libreri, chief technology officer.

Sweeney said Fortnite has reached 250 million players, and that the popular battle royale game hasn’t seen a drop off because of rival EA’s popular Apex Legends. He said the largesse from Fortnite will enable Epic Games to set up a new $100 million fund for Unreal Mega Grants for game developers.

Sweeney also talked about his views on openness and open source technology. He praised Microsoft for its open approach on HoloLens 2 augmented reality technology, and he ultimately believes high-quality AR glasses will replace all of our screens, like smartphones, TVs, and movie screens.

Libreri and developers from Goodbye Kansas showed how much progress the company has made with digital humans, or making ultra-realistic human faces with real-time ray tracing. They created the believable human imagery of Trolls.

We discussed these topics and more ahead of the Epic talk at GDC 2019.

Here’s an edited transcript of our interview.

Above: Troll has some amazing digital human imagery.

Image Credit: Epic Games/Goodbye Kansas

GamesBeat: Can you talk about what’s going on in your keynote?

Tim Sweeney: We have some tech demos that Kim will deliver on physics and digital humans. We’ll be talking about Epic’s online services that we’re opening up to all developers on all platforms and all stores for free. And we’ll be talking about the state of the Epic Games Store. That’s the big news. We’re very close to hitting 250 million Fortnite players. Since Apex Legends came out, we’ve gained an Apex Legends worth of Fortnite players, which is amazing.

GamesBeat: Some people thought that Apex Legends might slow Fortnite down. I guess not.

Sweeney: We hit a Fortnite non-event peak twice after Apex was out. We haven’t seen any visible cut into Fortnite. It’s a funny thing. The only game you can see where its peaks cut into Fortnite playtime is FIFA. It’s another game for everybody, wildly popular around the world.

What Apex Legends has done is re-energized a lot of shooter players, people who come in and out of shooters depending on what’s popular. It’s awesome to see other games picking up on battle royale, adding their unique spin to it and advancing the state of the industry.

Kim Libreri: The first thing we’re demoing — do you know who Quixel is? They’re a photogrammetry company that makes a lot of assets, a big asset library of rocks and trees and plants and natural phenomena. If you’ve played any triple-A game over the last year or so and you’re going through a big open world environment, chances are that stuff came from Quixel. They contacted us before Christmas saying, “Hey, we have a movie we’re working on, and we’d love for you to open up with it at your GDC keynote.”

So we’re opening up with this movie that was made by three artists using their assets. It looks super photorealistic, and it was made in Unreal Engine 4.2.1, the current version, as opposed to the to-be-released version coming out next week, 4.2.2. It looks great. That represents the state of the art of rasterizing. This is the Unreal Engine before we did a large overhaul of the graphics systems in the engine.

We’ve now made the engine capable of raytracing out of the box. We’re no longer talking about prototype or demonstration stuff. We now have an engine that will raytrace. What we’re trying to explain to people is that raytracing is not just about shiny bottles or glass surfaces or doing a bit of refraction. It’s about subtlety. It brings a quality of lighting that you’ve really only seen in animated movies or live-action photography.

Above: Troll takes advantage of real-time ray tracing.

Image Credit: Epic Games/Goodbye Kansas

To show that off, instead of making a demo ourselves, we wanted to work with a customer that would get early access to 4.2.2 and make a short movie. It’s based on an old Swedish fairy tale by an author called Jon Bauer. It’s about a princess and fairies and a troll. They’ve made a little teaser, about one and a half minutes, a cinematic piece, all rendered in real time in front of the audience on an Nvidia 2080Ti. It’s a movie, so it’s 24 frames per second. It looks awesome.

The star of the short is a digital recreation of Alicia Vikander, the Tomb Raider actress. She was also in Ex Machina a few years ago, an amazing Swedish actress. She’s an amazing digital human. 3Lateral built the digital human. But the real story is a very gentle, beautiful fairy tale with amazing-looking lighting.

We’ll go over the features around raytracing, how adding raytracing forced us to do a rework of the engine. It has to still be backward compatible for existing customers. We can’t change everything to a point where they can’t load up their old scenes. But as a side effect the engine became faster, so rendering in Unreal Engine is now faster than it’s ever been. We’ll go through features we’ve added to deal with cross-platform performance, from Android all the way to high-end workstations, and then we have a gameplay demo that shows off something we felt was a long time coming to the games business.

We’re in the danger, as an industry, of entering into an uncanny valley not around digital humans, but around interactions. You can walk around an environment built in Unreal Engine that looks pretty photographic. With raytracing now it looks very photographic. But if you pick something up or try to fracture a wall or knock down a door, you get met with very simple rigid-body physics, relatively simple collisions — physics is still in a simple state in the game industry.

We wanted to have a go and see if we could learn some lessons from the movie industry over the last 10 years. Can we do large-scale simulation and collision and destruction and particles that bring to the world something like the spectacle of a Michael Bay movie, but in a video game? We made a gameplay demo that shows killer destruction and cool explosions and all the stuff that — what would it be like to really blow up an environment? That’s the physics demo.

We call it Chaos, the new physics system. It’s all running on an Intel Core i9. They were quite instrumental in helping us make sure things were running as fast as they possibly could. It’s mind-blowing. Shit blows up all over the place. You’ll be able to play that demo at our booth. We’ll have two setups that can people can go in on, massive screens where you can blow up this robot city.

Magic Leap will also be bringing their Mica demo to our booth, so you’ll be able to interact with a digital human. Vlad from 3Lateral is going to talk a bit about our plans going forward in terms of wanting to ubiquitize the creation of digital humans, so it’s not a super-hard, expensive endeavor. That’s what it is right now for most developers. It’s out of reach. Most small studios cannot put quality digital humans in their games. It’s just too difficult. We’re going to talk a bit about that.