At first, when I looked at the Star Wars scene featuring two First Order stormtroopers and Captain Phasma, I thought it was a film. But it was a completely digital animation, made possible by a collaboration of Epic Games, Nvidia, and Disney’s ILMxLAB special effects studio.
Epic CEO Tim Sweeney and chief technology officer Kim Libreri showed the demo to a crowd at the Game Developers Conference today in San Francisco. It was the first public demonstration of real-time ray tracing in Unreal Engine 4, which powers many of the highest-quality video games. And the result could be that video game engines like Unreal could be used to make films in the future.
Real-time ray tracing is considered to be a holy grail for those creating high-end cinematic imagery, one that signifies a leap forward in the convergence of film and games.
At the event, the three companies presented an experimental cinematic demo using Star Wars characters from Star Wars: The Force Awakens and Star Wars: The Last Jedi.
The demonstration is powered by Nvidia’s RTX technology for Volta graphics processing units (GPUs), available via Microsoft’s DirectX Ray Tracing API (DXR).
An iPad running Apple’s ARKit augmented reality platform was used as a virtual camera to draw focus to fine details in up-close views.
Epic built the computer-generated scene using assets from Lucasfilm’s Star Wars: The Last Jedi, with Captain Phasma, clad in her distinctive armor of salvaged chromium, and two stormtroopers who run into her on an elevator on the First Order ship.
In the tech demo, lighting is moved around the scene interactively, as the ray-traced effects including shadows and photorealistic reflections render in real time. The stunning image quality of highly reflective surfaces and soft shadows has never before been achieved at such a high level of image fidelity in Unreal Engine, Sweeney said.
“Ray tracing is a rendering process typically only associated with high-end offline renderers and hours and hours of computer processing time,” said Sweeney. “Film-quality ray tracing in real time is an Unreal Engine first. This is an exciting new development for the media and entertainment linear content worlds—and any markets that require photorealistic visualization.”
Epic Games worked closely with Nvidia to support the Nvidia RTX technology available through the DXR API. It ran on an Nvidia DGX Station.
“Real-time ray tracing has been a dream of the graphics and visualization industry for years. It’s been thrilling to work with the talented teams at Epic and ILMxLAB on this stunning real-time ray tracing demonstration,”s aid Tony Tamasi, senior vice president of content and technology at Nvidia, in a statement. “With the use of Nvidia RTX technology, Volta GPUs and the new DXR API from Microsoft, the teams have been able to develop something truly amazing, that shows that the era of real-time ray tracing is finally here.”