testsetset
Offering a tethered VR-like experience on completely standalone headsets is about to become easier for developers, as Seurat — a Google pre-rendering tool teased by the company at last year’s I/O conference — is going open source. Google describes Seurat as a “scene simplification technology” with particular benefits for mobile VR devices, but the technology could improve performance across more powerful VR hardware, too.
Seurat radically reduces the processing demands for VR by using high-quality assets for advance calculations of all possibly viewable perspectives for a given scene, then removing everything that can’t be seen. As Google explains, Seurat “takes RGBD images (color and depth) as input and generates a textured mesh, targeting a configurable number of triangles, texture size, and fill rate to simplify scenes beyond what traditional methods can achieve.” The dramatic reduction in polygon complexity enables performance-constrained standalone headsets to display 3D scenes that look real and to offer six degrees of freedom (6DoF) head tracking without hiccups.

Above: Google’s Seurat pre-calculates the user’s perspectives of a given scene
Since last year, Google has been showing off Seurat’s ability to bring complex scenes from film adaptations to mobile devices, though it kept the software tricks to itself. Seurat was demonstrated at 2017’s I/O as enabling a mobile headset to display a nearly photorealistic ILM-developed Star Destroyer hanger scene from Rogue One: A Star Wars Story.
This year, Google announced that Seurat was used in the just-released game Blade Runner: Revelations. The company says Seurat reduced a 46.6 million triangle scene “down to only 307,000, improving performance by more than 100 times with almost no loss in visual quality.” While the video clip doesn’t show a Seurat before and after, the complexity of the scene far exceeds what anyone would normally expect to be able to see on a mobile VR headset.
Google is offering the tool on a new Seurat GitHub page with documentation and source code. The hope is to give developers the ability to “customize the tool for your own workflows” and inspire a new wave of powerful mobile VR experiences.