Like almost everyone who writes about virtual reality, I struggle on a daily basis with the knowledge that the new medium defies easy descriptions — it’s sometimes like trying to explain the appeal of television through a radio. Even “good” VR is more immersive than TV and cannot be replicated on a 2D screen, so if I say something has stepped past “good” into “great” territory, the only visual evidence I can offer most readers is a flat screenshot or a video.
Over the past month, the two most impressive things I’ve seen on the Oculus Quest can’t really be captured by screenshots, but they make more of a difference in actually using the VR headset than anything I’ve experienced since it launched three months ago.
So I’m going to tell you about them. And try to show them to you.
The first one arrived this week courtesy of Vertical Robot, a Spanish developer that has boasted — accurately — that it has “achieved console-like graphics on mobile VR,” by partially rewriting and locally compiling Unreal Engine’s mobile shaders. Without diving deeply into its showcase title, Red Matter, I will say that the company has proved that Quest’s graphics hardware can be pushed to achieve visual realism well beyond Oculus’ earliest launch titles.
Most importantly, glass effects and surface reflections make Red Matter’s environments look realistic as light bounces off of them, something that naturally happens all the time as you move. Multiple lights can be used within rooms, and diffracting glass can obscure glowing objects, as seen in the video or image at the top of this page. These improvements can’t be meaningfully captured in a single photo, and even 2D videos don’t do it full justice, but as you move through a 3D world and see light reacting realistically to shiny and transparent surfaces, you start feeling immersed and stop feeling like you’re in a cartoon.
While the obvious initial impact will be upon games, Unreal Engine is now powering all sorts of applications — The Weather Channel has been using it to augment real-world footage with photorealistic digital assets, and other non-gaming developers are using it as well. So you can expect similar tricks to pop up in Quest social apps, shopping apps, and others when developers implement them.
It’s worth underscoring that this isn’t a trivial accomplishment. Trying to squeeze more graphics performance out of a 2017-vintage Snapdragon 835 phone chip isn’t easy at this stage, and doing so while pushing pixels to two separate high-resolution displays is only a little short of magic. Oculus hinted earlier this year that the Quest hardware wasn’t as polygon-constrained as some developers expected, and that optimizations would make a big difference in delivering superior results. That’s turning out to be accurate, and I’m excited to see what’s next.
The second improvement is more subtle: Quest’s Guardian and inside-out tracking keep getting better. Guardian is the Quest feature that magically scans the room you’re sitting in and allows you to define one or more physically safe spaces for head, arm, and possibly leg motion. If you’re planning to remain stationary, it creates a tube-like cage around you that’s in-app invisible until you approach the edge, at which point it pops up only as much as necessary to guide you away.
Once again, Guardian can’t be done justice in a photo. The cage doesn’t get shared with Quest’s screensharing/casting feature, so I could only capture it by snapping an image directly through the headset’s lens. It doesn’t look like much, but it enables almost everything that makes Quest special. Virtually every person who uses Quest understands and somehow verbalizes, regardless of technical knowledge, how amazing it is that it works so well to let you physically move safely within VR spaces.
The fact that Guardian can remember multiple locations and different cages within locations is astonishing enough, but the tracking of your motions within the space — aided by Quest’s inside-out camera design — is also impressive. Over the past month, it’s evolved thanks to firmware updates that make close-to-body controller tracking more reliable and ease Guardian customization and location memorization. Across multiple apps, I’ve found the controllers more accurate at tracking hand motions than before, a great example of a seemingly small software change that benefits lots of apps at once.
Tweaks to Quest’s control scheme and graphics capabilities aren’t the sorts of things Oculus or its third-party developers can meaningfully advertise. It’s all but impossible to explain to someone who hasn’t used Quest already that various half-step software changes are making the hardware much more valuable. So I can tell you that Quest now feels more natural to control, or more realistic when viewing certain apps, but if you don’t experience it in person yourself, the improvements might seem trivial or abstract.

Above: Red Matter’s graphics improvements are impossible to see when captured from the Quest’s screencasting feature, even when the image is brightness-adjusted.
Worse yet, Quest’s official screencasting feature does no justice to these features. Images I snapshotted from Red Matter were unusably dark, even after trying to bring the levels up manually in a photo editing program. Guardian disappears entirely over the screencasting connection.
Which means that you’ll really have to see Quest for yourself to experience what makes it special. With Oculus Connect 6 coming up next month, I’m hoping to hear that the company has a plan to get more demo units out there so that it’s easier to try the headset even if a friend or family member doesn’t have one. Until that’s an option, videos such as Vertical Robot’s may be the closest you’ll get to seeing what’s making this VR experience increasingly compelling by the month.