Adobe’s annual Max conference is one of the year’s largest gatherings of creative professionals — people whose artistic, photographic, and graphic design skills improve the images we see and the materials we read every day. At this week’s event, Adobe is taking steps to embrace augmented reality as a new opportunity for creators, though the technology is fighting for attention in a quiet corner of the Los Angeles Convention Center’s exhibition area.
Ahead of the event, Adobe signaled that its biggest AR news would be about Aero, an iPad app designed to ease the process of developing AR experiences for novice creators. Demo stations at the event walked visitors through using the app to identify a real-world space’s surfaces and placing a 3D model at the right size and correct location, seamlessly blending it with the environment so it could be viewed from multiple angles on the iPad as if it were real.
In the primary demo of Aero, users saw the most basic type of AR experience — one that we’ve now seen in handheld form for years, thanks largely to games like Pokemon Go — though Adobe’s premise was to show creators how they could bring self-developed 3D assets into the mixed reality space for marketing purposes. One gets the sense that Aero will help brands create more apps like the Jack Daniel’s AR Experience I tested earlier this year, which turned any of the company’s whiskey bottles into a pop-out storybook using animated 3D assets and voiceovers.
Thankfully, Adobe didn’t stick to completely dry demonstrations. Though you have to look for it deep in the LACC’s corner, the company is also using Aero for a demonstration called the Terminator Max Experience, drawing upon assets from the just-released film Terminator: Dark Fate. Attendees are given a list of several cues to follow in acting out a scene with a virtual Terminator, which emerges menacingly from an energy bubble on a rubble-filled set.
Acting as both the camera and 3D compositing engine for the experience, the iPad with Aero films the scene as digital assets are merged with the actual people and objects on the set. While nothing at the booth explains how an end user would be able to use Aero or an Aero app in this way, the implication is that creatives could create cinematic-class 3D experiences for users, then deploy them either at specific locations or in downloadable apps for marketing purposes.
Adobe had a computer hooked up to the iPad running Aero to record the Terminator Max Experience videos, and promised to email them to attendees. I’m still waiting for my video, but you can see a sample clip of how the experience looked here.
The corner devoted to AR was actually a shared space for both 3D and AR — in that order — with most of the demos focusing on various and generally already announced 3D developments, touching AR lightly at best. Nvidia, for instance, was showing off RTX Studio graphics technology running on a laptop, illustrating how even portable machines can now raytrace realistic (if not photorealistic) reflections and shadows in 3D scenes.
When will RTX-caliber raytracing make its way into mobile devices? Because of the rendering technology’s high processing demands, Nvidia said that the only currently viable solutions would be to use GeForce Now to stream fully server-rendered scenes to mobile screens or use the mobile device to render the scene while a server streams the rendered reflections for compositing — something that might be hard to sync in real time due to latency.
Building on its Z by HP lineup for creative professionals, HP was on hand to show off Project Captis, a pre-production 3D material scanning tool that uses photometry to transform real textures and objects into digital ones. Presently unpriced and launching as a pilot for testing by creatives, the hardware and software combination places multiple 4K cameras inside a large box, capturing objects from multiple angles at high detail.
The resulting scans can be used to apply one material’s textures to another, such as making a pair of 3D printed sunglasses with a leather-like appearance, or a pair of sneakers with a complex mesh that looks woven. Like many of the other demos in the 3D & AR area, there’s obvious potential for this scanning technology to create ultra-realistic assets for future AR applications.
One of the most interesting announcements at Max — but one that wasn’t in the 3D & AR area — is Adobe Photoshop Camera, which is now being offered to a limited number of people in beta form. The photography app will be completely free to iOS and Android users, Adobe says, without the need for a Creative Cloud subscription or other in-app purchase. Users will be able to select from a large number of artist-developed Photoshop filters that can be applied to camera input in real time, adding still or animated imagery as they shoot.
What’s impressive about Photoshop Camera is how the filters actually work. Adobe uses cloud-based Sensei AI to scan the camera’s input in real time for certain types of content — skies, faces, or food as just a few examples — then selectively applies still or animated filters to the AI-identified objects. In the images above, the AI identifies an image of a landscape, transforming it from day into night (complete with a moving moon) or adding a Japanese-style rising sun overlay. Another example showed the app automatically adjusting a food photo with proper warming and contrast adjustments. You can see the app in action here.
I’m not going to tell you that the world needs yet another photo filtering app, or that Photoshop Camera is going to somehow supplant the integrated camera apps included with Android or iOS devices. But as you see the technology working, there’s clearly something compelling about using AI and machine learning to augment photography in real time. The underlying recognition technology is powerful, and in the hands of the right artists, the results could be powerful.
One thing that was conspicuously missing from Max — at least, I couldn’t find it when I looked around — was any wearable AR hardware. Though there were a few VR headsets scattered around the exhibition floor, almost everything on display was being shown off on computer or tablet screens, suggesting that Adobe and related vendors are still awaiting viable AR headsets. Consequently, the “3D & AR Village Theatre” (above) was strictly a 2D affair.
Adobe Max is a fairly large show, and there’s no question that AR is neither the primary nor a major secondary focus of the event — Aero and related initiatives had only small parts in the extended keynote and show flow. That said, it’s clear that creatives are beginning to prepare for a near-term future where mixed reality is a bigger part of everyday life, including both consumer or professional apps. The challenge at this stage is to find ways for AR to move past small demos and gimmicks into practical applications that consumers actually want to use. I suspect that will begin to happen in 2020, likely fueled by a combination of affordable hardware and greater developer interest.