Aleissia Laidacker is one of the biggest evangelists for the mixed reality experiences and games that are being built on futuristic “spatial computing” platforms, or those that take physical space into account, like the Magic Leap One Creator Edition.
As a technologist, she has also been working on the kind of tools that Magic Leap wants developers to use to build their augmented reality experiences. I went to Magic Leap’s recent event in Los Angeles, where I saw cool demos of AR and mixed reality technology such as Mica, a digital human; Dr. Grordbort’s Invaders, a zany game from Weta Workshop; and Angry Birds FPS.
Laidacker has 20 years of experience in product design, artificial intelligence, and game development. She worked on titles such as Assassin’s Creed while a programmer at Ubisoft. She is the interaction director at Magic Leap, which now has the tough challenge of convincing developers to support a platform with a $2,295 AR headset. Laidacker is also a big fan of immersive theater and escape rooms and how AR could be used to expand the experience of fun in the physical world and the digital world.
She believes that VR developers are itching to move beyond VR into mixed reality (MR).
“In reality, what a lot of those developers want to do is bring the interaction and the digital side of things to the physical world, because their intention is really–they want to interact with people,” she said. “They want to interact with the environment. Especially on the location-based side, we’re starting to see developers flock to the MR side. That’s the dream of what they wanted from the beginning, but they got to do a lot of their R&D on the VR side because that platform existed first.”
I caught up with her at Greenlight Insight’s Virtual Reality Strategy event in San Francisco.
Here’s an edited transcript of our interview.

Above: Magic Leap One will make possible “spatial computing.”
GamesBeat: How did you soak in all the feedback from the event you guys had? What did you learn from the people who went?
Aleissia Laidacker: A lot of people were surprised by the scale, the amount of content and information we shared. Also, considering that Magic Leap has a reputation for withholding information–at the conference, not only did we have the keynote, where almost every partner was highlighted and we showed a lot of future work, but there were 20 sessions at an hour each. We really dove into the technical side, how things work in the backend, the future tech that’s coming online. We were very transparent with a lot of those things.
Some of the feedback I got personally, especially from developer-creator friends, is that if you compare this to GDC or VRDC, the types of creators that were attending were very diverse. Magic Leap is working with the entertainment industry, but also the medical field and lifestyle. People had a good time comparing stories and tips and tricks with each other. It wasn’t just the typical video game community.
GamesBeat: What is your role in a lot of this? In your talks you’ve been going into how to think about making mixed reality games. Is that your job, or do you have other responsibilities?
Laidacker: I always say it’s twofold. Part of my job is on the creative design side. Since I’ve started, my role is to work with a lot of our partners to think about, what is mixed reality? What type of experiences can they build on the platform? But because my background is as a technology engineering lead–my team within Magic Leap is primarily a group of engineers and designers. We work on a lot of the developer-facing tools samples, tool kits. After all the feedback we’ve gotten in terms of how we can enable creators, a lot of what’s been shared and developed is what my team has worked on.
GamesBeat: You’ve worked on escape rooms, I understand? Physical games, in-person games. I can see how that becomes enhanced by augmented reality.
Laidacker: Oh, for sure. I’ve done escape room design and immersive experience design, just on the personal side. I’m really passionate about immersive theater. By working in that space, especially with creators–there are so many limitations there. They’re constrained by what they can do in the physical world. That’s why we’re getting a lot of creators like Meow Wolf and even ILM xLAB coming in, people who are looking at location-based experiences. They see how being able to blend the digital with the physical can enable their creativity and art. They’re able to do so much more in a physical space.

Above: Mica is a digital human demo for the Magic Leap One.
GamesBeat: Have you also come to figure out what kind of games might work more as VR in location-based entertainment, as opposed to AR? Or mixed reality, as you call it.
Laidacker: For VR, yes, I always like to say that VR to me is 100 percent escapism. If you want to bring someone specifically into a different digital world, with no importance to the real world around them, then yes, VR is the medium a developer should be using. But so many VR experiences were done on the VR platform because that’s just the first type of hardware that existed.
In reality, what a lot of those developers want to do is bring the interaction and the digital side of things to the physical world, because their intention is really–they want to interact with people. They want to interact with the environment. Especially on the location-based side, we’re starting to see developers flock to the MR side. That’s the dream of what they wanted from the beginning, but they got to do a lot of their R&D on the VR side because that platform existed first.
GamesBeat: You wind up getting the Star Wars: Secrets of the Empire experience in VR. But I could see very interesting things like Dr. Grordbort’s Invaders in a location-based setting with all the cool things they designed there for the room.
Laidacker: The physical world, especially if you take something like Disney or Universal–those people are experts at set design. Set design is so important for world-building and storytelling. A lot of the work we’ve done with Magic Leap is to show the possibility space of interacting with physical worlds, not just with digital content.
For example, one of our developers gave a talk at LeapCon where he talked about experimental inputs. He was showing something where we weren’t using any digital content at all. It was just a bunch of clocks on a wall. People would be able to put up their hands and control the speed of time. People were saying, “You can do that in mixed reality?” It’s using inputs from mixed reality, but communicating with Arduinos and Raspberry Pis.
In another example we had smart lights with a bunch of LEDs reacting to the inputs. We were able to change the color and emit particles from the physical lights. Every time I show this demo to someone, people actually lift up their glasses. They think the whole thing is fake. “Are those real lights?!” Yeah, they’re smart lights. It’s another example of combining the physical with the digital.
GamesBeat: The ILM folks showed the Porgs with the smart speakers.
Laidacker: My team worked on that. I was very proud of that demo.
GamesBeat: It seems like there’s a lot of cross-pollination or linkage going on between these MR experiences and escape rooms and things like that. It seems like that could be the best way to get your tech in front of a lot of people more quickly.
Laidacker: It’s a way to make it accessible, at least in the beginning, so that more and more people can try out the technology and see those possibilities, based on what mixed reality can be. But on the flip side we’re seeing a lot of developers come in and ask, “How can I use mixed reality to better people’s lives? How can I use it in the medical space?”
Or even on the entertainment side, I think a lot of the game devs who’ve come to this platform–there’s a lot more focus on interaction with digital characters and creating these intelligent characters that have agency and that adapt to the real world around you. You build this relationship, which is something you don’t necessarily see in traditional video games. Usually you have your weapon and you go out and do a bunch of fighting. The interaction space with mixed reality, because it’s in your home and feels much more personal, is making designers think about experiences in a more humanistic way.
GamesBeat: I was especially impressed by the quality of Dr. Grordbort’s Invaders, and then the Mica demo. Mica made more sense in the context of the digital assistants talk they did on stage.
Laidacker: What I liked when they described it is they said it’s not necessarily an assistant. When you think of it as an assistant, that’s a character who’s kind of there to serve you. We’re looking at it more as a companion. What I loved with Mica is she had agency. She was not there just to serve you in the experience. If anything, you really want to see how to bond with her and interact with her, but she has full agency.
On the flip side, what I love with Mica — and we stressed this with digital characters in general — is focusing on the awareness and the reactions. Are you making eye contact? If someone else walks into the room and I start having a conversation, the magic is when the digital character is aware of these environmental changes and reacting to them.
I always say that the new uncanny valley for mixed reality is not going to be around visual fidelity. We’ve solved that. If anything, people were blown away by the volumetric capture of MICA. But it’s really on that agency and AI reaction side. That’s where we really want developers to focus. Which is refreshing, because when you compare to game characters, game characters are kind of zombies. They only react when something happens.

Above: Star Wars Project Porg comes to Magic Leap One.
GamesBeat: Do you think it’s more believable when it’s AR as opposed to VR? VR scenes, to me–you get things like eyes looking at you. But you also often get the sense that people are too close to you. It feels like they’re crossing into your space. Whereas with Mica, if she’s in your house, sitting on your furniture, and she’s occluded properly, it almost feels more like a real person.
Laidacker: She’s grounded in reality. That’s the bit that makes things more believable. You mentioned something interesting. She’s in your house. It’s very different from–let’s say I watch a character on television. I’m watching them in their world. But say I had that actor or digital character come to my home. The way they interact with me and my space should be very different, because they’re a guest.
One thing I love with MICA is this wasn’t just programmed by a bunch of AI developers. We brought in a behavioral scientist, someone who’s on the human instinct and behavior side, to design how MICA should be developed. That’s informing a lot of the tools we want to expose as far as how digital characters should behave in the real world.
GamesBeat: In the presentation, we got to see all these dimensions of the companion. The dragon sitting on your shoulder, or the Grordbort characters flying around with you. But if you think of something that’s more for games, do other ideas come to mind? How would you use Mica in a game?
Laidacker: What ILM was starting to do with the Porgs is a good example of that. It’s characters, but it’s these mischievous–I’m here to cause trouble and have fun with you in your environment, that type of thing. I know what they’re looking at, thinking about–there’s these characters who have AI and awareness, but then what are all the different interactive games we can play together?
It’s almost like me coming home to play with my pet, play with my kids, something like that. They’re designing around the kind of games you would play with a pet. Whereas with MICA, I would think about what I’m going to do with a friend when they come over. Maybe we’d play a tabletop game. Maybe we’ll watch something and interact with each other in different ways. All the design themes we’re trying to get developers to think about–if this was the real world, if this was a human, if this was a pet or something like that, what would you do? That informs a lot of the interaction elements in play.
GamesBeat: Do you actually want to see something like an escape room in AR?
Laidacker: I would love to see escape rooms in AR. We’ve had a few escape room developers come in and talk to us. This was before the release. One of the reasons is, they always talk about how in an escape room, you have to have someone staring on a camera the whole time. They come on a microphone saying, “That is not interactive. Please do not try to take that puzzle off the wall.” When people get stuck, someone has to be watching and saying, “Move on! This is your clue.”
All of these things are solved in video games. It’s all solved by giving haptic feedback, digital feedback, clues of different kinds. I know that for the escape room side, a lot of designers would love to be able to incorporate mixed reality, mostly just to help guide their users. It takes away that need for a physical piece.
GamesBeat: Hey, what’s that glowing red thing on the wall there?
Laidacker: Exactly! It takes away a lot of the physical build-out and resetting at the end of each stage. It also opens up the space toward what we do in video games, where games aren’t necessarily played out just one time, in one way. Especially systemic games, which you can play in many different ways. It opens up a large space for escape room designers to think about the replayability, where people would want to come back and interact with the digital elements again in different ways.

Above: Image from Magic Leap’s Tonandi app.
GamesBeat: I’ve talked to some people who were wondering if Magic Leap will spend a lot of money on developers to get things going, to commission whatever projects, the same way Facebook has poured so much money into Oculus content. Have you figured out what kind of content strategy you might have in the context of that sort of thing, to get people to your platform?
Laidacker: There was the indie fund that was announced last week. That was something that came out of what myself and a few others have been saying for the last year or so. All these other companies have amazing funding to get indie devs building on their platform. To me, indies are the ones who really focus on innovation. Triple-A is where they can iterate on that innovation and bring it to the masses. That’s why it was so important, and I’m happy that we’ve announced it. We’ll be giving hardware, funding, marketing, engineering and design support, the whole caboodle.
At the same time, we want to be mindful with the creators that will get chosen. If I think about the last two years, our early access partner pool is very diverse as far as types of content. We want to continue to spread our content across the different types of initiatives we’re doing. Hopefully we’ll reach out to indies across a number of different spectrums.
GamesBeat: The other thing developers seem to ask for–what’s the road map to the consumer market?
Laidacker: During the keynote we saw, at a very high level, what’s coming. When I’ve talked about this with other developers, because I worked on the session side of things–there was a lot of information shared during the con. Some of it might have fallen through the cracks. For example, what was talked about in the keynote and one of our really large features is persistence and our passable world technology.
What that’s starting to enable for developers is, one, thinking about how content can persist over multiple sessions. Create just came out with an update just week so that now, when people play Create, when they restart a session they can build on that and have content that persists in the world around you. But what that technology also enables, and what’s coming online with the passable world tech, is being able to have shared experiences. That’s very important for developers.

Above: Magic Leap’s CEO Rony Abovitz, John Gaeta, Magic Leap’s head of creative strategy, and Neal Stephenson, Magic Leap’s chief futurist on stage at L.E.A.P. on October 10, 2018.
Whenever we talk about mixed reality, we always talk about how the real magic of why people do MR is to experience it in the real world, but also with people around them. Everything coming online with passable world–I don’t want to mix anything up as far as what quarter it’s coming out. But it’s enabling everything with passable world and persistence. All of this is also working to inform our object recognition technology as well, which I know is something that’s extremely important for developers.
Why it’s important is because object recognition starts to provide more semantic recognition about the real world around us. There are short-term things we’re working on that weren’t in the keynote, but were announced in the sessions. For example, there’s Environment Toolkit. That’s the first step of what will lie on top of object recognition. It’s starting to provide semantic info about–instead of just saying, “This is a plane and this is mesh,” it’s actually saying, “This is a table. These are seating locations. Here are hiding locations. Here’s a corner of a room.”
Instead of focusing on placement of content, developers can use that contextual information to create content that feels way more intelligent and alive. That’s a stepping stone toward the object recognition that’s coming online after, which was talked about in the road map. Voice commands, natural language processing, that’s something we also talked about in the experimental inputs talk, where they dove into the use space of what we’re doing with that. We’re doing a lot of experimentation and debugging on our side, and that’s an API that will go online as well.
Right now there’s a lot of work. I know it was not all talked about in depth in the keynote. But within the sessions, pretty much everything that was mentioned on that road map, we did a one-hour deep dive on exactly what’s going to be coming for developers.
GamesBeat: Oculus now has this low-middle-high set of developer targets. They have Rift, Quest, and Go, and even the phones. Do you have a feel yet for what your high is? There’s a middle and a low in AR, these hundred-dollar AR experiences or smartphone experiences, but you guys aren’t doing that. If someone comes to you with an idea, do you have a sense as to whether that’s a Magic Leap idea, as opposed to someone else’s AR platform?
Laidacker: It’s hard to say as far as comparing different types of AR content. Definitely, on the VR side–there’s a difference between the VR and MR sides of things. With the different AR platforms today it depends on the type of content you want to create. Do you need to be hands-free? Do you want to have something where you can walk around the world and be able touch things and pick up things? If that’s the case, a HMD is the way you want to go.
If that isn’t as important, or having as much depth information about the environment around you, yes, there are more lightweight AR capabilities. But I know that for us, for Magic Leap, we really want to be pushing the full spectrum of having the environmental information, having access to your hands, focusing on the HMD side of things. But as far as different types of Magic Leap platforms, that I can’t comment on.

Above: Magic Leap One will make possible “spatial computing.”
GamesBeat: Is this a real 6DOF experience now, do you think, or is it some kind of subset?
Laidacker: As far as the hand interactions–this is feedback we see from developers, and even internally. It’s the beginning days. Right now, yes, we have Totem, which is full 6DOF, and does enable really novel interactions. That’s one example that I really love, fully using the control to be able to control flying things in the air. That’s much more novel and intuitive, because with the full 6DOF–with gestures, I think what people are really interested in is not necessarily just the eight exact gestures we’re supporting today. But developers are excited that we have keypoint tracking. Developers can use that keypoint tracking for whatever is specific to their application.
In terms of where we know we need to go in the future–we want to be able to do gestures that are outside of that tracking zone, using gestures that are natural, instead of having my hands in front. Though we are providing developers with a lot of best practices about how to do more natural things within the field of view. Haptic feedback is also a big one that we’re looking into for future forms of technology.
There are definitely things where we’ve worked with–how do we trick people, almost, into making them think that interacting with something digital has that feedback? Thinking about grounding your digital content to something that’s physical. Maybe having a digital button on a table. That feels much more tactile when you press it, compared to poking into the void and wondering if you really interacted with something. We’re doing research there, it least in terms of sharing best practices. But we know that haptic feedback is something we really want to focus on for the future.
GamesBeat: After the convention, when you’re going around to visit people now, do you have a very different agenda?
Laidacker: We saw a bit about this in the keynote, this focus on the Magicverse and what that is. Our focus is twofold. We still want to work with a lot with our developers to see what are the features and the different APIs and technologies we need to provide to them to create compelling experiences for indoors, for living room use, for offices and all of that. It comes down to that semantic information. But at the same time, we’re also starting to see–what does it mean to have these experiences in the real world?
When I think of it from a use case perspective–think about being at home. What do I like do? I like to relax. I like to play with my kids. I like to cook. That’s where a lot of those experiences are surrounded, around the types of interactions I do in my home. But when I think about going out into the real world, the use cases are very different. I’m often just trying to get from point A to point B. I’m shopping for groceries. I’m hailing a cab. I’m flying somewhere.
We’re starting to look at those use cases, and still taking that same philosophy of how we got Magic Leap One out the door. Use cases inform features, inform systems, inform technologies. Especially for me, coming from a video game background, where my whole focus was doing open world AI system, it makes me excited to see the similarities of technologies that are going to transcend how we bring experiences to the real world around us. That’s why so much of the work we’re doing now with passable world, persistence, and shared experiences is key to how we’ll be able to bring those experiences to the real world around us.