NR50: Next Reality's 50 People to Watch: Timoni West

May 16, 2017 04:59 PM
May 16, 2017 05:03 PM
636304791739939994.jpg

If you're a developer in the augmented and mixed reality space, there's a high probability that you're intimately familiar with the 3D application and game engine Unity. In May, at VisionSummit 2017, Microsoft announced that 91% of all HoloLens applications have been made with the software. But there's a section of Unity that you may not be familiar with, which has become very important to augmented, mixed, and virtual reality (known collectively as XR, for "extended reality") — Unity Labs.

Unity Labs is the research and experimental camp inside of Unity, whose immersive Authoring Tools group released EditorVR near the end of 2016. Timoni West, a George Mason University graduate and principal designer at Unity Labs in charge of the Authoring Tools group, is what one might call a UX specialist. She and her team's ideas of how to take Unity from a 3D simulation on a 2D screen to an actual 3D environment will no doubt have a lasting effect on this new type of workflow for years to come.

While EditorVR is a great analog for 3D world building in an immersive environment like VR, it's the very reason I bought an HTC Vive. Seriously. As a former 3D game level designer turned mixed reality developer, my curiosity tends to move toward exploring these design metaphors for the less immersive end of the mixed reality spectrum. Can these same ideas work? Does MR, which tends to be far less static in nature, need a whole new set of rules and ideas to realize its full potential?

West was willing to take some time and talk to me about some of her ideas on UX/UI design, as well as paint a picture of how Unity sees the XR space as a whole.

636270624373581216.jpg

Timoni West

NextReality: Has your team put much thought into approaches for non-occluded immersive development with Unity? There seems to be a good deal of crossover between the environments.

Timoni West: Yes, we are designing all of our current immersive creation products with the idea that VR will likely gracefully move into AR once the consumer hardware has caught up. We also use VR for prototyping augmented experiences.

We tend to think of this space holistically, to the point we refer to it all as 'XR.'

NR: While a much larger problem to address, has any effort been put into ideas on world-scale development?

TW: Yes, a spec has been submitted for mixed reality geocoordinates mapping to virtual space: https://mixedrealitysystem.org

NR: As a UX specialist, are there any ideas or approaches that you have seen recently that really sparked your imagination?

TW: EXA VR just came out today and is a really fascinating and, I think, forward-thinking way of making music given no physical constraints:

I tend to favor approaches that are thoughtful about how to make best just of all dimensions, as opposed to giant billboards and laser pointers. Cosmic Trip is a great example of fun in-game VR UI, and My Lil' Donut has some fascinating experimental work that I think will become standard button types through the years.

NR: With the science/art of UX in VR/AR being so new and open, how often to do come across ideas that make you re-think your own approaches to a problem?

TW: By trade, designers are constantly reevaluating how and why they do things, why objects are designed the way they are, and how processes can be improved. XR is fun in that when we consider ideal workflows or best practices, we can do so independently of physics. Anything can float; anything can be any size; anything can follow you around, be moved somewhere else, or easily dismissed. So the answer is — constantly.

NR: Do you have any advice for developers that want to start developing or improve their UX and UI design sense and skill set?

TW: I always recommend Donald Norman's The Design of Everyday Things, but the lessons in it, especially around physical design, are even more important in VR.

My colleague Omer Shapira at NVIDIA Labs has developed a technique he calls 'brown boxing,' in which users close their eyes in VR and act out the action they want to do. They capture and analyze the resulting motion data to get realistic sizes and distances needed for the UI. I think that's brilliant.

NR: Are there any specific pitfalls that developers can avoid?

TW: I've noticed a trend to move the action, the fun part, far away from the user, especially in mobile VR. Instead, I recommend you put the user front-and-center in the action. XR allows you to go places you normally couldn't, without being destructive or disrupted. When I see experiences that are artificially constraining the user's motion or ability to interact with the world, it makes me sad: such a missed opportunity to create a genuinely new experience.

Remember folks — the rules of the real world, where paintings are behind glass and secure areas are roped off, don't need to apply in VR. Let your users interact as much as possible — even though it takes longer to build — and your experience is that much more likely to be wonderful.

A big thank you to Timoni West for taking the time to talk to us about the problems that will soon be facing developers in the XR space. It's good to know that Unity and Unity Labs are on the job. Future UX designers, listen to her advice; I keep a copy of The Design of Everyday Things with me whenever I travel, and it's my go-to reading during most flights.

Cover image via Unity Labs/YouTube

Just updated your iPhone? You'll find new Apple Intelligence capabilities, sudoku puzzles, Camera Control enhancements, volume control limits, layered Voice Memo recordings, and other useful features. Find out what's new and changed on your iPhone with the iOS 18.2 update.

Comments

No Comments Exist

Be the first, drop a comment!