Early this morning, Unity held their keynote at GDC 2017 in the InterContinental San Francisco hotel. During the event, they talked about their upcoming roadmap and many changes that are coming down the pipe. This list included the lighting explorer, progressive lightmapper, the new 4K video player, native support for Vulkan graphics, TextMesh Pro integration, and the one that really excites me as a HoloLens developer—dynamically/runtime created navigation meshes.
Some of these features will be in Unity 5.6, which will be leaving its beta milestone to enter full release on March 31, 2017. This date also marks the end of the long-running Unity version X.X system, and the introduction to a new yearly versioning system, starting with Unity 2017.
Here are a few of the features that will have the biggest effect on AR/MR developers.
When I was working on my asymmetrical multi-platform/multiuser experience prototype masquerading as a Pac-Man game, finding a solution to take the spatial mapping and dynamically create the field of Pac-Dots was an incredible pain. Part of this was because navmeshes could not be generated on the fly. The new version of this Unity tech is now in a component form that can be attached to objects. This fixes all of my headaches with that prototype completely.
In my specific project, it will also allow me to limit the places the other players (the ghosts) can go. This will considerably clean up my code base and make my future projects using the system to be far easier to implement in other projects.
For devices based on mobile processors, like the current generation of augmented and mixed reality devices, the ability to shave off every small bit of the computing overhead is a very important thing. Lightmaps are a way to light our environment and bake in the lighting so that it's not processed at all at runtime. Currently, the process for lightmapping is a bit slow, often taking 30 minutes or more for a scene with 12 or more lights. This being the case, making a simple change and seeing it in the editor can be a scary process due to the time commitment of rebaking afterward to see that change.
Progressive lightmapping allows the user to make a change, and to get around the long process or rebaking to simply see that change. It begins the baking process centered on the viewport and radiates outward in layers. The live demo that was shown was rather a simple example of this in a complex scene.
The new video player that has been in a beta form is being upgraded to allow the use of 4K video. This will help provide immersive video and enable the creation of 360-degree video experiences similar to HoloTour.
The new Vulkan graphics implementation, which we've talked about before, is a far more efficient use of graphics than the popular OpenGL standard, and the way this efficiency is utilized is through lower power consumption on your mobile devices.
These features, and more, are coming down the pipe and are going to offer those of us in this niche of software development more tools to leverage in our efforts to push our hardware to its fullest.
Keep your eyes here, as there's more to come from GDC 2017.