In its goal to push the visual quality of real-time rendering to a new level, Unity is starting the new year off right by releasing a sneak peek at its upcoming interactive rendering improvements via a short, three-minute first-person interactive demo called "Book of the Dead."
Utilizing a new feature called the "Scriptable Render Pipeline," which was introduced to the Unity beta program in the recently released Unity 2018.1, the video preview (below) shows a collection of complex and organic scenes. The sheer complexity of the demo should be enough for any developer to stop and take notice — not just Unity developers.
Long before it began reaching critical mass in terms of popularity, Unity was considered a comparatively easier platform for new game developers to get started on versus other tools. And, historically, the common assumption regarding starting with Unity was that once the developer reached a high enough level of sophisticated development experience, they would move on to something like the Unreal Engine. The primary reason for that assumption was that while Unity was accessible to developers of every level, its rendering quality, particle systems, and other features did not match the competitors out there.
In an effort to not only hold onto the developers that start by using Unity, and draw in veteran developers as well, the Unity team has, over the last few years, made major efforts to update the platform's rendering pipeline. That work is best represented in the video above, which reaches near photorealism at points and closes the gap between Unity and its competitors.
Achieving the feelings a forest evokes is one of the hardest challenges when working within real-time graphics. We set out to tackle this with 'Book of the Dead' to see how much a small team like ours could accomplish working within Unity ... We use new features when they're introduced early in their development cycle, and we partner closely with our tech teams to push new features to their limits, so that the innovations we uncover together will make a better product for our developers. When a demo is done, we know that the engine has achieved new functionality; and we are excited to see developers use this tech for their next creations.
One could say that photorealism was, at one time, the ultimate goal of all rendering engines. And while easily accomplished with 3D rendering software packages like Maya or 3ds Max, both of which can eat up hours rendering a single frame, "real-time" photorealism, which requires a minimum of 24 frames per second, has always been one of those features that seemed far from possible.
But with the increase in computing power we've experienced in recent years, real-time rendering of simple scenes was a major milestone that has been reached with the Unreal Engine. These scenes are typically set in interior room spaces. In an effort to minimize the volume of 3D mesh polygons, the scenes usually feature many square or box objects. Given that usual approach to real-time rendering in the Unreal Engine, the organic nature and complexity of the real-time scenes in "Book of the Dead" are really mind-blowing and a major step forward in the quality race (maybe even a step ahead of everyone).
Improvements in rendering quality aside, a few other major features that are being introduced in the Unity 2018.1 beta version include the C# job system and a visual shader programming system called Shader Graph. What do those features mean for developers? I'll explain. ...
C# Job System
This is a highly optimized system that allows rendering thousands of instances of an object with little effect on the user's frame rate. The feature was demonstrated at Unity 2017 in a scene featuring more than 100,000 objects, all while maintaining more than 30 frames per second (an amazing feat).
For those of us developing mobile augmented reality applications, there is a second point that should be made: The extra optimization that comes with this new system will be using far less processor power. This will result in better battery life on your mobile devices. This could be a major game changer.
Shader programming is a very different beast compared to the basic coding of a game. And it's a beast that many programmers hate passionately. With the introduction of the new visual shader programming tool called Shader Graph, developers can now use a series of boxes with various functionalities to create the types of shaders they need. There are a few tools out there for Unity that have offered this feature before, but this will now be a native part of the program, right out of the box.
And while producing photorealism in AR experiences is not likely a goal for most (at least, not at this point), for any developer who has a grasp on the hardware we're working with, the long struggle up to this point tells us that it's only a matter of time before this is something developers want, too. Regardless of what Unity's one-time reputation as a gateway game engine was before, it has become the "premiere" 3D engine for AR development. I genuinely look forward to seeing what AR developers can produce with Unity and these new features in the near future.
If you're looking to dodge potential bugs, the Unity 2018.1 beta will likely move into a release state sometime in March. If you're more interested in seeing what's coming next right now, you can download it today.
Be the First to Comment
Share Your Thoughts