If we were to assign a theme for the 2019 edition of the Next Reality 30 (NR30), it might be something along the lines of, "What have you done for me lately?"
Many of the top names in the industry from the 2018 edition remain the same, but their positions have shuffled. The people that rose in rank have demonstrated clear success, such as platform growth and new products. Those who have fallen are still very relevant, but their accomplishments are relatively stagnant compared to their 2018 output.
Nineteen members of the 2018 edition dropped off the list in 2019. In some cases, their companies folded. In other cases, they left their companies or the industry altogether. Some of those 2018 members who have left AR were replaced by colleagues, as their companies' AR efforts continued to flourish. Others have been surpassed by individuals whose contributions have simply been stronger in the last year.
Ultimately, augmented reality is still a volatile, experimental field where few use cases have broken through to mainstream success. With all the progress in hardware and software that the industry has accomplished over the past three years, there is still an expectation of what the utopian augmented reality experience should be.
The "hype cycle" methodology for evaluating emerging technologies may explain the swings in position, as some sectors are on the rise in terms of hype, with others facing the disruption of disillusionment.
For instance, Gartner has identified the AR cloud as one of 2019's emerging technologies on the upswing of the hype cycle curve. However, Gartner also predicts that it will take five to 10 years before AR cloud platforms reach the "plateau of productivity," meaning a rise, and then fall in hype level is in the offing.
Conversely, last year's revolutionary AR development tools (including ARKit and ARCore toolkits), are facing a moment of truth. In the face of lower rates of adoption of these tools, some competitors have stepped in with more innovative offerings.
While we finally see smartglasses with some consumer appeal, wearables makers are still sacrificing some of the form, function, or price of these products to deliver them now. Even the HoloLens 2 — the most anticipated augmented reality product of the year and the catalyst behind Microsoft landing four individuals on the NR30 for 2019 — has been cordoned off to the realm of enterprise customers and experimental developers. And several months after its debut at MWC, the device hasn't even launched. Meanwhile, Magic Leap, a year after its much-hyped launch of the Magic Leap One, has spent the year pushing developers to build content for its device.
Nonetheless, 2019 may very well set the stage for a massive year in 2020. So we can expect the changes between the 2019 and 2020 editions of the NR30 to display the same fluidity, either due to the success or failure of the aggregate industry players.
A couple of years ago, Snapchat's app was bleeding users, watching major executives depart, and suffering from a shaky stock price. It was beginning to look like the app's creator Evan Spiegel might go down in tech history as the man who foolishly turned down a multi-billion acquisition offer from Facebook. Now, in 2019, it's looking like Spiegel was right to bet on himself and his team instead.
Reversing a 2018 downward trend, Snapchat recently reported a significant increase in Snapchat users. The app's tally was 203 million daily active users and showed healthy revenue for Q2 — bringing in $388 million. But the numbers only tell part of the story. The real excitement around the company, at least when it comes to industry onlookers, is Snap's deepening focus on AR.
What started as a few cute filters is turning into an engagement engine. Thousands of users have begun making their own Snapchat filters using the company's Lens Studio software suite. The advertising and marketing world has started to embrace the possibilities inherent in Snapchat's platform. The app allows brands to deploy AR experiences and facilitates e-commerce with its Shoppable AR tool.
The latest, and perhaps most exciting AR development to come from Snap is its Landmarker targets. These transform real-world structures into fantastical AR-enhanced alternate reality versions of the real thing.
As usual, Facebook seems to be shadowing Snap's every move, rolling out copycat features as fast as Snap can originate them. But Facebook's now several years-long bad run of publicity has tarnished the larger social network in the eyes of many users. Facebook's stumbles have afforded Snap the daylight it needed to survive the social media behemoth's unrelenting onslaught of move-fast-and-take-things feature cloning.
Of course, the one area that continues to stoke interest in Snap is how, exactly, it plans to turn its AR wizardry on smartphones into something truly immersive. Its Spectacles wearables are still just camera glasses. When the first version came out in 2016, there were long lines to pick up a pair for $129. The second-generation version was a bit less flashy and cost $149. But despite the steady usage of Snapchat and the initial excitement around the wearable, public sightings are few and far between. Then, in 2018, the company released a fashion-forward version of Spectacles (Nico and Veronica models) for $200. But, for some reason, the release was done with little marketing, and, predictably, very few people seem to know the Nico and Veronica models (easily the best of the bunch) exist. Finally, just last month, Snap released Spectacles 3, a dual-camera wearable that looks like the company decided to say, "screw the subtlety, we're diving right into the future."
Although the new wearable doesn't offer AR, its dual camera set up allows you to capture photos that deliver a sense of depth in the image. Users can add AR effects after transferring their Spectacles 3 footage to the app. It's a clear indication of where Snap is going with their hardware ambitions. Now, it's just a matter of how long it will take them to get there, and if they'll beat Facebook to the punch.
Back in 2017, ARKit and ARCore were supposed to be the future of mobile AR, and they still might be. But for now, Snap is making the biggest impact in the most widespread hardware vector on the planet. So Spiegel wears the mainstream AR crown this year, even as dozens of players are reaching to snatch it off his deceptively chill brow.
At the start of 2019, Microsoft finally unveiled the followup to its groundbreaking augmented reality device, the HoloLens by introducing the HoloLens 2. Leading the charge for the update was HoloLens inventor Alex Kipman, who gave the world its first real look at the device on stage in Barcelona, Spain at the annual Mobile World Congress.
But instead of following the path of more recent augmented reality competitors like Magic Leap, the company steered hard away from showing off anything remotely consumer-oriented and instead leaned even harder into its existing traction with the enterprise space. Along with the device's far more comfortable design and better hand tracking (ditching that awkward HoloLens 1 pinch gesture), the company also promoted what is clearly going to be their version of the "AR cloud" — Azure.
Part of Microsoft's decision to focus on enterprise customers has already paid off as the company won a highly coveted military contract out from under Magic Leap. As history has shown, before technologies go mainstream, we often see them emerge first in space and military environments. Therefore, the U.S. military's embrace of the HoloLens makes perfect sense.
While the HoloLens 2 is looking like the defacto AR solution for many enterprise customers, the device is so much better than its predecessor that we are fully expecting some developers to bring their immersive gaming interests to the device.
Over the years, Kipman has sometimes seemed like he was fighting a dubious battle as Microsoft's leadership appeared to focus on more traditional paths to tech expansion. But as immersive computing has matured, Kipman's hard work and faith have paid off. What looked like a Microsoft side project/experiment is now becoming a key cog in the company's immediate and long-term future.
The on-stage debut for Aparna Chennapragada, Google's vice president of product management for Google Lens & AR, didn't occur at a Pixel launch event or Google I/O. Instead, it was in 2002 as part of the Harvard University South Asian Association's production of Interrogations. Working for cloud computing company Akamai Technologies in Cambridge, Massachusetts at the time, Chennapragada, despite no previous theatrical experience, answered a casting call at a grocery store.
Now, that early introduction to public speaking is paying off, as Chennapragada has emerged as the face of Google's AR software products.
Since the introduction of AR Stickers (now known as Playground) for Pixel smartphones during the Made by Google product launch event on Oct. 4, 2017, Chennapragada has served as Google's AR presenter, a role formerly occupied by Clay Bavor, Google's vice president of AR/VR (and previous NR30 honoree). She has also provided the tech world its first glimpse of Live View, the AR walking navigation feature in Google Maps, and AR content in Google Search.
While she did not have the privilege of introducing Google Lens (that responsibility fell to Google CEO Sundar Pinchai at Google I/O 2017), she has been the one to unveil new features in the visual search engine at every subsequent Pixel launch event and Google I/O keynote since. She has also penned several blog posts introducing new AR products (or, at the very least, has had those blog posts attributed to her).
After earning her undergraduate degree in computer science at the Indian Institute of Technology in Madras, Chennapragada moved to the US, where she earned her masters in computer science at the University of Texas in Austin and completed a software internship with Oracle.
While at Akamai, she continued her studies at the prestigious Massachusetts Institute of Technology (MIT). At MIT, she earned a masters degree in management and engineering as a graduate fellow for the System Design and Management program. During her time at MIT, she won third place in the Venture Capital Competition, winning $1,000 for her business plan. She also scored a 770 out of 800 on her Graduate Management Admission Test (GMAT) and graduated with a 4.9 GPA.
Immediately after earning her degree at MIT, Chennapragada joined Google as a product manager. At Google, she led a team of 30 scientists and engineers with a mission of turning research into products for Google and YouTube. In less than two years in the position, she had a measurable impact, improving the team's productivity from zero to more than 20 features launched. At the same time, she killed off several low-yield projects to focus on "big bets."
She continued to rise through the ranks, adding Google Search and Google Now to her product portfolio and joining the company's executive speaker bureau. She eventually ascended to the position of technical assistant to the CEO, where ran company-wide product strategy reviews and advised on areas of planning, hiring, development, and even product cutting.
In October 2017, Chennapragada accepted her current position, the same month she first took to the stage to show the world the new AR features on the Google Pixel 2.
Her deft management of Google's products has translated into positions of power outside of Google as well. In 2018, she joined the board of directors of Capital One. She has also served as an angel investor and advisor to various tech startups.
Based on her track record, Chennapragada is the ideal executive to lead Google's AR efforts. AR itself is still a relatively young technology, but it is evolving at a rapid pace. Fittingly, Chennapgrada has demonstrated the ability to help shepherd multiple AR products that have launched in less than two years. Moreover, she's shown that she isn't afraid to cut bait on products that come up short. And, at this stage, not everything will come up roses.
Apple CEO Tim Cook rose to the top of the AR industry on the strength of ARKit, a software platform that enables developers to add AR features to their mobile apps. Before ARKit, Apple's contribution to the technology existed mostly on unconfirmed reports of work on smartglasses and the assembly of an AR team through high-profile hires and (sometimes surreptitious) acquisitions.
Since the first version of ARKit, Apple has introduced several new, innovative features to its mobile AR toolkit, such as image recognition, multiplayer experiences, persistent content, object recognition, people occlusion, and motion capture, which make it arguably the most powerful mobile AR platform on the market. At the same time, Cook has continued to strengthen Apple's AR army. And, with the public release of iOS 13 in the fall of 2019, Apple will introduce additional new tools, namely RealityKit and Reality Composer, which promise to make it easier for developers to add AR to their apps.
Despite a relative boon in AR apps with the original iteration of ARKit, there hasn't been a flood of apps taking advantage of ARKit's newest features. The company has been making overtures to nurture its developer community through an app accelerator in China and AR art installations. But, unsubstantiated rumors recently emerged that Apple had shelved its smartglasses project, just as iconic design chief Jony Ive exited the company.
Cook, who succeeded late Apple co-founder and tech legend Steve Jobs as CEO in August 2011, also sits on the company's board of directors. He was promoted from chief operating officer, where he oversaw Apple's supply chain, sales, service, and support worldwide, as well as the company's Macintosh division.
After earning his Bachelor of Science degree in industrial engineering from Auburn University in his home state of Alabama, Cook continued his education at Duke University, where he earned his MBA and attained the title of Fuqua Scholar, an honor reserved for the top 10% of Duke's business school graduates.
For the first 12 years of his career, Cook ascended through the manufacturing and distribution ranks at IBM, topping out as director of North American fulfillment. He then served in executive roles at Intelligent Electronics and Compaq before joining Apple.
In recent years, Apple's chief executive has leveraged his leadership position for humanitarian pursuits as well. He is an outspoken advocate of consumer privacy rights, equality, and LGBTQ rights. He received the 2018 Human Rights Award from the Birmingham chapter of the Southern Christian Leadership Conference and the 2019 Champion Award from the Gay, Lesbian & Straight Education Network for his advocacy efforts.
Back in 2017, Cook said that "AR is going to change everything." Given his position as a proponent for cultural change, it's easy to back Cook's vision of AR as a force for good. Should he and Apple fulfill that prediction, his return to the top of the AR industry should be swift.
Most of the action for Magic Leap this year has been on the software side, with a wide variety of independent developer shops and major players taking on the challenging of pulling users into the burgeoning "magicverse." Part of that effort was driven by CEO Rony Abovitz's decision to launch the Independent Creator Program. The program, which gives $20,000 to $500,000 to a developer or team, is an initiative designed to support independent developers interested in adopting the Magic Leap platform.
Soon after, Epic Games added more fuel to the developer fire by offering 500 Magic Leap One headsets (which cost $2,295 each) to select developers using the Unreal Engine to develop on the spatial computing headset.
And while it's still fairly early in the life of the Magic Leap One at just over one year on the market, the Independent Creator Program is already bearing fruit. One recent standout among the program's grantees is UK-based Magic Lines, a company marrying analog techniques with AR to assist Parkinson's patients in enjoying more mobility.
Along those lines — despite the company's less than successful efforts to secure major military contracts — Magic Leap has decided to hang a good portion of its potential future on the medical space. In recent weeks, the company announced a major push to work with organizations including BrainLab, XRHealth, SyncThink, The Dan Marino Foundation, and the Lucile Packard Children's Hospital Stanford to bring immersive computing into the mainstream of medical tech. Leading that charge will be Jennifer Esposito, a veteran of the tech-meets-health space with previous stints at Intel as the head of the Health and Life Sciences Group and at GE Healthcare as a general manager. In her new role, Esposito will oversee the myriad ways in which Magic Leap might add its technical innovation to the healthcare space.
Some have tried to dismiss the Magic Leap One, which still hasn't quite caught on with most of the public despite its retail availability and in-store demos via AT&T, as well as several high profile entertainment franchise titles. But those naysayers are apparently being ignored by one very key demographic: the small but growing cadre of developers building apps and experiences for the device.
Just six months ago, checking the Magic Leap app store once every month or two was enough to remain updated, but in recent months, it seems like every week something new has either been released on the app store or at least announced at some event as part of a location-based experience. There's no guarantee that the Magic Leap story will work out, but the company's funding war chest bought it some time, and it's beginning to look like at least some of that patience is paying off.
As for Abovitz, he has remained active as the company's chief evangelist and dealmaker, frequently tracking the passionate debates on social media among AR insiders and weighing in whenever the topic of Magic Leap comes up. That might sound odd in the context of most companies with multibillion-dollar valuations, but Abovitz has wisely decided to forego the aloof approach to social media some in his position have adopted and instead is in the trenches of the conversation around the immersive computing space. That decision to remain engaged with the community will likely keep him more connected and nimble than some of his competitors in the months and years to come.
Based in Rochester, New York, Vuzix, led by CEO and founder Paul Travers, is a relative outsider compared to the Silicon Valley tech giants that are scheming to produce consumer-grade smartglasses.
But Travers and company just so happen to be ahead of most of them. The Vuzix Blade happens to be one of the first smartglasses to hit the market with a form factor that might appeal to the average user for everyday wear. And while the device falls far short of the likes of HoloLens and Magic Leap One in terms of 3D immersive content, it delivers on the promise that Google made with the Glass Explorer Edition.
And, now that it is available for purchase through Amazon, the Blade's $699 price tag comes off as relatively inexpensive compared to current top-tier smartphones.
Travers got his start in the tech world at a time when Rochester was the Silicon Valley of imaging technology. He graduated from Clarkson University with a Bachelor of Science degree in electrical and computer engineering, a calendar year before Apple launched the original Macintosh and before Facebook co-founder Mark Zuckerberg was born. He initially worked in the well-funded research and development department at Eastman Kodak, but he became disillusioned with seeing his work continually shelved, so he quit to start his own company in 1989, a year before Snapchat founder Evan Spiegel was born.
His first company, Forte Sound, built sound cards for PCs. After selling Forte Sound to Advanced Gravis, he started two other companies in 1992, a year before Jeff Bezos started Amazon. One of those companies, Etek Labs, made USB connectivity technology and was eventually acquired by Belkin. The other, Forte Technology, produced VR headsets. Finally, in 1997, Travers founded Interactive Imaging Systems, the company that would eventually do business as Vuzix.
Interactive Imaging Systems initially started building night vision goggles for military customers, evolving into the consumer space in 2001 with the iCOM Wireless Personal Internet Browser, which was essentially a web-connected PDA with a full-color screen (at the time, a groundbreaking achievement). The company then began making personal video displays, rebranding as Vuzix in 2007.
In 2010, two years before Google introduced Glass to the world via a bombastic skydiving stunt, Vuzix launched its first augmented reality wearable. The STAR-1200 was a gaudy pair of smartglasses with see-through displays, a bulky 1080p camera mounted atop its frames, and head tracking at three degrees of freedom.
The Google Glass Explorer edition came and went, but Vuzix continued making smartglasses for enterprises. It turns out that Vuzix has made enough of a splash in the sector for Google to follow Vuzix's lead, reviving Glass for enterprises in 2017.
Now, Travers and Vuzix can lay claim to being leaders when it comes to consumer-grade smartglasses with Blade, despite not necessarily being a household name, but they'll have to remain innovative in the face of what promises to be fierce competition. And the company appears to be up to the task. After North burst onto the scene with the fashionable Focals, Vuzix prepared its own take on stylish smartglasses. Considering the fact that Travers has been met with success in producing wearables before Google was even founded, it would be foolish to count him out.
If you haven't already heard, 2019 has been the year that AT&T and other mobile carriers have geared up to start their 5G network rollouts in earnest. For its part, AT&T has promised nationwide 5G coverage by 2020.
Part of the AT&T 5G rollout is to make a case to consumers and businesses that the technologies will take advantage of a leap forward in data speeds afforded by 5G. These faster speeds could be key in executing more advanced augmented reality applications with robust 3D content and interactive multi-user experiences.
That's why AT&T hitched Magic Leap's wagon to its 5G train last year with a strategic investment and a deal to be the exclusive retail launch partner for the Magic Leap One.
Up until recently, Jeff McElfresh, as president of AT&T Technology Operations, has been the executive in charge of AT&T's 5G rollout, a role he took on in August 2018. As of Oct. 1, though, McElfresh will fill a new role as CEO of AT&T Communications, Inc., which includes mobile, broadband, and pay-TV units.
The AT&T veteran prepared for his current role via prior technical and business education. He earned an electrical engineering degree from the University of Florida and later completed an MBA from Northwestern University's Kellogg School of Management.
McElfresh has more than 20 years of tenure at AT&T in various roles. Prior to serving as president of AT&T Technology Operations, McElfresh was CEO of Vrio, a holding company for DirecTV Latin America and Sky Brasil. He was also president of AT&T's operations in Mexico, where he had seats on the board of directors and executive committee for Telmex and America Movil. He has also served as vice president of product development for AT&T's emerging devices group, vice president and general manager of AT&T Mobility, and chief of staff to the chief operating officer for the merger integration of Cingular and AT&T Wireless.
Since the launch of Magic Leap One, AT&T has leveraged the partnership in earnest. At last year's L.E.A.P. conference, AT&T announced its DirecTV app for Magic Leap One and revealed that it would set up a 5G test area at Magic Leap's headquarters in Plantation, Florida. AT&T has also used the Magic Leap to promote the final season of Game of Thrones and the latest installment in the Harry Potterverse film franchise.
In addition, AT&T has expanded its partnership with Magic Leap to showcase enterprise augmented reality apps, which will help Magic Leap compete more directly with the HoloLens 2, which Microsoft is pitching as purely an enterprise tool. The company is also assisting Magic Leap in engaging the development community by sponsoring Magic Leap hackathons.
AT&T has now begun selling Magic Leap One at select stores and through its website. But this is just the beginning, as we can expect AT&T to promote Magic Leap further as its 5G rollout continues. With his expanded leadership position, McElfresh plays a key role in fulfilling this partnership.
Niantic, led by CEO John Hanke, is best known as the developer of the most successful augmented reality game to date, Pokémon GO. But, before that, Hanke had a knack for building apps that immerse users in the real world even without the benefit of AR.
After earning his undergraduate degree at the University of Texas in 1989, Hanke continued his studies at the University of California, Berkeley, earning his MBA in 1996. While at Berkeley, Hanke worked on Meridian 59, one of the first multiplayer online games, and continued on with the game when it was acquired by 3DO. After two years at 3DO, Hanke left to launch another gaming startup, Big Network, that was itself acquired by eUniverse in 2000.
Hanke's next move left gaming behind for a different pursuit, though both areas would eventually serve as the foundation for his work at Niantic. In 2001, Hanke co-founded Keyhole, a mapping company that would go on to build a digital 3D globe of the Earth. And, if you're following the pattern that marks Hanke's early career, you can guess what happened next, as Google acquired Keyhole in October 2004.
At Google, Hanke and the Keyhole team re-launched the technology as Google Earth. As vice president for Google's geography products, Hanke continued to advance Earth along with Google Maps, Street View, and Google Local (products that he now, ironically, competes with) until Dec. 2010. In Jan. 2011, Hanke took on a new role as vice president of product for Niantic Labs, an experimental team that focused on building apps on top of Google's geo products.
Niantic's first app was Field Trip, which is actually shutting down in 2019. Using a mobile device's location, the Field Trip app served up points of interest, such as historic markers, parks, and libraries, within the user's immediate surroundings. These location points would serve as the foundation for Niantic's location-based gaming empire.
The first game from Niantic to take advantage of that location data was Ingress, the cyberpunk-influenced game that pitted two factions against each other in what amounts to a massive game of Capture the Flag. Ingress amassed a cult following that would set the stage for Niantic's next project.
In 2014, one of Google's infamous April Fools pranks would inspire the game that would make Niantic a household name. A Google engineer (and now a Niantic employee) inserted Pokémon characters into Google Maps. Hanke recognized that there was a whole game there.
And, as we all know, Niantic did, in fact, build a game based on that joke. However, as development was underway, Google decided to spin-off Niantic into an independent company in 2015. Soon after the spin-off, Google participated in a $30 million funding round, alongside The Pokemon Company and Nintendo, Niantic's partners in Pokémon GO. Niantic continued in 2016 with the launch of Pokémon GO, which took the world by storm and still remains one of the most popular mobile apps today.
Fast-forward to 2019, and Niantic has now launched another highly-anticipated game based on a popular media franchise in Harry Potter: Wizards Unite. While the game has not reached the level of success that Pokémon GO enjoyed, Niantic's aims for future growth are not necessarily tied to one game. Instead, its future fortunes are tied to the Niantic Real World Platform, an AR cloud platform that enables multiplayer gaming, persistent content and object occlusion and could potentially usher in a new era of AR gaming. The company closed a $245 million funding round earlier this year to build out the platform, which will serve as the foundation for Niantic's games as well as a tool for other developers.
But Hanke's ambitions are more grounded than building games and platforms. Really, he just wants to be able to take the games that people play sitting on their couches and bring them outdoors. For that goal, Hanke has already succeeded. But with the Real World Platform, he and Niantic could potentially expand the numbers of people playing AR games outside even further.
As a founding member of the Android team, one of the top operating systems for mobile devices in the world, Ficus Kirkpatrick, engineering director at Facebook, is an ideal choice to lead Facebook's AR camera platform, which is now known as Spark AR.
Kirkpatrick's role in founding Android came through his working relationship with Android founder Andy Rubin on the Danger Hiptop (better known as the T-Mobile Sidekick). As a result, Kirkpatrick joined Google in 2005 when the company acquired Android.
During his tenure at Google, Kirkpatrick led the Google Play Store efforts, including the Instant Apps project. He turned in his Google badge on Oct. 20, 2016 and headed for Facebook. Perhaps it's no small coincidence that at least three other former Googlers have since joined Facebook's AR team after Kirkpatrick came on board.
At the F8 developer conference in 2017, Facebook CEO Mark Zuckerberg laid down the gauntlet for the company's mobile augmented reality offerings, introducing its Camera Effects Platform, with Kirkpatrick penning the announcement for Facebook's developer blog. By Dec. 2017, Kirkpatrick and his team launched Facebook's AR Studio, the development tool that enabled creators to build their own Camera Effects, as an open beta along with the extending Camera Effects to Facebook Messenger.
Since its launch, Kirkpatrick and the AR engineering team have continued to ship iterative updates to the platform. At the 2018 edition of F8, the team added Sketchfab integration and a host of new AR capabilities, including hand and body tracking and background segmentation. In August 2018, Kirkpatrick and his team added multiplayer AR games to Messenger's group video chats and integrated L'Oreal's Modiface platform into the Facebook Camera. After being renamed as Spark AR, the platform made its way into Facebook's first hardware product, Portal. This year, Kirkpatrick and company's headlining achievement has been expanding the beta of Spark AR for Instagram to all creators.
Having established Spark AR as a force to be reckoned with among AR platforms, Kirkpatrick accepted a promotion to vice president of engineering for AR/VR in August 2019, according to a company spokesperson. The move expands his influence on the development of immersive experiences at Facebook.
Despite his team's achievements, Kirkpatrick might be better known as the Facebook employee to tip the company's AR hardware plans. In an interview last year, Kirkpatrick confirmed that Facebook is indeed working on augmented reality smartglasses, a level of candidness that is practically taboo in Silicon Valley.
Kirkpatrick likely received a reprimand from the PR team for his loose lips, but you'll have to excuse his eagerness, as AR wearables mean that his team will enjoy a whole new playground for its AR platform.
Microsoft's unveiling of the second-generation HoloLens at Mobile World Congress 2019 was the most anticipated event in the augmented reality industry this year, and no other product unveiling has come close to matching its level of hype.
As a result, Julia Schwarz, a principal software engineer for HoloLens and the chosen one to execute the on-stage demo of the new headset, has become the face of HoloLens 2. One upload of her video presentation to YouTube has nearly two million views to date. Her comfortable stage presence and joyful interaction with AR content during the presentation helped Microsoft deliver on the expectations for the HoloLens 2 at the reveal.
But Schwarz earned her presenting role based on much more than her stage presence. As the technical and user experience lead for instinctual interactions, she was responsible for designing the techniques that enable HoloLens 2 users to manipulate 3D content with their hands.
While studying computer science at the University of Washington, Schwarz worked on three internships in software engineering at Google. She earned her undergraduate degree with a 3.97 grade point average and continued her studies at Carnegie Mellon University, where she earned a doctorate in computer science.
At Carnegie Mellon, she got her first experience with Microsoft via three summer research internships. During one of her internships with Microsoft Research, she developed a probabilistic input toolkit for Kinect, which became one of the most popular internal tools for the sensor at Microsoft. Later, as part of the Kinect for Xbox One team working on skeletal tracking APIs, she developed a prototype for hand gesture detection that contributed to the design of gesture language for the Xbox One and an interaction method that replaced the "wave to engage" gesture.
As she continued her pursuit of her Ph.D., Schwarz co-founded Qeexo to bring her doctoral research to market. She served as the lead for the machine learning team that built FingerSense, an advanced touch recognition platform that can differentiate between fingertip, knuckle, and fingernail presses, a dynamic that has been integrated into the Huawei P8, P8 Plus, and Honor 7. She also developed Windows 8 and Windows Mobile apps along with a friend under the name Electric Squash Studios.
After completing her doctorate work in 2014, Schwarz joined the Microsoft HoloLens team in 2015. During her time on the team, she has built prototypes for the new interaction model, has conducted troubleshooting during the production phase, and implemented the hand interactions in the HoloLens shell. She has also filed 20 patents for hand interactions based on her work.
As the HoloLens 2 makes its way toward its shipping date, Schwarz now works on the Mixed Reality Toolkit. In her current role, she is building user experience features that will help developers effectively deploy hand tracking in their apps. So, after showing the world how hand tracking works in the HoloLens 2, she's now showing developers how to put the system to work.
This time last year, while Adobe was on the AR industry radar with its potentially game-changing creation platform, Project Aero, Sebastien Deguy, the company's current vice president of 3D and immersive, was not.
That all changed in January of this year, when Adobe acquired Deguy's company, Allegorithmic.
Deguy founded Allegorithmic in 2002, with its signature product, the Substance suite of 3D painting and texturing tools, launching in 2009. Over the course of its history, Allegorithmic became synonymous with game development, with more than 150 AAA games employing Allegorithmic's software for 3D texturing, including Naughty Dog's blockbuster, Uncharted 4. Deguy also has his name on eight patents related to his work with Allegorithmic.
The birth of Allegorithmic dates back to Deguy's beginnings as a mathematician. Deguy earned his doctorate at the University of Auvergne Clermont-Ferrand. "I decided to create a company, for which I would come up with the name 'Allegorithmic,' on December 22, 2001, the very day after I defended my Ph.D. thesis," said Deguy in a blog post.
In addition to Substance, which allows 3D content creators to render 3D textures in compact package sizes, Deguy occupies a critical role in overseeing the development of Project Aero, which promises to help graphic designers create augmented reality experiences within the familiar suite of Adobe products, such as Photoshop and Illustrator. Moreover, Deguy has the privilege of working with some advanced prototypes, such as the Project Glasswing transparent display.
When reflecting on the Adobe acquisition, Deguy likened the moment to Peter Jackson landing the gig as the director of the Lord of the Rings trilogy, in that this is a career-making opportunity. But the analogy works on another level because, if Adobe's role in the augmented reality industry is comparable to the critical and box office success of Lord of the Rings, then there are great things in store for the future of AR.
The company that is helping to pioneer the dynamic of using a smartphone as the brains for a wearable AR headset also had a successful crowdfunding launch for its latest head-mounted device last month. That company is called Dreamworld and it's the creation of former Meta employee Kevin Zhong.
Zhong is among a growing list of Chinese or China-backed companies storming the AR gates at full speed, looking to be first in line for the mainstreaming of AR rather than play catchup as China did with smartphones back during the dawn of the iPhone. Like another recent Chinese AR startup founder, Zhong faced allegations of theft from his former company.
But the matter was settled and he went on to debut the DreamGlass AR headset, a device that uses an Android smartphone to power rich, 3D experiences at a much cheaper price compared to some of the higher-end options on the market. Not only does the device offer 3D AR at a much lower price, but the device also offers limited hand tracking interactions.
Unlike the DreamGlass AR, the crowd funded DreamGlass Air only offers 2D imagery, therefore, it seems more suited to watching videos rather than using it as a robust mobile computing interface. Nevertheless, the company managed to raise over $750,000 for the device on Kickstarter in just a couple of weeks, which indicates there's a hunger somewhere for this particular interaction.
Although it's true that there aren't legions of consumers wearing the previously released DreamGlass device, there something about Zhong's ability to move fast and iterate quickly that indicates he'll be an important player when AR does hit its mainstream stride.
With his background in artificial intelligence, Zhu Mingming, co-founder and CEO of Rokid, aims to put the smart in AR smartglasses.
In addition to a suite of voice-activated smart speakers that are all the rage these days, Rokid makes sleek augmented reality wearables with a side of artificial intelligence. Rokid Glass is positioned to serve enterprise businesses with remote assistance and workflow guidance software, while Rokid Vision is marketed towards consumers. A third model, the prototype Project Aurora, fits in the mold of tethered smartglasses like the Nreal Light. Each model comes with Rokid's own digital assistant.
Mingming is a bit of a serial entrepreneur. After earning his undergraduate degree in computer science from Zhejiang University and his Ph.D. from the International Computer Science Institute at the University of California, Berkeley, he co-founded a mobile internet software startup called Zhejiang SU2 Technology Ltd. He then started up Mammoth Technologies, which the Alibaba Group acquired in 2010.
While at Alibaba, Mingming experimented with deep learning and natural language processing as the leader of the company's M-team. With his skill sharpened, he left Alibaba to start Rokid in 2014.
So far, Rokid has continued to attract the attention of investors, including Temasek, Credit Suisse, IDG Capital, and CDIB Capital. The company has assembled a team of more than 400 employees, a headquarters in Hangzhou, China, a research and development center (R-Lab) in San Francisco, an algorithm lab (A-Lab) in Beijing, and a new hardware site in Shenzen.
Rokid is among a growing throng of China-based AR wearable makers emerging at the moment, such as Nreal, Shadow Technologies, and Mad Gaze. Mingming's background in AI and his company's infrastructure and resources give Rokid an advantage over its peers.
Taking a page from the book of Warby Parker, the mission of North is to be the first company to offer truly fashionable smartglasses that bring the notifications and functionality usually found on a smartwatch to your face. Boasting two brick and mortar outlets in Toronto, Canada, and New York City, the company is giving us the first real-world example of what a luxury smartglasses brand looks like.
The leader of this grand attempt to beat the rest of the market to the fashionable AR finish line is Canada native Stephen Lake. That name might be familiar to some AR insiders who remember Thalmic Labs, which was founded in 2012 and created the Myo armband, a gesture recognition device that users could wear on their arm to control a PC or smartphone.
That company has since rebranded itself as North and its first product is Focals, an incredibly well-designed pair of glasses (that can be turned into shades via an attachment) that delivers notifications, direction information, and even digital assistance via Amazon Alexa. But there are two challenges that the company currently faces. First, the company requires users to visit one of its storefronts (or, if they're lucky, catch one of the company's rolling pop-up shops) in order to be properly fitted for a pair. Simplifying this process would go a long way toward helping North's prospects.
The other potential issue is price. Although the company wisely reduced the price from $999 to $599, some mainstream users may still hesitate to pay iPad prices for a device they're completely unfamiliar with. When even Snap is having trouble moving $200 Spectacles camera glasses, it's not shocking that a comparatively unknown brand may initially struggle to get consumers to understand the value of their product. Still, we're betting that, in keeping with the fashion-forward aesthetic of the company, landing a couple of major celebrities who are willing to wear the smartglasses around the clock would do wonders for consumer adoption of Focals.
In the meantime, Lake and his team are doing everything possible -- including keeping up an insanely rapid pace of software updates -- to get North Focals into the minds and on the faces of users. Fueled by a new funding round reportedly in the $40 million range a few months ago, it looks like North just secured a bit more runway to find out if its strategy can work.
Component makers play a valuable role in the growth and maturation of AR headsets and smartglasses. Innovations by the makers of AR displays and optics, sensors, and chipsets are necessary to evolve AR wearables.
One of the leading components companies is Lumus, makers of transparent displays and optics for AR headsets based in Israel, and CEO Ari Grobman is charged with leading the company into the bright future of AR hardware as the premier provider of optics to the world's smartglasses makers.
Before his appointment to CEO at Lumus in 2017, Grobman spent 11 years as vice president of business development, where his role centered on growing the company's customer base and increasing overall revenue. This involved pitching the benefits of augmented reality to enterprises and the strengths of the technology offered by Lumus.
As CEO, Grobman continues to form strategic relationships with AR headset makers and enterprise customers while also overseeing the company's technology innovation and product marketing strategies. Before the end of his first year as CEO, the company struck a deal with Quanta, an original design manufacturer whose business relationships include Apple, Google, and Intel, among others. In fact, subsequent reports in 2018 revealed that Google was working with Qualcomm and Quanta to produce an AR headset, which could mean that Lumus displays would be involved should the product come to fruition.
Lumus secured another significant partnership in 2018. At Augmented World Expo, the company announced that it would integrate eye tracking technology from Tobii into its AR development kit.
The relationships secured under Grobman's leadership have begun to pay off. This year, its optics made their way into the new enterprise-focused AR headset from Lenovo, the ThinkReality A6, which will be manufactured by Quanta. Upon the announcement of the Quanta partnership, Lumus predicted that its technology would arrive in an AR headset within 12 to 18 months. The Lenovo headset fulfills that prediction.
Prior to joining Lumus, Grobman was head of North American sales at ImageID, makers of advanced barcode-scanning technology. While at ImageID, Grobman secured high-profile accounts with Ford Motor Company, Philip Morris, and Smithfield Foods. Grobman has also served as a product manager at optical communications company Rogiya Networks and as an investment banker at Yazam. Grobman earned his degree in marketing and management at Touro College, a private Jewish university in New York City. Clearly, Grobman has the experience and training to succeed in his position.
Based on the company's activity in the last two years and the changes since he took over as CEO, Grobman is elevating Lumus into one of the top AR display providers, if not the premier vendor in the industry. With tech giants planning their own AR wearables, Grobman has Lumus in the pole position to capitalize on the new opportunities.
Making a return from last year's NR30 list is Tony Parisi, one of the earliest pioneers of immersive computing via his work on VRML, and now as a leading voice in the promotion of the AR and VR adoption through his work as the global head of AR/VR ad innovation at Unity.
If enterprise AR is the meat and potatoes of high-end AR, then marketing and advertising is ground zero for mobile AR. Therefore, it makes sense that Unity has one of its best minds locked in on helping brands new to the technology to develop their presence this new virtual landscape full of interactive opportunities.
Some of Unity's AR-meets-VR magic was on display earlier this year when Varjo let Next Reality sample its Volvo experience, which the team created using Unity. In recent months, Unity has also added support for ARKit 3, and unveiled Responsive AR for advertisers.
In addition to his work with the interactive agency world, this year Parisi continued to deliver his insights on the industry as a guest at some of the biggest conferences, including SXSW, AWE, and others. But his most passionate and well thought out messages on the AR space usually come in the form of his frequent Medium postings, where he opines on the present and future of immersive computing, including the good and the bad.
In a recent posting, Parisi summed up just why this is such an exciting time in immersive computing. "We are on the cusp of a step change. The computer interface is moving from 2D to 3D — the culmination of decades of innovation," wrote Parisi. "Sooner rather than later, 3D will become the dominant paradigm, with legacy media types like images and video embedded in spatial interfaces that immerse us in information, imbue the physical world with digital magic, and transport us to fantastical other places."
Luckily, companies like Parisi's Unity are working tirelessly to help the startups, agencies, and artists craft this new reality to leverage the tools of the emerging immersive computing paradigm.
For talented women entrepreneurs seeking help in getting their augmented reality dreams off the ground, Amy LaMeyer may be the answer to their prayers.
An early-stage angel investor based in San Francisco, LaMeyer is a partner at the Women in XR (WXR) Fund, which specializes in investing in augmented reality and artificial intelligence startups founded by women. Her role at the WXR Fund also consists of mentoring companies in augmented reality and artificial intelligence.
LaMeyer assists startups with more than just funding. She also serves as a board member and advisor at the ARVR Academy, a professional development organization that endeavors to foster diversity and inclusion in the immersive technology field. But one does not simply walk into the role of AR investment and mentorship. LaMeyer has amassed nearly 20 years of experience in business, with demonstrated success in finance.
After earning her Bachelor of Arts degree in international studies and political science, LaMeyer continued her studies at Boston University's School of Management, where she received her Master of Science in management information systems and a master in business administration degree.
She spent more than 15 years at cloud computing company Akamai Technologies, getting in on the ground floor when the company was still a startup. She began on the technical side of the business, starting as an engineering software release manager before moving into a role as director of development planning and systems, leading initiatives in systems, process, and quality assurance.
LaMeyer made her major impact on the finance and corporate development side of the business as the company grew into a publicly-traded company. After leading process improvements as the director of finance, she was promoted to senior director of corporate development. She spent a decade in the role, where she had a hand in 20 acquisition integrations, divestitures, and strategic partnerships involving companies from India and Israel to Denmark and Scotland and totaling more than $1.5 billion in value. As part of Akamai's partnership with Microsoft Ventures Israel, she led a cybersecurity accelerator where five of the six participating startups raised funds.
Her background in technology and finance equip her to expertly evaluate startups for funding and advise them on how to fulfill their missions. But it is her deep interest in spatial computing that guides her towards AR.
"I am incredibly excited about the potential of spatial computing (VR, AR, MR) particularly in collaboration with artificial intelligence (AI)," she said in a recent blog post. "Seeing these technologies in action reminds me of the early days of mass internet adoption. The potential to transform businesses and enable human interaction is just as great, and just as unknown."
Keeping the faith when so few people understand your vision is tough, but that's exactly what Ori Inbar has done for the past decade through this annual AWE conference in Silicon Valley. What started as a small gathering of immersive computing enthusiasts years ago has grown into the premier event for everything from tiny startups to global corporations looking to display, discover, and discuss the state of the art in AR and VR.
The last event, held in Santa Clara, California, was packed as usual and drew most of the biggest names in AR and VR. But beyond the massive showroom floor, where the future of AR is on full display, AWE has managed to stick to its roots by continuing to offer intimate presentations touching upon bleeding edge subjects like the intersection of AR and blockchain technology, as well as the shifting landscape of marketing in an age when immersive computing can turn nearly any surface or space into a promotional opportunity.
Based in New York City, Inbar not only leads the push to hold his annual event on the west coast, but he also holds AWE events in Europe, Asia, and Israel, his country of birth. Along with AWE, Inbar is also the founder and managing partner of Super Ventures, where he diligently looks for the next big immersive computing idea to invest in.
In an interview with Next Reality earlier this year, Inbar described how he balances the mission of serving the AR community via AWE while also serving the needs of the companies he invests in. "I've been advising and helping thousands of AR startups for free, for the love of the game, and been connecting them and helping them, whether it's in AWE or beyond," said Inbar. "In our minds, we have a very hard wall between what we do to keep growing the community, and helping drive adoption of AR as an industry versus what we do on the investment side."
Although the AWE event recently saw the departure of one of its most active voices, Tom Emrich, there doesn't appear to be anything slowing down the momentum of AWE as it continues to mature in tandem with the AR industry as a whole.
New York hasn't really been considered a hotbed of tech activity since the Dotcom 1.0 days when the Flatiron District was tagged with the nickname Silicon Alley. Sure, there are successful startups in the city today, but it just hasn't been the same — until now.
In the last couple of years, New York City officials have gradually embraced the AR and VR startup community in the city, which has led to funding initiatives and special programs and partnerships. The most notable of those initiatives is the RLab, a location and team based at the historic Brooklyn Navy Yard, dedicated to fostering and supporting the innovation coming out of the immersive computing space.
And while it might seem odd to point out a single-city-centric program in New York City, particularly when other cities are also engaged in similar movements, the fact remains that New York City, along with Los Angeles, is one of the two major media and entertainment hubs of the United States. So when New York City officials devote real governmental resources to AR and VR, two spaces still considered edge technology by many in the mainstream, it's time to pay close attention.
In his role as executive director of the RLab, Justin Hendrix supports a wide range of AR and VR startups that are looking to innovate within the ever-expanding confines of the technology, or partner with existing mainstream concerns. One of the RLab's initiatives is called the XR Beta residency program, a set up that allows immersive computing startups to use the resources of RLab, including office space, to give themselves a boost into the wider business and technology space. The latest additions to the AR component of the program are echoAR (an AR cloud platform), Sensorium (an AR and VR film production team), Unseen Media (an AR gaming startup), Lightframe (a company focused on volumetric capture solutions for clients), and Optoon (an AR avatars app).
Hendrix has a lot of experience leveraging New York's business and community resources as he's also the longtime executive director of the NYC Media Lab, an organization that is described as "a public-private partnership" that drives entrepreneurship by connecting New York City's business and educational communities.
In addition to its accelerator activities, the RLab also offers 16,500 square feet of co-working space, educational programs and events and bills itself as "the nation's first city-funded center for research, education and entrepreneurship in virtual and augmented reality and related technologies." Aside from Hendrix, the team administering the RLab includes notables such as Adaora Udoji (previously with WNYC radio and CNN), who is the director of corporate innovation & entrepreneurship, and Alexis Seeley (formerly an associate dean at Columbia University and Barnard College), who serves as the RLab's director of education & opportunity.
The RLab just launched in 2018, so it's still too early to say how impactful it's been, but AR insiders and major business names regularly mention the facility and its team, so the project appears primed to deliver a few of the next big names in the AR space.
Like many of the members of the NR30, Robin Hunicke, CEO of Funomena, comes from the video game industry. Hunicke and Martin Middleton started Funomena in January 2013. The two came together under the shared belief that "games can have a positive impact on the world." And Hunicke has done that in more ways than one.
In the video game industry's male-dominated field, where 22% of the workforce is female, Hunicke stands apart. Not just for her gender, but for her roles as a startup co-founder and chief executive of her innovative company.
Aside from her role in the gaming industry in general, Funomena is also a pioneer in augmented reality gaming. The company was one of the few early access partners for Magic Leap's AR headset. Luna: Moondust Garden, the sequel to one of the studio's more successful VR titles, served as Funomena's debut on the Magic Leap One.
Hunicke's contributions extend beyond her company's standing as a leader in AR development. She also serves as the director and founder of the BA in Art + Design: Games & Playable Media program at the University of California at Santa Cruz and maintains a role as an associate professor as well as a researcher at the Ludo Lab Center.
Her role as an academic dates back to the Methods, Dynamics, and Aesthetics (MDA) framework that she developed. She developed the framework with colleagues at Northwestern University, where she earned her doctorate in computer science with a focus on AI and game design in 2011. She has applied the framework to her game designs, including the titles Boom Blox, MySims, and TheSims 2 via her tenure at Electronic Arts and the game Journey via her time as executive producer at thatgamecompany. The latter title won the IGN Game of the Year in 2012. She also teaches the MDA framework as part of the Experimental Gameplay Workshop at the Game Developers Conference.
Hunicke also plays an important role in nurturing diversity in the industry. She co-hosts Amplifying New Voices, which also takes place as part of the Game Developers Conference.
For all of her credentials and accomplishments, there is another relatively unexpected bullet point on her resume that distinguishes Hunicke from her colleagues on the NR30: she has been a judge on Project Runway.
The first thing to know about Spatial is that it has "two" primary co-founders, Anand Agarawala and Jinha Lee (previously an interface whiz for the likes of MIT, Microsoft, and Samsung). The two founders work in tandem to shape the vision and path of this potentially groundbreaking startup. And when you meet them they frequently finish each other's sentences seamlessly.
However, we're highlighting Angarawala because he's been leading the public-facing voice for the company in the past year, carving out a space for the company in the big leagues earlier this year by appearing on stage at Microsoft's HoloLens 2 unveiling event in Barcelona, Spain.
Back in 2010, Bumptop, the company founded by Agarawala, was acquired by Google. Bumptop's product? A 3D multi-touch desktop computing environment that was probably just a bit ahead of its time. Nevertheless, the parallels to AR are obvious and explain why Agarawala decided to pin his hopes to AR, where virtual interfaces come alive in the most interesting and practical ways.
Founded in 2016, the company has landed roughly $8 million in investment funding from Samsung NEXT, Mark Pincus, Kakao Ventures and others. What does the company do? Simply put, Spatial facilitates remote communication and collaboration in AR. But instead of simplifying things like some alternative remote AR communication apps, Spatial actually takes on the challenge of representing your body, face (yes, your real face, not a cartoon illustration), and your position in the real world, all in real-time as you work with other users in remote locations.
It's difficult to explain the sensation unless you've tried it, as we have, but if you're familiar with VR solutions that deliver remote meeting solutions, then you'll have a hint at what it's like. Except, in the case of Spatial, the "room" you're working in isn't virtual, it's the real world. And the avatars you interact with actually look like the real you. Perhaps the most impressive part of the system — and there's a lot to be impressed with — is the simple fact that users without AR headset can also participate from a desktop computer.
We've tested a number of AR and VR remote collaboration solutions over the past few years and Spatial is a singular achievement not just in terms of execution, but also in the way of polish. At present, the app is available for the Microsoft HoloLens, but the team is very far along with its work in providing a Magic Leap One version as well.
If you find yourself impressed with the UX building blocks of the Mixed Reality Toolkit, you can thank Dong Yoon Park, the principal user experience (UX) designer for the HoloLens.
With a background in design and engineering, Park joined Microsoft as a UX designer in 2011. Before joining the HoloLens team in 2015, he worked on MSN and Bing apps for Windows 8, Windows Phone 8, Windows 10, iOS, and Android. In this capacity, he steered the company's Metro design language.
He currently leads ecosystem design as part of the cognition design team for HoloLens and Windows Mixed Reality. This includes the open-source Mixed Reality Toolkit (MRTK), the Mixed Reality Design Labs, Mixed Reality Academy, and Mixed Reality Dev Center.
His early experiences with painting software and computer graphics led him to pursue an electrical engineering degree from Korea University, but Yoon put his studies on pause to work as a web designer and programmer during the dot-com boom. After serving in the Republic of Korea Army and rising to the rank of sergeant, Park returned to Korea University to earn his bachelor's degree in engineering. Upon graduation, Park joined Samsung Electronics, working on GSM/WCDMA user interfaces and prototype mobile phones as a software research engineer in the telecommunications research and development center.
After working alongside Samsung design teams on user interface projects, he decided to shift his career to the design side of tech. He returned to school to study graphic design at the Samsung Art and Design Institute and then the Parsons School of Design in New York.
For his Masters in Fine Arts thesis, he developed the original version of his first HoloLens app, Typography Insight, for iOS. The app eventually became the number-two app in the education category of the App Store.
Park has continued to lean on his aptitude for app development and design for the HoloLens, exploring typography with News Space, an app that reimagines news reading apps for augmented reality. His latest app, Type In Space, takes advantage of the new two-handed interactions in the Mixed Reality Toolkit.
Giving users the ability to interact intuitively inside AR headsets and decipher text is a critical component in the way of ensuring that AR technology takes its natural next step as the heir apparent of personal computing. Park's penchant for design and development is what shapes that experience for the HoloLens.
Newcomers to the AR space may not know Jeri Ellsworth, but they should, because, back in 2013, long before the current AR explosion of activity, she toiled tirelessly to offer an AR device to the public via her startup castAR. Although the company raised over $1 million in funding via via Kickstarter, and an additional $15 million from Playground Global, the company ultimately folded in 2017 and one bright and shining light leading the way to the future of AR was dimmed.
But the kind of grit that makes founders like Ellsworth get started in the first place is the same toughness that makes her return to the AR space no surprise. Ellsworth's latest venture is called Tilt Five, and it's a project designed to focus primarily on tabletop gaming in AR. Although we haven't had a chance to directly sample the system, what we know is that it consists of a completely original AR headset and wand-style controller, which the company hopes to sell for less than the price of some of the popular standalone VR headsets like Oculus Quest (which sells for about $500).
There are many evangelists and prognosticators within the AR industry who have trusted their gut and stayed on the path toward realizing AR as a business, but Ellsworth is one of the rare individuals who actually put her professional future on the line to try to help create the AR future we've all been dreaming about. In the past, Ellsworth as worked as an electrical engineer at Realtime Associates, NewTek, and, most notably, Valve, as a research and development hardware engineer.
But don't let her rigorous and high-end track record fool you, Ellsworth also uses technology to just have some plain old fun. And if you follow her online, you'll likely learn a few things in the process. To every DIY super geek out there who has ever taken apart an engine or a computer, Ellsworth — who dropped out of high school and taught herself to design computer chips — is the example of someone taking sheer drive, inspiration, curiosity, and talent and shaping their world around them. Now, all that energy and brilliance is back in the AR fold, and the entire community will probably be all the better for it.
They say the second time is the charm and, based on the hot activity in the AR space, Ellsworth's timing might finally be on target. The company plans to launch a Kickstarter for the Tilt Five system this month.
The saying goes: If you build it, they will come. And in the realm of developing augmented reality experiences, building often comes via the Unity 3D engine. According to Unity Technologies, the 3D engine has birthed about 60% of AR and VR experiences, with more than 91% of HoloLens experiences coming from Unity.
The person responsible for facilitating a good deal of augmented reality development in Unity is Timoni West, the director of XR for Unity Labs. That makes her one of the most crucial leaders in the industry.
One of the first tools West worked on after joining Unity as a principal designer in 2015 was Editor XR, the first virtual reality tool for Unity. West was promoted to head of the authoring tools group in 2015. There, she led a group of engineers and designers in building prototype immersive experiences, as well as shaping Unity's strategy for AR and VR.
She moved into her current role in 2017, where she now leads Unity's efforts in building advanced spatial computing tools. Perhaps the most anticipated of those tools is Project MARS (Mixed Augmented Reality Studio). MARS is a Unity extension that will enable developers to create apps that can interact with the real world. In addition to building practical tools, West's team researches spatial computing.
A magna cum laude graduate of George Mason University, West's resume includes stints at some of the tech world's most high-profile properties. As a senior product designer at Flickr from 2008 to 2011, she redesigned the site's homepage and upload function, among other things. Under the same job title at Foursquare in 2013, she worked on concepts for onboarding redesigns, iPad apps, gamification, and location tracking.
She also enjoys a role as a frequent speaker within the tech space, with at least 40 keynotes and presentations to her credit (and that doesn't even include her appearance at Augmented World Expo (AWE) 2019).
The most succinct insight into West and her role in shaping how augmented reality experiences are built, though, comes in her own words from her own website.
"My interest range from creation tools to privacy and intent, digitally augmented social and physical spaces, applied machine learning, and hardware for new mediums — specifically inputs for spatial computing," says West. "I love hearing about new work in rendering tech, from headsets to holograms, and any company pushing the boundaries on how we can create a future we all want to live in."
Both Apple and Google view web-based augmented reality experiences as a key component of the mobile AR experience, with each company creating its own methods for building AR experiences for mobile browsers.
But startup 8th Web, led by CEO and co-founder Erik Murphy-Chutorian, beat them both to the punch while breaking through their walled gardens. The company's web-based platform, 8th Wall Web, not only arrived on the scene while Apple's AR Quick Look was available in beta versions of iOS 12 and Google's flavor of web AR was corralled in its experimental edition of Chrome, but also supported web-based experiences on iOS and Android.
Now that Apple and Google have their web-based AR toolkits in the real world, 8th Wall has continued to iterate on its own platform, adding helpful features like AR Camera for quickly previewing 3D content in AR and Image Targets for building the kind of experiences that Snapchat is so fond of.
Murphy-Chutorian is not exactly an outsider in the tech industry, though. Like several other members of the NR30, he cut his teeth with the big players. He started as a software engineering intern between June 2006 and Feb. 2007. He returned to Google in 2008 as a senior staff software engineer, working on image search, earning a promotion in 2012 to engineering manager in charge of Google Photos, with several cool features such as Auto Awesome and Photo Search developed under his supervision. He left Google in 2015 for Facebook, where he worked as an engineering manager for the Messenger product.
After a year and a half at Facebook, Murphy-Chutorian struck out on his own, founding 8th Wall in August 2016 fueled by the belief that "AR is for everyone." Within a year, Murphy-Chutorian and company capitalized on that ethos, launching 8th Wall XR, a cross-platform toolkit that enables developers to build AR apps for devices that lacked ARKit or ARCore support.
8th Wall's prodigious rise is foreshadowed by Murphy-Chutorian's scholarly work as well. After earning his undergraduate degree in engineering at Dartmouth College in 2002, he continued his studies at the Electrical and Computer Engineering school at the University of California, San Diego, earning his Masters in electrical and computer engineering in 2005 and his Ph.D. in computer vision in 2009. While there, he was also a member of the Computer Vision and Robotics Research Laboratory (CVRR) and the Complex Systems and Cognition Laboratory (CSCLAB). He has authored or co-authored more than 20 research papers in the fields of computer vision and machine learning, with a particular focus on object recognition and visual tracking.
His accomplishments in the field of computer vision have paid off with 8th Wall, as the company has raised a total of $10.4 million in funding, capped by a $8 million funding round in 2018. While its funding is dwarfed by the rounds closed by the likes of Magic Leap, Niantic, and others working in AR, the company has started to reward the faith of its investors over the past year. To date, 8th Wall Web has earned high-profile brand activations for Sony Pictures, Toyota, and MillerCoors.
Under Murphy-Chutorian's leadership and expertise, 8th Wall has jumped out ahead of the world's most valuable tech companies in the realm of web-based AR. Maintaining that lead will take a herculean effort, but, for now, Murphy-Chutorian has earned a place among the leaders of AR.
The story of Anjney Midha, co-founder and CEO of Ubiquity6, has crossed paths with a few NR30 mainstays in a short period of time who shaped the direction that led Midha to launch his AR cloud startup. So it is fitting that he finds himself listed alongside them this year.
After growing up in India and attending high school in Singapore, Midha earned his undergraduate degree as well as his Masters in biomedical informatics at Stanford University. While at Stanford, one of his neighbors was NR30 alum Evan Spiegel, who was working on an early prototype of Snapchat. He also met Ankit Kumar, who would provide the computer vision expertise for Ubiquity6.
During his Stanford schooling, he also worked as an intern for venture capital firm Kleiner Perkins Caufield and Byers (or Kleiner Perkins for short), which is now an investor in Ubiquity6. While at Kleiner Perkins, Midha led the firm's investment into Magic Leap in 2013. His experience with Magic Leap would also reignite his interest in AR, which started at Stanford.
In 2015, Midha, along with colleagues Roneil Rumburg and Ruby Lee, founded KPCB Edge, a $4 million early-stage fund at Kleiner Perkins focused on emerging technologies, with Midha focused on companies working in computer vision, augmented reality, and virtual reality. During his time at KPCB Edge, Midha worked with Kleiner Perkins general partner Michael Abbott, a former Twitter executive and current vice president of engineering at Apple, who would act as the founding advisor for Ubiquity6.
Meanwhile, in their spare time, Midha and Kumar began working on the prototype of the Ubiquity6 platform. In July 2017, Midha, Kumar, and Abbott left their respective jobs to officially launch Ubiquity6, with KPCB Edge closing upon Midha's departure. The trio assembled a team of engineers from Metamind, Twitter, Facebook, Tesla, Magic Leap, Electronic Arts, and Zynga to work on the platform.
Ubiquity6's AR cloud platform, like other competing platforms, enables a more connected and immersive version of mobile AR. With Ubiquity6, developers can build multi-user experiences, where various participants in an AR experience can see the same content in the same space, and create content that is persistent, meaning that content is anchored in the same space over time, and occluded, meaning that virtual content can occupy space and appear in front of and behind physical objects.
In the first public demonstration of the technology, hundreds of participants viewed, interacted with, and contributed to a Minecraft-like virtual art installation at the San Francisco Museum of Modern Art.
The platform has also attracted some high-profile support from investors. Ubiquity6 closed a $10.5 million funding round in March 2018 with Gradient Ventures, Google's AI investment arm. In a subsequent funding round, in August 2018, Ubiquity6 raised another $27 million. The company has also attracted the attention of Disney, which selected Ubiquity6 to participate in its accelerator in 2018.
Now, along with Niantic and NR30 alum 6D.ai, Ubiquity6 is among the leading candidates to bring the AR cloud to fruition.
Jesse McCulloch was doing enterprise .NET development when the first HoloLens was first announced. But as he watched the live presentation, he knew he had to get his hands on one.
"My first thoughts were, 'There's no way this is real,'" he said on the VRARA podcast.
Eventually, Microsoft gave him his opportunity via a five-question survey form for early access consideration, which asked participants what they planned to do with the HoloLens and how they might change the world with it. McCulloch didn't know what he wanted to do with the HoloLens, he just really, really wanted one. So that's what he wrote down. He submitted his application with low expectations of getting selected.
As Microsoft commenced manufacturing the first devices in waves, the company emailed developers to notify them that their HoloLens was ready if they were prepared to purchase it. McCulloch was one of the first wave of applicants to receive that email.
Luckily, McCulloch had just received his yearly bonus, so he had the disposable income to make the purchase. But he hesitated, not sure how he would generate a return on his investment. He finally decided that if he didn't have an idea, he could at least learn how to develop for it and then sell his services to those who did have app ideas.
When the HoloLens finally arrived, he put the headset on with a bit of apprehension. After powering it on, he soon found that it lived up to all of his expectations. After playing with a helicopter game and watching the virtual interactions, he caught on to the device's propensity to trick the user's mind.
He then dove right into developing for the device, which entailed learning Unity from the ground up. Seeking a community for support and tutelage, he started a Slack channel for HoloLens developers. Eventually, Alex Kipman, the inventor of the HoloLens, caught wind of the community and shared it with his followers on Twitter.
Once he found his footing, McCulloch started his own HoloLens development firm, Roarke Software, while remaining active among the HoloLens developer community.
His efforts in facilitating the needs of the HoloLens developer community eventually attracted the attention of the higher-ups at Microsoft. Now, McCulloch manages the HoloLens community for a living, filling a role as a mixed reality developer relations program manager at Microsoft.
After starting out in the great unknown, McCulloch now serves as the sherpa for others looking to explore the next frontier in augmented reality. With the HoloLens 2 arriving soon, McCulloch can expect to lead many more adventurers along their way.
Before joining Magic Leap, Cathy Hackl had already established herself as an active voice within the augmented reality community.
Hackl's previous professional work was primarily focused on journalism and public relations. She earned her bachelor's degree in journalism from the University of Texas at Austin in 2000 and a double masters in mass communications and international studies in 2004 from Florida International University.
In the sphere of journalism, she has worked as a communicator at CNN, Discovery, and ABC News. Later, her career moved into the practice of public relations, with Atlanta-based agency Spaulding Communications and the University of Florida among her stops. She's also earned her Accredited in Public Relations (APR) professional certification in 2015.
She began using her media skills within the immersive technology field in 2016 as the chief communications officer at VR studio Future Lighthouse, where she collaborated on projects for Sony Pictures Entertainment, Oculus, Beefeater, and William Morris Endeavor. Later, she served as VR evangelist for HTC Vive for the launch of its enterprise VR headset and its partnership with Warner Bros. for the film Ready Player One. Then, as the lead futurist at You Are Here Labs, Hackl consulted with brands like AT&T and Porsche regarding their augmented and virtual reality strategies for marketing and training. Hackl has also assisted in marketing efforts for Oculus VR and Mozilla's web-based AR/VR initiatives.
Now at Magic Leap, Hackl is a member of the company's enterprise strategy team, where her role involves executing Magic Leap's go-to-market efforts for spatial computing solutions for manufacturing, architecture, engineering, construction, automotive, education, healthcare, and other industries.
Hackl has also emerged as a passionate voice for immersive technology in general. She co-authored Marketing New Realities, a book on AR/VR marketing, and serves as a global advisor for the VR/AR Association. She has been recognized as one of the top Latina women working in augmented reality by NBC News. Working alongside co-author John Buzzell, Hackl is currently writing her second book The Augmented Workforce: How AI, AR, and 5G Will Impact Every Dollar You Make, which is scheduled to hit shelves by the end of 2019.
The futurist and author is also using her role to help foster diversity in the technology and communications space as a co-founder of Latinos in VR/AR, and she's the president of the Georgia Association of Latin American Journalists.
While augmented reality is pitched as a new storytelling medium, the industry also needs people who can relay the story of augmented reality itself. Hackl has relished the role of telling AR's story and is putting forth the effort to ensure the world hears it.
Trigger, led by CEO and executive creative director Jason Yim, is responsible for more AR marketing campaigns and branded experiences than perhaps any other agency on Earth.
Let's drop some names. Trigger built the Spider-Man app for the past two live-action films as well as the web-based AR ads for Spider-Man: Into the Spider-Verse. The firm created the Apollo 11 AR experience via the new Time Immersive app, as well as interactive issues of Time, Sports Illustrated, and Entertainment Weekly covers through the Life VR app. Trigger has also produced AR experiences in the Star Wars app and worked on The Last Jedi AR sticker pack for Google Playground. The agency is also the creative force behind AR apps for Travelocity, the NBA the PGA Tour, Marvel's Masters of the Sun, ABC News, and Moviebill. And that's just scratching the surface.
Trigger was an early adopter of the leading AR development platforms, including ARKit and ARCore, Magic Leap's Lumin OS, Amazon Sumerian, Lens Studio, Spark AR, and 8th Wall Web.
In total, Yim has led more than 125,000 hours of development for augmented reality experiences. He also has been awarded four patents for immersive technology with other patents pending. Partnering with Honda Advanced Design, Yim won back-to-back LA Auto Show Design Challenges.
Yim also serves as an evangelist for AR. He's been a featured speaker at top technology and industry conferences, taking the stage in New York, Los Angeles, San Francisco, Singapore, Shanghai, Berlin, Tokyo, Copenhagen, and London. He has also delivered a TEDx Hong Kong presentation on using computer vision in toys and appeared in Apple's first TV show, Planet of the Apps.
Born in Singapore and raised in Hong Kong, Yim earned his undergraduate degree in graphic design from the University of California at Los Angeles in 1995. Later, along with friends from college, he started a web design firm. One of their first customers, Academy Award-nominated composer Hans Zimmer, poached Yim to serve as the president and creative director of Media Revolution, his marketing firm in 1996.
Yim left Media Revolution in 2005 to start Trigger, working out of his home and calling on former clients for work. Within four years, Trigger was generating $5 million a year in revenue. And that was before the firm got into augmented reality.
Marketing is one of the early use cases for consumer-focused augmented reality to take hold, and Yim and company have jumped ahead as the premier firm practicing the art of AR marketing.
Last year, as The New York Times was rolling full steam ahead with AR storytelling, USA Today had just started feeling out the possibilities of AR, earning a spot among the most innovative companies in AR for its efforts.
This year, with the Times taking a step back from AR to recalibrate, USA Today has forged ahead to fill its space as the leader in AR news coverage.
Roy Soto, director of emerging technology at USA Today Network, leads the media company's AR efforts. In his role, he sets the creative direction for the AR experiences and oversees and assembles teams of artists, designers, engineers, and developers who work with editorial, product, marketing, and sales counterparts to construct AR news experiences.
Soto's background is (mostly) in video game development. After working as an analyst for the Department of Defense for six years, he earned his bachelor's degree in fine arts from the Art Institute of Washington in 2003. He started out creating 3D assets at Dynamic Animation Systems, makers of training simulations, before landing at Electronic Arts (EA), where he built game and environmental assets for top tier games. After five years at EA, Soto took a position as art lead at NCSoft, a South Korea-based game developer, but returned to EA after a year to serve as associate art director and outsource manager.
In 2014, he accepted his current position at the USA Today Network. Some of the earlier projects in his tenure included a VR exploration of the USS Eisenhower and the Pulitzer Prize-winning multi-media dissection of the controversial US border wall.
The AR coverage from the USA Today Network has truly taken off under Soto's direction, starting in 2018 with the 321 Launch app for Florida Today for viewing SpaceX rocket launches. Soto's very first project for USA Today told the story of corruption in Chicago in AR.
In 2019, USA Today has covered Oscar-nominated costumes, the NHL, the fire at the Cathedral of Notre Dame, the world's tallest skyscrapers, and the USA Women's Soccer Team. The media organization has also used AR to explore the history of the Apollo 11 moon landing and the the 17th century slave trade.
Soto's direction has built momentum behind the AR coverage at USA Today, with no signs of slowing down. So far, the team's approach toward AR experiences, which combines 3D content with spatial audio, written narrative, and even gameplay to immerse users in the story, has proven to be a winning formula.
Congratulations to the members of the 2019 edition of the NR30! The varied talent and backgrounds of this year's honorees are a testament to the rich talent pool and overall vibrancy of the growing immersive computing space. If this list is any indication, the future of AR is so bright, we're gonna need (and want) smartglasses shades.
Reporting by Adario Strange & Tommy Palladino