AR Spotlight: How Snapchat Lens Creator Audrey Spencer Is Bending Reality on Your Smartphone

How Snapchat Lens Creator Audrey Spencer Is Bending Reality on Your Smartphone

The rapid advance of Lens Studio as a platform for easily developing augmented reality experiences is just one indicator that immersive computing is becoming the norm.

But one layer above the technology tools are the users, the real stars of any platform. Without passionate users, your platform is really just a great idea without the footprint of real-world usage and momentum. That's part of why we decided to begin looking closer at some of the people who are driving the most mainstream version of AR forward via mobile apps.

Don't Miss: Bernie Sanders Presidential Inauguration Freezing Moment Spawns Meme-Worthy Snapchat Lens

First up is Audrey Spencer, a prolific Snapchat Lens creator (you can follow her on Snapchat here) who began years ago as a casual user, and ended up developing some of the most engaging AR experiences on the app. That experience has led Spencer to begin investigating taking her work to the next level by learning to develop experiences in the world of Unity.

Image via Audrey Spencer

In addition to Spencer's work in smartphone-based AR, she's also currently working with San Francisco-based AR startup Kura Technologies as its lead industrial designer. If you're looking to get a peek into one of the most active minds leading the mainstreaming of AR, start here, this is just the first of many such explorations with AR creators.

Next Reality: Where are you from? And what was your first entry into any kind of immersive media, be it VR, AR, mobile AR, whatever.

Audrey Spencer: I'm from the Boston area, 20 minutes outside of Boston. I went to school (Massachusetts College of Art and Design) for industrial design, but I also did a lot of video work and audio, as well as a lot of multimedia projects. Storytelling is what has driven me throughout my whole career, whether it's industrial design or creating stuff for the internet. In 2014, I started making content on Snapchat. It wasn't AR or VR, but you know, mobile stuff, using the drawing tools, and some of the other tools to create stories in a different way.

Next Reality: That's around the same time I started dabbling with Snapchat. Were you playing with any other kind of social media?

Spencer: I was very used to drawing on my screen on my phone, that's kind of what led to drawing in Snapchat. I actually didn't really like social media all that much. I had a Facebook account, I didn't really post on it, but there was just something about Snapchat's disappearing messages. You could try things out. It was more freeing and less like YouTube, where you have to produce everything.

Next Reality: Do you remember the first time you saw augmented reality pop up on Snapchat?

Spencer: I think it was around 2016. Maybe it was like what that rainbow barfing Lens. My first reaction to it was: I wonder what else I could get it to track? Some of the filters worked on one of my cats and that was very exciting. That was my first introduction to it. I always thought they were really neat, but how they were made was kind of a mystery to me. Like, how does it know what my face is?

Next Reality: How did you eventually begin to figure out what they were doing behind the scenes to get the AR to work in terms of tracking, etc. before you even touched Lens Studio?

Spencer: That's a good question. So a few of my friends who I met through doing artwork on Snapchat had transitioned from just doing just art and storytelling on Snapchat to making Lenses. And they were like, "Oh, you work in 3D, you should totally do this. It's right up your alley." It probably took me two years of just fiddling with it to finally get the hang of it.

Next Reality: Talk a bit more about your experience Lens Studio. I think you said you first started using it in 2018?

Spencer: Around 2018, I did this video series. I had this photo of my cat Oscar where he was in front of you sitting. And it kind of looked like a snowball. Like you couldn't see his body or legs, just his like his beard and his face. And it kind of looked like an orb. So I made all these videos of him kind of just like hovering in different locations. And then another one where he was an ice cream cone, and then an orb.

So the first thing I did was I took the 2D image of Oscar as an orb and I put it on a grounded plane that you could move around. Like, here, you can have your own orbs floating in your house. That was was my first. Back then, Lens Studio was intimidating, it wasn't as simple as it is now. I didn't know the program, but I did have a starting point for understanding it [via experience with Photoshop, Final Cut, and other apps].

Next Reality: Your experience seems unique. Most people didn't have a background in nonlinear editing and advanced programs like Adobe After Effects. Do you think it still requires some of that kind of mental framework in terms of quick, easy AR creation within Lens Studio?

Spencer: I think, having started using Unity, I have a great respect for for how Lens Studio has taken those same tools and kind of molded it into something that everyone can use. Some of the more complex stuff requires coding or using their tutorials. But I think at an entry-level, if you want to just put a sprite on your forehead and put some particle effects in, and have some makeup that you want to put on a face, I think that it's a really great tool to get people started thinking in that way because they walk you through it well in lots of tutorials, and the pre-built stuff is easy to understand, for the most part.

Next Reality: I feel like I've seen more music artists using mobile AR and fewer traditional visual artists taking the leap into mobile AR. Specifically, with regard to Lens Studio, I often liken it to Photoshop for AR, so do you have any sense as to why we're not seeing even more traditional artists adopting AR?

Spencer: I think it's because there's still like a mystique around AR…how it's made, and not knowing how accessible it is. I think what we'll probably see more AR work as people get more involved and have a better understanding of the technology.

Image via Audrey Spencer

Next Reality: What is your reaction regarding Snapchat and how they're responding to TikTok AR filters and Facebook's Spark AR AR creation tool? What's your take on just the mobile AR space in general? Do you dabble in Spark AR and TikTok?

Spencer: I have a TikTok account, which I haven't used or posted on in a number of years. I like watching TikTok videos on Reddit, but I don't really use the app too much. In terms of Spark, actually, my first lens that I made that wasn't just a floating orb was a thing that I did that was complicated in Spark AR and it was for an HBO Watchman contest.

It's been a long time since I've done anything in Spark AR. But one of the big differences for me, at least visually, is just the materials, the lighting, and the textures in Lens Studio are so much better. I don't know what they do, but everything just looks nicer and better lit. Objects look nicer and attach to the head better. Basically, on the back end, they have a lot better logic and processing, in my opinion.

Next Reality: From a passion standpoint, not technical, what is it about AR that fascinates you? And how do you describe the space when you talk to people who aren't that familiar with AR?

Spencer: I think AR is the future of computing. There's so much you can do, and it can simplify your life. And as a designer, it excites me. Communicating with people in AR across the world, and just making life more seamless. I think that's the insane part for me. As a Lens creator, and just as a creator in general, what excites me the most is creating stories that people can interact with on their own terms. Instead of just watching my Snapchat, they can interact with a Lens.

Use this Snapcode to follow Audrey Spencer on Snapchat

Next Reality: We talked previously and I was surprised that you didn't seem that optimistic about AR glasses when you talk about the future of AR in terms of everyone walking down the street using them.

Spencer: I want to clarify. I am optimistic about it. Especially because the price will come down and they'll become lighter as technology gets better. And I think that they'll become more akin to how we feel about phones now. We know some of the big problems with AR glasses now that some of the mainstream companies are facing is (lens) transparency and (image) brightness.

So they have the lenses tinted so that you can see the images, so you can't really use them outside. And then you also can't see people's eyes as well. And then they're not very bright, so you can't use them outside. And they're heavy. So right now, there's a lot of issues. But people will find ways to fix it, including some folks that I work with. My point is, I'm optimistic, it's just that current products need to improve.

Just updated your iPhone? You'll find new emoji, enhanced security, podcast transcripts, Apple Cash virtual numbers, and other useful features. There are even new additions hidden within Safari. Find out what's new and changed on your iPhone with the iOS 17.4 update.

Cover image via Audrey Spencer

Be the First to Comment

Share Your Thoughts

  • Hot
  • Latest