top of page

Snap London’s Qi Pan, Director On The Snap Partner Summit 2021

Author: Robyn Foster

Our Content Partner site Women Love Tech talks to Qi Pan, the Director, Computer Vision Engineering from Snap London, talks about the future of AR and the Snap Partner Summit 2021.

What does the future hold with AR for Snapchat?

Augmented Reality has been a key part of the Snap journey from day one, from entertaining Lenses we’re now seeing AR move from fun to function. We believe that the long-term success of AR is dependent on a diverse creator community and are continuing to invest in Lens Studio to support this growth.

The Snap AR platform continues to evolve, and we’re excited to expand our AR capabilities by partnering with brands and organisations to create new capabilities and possibilities. While we are developing tools in this space to meet current user and industry interest in AR such as improved AR eCommerce, and try-on, we’re also making the camera smarter; creating the building blocks of a future where the camera can understand the world around it, both inside and outside and offer additive experiences, whether they are around communication, education, entertainment, shopping or other utility.

How are you involved in the process and what are you excited about?

My team is working together to build the future of AR at Snap, from mapping the world around us, through Landmarkers and Local Lenses and recently Connected Lenses, to understanding how the Snap camera sees and understands objects and people, e.g. sky and ground segmentation, our partnership with Apple around LiDAR, 3D Body Mesh, and more.

I’m particularly excited about Connected Lenses, because for the first time friends will be able to interact with each other through Lenses from anywhere in the world. Up until now,

Snapchat’s AR experiences have been solo or have relied on you being in the same physical space to have a shared experience, so to introduce communal experiences that allow for real-time collaboration between multiple people, wherever they are in the world feels like we are taking a really important and exciting step forward.

We’ve launched this technology with LEGO’s ‘Rebuild the World’ Lens, which enables users to build LEGO in shared virtual spaces in different physical locations, and we also announced co-located Connected Lenses, where Snapchatters can see the same thing in the same physical environment.

And we can’t wait to see what our community makes of these new capabilities. Our AR creator ecosystem is booming with over 200,000 Lens Creators around the world, who have made nearly 2 million Lenses that have been viewed by Snapchatters more than 2 trillion times. With our newest Lens Studio 4.0, it’s exciting to open up all our latest AR technology, including 3D Body Mesh, Connected Lenses and Cloth Simulation to everyone. Both Connected Lenses and Body Mesh in Lens Studio have an amazing understanding of people and the scene around you, so while we’ve started with LEGO, Lens Studio 4.0 allows anyone to create these experiences.

What is it that people love about AR and in what way will it become ubiquitous to their lives?

Snap has long been a pioneer in AR, currently over 200 million Snapchatters engage with AR every single day. With the exciting features we’ve just announced, we hope to see continued growth.

For the majority of Snapchatters, the beauty of AR on the platform is that it just works. Most Snapchatters wouldn’t think ‘I am using AR’ but instead immediately see the fun or value in using the camera this way. Snapchatters are growing more familiar and more comfortable with AR. The huge engagement has allowed us to learn and iterate at a rapid speed, and we have started to see AR transcend play or entertainment, with Snapchatters now adopting the technology for everything from helping with their workout, to trying on makeup or shoes, to scanning a nutrition label and even helping with a math problem.

The AR market is expected to see a 10-fold increase in value by 2023. Today, primarily used for entertainment, experts predict this fast-growing technology will be used in a much wider variety of industries – from marketing and education to construction and agriculture – to streamline processes, reduce human error and support training, in the coming years. AR on Snap has really proven its use case and demonstrated how it can help make life easier. AR will continue to be an important part of Millennial, Gen Z and soon Generation Alpha’s lives.

For the smartphone generation, the camera is becoming the interface to the internet, the same way the search bar is for the computer generation.

How AR has become ubiquitous

  • Interactive lenses. For the first time, friends can interact with each other through Lenses, from across the room or across the world. Snap and the LEGOGroup have created the first Connected Lens, so starting today, friends can collaboratively build with LEGO bricks on Snapchat.

  • Snapchat’s camera can ‘scan’ outfits to help you shop – Screenshop is a new Scan feature: when you Scan a friend’s outfit or your own saved photos, it helps you shop similar looks with recommendations from hundreds of brands. This builds on our existing Scan capabilities, which can use the Snapchat camera to identify types of plants, dog breeds, wine bottles, car models, songs etc. through partner integrations.

  • Voice and gesture-enabled AR shopping on Snapchat (including example from Prada) — New ‘voice-enabled commands’ let you verbally control our AR Lenses, and over 40 commands like ‘next’ or ‘take a Snap’ let you go hands-free, even from across the room. New ‘gesture-enabled controls’ let you physically signal Lenses to take an action, like showing you a different item. Prada is tapping into new gesture recognition capabilities that let shoppers signal to the camera when they want to try on another item.

Article courtesy of our Content Partner Site - Women Love Tech


bottom of page