Snapchat is taking a significant step toward solidifying its role in the augmented reality (AR) landscape with the unveiling of its next-generation Lens Studio at Lens Fest 2025. The update marks a strategic push by Snap to merge AI with AR development, making content creation more accessible while building a sustainable monetization system for creators.
At the center of this initiative is the redesigned Lens Studio platform, which now integrates generative AI tools that allow creators to produce visual effects, manipulate Bitmoji avatars, and remix popular templates with minimal effort. Instead of relying solely on traditional coding and asset design, users can now build and refine Lenses through natural text prompts. The updated platform is available across both mobile and web, aiming to simplify how creators build and publish AR experiences.
A major addition is “Blocks,” Snap’s new modular framework for AR creation. Blocks serve as reusable, prebuilt components—bundles of scripts, visual assets, and effects—that streamline Lens development. Accessible through Lens Studio AI, these components are designed to speed up experimentation while maintaining creative flexibility. This structure could position Lens Studio as one of the more efficient AR creation tools available to independent developers and brands alike.
Beyond creative tools, Snap is expanding its AR monetization ecosystem. The existing Lens Creator Rewards program will now include Lens+ Payouts, which compensate developers based on engagement from Snapchat+ premium subscribers. The update reflects Snap’s ongoing effort to make AR development financially viable rather than purely experimental. The company also announced that its Camera Kit—used to integrate Snap’s AR technology into external apps and websites—will no longer require mandatory branding. That change gives developers more freedom to deploy Snap’s AR framework in professional contexts without overt platform ties.
This renewed focus on AR tools comes as Snap prepares for the next release of its Spectacles smart glasses, which will run on Snap OS 2.0. The new model promises deeper AR integration with features like Travel Mode, EyeConnect, and real-time social sharing capabilities. Perhaps most notably, Spectacles will debut the new Commerce Kit, allowing in-Lens payments and shopping experiences — an early step toward frictionless AR-based commerce.
Supporting all of this is Snap Cloud, a back-end infrastructure designed to handle high-performance AR workloads at scale. It offers fast APIs, real-time capabilities, and secure storage for developers building experiences that blend virtual content with real-world interactivity. From location-based adventures to multiplayer AR games, Snap Cloud is meant to give creators enterprise-grade reliability within the Snapchat ecosystem.
The company is also updating its Lens Games platform with live multiplayer matchmaking integrated directly into Chat, as well as improved developer tools for animation and camera control. These enhancements suggest Snap’s broader aim: to make Snapchat not just a social platform, but a fully-fledged creative and commercial ecosystem for interactive AR.
While Snap’s heavy investment in AI-driven AR tools underscores its confidence in the medium, the question remains whether these innovations can drive user engagement beyond novelty. With competitors like Meta, Apple, and Niantic racing to dominate spatial computing, Snap’s success will depend on whether it can turn creative experimentation into long-term user habits — and consistent income for the developers who power its platform.