
Snap to Launch New Lightweight, Immersive Specs in 2026

We believe the time is right for a revolution in computing that naturally integrates our digital experience with the physical world. That’s why we’ve spent 11 years and more than $3 billion to invent a new type of computer for augmented reality, designed to enhance the physical world with digital experiences. We call them Specs, an ultra-powerful wearable computer integrated into a lightweight pair of glasses with see-thru lenses, and they’re coming next year.

Specs understand the world around you with advanced machine learning, bring AI assistance into three dimensional space, enable shared games and experiences with friends, and provide a flexible and powerful workstation for browsing, streaming, and more.
We’ve always built our products and services in an attempt to make technology feel more human. When we first started Snapchat, people still believed that the digital world was separate from the “real” world. Digital services rarely reflected human nature. Social media was a popularity contest, conversations were recorded forever, and apps opened into endless feeds of content.
Snapchat changed all of that. We introduced ephemeral messaging so conversations felt more like talking face to face. We built vertical video so people could hold their phones comfortably while watching full screen. We designed Stories to put content in chronological order, the way stories have always been told, rather than a reverse chronological feed. And we’ve designed Snapchat to open into the camera, to inspire creativity and self-expression.
But we’ve always longed for something different. The tiny smartphone limited our imagination. It forced us to look down at a screen, instead of up at the world. It required our fingers to hold, swipe, and tap when we really wanted to live hands-free. It kept content confined to a small 2D rectangle, when we really wanted to experience life in all of its three-dimensional splendor.
And now at a time when computers are learning to think and act like humans with artificial intelligence, it’s clear that today’s devices and user interfaces are woefully inadequate to realize the full potential of AI. Chatbots will soon give way to immersive experiences that bring AI into the world through augmented reality, empowering us to express ourselves, live in the moment, learn about the world, and have fun together.
Developers are already building new experiences for Spectacles, our 5th generation of glasses released in 2024 and designed for developers to prepare for the public launch of Specs next year. We’ve seen incredible innovation from all around the world, including:
Super Travel from Gowaaa helps global travelers translate signs, menus, and receipts and convert currencies.
Drum Kit from Paradiddle teaches new drummers how to play by overlaying cues on a real drum set and listening to the notes.
Pool Assist from Studio ANRK helps players make better shots in pool.
Cookmate from Headraft finds recipes based on available ingredients and provides step-by-step cooking guidance in the kitchen.
Wisp World from Liquid City brings you on whimsical adventures to playfully explore the world around you.
We’re also announcing major updates to Snap OS, building on feedback and suggestions from our developer community:
Deep Integrations with OpenAI and Gemini on Google Cloud: We now enable developers to build multimodal AI-powered Lenses and publish them for the Spectacles community. For example, developers are using AI to provide text translation and currency conversion (Super Travel), suggest recipes (Cookmate), and bring you on whimsical adventures (Wisp World) based on what you see, say, or hear while wearing Spectacles. We offer camera access designed with privacy in mind through our proprietary Remote Service Gateway.
Depth Module API: Translates 2D information from large language models in order to anchor AR information accurately in three dimensions, unlocking a new paradigm for spatial intelligence.
Automated Speech Recognition API: Enables real-time transcription with support for 40+ languages including non-native accents with high accuracy.
Snap3D API: Lets developers generate 3D objects on the fly inside Lenses.
And, we’re launching new tools specifically for developers building location-based experiences, making it easier to bring monuments, museums, and more to life:
Fleet Management app: Enables developers to remotely monitor and manage multiple pairs of Specs.
Guided Mode: Developers can configure Specs to launch directly into a single-player or multiplayer Lens quickly for a seamless visitor experience.
Guided Navigation: This feature makes it easy to build AR-guided tours that direct people through a series of landmarks at events or museums.
These tools support developers like Enklu, which operates holographic theater Verse Immersive in more than a dozen locations across the US. Now, Verse Immersive customers in Chicago can use Spectacles to play their new game SightCraft with friends, and it will roll out in more locations this year.
And coming soon:
Niantic Spatial VPS: We’re partnering with Niantic Spatial to bring their Visual Positioning System to Lens Studio and Specs to build a shared, AI-powered map of the world.
WebXR Support in the browser: Will enable developers to build, test, and access WebXR experiences.
If you’re interested in building for Specs before launch, you can join our developer program here: www.spectacles.com/lens-studio.
Get In Touch
For press requests, email press@snap.com.
For all other inquiries, please visit our Support site.