SNAP’S FIRST AR SPECTACLES ARE AN AMBITIOUS, IMPRACTICAL START
Share this story
- Share this on Facebook (opens in new window)
- Share this on Twitter (opens in new window)
- SHAREAll sharing options
ItIt doesn’t take long to realize why Snap’s first true AR glasses aren’t for sale. The overall design is the highest quality of any standalone AR eyewear I’ve tried, and they make it easy to quickly jump into a variety of augmented-reality experiences, from a multiplayer game to a virtual art installation. But the first pair I was handed during a recent demo overheated after about 10 minutes, and the displays are so small that I wouldn’t want to look through them for a long period of time, even if the battery allowed for it.
Snap is aware of the limitations. Instead of releasing these glasses publicly, it’s treating this generation of Spectacles like a private beta. The company has given out pairs to hundreds of its AR creators since the glasses were announced in May and has recently made a few notable software updates based on user feedback. “It was really just about getting the technology out there in the hands of actual people and doing it in a way that would allow us to maximize our learning from their experiences of using it,” Bobby Murphy, Snap’s co-founder and chief technology officer, says of the rollout.
After months of asking for a demo, Snap invited me and a handful of other journalists to try them in conjunction with Lens Fest, Snap’s annual AR creator conference being held virtually this week. Guided by Snap employees in a Los Angeles backyard, I tried a wide range of AR experiences in the glasses, including a zombie chase, a pong game, Solar System projection, and an interactive art piece that utilized basic hand tracking.
The demoes showed me that Snap has an ambitious, long-term vision for where AR is headed. The hardware also highlighted the technical limitations keeping mainstream AR glasses at bay.
Like past versions, these AR Spectacles boast a bold design. The narrow, sharp-edged frame has a similar aesthetic to Tesla’s Cybertruck, something that is not lost on Snap’s product designers, and they come with a sturdy, magnetized case that can be turned into a charging stand.
The glasses are light to wear, with flexible sides that can bend out from the head enough to accommodate prescription glasses underneath. (Prescription lenses are available to AR creators who apply and receive a pair.) They include stereo speakers, onboard Wi-Fi, a USB-C port for charging, and two front-facing cameras for capturing video and detecting surfaces.
The biggest limitation I noticed was the battery, which lasts for only 30 minutes of use. Snap did not try to hide this fact and had multiple pairs ready on standby to swap out for me.
The AR effects, which Snap calls Lenses, are projected by a pair of dual waveguide displays that sync with Snapchat on a paired mobile phone. Besides the battery, the main drawback to these Spectacles is the small size of the displays, which covers roughly half of the physical lenses. Due to the small field of view, the AR effects I tried looked better after the fact in their full size on a phone screen, rather than in the actual glasses. Even still, the WaveOptics waveguides were surprisingly rich in color and clarity. The displays’ 2,000 nits of brightness means they are clearly visible in sunlight, a tradeoff that severely impacts battery life.
Since Snap announced these Spectacles earlier this year, it has added some new software improvements. To maximize battery life, an endurance mode automatically turns off the displays when an AR Lens, such as a scavenger hunt game, is running but not actively being used. Lenses can be tailored to specific locations based on a GPS radius. A feature coming soon called Custom Landmarkers will let people overlay Lenses onto local landmarks persistently for others wearing Spectacles to see.
Another new software update brings Connected Lenses to Spectacles, letting multiple pairs interact with the same Lens when sharing a Wi-Fi network. I tried a couple of basic multiplayer games with a Snap AR creator named Aidan Wolf, including one he made that lets you shoot orbs of energy at your opponent with the capture button on the side of the frame. The pairing system still needs work since syncing our glasses to play the game took a couple of tries.
None of the Lenses I tried blew me away. But a few showed me the promise of how compelling AR glasses will be once the hardware is more advanced. Rudimentary hand tracking was limited to one Lens I tried that let me cue different parts of a moving art piece with specific gestures. Assuming hand tracking gets better over time, I can see it being a key way to control the glasses. In one of the other more impressive experiences, I placed persistent location markers around the backyard and then raced through them.
Most of the Lenses I tried felt like the basic proofs of concept I’ve seen in other AR headsets over the years and not experiences that would compel me to buy these glasses if they were available for purchase. But for glasses that have been in the wild for less than a year, it’s clear that creators will dream up interesting Lenses as the software and future hardware gets better. I’ve seen a few early concepts online that are compelling, including exercise games, utility use cases like seeing the city you’re in while traveling, and AR food menus.https://f55c6aa5e4cc77cae0752adb79c7793c.safeframe.googlesyndication.com/safeframe/1-0-38/html/container.html
Here are a few Lenses I captured during my demo:https://volume.vox-cdn.com/embed/a59b9653f?autoplay=false&placement=article&tracking=article:middlehttps://volume.vox-cdn.com/embed/0f380dec5?autoplay=false&placement=article&tracking=article:middlehttps://volume.vox-cdn.com/embed/43b7d991f?autoplay=false&placement=article&tracking=article:middle
The glasses’ main visual interface is called the Lens Carousel. A touchpad on the side of the frame uses flick gestures to navigate in and out of Lenses, view recorded footage, and send it to Snapchat friends without removing the glasses. You can also use your voice to cue a Lens. Ways of controlling future Spectacles will likely include eye-tracking and more robust hand tracking — technologies Snap is already exploring.
“WE FULLY UNDERSTAND THAT THIS IS STILL A NUMBER OF YEARS AWAY”
A dedicated button on the side of the Spectacles frame is for Scan, Snap’s visual search feature that was recently introduced in the main Snapchat app. I used it to scan a plant on a table and my glasses recommended a few plant-related Lenses to try. Like Scan in Snapchat, its functionality and ability to recognize objects is fairly limited for now. But if it continues to get better, I could see Scan being a staple feature for Spectacles in the years to come.
Meanwhile, the tech powering Lenses is continuing to get more advanced. At Lens Fest this week, Snap is announcing a slew of new tools for making Lenses smarter, including a library of music from the top music labels and the ability to pull in real-time information from outside partners like the crypto trading platform FTX, Accuweather, and iTranslate. A new real-world physics engine and software Snap calls World Mesh makes Lenses interact more naturally with the world by moving with the laws of gravity, reacting to real surfaces, and understanding the depth of a scene. GRID VIEW
1 of 7Photo by Amanda Lopez for The Verge
Like Meta, Snap sees AR glasses as the future of computing. “We’ve been very interested in and invested in AR for a number of years now because we view AR as the capacity to perceive the world and render digital experiences in much the same way that we naturally observe and engage with our surroundings as people,” Bobby Murphy tells me. “And I think this is really in stark contrast to the way that we use a lot of technology today.”
AN “INSANE DISTRIBUTION SYSTEM OF AUGMENTED REALITY”
Murphy won’t say when AR Spectacles will be ready to sell publicly, but Meta and other tech companies have signaled that consumer-ready AR eyewear isn’t coming any time soon. “We fully understand that this is still a number of years away,” Murphy says, citing battery and display technology as the two key limitations.
While the tech to make quality AR glasses a reality is still being developed, Snap is already betting its future on AR in the mobile phone era. According to Murphy, Snap’s “main priority now as a company is to really support and empower our partners and our community to be as successful as they can be through AR.”
Snap claims to have over 250,000 Lens Creators who have collectively made 2.5 million Lenses that have been viewed a staggering 3.5 trillion times. 300 creators have a made a Lens that’s been viewed over one billion times. “We’re building this really insane distribution system of augmented reality that doesn’t exist anywhere else,” says Sophia Dominguez, Snap’s head of AR platform partnerships.
Now the company is starting to focus on ways to help Lens Creators make money, including a new marketplace that lets app developers using Snap’s camera technology pay creators directly to use their Lenses. Viewers of a Lens can send its creator an in-app gift, which they can then redeem for real money. And for the first time, a Lens can include a link to a website, allowing a creator to link to something like an online store directly from AR.
What about letting users pay for Lenses directly? “It’s something we’ve certainly given thought to,” says Murphy. He calls NFTs a “very, very fascinating space” and “a good example of digital assets [and] digital art having a kind of a real, tangible value.”
Snap doesn’t like to talk about its future product roadmap but Murphy is clear that “new updates to our hardware roadmap” will keep coming “fairly often.” In the meantime, it’s making an effort to court AR creators long before its glasses are ready for primetime. While there’s no guarantee that Snap will be a major player when the tech is finally ready, for now it has a head start.