How Meta Ray Ban Smart Glasses Displaced My Smartphone On Opening Day

By Anshel Sag, Patrick Moorhead - May 7, 2024

I already wrote a detailed review of the Meta Ray Ban Smart Glasses shortly after they launched late last year. But just like Meta’s Quest headsets, the Meta Ray Ban Smart Glasses continue to improve through updates and, more importantly, get more use from me in more ways over time. Since the Meta Ray Ban Smart Glasses launched after the 2023 baseball season was over, I didn’t really get to experience them at a sporting event in a way I would naturally find myself doing. Which brings us up to Opening Day for the 2024 MLB season.

The Pageantry Of Opening Day

One of the things that is fun about Opening Day is all the pageantry and excitement that is built up around the first home game of the season. By the time Opening Day arrived in my hometown of San Diego, the Padres had already played their first games away in Seoul, South Korea, splitting a two-game series with the L.A. Dodgers. (I did not fly to South Korea for those games.) For anyone who loves baseball, Opening Day is a big deal. Many of us take the day off, especially since the games are usually played in the middle of the day.

The Meta Ray Ban Smart Glasses take great hands-free photos.
Anshel Sag

For me, Opening Day meant wearing my favorite jersey and hat, but also bringing a few smartphones with me, including the Vivo X100 Pro that I’m currently reviewing alongside my Samsung S24 Ultra and iPhone 15 Pro Max. Naturally, I busted out the X100 Pro to grab some telephoto shots of players and snap a few wide-angle photos of the festivities. I also used the iPhone to grab some videos, but for the most part I kept my smartphones in my pockets for most of the game, which is unheard of from a gadget maven like me. This was because I was taking pictures and videos using the Meta Ray Ban Smart Glasses.

Meta Ray Ban Smart Glasses At Petco Park

One of my biggest concerns about bringing these glasses to Opening Day was that I wouldn’t be able to bring my charging case with me, and the glasses might die. That concern was negated by the end of the game, when I returned home with enough battery left to transfer 30-plus photos and videos from the glasses to my phone. Trying to transfer and post from the game itself probably would’ve been an exercise in futility—or at least so I thought until I started to use the Petco Park stadium Wi-Fi, which seems to be working much better this season.

A moment of silence to honor the late Padres owner Peter Seidler; I was able to take this photo … [+]
Anshel Sag

Opening Day was the most that I had ever used the Meta Ray Ban Smart Glasses in such a short period, and the most I had ever put them through their paces in general. This gave me new insights into some of the functionality. One difficulty came when I wanted to take photos or videos using Meta AI voice commands; given the level of background noise, I could never hear an acknowledgement sound or recording/photo-snap sound to let me know that a photo had been taken. In noisy scenarios like this, it would be great if Meta included some kind of haptic feedback to signal the user.

I had already provided Meta feedback that the glasses should have a “blurry” image-detection algorithm that can notify you if one of your lenses is dirty or smudged, which can ruin your photos and videos. (This happened to me a few weeks ago.) That said, I did recently notice that the glasses have a hat-detection algorithm and will warn you when your hat is blocking the glasses from taking a clear photo or video.

On Opening Day, I also noticed that I recorded way more videos than photos. This makes sense to me; if I had wanted to take pictures, I would probably have pulled out the X100 Pro and used the 10x optical telephoto. The glasses also still have a considerable shutter lag as well. For video, the Meta Ray Ban Smart Glasses take 1080P first-person video. This is the best for sporting events, especially ones outdoors during the day when lighting conditions are ideal. Capturing the fighter-jet flyover right before the first pitch was awesome, and I was excited to have caught that on video from my perspective. That said, it would also have been great to try Meta’s multimodal AI capabilities, which are still in limited beta for U.S. users. These include the ability to translate, identify and describe the real world to the user. I’m hoping the company releases those capabilities broadly soon so users can enhance their experience.

Comparing to Humane’s AI Pin

The Meta Ray Bans have become a regular companion for me on walks with my family, going around town, traveling and going to events. A huge component of that is that the glasses enable me to have a completely hands-free user experience, keeping my phone in my pocket while my hands are occupied changing a baby’s diaper, walking the dog or carrying groceries, all while still being able to make phone calls, respond to text messages or listen to my latest audiobook. This got me thinking about other AI-enhanced wearables that are designed to reduce our reliance on the smartphone—for example, the newly launched AI Pin from Humane.

The Humane AI Pin is a wearable computer designed to be magnetically pinned to a user’s shirt or jacket. It uses a combination of cameras, microphones and a projector to enable a user to communicate with the pin’s custom AI operating system, which is called Cosmos. A user can simply press the touchpad on the AI Pin to query the device or give it a command; the device uses the cloud to improve the accuracy and context of the pin’s responses. The pin uses an innovative interface that combines a class-two laser projector with hand-tracking so the user can interact with the pin without needing a screen. Humane’s mission is to make computing more personal and to reduce a user’s dependence on a smartphone, which is why the device has its own cellular connectivity.

If you think about what Humane is trying to do with the AI Pin—shift your attention away from the phone—the Meta Ray Ban Smart Glasses accomplish many of the same tasks and do many of them better. Yes, it’s true that the AI Pin is a standalone device, but the Meta Ray Bans have a much faster processor with better connectivity to your phone. Additionally, the glasses also have better quality audio for phone calls, better photo and video quality and a better perspective for those photos and videos as well. The Meta Ray Ban Smart Glasses also have better voice-to-text dictation. And while it’s true that the Meta AI is really lacking for some simple tasks such as search queries, it’s improving on those every day.

The Meta Ray Ban Smart Glasses are the best pairing of AI and AR wearables that I’ve seen, and the reality is that, for at least the next few years, there will be a need to pair these devices to our phones to maximize battery life and performance while reducing weight. I won’t talk about pricing because smart glasses and the AI Pin operate on very different business models, but fundamentally I believe that AR and smart glasses present a better form factor for the kinds of AI assistants that Humane and other companies are trying to create. AI models are going to continue to evolve, and the smartphone is the right platform to handle that innovation, at least for now.

Wrapping Up

Meta’s Ray Ban Smart Glasses enhanced my favorite sporting day of the year, Opening Day. While they didn’t replace any of my smartphones entirely during the experience, I found myself recording considerably less video on my phones and way more on the glasses, and even capturing some photos too. In fact, with these glasses I generally find myself taking pictures and videos that would otherwise be impossible if I had to grab a smartphone, either because both of my hands are occupied, or because it would take too long to get out the smartphone and use it as a camera. That said, as mentioned earlier, there is still a considerable shutter lag when taking photos that I’d like to see Meta resolve.

The Meta Ray Ban Smart Glasses, while far from perfect, do give us a valuable glimpse into a world of AI wearables that we want to take with us every day. These glasses not only come in a very natural form factor, but they have practical everyday uses that can be extended for special events like Opening Day. While my Opening Day experience with them may have exposed a few weaknesses, it has also shown me that I much prefer to capture hands-free video with these glasses rather than using a smartphone that pulls me out of the experience. The Meta Ray Ban Smart Glasses may be the first device to truly reduce my smartphone use, even though they are still very dependent on smartphones for connectivity and AI processing power.

Anshel Sag
VP & Principal Analyst| Website| + posts

Anshel Sag is Moor Insights & Strategy’s in-house millennial with over 15 years of experience in the IT industry. Anshel has had extensive experience working with consumers and enterprises while interfacing with both B2B and B2C relationships, gaining empathy and understanding of what users really want. Some of his earliest experience goes back as far as his childhood when he started PC gaming at the ripe of old age of 5 while building his first PC at 11 and learning his first programming languages at 13.

Patrick Moorhead
+ posts

Patrick founded the firm based on his real-world world technology experiences with the understanding of what he wasn’t getting from analysts and consultants. Ten years later, Patrick is ranked #1 among technology industry analysts in terms of “power” (ARInsights)  in “press citations” (Apollo Research). Moorhead is a contributor at Forbes and frequently appears on CNBC. He is a broad-based analyst covering a wide variety of topics including the cloud, enterprise SaaS, collaboration, client computing, and semiconductors. He has 30 years of experience including 15 years of executive experience at high tech companies (NCR, AT&T, Compaq, now HP, and AMD) leading strategy, product management, product marketing, and corporate marketing, including three industry board appointments.