The Panel Reacts: Is the Vision Pro for Everyone, or Just Enthusiasts?
Hear from Adobe’s Scott Belsky, Semafor’s Reed Albergotti, and more Big Technology Panel members on the potential ceiling for Apple’s new XR device.
I put on Apple’s Vision Pro headset for the first time this weekend. The technology was even better than I expected. The device’s mixed reality passthrough was near-perfect, blending the physical and digital worlds seamlessly. And it wasn’t too heavy either. If I hadn’t found the $3,500 price tag impossible to justify, I might’ve just been one of the guys using it in the crosswalk.
But after seeing the Vision Pro’s debut, I’ve wondered just who it is for.
Is it for developers? Enterprise users? The general consumer? And what’s its potential exactly? To find out, I emailed the Big Technology Panel, a (growing) group of 30+ tech insiders who share their thoughts (when applicable) on big breaking news in the moment. Here’s what they said:
This is a two-year “public beta trial,” where Apple can observe and develop use cases, refine the hardware and UX, and only then will it make sense to scale into a wider market. It's also great (classic Apple) marketing, and will bring people into stores for demos where they will be sold high-margin wearables and accessories!
Richard Kramer, Analyst at Arete Research Services
I've always thought that the killer app for XR will be productivity, and after a few hours of coding in the Vision Pro I can say that the experience is almost there — once the software supports multiple screens it'll be better than my physical setup. I think the price point is too high for this to go mainstream, and it's still too heavy (I couldn't wear it for multiple hours every day), but I'm very excited for the Vision Pro 2.
Ben Lerner, CEO of Espresso AI
Normally, when Apple starts selling a new product category like a phone, a watch, or a laptop, you start seeing those things in the wild and those early adopters become an extension of Apple's marketing and are often evangelists. But the Vision Pro is a solitary device that you use mainly in the privacy of your own home and it probably goes in a drawer somewhere once you're done using it (Maybe hardcore Apple fans will put it on display). I wonder if Apple tried to quantify how that reality will affect the speed of adoption.
Adding an addendum: After I wrote this, someone sent me this TikTok video. People are walking around with the goggles on, air pinching as they cross the street. But I’m not sure if this will make people want to buy this thing or give up on humanity.
Reed Albergotti, Technology editor at Semafor (you can sign up for Semafor’s tech newsletter here)
, Adobe Chief Strategy Officer, Executive Vice President of Design & Emerging Products (read more from Scott at Implications)I was especially struck by the following elements: (1) Select with your eyes and pinch with your fingers on your lap is the breakthrough gesture innovation much like “touch” was for the iPhone. (2) Environment selection - It was an incredible sensation to select my environment for my work. (3) Family spatial shots - the ability to capture and “re-live” a moment with your family and friends is powerful. (4) Sports and “seats you can’t buy” — I’m no sports fanatic, but I was blown away by the short clips I experienced of an NBA game and a MLB game. In both cases, I was sitting on the court or field and felt completely immersed in the game to the point where I felt I could get hit with a ball or trampled over.
The Vision Pro will be a flop by Apple standards but still be critical in the long term by kickstarting adoption of VR tech in a significant way. Apple remains the best at product marketing, and strapping a computer to your face needs that.
Brian Morrissey, author of
Hands-On With the Vision Pro on Big Technology Podcast, with and WSJ’s
Listen to one of our most entertaining episodes ever on Big Technology Podcast via Apple Podcasts, Spotify, or your podcast app of choice.
Great Quarter, Guys
We talked Tesla, Alphabet., Microsoft, Meta, Apple, and Amazon on Great Quarter, Guys, a new show from The Compound. Tune in for a deep but fun look at where these companies stand and where they're heading with Josh Brown, Michael Batnick, Dan Ives, and myself. Enjoy!
Thanks for reading! We’ll be back on Friday with more!
"I was especially struck by the following elements: (1) Select with your eyes and pinch with your fingers on your lap is the breakthrough gesture innovation much like “touch” was for the iPhone."
This is a great observation. I was a little too harsh in my previous criticism of VR on other blogs I've written. I understand the accessibility feature, and think the leap from T9 phones to resistive ones like the Nokia C5-03, then capactive touch screens wasn't always an improvement.
"But the Vision Pro is a solitary device that you use mainly in the privacy of your own home and it probably goes in a drawer somewhere once you're done using it (Maybe hardcore Apple fans will put it on display)."
They should add after "in a drawer," "in a secret vault, behind a secret door in a closet with 5 locks behind shoe boxes and down a staircase into a basement lair stocked alongside a number of other paraphernalia."
Reframing the Narrative: Beyond Buzzwords, Why Meta's AI Will Power Deep Connections in the Metaverse
While media headlines scream "Meta pivots to AI," they miss the deeper story. The company's recent investments in Large Language Models (LLMs) aren't a sudden shift, but rather a crucial piece of their long-term vision for a personalized, engaging metaverse. Here's why understanding this connection is key:
From Buzzwords to Building Blocks:
Forget catchy terms like "metaverse" and "AI." Meta's vision goes beyond mere buzzwords. Their LLMs are the building blocks of real-live AI companions, the heart of their metaverse experience.
Beyond Assistance, Building Bonds:
These advanced AI companions won't just answer questions or complete tasks. They'll learn your personality, adapt to your moods, and offer unwavering support, forging genuine emotional connections within the metaverse.
The Power of LLMs:
Meta's LLMs unlock unprecedented natural language processing and personalization. This allows AI companions to understand your unique way of communicating, respond with empathy, and tailor their behavior to your needs and preferences.
A Vision Evolving, Not Changing:
Don't be fooled by the narrative shift. Meta's core vision remains constant: creating a meaningful virtual world where users can connect, grow, and thrive. LLMs are the tools to achieve that vision, making the metaverse more than just a place, but a platform for personal connection.
Challenges and Considerations:
Building trust and avoiding AI bias are crucial. Meta needs to prioritize ethical development and user control to ensure responsible, meaningful interactions.
Emphasize that AI companions complement, not replace, real-world relationships.
Conclusion:
Meta's investment in LLMs isn't a pivot, but a strategic step towards their long-held vision. By harnessing the power of these models, they'll create AI companions that go beyond assistance, offering genuine connection and emotional support within the metaverse. By addressing ethical concerns and transparently communicating their vision, Meta can ensure their AI companions become valued friends, fostering a deeper and more meaningful metaverse experience for all.
Remember: This revised synopsis highlights the connection between Meta's LLM investments and their metaverse vision, providing a more nuanced perspective than the media's current buzzword-driven narrative.