3 Comments

"I was especially struck by the following elements: (1) Select with your eyes and pinch with your fingers on your lap is the breakthrough gesture innovation much like “touch” was for the iPhone."

This is a great observation. I was a little too harsh in my previous criticism of VR on other blogs I've written. I understand the accessibility feature, and think the leap from T9 phones to resistive ones like the Nokia C5-03, then capactive touch screens wasn't always an improvement.

"But the Vision Pro is a solitary device that you use mainly in the privacy of your own home and it probably goes in a drawer somewhere once you're done using it (Maybe hardcore Apple fans will put it on display)."

They should add after "in a drawer," "in a secret vault, behind a secret door in a closet with 5 locks behind shoe boxes and down a staircase into a basement lair stocked alongside a number of other paraphernalia."

Expand full comment

Reframing the Narrative: Beyond Buzzwords, Why Meta's AI Will Power Deep Connections in the Metaverse

While media headlines scream "Meta pivots to AI," they miss the deeper story. The company's recent investments in Large Language Models (LLMs) aren't a sudden shift, but rather a crucial piece of their long-term vision for a personalized, engaging metaverse. Here's why understanding this connection is key:

From Buzzwords to Building Blocks:

Forget catchy terms like "metaverse" and "AI." Meta's vision goes beyond mere buzzwords. Their LLMs are the building blocks of real-live AI companions, the heart of their metaverse experience.

Beyond Assistance, Building Bonds:

These advanced AI companions won't just answer questions or complete tasks. They'll learn your personality, adapt to your moods, and offer unwavering support, forging genuine emotional connections within the metaverse.

The Power of LLMs:

Meta's LLMs unlock unprecedented natural language processing and personalization. This allows AI companions to understand your unique way of communicating, respond with empathy, and tailor their behavior to your needs and preferences.

A Vision Evolving, Not Changing:

Don't be fooled by the narrative shift. Meta's core vision remains constant: creating a meaningful virtual world where users can connect, grow, and thrive. LLMs are the tools to achieve that vision, making the metaverse more than just a place, but a platform for personal connection.

Challenges and Considerations:

Building trust and avoiding AI bias are crucial. Meta needs to prioritize ethical development and user control to ensure responsible, meaningful interactions.

Emphasize that AI companions complement, not replace, real-world relationships.

Conclusion:

Meta's investment in LLMs isn't a pivot, but a strategic step towards their long-held vision. By harnessing the power of these models, they'll create AI companions that go beyond assistance, offering genuine connection and emotional support within the metaverse. By addressing ethical concerns and transparently communicating their vision, Meta can ensure their AI companions become valued friends, fostering a deeper and more meaningful metaverse experience for all.

Remember: This revised synopsis highlights the connection between Meta's LLM investments and their metaverse vision, providing a more nuanced perspective than the media's current buzzword-driven narrative.

Expand full comment

"Once the software supports multiple screens it'll be better than my physical setup"

1/ This is a great point by Ben... one that I had overlooked. I assumed multiple screens were already supported...

2/ What in the world am I going to do with all my monitors, powercords, and dongles!?

Expand full comment