Inside Meta’s Reality Labs - Hands-On With the Future of Metaverse

The Future of Avatars: A Metaverse Revolution

My 3D avatar is ready for use in my phone or VR, and it just took a few hours to generate after my scan. The team's working hard to make that processing faster. I had a conversation with somebody remotely, and it looked not bad, although kind of like an animated bust. You know, I felt like they were talking to me, but there was still a little bit of lag. If I had a conversation with someone like that, I would say, "Why aren't you moving very much? What's going on here?" It was a little uncanny.

The Second Demo: A Surprising Experience

I had the second demo, and it was really surprising. That was um Kodak Avatar 2, which is their next-generation kind of moonshot avatar that they're working on. I talked to somebody in Pittsburgh who's building this, and I got to see basically their head almost lit up with candlelight. What you're seeing actually is a reliable volumetric representation of my head, face, hair, shoulders, and more. To enable this interaction, there are a few cameras mounted on the headset that absorb my eyes and mouth, allowing me to animate the avatar in various ways.

I was talking to them in some weird dark room; it almost felt like a PlayStation 5 video game or Xbox video game where you know you look at it, and it looks so incredibly rendered that you wonder if it's photo-real but it's actually that person talking. I kept thinking, "Is this animated? Am I looking at the actual person?" I got really close to them; I think I was close enough talking, but I felt like I was intimate, like I was really talking to them. They seemed good, and I don't know when that's ever going to become available, but if I saw it in a game or an app, I'd be really curious to try it.

The Third Avatar Demo: A New Era of Avatars

The third avatar demo I saw was looking at how Med is going to try to actually add legs to avatars along with clothing. It was a scan of an actor in this room studied with cameras to create just a quick captured clip of that avatar that I could then walk around also the clothing that was being draped on that person was all virtual, and it looked as good as the avatar itself. A lot of the shirts kind of rippling and moving, and nothing felt like it was glued on.

The tour wrapped up with Michael Abrush talking about this big future of where we're going in the metaverse. He feels that this is a real phase change for people – maybe the biggest shift since personal computing and the internet. What it feels like is that a lot of things we know are starting to evolve to a next level, where Metaverse wants it to go to includes VR and AR, avatars, 3D objects, and more.

The Metaverse Landscape: A Shift in Reality

A lot of companies have been exploring this, including Apple, Google, Microsoft, and Nvidia. Pretty much every player in the tech landscape has been trying to get there. It feels like it could actually start to happen because a lot of companies are willing it to happen. I got to see all of Meta's prototype VR and AR headsets on a wall at their reality research lab some of them were looking at things like adding mixed reality, some were adding virtual eyes to the outside of your headset, some were trying to be slimmer.

One of them was shooting for the way in which VR glasses could eventually be small enough to almost feel like sunglasses. This is our North Star in a sense – can we make this faultlessly realistic, comfortable to wear all day, and open up this productivity vision? When you look at that wall, I get the sense of how much change is still happening in this landscape.

I got the sense looking at Meta's research lab that there's still a lot of work left to be done. However, it was fascinating to get a taste of where things are going – even if some of that stuff is going to take five years or more to get there.