Inside Meta’s Reality Labs - Hands-On With the Future of Metaverse

The Future of Avatars: A Metaverse Revolution

My 3D avatar is ready for use in my phone or VR, and it just took a few hours to generate after my scan. The team's working hard to make that processing faster. I had a conversation with somebody remotely, and it looked not bad, although kind of like an animated bust. You know, I felt like they were talking to me, but there was still a little bit of lag. If I had a conversation with someone like that, I would say, "Why aren't you moving very much? What's going on here?" It was a little uncanny.

The Second Demo: A Surprising Experience

I had the second demo, and it was really surprising. That was um Kodak Avatar 2, which is their next-generation kind of moonshot avatar that they're working on. I talked to somebody in Pittsburgh who's building this, and I got to see basically their head almost lit up with candlelight. What you're seeing actually is a reliable volumetric representation of my head, face, hair, shoulders, and more. To enable this interaction, there are a few cameras mounted on the headset that absorb my eyes and mouth, allowing me to animate the avatar in various ways.

I was talking to them in some weird dark room; it almost felt like a PlayStation 5 video game or Xbox video game where you know you look at it, and it looks so incredibly rendered that you wonder if it's photo-real but it's actually that person talking. I kept thinking, "Is this animated? Am I looking at the actual person?" I got really close to them; I think I was close enough talking, but I felt like I was intimate, like I was really talking to them. They seemed good, and I don't know when that's ever going to become available, but if I saw it in a game or an app, I'd be really curious to try it.

The Third Avatar Demo: A New Era of Avatars

The third avatar demo I saw was looking at how Med is going to try to actually add legs to avatars along with clothing. It was a scan of an actor in this room studied with cameras to create just a quick captured clip of that avatar that I could then walk around also the clothing that was being draped on that person was all virtual, and it looked as good as the avatar itself. A lot of the shirts kind of rippling and moving, and nothing felt like it was glued on.

The tour wrapped up with Michael Abrush talking about this big future of where we're going in the metaverse. He feels that this is a real phase change for people – maybe the biggest shift since personal computing and the internet. What it feels like is that a lot of things we know are starting to evolve to a next level, where Metaverse wants it to go to includes VR and AR, avatars, 3D objects, and more.

The Metaverse Landscape: A Shift in Reality

A lot of companies have been exploring this, including Apple, Google, Microsoft, and Nvidia. Pretty much every player in the tech landscape has been trying to get there. It feels like it could actually start to happen because a lot of companies are willing it to happen. I got to see all of Meta's prototype VR and AR headsets on a wall at their reality research lab some of them were looking at things like adding mixed reality, some were adding virtual eyes to the outside of your headset, some were trying to be slimmer.

One of them was shooting for the way in which VR glasses could eventually be small enough to almost feel like sunglasses. This is our North Star in a sense – can we make this faultlessly realistic, comfortable to wear all day, and open up this productivity vision? When you look at that wall, I get the sense of how much change is still happening in this landscape.

I got the sense looking at Meta's research lab that there's still a lot of work left to be done. However, it was fascinating to get a taste of where things are going – even if some of that stuff is going to take five years or more to get there.

"WEBVTTKind: captionsLanguage: enI sat across from Mark Zuckerberg as he demonstrated moving things around on a screen wearing a motor neuron sensing wristband that he was doing micro gestures with here's what I'm seeing while I'm using this just the gentlest flick of my thumb to check my messages and with another quick movement I can answer while I'm on the move or I can even take a photo is this the future of the metaverse well it's one of several things that meta is counting on and I got to look at these Technologies firsthand at their reality research lab stuff that you won't get your chance to see for years at meta's reality research lab I got to look at the quest Pro which has meta's next VR headset for more on that watch the whole second video that I have but in this video I'm going to also talk about all of the future Tech demos that I got to try I've been curious just like many people about where meta is going to go with the metaverse Beyond VR to AR to what else how is this going to take over some of the stuff that we do on our everyday lives so when the company invited me to come out to their reality research Labs I was super excited also because this is the first time the company has invited journalists out to that facility I had to shoot my own photo and video when I was there except for a few parts where I wasn't allowed to shoot in which places meta shot footage and photos using their own equipment on site meta is based down in Silicon Valley but its future Tech Team the reality research Labs is up in Washington state kind of near Microsoft and when I arrived at this lab facility it was a bunch of non-descript Office Buildings in an office Park near digipen I was super curious what was inside Michael abrash who's the chief scientist of reality research labs and has always been met as far future or futurist was the one who guided us through a lot of these demos so what I saw was kind of a tasting of four different demos that were meant to represent technologies that meta is not able to make happen yet for everyday people but is trying to shoot for to advance different zones one of them was neural inputs and it's kind of the wildest one it's one that meta has talked about before and meta acquired this company called control labs in 2019 these conversations and demos from their reports have shown people using this band to uh to control things with micro gestures and we got to see some people using it just a few feet away from us I didn't get a chance to demo it myself unfortunately but I got to see both Mark Zuckerberg and a couple of people who were trained over a period of time to use it the way this works is it senses individual motor neurons that are firing and can kind of sense a muscle movement in a way that you don't necessarily even have to fully move your fingers according to meta these are the zero ones binary events that your brain is communicating down to your muscles and what Diego has learned how to do is to voluntarily control them so some of the stuff could look like gestures but over time apparently this will start feeling almost microscopic invisible movements that will then control things kind of looking like mind reading but it's actually intent to move your hand the demos that I saw first Mark Zuckerberg had shown us a whole bunch of little things moving icons around and it looked like he was kind of moving his thumb like a mouse and tapping meta has shown some of these demos before it looks almost like I'm like the way you'd use a mouse or some sort of a futuristic control device but with no actual device apparently it's not precisely enough yet or fast enough yet for typing but it's meant to eventually be used for things like smart glasses that's where Matt is looking at this technology the most because the idea is that you would not be carrying a controller around you'd want to be able to interact really quickly but right now it's all about trying to prove that it just works and how easy it could be to use it the other demo I saw with it were people sitting down playing this this game that was like you know you move back and forth and try to survive this this Endless Running game and usually it would involve some some hand motions but after a while they kept their hands still and did these micro gestures that I couldn't even tell were happening and they claimed that this was a technique called co-adaptation or co-adaptive learning and the feedback that you get from moving it can eventually be whittled down to something so small that the AI picks that up and you can start making motions that feel really really tiny this is all pretty wild and hard to imagine in everyday use but there are a lot of possibilities not just for General control but possibly for assistive uses people who may not have full use of their limbs or have other motor complications because this technology is not that different from the types of tech that could be used say to create a prosthetic limb so you could potentially use it to operate something even if you didn't have full use of your hand the second demo that I did get to try involved spatial audio now spatial audio has been around in airpods and VR and it's basically 3D audio that you can hear around you can be interesting it can feel gimmicky in VR it can be really useful to try to locate where things are where Matt is trying to go with spatial audio though is to AR to eventually be able to place 3D audio in a real room and make it feel like it's there and the company has been working on technology to not just measure where your audio is coming from in the room but where the Echoes are coming from in the space of a room they took us to an anechoic chamber which is soundproof and showed us this array of dozens of speakers that was designed to create the soundscape that microphones would be used to measure the echoing in a room and also be able to tune to specific ears so what I got to listen to were these two demos after that put microphones inside my ears and then I wore these over ear headphones somebody in the room walked around me and recorded this you know 40 second clip of them making various noises and things like that and then I listened to it again played back foreign the audio even having listened to immersive 3D audio things it was surprisingly convincing at times I kept my eyes closed both times and it really felt like somebody was moving around to the side and Whispering near me and that I thought there was someone behind me so it recreates that soundscape but I had to stay perfectly still the other demo that I tried was in a room with four speakers and I wore these headphones with this um tracking device to allow me to track the audio as I moved and they played back audio both on the speakers and on the headphones to see if I could tell the difference whether it was being projected or real foreign coming from the first loudspeaker okay good I failed the first test because they played identified on the speakers and then I realized Midway through are they tricking me I took off the headphones it was all playing on the headphones it was pretty shocking there were over ear headphones that kind of floated a few centimeters over my ears this Tech is something that they're creating specifically for this space so why is that any different than anything else I think again it's that if you could create audio that feels convincing enough that it's in the room with you then eventually if you have holographic avatars you know like the Marvel type things that would appear and beam down and talk to you it could actually sound like they're in the room with you versus just being in your earbud and based on these couple of demos this spatial audio was a lot better than anything that I've ever heard before but it won't be here necessarily anytime soon the other demo I tried involved 3D scanning this is the sort of stuff that I've looked at on iPads and iPhones with lidar and 3D scanning is already a big Trend all across the landscape for VR and AR what's new here is meta sort of showed some of the ways that phone-based 3D scanning could improve they use my shoe for one of the scans so they took took my sneaker off and scanned it so I got a good familiar look of my own shoe in AR that first scan which took a few minutes to make it was good it looked like a better scan than I've seen most times doing it by myself using lidar I could still see some flaws with it though this second demo that they showed with 3D scanning was a lot more interesting looking at something called Radiance Fields they used a technology where they could look at the light patterns around an object and kind of create this 3D scan that would be a lot more detailed but to do it realistically and what I'm basically saying is they showed some 3D objects that they scanned into VR that were very complicated a very fuzzy teddy bear and a very spiky Cactus with tons of little spikes and I thought okay how good is this going to be when I looked at it in VR I could see all the little curly cues of the hair and the spikes of the cactus which were really fine and when I brought it over to a lamp I could see light being reflected off of it a virtual lamp these objects looked really good and crafted but they were 3D scans and when you throw it in the air against a wall or if you bounce it off the ground it's going to respond the same way that the physical object would now that's how good they eventually could make 3D scanning into VR that could be huge because the whole dream of scanning in your furniture your clothing or or other people right now a lot of that stuff looks kind of melted and weird but could it eventually look good enough to not feel like it was glued into the scene I feel like some of those later demos that I saw showed a lot of possibility but those take hours to process right now and aren't ready yet and the final demo I looked at involved avatars we've seen them all the time in VR and they're cartoony and meta has been promising these photorealistic avatars that will start looking like we're really talking with somebody codec avatars are what meta calls them and I've never seen a demo of them before until now I got to look at three different types of Avatar demos one of them was a more boiled down 3D scan Avatar that is meant to be done using a phone something that you eventually could maybe scan yourself and pop yourself in she scans her face from different angles with a neutral expression for about 30 seconds then spends another minute and a half making a variety of expressions that's really all there is to it hi guys my 3D avatar is ready for use in my phone or VR it just took a few hours to generate after my scan and the team's working and making that processing a whole lot faster the conversation I had with somebody remotely looked not bad although kind of like an animated bust you know I felt like they were talking to me but but a little bit still and so if I had a conversation with somebody like that I would say why aren't you moving very much what's going on here it was a little uncanny the second demo I had was really surprising and that was um Kodak Avatar 2 which is their next Generation kind of moonshot Avatar that they're that they're building where I talked to somebody in Pittsburgh who's who's building this and I got to see basically their head almost letting Candlelight so what you're seeing actually is a reliable volumetric representation of my head my face my hair my shoulders and men and to enable this interaction there are a few cameras mounted on the headset that I'm wearing they're absorbing my eyes and my mouth and allow me to animate the Avatar in various ways I was talking to them in some weird dark room it almost felt like a PlayStation 5 video game or Xbox video game where you know you look at it and it looks so incredibly rendered that you wonder if it's photo real but it's actually that person talking and so I kept thinking like is this animated am I looking at the actual person I got really close to them I think I was close talking and but I felt like I was intimate like I felt like I was really talking to them and they I wanted to see what the Expressions were like and the smiles I asked them to kind of make different faces it was pretty good I don't know when that's ever going to become available but if I saw that in a game or in an app I'd be really curious to try it the third Avatar demo I saw was looking at how Med is going to try to actually add legs to avatars along with clothing it was a a scan of an actor in this room studied with cameras to create just a quick captured clip of that Avatar that I could then walk around also the clothing that was being draped on that on that person was all virtual that looked as good it as the Avatar you know a lot of um you know the shirts kind of Rippling and moving and nothing felt kind of glued on and a lot of the metaverse has been talking a big game about Commerce and and fashion and a lot of companies going into the space trying to make things for people is that meta trying to flex out to show some of those possibilities I think so the tour wrapped up with Michael abrush talking about this big future of where we're going in the metaverse where he feels that this is a real phase change for people maybe the biggest shift since uh personal Computing and the internet well I think what it feels like is that a lot of things we know are starting to evolve to a next level where Meadow wants it to go to is stuff that Bridges VR and AR and avatars and 3D objects and that is not the only company trying to get to this point Apple has been going there Google's been talking about it Microsoft's been talking about it nvidia's been talking about it you've got snap pretty much every player in the tech landscape has been exploring it which makes it it feel like it could actually start to happen because a lot of companies are willing it to happen I got to see all of meta's prototype VR and AR headsets on a wall at their reality research Labs some of them were looking at things like adding mixed reality some of them were adding virtual eyes to the outside of your headset some were trying to be Slimmer I saw one that was shooting for the the way in which VR glasses could eventually be small enough to almost feel like sunglasses and this is our North Star in a sense you know can we make this faultlessly realistic comfortable to wear all day and open up this productivity vision and when you look at that wall I get the sense of how much change is still happening in this landscape I got the sense looking at meta's research lab that there's still a lot of work left to be done but it was fascinating to get a taste of where things are going even if some of that stuff is going to take five years or more to get there thanks a lot for watching and if you have any more questions feel free to fire away make sure to like And subscribe and I'll talk to you soonI sat across from Mark Zuckerberg as he demonstrated moving things around on a screen wearing a motor neuron sensing wristband that he was doing micro gestures with here's what I'm seeing while I'm using this just the gentlest flick of my thumb to check my messages and with another quick movement I can answer while I'm on the move or I can even take a photo is this the future of the metaverse well it's one of several things that meta is counting on and I got to look at these Technologies firsthand at their reality research lab stuff that you won't get your chance to see for years at meta's reality research lab I got to look at the quest Pro which has meta's next VR headset for more on that watch the whole second video that I have but in this video I'm going to also talk about all of the future Tech demos that I got to try I've been curious just like many people about where meta is going to go with the metaverse Beyond VR to AR to what else how is this going to take over some of the stuff that we do on our everyday lives so when the company invited me to come out to their reality research Labs I was super excited also because this is the first time the company has invited journalists out to that facility I had to shoot my own photo and video when I was there except for a few parts where I wasn't allowed to shoot in which places meta shot footage and photos using their own equipment on site meta is based down in Silicon Valley but its future Tech Team the reality research Labs is up in Washington state kind of near Microsoft and when I arrived at this lab facility it was a bunch of non-descript Office Buildings in an office Park near digipen I was super curious what was inside Michael abrash who's the chief scientist of reality research labs and has always been met as far future or futurist was the one who guided us through a lot of these demos so what I saw was kind of a tasting of four different demos that were meant to represent technologies that meta is not able to make happen yet for everyday people but is trying to shoot for to advance different zones one of them was neural inputs and it's kind of the wildest one it's one that meta has talked about before and meta acquired this company called control labs in 2019 these conversations and demos from their reports have shown people using this band to uh to control things with micro gestures and we got to see some people using it just a few feet away from us I didn't get a chance to demo it myself unfortunately but I got to see both Mark Zuckerberg and a couple of people who were trained over a period of time to use it the way this works is it senses individual motor neurons that are firing and can kind of sense a muscle movement in a way that you don't necessarily even have to fully move your fingers according to meta these are the zero ones binary events that your brain is communicating down to your muscles and what Diego has learned how to do is to voluntarily control them so some of the stuff could look like gestures but over time apparently this will start feeling almost microscopic invisible movements that will then control things kind of looking like mind reading but it's actually intent to move your hand the demos that I saw first Mark Zuckerberg had shown us a whole bunch of little things moving icons around and it looked like he was kind of moving his thumb like a mouse and tapping meta has shown some of these demos before it looks almost like I'm like the way you'd use a mouse or some sort of a futuristic control device but with no actual device apparently it's not precisely enough yet or fast enough yet for typing but it's meant to eventually be used for things like smart glasses that's where Matt is looking at this technology the most because the idea is that you would not be carrying a controller around you'd want to be able to interact really quickly but right now it's all about trying to prove that it just works and how easy it could be to use it the other demo I saw with it were people sitting down playing this this game that was like you know you move back and forth and try to survive this this Endless Running game and usually it would involve some some hand motions but after a while they kept their hands still and did these micro gestures that I couldn't even tell were happening and they claimed that this was a technique called co-adaptation or co-adaptive learning and the feedback that you get from moving it can eventually be whittled down to something so small that the AI picks that up and you can start making motions that feel really really tiny this is all pretty wild and hard to imagine in everyday use but there are a lot of possibilities not just for General control but possibly for assistive uses people who may not have full use of their limbs or have other motor complications because this technology is not that different from the types of tech that could be used say to create a prosthetic limb so you could potentially use it to operate something even if you didn't have full use of your hand the second demo that I did get to try involved spatial audio now spatial audio has been around in airpods and VR and it's basically 3D audio that you can hear around you can be interesting it can feel gimmicky in VR it can be really useful to try to locate where things are where Matt is trying to go with spatial audio though is to AR to eventually be able to place 3D audio in a real room and make it feel like it's there and the company has been working on technology to not just measure where your audio is coming from in the room but where the Echoes are coming from in the space of a room they took us to an anechoic chamber which is soundproof and showed us this array of dozens of speakers that was designed to create the soundscape that microphones would be used to measure the echoing in a room and also be able to tune to specific ears so what I got to listen to were these two demos after that put microphones inside my ears and then I wore these over ear headphones somebody in the room walked around me and recorded this you know 40 second clip of them making various noises and things like that and then I listened to it again played back foreign the audio even having listened to immersive 3D audio things it was surprisingly convincing at times I kept my eyes closed both times and it really felt like somebody was moving around to the side and Whispering near me and that I thought there was someone behind me so it recreates that soundscape but I had to stay perfectly still the other demo that I tried was in a room with four speakers and I wore these headphones with this um tracking device to allow me to track the audio as I moved and they played back audio both on the speakers and on the headphones to see if I could tell the difference whether it was being projected or real foreign coming from the first loudspeaker okay good I failed the first test because they played identified on the speakers and then I realized Midway through are they tricking me I took off the headphones it was all playing on the headphones it was pretty shocking there were over ear headphones that kind of floated a few centimeters over my ears this Tech is something that they're creating specifically for this space so why is that any different than anything else I think again it's that if you could create audio that feels convincing enough that it's in the room with you then eventually if you have holographic avatars you know like the Marvel type things that would appear and beam down and talk to you it could actually sound like they're in the room with you versus just being in your earbud and based on these couple of demos this spatial audio was a lot better than anything that I've ever heard before but it won't be here necessarily anytime soon the other demo I tried involved 3D scanning this is the sort of stuff that I've looked at on iPads and iPhones with lidar and 3D scanning is already a big Trend all across the landscape for VR and AR what's new here is meta sort of showed some of the ways that phone-based 3D scanning could improve they use my shoe for one of the scans so they took took my sneaker off and scanned it so I got a good familiar look of my own shoe in AR that first scan which took a few minutes to make it was good it looked like a better scan than I've seen most times doing it by myself using lidar I could still see some flaws with it though this second demo that they showed with 3D scanning was a lot more interesting looking at something called Radiance Fields they used a technology where they could look at the light patterns around an object and kind of create this 3D scan that would be a lot more detailed but to do it realistically and what I'm basically saying is they showed some 3D objects that they scanned into VR that were very complicated a very fuzzy teddy bear and a very spiky Cactus with tons of little spikes and I thought okay how good is this going to be when I looked at it in VR I could see all the little curly cues of the hair and the spikes of the cactus which were really fine and when I brought it over to a lamp I could see light being reflected off of it a virtual lamp these objects looked really good and crafted but they were 3D scans and when you throw it in the air against a wall or if you bounce it off the ground it's going to respond the same way that the physical object would now that's how good they eventually could make 3D scanning into VR that could be huge because the whole dream of scanning in your furniture your clothing or or other people right now a lot of that stuff looks kind of melted and weird but could it eventually look good enough to not feel like it was glued into the scene I feel like some of those later demos that I saw showed a lot of possibility but those take hours to process right now and aren't ready yet and the final demo I looked at involved avatars we've seen them all the time in VR and they're cartoony and meta has been promising these photorealistic avatars that will start looking like we're really talking with somebody codec avatars are what meta calls them and I've never seen a demo of them before until now I got to look at three different types of Avatar demos one of them was a more boiled down 3D scan Avatar that is meant to be done using a phone something that you eventually could maybe scan yourself and pop yourself in she scans her face from different angles with a neutral expression for about 30 seconds then spends another minute and a half making a variety of expressions that's really all there is to it hi guys my 3D avatar is ready for use in my phone or VR it just took a few hours to generate after my scan and the team's working and making that processing a whole lot faster the conversation I had with somebody remotely looked not bad although kind of like an animated bust you know I felt like they were talking to me but but a little bit still and so if I had a conversation with somebody like that I would say why aren't you moving very much what's going on here it was a little uncanny the second demo I had was really surprising and that was um Kodak Avatar 2 which is their next Generation kind of moonshot Avatar that they're that they're building where I talked to somebody in Pittsburgh who's who's building this and I got to see basically their head almost letting Candlelight so what you're seeing actually is a reliable volumetric representation of my head my face my hair my shoulders and men and to enable this interaction there are a few cameras mounted on the headset that I'm wearing they're absorbing my eyes and my mouth and allow me to animate the Avatar in various ways I was talking to them in some weird dark room it almost felt like a PlayStation 5 video game or Xbox video game where you know you look at it and it looks so incredibly rendered that you wonder if it's photo real but it's actually that person talking and so I kept thinking like is this animated am I looking at the actual person I got really close to them I think I was close talking and but I felt like I was intimate like I felt like I was really talking to them and they I wanted to see what the Expressions were like and the smiles I asked them to kind of make different faces it was pretty good I don't know when that's ever going to become available but if I saw that in a game or in an app I'd be really curious to try it the third Avatar demo I saw was looking at how Med is going to try to actually add legs to avatars along with clothing it was a a scan of an actor in this room studied with cameras to create just a quick captured clip of that Avatar that I could then walk around also the clothing that was being draped on that on that person was all virtual that looked as good it as the Avatar you know a lot of um you know the shirts kind of Rippling and moving and nothing felt kind of glued on and a lot of the metaverse has been talking a big game about Commerce and and fashion and a lot of companies going into the space trying to make things for people is that meta trying to flex out to show some of those possibilities I think so the tour wrapped up with Michael abrush talking about this big future of where we're going in the metaverse where he feels that this is a real phase change for people maybe the biggest shift since uh personal Computing and the internet well I think what it feels like is that a lot of things we know are starting to evolve to a next level where Meadow wants it to go to is stuff that Bridges VR and AR and avatars and 3D objects and that is not the only company trying to get to this point Apple has been going there Google's been talking about it Microsoft's been talking about it nvidia's been talking about it you've got snap pretty much every player in the tech landscape has been exploring it which makes it it feel like it could actually start to happen because a lot of companies are willing it to happen I got to see all of meta's prototype VR and AR headsets on a wall at their reality research Labs some of them were looking at things like adding mixed reality some of them were adding virtual eyes to the outside of your headset some were trying to be Slimmer I saw one that was shooting for the the way in which VR glasses could eventually be small enough to almost feel like sunglasses and this is our North Star in a sense you know can we make this faultlessly realistic comfortable to wear all day and open up this productivity vision and when you look at that wall I get the sense of how much change is still happening in this landscape I got the sense looking at meta's research lab that there's still a lot of work left to be done but it was fascinating to get a taste of where things are going even if some of that stuff is going to take five years or more to get there thanks a lot for watching and if you have any more questions feel free to fire away make sure to like And subscribe and I'll talk to you soon\n"