Project Tango Interview and Demo at Google I_O 2014

**Project Tango: A New Era in Mobile Gaming**

The Project Tango team is excited to announce that their latest technology will be integrated into the Unity SDK, allowing developers to create immersive and interactive experiences. The ability to track physical motion and use it to enhance game play is a major breakthrough in mobile gaming.

"We've connected that tracking data to the Unity camera," explained one of the researchers. "This is a very simple game where essentially I just pick up the cube and put it on the switch, but to actually finish this game, I um you can see that I can't quite reach the cube from where I am now so I actually have to go walk forward to solve it."

The tracking data allows players to move their device in the real world, translating that motion into virtual space. This means that developers can create games that respond to the player's physical environment, creating a more immersive experience.

"So this basically introduces motion into Mobile gaming," said one of the researchers. "if I pick up this green switch block sorry I can see that the green switch is actually over there on the booth you to over there I'll come back all right and there you go so this so This basically introduces motion into Mobile gaming so if I move forward in the real world I actually move move forward in the virtual space as well."

This technology has huge potential for developers to create new types of games that take advantage of the player's physical space. "If you can imagine now that I've understand where I am in my house or my office space and I actually have a little bit of geometry data, uh I can sort of transform my environment into a fantasy world," explained one of the researchers.

The ability to track physical motion also opens up new possibilities for interacting with virtual objects. "I'm just using the tablet as a camera controller, I can look at the trees and the stone but also see on the ground there's a small wizard," said one of the researchers. "if we Crouch if I crouch down I can get down to the Wizard's level and interact with him."

This technology has the potential to revolutionize mobile gaming. "If we had game that actually understood the shape of the Moscone Center, you can imagine actually having multiple people walk around this space competing with each other for territory," said one of the researchers.

The Project Tango team is also working on more advanced features, including 3D capture of the environment using both tracking data and depth sensing. "This is like I'm painting the environment with my camera and I can back up and I'll scan you into the space okay that's supposed to be me right there," explained one of the researchers.

The technology is still in its early stages, but it has huge potential for developers to create new types of games and experiences. "These are sort of the building blocks you can imagine allowing developers to start building new user experiences that use both motion as well as physical geometry to uh create new apps," said one of the researchers.

**A Future with Project Tango**

One researcher suggested that it's possible that next year's Google I/O could see a game built using the Moscone Center schematic. While this is ambitious, the team is confident that their technology can get there.

"I think that would be a really neat Target to aim for," said one of the researchers. "we'll have to see there's a lot of things that have to sort of improve on both the software side and the hardware side to make it happen."

But for now, the Project Tango team is excited to share their technology with developers and start building new experiences. "oh I'm and I'm sure it's going to get there," said one of the researchers.

**Conclusion**

The Project Tango technology has huge potential for mobile gaming and beyond. By tracking physical motion and using it to enhance game play, developers can create immersive and interactive experiences that take advantage of the player's physical space. While there are still challenges to overcome, the team is confident that their technology will revolutionize the way we experience virtual worlds.

As researchers continue to refine and improve the technology, we can expect to see more exciting developments in the future. One thing is certain: Project Tango is a game-changer for mobile gaming and beyond.

**App Demo**

The Project Tango team showcased an app from a university partner that combines tracking data with depth sensing to create a 3D capture of the space. The app allowed users to scan their environment and bring it into 3D, creating a immersive experience.

"We just got this app a few days ago," said one of the researchers. "This is what this is doing right now is actually combining both the tracking data and the depth sensor data to allow us to create a a 3D capture of the space."

As the technology improves and becomes more accurate, we can expect to see even higher quality results. For now, the app demonstrates the potential for Project Tango to revolutionize mobile gaming.

**The Future is Now**

The future of mobile gaming is bright with Project Tango leading the way. By combining tracking data with depth sensing, developers can create immersive and interactive experiences that take advantage of the player's physical space.

While there are still challenges to overcome, the team is confident that their technology will revolutionize the way we experience virtual worlds. As researchers continue to refine and improve the technology, we can expect to see even more exciting developments in the future.

"WEBVTTKind: captionsLanguage: enhey it's joshuar from Android authority here at Google IO at the Project Tango booth and I'm here with one of the uh Engineers of Project Tango Johnny and I wanted to get some uh to get some extra information because I know that Tango in some ways might be somewhat of a mystery to some of you out there so I wanted to get it from the actual people themselves what Project Tango is from their word so why don't we get a bit of an explanation about what Project Tango is and we have seen uh gaming uh examples of it but what else can Project Tango do sure the at a high level the goal of Project Tango is to advance 3D sensing and tracking on mobile devices um so we've been working with the hardware ecosystem and the software Partners to advance both the sensors and the algorithms necessary to make these devices track their position and and capture models of their space um so one of the things that we've demoed a lot recently is uh what you can do with just the tracking information where we take that camera and uh control either a Unity uh based 3D environment or a Unreal Engine based uh 3D environment um when we add the data from the depth sensor um this is where it starts to get into new code and new algorithms that aren't quite ready to put in the SDK yet um but also why we haven't had so many demos but we have a few demos here that essentially as you walk through a space you're actually able to capture the geometry of the floor and the walls and the furniture and uh outside of gaming uh well inside of gaming that allows you to have characters that sort of know how to navigate through your hallways or potentially play hide-and-seek in your in your house and because it knows where your closet is um uh but industrial applications and professional applications include real estate where you're trying to estimate the square footage of your home uh we have uh apps from uh Trimble uh which essentially start to explore well how much better could you make room scanning uh because room scanning is ends up being important for shopping for furniture or being able to figure out how much carpet do you need or how many tiles that you want to put on the floor um and then there's industrial inspection so the trumo also has another very early prototype of an experience where we've taken in a 3D model of the moson center and once we recognize the position of the device in the moson center you can essentially get x-ray vision of where are the elevators uh where's the service stations for this particular p panel and without having to walk around the building uh sort of uh looking at the signs on the walls you actually can hold up the tablet and see sort of through the walls um um would uh sorry would uh one one one question that just came to mind um for for various applications I say consumers would be using would the algorithms or the 3D atics would um uh uh let's say would they be made available for the various applications or would the consumers be able to create those schematics themselves so the amount of data that the device can generate from scratch is uh relatively modest it basically is uh just the position of the device as you were holding it and then rough geometry information sort of I can pick out uh sort of rough walls and floors and some furniture uh since it has a 3D sensor to capture that geometry um it does not automatically know about say where does the hot water pipe run through the wall that requires having the cad model uh in advance or from other source so and chances are people don't won't have that available in most buildings but for professional and Industrial uh commercial uh structures that data may be available by the Builder okay yeah so the it's a 7-in tablet that has a tiger K1 inside and uh our Custom Design sensor so we have a fishey camera that's designed for motion tracking which is in the middle uh and if you think about human Vision we have a a tremendously wide peripheral vision that that expands out uh uh gives us enormous amount of context about the way the environment is moving and so what this does is gives us a wide field of view of the environment and then we have a set of sensors on here that are depth sensors um these are sensors that basically give us information about the floor and the walls uh to tell us what the geometry is uh in this particular tablet we have a structured light uh depth sensor from a company called mentis uh but we also have a prototype here of a depth sensor from a company called PMD Tech um that has a different principle of operation which is time of flight uh and the time of flight uh has slightly different properties in that it doesn't require a large separation of sensors um and potentially is more robust to sunlight um but as we evolve the hardware will continue to sort of push the push the envelope in terms of the performance of those depth sensors sure um so let me give you a quick tour of the software uh that actually runs on a Project Tango device uh this is our Diagnostics tool which just gives us uh a view of the sensor data coming in so on the left side of the screen you can see this fisheye lens uh in fact I can still see the my fingers just around the tablet uh and so this is a super wide camera that that allows us to understand the motion of the device uh the left those points are being tracked already yeah so those green points are essentially uh the image processing uh doing optical flow so this allows us to understand while everything moved in the world uh in a way that makes sense uh to the way the device was held uh we also have the gyro and accelerometer data right below the image uh that is essentially time synchronized with the with the camera so this allows all the sensor data to come into the same time uh or well time stamped so we can make an estimate of where the device is U but if I actually move the tablet left and right or actually make a circle you can actually see that it's actually tracing out the circle on the screen right over on the uh screen right here yeah yeah so I'm going to just walk around here a little bit and you'll see uh the tracking showing up on the screen at it over here yeah so Johnny's going to go off camera and you'll see the tracking right here is actually taking where he is and there he goes you can see it's actually mapping the entire area that he is walking through using all of those sensors and now we have this sort of this kind of janky figure eight right here oh that's great uh yeah so this is uh tracing out my path so one of the things we can do is uh insert that path into a game environment so this is one of the uh Unity applications that we have will be part of the SDK which is uh how do you actually use this data for end user experience and so what we've done is we've connected that tracking data to the unity uh camera and this is a very simple game where essentially I just pick up the cube and put it on the switch uh but to actually finish this game I um you can see that I can't quite reach the cube from where I am now so I actually have to go walk forward to solve and is that using is that using the data that you just tracked uh that's right that's right so so that this ability to track the physical motion of the device is now being piped into Unity um so if I pick up this green switch block sorry I can see that the green switch is actually over there on the booth you to over there I'll come back all right and there you go so this so This basically introduces motion into Mobile gaming so if I move forward in the real world I actually move move forward in the virtual space as well so this allows uh game developers or application developers to start thinking about well how would I actually use my physical room either my my living room or my office space and actually use this inside the game um which is if you can imagine now that I've understand where I am in my house or my environment and I actually have a little bit of geometry data uh I can sort of transform my environment into a fantasy world so again I'm just using the tablet as a camera controller I can look at the trees and the stone but also see on the ground there's a small wizard um so if we Crouch if I crouch down I can get down to the Wizard's level and interact with him um so essentially if I have the ability to uh place these assets I can start transforming parts of my living room into a fantasy environment so I could imagine you know my living room is a fantasy world versus the my bedroom is another sort of Safe Haven or you know an Iceland and you can actually use physical space as part of the game uh if we had game a game that actually understood the shape of the Moscone Center you can imagine actually having multiple people walk around this space competing with each other for territory you think that would not be very dangerous though like uh I actually have a personal desire to see a game at some point that causes two people to actively tackle each other but we'll see if that actually comes to comes to fruition here's an app from a university partner that we just got recently that actually combines the uh real-time data of the tracking with the depth sensing that we have um we'll see if this runs this was a an app that we just got a few days ago so what this is doing right now is actually combining both the tracking data and the depth sensor data to allow us to create a a 3D capture of the space Oh it's oh that's actually Imaging the yeah so this is like I'm painting the environment with my camera and I can back up and I'll scan you into the space okay that's supposed to be me right there yeah so so this is a you know a very simple mesh that we're building in real time um but as the software improves and as the hardware improves in terms of it accuracy uh we can imagine that the quality of this data continues to higher and higher uh Fidelity if we actually stored the data and did more postprocessing on it uh app developers could actually get much much higher quality but this is just what we could process in real time that's wonderful um so these are sort of the building blocks you can imagine allowing developers to start building new user experiences that use both motion as well as physical geometry uh to uh create new apps okay do you think that uh next year maybe for Io 2015 or 2016 you'll be able to create that game using the Moscone Center schematic uh I think that would be a really neat Target to aim for we'll have to see there's a lot of things that have to sort of improve on both the software side and the hardware side to make it happen oh I'm and I'm sure it's going to get there I mean you got some great minds working on it and the way the where it is already I'm I'm astounded this is really wonderful like it's great all right well thank you very much and um yeah you know keep it tuned here we have Project Tango as it uh as it happens we're going to be covering it as it comes along it's a wonderful wonderful Suite of technology that we have here and Johnny one of the minds behind it thank you so much for uh for for fiing our questions and for giving us a demo of it thanks very much all right take care all right this Android authority coming to you live from google.io 2014hey it's joshuar from Android authority here at Google IO at the Project Tango booth and I'm here with one of the uh Engineers of Project Tango Johnny and I wanted to get some uh to get some extra information because I know that Tango in some ways might be somewhat of a mystery to some of you out there so I wanted to get it from the actual people themselves what Project Tango is from their word so why don't we get a bit of an explanation about what Project Tango is and we have seen uh gaming uh examples of it but what else can Project Tango do sure the at a high level the goal of Project Tango is to advance 3D sensing and tracking on mobile devices um so we've been working with the hardware ecosystem and the software Partners to advance both the sensors and the algorithms necessary to make these devices track their position and and capture models of their space um so one of the things that we've demoed a lot recently is uh what you can do with just the tracking information where we take that camera and uh control either a Unity uh based 3D environment or a Unreal Engine based uh 3D environment um when we add the data from the depth sensor um this is where it starts to get into new code and new algorithms that aren't quite ready to put in the SDK yet um but also why we haven't had so many demos but we have a few demos here that essentially as you walk through a space you're actually able to capture the geometry of the floor and the walls and the furniture and uh outside of gaming uh well inside of gaming that allows you to have characters that sort of know how to navigate through your hallways or potentially play hide-and-seek in your in your house and because it knows where your closet is um uh but industrial applications and professional applications include real estate where you're trying to estimate the square footage of your home uh we have uh apps from uh Trimble uh which essentially start to explore well how much better could you make room scanning uh because room scanning is ends up being important for shopping for furniture or being able to figure out how much carpet do you need or how many tiles that you want to put on the floor um and then there's industrial inspection so the trumo also has another very early prototype of an experience where we've taken in a 3D model of the moson center and once we recognize the position of the device in the moson center you can essentially get x-ray vision of where are the elevators uh where's the service stations for this particular p panel and without having to walk around the building uh sort of uh looking at the signs on the walls you actually can hold up the tablet and see sort of through the walls um um would uh sorry would uh one one one question that just came to mind um for for various applications I say consumers would be using would the algorithms or the 3D atics would um uh uh let's say would they be made available for the various applications or would the consumers be able to create those schematics themselves so the amount of data that the device can generate from scratch is uh relatively modest it basically is uh just the position of the device as you were holding it and then rough geometry information sort of I can pick out uh sort of rough walls and floors and some furniture uh since it has a 3D sensor to capture that geometry um it does not automatically know about say where does the hot water pipe run through the wall that requires having the cad model uh in advance or from other source so and chances are people don't won't have that available in most buildings but for professional and Industrial uh commercial uh structures that data may be available by the Builder okay yeah so the it's a 7-in tablet that has a tiger K1 inside and uh our Custom Design sensor so we have a fishey camera that's designed for motion tracking which is in the middle uh and if you think about human Vision we have a a tremendously wide peripheral vision that that expands out uh uh gives us enormous amount of context about the way the environment is moving and so what this does is gives us a wide field of view of the environment and then we have a set of sensors on here that are depth sensors um these are sensors that basically give us information about the floor and the walls uh to tell us what the geometry is uh in this particular tablet we have a structured light uh depth sensor from a company called mentis uh but we also have a prototype here of a depth sensor from a company called PMD Tech um that has a different principle of operation which is time of flight uh and the time of flight uh has slightly different properties in that it doesn't require a large separation of sensors um and potentially is more robust to sunlight um but as we evolve the hardware will continue to sort of push the push the envelope in terms of the performance of those depth sensors sure um so let me give you a quick tour of the software uh that actually runs on a Project Tango device uh this is our Diagnostics tool which just gives us uh a view of the sensor data coming in so on the left side of the screen you can see this fisheye lens uh in fact I can still see the my fingers just around the tablet uh and so this is a super wide camera that that allows us to understand the motion of the device uh the left those points are being tracked already yeah so those green points are essentially uh the image processing uh doing optical flow so this allows us to understand while everything moved in the world uh in a way that makes sense uh to the way the device was held uh we also have the gyro and accelerometer data right below the image uh that is essentially time synchronized with the with the camera so this allows all the sensor data to come into the same time uh or well time stamped so we can make an estimate of where the device is U but if I actually move the tablet left and right or actually make a circle you can actually see that it's actually tracing out the circle on the screen right over on the uh screen right here yeah yeah so I'm going to just walk around here a little bit and you'll see uh the tracking showing up on the screen at it over here yeah so Johnny's going to go off camera and you'll see the tracking right here is actually taking where he is and there he goes you can see it's actually mapping the entire area that he is walking through using all of those sensors and now we have this sort of this kind of janky figure eight right here oh that's great uh yeah so this is uh tracing out my path so one of the things we can do is uh insert that path into a game environment so this is one of the uh Unity applications that we have will be part of the SDK which is uh how do you actually use this data for end user experience and so what we've done is we've connected that tracking data to the unity uh camera and this is a very simple game where essentially I just pick up the cube and put it on the switch uh but to actually finish this game I um you can see that I can't quite reach the cube from where I am now so I actually have to go walk forward to solve and is that using is that using the data that you just tracked uh that's right that's right so so that this ability to track the physical motion of the device is now being piped into Unity um so if I pick up this green switch block sorry I can see that the green switch is actually over there on the booth you to over there I'll come back all right and there you go so this so This basically introduces motion into Mobile gaming so if I move forward in the real world I actually move move forward in the virtual space as well so this allows uh game developers or application developers to start thinking about well how would I actually use my physical room either my my living room or my office space and actually use this inside the game um which is if you can imagine now that I've understand where I am in my house or my environment and I actually have a little bit of geometry data uh I can sort of transform my environment into a fantasy world so again I'm just using the tablet as a camera controller I can look at the trees and the stone but also see on the ground there's a small wizard um so if we Crouch if I crouch down I can get down to the Wizard's level and interact with him um so essentially if I have the ability to uh place these assets I can start transforming parts of my living room into a fantasy environment so I could imagine you know my living room is a fantasy world versus the my bedroom is another sort of Safe Haven or you know an Iceland and you can actually use physical space as part of the game uh if we had game a game that actually understood the shape of the Moscone Center you can imagine actually having multiple people walk around this space competing with each other for territory you think that would not be very dangerous though like uh I actually have a personal desire to see a game at some point that causes two people to actively tackle each other but we'll see if that actually comes to comes to fruition here's an app from a university partner that we just got recently that actually combines the uh real-time data of the tracking with the depth sensing that we have um we'll see if this runs this was a an app that we just got a few days ago so what this is doing right now is actually combining both the tracking data and the depth sensor data to allow us to create a a 3D capture of the space Oh it's oh that's actually Imaging the yeah so this is like I'm painting the environment with my camera and I can back up and I'll scan you into the space okay that's supposed to be me right there yeah so so this is a you know a very simple mesh that we're building in real time um but as the software improves and as the hardware improves in terms of it accuracy uh we can imagine that the quality of this data continues to higher and higher uh Fidelity if we actually stored the data and did more postprocessing on it uh app developers could actually get much much higher quality but this is just what we could process in real time that's wonderful um so these are sort of the building blocks you can imagine allowing developers to start building new user experiences that use both motion as well as physical geometry uh to uh create new apps okay do you think that uh next year maybe for Io 2015 or 2016 you'll be able to create that game using the Moscone Center schematic uh I think that would be a really neat Target to aim for we'll have to see there's a lot of things that have to sort of improve on both the software side and the hardware side to make it happen oh I'm and I'm sure it's going to get there I mean you got some great minds working on it and the way the where it is already I'm I'm astounded this is really wonderful like it's great all right well thank you very much and um yeah you know keep it tuned here we have Project Tango as it uh as it happens we're going to be covering it as it comes along it's a wonderful wonderful Suite of technology that we have here and Johnny one of the minds behind it thank you so much for uh for for fiing our questions and for giving us a demo of it thanks very much all right take care all right this Android authority coming to you live from google.io 2014\n"