How VR Works - Frametimes & Warp Misses w_ Tom Petersen

**Understanding How Virtual Reality Works: A Comprehensive Overview**

---

### Introduction to Tom Peterson from Nvidia

Tom Peterson, a prominent figure at Nvidia, joins us today to discuss the intricacies of how virtual reality (VR) works. Known for his work on stage presentations and contributions to projects like FCAT, Tom brings extensive expertise to the conversation. Today, we delve into the technical aspects of VR, focusing on its timeline, challenges, and future developments.

---

### Overview of How VR Works

In traditional monitors, frames are delivered within a specific time window, typically 16 milliseconds for 60 Hz refresh rates. However, VR introduces added complexity due to factors like lenses, head motion, and the need for real-time simulation. The process involves two key components: the game application and the runtime provider.

- **Game Application**: This is responsible for rendering frames based on input and simulating the environment. It generates a texture similar to a regular PC game but tailored for VR.

- **Runtime Provider (e.g., Oculus, Valve)**: Running parallel to the game, the runtime handles lens correction, distortion adjustment, and reprojection. These tasks ensure the image aligns with the headset's lenses and maintains fluidity despite motion.

---

### The Role of Runtime in VR

The runtime processes frames rendered by the game, applying necessary corrections before they are displayed on the headset. This includes:

- **Lens Correction**: Adjusting images to account for the physical distortion caused by the headset's lenses.

- **Reprojection (Late Warp)**: Repositioning pixels based on the latest head position data to prevent motion sickness.

The 90 Hz refresh rate creates an 11-millisecond window, a tight deadline that both the game and runtime must meet efficiently. Any delay can lead to issues like warp misses or drop frames.

---

### Challenges in VR Frame Delivery

In ideal scenarios, each frame is rendered and processed within the allocated time window. However, real-world conditions often pose challenges:

- **Warp Miss**: Occurs when the runtime fails to process a frame in time, resulting in an older frame being displayed. This causes visible stuttering and disrupts immersion.

- **Drop Frame**: When the runtime modifies an older frame using the latest head position data to maintain fluidity, albeit with slightly outdated visuals.

These issues highlight the importance of efficient rendering and processing to ensure a seamless VR experience.

---

### Strategies for Handling Missed Frames

To mitigate missed frames, both game developers and runtime providers employ strategies:

- **Dynamic Quality Adjustment**: Adjusting graphics settings in real-time if the system struggles to meet frame deadlines. This ensures smooth performance at lower quality levels.

- **Reprojection Techniques**: Utilizing prior frames creatively to maintain fluidity without significant visual artifacts.

While dynamic adjustments are ideal, they require collaboration between games and runtimes, which is still evolving.

---

### Future Considerations in VR

Looking ahead, improvements in hardware and software will enhance VR experiences. Innovations like improved rendering techniques and adaptive sync technologies promise to reduce latency and improve frame rates. As VR continues to evolve, understanding frame timing and delivery mechanisms will be crucial for both developers and users.

---

### Conclusion

VR's complexity lies in its need for real-time simulation, motion handling, and efficient image processing. While challenges like warp misses and drop frames persist, ongoing advancements aim to deliver more immersive experiences. By comprehending the role of runtimes, frame timing, and future innovations, we can better appreciate the technological marvel that VR represents.

---

This structured article provides a thorough exploration of how VR works, breaking down complex concepts into digestible sections while maintaining the integrity of the original transcription.

"WEBVTTKind: captionsLanguage: enhey everyone I am joined by Tom Peterson from Nvidia hey and you may know Tom from doing a lot of the stage presents I do a lot of stage presents right a lot of giveaways Tom actually also worked on FCAT I did for the frame Time Capture stuff that you've seen pretty much everyone use at this point but today we're talking about how VR works so the the question here is if we start with a timeline of VR you know for a monitor you have X milliseconds y seconds the frame needs to be delivered in that time what does it look like for VR well uh VR is a little bit more complicated of course because there's more going on there's the game of course which is the application that you're running like say raw data or something like that and the game's job is to take physical input to do a simulation and then to generate a new frame but because it's VR there's lenses and there's head motion all that kind of stuff so the uh runtime providers folks like Oculus or valve with Vive are providing another program that I just call the runtime and the runtime runs in parallel with the game so the game is rendering a texture that's Square effectively just like a regular PC game but then that gets read by the runtime and the runtime does a couple things it does effectively lens correction which is making the image uh work with the lenses that are in the headset and it's also doing um something that's called late warp or reprojection and the reason that's happening is to kind of deal with the fact that there's a fixed 11 Mill second window because of 90 HZ they're trying to hit all the time and the the lenses are there of course because without the lenses you're got something right in front of your eyes of course can't really focus on it yeah if you imagine looking at of your headset you've got that that display about 2 inches from your head so the the lenses are there to allow your eye to relax and still see what's on the the image but those lenses cause a distortion when it's on your lens so they actually do undistortion kind of U modifying the square images to make them a little bit more curved to get for the lenses so the big challenge with VR is how do you how do you make all this work so again you're looking at an 11 millisecond window might help if I just draw a picture okay let's try try this in real time totally impromptu totally impromptu didn't know you were going to do this all right so this is my uh my version of time right and these blue lines represent every time you're going to draw a new image on the VR headset so you've got 0 11 22 milliseconds 33 milliseconds yeah so basically since it's 90 Hertz there's an 11 millisecond step and it happens over and over and over so think of it as you've got this window just a small time right before that next cycle that you have to be done with your image so since there's two people involved you really kind of never know what's going to happen so the way it works is the game might start rendering so let's call that game and it's looking at things like you know taking a headset position and it's calculating animation and it might be doing doing some Network stuff but it's figuring out what's the image that uh I'm about to put up on the screen and the next thing that happens is the runtime which is actually going to read that texture and the runtime is going to do things like warping for the lens or lens warp and it's also going to do a reprojection which is uh sort of retiming it right um and it has to get all done in time to make that next refresh so on the headset this would be you know image number one so that's our frame yep that's what you see and this would be game frame number one right so that's the way VR is supposed to work the problem is let's say your GPU is running slower or you have like CPU get busy sometimes this game runs too long or sometimes that runtime long runs too long so what it may have actually looked like is the game ran too long so let's call that game two but now you've already missed the next interval so the runtime still down here doing its stuff but it to make a decision what does it want to show in the next frame so the are the headsets aware of when the run time usually occurs and how long it takes yeah the the uh headset manufacturers have a whole software architecture that is effectively defined in the runtime so think of it this runtime as Oculus or Vibes or any other headsets Sky it's their secret sauce of how do they give a great experience but no matter what their algorithm is they've got a fundamental problem which is the game did not get a frame rendered in time but they still have to put something on the screen because your headset's you know running at 11 milliseconds so there's a couple different strategies right one is I just take the old rendered frame and I reh you know I reshow it you can do reprojection where you basically take this uh a frame that was rendered by the game earlier and you modify it to put it on the screen again um you can also just do nothing and take uh the unmodified frame from last time and reshow it so all of these different defects have different performance impacts and it's really kind of complex as to how do you how do you represent the performance of VR right so with with traditional monitors non non VR stuff uh whether you have vsync or gsync or none of those uh we have issues like stuttering tearing what is the VR equivalent of those oh okay so since um it's well known that if you have tearing in VR it's absolutely really really horrible experience yeah so so the first thing is that VR is almost always vsync on okay and that's that's hardwired right um and so what that means is the real decision is what do you do at every refresh interval and you kind of when things are working right you just show the new frame and everybody's happy if you don't show a new frame you can either reproject an old frame which is kind of synthesizing a new frame right it's it's the runtime creating something to show that they didn't get from the game um or you do nothing right so when you do nothing which is like in this case I call that a warp Miss okay so a warp Miss means that you replayed an old frame just like on desktop when you stutter it's exactly the same thing so is it does everything stay where it was or is there still a head tracking everything stays where it was because when you have a warp miss the runtime didn't get a new frame done in time so the driver just replays an old frame the other thing that could happen is what I call a drop frame and I I think these terms are still settling you know everybody's got a different name for all these things so when there's a drop frame what that means to me is the runtime was able to take uh some version of a prior frame and then modify it and get that thing out in time using the latest head position okay so as long as you use a current head position and you adjust or reproject a prior frame you get a reason good experience right but the animation in this Frame that's reprojected is actually coming from you know an older frame so it looks like a dropped frame from an animation perspective but in terms of fluidity with head tracking you're not getting sick from it right you're not getting sick so I my experience has been when you're reprojecting frames it's a better experience than if you're doing nothing and you're dropping frames or warping missing um but you can definitely see the difference between a dropped frame which is the these reprojected frames and Native frames native frames that are running at 11 milliseconds and everybody's happy so I think it's important as we start figuring out how should we represent you know all of this stuff that we're going to we're going to comprehend you know how long did the game take to render and that's like the New Concept that's sort of like frame time and there's some questions about how that should be measured but we'll get all that worked out and then so you got frame time which is kind of like it is on desktop but then there's this concept of a warp Miss which is the Run time did not have the time to put a new frame out and then there's a concept of a drop frame that is animation is from a prior frame right uh now when looking at the the games that are coming out I know it's possible for dynamic quality changing valve has certainly talked about it from what I understand not a lot of games currently do it we know the the tech demo does the portal Tech demo yep so in theory can the games or does the API or what sees this happening and says I'm going to miss that frame I need to lower the quality what's what's going on there um I would expect most of that to be done by the runtime so it could be in the game but it and it's not within nvidia's control but I would expect the runtime to do something like say hey the frame I just got was a little bit late and because of that I suggest that you lower the quality settings of your render of the game and the the API would be something between the runtime and the game where they're cooperating on the technology to red quality the runtime kind of needs to know about it um but um you know I think it's a little unclear right so recapping the major points here uh the runtime sits sort of at the end of the pipe for delivering the frame y uh we were talking previously about sort of how long that normally takes now with the Vive and the rift do they take a different it's very similar I mean uh if the the runtime is designed to be very quick and so you want to do you're not doing a full rendered of a frame you're you're basically doing a quick sh on an existing rendered image and then um showing it so it's not it's not meant to take as long as the games take so it's a couple milliseconds typically and then outside of that the items look out for in the future uh we've got warp misses and then drop frames yep and then the difference is basically a totally still uh output from a previous render I guess render pass versus one with no animation yeah the way I think about it is when you have a warp Miss you're you're going to get a stuttery experience and it can be pretty bad and when you have a drop frame you're going to be um missing animation steps so you'll see some jutter and animation but it's better than a warpness sure yeah sure well very cool that's a good overview I think for the basics of how VR works yeah and I'm sure we'll have more stuff to talk about at some point in the future I hope so I haven't gotten into a whole lot of VR yet I'm sure our readers know we've looked at it a million times at all the the tours uh but we'll see we'll see see where expansion options are so Tom thank you for the walk through the Glorious CH the Glorious chart with the engineer let say yes and we'll get a little smiley face yeah Hearts to RDU right my woohoo but thank you for joining yeah good to see see I'll see you next time yeahhey everyone I am joined by Tom Peterson from Nvidia hey and you may know Tom from doing a lot of the stage presents I do a lot of stage presents right a lot of giveaways Tom actually also worked on FCAT I did for the frame Time Capture stuff that you've seen pretty much everyone use at this point but today we're talking about how VR works so the the question here is if we start with a timeline of VR you know for a monitor you have X milliseconds y seconds the frame needs to be delivered in that time what does it look like for VR well uh VR is a little bit more complicated of course because there's more going on there's the game of course which is the application that you're running like say raw data or something like that and the game's job is to take physical input to do a simulation and then to generate a new frame but because it's VR there's lenses and there's head motion all that kind of stuff so the uh runtime providers folks like Oculus or valve with Vive are providing another program that I just call the runtime and the runtime runs in parallel with the game so the game is rendering a texture that's Square effectively just like a regular PC game but then that gets read by the runtime and the runtime does a couple things it does effectively lens correction which is making the image uh work with the lenses that are in the headset and it's also doing um something that's called late warp or reprojection and the reason that's happening is to kind of deal with the fact that there's a fixed 11 Mill second window because of 90 HZ they're trying to hit all the time and the the lenses are there of course because without the lenses you're got something right in front of your eyes of course can't really focus on it yeah if you imagine looking at of your headset you've got that that display about 2 inches from your head so the the lenses are there to allow your eye to relax and still see what's on the the image but those lenses cause a distortion when it's on your lens so they actually do undistortion kind of U modifying the square images to make them a little bit more curved to get for the lenses so the big challenge with VR is how do you how do you make all this work so again you're looking at an 11 millisecond window might help if I just draw a picture okay let's try try this in real time totally impromptu totally impromptu didn't know you were going to do this all right so this is my uh my version of time right and these blue lines represent every time you're going to draw a new image on the VR headset so you've got 0 11 22 milliseconds 33 milliseconds yeah so basically since it's 90 Hertz there's an 11 millisecond step and it happens over and over and over so think of it as you've got this window just a small time right before that next cycle that you have to be done with your image so since there's two people involved you really kind of never know what's going to happen so the way it works is the game might start rendering so let's call that game and it's looking at things like you know taking a headset position and it's calculating animation and it might be doing doing some Network stuff but it's figuring out what's the image that uh I'm about to put up on the screen and the next thing that happens is the runtime which is actually going to read that texture and the runtime is going to do things like warping for the lens or lens warp and it's also going to do a reprojection which is uh sort of retiming it right um and it has to get all done in time to make that next refresh so on the headset this would be you know image number one so that's our frame yep that's what you see and this would be game frame number one right so that's the way VR is supposed to work the problem is let's say your GPU is running slower or you have like CPU get busy sometimes this game runs too long or sometimes that runtime long runs too long so what it may have actually looked like is the game ran too long so let's call that game two but now you've already missed the next interval so the runtime still down here doing its stuff but it to make a decision what does it want to show in the next frame so the are the headsets aware of when the run time usually occurs and how long it takes yeah the the uh headset manufacturers have a whole software architecture that is effectively defined in the runtime so think of it this runtime as Oculus or Vibes or any other headsets Sky it's their secret sauce of how do they give a great experience but no matter what their algorithm is they've got a fundamental problem which is the game did not get a frame rendered in time but they still have to put something on the screen because your headset's you know running at 11 milliseconds so there's a couple different strategies right one is I just take the old rendered frame and I reh you know I reshow it you can do reprojection where you basically take this uh a frame that was rendered by the game earlier and you modify it to put it on the screen again um you can also just do nothing and take uh the unmodified frame from last time and reshow it so all of these different defects have different performance impacts and it's really kind of complex as to how do you how do you represent the performance of VR right so with with traditional monitors non non VR stuff uh whether you have vsync or gsync or none of those uh we have issues like stuttering tearing what is the VR equivalent of those oh okay so since um it's well known that if you have tearing in VR it's absolutely really really horrible experience yeah so so the first thing is that VR is almost always vsync on okay and that's that's hardwired right um and so what that means is the real decision is what do you do at every refresh interval and you kind of when things are working right you just show the new frame and everybody's happy if you don't show a new frame you can either reproject an old frame which is kind of synthesizing a new frame right it's it's the runtime creating something to show that they didn't get from the game um or you do nothing right so when you do nothing which is like in this case I call that a warp Miss okay so a warp Miss means that you replayed an old frame just like on desktop when you stutter it's exactly the same thing so is it does everything stay where it was or is there still a head tracking everything stays where it was because when you have a warp miss the runtime didn't get a new frame done in time so the driver just replays an old frame the other thing that could happen is what I call a drop frame and I I think these terms are still settling you know everybody's got a different name for all these things so when there's a drop frame what that means to me is the runtime was able to take uh some version of a prior frame and then modify it and get that thing out in time using the latest head position okay so as long as you use a current head position and you adjust or reproject a prior frame you get a reason good experience right but the animation in this Frame that's reprojected is actually coming from you know an older frame so it looks like a dropped frame from an animation perspective but in terms of fluidity with head tracking you're not getting sick from it right you're not getting sick so I my experience has been when you're reprojecting frames it's a better experience than if you're doing nothing and you're dropping frames or warping missing um but you can definitely see the difference between a dropped frame which is the these reprojected frames and Native frames native frames that are running at 11 milliseconds and everybody's happy so I think it's important as we start figuring out how should we represent you know all of this stuff that we're going to we're going to comprehend you know how long did the game take to render and that's like the New Concept that's sort of like frame time and there's some questions about how that should be measured but we'll get all that worked out and then so you got frame time which is kind of like it is on desktop but then there's this concept of a warp Miss which is the Run time did not have the time to put a new frame out and then there's a concept of a drop frame that is animation is from a prior frame right uh now when looking at the the games that are coming out I know it's possible for dynamic quality changing valve has certainly talked about it from what I understand not a lot of games currently do it we know the the tech demo does the portal Tech demo yep so in theory can the games or does the API or what sees this happening and says I'm going to miss that frame I need to lower the quality what's what's going on there um I would expect most of that to be done by the runtime so it could be in the game but it and it's not within nvidia's control but I would expect the runtime to do something like say hey the frame I just got was a little bit late and because of that I suggest that you lower the quality settings of your render of the game and the the API would be something between the runtime and the game where they're cooperating on the technology to red quality the runtime kind of needs to know about it um but um you know I think it's a little unclear right so recapping the major points here uh the runtime sits sort of at the end of the pipe for delivering the frame y uh we were talking previously about sort of how long that normally takes now with the Vive and the rift do they take a different it's very similar I mean uh if the the runtime is designed to be very quick and so you want to do you're not doing a full rendered of a frame you're you're basically doing a quick sh on an existing rendered image and then um showing it so it's not it's not meant to take as long as the games take so it's a couple milliseconds typically and then outside of that the items look out for in the future uh we've got warp misses and then drop frames yep and then the difference is basically a totally still uh output from a previous render I guess render pass versus one with no animation yeah the way I think about it is when you have a warp Miss you're you're going to get a stuttery experience and it can be pretty bad and when you have a drop frame you're going to be um missing animation steps so you'll see some jutter and animation but it's better than a warpness sure yeah sure well very cool that's a good overview I think for the basics of how VR works yeah and I'm sure we'll have more stuff to talk about at some point in the future I hope so I haven't gotten into a whole lot of VR yet I'm sure our readers know we've looked at it a million times at all the the tours uh but we'll see we'll see see where expansion options are so Tom thank you for the walk through the Glorious CH the Glorious chart with the engineer let say yes and we'll get a little smiley face yeah Hearts to RDU right my woohoo but thank you for joining yeah good to see see I'll see you next time yeah\n"