Refresh Rate vs. Frames Per Second

The Difference Between FPS and Refresh Rate: A Beginner's Guide

When shopping for a new TV or computer monitor, you may have come across specifications denoted by Hertz, such as 60 Hz, 120 Hz, or even 240 Hz. But what does these numbers mean, and how do they relate to each other? In this article, we'll delve into the world of frames per second (FPS) and refresh rate to help you understand the difference between them.

Refresh Rate: The Number of Times a Signal Can Be Transmitted to the Screen Per Second

The refresh rate of a monitor or television is essentially the number of times that a signal can be transmitted to the screen per second. In other words, it's how often the display updates with new information. Think of it like a heartbeat – if your heart beats 60 times per minute, that's roughly 10 times per second. Similarly, a monitor's refresh rate is measured in hertz (Hz), which represents the number of times the display updates.

In the case of a computer, the graphics card or integrated graphics processor determines how many frames per second are sent to the monitor. This is known as the frame rate, and it can range from 30 FPS to 240 FPS or even higher. The key point here is that the refresh rate has nothing to do with the number of frames per second that enter the screen from the source – such as a computer, DVD player, or channel service provider.

Interpolating Frames: What Happens When the Monitor Can't Keep Up

Let's assume you have a computer sending 30 FPS to your monitor, which is running at 60 Hz. In this scenario, the monitor can easily keep up with the frame rate, and you'll see no interpolation or artifacts on the screen. However, if your computer starts sending 40 FPS, the monitor will start interpolating frames between the existing ones. This means that the display will create new frames to fill in the gaps, resulting in a smoother and more stable image.

The question then arises: what happens when the frame rate exceeds the refresh rate? For instance, if your computer sends 120 FPS to a monitor running at 60 Hz, the monitor won't be able to keep up. In this case, it will only display 60 frames per second, with the remaining frames being interpolated or discarded.

Overclocking: Extending the Refresh Rate

While monitors are not typically overclocked in the classical sense, some manufacturers may offer adjustable refresh rates through software updates or hardware modifications. These adjustments can allow a monitor to operate at higher refresh rates than its rated specification. However, it's essential to note that this is rare and usually only applies to high-end or specialized displays.

The Human Eye Can't Tell the Difference: Why Higher Refresh Rates Don't Always Matter

Despite their high numbers, some people may assume that higher refresh rates are necessary for an ideal viewing experience. However, the human eye has a limited ability to perceive frame rates beyond 60 FPS. In reality, most modern TVs and monitors have built-in algorithms to reduce or eliminate artifacts caused by interpolation, making it difficult to distinguish between different refresh rates.

That being said, some viewers may prefer higher refresh rates for specific reasons, such as:

1. Reduced motion blur: Higher frame rates can reduce motion blur, which is especially noticeable in fast-paced content like sports and action movies.

2. Smoother visuals: Interpolating frames can create a more stable image, reducing the perception of artifacts or judder.

3. Enhanced realism: Some viewers believe that higher refresh rates enhance the overall visual fidelity of a display.

Ultimately, whether or not to prioritize higher refresh rates depends on individual preferences and viewing habits.

The Price of Higher Refresh Rates

In recent years, TVs with higher refresh rates have become increasingly popular, especially during Black Friday and Cyber Monday sales. As a result, prices for these displays have skyrocketed, making it essential to weigh the benefits against the costs.

While higher refresh rates offer improved visuals and reduced motion blur, they also come at a premium price tag. It's crucial to consider whether the additional cost is justified by your viewing habits and preferences.

Conclusion

The difference between FPS and refresh rate may seem complex, but understanding these concepts can help you make informed decisions when shopping for a new TV or computer monitor. By grasping how these parameters interact with each other, you'll be better equipped to choose the perfect display for your needs and budget.

"WEBVTTKind: captionsLanguage: enhey Greg here with science studio so what's the difference between FPS which stands for frames per second and refresh rate which is usually denoted by Hertz if you've been recently shopping for a new TV or new computer monitor you probably noticed the specification denoted by Hertz usually 60 Herz or 120 hertz 240 hertz even 480 HZ in some cases but what does refresh rate have to do with the number of frames per second that the monitor displays well the answer is nothing at all in the case of a computer the graphics card or integrated graphics processor in some cases is typically the sole determinant of how many FPS you see on a monitor now what the monitor's job is is to translate that data into a moving pict pict but this is where refresh rate comes into play basically the refresh rate of a monitor or television is the number of times that a signal can be transmitted to the screen per second so in a way refresh rate is a type of frame per second parameter but it has nothing to do with the number of frames per second that enter the screen from The Source the computer DVD or Blu-ray player or Channel service provider so let's set up a hypothetical let's say I have a computer that's pushing about 30 frames per second from the computer to my monitor and my monitor happens to be a 60 HZ monitor that's typical of most cheap displays so if my monitor is capable of refreshing its screen 60 times per second and my computer is only sending 30 frames per second via DVI or HDMI cable then we're in the clear often times what happens is the monitor or TV will interpolate frames in between those that it's receiving from its source so if my 60 HZ monitor is receiving 30 frames per second from my computer then it will interpolate in between each frame if the computer is sending 40 frames per second to my 60 HZ monitor then my monitor might interpolate between every other set of frames rather than between every set of frames but then this raises the question what if my computer is sending more frames per second than the refresh rate of my monitor so let's say now that instead of 30 or 40 frames per second that my source is sending to my 60 HZ monitor the source is sending 120 frames per second what will the monitor do there well there's actually not much it can do it's it's really only going to send out a true 60 frames per second because this is what the monitor is capable of now there are instances in which you can overclock to an extent the refresh rate of a monitor but you'll be lucky to get maybe 15 to 20 Hertz on top of what you're already rated at so if a source is sending 120 frames per second to your 60 HZ TV or monitor chances are you aren't seeing the full picture the same case applies to a 240 frame Per Second Source or anything above that so what's the sweet bucket well obviously 60 HZ is pretty common in fact most monitors and TVs that you find on Amazon new EG and even in local stores happen to be 60 HZ it's just cheaper this way and to tell you the truth the human eye can't discern much of a difference between 60 frames per second and 120 frames per second but this doesn't mean that you should just settle for 60 HZ 120 HZ is becoming more and more common if you shop for TVs over Black Friday or Cyber Monday you probably noticed that the refresh rates were directly correlated to the price tags of those televisions meaning that a TV with a higher refresh rate was likely more expensive than its lower refresh rate counterpart if you have a newer TV with a 120 or 240 HZ refresh rate you probably notice that when watching HD let's say Blu-ray movies subtle wobbles in the camera or the occasional stutter of something that you wouldn't have noticed in a 60 HZ display these little micro jolts and micro anomalies are the result of a higher refresh rate so is a higher refresh rate worth the extra money well that's up to you some people hate seeing those little micro anomalies and micro movements in the cameras it seems less professional others like myself personally enjoy seeing things like that it makes the movie seem more realistic and not so artificial and stable check out your local electronic store that sells TVs and even has some on display and decide for yourself whether you prefer something that seems more stable or arguably seems more realistic this is science Studio thanks for learning with ushey Greg here with science studio so what's the difference between FPS which stands for frames per second and refresh rate which is usually denoted by Hertz if you've been recently shopping for a new TV or new computer monitor you probably noticed the specification denoted by Hertz usually 60 Herz or 120 hertz 240 hertz even 480 HZ in some cases but what does refresh rate have to do with the number of frames per second that the monitor displays well the answer is nothing at all in the case of a computer the graphics card or integrated graphics processor in some cases is typically the sole determinant of how many FPS you see on a monitor now what the monitor's job is is to translate that data into a moving pict pict but this is where refresh rate comes into play basically the refresh rate of a monitor or television is the number of times that a signal can be transmitted to the screen per second so in a way refresh rate is a type of frame per second parameter but it has nothing to do with the number of frames per second that enter the screen from The Source the computer DVD or Blu-ray player or Channel service provider so let's set up a hypothetical let's say I have a computer that's pushing about 30 frames per second from the computer to my monitor and my monitor happens to be a 60 HZ monitor that's typical of most cheap displays so if my monitor is capable of refreshing its screen 60 times per second and my computer is only sending 30 frames per second via DVI or HDMI cable then we're in the clear often times what happens is the monitor or TV will interpolate frames in between those that it's receiving from its source so if my 60 HZ monitor is receiving 30 frames per second from my computer then it will interpolate in between each frame if the computer is sending 40 frames per second to my 60 HZ monitor then my monitor might interpolate between every other set of frames rather than between every set of frames but then this raises the question what if my computer is sending more frames per second than the refresh rate of my monitor so let's say now that instead of 30 or 40 frames per second that my source is sending to my 60 HZ monitor the source is sending 120 frames per second what will the monitor do there well there's actually not much it can do it's it's really only going to send out a true 60 frames per second because this is what the monitor is capable of now there are instances in which you can overclock to an extent the refresh rate of a monitor but you'll be lucky to get maybe 15 to 20 Hertz on top of what you're already rated at so if a source is sending 120 frames per second to your 60 HZ TV or monitor chances are you aren't seeing the full picture the same case applies to a 240 frame Per Second Source or anything above that so what's the sweet bucket well obviously 60 HZ is pretty common in fact most monitors and TVs that you find on Amazon new EG and even in local stores happen to be 60 HZ it's just cheaper this way and to tell you the truth the human eye can't discern much of a difference between 60 frames per second and 120 frames per second but this doesn't mean that you should just settle for 60 HZ 120 HZ is becoming more and more common if you shop for TVs over Black Friday or Cyber Monday you probably noticed that the refresh rates were directly correlated to the price tags of those televisions meaning that a TV with a higher refresh rate was likely more expensive than its lower refresh rate counterpart if you have a newer TV with a 120 or 240 HZ refresh rate you probably notice that when watching HD let's say Blu-ray movies subtle wobbles in the camera or the occasional stutter of something that you wouldn't have noticed in a 60 HZ display these little micro jolts and micro anomalies are the result of a higher refresh rate so is a higher refresh rate worth the extra money well that's up to you some people hate seeing those little micro anomalies and micro movements in the cameras it seems less professional others like myself personally enjoy seeing things like that it makes the movie seem more realistic and not so artificial and stable check out your local electronic store that sells TVs and even has some on display and decide for yourself whether you prefer something that seems more stable or arguably seems more realistic this is science Studio thanks for learning with us\n"