FPS Vs. Refresh Rate

# Thread: FPS Vs. Refresh Rate

1. ## FPS Vs. Refresh Rate

Hi jus read reviews of "great" graphics cards which churn out frame rates of 300 n 400 fps. I always get this question in relation to frame rates - most of our monitors , at the commonly used resolutions hv refresh rates of around 125 hz maximum. so what's the point in having graphics cards which puts a new image on the screen every 1/400th of a second, when your monitors puts a new image on the screen only every 1/120th of a second?

2. Hi d,

You are totally confused over the meaning and significance of the terms.

Refresh Rate: It is the rate at which a monitor redraws the image on screen i.e. the frequency. Greater the frequency, lesser will be the flickering you see on the screen. It has got nothing to do with the FPS stuff.

FPS(Frames per Second): This is a measure of how fast the graphics card can process the details associated with the image i.e. the texture, quality, ligthing, anti-aliasing etc. It is purely a number crunching game.

It is a measure of how much information is used to store and display motion video. The term applies equally to film video and digital video. Each frame is a still image; displaying frames in quick succession creates the illusion of motion. The more frames per second (fps), the smoother the motion appears.

your monitors puts a new image on the screen only every 1/120th of a second?
Refresh rate does not mean the rate of displaying of a new frame. It is the rate at which the same frame is redrawn.

3. i agree with anomit !!! go and google !!!

cheers !!!!

4. Originally Posted by anomit
You are totally confused over the meaning and significance of the terms.
The OP is asking the question because he is confused

When a cinema projector or a VCR is playing a movie, they are showing in sequence individual pictures that were already stored as full frames. But when a graphics card is processing a picture, it has to continuously draw every single pixel dot by dot. In an 800x600 screen that's 480,000 pixels, and each pixel has its own colour and brightness information which also have to be dealt with by the GPU. That's a lot of work, especially when the game is constantly sending a stream of ever-changing information. The more subtle details the image contains, the more work the GPU has to do and takes more time to finish drawing each frame. That's where a slow GPU falters and can draw only a few frames per second (fps), resulting in jerky movements on the screen. A high fps number gives a smooth display of moving objects.

The refresh rate is the number of times per second the full frame is shown on the monitor screen, no matter how quickly or slowly the pixels are drawn. If the fps is low, the same picture is simply displayed again on the next frame.

d is right to some extent in that, for the same game at the same settings, it makes little difference if you get 300 fps or 150fps. The monitor cannot change what it shows within a single refresh cycle, and the human eye cannot follow rapid changes above a certain frequency. Very high fps values simply show the capabilities of a video card, and determine what reserve power it has for higher settings or for future games with ever-increasing requirements.

5. more fps is just the marketing tools for Graphics card, these must not bother us.

100-150 fps is good for viewing in PC monitor

#### Posting Permissions

• You may not post new threads
• You may not post replies
• You may not post attachments
• You may not edit your posts
•