Monitors Question on Monitor Refresh Rates

Trajan

Adept
Will Monitor refresh rates have an impact on the Gaming performance.... to put it simply will there be any difference in gaming performance between the following

800 x 600@85Hz
1024 x 768@60Hz
1024 x 768@75Hz
1024 x 768@85Hz
 
Theoretically no... But in practice, the lower the ref rate, the higher the fps (there's a difference but its less than 1%)
 
refresh rates will absoultely not affect the amount of FPS you get in a game UNLESS Vsync is enabled in the video card drivers....

but the higher the resolution ofcourse the lower FPS will be...so there will be a dif between 800x600 and 1024x768
 
Well I once switched to a very high refresh rate and for some reason menus appeared black and other weird stuff so I switched back.
The monitor cud support it, my 9600p cud support it, but for somw weird reason it just didn't work.
 
I think refresh rates matters a lot... Technically... If your refresh rate is 60HHz then even if your GFX card can do 70 FPS you will not be able to see more then 60 FPS(Ofcourse res being the same)... Correct me if i am wrong...
 
LOL....the human eye cannot tell the difference between ~30FPS and 60FPS but we can notice the dif when FPS drops....something like that, correct me if im wrong

Refresh rates will NOT effect performance, unless you get 60FPS and ur monitor is at 85hz you might get something called "tearing" this can be stopped by locking the FPS with the refresh rate called "Vsync"

The higher the refresh rate the easier it on your eyes :) i will never go below 85Hz
 
Switch said:
I think refresh rates matters a lot... Technically... If your refresh rate is 60HHz then even if your GFX card can do 70 FPS you will not be able to see more then 60 FPS(Ofcourse res being the same)... Correct me if i am wrong...

Err that doesn't make sense Switch.
So what happens in games like Q3 where frames reach 400 every second? (assuming you have good card of course). Refresh rates have nothing to do with performance except for the tearing that occurs when vertical sync is disabled. The main and only reason why you should have a high refresh rate is so that there's less strain on your eyes as the monitor doesn't flicker that much.

@Chaos, there shouldn't be anything performance hit (not even 1%) due to higher refresh rates, because the load would come on the DACs, and not on the GPU - or so I think.
 
Last edited by a moderator:
tracerbullet said:
Err that doesn't make sense Switch.
So what happens in games like Q3 where frames reach 400 every second? (assuming you have good card of course). Refresh rates have nothing to do with performance except for the tearing that occurs when vertical sync is disabled. The main and only reason why you should have a high refresh rate is so that there's less strain on your eyes as the monitor doesn't flicker that much.

@Chaos, there shouldn't be anything performance hit (not even 1%) due to higher refresh rates, because the load would come on the DACs, and not on the GPU - or so I think.
Thats why I told that theoretically there should be none but in practice, there is a small penalty. Try running 3dmark 2k3 or 2k5 with different refresh rates and check out urself ;)
 
Last edited by a moderator:
Three terms need to be clarified: refresh rate, frame rate (fps), vertical sync
--------------------------------------------------------------------------
Before going into these, a little background on video buffering:

- the RAM on your graphics card (video-RAM or v-RAM for short) serves as a
high-speed buffer. (DoH!!)

- FRAME BUFFER means a screen-sized buffer (ie. memory allocated for every
pixel on your screen) where the video card can render images to be
displayed on screen. Front buffer is the one that you see, and
back buffers are used for rendering the image while front buffer is
being drawn on screen, thereby saving time.

- DOUBLE BUFFERING involves creating a front buffer and a back buffer in the
v-RAM.

- TRIPLE BUFFERING involves creating 1 front buffer and 2 back buffers in the
v-RAM; this provides smoother playback at the cost of additional v-RAM.
--------------------------------------------------------------------------

1) REFRESH RATE

It is the FIXED number of times per second (Hz) your monitor draws a picture(frame) on your screen.

Dependent on - (a) Monitor
(b) Video card's bandwidth

Independent of - (a) Game engine/Application
(b) Video settings within the game/application

Refresh rates lesser the 70Hz often causes eye-strain due to flickering; however this is dependent on the individual viewing the screen.
Typically, refresh rates of 85Hz or above are preferred.
2) FRAME RATE

It refers to the number of frames a game engine generates per second.

Dependent on - (a) Game/Application
(b) Hardware such as Proccy,GPU,etc.
(c) Video settings in the game/application

Independent of - Monitor
3) VERTICAL SYNC

Monitors use an electron gun (EG) to "draw" horizontal lines of pixels (called
scan lines) from left to right. After finishing one such line, the EG is repositioned from the rightmost end of one line to the leftmost end of the next, lower line. When it reaches the rightmost pixel in the last line of the screen, the current frame is complete, and the EG is repostitioned to the top-left of the screen.
This is called vertical retrace, and takes some time (we'r still talking millisecond range here) to complete.
In this duration, the back buffer is switched with the front buffer, and the new back buffer is rendered, while the front buffer is "drawn".
This is called "Vertical Sync".

In the alternative approach without VSync, the graphics card just draws frames as fast as possible, swapping front/back buffers ASAP.
The famous "tearing" seen with VSync off is a result of the front frame buffer getting updated while it is being draw.

--------------------------------------------------------------------------

As the monitor is what displays the frames, the frame rate always plays second-fiddle to the refresh rate.
so, IMHO Switch is correct: actual FPS u see is lesser than, or equal to, your refresh rate.

So what happens in games like Q3 where frames reach 400 every second?

AFAIK, extra frames are "melded" together - the top half of one and the bottom half of the next frame, r something like that.

What is certain, is that the higher the frame rate is, the better is the game's visual quality.

--------------------------------------------------------------------------

Refresh Rates as pertaining to LCDs:

LCD technology differs at a fundamental level from CRT technology.
While pixels on CRTs need to be continuously refreshed to maintain a constant picture, LCD pixels need only to be set initially - they will retain their state (and hence the overall picture) until the pixels are changed.
Effectively, an LCD will maintain a constant picture display with a 0 Hz refresh rate.

The maximum number of different frames an LCD can display per second - it's refresh rate - depends on how quickly the LCD display can change the value of it's pixels, thereby displaying a new frame. This is refered to as the response time.

Everyone knows that the lower the response time of an LCD, the better it is for gaming/movies/other apps involving a display of motion. The standard explanation is that lower response times eliminate "ghosting" - pixels which are unable to change quick enough to match the front buffer contents.

What a lot of people don't realise, is that response time is inversly proportional to refresh rate. A low response time increases the number of frames which can be displayed.

Converting response time into refresh rate involves taking the inverse of the reponse time.
A simple calculation yields the following results:

16 ms >> 62.50 Hz
12 ms >> 83.33 Hz
10 ms >> 100.0 Hz
8 ms >> 125.00 Hz
6 ms >> 166.67 Hz
4 ms >> 250.00 Hz
-----------------------------------------------------------------------
Dependent on - (a) LCD display response time
(b) Video card

Independent of - (a) Game engine/Application
(b) Video settings within the game/application
-----------------------------------------------------------------------
 
Ein said:
the monitor is what displays the frames, the frame rate always plays second-fiddle to the refresh rate.
so, IMHO Switch is correct: actual FPS u see is lesser than, or equal to, your refresh rate.

AFAIK, extra frames are "melded" together - the top half of one and the bottom half of the next frame, r something like that.

What switch said was exactly what i always wondered. Whats the point of doing 400 fps when your monitor can do only 60.

Now what Ein is saying is true i think. Coz the gun will render 60 frames per second but in that time the game engine has flushed out 400 frames. So effectively by the time the electron gun has completed one frame the actual game frame has changed 6-7 times. We know that the bottlneck is always the slowest entity.

Now what i dont know is whther the frame is updated continuously and thus merged as Ein said.. the top of one the middle of other and the bottom of the last. Coz the signal that is fed to the gun is sequential and wont change over to some other part of some other frame until it has completed the frame it has started rendering.
 
kev182 said:
The higher the refresh rate the easier it on your eyes :) i will never go below 85Hz

yeah...if i lower my refresh rate frm 85 to 60, my eyes will start watering and paiiiiin.....
 
switch what u mentioned only happens if vsync is on.

& for fps games refresh rate is the only thing that matters.

cause as u change refresh rate so does the sensitivity change. for multiplayer games like cs/q3 & others this is really important.

ur sensitivity will be diff when u play the game on 85/100/120hz.

i use these settings
640 - 120hz
800 - 120hz
1024 - 100hz
1152 - 85hz
1280 - 75hz
1600 - 60hz (sux man cant use 1600 cause 60hz is too low)

& also in any other games the refresh rate makes a big diff cause low rate strains the eyes.

xp has a bug with refreshrate in opengl games & limits them to 60hz. this can be changed by using refresh rate override. or for ati cards u can use reforce.
 
I think the slight difference in sensitivity and responsiveness people experience only happens 'cause at different resolutions your mouse has to scan and report varying amounts of screen area and has nothing to do with your monitor or video card.

A chain is only as strong as it's weakest link, what's the use of a screen-tearing 500fps when 85fps @ 85hz feels a LOT smoother.
Maybe when monitors get good enough to deliver 500hz, till then a decent refresh rate + vsync + triple buffering is the best setup.
 
Well about the refresh bugs... Q3 has a bug due to which u can jump the farthest if ur fps is a constant 125. This is the reason why most ppl cap q3 frame rates to 125 and enable vsync with the refresh rate set to 120Hz(closest possible to 125). Also the default USB report rate is 125Hz as well(or its multiples if u overclock the usb port like me :D). So 125 is basically the magic figure as far as q3 is concerned. Don't know if such a thing exists for cs as well.
 
Back
Top