Is it pointless to draw 512times/sec ...

Started by
10 comments, last by Extrarius 19 years, 4 months ago
Hi, Is it pointless to draw 512times/second when a monitor is set to lets say 120hertz? If yes, does anyone do that? I see games where framerate is shown and its higher than the refreshrate set on my monitor. Im new to graphics programming. Thanks in advance!
Advertisement
To answer your question: most of the time it isn't necessary. Those games are probably showing how many times they update per second, not how many times they draw per second.

Since most people call the update and render code at the same time, updating is slowed down to how fast you can render per second. The best solution to get a higher framerate would be to seperate updating and rendering code.

EDIT:
By seperating updating and rendering code I mean that you should call the updating code as much as possible, and only call the rendering code, say 60 times per second.
You first need to know how displaying a rendered frame works.

There is a ram on the graphiccard which row by row gets transmitted to the monitor. The refresh rate of the monitor is the number of times the complete ram is send to the monitor.

Now you have 2 options, either you only change the contents of the ram at the exact moment the transmission of it is finished (=VSYNC on) or you just overwrite it any time you want (=VSYNC off).

If you overwrite it during transmission the monitor displays a part of the old content together with a part of the new content.
Short answer: yes it's pointless.

Longer answer: it's not uncommon for games to run at greater than the refresh rate, mainly as a benchmarking aid. If you want to know how powerful a particular graphics card is or whether the latest optimization you've made has actually helped speed rendering up you want to render frames as fast as possible. If you always locked the frame update to the refresh rate of the monitor you wouldn't be able to tell whether you were rendering everything with cycles to spare or barely keeping up with the refresh rate. For actual gameplay you'll typically want to sync to the refresh rate as it makes for smoother visuals with no tearing and there's no benefit to the player to rendering faster than their monitor refresh.

Game Programming Blog: www.mattnewport.com/blog

I've read that some players of FPS shooters don't sync and turn the graphics detail down alot to get high frame-rate - presumably because they play with high sample rate input devices (the input devices typically get sampled once per frame), but I don't know of any mouse devices that sample higher than 120Hz. However, I think the professional players want to guarantee that the minimum frame rate (during periods of intense action) is at least the input device sample rate so that they get the best input response possible.

Syncing does stop tearing but it can also mean abrupt changes in the frame-rate from monitor_refresh to monitor_refresh/2 to monitor_refresh/3, etc (eg, 90,45,30Hz) depending on the current load which can be annoying.

It should always be a user option.
I would have stopped playing UT years ago if it insisted on limiting the framerate to the monitor refresh.
There *is* a difference, despite a lot of people quoting human persistence of vision and perception limits.
Umm... images that don't get drawn, don't make a difference. If you're rendering more frames then your monitor can present per second, then all those extra frames are either never seen, or mixed with another image (tearing). So things that you don't see because they arn't drawn on the sreen at all, don't make a difference.
[s] [/s]
I can see the fnords.
Quote:Original post by Aph3x
I would have stopped playing UT years ago if it insisted on limiting the framerate to the monitor refresh.
There *is* a difference, despite a lot of people quoting human persistence of vision and perception limits.


Any links? I play and I can absolutely can tell the difference between 60/70/90, especially in an alpha state.
- The trade-off between price and quality does not exist in Japan. Rather, the idea that high quality brings on cost reduction is widely accepted.-- Tajima & Matsubara
Well if the monitor refresh rate is say 60Hz there could be a noticible difference between say 60 and 70 FPS because when you're updating the screen faster than your refresh rate you'll get various artifacts, but these are hardly desirable.
Rendering faster than the refresh rate is pointless. However doing Physics/Input updates faster than the rfresh rate is very much a good thing. The problem with VSync is that it blocks until the refresh happens, which means the whole application stalls until it can swap the buffers.

The 'correct' way to do it would be to keep track of time internally in your application, and only call Render() 60 times a second, but let AI/Physics and especially input happen as fast as possible. (or maybe limit those to happen 120 times a second)
Waramp.Before you insult a man, walk a mile in his shoes.That way, when you do insult him, you'll be a mile away, and you'll have his shoes.

This topic is closed to new replies.

Advertisement