Jump to content
  • Advertisement
Sign in to follow this  
Norman Barrows

no vsync means double buffering to avoid tearing, right?

This topic is 1227 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Advertisement

single buffering = not possible on modern OS's

 

Sometimes I wish it was possible only to create less laggy GUIs...

Share this post


Link to post
Share on other sites

 

single buffering = not possible on modern OS's

 

Sometimes I wish it was possible only to create less laggy GUIs...

 

Single buffering w/out VSync is the best way to reduce latency, but it will be glitchy as hell.

Double Buffer w/ VSync often achieves lower latency than Single Buffering w/ VSync.

Share this post


Link to post
Share on other sites

single buffering = not possible on modern OS's

 
Sometimes I wish it was possible only to create less laggy GUIs...

Maybe these could help you (on Direct3D) if you render the GUIs separeted from the rest:

https://msdn.microsoft.com/en-us/library/ff471334.aspx
https://msdn.microsoft.com/en-us/library/dn448914.aspx (unfortunately DXGI 1.3 and greater only) Edited by Alessio1989

Share this post


Link to post
Share on other sites

I suppose that most future monitors and graphic cards will support freesync or gsync. Which will make this issue probably go away in about 5-7 years.

Share this post


Link to post
Share on other sites


Tearing occurs whenever the drawing of the image crosses the point where the monitor refreshes its image.

 

yeah, that makes sense. been a long time since i wrote directly to vidram...   what was it 0x0A000:0000 or something like that?  just looked it up. 0xA0000 was dos vidram address. set a pointer and let's party on the bitmap! <g>.

 

 


single buffering = not possible on modern OS's
double buffering = mandatory! Screen tearing will be visible.
double buffering w/ vsync = no tearing, but CPU/GPU sleeps occur (waiting for vblank) if frame-time doesn't line up with refresh rate nicely
triple buffering = greater latency...
triple buffering w/ vsync = no tearing, greater latency, but less CPU sleeps.

 

so double buffer with vsync is the best i can get with no tearing and low latency eh? thanks for the tip. seems some things never change. i remember implementing a double buffer system for the game library - might have been back in my pascal days (late 80's). It did sprites and lines and rectangles and such. The C++ version of that module was actually used in Caveman 1,.0 in 2000 to do 2D sprites by adding a "color keyed alpha test blit to d3d backbuffer" method to the double buffer module. Caveman 2.0 (circa 2008) and Caveman 3.0 use d3dx sprites and/or d3dx fonts.

Share this post


Link to post
Share on other sites
Personally I'd use triple buffering w/vsync. Ofcourse it depends on a lot of factors, in my case I had unneccesary long frame times without triple buffering.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!