Sign in to follow this  
Norman Barrows

no vsync means double buffering to avoid tearing, right?

Recommended Posts

Matias Goldberg    9582

 

single buffering = not possible on modern OS's

 

Sometimes I wish it was possible only to create less laggy GUIs...

 

Single buffering w/out VSync is the best way to reduce latency, but it will be glitchy as hell.

Double Buffer w/ VSync often achieves lower latency than Single Buffering w/ VSync.

Share this post


Link to post
Share on other sites
Alessio1989    4634

single buffering = not possible on modern OS's

 
Sometimes I wish it was possible only to create less laggy GUIs...

Maybe these could help you (on Direct3D) if you render the GUIs separeted from the rest:

https://msdn.microsoft.com/en-us/library/ff471334.aspx
https://msdn.microsoft.com/en-us/library/dn448914.aspx (unfortunately DXGI 1.3 and greater only) Edited by Alessio1989

Share this post


Link to post
Share on other sites
Tispe    1468

I suppose that most future monitors and graphic cards will support freesync or gsync. Which will make this issue probably go away in about 5-7 years.

Share this post


Link to post
Share on other sites
Norman Barrows    7179


Tearing occurs whenever the drawing of the image crosses the point where the monitor refreshes its image.

 

yeah, that makes sense. been a long time since i wrote directly to vidram...   what was it 0x0A000:0000 or something like that?  just looked it up. 0xA0000 was dos vidram address. set a pointer and let's party on the bitmap! <g>.

 

 


single buffering = not possible on modern OS's
double buffering = mandatory! Screen tearing will be visible.
double buffering w/ vsync = no tearing, but CPU/GPU sleeps occur (waiting for vblank) if frame-time doesn't line up with refresh rate nicely
triple buffering = greater latency...
triple buffering w/ vsync = no tearing, greater latency, but less CPU sleeps.

 

so double buffer with vsync is the best i can get with no tearing and low latency eh? thanks for the tip. seems some things never change. i remember implementing a double buffer system for the game library - might have been back in my pascal days (late 80's). It did sprites and lines and rectangles and such. The C++ version of that module was actually used in Caveman 1,.0 in 2000 to do 2D sprites by adding a "color keyed alpha test blit to d3d backbuffer" method to the double buffer module. Caveman 2.0 (circa 2008) and Caveman 3.0 use d3dx sprites and/or d3dx fonts.

Share this post


Link to post
Share on other sites
Brain    18906


been a long time since i wrote directly to vidram...   what was it 0x0A000:0000 or something like that?  just looked it up. 0xA0000 was dos vidram address. set a pointer and let's party on the bitmap! .

 


i remember implementing a double buffer system for the game library - might have been back in my pascal days (late 80's).

 

Yay, another fan of turbo pascal from back in the day! Looks like we were both doing the same things about ten years apart, turbo pascal was the defining facto language for me in college and i remember implementing a sprite blitter with scaling and rotation in mode 13h whilst the course was talking about crappy text mode database programs. Those were the days eh :)

 

Still got that source code, it's pretty useless these days I guess...

Share this post


Link to post
Share on other sites
Norman Barrows    7179


Sometimes I wish it was possible only to create less laggy GUIs...

 

one thing i noticed with directx is that d3dx fonts are slow compared to d3dx sprites.

drawing the debugging HUD info in Caveman with 3d fonts took like 3ms!

i implemented my own fonts using d3dx sprites and got those clock cycles back.

i later went to a minimalist UI, so most of the time there's no UI text or graphics on the screen at all, not even crosshairs.

but all my stats and inventory screens are quite responsive with the fast sprite based fonts.

Share this post


Link to post
Share on other sites
Norman Barrows    7179


Also bear in mind it is not possible to enforce vsync or no vsync. The graphics card settings app can normally override whatever your application requests.

 

yes, that is a concern, what can be done about this? nothing? if the user is dumb enough to disable vsync and allow tearing, are we screwed?

Share this post


Link to post
Share on other sites
Norman Barrows    7179

Still got that source code, it's pretty useless these days I guess...

 

 someday some guy like us will crack the realtime raytracing barrier.

 

i don't think it will be me though. while i enjoy graphics programming, its still mostly a means to an end for me.

 

heck, i'm afraid to step up to dx11 for fear i'll become so addicted to shader coding i'll never finish another game! <g>.

 

just the thought of being able to party on the bitmap again with such power is - intoxicating.

 

i've actually started porting my graphics library to dx11, and can clear the screen.  but i must be strong and not give into temptation. dx9.0c fixed function is sufficient for the needs of the next three planned titles.  i don't need to build a new masarattii, the old honda i have in the garage will do just fine.

Edited by Norman Barrows

Share this post


Link to post
Share on other sites
Aardvajk    13207

Also bear in mind it is not possible to enforce vsync or no vsync. The graphics card settings app can normally override whatever your application requests.

 
yes, that is a concern, what can be done about this? nothing? if the user is dumb enough to disable vsync and allow tearing, are we screwed?


Well, the obvious answer is to design in such a way that the program is not affected by the graphical frame rate, Norman. By, for example, decoupling the render rate from the physics rate and tweening to get rid of temporal artifacts. But we've had this conversation at length smile.png

But yes, you can of course just tell users that if they disable or force vsync, whichever way you are relying on, the game will not perform correctly. When I was doing 2D sprite based games with D3D, I found there was a set of graphics card settings to do with antialiasing that I could set on my card that completely screwed all my views up and had to just accept that.

Personally, if I had a game that was telling me I had to leave vsync enabled on my card to play it, I would think "Oh, they've implemented that wrong then." Edited by Aardvajk

Share this post


Link to post
Share on other sites
Norman Barrows    7179


Well, the obvious answer is to design in such a way that the program is not affected by the graphical frame rate

 

given the follwing present paramwters, what the behavior of directx with vsync off?

 

void Zsetparams(int w,int h)
{
ZeroMemory(&Zparams,sizeof(D3DPRESENT_PARAMETERS));
Zparams.AutoDepthStencilFormat = D3DFMT_D24X8;
Zparams.BackBufferCount=1;
Zparams.BackBufferFormat = D3DFMT_A8R8G8B8;              // set the back buffer format to 32-bit      // turn these on for fullscreen
Zparams.BackBufferWidth = (unsigned)w;                          //width;    // set the width of the buffer
Zparams.BackBufferHeight = (unsigned)h;                          //height;    // set the height of the buffer
Zparams.EnableAutoDepthStencil = TRUE;                   // automatically run the z-buffer for us
Zparams.Flags=0;
Zparams.FullScreen_RefreshRateInHz=D3DPRESENT_INTERVAL_DEFAULT;
Zparams.hDeviceWindow=Zprogram_window_handle;
Zparams.MultiSampleQuality=0;
Zparams.MultiSampleType=D3DMULTISAMPLE_NONE;
Zparams.PresentationInterval=D3DPRESENT_INTERVAL_DEFAULT;
Zparams.SwapEffect = D3DSWAPEFFECT_COPY;                 //D3DSWAPEFFECT_DISCARD;
Zparams.Windowed = FALSE;                                // TRUE for windowed, FALSE for fullscreen
}
 

 

D3DPRESENT_INTERVAL_DEFAULT supposedly waits for vsync and uses the default system timer resolution

 

but no mention is made of behavior when vsync is disabled,. 

 

guess it would just present immediately and return, huh?

 

so its like vsync doesn't exsist, and you get tearing. but as long as you control the speed of the game some other way such as a framerate limiter or f-y-t, you're ok - you just get tearing.

Share this post


Link to post
Share on other sites
Aardvajk    13207
The behaviour is not possible to guarantee looking at the code there. The user can set a setting on their graphics card to make it force off vsync or force on vsync.

I think that forcing off makes the card just return the signal that the vertical blank period has been entered immediately, while it is still drawing the current frame. Forcing it on makes the card block on a query until the period is entered. Hardware gurus can correct me on this.

If the user forces vsync off then the user has accepted that they will get tearing, but they will speed up the framerate of games that don't provide an option to turn vsync off. Lots of users like to post on sites boasting about their benchmark framerates for example.

But look, it is fine to have your game perform incorrectly if the user has dicked around with the card settings, no problem. I'm just saying that there is no need for that to be an issue (apart from tearing) with vsync because there is no need to write a game that requires a particular refresh rate.

What will you do if the hardware doesn't support say, 60 fps vsync but can only do 65 or 70?

Share this post


Link to post
Share on other sites
Norman Barrows    7179

What will you do if the hardware doesn't support say, 60 fps vsync but can only do 65 or 70?

 

i use query performance counter, not vsync to control update rate.

 

i was just wondering if there was a way to make games more robust so they didn't tear with vsync off.

 

but now that i've dusted the cobwebs off that corner of my brain that stores how vidram and the DAC (well, there used to be DAC! <g>) work, i realize there's no solution possible in software, unless you can somehow query the vertical retrace signal and do your own "vsync".

Edited by Norman Barrows

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this