no vsync means double buffering to avoid tearing, right?

Started by
17 comments, last by Norman Barrows 9 years, 2 months ago
Also bear in mind it is not possible to enforce vsync or no vsync. The graphics card settings app can normally override whatever your application requests.
Advertisement


been a long time since i wrote directly to vidram... what was it 0x0A000:0000 or something like that? just looked it up. 0xA0000 was dos vidram address. set a pointer and let's party on the bitmap! .


i remember implementing a double buffer system for the game library - might have been back in my pascal days (late 80's).

Yay, another fan of turbo pascal from back in the day! Looks like we were both doing the same things about ten years apart, turbo pascal was the defining facto language for me in college and i remember implementing a sprite blitter with scaling and rotation in mode 13h whilst the course was talking about crappy text mode database programs. Those were the days eh :)

Still got that source code, it's pretty useless these days I guess...


Sometimes I wish it was possible only to create less laggy GUIs...

one thing i noticed with directx is that d3dx fonts are slow compared to d3dx sprites.

drawing the debugging HUD info in Caveman with 3d fonts took like 3ms!

i implemented my own fonts using d3dx sprites and got those clock cycles back.

i later went to a minimalist UI, so most of the time there's no UI text or graphics on the screen at all, not even crosshairs.

but all my stats and inventory screens are quite responsive with the fast sprite based fonts.

Norm Barrows

Rockland Software Productions

"Building PC games since 1989"

rocklandsoftware.net

PLAY CAVEMAN NOW!

http://rocklandsoftware.net/beta.php


Also bear in mind it is not possible to enforce vsync or no vsync. The graphics card settings app can normally override whatever your application requests.

yes, that is a concern, what can be done about this? nothing? if the user is dumb enough to disable vsync and allow tearing, are we screwed?

Norm Barrows

Rockland Software Productions

"Building PC games since 1989"

rocklandsoftware.net

PLAY CAVEMAN NOW!

http://rocklandsoftware.net/beta.php


Still got that source code, it's pretty useless these days I guess...

someday some guy like us will crack the realtime raytracing barrier.

i don't think it will be me though. while i enjoy graphics programming, its still mostly a means to an end for me.

heck, i'm afraid to step up to dx11 for fear i'll become so addicted to shader coding i'll never finish another game! <g>.

just the thought of being able to party on the bitmap again with such power is - intoxicating.

i've actually started porting my graphics library to dx11, and can clear the screen. but i must be strong and not give into temptation. dx9.0c fixed function is sufficient for the needs of the next three planned titles. i don't need to build a new masarattii, the old honda i have in the garage will do just fine.

Norm Barrows

Rockland Software Productions

"Building PC games since 1989"

rocklandsoftware.net

PLAY CAVEMAN NOW!

http://rocklandsoftware.net/beta.php

Also bear in mind it is not possible to enforce vsync or no vsync. The graphics card settings app can normally override whatever your application requests.


yes, that is a concern, what can be done about this? nothing? if the user is dumb enough to disable vsync and allow tearing, are we screwed?


Well, the obvious answer is to design in such a way that the program is not affected by the graphical frame rate, Norman. By, for example, decoupling the render rate from the physics rate and tweening to get rid of temporal artifacts. But we've had this conversation at length smile.png

But yes, you can of course just tell users that if they disable or force vsync, whichever way you are relying on, the game will not perform correctly. When I was doing 2D sprite based games with D3D, I found there was a set of graphics card settings to do with antialiasing that I could set on my card that completely screwed all my views up and had to just accept that.

Personally, if I had a game that was telling me I had to leave vsync enabled on my card to play it, I would think "Oh, they've implemented that wrong then."


Well, the obvious answer is to design in such a way that the program is not affected by the graphical frame rate

given the follwing present paramwters, what the behavior of directx with vsync off?

void Zsetparams(int w,int h)
{
ZeroMemory(&Zparams,sizeof(D3DPRESENT_PARAMETERS));
Zparams.AutoDepthStencilFormat = D3DFMT_D24X8;
Zparams.BackBufferCount=1;
Zparams.BackBufferFormat = D3DFMT_A8R8G8B8; // set the back buffer format to 32-bit // turn these on for fullscreen
Zparams.BackBufferWidth = (unsigned)w; //width; // set the width of the buffer
Zparams.BackBufferHeight = (unsigned)h; //height; // set the height of the buffer
Zparams.EnableAutoDepthStencil = TRUE; // automatically run the z-buffer for us
Zparams.Flags=0;
Zparams.FullScreen_RefreshRateInHz=D3DPRESENT_INTERVAL_DEFAULT;
Zparams.hDeviceWindow=Zprogram_window_handle;
Zparams.MultiSampleQuality=0;
Zparams.MultiSampleType=D3DMULTISAMPLE_NONE;
Zparams.PresentationInterval=D3DPRESENT_INTERVAL_DEFAULT;
Zparams.SwapEffect = D3DSWAPEFFECT_COPY; //D3DSWAPEFFECT_DISCARD;
Zparams.Windowed = FALSE; // TRUE for windowed, FALSE for fullscreen
}

D3DPRESENT_INTERVAL_DEFAULT supposedly waits for vsync and uses the default system timer resolution

but no mention is made of behavior when vsync is disabled,.

guess it would just present immediately and return, huh?

so its like vsync doesn't exsist, and you get tearing. but as long as you control the speed of the game some other way such as a framerate limiter or f-y-t, you're ok - you just get tearing.

Norm Barrows

Rockland Software Productions

"Building PC games since 1989"

rocklandsoftware.net

PLAY CAVEMAN NOW!

http://rocklandsoftware.net/beta.php

The behaviour is not possible to guarantee looking at the code there. The user can set a setting on their graphics card to make it force off vsync or force on vsync.

I think that forcing off makes the card just return the signal that the vertical blank period has been entered immediately, while it is still drawing the current frame. Forcing it on makes the card block on a query until the period is entered. Hardware gurus can correct me on this.

If the user forces vsync off then the user has accepted that they will get tearing, but they will speed up the framerate of games that don't provide an option to turn vsync off. Lots of users like to post on sites boasting about their benchmark framerates for example.

But look, it is fine to have your game perform incorrectly if the user has dicked around with the card settings, no problem. I'm just saying that there is no need for that to be an issue (apart from tearing) with vsync because there is no need to write a game that requires a particular refresh rate.

What will you do if the hardware doesn't support say, 60 fps vsync but can only do 65 or 70?

What will you do if the hardware doesn't support say, 60 fps vsync but can only do 65 or 70?

i use query performance counter, not vsync to control update rate.

i was just wondering if there was a way to make games more robust so they didn't tear with vsync off.

but now that i've dusted the cobwebs off that corner of my brain that stores how vidram and the DAC (well, there used to be DAC! <g>) work, i realize there's no solution possible in software, unless you can somehow query the vertical retrace signal and do your own "vsync".

Norm Barrows

Rockland Software Productions

"Building PC games since 1989"

rocklandsoftware.net

PLAY CAVEMAN NOW!

http://rocklandsoftware.net/beta.php

This topic is closed to new replies.

Advertisement