Sign in to follow this  
TheChronoTrigger

D3D init... Need help getting things straight

Recommended Posts

I've been experimenting with DirectX, and attempting to write a class that will manage the initalization / reset of the Direct3D device. My hope was to set the requirements of my target application, populate the available modes, and then be able to call a function that will automatically return a settings structure for the adapter which meets my apps requirements. I've been following the MS sample framework, and have developed a class pretty similar to the non-callback version of the enumeration class. While I've been able to follow and recreate the framework enumeration class, I don't think I totally understand what I will do with all these settings I now know about. I think the user should be able to select D3DDISPLAYMODEs, but I'm pretty sure my app needs to choose the display/backbuffer/depth/stencil on it's own. For example, should my application usually grab the display settings with the highest pixel depth, or is it best to grab the smallest pixel depth (same question with depth/stencil format). Besides not having the richest colors, is there going to be any consequences I need to be aware of if I pick a 16 bit display/backbuffer instead of a 32 bit display/backbuffer (same question with depth/stencil format). The only consequence I can think of is that if I need to lock the backbuffer, I thinking I might need to know what the pixel depth is... any other bit depth situations I need to be aware of? Another thing, should all my settings have the same depth. Like, should I only support backbuffers of the same depth as my display mode, and should my depth/stencil buffer be the same pixel depth as both my display and backbuffer. Other than Display/BackBuffer formats, depth/stencil formats, Multisample types/qualities & Depth/Stencil conflicts, Hw/Sw Vertex capabilities, and the various D3DDISPLAYMODEs, is there anything else that is really necessary in a general purpose D3D device init class. Sorry for so many questions, but I just want to develop this class so I can experiment with different D3D techniques and also use it when I do create a game. Thanks in advance!

Share this post


Link to post
Share on other sites
Bonehed316    122
I think a good answer here depends on what YOU want from YOUR framework.

For instance, are you writing for dx9? If you are, then why bother supporting 16bit at all? Simply exit with a message stating that 16bit is not supported. Since MOST cards use 32bit, this is not unreasonable.

As for the rest, I currently have a set of macros set up for holding settings in an ini file (will need a revamp one day, but its good enough for now). Those settings are loaded and then checked by the device caps. My device init code doesnt actually enable any settings, only disables them if theyre not supported.

As for depth/stencil, only use stencil if you really need it, and probably stick to 16bit depth by default. You can expose this as an option, if you get such a setup working.

Another thing to consider is when in windowed mode, try to keep the same color depth/refresh rate as the desktop already has. You can get that stuff like this:

HWND hwndDesktop = GetDesktopWindow();
HDC hdcDesktop = GetDC(hwndDesktop);
DesktopColorBits = GetDeviceCaps( hdcDesktop, BITSPIXEL );

You can use GetDeviceCaps in any win32 environment, and can get a lot of information about the desktop this way.

Share this post


Link to post
Share on other sites
Muhammad Haggag    1358
Quote:
Original post by TheChronoTrigger
I think the user should be able to select D3DDISPLAYMODEs, but I'm pretty sure my app needs to choose the display/backbuffer/depth/stencil on it's own. For example, should my application usually grab the display settings with the highest pixel depth, or is it best to grab the smallest pixel depth (same question with depth/stencil format). Besides not having the richest colors, is there going to be any consequences I need to be aware of if I pick a 16 bit display/backbuffer instead of a 32 bit display/backbuffer (same question with depth/stencil format). The only consequence I can think of is that if I need to lock the backbuffer, I thinking I might need to know what the pixel depth is... any other bit depth situations I need to be aware of?

- What I do is allow selection of resolution. The game code itself then passes that, along with the minimum requirements for depth and alpha - which I determine because I know what I need in the game, i.e. hard-coded - to a component that enumerates the available modes and finds the highest mode that fits the requirements. For example, if a minimum of 1 bit alpha is required, and an 8-bit format exists, it'll select it.

You can expose some "low quality" option which would select the tightest fitting mode instead, because this can save you some video memory on low-end cards.

For multisampling, I can think of 2 methods:
1) After you've determined the backbuffer format, you proceed to enumerate the multisampling options and show them to the user.

2) Do it the other way round. The user specifies a minimum multisampling quality, and you choose the format that satisfies this.

Of course you can go out and do all the crazy enumeration and give the user full control (like the samples do), but I - personally - find this too much work for nothing, and thus I take shortcuts as noted above.

- You can determine the format of the backbuffer any time, so locking it isn't really an issue if you gave the user the option to choose.

Quote:
Another thing, should all my settings have the same depth. Like, should I only support backbuffers of the same depth as my display mode, and should my depth/stencil buffer be the same pixel depth as both my display and backbuffer.

The backbuffer format should match the display mode in windowed mode only. Other than that, you can do what you want. You don't have to match your depth/stencil buffer with your backbuffer either. If you don't need stencil, or don't need 24-bit z-buffering, then there's no need for a 32-bit depth/stencil.

Quote:
Other than Display/BackBuffer formats, depth/stencil formats, Multisample types/qualities & Depth/Stencil conflicts, Hw/Sw Vertex capabilities, and the various D3DDISPLAYMODEs, is there anything else that is really necessary in a general purpose D3D device init class.

As I said above, I'd stay away from a "general-purpose device init class". Unless you're developing an engine, that is. If you're developing a game, focus on what's important. Which is most definitely not giving the user advanced graphics options that only hardcore players care about anyway.

Of course that's my humble opinion [smile]

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this