Sign in to follow this  
NickGravelyn

How To Support D3D9 and D3D10 In One Game?

Recommended Posts

This may be a funny question coming from a guy with DirectX MVP in his title, but that's because I got my MVP (and spend most of my time) working with XNA Game Studio. I've recently started working on native Direct3D programming as a way to get better with C++ and to hopefully make that DirectX part of my MVP seem less silly. I started out with D3D10 since I just wanted to poke around and had no intention of releasing what I was working on. I just did the basics of getting a window open with a fun little rotating cube. Nothing special, but enough to get me going. Now I'm thinking it might be fun to work on something that I could potentially release, which to me means supporting D3D9 since Vista and Win7 still are not market leaders (and beyond that many people running those still can't run D3D10 well or at all). But I'm still liking the idea of supporting fancy D3D10 effects for people who can run it. I'm sure supporting both of these will be a huge pain, but I think it'll be worth it in the end. I tried doing some Google searches but everything was just consumer aimed stuff comparing D3D9 and D3D10. Are there any good references on how to build a common interface so that I can have as few branched code paths and support both systems? Is there something already in D3D10 that allows it to run as a D3D9 device without me doing too much work?

Share this post


Link to post
Share on other sites
I'm not a big fan of massive if/else (or #if/#else) blocks to determine which API to use. Using preprocessor #if/#else leads to messy and hard-to-maintain code IMO, and runtime if/else blocks are the same except they incur an unnecessary runtime overhead.

I very much prefer a system based on polymorphism.

You need to identify what functionality your renderer needs, and build an interface around that. It also means that you're pretty much going to have to abstract away all the specific details - your renderer should be self-contained and nothing outside of it should need to know what API it's using under the covers.

I'll describe two different ways of implementing your hierarchy using polymorphism. I'm not sure if this is exactly the information you're looking for, but if not maybe someone else will find it useful.

1. Using runtime polymorphism

Have a base renderer class, let's call it IRenderer. It's an abstract base class that provides the interface for the renderer that the rest of your application uses. Two classes, D3D10Renderer and D3D9Renderer, inherit from IRenderer and provde implementations of the interface. Things like textures are wrapped similarly (ITexture, with derived classes D3D9Texture and D3D10Texture).

So it might look a little like this:

class ITexture
{
public:
virtual int GetHeight() const = 0;
virtual int GetWidth() const = 0;

/*...*/
};

class IRenderer
{
public:
virtual shared_ptr<ITexture> CreateTexture(const std::string& path) = 0;
virtual void DrawSprite(const shared_ptr<ITexture>& texture, const Vector2& pos) = 0;

/*...*/
}


class D3D9Texture : public ITexture
{
public:
D3D9Texture(const CComPtr<IDirect3DTexture9>& d3dTex);

int GetHeight() const { return m_height; }
int GetWidth() const { return m_width; }

const CComPtr<IDirect3DTexture9>& GetD3DTexture() const { return m_d3dTex; }

private:
CComPtr<IDirect3DTexture9> m_d3dTex;
int m_width;
int m_height;
}

class D3D9Renderer : public IRenderer
{
public:
shared_ptr<ITexture> CreateTexture(const std::string& path)
{
CComPtr<IDirect3DTexture9> d3dTex;
D3DXCreateTextureFromFile(m_d3dDev, path.c_str(), &d3dTex);
return shared_ptr<ITexture>(new D3D9Texture(d3dTex));
}

void DrawSprite(const shared_ptr<ITexture>& texture, const Vector2& pos)
{
shared_ptr<D3D9Texture> d3d9Texture = static_pointer_cast<D3D9Texture>(texture);

// draw the sprite using d3d9Texture.GetD3DTexture() to get the IDirect3DTexture9*
}

private:
CComPtr<IDirect3DDevice9> m_d3dDev;

/*...*/
}



That's the general gist of it, anyway. There are several problems with this design:


  1. It uses runtime polymorphism using virtual function calls. Every call to the renderer will entail a virtual function call - which is completely unnecessary. You know at compile-time (or, maybe, load-time) which renderer you're going to use. So runtime polymorphism is just an unncessary drain on resources here.
  2. Every function like DrawSprite() requires a downcast from the ITexture interface to the specialised D3D9Texture interface so that it can gain access to the D3D9-specific data (in this case, the IDirect3DTexture9). This alone should raise some red flags. You also have to pray that the ITexture you got is really a D3D9Texture unless you use RTTI. In other words, this is not ideal.


2. Using static polymorphism

You can have a base renderer interface, and two implementation classes. Since you don't need dynamic dispatch (your game is only ever going to be running with one renderer, known either at compile or load-time) we can use something like the Curiously Recurring Template Pattern, which allows for static polymorphism so you don't need to pay the cost of a virtual function call every time you make a call to the renderer.

So your interface might look something like this:

template<typename Impl>
class Texture
{
public:
int GetHeight() const
{
return static_cast<Impl*>(this)->GetHeight();
}
int GetWidth() const
{
return static_cast<Impl*>(this)->GetWidth();
}

/*...*/
};

template<typename RendererImpl, typename TextureImpl>
class Renderer
{
public:
shared_ptr< Texture<TextureImpl> > CreateTexture(const std::string& path)
{
return static_cast<RendererImpl*>(this)->CreateTexture(path);
}
void DrawSprite(const shared_ptr< Texture<TextureImpl> >& texture, const Vector2& pos)
{
static_cast<RendererImpl*>(this)->DrawSprite(texture, pos);
}

/*...*/
}


class D3D9Texture : public Texture<D3D9Texture>
{
public:
D3D9Texture(const CComPtr<IDirect3DTexture9>& d3dTex);

int GetHeight() const { return m_height; }
int GetWidth() const { return m_width; }

const CComPtr<IDirect3DTexture9>& GetD3DTexture() const { return m_d3dTex; }

private:
CComPtr<IDirect3DTexture9> m_d3dTex;
int m_width;
int m_height;
}

class D3D9Renderer : public Renderer<D3D9Renderer, D3D9Texture>
{
public:
shared_ptr< Texture<D3D9Texture> > CreateTexture(const std::string& path)
{
CComPtr<IDirect3DTexture9> d3dTex;
D3DXCreateTextureFromFile(m_d3dDev, path.c_str(), &d3dTex);
return shared_ptr< Texture<D3D9Texture> >(new D3D9Texture(d3dTex));
}

void DrawSprite(const shared_ptr< Texture<D3D9Texture> >& texture, const Vector2& pos)
{
shared_ptr<D3D9Texture> d3d9Texture = static_pointer_cast<D3D9Texture>(texture);

// draw the sprite using d3d9Texture.GetD3DTexture() to get the IDirect3DTexture9*
}

private:
CComPtr<IDirect3DDevice9> m_d3dDev;

/*...*/
}



This is a little better in that you know that the casts you make are safe and there's unnecessary virtual function calls. But there's a downside: you have to use templates everywhere, which may or may not be acceptable to you.

Note: I've not actually tried implementing this system using static polymorphism before. Sounds good on paper, but I have no clue whether it truly works in real life. [grin]

Quote:
Is there something already in D3D10 that allows it to run as a D3D9 device without me doing too much work?

Not in Direct3D10, no. This is a feature that's coming in DX11 - previous versions of D3D (including 9 and 10) will be able to use the same DX11 API.

Share this post


Link to post
Share on other sites
In my opinion D3D10 isn't really worth supporting anymore. I've thought about this a bit, mostly in the context of managed programming, and I believe it breaks down to which platforms get supported, so you'd end up with a different executable for each OS. No need for fancy runtime differentiation.

For example, if I had lots of free time I might spend it writing an engine that would publish three separate binaries: SlimDX.Direct3D11 for Windows 7 and Vista, XNA for Windows XP and the Xbox360, and OpenGL for Mac/Linux/whatever.

You'd compile each one separately, and although they'd largely share the bulk of their code, you don't want them to be truly interchangeable, because at that point you lose the ability to take advantage of the specialties of each API.

Share this post


Link to post
Share on other sites
While I am no way qualified to answer your question, the problem origin (in my opinion) is almost the same of the Adapter design pattern.

http://en.wikipedia.org/wiki/Adapter_pattern

Id say at some point you have to know the configuration of your client, is he running linux/windows and can he provide opengl/directx9,10 support. From there you can instanciate the good Renderer via a class that :

OpenGLRendererAdapter
DirectX9RendererAdapter
DirectX10RendererAdapter

If you use polymorphism all of yours adapter must use the renderer interface (as pointed out by the guy 2 posts up) thus kind-of limiting you.

This pattern is best understood with an SQL connecter class example. If you have to provide the following functions : connectDB(name,ip), disconnect(), select(query), etc... You implement the MySQLAdapter, OracleSQLAdapter, MSSQLAdapter with the interface and you can pick up at run-time which class you want to instanciate.

In java : InterfaceDBConnecter myDBConnecter = new MySQLAdapter();

Share this post


Link to post
Share on other sites
Thanks for all the suggestions. That's definitely a lot to think about.

Quote:
Original post by Mike.Popoloski
In my opinion D3D10 isn't really worth supporting anymore. I've thought about this a bit, mostly in the context of managed programming, and I believe it breaks down to which platforms get supported, so you'd end up with a different executable for each OS. No need for fancy runtime differentiation.

For example, if I had lots of free time I might spend it writing an engine that would publish three separate binaries: SlimDX.Direct3D11 for Windows 7 and Vista, XNA for Windows XP and the Xbox360, and OpenGL for Mac/Linux/whatever.

You'd compile each one separately, and although they'd largely share the bulk of their code, you don't want them to be truly interchangeable, because at that point you lose the ability to take advantage of the specialties of each API.
That's definitely an interesting route, and for sure I could utilize a lot of shared code that would be separate from graphics code. And with D3D11 supporting D3D9 and D3D10 runtimes, I could make that version be able to fall back (or allow the user to choose) to use those older runtimes so that it would work on lower end hardware running Vista or Windows 7. I might have to look into something like that.

Share this post


Link to post
Share on other sites
In my engine, I have support for both D3D9 and D3D10/D3D11. They are just separate renderers, sharing a common base class. There is no fancy code to automatically select which one should be used or anything like that - but I can choose to use either one, or both, in the same application.

By having them in the same library, all of my standard classes are shared and only the rendering classes are unique. There really isn't that much more work to get both renderers running at approximately the same functionality level, and you can keep them isolated from each other to allow incremental updates to one or the other.

Share this post


Link to post
Share on other sites
Quote:
Original post by Jason Z
In my engine, I have support for both D3D9 and D3D10/D3D11. They are just separate renderers, sharing a common base class. There is no fancy code to automatically select which one should be used or anything like that - but I can choose to use either one, or both, in the same application.

By having them in the same library, all of my standard classes are shared and only the rendering classes are unique. There really isn't that much more work to get both renderers running at approximately the same functionality level, and you can keep them isolated from each other to allow incremental updates to one or the other.
So there are no issues with having a single DLL with all that code running on XP? My concern right now is that if I try to have it all in a single DLL or EXE that it will fail on XP where there is no D3D10 or D3D11. But that might just be my managed coder brain talking, so I wanted to make sure.

Share this post


Link to post
Share on other sites
I'd say that the question is how important the XP market is to you. The hardware itself doesn't matter much, since DX10/11 has feature levels that support DX9 hardware (this feature is in beta, but should be released in time for Windows 7).

The Steam survey shows that Vista/Win7 hold about 40% of the market currently. My guess is that after Win7's release, Vista/Win7 will have the majority of the gaming market. The question is, what is your expected release date? If you're only going to release your game in a year or two, I expect that XP users will be a minority, and not even a very large one. Perhaps still significant, but I think they'll be used by that time to not being able to run all games.

If you still decide to support XP, I'd suggest designing your engine for DX10 and then porting it to DX9. The reason is that DX9 is more flexible, so it's better to consider the limitations of DX10 (state changes, constant handling), create something that works well on it, then convert to DX9. I also won't suggest going the DX11 way in this case, which I'd suggest to do if you don't intend to support XP. Well, you can without using new features, but I don't see much point in in.

Regarding loading DX10 on XP, I think that Simon's answer here is a good one.

Share this post


Link to post
Share on other sites
Quote:
Original post by ET3D
The Steam survey shows that Vista/Win7 hold about 40% of the market currently. My guess is that after Win7's release, Vista/Win7 will have the majority of the gaming market. The question is, what is your expected release date? If you're only going to release your game in a year or two, I expect that XP users will be a minority, and not even a very large one. Perhaps still significant, but I think they'll be used by that time to not being able to run all games.
Oh wow. That's definitely interesting. Anything I work on likely won't release for a year or two since I don't have much time, so by then I would expect Vista/Win7 to have a good majority. I think I might just go supporting D3D11 since it sounds like it already supports all hardware from D3D9 on in one interface, which pretty much eliminates the work of supporting that range of hardware. The only requirement then is that the user has Vista or Windows 7, which is a requirement I can live with in a year or two given that Steam survey stat.

Share this post


Link to post
Share on other sites
As long as you remember that it's not 40% of all internet users.

Nor is it 40% of all gamers, it's 40% of all steam users.
What kind of market / gamer are you aiming for?
Not every type of gamer are in favor of steam, are yours?

W3C May 2009 statistics shows that 67% of the avg internet user, uses Windows XP.

Share this post


Link to post
Share on other sites
Quote:
Original post by NickGravelyn
So there are no issues with having a single DLL with all that code running on XP? My concern right now is that if I try to have it all in a single DLL or EXE that it will fail on XP where there is no D3D10 or D3D11. But that might just be my managed coder brain talking, so I wanted to make sure.


I had issues compiling all code in 1 single file with the dependancies.
After a lot of trial/error i finally found a pretty good way to solve this issue.

First of all i made a set of pure virtual classes (the framework)
Then i made 2 dll files, Render3D9.dll and RenderD3D10.dll wich are dynamicly loaded by the client (LoadLibrary)

For example:

Framework class

class IRenderDevice
{
public:
virtual void Clear() = 0;
virtual void Present() = 0;
virtual void Release() = 0;

... methods for creating resources/rendering objects

protected:
virtual ~IRenderDevice() {}

};





Render library class

class CRenderDevice : public IRenderDevice
{
private:
friend HRESULT WINAPI CreateRenderDevice( const DWORD dwSdkVersion, HWND hWnd, LPCWSTR lpIniFileName, IRenderDevice** ppOut );

public:
CRenderDevice();
~CRenderDevice();

public:
void Clear();
void Present();
void Release();

... methods for creating resources/rendering objects

private:
HRESULT InitInstance( HWND hWnd, LPCWSTR lpIniFileName );

};





Render library exported method

HRESULT WINAPI CreateRenderDevice( const DWORD dwSdkVersion, HWND hWnd, LPCWSTR lpIniFileName, IRenderDevice** ppOut )
{
HRESULT hr;

// Validate the sdk header version
if( NP_SDK_VERSION != dwSdkVersion )
return E_INVALIDARG; //TODO: Customize return code

// Validate the parameters
if( NULL == hWnd )
return E_INVALIDARG;

// Create the class instance
CRenderDevice* pInstance = new CRenderDevice();
if( NULL == pInstance )
return E_OUTOFMEMORY;

// Initialize the class instance
if( FAILED( hr = pInstance->InitInstance( hWnd, lpIniFileName ) ) )
{
delete pInstance;
return hr;
}

// Operation successful
*ppOut = pInstance;
return S_OK;

}





Finally the client can load the D3D9 or D3D10 render library like this:

typedef HRESULT (WINAPI* LPCREATERENDERDEVICE)( const DWORD, HWND, const wchar_t*, IRenderDevice** );

// Create the render device
g_hRenderModule = LoadLibrary( L"RenderD3D10.dll" );
LPCREATERENDERDEVICE lpCreateRenderDevice = (NPCREATERENDERDEVICE)GetProcAddress( g_hRenderModule, "CreateRenderDevice" );
lpCreateRenderDevice( NP_SDK_VERSION, g_hWnd, L"Sandbox.ini", &g_pRenderDevice );






Now the bad thing about this code is that non of the D3D code is directly available to the client application, so the client is fully bound to the featured supported by the engine.
In theory, this system should also allow having an RenderOGL.dll to support open GL.

edit: note that i dont pass any render config parameters when creating the render device but rather pass a path to a ini file of wich each render module can read the settings internally. This gives bit more freedom of configuring the render device rather then passing a static set of parameters.
My ini file contain a separated section for both D3D9/D3D19

; General render config
[Engine.Render]
DimensionX=800
DimensionY=600
Windowed=1

; D3D9 specific config
[Engine.Render.D3D9]
...

; D3D10 specific config
[Engine.Render.D3D10]
...

For D3D9, each paramet of the present_parameters are read from the ini file, same goes for D3D10 where all parameters on the swap_chain_desc are read from the ini file.

Share this post


Link to post
Share on other sites
Quote:
Original post by NickGravelyn
My concern right now is that if I try to have it all in a single DLL or EXE that it will fail on XP where there is no D3D10 or D3D11. But that might just be my managed coder brain talking, so I wanted to make sure.


Visual Studio have nice feature called Delayed Loaded DLL. This means no external DLL will be loaded until you actually use function from it. So if you wont use any DX10 function - no DX10 dll will be loaded. Your program will work also on XP.

Of course - you can do it manually with LoadLibrary and GetProcAddress functions. But it is lot of manual and unneeded work.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this