Jump to content

  • Log In with Google      Sign In   
  • Create Account

Interested in a FREE copy of HTML5 game maker Construct 2?

We'll be giving away three Personal Edition licences in next Tuesday's GDNet Direct email newsletter!

Sign up from the right-hand sidebar on our homepage and read Tuesday's newsletter for details!


Using both D3D9 and D3D11


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
13 replies to this topic

#1 incertia   Crossbones+   -  Reputation: 779

Like
0Likes
Like

Posted 19 August 2013 - 05:56 PM

So in my own personal renderer, I want it to be able to support D3D9 and D3D11. To do this, I thought about having a master renderer interface that two classes will derive from. One will implement the D3D9 renderer and the other will implement the D3D11 renderer. However, this requires that I statically link to both D3D9 and D3D11, which will (I think) require that both dlls be present at run time. Is there any way I can avoid this?


what

Sponsor:

#2 siri   Members   -  Reputation: 233

Like
0Likes
Like

Posted 19 August 2013 - 06:32 PM

This is probably a very stupid question and I apologise if it is. But can't you just use the D3D9 feature levels in DX11 for the D3D9 renderer ?



#3 Migi0027 =A=   Crossbones+   -  Reputation: 1921

Like
0Likes
Like

Posted 19 August 2013 - 10:49 PM

Agree with Siri.

 

With Directx 11 you pass the feature level, which, for directx 11, is usually D3D_FEATURE_LEVEL_11_0, but that can be changed!

 

Taken from msdn:

 

 

With Direct3D 11, a new paradigm is introduced called feature levels. A feature level is a well defined set of GPU functionality. For instance, the 9_1 feature level implements the functionality that was implemented in Microsoft Direct3D 9, which exposes the capabilities of shader models ps_2_x and vs_2_x, while the 11_0 feature level implements the functionality that was implemented in Direct3D 11.

Now when you create a device, you can attempt to create a device for the feature level that you want to request. If the device creation works, that feature level exists, if not, the hardware does not support that feature level. You can either try to recreate a device at a lower feature level or you can choose to exit the application. For more info about creating a device, see the D3D11CreateDevice function.

Using feature levels, you can develop an application for Direct3D 9, Microsoft Direct3D 10, or Direct3D 11, and then run it on 9, 10 or 11 hardware (with some exceptions of course, new 11 features will not run on an existing 9 card for example). Here is a couple of other basic properties of feature levels:

  • A GPU that allows a device to be created meets or exceeds the functionality of that feature level.
  • A feature level always includes the functionality of previous or lower feature levels.
  • A feature level does not imply performance, only functionality. Performance is dependent on hardware implementation.
  • Choose a feature level when you create a Direct3D 11 device.

Hi! Cuboid Zone
The Rule: Be polite, be professional, but have a plan to kill everyone you meet, ohh, AND STEAL ALL ZE TRIANGLES FROM ZHEM!

#4 Tape_Worm   Crossbones+   -  Reputation: 1793

Like
6Likes
Like

Posted 19 August 2013 - 10:49 PM

This is probably a very stupid question and I apologise if it is. But can't you just use the D3D9 feature levels in DX11 for the D3D9 renderer ?

It's not a stupid question at all, it's definitely something developers should look into if they wish to support direct3d 9 video devices.  However, there are caveats.  For example, if the dev wanted to run their application on Windows XP, they couldn't use the Direct3D 11 runtime because it won't work on XP.  
 
Also, from personal experience, the D3D9 feature levels are a pain in the ass to work with.  I've had several little "gotchas" when dealing with the D3D9 feature level (some of which have no or very sparse documentation) that I've nearly given up on caring if people can use it.  
 
An example of one of the issues I ran into was copying a texture into a staging texture.  Apparently, under feature level 9 you can't copy a resource into a staging resource if the source resource is in GPU memory and it's got a shader view attached to it.  So, if you created a texture with a shader view, you're out of luck.  So, create one without a shader view right?  Unfortunately, no, that leads to another set of issues (e.g. you can't bind to a pixel shader, kinda necessary).  There was that and a few other small issues I ran into (multi monitor was especially painful).

 

Another issue with feature level 9 is that the highest vertex/pixel shader model supported is SM2_x.  If you wanted to support SM3, then you're out of luck.
 

So in my own personal renderer, I want it to be able to support D3D9 and D3D11. To do this, I thought about having a master renderer interface that two classes will derive from. One will implement the D3D9 renderer and the other will implement the D3D11 renderer. However, this requires that I statically link to both D3D9 and D3D11, which will (I think) require that both dlls be present at run time. Is there any way I can avoid this?

You could design a system to load your renderer dynamically as a DLL.  This way the DLL would be the only thing linking to the libraries in question, your host application/API would have the interface abstracted so it wouldn't care about the dependencies.  So, for example, you can detect whether the user is running XP and force load the D3D 9 renderer DLL, or if they're not then load the D3D 11 renderer DLL.  
 
If you put both renderers in the same DLL then yes, you'll need to link against both.  Whether the end user will require D3D11 and D3D9 installed to run, I can't answer with any degree of certainty because it's been an incredibly long time since I've dealt with that stuff, but I'm going to guess that yes they would require both.


Edited by Tape_Worm, 19 August 2013 - 10:59 PM.


#5 nonoptimalrobot   Members   -  Reputation: 416

Like
2Likes
Like

Posted 19 August 2013 - 10:50 PM

You can statically link to d3d9.lib and d3d11.lib in the same project without problems (it's not recommended though).  The d3d9.dll and d3d11.dll don't get loaded until you call Direct3DCreate9 or D3D11CreateDevice respectively (at least that's my understanding).  Having the two dlls loaded at the same time shouldn't be a problem but you I would expect a storm of horrendously difficult to diagnose bugs to result from having a d3d9 device and a d3d11 device alive at the same time.

 

You have the right about creating a render interface that abstracts a d3d9 or d3d11 backend.  Being able to switch between them at runtime is sort of a novelty that's not worth the effort in my opinion.  I've never worked on a game that didn't simply build a separate exe for d3d9 and d3d11 distributions.  My preference is to create multiple static libraries that implement the renderer interface in whatever API.  The code in the actual exe project simply uses the render interface and is linked to the appropriate static lib at compile time based on a build configuration.  Instantiating the appropriate renderer implementation is done with some #defines in your main.cpp or wherever else.  The only problem with this setup is that you have to be vigilant about not injecting unnecessary #defines in other places and you end up with multiple exes.

 

Also...Siri raises a good point. 



#6 imoogiBG   Members   -  Reputation: 1218

Like
0Likes
Like

Posted 20 August 2013 - 01:28 AM

There is a HUGE difference between feature level and directx version.....



#7 MJP   Moderators   -  Reputation: 11567

Like
2Likes
Like

Posted 20 August 2013 - 01:30 AM

As far as I know there's nothing inherently dangerous about having your app load both the D3D9 and D3D11 DLL's. I've been doing this for years since the PIX marker functions are exported from D3D9.DLL, and I don't believe it's been the cause of any problems. The old samples from the SDK also did the same thing.



#8 Hodgman   Moderators   -  Reputation: 30884

Like
3Likes
Like

Posted 20 August 2013 - 01:38 AM

So in my own personal renderer, I want it to be able to support D3D9 and D3D11. To do this, I thought about having a master renderer interface that two classes will derive from. One will implement the D3D9 renderer and the other will implement the D3D11 renderer. However, this requires that I statically link to both D3D9 and D3D11, which will (I think) require that both dlls be present at run time. Is there any way I can avoid this?

I compile my game as two different executables - one for XP/D3D9, and one for D3D11.

Instead of using inheritance, I select the appropriate renderer classes at compile time, e.g.
#if defined(BUILD_WITH_D3D9)
  #include "d3d9renderer.h"
  typedef Renderer_D9 Renderer;
  typedef Texture_D9 Texture;
#elif defined(BUILD_WITH_D3D11)
  #include "d3d9renderer.h"
  typedef Renderer_D11 Renderer;
  typedef Texture_D11 Texture;
#else
  #error "no renderer type defined"
#endif


#9 Jason Z   Crossbones+   -  Reputation: 5147

Like
0Likes
Like

Posted 20 August 2013 - 04:55 AM

As the others have already mentioned, there isn't any issue with linking to both statically.  You can load your renderer as part of a DLL and selectively choose which one to load on which platform.  You can even use them both at the exact same time in one application - I experimented with this back when D3D11 first came out: journal post.



#10 incertia   Crossbones+   -  Reputation: 779

Like
0Likes
Like

Posted 20 August 2013 - 07:56 AM

You could design a system to load your renderer dynamically as a DLL.  This way the DLL would be the only thing linking to the libraries in question, your host application/API would have the interface abstracted so it wouldn't care about the dependencies.  So, for example, you can detect whether the user is running XP and force load the D3D 9 renderer DLL, or if they're not then load the D3D 11 renderer DLL.

I don't think I will want to use this because from what I've read, LoadLibrary and GetProcAddress don't really work well with C++ name mangling so I'll have to extern "C" a lot of things and I don't think that will work very well with namespaces. Although if I were to do this, what would be the best way to avoid rewriting declaration code (i.e. CreateDevice())? Should I just have a global header that doesn't belong to any specific visual studio project? Also, how would I load virtual functions from interfaces?
 

I compile my game as two different executables - one for XP/D3D9, and one for D3D11.

If I were to use this method, I would need to build two versions of my DLL, one for D3D9 and D3D11. Additionally, I would need two versions of my executable, correct?


Edited by incertia, 20 August 2013 - 10:05 AM.

what

#11 Adam_42   Crossbones+   -  Reputation: 2561

Like
0Likes
Like

Posted 21 August 2013 - 05:36 PM


If I were to use this method, I would need to build two versions of my DLL, one for D3D9 and D3D11. Additionally, I would need two versions of my executable, correct?

 

You need zero DLLs. You can statically link the rendering code instead of dynamically linking it.

 

You'd probably also want three executables: The main one is the "Launcher" that probably has an options menu and a launch game button. The launch game button kicks off one of the other two exes (D3D9 or D3D11) depending on the settings and what the PC supports.



#12 andur   Members   -  Reputation: 613

Like
0Likes
Like

Posted 21 August 2013 - 09:55 PM

Couldn't you just set the D3D9/D3D11 dlls to be delay loaded in your project settings? Then they will only be loaded when your application first tries to use a function from one of them.

 

You'd still have to check for their existence before initializing your renderer or it'll crash when it tries to use one if it doesn't exist.

 

(Note I've never actually tried this with the D3D dlls, but I don't see why it wouldn't work)


Edited by andur, 21 August 2013 - 09:57 PM.


#13 Krohm   Crossbones+   -  Reputation: 3167

Like
0Likes
Like

Posted 26 August 2013 - 12:19 AM


For example, if the dev wanted to run their application on Windows XP, they couldn't use the Direct3D 11 runtime because it won't work on XP.  
That's a big IF about now.

Personally, most computers I've dealt with still running XP had at least one problem which would make me run away screaming. I've fried an old XP machine, I've apparently loaded it too much causing... perhaps voltage regulators to blow up?

If you really want to run on XP, I strongly suggest to make sure everybody understands there's going to be no support involved. Personally I really don't want to have to deal with anyone still running XP.



#14 Burnt_Fyr   Members   -  Reputation: 1245

Like
0Likes
Like

Posted 26 August 2013 - 10:07 AM

 

You could design a system to load your renderer dynamically as a DLL.  This way the DLL would be the only thing linking to the libraries in question, your host application/API would have the interface abstracted so it wouldn't care about the dependencies.  So, for example, you can detect whether the user is running XP and force load the D3D 9 renderer DLL, or if they're not then load the D3D 11 renderer DLL.

I don't think I will want to use this because from what I've read, LoadLibrary and GetProcAddress don't really work well with C++ name mangling so I'll have to extern "C" a lot of things and I don't think that will work very well with namespaces. Although if I were to do this, what would be the best way to avoid rewriting declaration code (i.e. CreateDevice())? Should I just have a global header that doesn't belong to any specific visual studio project? Also, how would I load virtual functions from interfaces?

 

To see an example of this, look for the book  3d game engine programming by Stefan zerbst. You can look also look at the ZFX Community Engine which is based on the one presented in the book(sourceforge: https://github.com/kimkulling/zfxce2)






Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS