Do game developers still have any reason to support Direct3D 10 cards?

Started by
12 comments, last by kubera 8 years, 2 months ago

The D3D9 vs D3D10/11 and D3D10 vs D3D11 is not exactly the same.

Supporting multiple DX versions means we need to aim for lowest common denominator. This cripples performance optimizations that are not possible because of the oldest path (unless we'd spent an disproportionate amount of resources to maintain two completely different code paths).

This means a game well-designed to run D3D11 will be significantly more efficient than a game that aims to run on D3D11, 10 & 9.

The D3D9 to D3D10 shift was a very peculiar one. It wasn't just performance, D3D10 introduced a few improvements that were very handy and easy to support for adding "extras" (note: these extras could've been easily backported to D3D9, but MS just wasn't interested).
For example: Z Buffer access, Z Out semantic, sending to multiple RTTs via the geometry shader, access to individual MSAA samples, separate alpha blending, dynamic indexing in the pixel shader, real dynamic branching in the pixel shader.
All stuff that made certain postprocessing FXs much easier. Therefore it was possible to offer DX10-only effects like you see in Bioshock that can be turned on and off, just to "spice up" the experience when you had a recent GPU running on Vista.

But moving from D3D10->D3D11... there weren't many features introduces, but those few features... oh boy they were critical. Let's take Assassin's Creed Unity for example: its frustum culling and dispatch of draws lives in a compute shader! We're not talking about an effect you can turn on and off. We're talking about the bare bones of its rendering infrastructure depending on a feature unavailable to D3D10 cards. Supporting D3D10 cards may mean as well to rewrite 70% or more of its entire rendering engine; which also likely will affect the asset pipeline and the map layout.

There are only a few D3D11-only things that can be used to spice up the graphics while still turning them off for D3D10, tessellation comes to mind.

Advertisement

The D3D9 to D3D10 shift was a very peculiar one. It wasn't just performance, D3D10 introduced a few improvements that were very handy and easy to support for adding "extras" (note: these extras could've been easily backported to D3D9, but MS just wasn't interested).
For example:
1) Z Buffer access, 2) Z Out semantic, 3) sending to multiple RTTs via the geometry shader, 4) access to individual MSAA samples, 5) separate alpha blending, 6) dynamic indexing in the pixel shader, 7) real dynamic branching in the pixel shader.

You're misremembering how groundbreaking D3D10 was smile.png
Uptake of D3D10 was excruciatingly slow, as it didn't really introduce any killer new feature (geometry shaders did not live up to the hype), and came with the lack of XP-compatability, which was a big deal at the time. In my experience, a lot of people seem to have stuck with D3D9 until D3D11 came out. Most of the stuff you mention is accessible from the D3D9 API:
1) D3D9-era GPUs had multiple different vendor-specific extensions for this, which was painful. D3D10-era GPUs all support the "INTZ" extension (side note: you see a LOT of games that use the D3D9 API, but list the GeForce 8800 as their min-spec, which is the first NV D3D10 GPU -- my guess is because being able to reliably read depth values is kinda important)
2/5/7) Are in the D3D9 core API.
3) is a D3D10 API feature, but performance wasn't that great...
4) is a D3D10.1 API feature (and compatible GPU), but wasn't put to great use until proper compute shaders appeared in D3D11 smile.png
6) is emulatable in D3D9 but requires you to use a tbuffer instead of a cbuffer (as cbuffers weren't buffers in D3D9).

I've actually been toying with the idea of doing a D3D9/WinXP build of my current game, as a 10-years-too-late argument against all the D3D10 hype of the time laugh.png
You can actually do a damn lot with that API, albeit a lot less efficiently than the D3D11 API does it! I'd be able to implement a lot of the D3D11-era techniques... but with quite significant RAM and shader efficiency overheads. Still, would be fun to see all those modern techniques running badly on Windows XP (or an XP app, emulated on Wine!!).

The D3D9 to D3D10 shift was a very peculiar one. It wasn't just performance, D3D10 introduced a few improvements that were very handy and easy to support for adding "extras" (note: these extras could've been easily backported to D3D9, but MS just wasn't interested).
For example:
1) Z Buffer access, 2) Z Out semantic, 3) sending to multiple RTTs via the geometry shader, 4) access to individual MSAA samples, 5) separate alpha blending, 6) dynamic indexing in the pixel shader, 7) real dynamic branching in the pixel shader.

You're misremembering how groundbreaking D3D10 was smile.png
Uptake of D3D10 was excruciatingly slow, as it didn't really introduce any killer new feature (geometry shaders did not live up to the hype), and came with the lack of XP-compatability, which was a big deal at the time. In my experience, a lot of people seem to have stuck with D3D9 until D3D11 came out. Most of the stuff you mention is accessible from the D3D9 API:
1) D3D9-era GPUs had multiple different vendor-specific extensions for this, which was painful. D3D10-era GPUs all support the "INTZ" extension (side note: you see a LOT of games that use the D3D9 API, but list the GeForce 8800 as their min-spec, which is the first NV D3D10 GPU -- my guess is because being able to reliably read depth values is kinda important)
2/5/7) Are in the D3D9 core API.
3) is a D3D10 API feature, but performance wasn't that great...
4) is a D3D10.1 API feature (and compatible GPU), but wasn't put to great use until proper compute shaders appeared in D3D11 smile.png
6) is emulatable in D3D9 but requires you to use a tbuffer instead of a cbuffer (as cbuffers weren't buffers in D3D9).

I've actually been toying with the idea of doing a D3D9/WinXP build of my current game, as a 10-years-too-late argument against all the D3D10 hype of the time laugh.png
You can actually do a damn lot with that API, albeit a lot less efficiently than the D3D11 API does it! I'd be able to implement a lot of the D3D11-era techniques... but with quite significant RAM and shader efficiency overheads. Still, would be fun to see all those modern techniques running badly on Windows XP (or an XP app, emulated on Wine!!).

Actually I did. I was using those screenshots about Saint Rows & Bioshock DX10-only features he posted:

  • Reflections. I guess they used GS to write to multiple RTTs at once. Otherwise it doesn't make sense to be DX10-only (from a technical point of view). While GS didn't live nowhere to their hype, that doesn't mean people didn't try. Probably they didn't gain performance. But porting it to DX9 would mean creating two codepaths (one for single pass using a GS, another for multi pass w/out GS). Note however, hybrids did actually improve performance. A hybrid would use instancing to multiply the geometry, a Geometry Shader to output to multiple RTTs, and still being multipass. Instead of writing to all 6 faces in one pass, write to 3 faces in 2 passes, or 2 faces in 3 passes. This kind of leverage allowed to find a sweet spot in performance improvement.
  • Ambient occlusion: Clearly they're doing a dynamic loop in the pixel shader which would explain why they'd need DX10. Or maybe they wanted Z Buffer access and didn't bother with the INTZ hack.
  • DirectX10 detail surfaces: I'm suspecting they mean multiple diffuse/normal textures overlayed on top of each other, taking advantage of array textures. Or maybe they enabled some Geometry Shaders somewhere for extra effects, like in a wall or something.

All of these features can definitely be done on DX9. But on the lazy side, you have to admit they're much easier to implement on DX10 (or like you said, doing it in DX9 would require more RAM or some other kind of overhead).

Like you said, DX10 wasn't that groundbreaking; but the features (that could've easily been backported to DX9, but weren't; save for vendor hacks like the INTZ one) that were added allowed games to include "turn on / turn off" kind of effects when running in DX10 mode.

Hi!

An interestic thread.

I use the Direct3D11 API, but min. feature level 9_3.

One comment, that 9_1 ... 9_3 feature levels are maximum on some tablets.

For such devices it is a good idea to be back compatible.

This topic is closed to new replies.

Advertisement