Sign in to follow this  
  • entries
    455
  • comments
    639
  • views
    422480

TLM : The Falling out with D3D...

Sign in to follow this  
_the_phantom_

83 views

It was going so well, things were compling, I had a plane on screen and no resource leaking, ok so my shaders weren't working but that was a minor problem (well, a typo as it turns out).

But, as I've used it I've started to fall out with D3D; the biggest problem is that whatever I'm doing is broken and I've no idea why.

I've got "if(FAILED()) { }" around ALL my drawing calls and nothing is turning up. Switching D3D to debug mode with all boxes checked and max debug output gives me a couple of warnings about VB not being write only and two infos about my index buffers not being supported in hardware or some such. Neither of these are show stopping errors however things still aren't doing what I expect.

The theory is that I'm pulling two vertex streams from two blank textures (created, bound as render targets, cleared and then unbound again); the practise is that instead of a lovely flat plane as I was expecting (from a height map which should be zero) I instead get a jagged mess. Swapping it to visualise as colours by writing to the red channel gives me a series of red smudges which get bigger as you move from bottom left to top right. Swapping to visualise the normal buffer, which should also be black, as a 3 colour component gives a black bottom left with the top right being white and the top and left sides being green and purple (iirc).

There is no way in hell this makes any sense, so it's cheesed me off some what as I've no errors but it's not remotely playing ball. I guess I'll have to try and visualise it as a normal texture (I'm producing texture coords as well as vertex coords) and see what I get; if that comes out black then I might just cry.

I'm also not getting along too well with various D3D concepts;
- the render targets seem like a joke when coming from OGL's FBO
- .fx/HLSL and me have hard a barney due to a slight case of unclear docs. Oh, and the semantics crap is just that; crap. GLSL's 'bind stream to varying' system is MUCH better imo. It also doesn't lead to dumb situations where the compiler can't tell the difference between an incoming stream and an outgoing one (POSITION semantic on SM3.0 I'm looking at you....) because you have certain things you have to write to.
- Vertex declarations make me cry. The fact you have to create them seperate from the use point just makes fiddling a small nightmare. OK, great, I've said that stream 0 will have two vec2s and that the 2nd has a certain offset in the buffer, now explain to me why having declared this earlier and set said declaration as being used I then have to resubmit the stride for the thing? Surely this could have been set at declaration creation time? Heck, an overload which does the work for me based on the current declaration would have been nice. But no, if I want to change the vertex format for testing I have to remember to edit one extra place.

Knock OGL's bind system all you like, as least you don't end up doing things in two locations and repeating yourself with it.

That said, some of the stuff behind the .fx system I do like and an .fx-a-like for OGL would be nice... just preferably without the retardedness I encounted when trying to use it with HLSL.

I'm going to have one, maybe two, my attempts at working out how to debug this (anyone know of a system where by each frame a copy of all the textures can be grabbed and the framebuffer? I'm thinking something like glIntercept but for D3D) and if I can't get it to work then I'm running back to OGL and my own windowing framework; at least I know how that works.

D3D was only winning over OGL for this project because of it's
- R2VB
- DXUT GUI
- Novality factor of learning a new API

However, ATI now have PBO in their drivers which nullifies the R2VB advantage and the novality factor is all well and good when you don't have a deadline and things aren't being strange at your head. The lack of GUI for the final product is a tad annoying but, worse comes to worse, I'll just do what the previous person did and embed the renderer into a Win32 app, or make a second window to control it... assuming I can't make one of the free GUI's play ball in a sane amount of time.

My rantings are done, D3D can bite me, I'm going to bed.
Sign in to follow this  


5 Comments


Recommended Comments

You'll want to take a look at PIX. It can capture all calls to D3D, and more importantly play them back one at a time afterwards. During which you can view pretty much everything in VRAM (buffers, textures, state, etc).

Share this comment


Link to comment
ah yes the same issues with DX I had when I was tempted by the darkside!! I think DX is ok, but not as good as GL and with the new version of GL coming GL is going to so rock and slap DX to the ground. I agree 100% with all the ranting phantom stated. .fx files would be nice for GL, but not a huge issue. I hated the vertex streams vs. GL style also. I also hate HLSL syntax vs. GLSL. I also don't care for all the matrix ideas you have to get your head around vs. GL just call a function. GL just gets to the point for you. Well I hope you jump back to GL!! :) hey if you want a GUI check out JavaCoolDudes GUI he used for his demos at Nvidia...

Share this comment


Link to comment
How could I let this opportunity pass by [grin]

Use PIX to step through some iterations of your PS - its not really an error to sample the wrong location, but it does generate many of the results you're describing. The compiler and debug runtimes won't pick this sort of thing up...

Quote:
the render targets seem like a joke when coming from OGL's FBO
I've no knowledge of FBO's (haven't gotten around to reading your article [oh]) but RTT's can be a bit verbose with the defensive copies and so on, but I don't see the problem... They're sufficiently simple that nothing much could go wrong [smile]

Quote:
Vertex declarations make me cry
Yeah, they aint so pretty at times. Try FVF's though [evil]

On the plus-side, with a decent FX system you can abstract out the decl's such that you code it once and ignore them...

Cheers,
Jack

Share this comment


Link to comment

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now