Jump to content
  • Advertisement
Sign in to follow this  
  • entries
  • comments
  • views

The DX Question....

Sign in to follow this  


... or "Why one tool isn't always everything".

Having made my last entry Mr Mars999 seems a bit.. shocked? at my pronoucement that I might be looking at DX at some time in the future.

Now, I'm reasonably well known for using OpenGL for my 3D needs and don't get me wrong, I like it and I like the direction the new API is taking, however it never hurts to see what other tools can do for you.

I've never been an "OpenGL or death!" kinda person, infact my difference with the API are mostly cosmetic, all the long capital letters make me twitch slightly and of course there was the crossplatform angle, at the time I started learning I had this crazy idea to make Linux games as well as windows...

I had however always said I'd learn D3D if I had a compelling reason todo so... or maybe just for the hell of it [wink] and the chance to work on the XBox360 hardware is a pretty compelling reason [grin]

I've got some limited experiance with D3D, as I said in my pervious entry a D3D=>OGL conversion did give me some understanding at least with basic drawing (it also threw up some odd speed issues but thats an aside) and as I've got a soild grounding in 3D concepts and such a transition shouldn't be that hard.

Mr Mars999 also asks about the geometry shaders coming to OpenGL. Well, I know what you guys know (if you've been reading the presentations), that it's "coming soon" and on the list that NV gave us at SIGGRAPH this year.

Infact, talking of which, aside from extending GLSL to support Geo-shaders (which will be no use until D3D10 class hardware appears anyways) a couple of important things are on the horizon;

- conditional rendering
This concept rocks. My basic understanding is that you perform an occulsion query, you then pass this result into the stream of data to render. If, when the hardware hits the query, the result is false it'll skip the following section of stuff to be rendered and pick up on the next block. If the result is 'true' or undefined by the time it renders then it renders anyway. How can you not be looking forward to this functionality? This increases parrealism a fair amount as you can corsely sort your dataset and then let the gfx card work out what needs to be rendered and what doesn't.

- Record transformed verts into buffers
I suspect this will come with D3D10 cards, the ability to basically grab a the transformed vertices and resubmit them again, by passing a whole transform stage. For multipass systems this will rock.. To a degree it looks like a more modern Compiled-vertex-array extension, in that once transformed and lit the first time you can cache the results until later.

There are some other cool things coming, not to mention the OGL3.0 overhaul, so while I've got one eye on D3D I'm looking forward to OGL's future as well [smile]

GDNet.net London meet, T-13h 30mins and counting...
Sign in to follow this  


Recommended Comments

So you're finally coming over to the right side [grin]

Those last two things you mention as on the horizon for OpenGL sound very similar to "predicated rendering" and "stream out" for D3D10. Guess that's not too surprising - wouldn't make much sense for Nvidia/ATi not to expose the full feature set regardless of API...


Share this comment

Link to comment
Original post by jollyjeffers
So you're finally coming over to the right side [grin]


LOL, I tried it and found it tempting but like I said in my journal posting, DIP() sucks period. I hope in DX10 its better, hell ATI or NV don't even look at the first two parameters IIRC... Plus is MS going to keep .x or move on or drop it all together. The main reason I wanted to switch was DX was supposed to do so much for me, well after I used it everyone is like ah no make/use your own format for this and that don't use .x or this or that. Well hell, now I am down to just the math library of DX. So I grabbed a math lib and added a few things I wanted to it and now things are great, beings I have to make up all my own anyway. Speed wise sorry but GL version of mine is faster. Now like I said DX10 comes out I may try it again and see if its as fast, but DX9 isn't. What may end up happening long shot but GL3.0 could overtake DX someday for gaming. ATI/NV are saying GL will be faster on Vista! [grin]

Share this comment

Link to comment

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Advertisement

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!