HardwareVertexProcessing vs. SoftwareVertexProcessing Whats the difference?

Started by
4 comments, last by drewslater 20 years, 8 months ago
Can anyone explain to me the difference between the various types of D3D devices we can create? What is the difference between creating a device of DeviceType.Hardware vs DeviceType.Software? What is the difference between CreateFlags.SoftwareVertexProcessing, CreateFlags.HardwareVertexProcessing, and CreateFlags.MixedVertexProcessing? It seems that one would be standard TnL (software vp?) another would be using the programmable pipeline (hardware vp?) and mixed would be a combination of both. ATS
Advertisement
Device types:

Software - None exists yet, but apparently someone is making one.
Hardware - These exist for each card. This is the normal setting.
Reference - A software implementation that exists only in the SDK release, which supports every feature. If it acts differently than your hardware (for features your hardware supports) then the card or driver has a bug. Very very slow. Used to test for possible driver/hw bugs only.

Vertex Processing:

Software - Hardware rasterizer, but software lighting and shader emulation
Hardware - Used when hardware supports every shader you need (if you need any), and has hardware T&L (check the caps.devcaps for hardware T&L flag)
Mixed - Some of your shaders can run in hardware, but some require software emulation. You switch with a renderstate in dx 8, and with a function call in dx9. Switching is slow, so you should draw all hardware things first, then all software things, instead of switching many times per frame. It''s better if you can just find a way to use hardware only to support an approximate result on older cards.
name... - thanks for the info. I have one question though. I have been creating my device using Hardware vertex processing. You say that Hardware VP "...has hardware T&L..." - does this mean that I do not have to have a basic vertex shader which transforms verticies by concatinating view, projection, etc. in order to render properly (e.g. I can use device.Transform.Projection etc to set up how verticies are transformed instead of doing all that in a VS)? Bascially my problem is this - I am attempting to render a texturemapped quad using a triangle fan. and I am able to get the quad to be rendered but the texture does not show up. I am pretty sure I am loading the texture properly and setting the texture state so I was thinking maybe it was because Hardware VP required that I wrote a shader which applies the texture. I am thinking this is not required. Correct? Thanks again.

ATS
are the vertices for your quad have u,v coordinates?
If so, im assuming you gave those u,v values a proper value for your quad.?
I just realized the reason the texture wasn''t visible was because lighting is on by default and Ihad no lights placed. Doh!

ATS
Glad you found your problem. As to your question... No you don''t need a shader.

A series of RenderState, TextureStageState, and Lighting options have the same effect as vertex and pixel shaders.

On early T&L cards, such as the original GeForce256, there were no shaders, but there was hardware support for the fixed pipeline operations. Hardware would perform all vertex operations (lighting, UV transforms, UV generation, World, View, Projection, fogging, interpolation) that were previously handled on the CPU.

The introduction of shaders meant that you could have non-standard lighting, non-standard morphing, bone animation, custom UV mappings (ie: Light as UV for toon shading), and more. Not only were things hardware accelerated, but you could decide how everything worked, within hardware limitations (instruction set, instruction limit, texture limits, color and UV coord interpolator limits, etc.)

Anyway, nope, you don''t need a shader for hardware processing.

This topic is closed to new replies.

Advertisement