# Help with DirectX/HLSL

This topic is 4585 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

This is a crosspost from the .NET forum, but I can easily convert a DirectX solution to a Managed DirectX solution. I'm going nuts here trying to get pixel shaders working correctly. I sailed across the entire internet looking for some help with this stuff, and even found some nice video tutorials, but I am still having trouble. Here is the situation, I have a game I am working on and am trying to add some fancy pixel shaders to make it look super cool. The problem is, I am having a hell of a time getting them to work correctly. For instance, I have a shader file containing 3 shader techniques that would do different things to a model involving color. This one came straight out of my book, but I changed it to work with my own mesh. Here is the file: http://www.ethangar.com/shaders/shade.fx Now the problem arises when I try to use two of the shaders. The technique "TransformTextureNoBlue" works great. BUT, "TransformTexture" and "TransformInverseTexture" will just make my entire mesh disappear. WTF is going on. Is my mesh fucked up? Am I forgetting some device parameters? This computer has a GeForce 6800 Ultra, so I know my card is perfectly capable of all the shaders. I am also running in Pure mode. Anyone have any hints as to what might be up? Also, I tried to bump map a box once and I ended up with some crazy semi-transparent abortion of a box. Here is my messy code for the initialization of my device:
this.AssociatedSprites = new ArrayList();
this.AssociatedBackgrounds = new ArrayList();
this.AssociatedInput = new ArrayList();
this.AssociatedMeshes = new ArrayList();

Direct3D.PresentParameters presentParams = new Direct3D.PresentParameters();
presentParams.Windowed = true;
presentParams.EnableAutoDepthStencil = true;
presentParams.AutoDepthStencilFormat = Direct3D.DepthFormat.D16;

DirectX.Direct3D.Caps hardware = DirectX.Direct3D.Manager.GetDeviceCaps(0, DirectX.Direct3D.DeviceType.Hardware);
if ((hardware.VertexShaderVersion >= new Version(1, 1)) &&
{
// Default to software processing
DirectX.Direct3D.CreateFlags flags = DirectX.Direct3D.CreateFlags.SoftwareVertexProcessing;

// Use hardware if it's available
if (hardware.DeviceCaps.SupportsHardwareTransformAndLight)
flags = DirectX.Direct3D.CreateFlags.HardwareVertexProcessing;

// Use pure if it's available
if (hardware.DeviceCaps.SupportsPureDevice)
flags |= DirectX.Direct3D.CreateFlags.PureDevice;

// Yes, Create our device
device = new DirectX.Direct3D.Device(0, DirectX.Direct3D.DeviceType.Hardware, this, flags, presentParams);
}
else
{

// Create a reference device
device = new DirectX.Direct3D.Device(0, DirectX.Direct3D.DeviceType.Reference, this,
DirectX.Direct3D.CreateFlags.SoftwareVertexProcessing, presentParams);
}

//device = new Microsoft.DirectX.Direct3D.Device(0, Direct3D.DeviceType.Hardware, this, Direct3D.CreateFlags.HardwareVertexProcessing, presentParams);

device.RenderState.AlphaTestEnable = true;
device.RenderState.ReferenceAlpha = 0x08;
device.RenderState.AlphaFunction = Direct3D.Compare.GreaterEqual;

device.SetTextureStageState(0, Direct3D.TextureStageStates.AlphaOperation, true);

Also, another shader I really really want to get working was one I found in Render Monkey that has a nice ooze effect. You can find the stuff here: http://www.ethangar.com/shaders/ooze.rar If anyone can help me get that working, I would really appreciate it. Basically, I want to replace that sphere mesh that the ooze is being rendered on with a simple box mesh or a flat surface (an ocean of ooze as it were). If anyone could help me modify it for that and get it working in a managed directX project.

##### Share on other sites
1) In your device setup, you have alpha testing enabled and have the comparison set to only pass pixels with an alpha value of greater than or equal to 8. Alpha testing operates on the alpha value returned from your pixel shader.

2) If the alpha channel of your texture is 0, then the "TextureColor" pixel shader will output an alpha value of 0 and so will fail the alpha test and not appear.

Your "TextureColorOnlyBlue" pixel shader explicitly sets the alpha channel of the output colour to 0, so will also fail the alpha test and the mesh won't appear.

If the alpha channel of the texture being passed to the "InverseTextureColor" pixel shader is 1.0, or the texture doesn't have an alpha channel (the GPU sets a=1 for textures without alpha channels), then the shader will compute 1.0f-1.0f and so return 0, and so again fail the alpha test and not appear.

Finally, the shader that does work, "TransformTextureNoBlue", will only work if the texture used with it has an alpha channel set to 1, or a texture without an alpha channel (in which case the GPU automatically sets a=1).

If your bump mapped box used a 4 component dot product, then the alpha value output from the shader will probably have ended up with a value dependent on the amount of light recieved by that pixel - which would have once again caused strange looking results with alpha test enabled.

3) So you either need to disable alpha test if you don't need it - or be more explicit with the alpha values you output from your pixel shaders!

4) Although the ooze effect specifies a mesh to load for RenderMonkey, you should be able to simply load your own mesh through code and then just apply the effect in the same way you apply your own effects; the only things that probably won't work are RenderMonkey specific UI settings. If I get a chance tonight, I'll take a look at the shader in more detail to see if there are any things to watch out for.

##### Share on other sites
Ah thanks. I'll try explicity making the alpha channel 1 when messing with the colors from now on.

Secondly, I am going over the HLSL scripts in detail, and my lack of understanding for shaders is really coming out. The rendermonkey generated HLSL seems to define a bunch of strings that point to the actual meshes but are seemingly not used anywhere. Also, the meshes used are 3ds files. I am used to exporting the mesh from 3D Studio Max with to the DirectX .X format.

- Is there any way for HLSL to just load and use the 3DS files as statically defined in the .fx file?

- If I loaded a mesh in the file, what would my render loop look like. I am used to start the pass loop, beginning the effect pass, drawing the mesh subset, ending the pass, etc. If I have no mesh subset to draw, what happens?

- The same thing with textures. In theory, couldn't I texture a mesh entirely from HLSL instead of the mesh file itself pointing to a texture? I tried this, but it is messing something up.

Edit: One more thing, there is a variable I need to define in the Rendermonkey ooze fx file. It is:

"float4x4 matViewProjection : ViewProjection;"

Is that my view matrix multiplied by my projection matrix, or is that just a misnomer for View * Proj * World as is common. I'm so confused!