Questions Implementing RenderMonkey DX9 Effect

Started by
5 comments, last by Ethangar 18 years, 8 months ago
My goal: Apply a RenderMonkey effect (ooze.rfx) to a cube or some other mesh. My status: SNAFU The files: http://www.ethangar.com/jacob/Ooze2.zip (1.1MB) (has the fx file and the .rfx file in it along with textures and mesh) I'm in way over my head, but I really need some help implementing this effect. If anyone would go above and beyond the call of duty and actually implement this effect in a Managed DirectX project, I would be more than happy to compensate you for the time you spent tutoring me (contact information is in my profile). If that is too much work, perhaps the following questions can be answered: 1. There are really only two matrices that RenderMonkey leaves up as global values needed to assign. One is "matView" which I assume is the View Matrix and the other is "matViewProj" which I am assuming is the View Matrix multiplied by the Projection Matrix. Is that correct? 2. If so, why don't I need a world matrix to show the location of my object. Why does everything else use matViewWorldProj and this effect use view and viewProj 3. I have a few textures I need to set as values in my FX file. How do I properly load these textures in Managed DirectX. The first one is a DDS file named "NoiseVolume.dds." The other two seem to be generated by RenderMonkey. One is a height map that is annotated by RenderMonkey like this:

texture heightTexture_Tex : RenderColorTarget
<
   float2 RenderTargetDimensions = {128,128};
   string Format="D3DFMT_X8R8G8B8";
   float  ClearDepth=1.000000;
   int    ClearColor=-16777216;
>;
The other is a bump map that is annotated the same way. How do I generate those in DirectX? 4. My book on Shaders mentions the availability of a software emulated pixel and vertex shader 3.0. Is it safe to use those as a backup should someone have an older video card or do they murder the cpu? When I get home, I'll try to post some screenshots of my horrible frakenstein creation that I got from trying to implement this the last time. It may help you all understand some of places I am messing up.
Advertisement
I'll try to answer couple of your questions:
Quote:Original post by Ethangar

1. There are really only two matrices that RenderMonkey leaves up as global values needed to assign. One is "matView" which I assume is the View Matrix and the other is "matViewProj" which I am assuming is the View Matrix multiplied by the Projection Matrix. Is that correct?

Yes, indeed.

Quote:
2. If so, why don't I need a world matrix to show the location of my object. Why does everything else use matViewWorldProj and this effect use view and viewProj

Because you'll need to multiply vertex position with World*View*Projection matrix to transform it into screen coordinates. You could, of course, sen all of these matrices separately to vertex shader, but then you would have to multiply them together for every vertex.

Quote:
3. I have a few textures I need to set as values in my FX file. How do I properly load these textures in Managed DirectX. The first one is a DDS file named "NoiseVolume.dds." The other two seem to be generated by RenderMonkey. One is a height map that is annotated by RenderMonkey like this:

texture heightTexture_Tex : RenderColorTarget<   float2 RenderTargetDimensions = {128,128};   string Format="D3DFMT_X8R8G8B8";   float  ClearDepth=1.000000;   int    ClearColor=-16777216;>;


The other is a bump map that is annotated the same way. How do I generate those in DirectX?

You need some kind of system which parses semantics and annotations and binds data to them. Look at NVidia's developer site, they have an example DXSAS implementation written in C#/MDX. It's one way of doing that.
{quote]
4. My book on Shaders mentions the availability of a software emulated pixel and vertex shader 3.0. Is it safe to use those as a backup should someone have an older video card or do they murder the cpu?
There is a reference Direct3D device, but that is only available for those who have the SDK, and it's really, really slow. It's meant just for testing all those advanced features if you don't have hardware support for them.

I hope this helps. [smile]
Quote:
Because you'll need to multiply vertex position with World*View*Projection matrix to transform it into screen coordinates. You could, of course, sen all of these matrices separately to vertex shader, but then you would have to multiply them together for every vertex.


But that is the whole issue, you never pass the world matrix in any form. You don't pass it individually, and there is no WorldViewProj variable to pass it to. Is there some way one is expected to transform it AFTER the shader pass has run?

Quote:
You need some kind of system which parses semantics and annotations and binds data to them. Look at NVidia's developer site, they have an example DXSAS implementation written in C#/MDX. It's one way of doing that.


Well, I can user RenderMonkey to create the files, so I guess I don't need to do that. But is there some special way that I need to load them? Same with the NoiseVolume. Can I simply use the TextureLoader.FromFile(device, "path/file.dds") function and expect it to work or do I need to load it with special parameters that match those in the annotations. FromFile has a bunch of overloads that I am not sure if I need to load.

Quote:There is a reference Direct3D device, but that is only available for those who have the SDK, and it's really, really slow. It's meant just for testing all those advanced features if you don't have hardware support for them.


Well, there are the devices, but there were also something like ps_3_0_sw and vs_3_0_sw that I can use in the fx files to call some sort of software shader implementation. Does that really exist?

Quote:I hope this helps. [smile]


Yes, it did, thanks so much for taking the time to reply.
Quote:Original post by Ethangar
But that the whole thing, you never pass the world matrix in any form. You don't pass it individually, and there is no WorldViewProject variable to pass it to. Is there some way one is expected to transform it AFTER the shader pass has run?

There is no transforming done after vertex shader, it does all of that. You can pass world matrix of the currently rendered object to the vertex shader, but you still also need view and projection matrices. Vertex shader is supposed to transform vertex from world coordinates to screen-space, so those all are needed.

Quote:
Well, I can user RenderMonkey to create the files, so I guess I don't need to do that. But is there some special way that I need to load them? Same with the NoiseVolume. Can I simply use the TextureLoader.FromFile(device, "path/file.dds") function and expect it to work or do I need to load it with special parameters that match those in the annotations. FromFile has a bunch of overloads that I am not sure if I need to load.

I meant that you must read those annotations in your application. They are just metadata associated with effects, you can use them to add any info you want.
For example, if you have
texture diffuse: DiffuseTexture<    string filename = "some_texture.dds";>

you can use Effect.GetAnnotation to read annotation "filename" and get its value. Then you can use this value to load your texture map normally. Check the SDK docs for more details.

EDIT: Couple of days ago here was an earlier thread about annotations. You may want to check that out, too.
Quote:
Well, there are the devices, but there were also something like ps_3_0_sw and vs_3_0_sw that I can use in the fx files to call some sort of software shader implementation. Does that really exist?


Those names are used in RenderMonkey, and they mean reference device's shaders.
Quote:
There is no transforming done after vertex shader, it does all of that. You can pass world matrix of the currently rendered object to the vertex shader, but you still also need view and projection matrices. Vertex shader is supposed to transform vertex from world coordinates to screen-space, so those all are needed.


Yeah, that is what I thought, but for whatever reason RenderMonkey didn't make me use (or create) a matWorld or matWorldViewProj variable, and I am trying to understand their logic behind that. Would it be possible to render shaders on a mesh without a world matrix value being passed somehow? Perhaps if I transformed the object before I shaded it (I still would think the shader would need to know where the mesh is to calculate normals and such). All of the examples I have ever followed have had one and the RenderMonkey generated fx file doesn't call for a world matrix.
Quote:Original post by Ethangar

Yeah, that is what I thought, but for whatever reason RenderMonkey didn't make me use (or create) a matWorld or matWorldViewProj variable, and I am trying to understand their logic behind that. Would it be possible to render shaders on a mesh without one somehow? Perhaps if I transformed the object before I shaded it (I still would think the shader would need to know where the mesh is to calculate normals and such). All of the examples I have ever followed have had one and the RenderMonkey generated fx file doesn't call for a world matrix.

You could use pretransformed vertices, but then you would have to transform all of them manually in software. When using pretransformed verts, vertex shader should not execute at all.

Here is some more information on my problem. The .fx file I am using can be found here: http://ethangar.com/jacob/ooze/Ooze.fx

Notice how it lacks any sort of call for a world matrix.

Here is an image of what I should be getting:


Here is what I am getting:



Here is where I load the shader:
effect = Effect.FromFile(device, @"../../Ooze/Ooze.fx", null, ShaderFlags.None, null);effect.Technique = "DX_Ooze";


Here is where I load the mesh and textures:
mesh = Mesh.FromFile("../../Ooze/box.X",MeshFlags.Managed, device);noiseTexture = TextureLoader.FromFile(device, "../../Ooze/noiseVolume.dds");heightTexture = TextureLoader.FromFile(device, "../../Ooze/heightTexture.dds");bumpTexture = TextureLoader.FromFile(device, "../../Ooze/bumpTexture.dds");


Here is where I update the world matrix and genereate some views and such
        private void UpdateWorld()        {			device.Transform.Projection = projMatrix;			device.Transform.View = viewMatrix;					worldMatrix = Matrix.RotationYawPitchRoll(-.5f,0.5f,0.0f) * Matrix.Translation(0, -150, -500);			device.Transform.World = worldMatrix;					effect.SetValue("heightTexture_Tex", heightTexture);			effect.SetValue("fTime0_1", angle);			effect.SetValue("NoiseVolume_Tex", noiseTexture);			effect.SetValue("bumpTexture_Tex", bumpTexture);			effect.SetValue("matView", viewMatrix);			effect.SetValue("matViewProjection", viewMatrix * projMatrix);        }


Finally, where paint event where I draw mesh:
        protected override void OnPaint(System.Windows.Forms.PaintEventArgs e)        {            device.Clear(ClearFlags.Target | ClearFlags.ZBuffer, Color.Black, 1.0f, 0);			UpdateWorld();            			device.BeginScene();            			int numPasses = effect.Begin(0);            			for (int i = 0; i < numPasses; i++)            {	                effect.BeginPass(i);				mesh.DrawSubset(0);								effect.EndPass();            }            effect.End();            			device.EndScene();            device.Present();            this.Invalidate();        }


Where am I going wrong?

This topic is closed to new replies.

Advertisement