Effect files, and managing textures/materials?

Started by
11 comments, last by Muhammad Haggag 19 years ago
I am currently writing for use of my effects on my textures, Instead of the old way of doing textures. So my question is, how do you manage and sort all this? I have currently importing from 3dsmax my materials which includes an index to a texture, and that texture has a slot for wether its a bumpmap, diffuse... u get the idea. Now my problem comes up when how do I use an effect file to handle all these cases, I might have a texture under every slot, or maybe only under a couple. How do I tell the effect file which texture to use under which slot(bumpmap, diffuse, etc...)???? When they can all be different? I know how to manually do each one, but I CANT manually create 8^8 many effect files to handle every case, or am I gunna hafta? So is there a flag I can send to an effect file to tell it what type of texture it is, so I know how to handle it? Thanks --Brad
--X
Advertisement
There is no magical way to tell your shaders that Sampler #1 is a diffuse texture and that Operation X needs to be performed with it and that Sampler #2 is a bumpmap texture and that Operation Y needs to be performed with it. So, you do need to explicility code (or use the FragmentLinker) each case.

However, I think you should only hit the big ones, like diffuse + bumpmap, ect... You shouldn't code for every single combination, because most times you stick to a fairly uniform texture layout. Characters may have a diffuse map, specular map, and normal map, while terrain has multiple diffuse maps and a normal map. Stick to coding what you need, instead of worrying about what you may never use.
Dustin Franklin ( circlesoft :: KBase :: Mystic GD :: ApolloNL )
That makes alot of sense, so basically I need to write a way for the (future) artists to be able to export their own through 3dsmax? Ive also learned a little about fragmenting, is that the best choice?

Basically im kinda stuck between a rock and a hard place. I want to be able to support all of 3ds max's material support, but I cant without writing for every single case? Is fragmenting slow? whats the stats on that?

Thanks for the help.
--X
Quote:Original post by xsirxx
Is fragmenting slow? whats the stats on that?

No, fragment linking isn't slow, because the only real time you are performing fragment-related routines is at load-time, when you are creating your shaders. There is no performance difference once you get them linked. Some apps, like the sample in the SDK, even links shaders while the app is rendering, without any noticable performance hit. This is cool, because you can create shaders on-the-fly, as you find out that you need them.

I'd suggest checking out the FragmentLinker sample, and see if it suits your needs.
Dustin Franklin ( circlesoft :: KBase :: Mystic GD :: ApolloNL )
I did a search on here for fragment shaders and not much appears. Why arent people getting into fragmenting? it seems to be the best method, allowing your artists to still create shaders on the fly without a performance hit?

This way I can run a bunch of checks to build them and optimally have it build the fewest amount of shaders and have each fragment possible hold a lower version as well for distance checking all within the shaders. I cant see why any programmer would go another direction with this? Ok so anywho, I need to check the sdk examples, is there any book or place else to look? Thanks

BTW, I bought Shader X3, it has a chapter on fragmented shaders, ill post if its any good...

[Edited by - xsirxx on March 30, 2005 6:38:01 PM]
--X
I have the exact same problem as you except that im using Maya.

what i am trying at the moment is reading the the fx file semantics and looking at each "ParameterDescription" with this code:

EffectHandle handleParam = effect.GetParameter(null, j);
ParameterDescription desc = effect.GetParameterDescription(handleParam);

the description returns the "Type" and "Name" of the semantic/annotation.


With this info ill be able to compare with the semantic names given by Maya in the x-file. Then im hoping to be able to plug my .x file values into the .fx file annotations and maintain the same "Type".

V.
Well I looked over the fragment code on the directx sample. Its confusing to all hell btw. And it looks like they run EVERYTHING all the time but with stupid D3DXCOLOR(1.0, 1.0, 1.0, 1.0) defaults? This would be a total waste of gpu bandwidth wouldnt it? So how do you compile only what ya want out of fragments? For instance they have inside of their example a Diffuse(), Specular(), etc.., so how do I state I want Diffuse(), and Specular() from the fx file to run? Also with fragments the inputs will multiply upward depending on how many fragments you want, they use ALL INPUTS that only SOME fragments will use but that if you combined all fragments will supply all info. This is such a waste, I can start to see why people dont use them.

Of course maybe I just freakin suck at reading the heavily overloaded samples?
--X
Create new mesh with the EffectInstance overload. this will contain some param info taken from the .x file.

WIth the info you just got, you can get the name of each param and the data corresponding to it. This works for any type as far as i know. it'll prolly be an array of floats (matrix or vector) or a string... just .x file stuff.

You can do similar with an fx file. Major difference is that param info will be in the TYPE you want like Matrix, Texture, etc (as opposed to simple array of floats that have to be converted to a Matrix). Each semantic is made up of annotations. ex:

texture diffuseTexture : DIFFUSE
<
string ResourceName = "nothing.jpg";
string ResourceType = "2D";
>;

here "texture" is the semantic and "ResourceName" and "ResourceType" are the annotations. The annotations is where you plug your data from the .x (notice all annotations are of type floats and string just like the data from an .x file. good)

Hope that clarifies things a little. BTW, I don't even know what fragments are... :D
Right that will getcha a working shader and filled with the correct info, but problem is I have to create a 8^8 amount of shaders to handle what 3dsmax has BUILT-IN. OR I can use a way like fragmented shaders and piece them together at run time to take care of only what is being used during that chunk load... This will also allow the artists to create and import as many as they want to use and it should never get out of control. Only problem is, only one person on here knows about fragments :), and I cant seem to figure out the sample crap microsoft has enough to build a working shader from fragments. I read the shaderX3 chapter on fragmenting and they seem to build them from scratch, although Microsoft has already built most for us, If I can get around all the garbage in their samples to get down to the actual sample.

UPDATE:

I found where they filled the pData in the Handles, So it seems they pick what part to adding which Handles they want inside of the FX file. Then doing a LinkShader(...) with those Handles only? Seems a little easier than I originally thought after grabbing a glass of wine.
--X
I think this is what you are looking for.

http://download.developer.nvidia.com/developer/SDK/Individual_Samples/DEMOS/Direct3D9/Dxsas.zip


The app handles many shaders on a mesh and source is included. It's better than microsofts.

This topic is closed to new replies.

Advertisement