best way to load a texture

Started by
8 comments, last by jad_salloum 18 years, 4 months ago
meshTextures = TextureLoader.FromFile(dev,@"..\\..\\xfiles\\"+ materials.TextureFilename , D3DX.Default, D3DX.Default, D3DX.Default, 0, Format.Unknown, Pool.Default , Filter.Point, Filter.Point, 0); hi guys above there is the way i load textures , the problem is that i am getting a bad resolution in my game and i am not getting soft edges for my characters . the device is configured in a good way like what is used in the SDK's so i think the problem is in my textures loading . how i can determine the best way to load them with the best variables because above i used the deafault variables ,the point filter ,unknown format and default pool. also 1 more thing is that when i use pool.managed the memory becomes very high while when i use pool.default the memory is normal , what is the difference between them ??
Advertisement
Wow, I just load using:

this.texture = TextureLoader.FromFile(device, filename);

Everything looks fine.
Quote:Original post by jad_salloum
also 1 more thing is that when i use pool.managed the memory becomes very high while when i use pool.default the memory is normal , what is the difference between them ??

Placing the texture in the managed pool will cause an uncompressed(?) copy of the texture to reside in system memory. This behaviour enables direct3d to transparently re-create the texture if the device is lost (e.g. the data is transferred from AGP to video memory and does not need to be reloaded from disk).


I'm not entirely sure about the managed function, but looking at the Filter.Point you have there, I would try switching the Filter parameters to Filter.Linear. Using point will make the texture look pixely.

Also, from the DirectX Docs:
Quote:Managed - Resources are copied automatically to device-accessible memory as needed. Managed resources are backed by system memory and do not need to be re-created when a device is lost. See Resources 4. Managing Resources.
Quote:Original post by Flimflam
I'm not entirely sure about the managed function, but looking at the Filter.Point you have there, I would try switching the Filter parameters to Filter.Linear. Using point will make the texture look pixely.

The filtering params specified in TextureLoader.FromFile() and D3DXCreateTextureFromFileEx() are for D3DX only - they don't have any baring on the actual rendering. They exist for the cases where the final dimensions have to be different from those of the source file - the filtering affects how it is resized from file->memory.

hth
Jack

<hr align="left" width="25%" />
Jack Hoxley <small>[</small><small> Forum FAQ | Revised FAQ | MVP Profile | Developer Journal ]</small>

Quote:Original post by jollyjeffers
The filtering params specified in TextureLoader.FromFile() and D3DXCreateTextureFromFileEx() are for D3DX only - they don't have any baring on the actual rendering. They exist for the cases where the final dimensions have to be different from those of the source file - the filtering affects how it is resized from file->memory.

hth
Jack


Well, I was pretty confused by the lack of dimensions specified. Do I take it that the managed texture loader thing automatically uses the image's dimensions? Suppose I was just confused a bit. :)
Quote:Original post by Flimflam
Quote:Original post by jollyjeffers
The filtering params specified in TextureLoader.FromFile() and D3DXCreateTextureFromFileEx() are for D3DX only - they don't have any baring on the actual rendering. They exist for the cases where the final dimensions have to be different from those of the source file - the filtering affects how it is resized from file->memory.

hth
Jack


Well, I was pretty confused by the lack of dimensions specified. Do I take it that the managed texture loader thing automatically uses the image's dimensions? Suppose I was just confused a bit. :)

My understanding is that for the shorter form (Texture.FromFile(device,filename)) it'll effectively call the extended form with D3DX.Default for the height/width.

In most cases it will result in a texture that is the same size in memory as it was when stored on disk. However, for devices that require 2N dimensions they'll get resized.

If it's likely to cause a problem, the best thing to do is to either request the image information (not sure of the MDX code) from the load call, or to query the top-level surface dimensions of the returned texture.

hth
Jack

<hr align="left" width="25%" />
Jack Hoxley <small>[</small><small> Forum FAQ | Revised FAQ | MVP Profile | Developer Journal ]</small>

Quote:
the best thing to do is to either request the image information (not sure of the MDX code) from the load call, or to query the top-level surface dimensions of the returned texture.



thanx guys for replying , i am not using the pool.managed coz it takes alot of memory as u all said above and i am playing with the Width and Hieght of the texture to get extra free memory but what i want to know is how to determine the approperiate parameters to pass to the TextureLoader in order to get the best resolution and to make my meshes look smooth not pixelized ???

anybody knows how to get the image information and pass it like what " jollyjeffers " said ??
I came here to poist and then I see this appears to be about managed DX (I think). People have started indicating this in the title - ie add [MDX] or MDX into the title.
Helps us dinosaurs know we can't be of use!
i used magfilter and minfilter and the texture become better what can i do to improve it more ??

This topic is closed to new replies.

Advertisement