Jump to content
  • Advertisement
Sign in to follow this  
gabdab

OpenGL Devil sdl texture mapping wrong

This topic is 4329 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi, I have this problem texturing with Devil Image library and opengl where textures on 3d models looks wrong. I guess it is the common pitfall of sdl having its own format or | what else.. Anyone helping ? Gab.

Share this post


Link to post
Share on other sites
Advertisement
I haven't noticed any problems with devil/sdl/openg gl works like a a charm for me. What does "looks wrong" mean? It could be your texture coordinates, can you describe more or show a image?

Share this post


Link to post
Share on other sites
Hi,
this is , one next to the other the 2 mesh .
The left one using Devil , the right one using a texture loader from the web(correct) .
http://img503.imageshack.us/img503/1540/screenshotxga9.jpg
As I mentioned I am using a sdl viewer.
I read here http://twomix.devolution.com/pipermail/sdl/2002-September/049078.html that texture coordinate starts from the bottom left in sdl while from up left in opengl , ok , but that doesn't explain much .
How do I turn image coordinates at the source ( texture load)?

Gab-

Share this post


Link to post
Share on other sites
Not working,
I set this in my initgl ():
FileName = "blackman.png"; // Set filename equal to the first argument.

//
// Check if the shared lib's version matches the executable's version.



// Needed to initialize DevIL.
ilInit ();


// GL cannot use palettes anyway, so convert early.
ilEnable (IL_CONV_PAL);

ilEnable(IL_ORIGIN_SET);
//ilOriginFunc(IL_ORIGIN_LOWER_LEFT);
ilOriginFunc(IL_ORIGIN_UPPER_LEFT);



// Gets rid of dithering on some nVidia-based cards.
ilutEnable (ILUT_OPENGL_CONV);




// Generate the main image name to use.
ilGenImages (1, &ImgId);

//iluFlipImage();

// Bind this image name.
ilBindImage (ImgId);

// Loads the image specified by File into the ImgId image.
if (!ilLoadImage (FileName)) {
HandleDevILErrors ();
}
//_______________________________________________-

// Lets ILUT know to use its OpenGL functions.
ilutRenderer (ILUT_OPENGL);

// Goes through all steps of sending the image to OpenGL.
TexID = ilutGLBindTexImage();

// We're done with our image, so we go ahead and delete it.
ilDeleteImages(1, &ImgId);

Share this post


Link to post
Share on other sites
Ok,
I have substituted Devil with Freeimage untill I get to know Devil better.
This is the function I am using to load texture.



#include "FreeImage.h"

//GLuint g_textureID = -1;

GLuint g_textureID = 0;





void loadTexture(void)

{

/////////////////////////////////////////////

// NEW! - This function has been completely

// rewritten to use FreeImage.

/////////////////////////////////////////////



//const char textName[64] = "blackman.tga";
const char textName[64] = "blackman.png";



// Get the image file type from FreeImage.

FREE_IMAGE_FORMAT fifmt = FreeImage_GetFileType(textName, 0);



// Actually load the image file.

FIBITMAP *dib = FreeImage_Load(fifmt, textName,0);



// Now, there is no guarantee that the image file

// loaded will be GL_RGB, so we force FreeImage to

// convert the image to GL_RGB.

dib = FreeImage_ConvertTo24Bits(dib);





if( dib != NULL )

{

glGenTextures( 1, &g_textureID );

glBindTexture( GL_TEXTURE_2D, g_textureID );

glTexParameteri( GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER, GL_LINEAR );

glTexParameteri( GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER, GL_LINEAR );





// This is important to note, FreeImage loads textures in

// BGR format. Now we could just use the GL_BGR extension

// But, we will simply swap the B and R components ourselves.

// Firstly, allocate the new bit data doe the image.

BYTE *bits = new BYTE[FreeImage_GetWidth(dib) * FreeImage_GetHeight(dib) * 3];



// get a pointer to FreeImage's data.

BYTE *pixels = (BYTE*)FreeImage_GetBits(dib);



// Iterate through the pixels, copying the data

// from 'pixels' to 'bits' except in RGB format.

for(int pix=0; pix<FreeImage_GetWidth(dib) * FreeImage_GetHeight(dib); pix++)

{

bits[pix*3+0]=pixels[pix*3+2];

bits[pix*3+1]=pixels[pix*3+1];

bits[pix*3+2]=pixels[pix*3+0];



}



// The new 'glTexImage2D' function, the prime difference

// being that it gets the width, height and pixel information

// from 'bits', which is the RGB pixel data..

glTexImage2D( GL_TEXTURE_2D, 0, 3, FreeImage_GetWidth(dib), FreeImage_GetHeight(dib), 0,

GL_RGB, GL_UNSIGNED_BYTE, bits );





// Unload the image.

// and free the bit data.

FreeImage_Unload(dib);

delete bits;

}



}

Share this post


Link to post
Share on other sites
One question, why does it matter what format SDL uses?
When you load a texture into OpenGL, SDL has nothing to do with it. I had a similar problem when loading two kinds of textures. i.e. JPG and PNG. For me a same image in .png and .jpg appeared different with .jpg images appearing inverted. But that just because my engine uses a UV co-ordinate system similar to OpenGL.

Yes it can get confusing, so this is how I solved the problem. I loaded the images in an external image editor application and then adjusted the iluFlipImage() so that textures were aligned correctly all the time. I check this for all popular formats like tga, png, jpg, .. bla bla bla.

I put a break-point in the debugger to check the origin of the image to see how things were getting aligned.

For OpenGL

ILinfo ImageInfo;
iluGetImageInfo(&ImageInfo); //<-- break-point here to check!

// OpenGL requires an Image that is aligned to lower left.
// Not all formats store images in this way (e.g. JPEG).
if(ImageInfo.Origin == IL_ORIGIN_UPPER_LEFT)
{
iluFlipImage();
}






The engine can render using direct3d also which has a different UV co-ordinate system. Did a similar thing for that too. Now it works for all images and all formats.

For DirectX

ILinfo ImageInfo;
iluGetImageInfo(&ImageInfo); //<-- break-point here to check!

// Direct3D requires an Image that is aligned to upper left.
// Not all formats store images in this way.
if(ImageInfo.Origin == IL_ORIGIN_LOWER_LEFT)
{
iluFlipImage();
}






Hope that helps.

[EDIT:] Sorry wrong comments in code. Made changes.

[Edited by - _neutrin0_ on February 2, 2007 7:53:00 AM]

Share this post


Link to post
Share on other sites
Quote:
Original post by gabdab
// GL cannot use palettes anyway, so convert early.
ilEnable (IL_CONV_PAL);


This is really just FYI, since I guess it won't be around for much longer, but OpenGL has support for color indexing as well as true color.

Share this post


Link to post
Share on other sites
Thx guys ,
( comments in code were from who ever wrote it, I am lexically short in english).

I tried using iluFlipImage(); with no success.
Maybe I was calling it in a wrong routine.


Anyway thanks,
Gab-

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!