## Recommended Posts

luke2    100
Alrighty, So I am loading textures with the Glu command Glu.gluBuild2DMipmaps( Gl.GL_TEXTURE_2D, Gl.GL_RGBA, texMap.Width, texMap.Height, Gl.GL_RGBA, Gl.GL_UNSIGNED_BYTE, texData ); Although I've gotten this to 'work' there is a major issue. What happens when the internal format of the texture is NOT RGBA!?! The answer is a image with the wrong colors. So in an effort to solve that, I figured I would use SdlDotNet, which my project currently uses. So my solution was to create a byte[ , , ] (a three dim array in C#) and fill it up with data (the first two dimensions are the x,y and the last one consists of 4 entries for rbga respectively). So basically, I load in the image as an SDL.NET surface, and then I get an IntPtr to it's data (I did it in a more user friendly manner, but it was too slow). The problem is that this is still too slow, nor does it work. Here is my 'low level' C# code - almost an oxymoron. I suspect that my slow down originates from the line which sets colorValue and from the 4 (byte) typecasts. As for the incorrect output (nothing is shown on screen regardless the value of bits- and I don't think bits is getting filled correctly either, nor are any exceptions thrown)... beats me for now.
class GLTextureManager
{
private static byte[, ,] convertImageFormat( Surface texMap )
{
//Represents the image in an RGBA format.
byte[,,] bits = new byte[texMap.Width,texMap.Height,4];
//Represents the current pixel which is being converted to RGBA.
int pixelPtrOffset = texMap.Pixels.ToInt32( );
//Calculate the offset for bitwise ops w/pixel masks.
int surfacePitchOffset = -4 + texMap.BytesPerPixel;
//Various data of the surface, cached here to avoid repeated
//calls during the nested loops.
short pitch = texMap.Pitch;
//The raw value of the pixel data.
int colorValue = 0;
//Represents the color masks of the pixel.
//Represents the number of bytes per pixel.
//Usually 1,2,3,4.
int bytesPerPixel = texMap.BytesPerPixel;

for( int x = 0; x < texMap.Width; x ++ )
for ( int y = 0; y < texMap.Height; y++ )
{
//Calculate the current pointer for this pixel.
//Note that pitch is the number of bytes per row,
//Thus y*pitch gets us the correct row and x*bpp
//gets us the correct column.
colorValue =
Marshal.ReadInt32( new IntPtr( pixelPtrOffset + y * pitch + x * bytesPerPixel ).ToInt32( ), surfacePitchOffset );
//Get the current pixel's color.
int of = pixelPtrOffset - ( pixelPtrOffset + y * pitch + x * bytesPerPixel );
//Format it into RGBA.
bits[x, y, RED_COMPONENT]   = (byte)( rMask & colorValue );
bits[x, y, GREEN_COMPONENT] = (byte)( gMask & colorValue );
bits[x, y, BLUE_COMPONENT]  = (byte)( bMask & colorValue );
bits[x, y, ALPHA_COMPONENT] = (byte)( aMask & colorValue );
}
//Return image data in RGBA format.
return bits;
}

private static Dictionary<String, int> textures = new Dictionary<string,int>();

public static int getTextureID( String fileName )
{
if( textures.ContainsKey( fileName ) )
return textures[fileName];
System.Diagnostics.Stopwatch watch = new System.Diagnostics.Stopwatch( );
watch.Start( );
//Only using an array b/c glGenTextures
//Requires them.
int[] textureId = new int[1];
Gl.glGenTextures( 1, textureId );
Gl.glBindTexture( Gl.GL_TEXTURE_2D, textureId[0] );
//Now that we have bound our texture, lets set it's properties.
Gl.glTexParameteri( Gl.GL_TEXTURE_2D, Gl.GL_TEXTURE_WRAP_S, Gl.GL_REPEAT );
Gl.glTexParameteri( Gl.GL_TEXTURE_2D, Gl.GL_TEXTURE_WRAP_T, Gl.GL_REPEAT );
Gl.glTexParameteri( Gl.GL_TEXTURE_2D, Gl.GL_TEXTURE_MAG_FILTER, Gl.GL_LINEAR );
Gl.glTexParameteri( Gl.GL_TEXTURE_2D, Gl.GL_TEXTURE_MIN_FILTER, Gl.GL_LINEAR );
//Set the texture's behavior, GL_MODULATE=lighting & coloring,
//Change to GL_DECAL to get rid of that.
Gl.glTexEnvf( Gl.GL_TEXTURE_ENV, Gl.GL_TEXTURE_ENV_MODE, Gl.GL_MODULATE );
//Upload the actual texture to video memory.
//The surface to be converted into the proper image format.
Surface texMap = new Surface( fileName );
byte[, ,] texData = convertImageFormat( texMap );

Glu.gluBuild2DMipmaps( Gl.GL_TEXTURE_2D, Gl.GL_RGBA, texMap.Width,
texMap.Height, Gl.GL_RGBA, Gl.GL_UNSIGNED_BYTE, texData );

textures[fileName] = textureId[0];
watch.Stop( );
System.Console.Out.Write( watch.ElapsedMilliseconds + "\n");
return textureId[0];
}

/// <summary>
/// Location of the R component in the byte array.
/// </summary>
private const int RED_COMPONENT = 0;

/// <summary>
/// Location of the G component in the byte array.
/// </summary>
private const int GREEN_COMPONENT = 1;

/// <summary>
/// Location of the B component in the byte array.
/// </summary>
private const int BLUE_COMPONENT = 2;

/// <summary>
/// Location of the A component in the byte array.
/// </summary>
private const int ALPHA_COMPONENT = 3;
}



##### Share on other sites
luke2    100
Well, should I have posted this in a more traveled forum?

##### Share on other sites
ndhb    246

I don't understand what you mean by "What happens when the internal format of the texture is NOT RGBA!?! The answer is a image with the wrong colors.". When you send data you specify in what the format the data is in (the last argument before data type) and you specify in what format to store the data in internally (the argument before image width). A conversion between the two formats takes places according to the current GL state (in particular PixelStore/PixelTransfer). In other words, it is perfectly possible to store an image from an external format such as RGBA in some other format internally, for instance RGB or BGRA.

kind regards,
Nicolai

Edited by ndhb

##### Share on other sites
luke2    100
That is all true, but my problem is determining what the format of the image itself is, because if I tell it that it is RGB and it is really BGR, then regardless of what OpenGL converts it to, it will be messed up. I haven't figured our a reliable way to determine what the format of an image is (or rather, figure it out and then translate it into a GL constant like GL_RGBA). So my solution for now is to just reformat the image.

##### Share on other sites
xycsoscyx    1149
If you know how to reformat it, then you should know what format it's in to begin with, right? You're actually reformatting it from FormatA to your OpenGL texture, so what is FormatA?