White textures on Windows XP VM using OpenTK / C#

Started by
3 comments, last by 21st Century Moose 11 years, 7 months ago
Hello,

While testing a build of my game on an XP machine some weeks back, I found that some of the textures, even those not using alpha blending, were being rendered as white rectangles on the VM. On the physical Windows 7 machine, they displayed perfectly.

From doing various google searches, I am led to believe that white textures are pretty much an error state and therefore I'm doing something wrong. (Which I then further clarified with glGetError)

The game used to work on the XP VM so I'm at a loss what I did to cause it - after I originally got textures working way back when I haven't really touched the code as it hasn't been nessecary and as I'm still pretty much a beginner at OpenGL I don't want to break things by mistake.

Part of the problem is that I'm not even sure if I really have broken the game, or if it's the VM itself - while testing I noted that if I changed the color depth of the VM from 32bit to lower, then the game window always displayed a solid red color, nothing else. Of course, that could still be a fault with my code ;)

Here's examples of what the game looks like running on Windows 7:

[attachment=11290:good1.png][attachment=11291:good2.png]

And this is what happens on the XP VM:

[attachment=11288:bad1.png][attachment=11289:bad2.png]

What is truly frustrating about this whole issue is it only affects some graphics, and it's always the same ones.

I added a call to glGetError after calling glTexImage2D and this is returning InvalidValue. But that doesn't make sense to me - why it would it return fine on one machine and not another, and why wouldn't it affect all graphics instead of just some? This is the output of a debug log I added:

15/09/2012 20:04:45: Debug: Assigning texture 1 (splash) [Hint: Linear, Flags: None]
15/09/2012 20:04:45: Debug: InvalidValue
15/09/2012 20:04:45: Debug: Activated Scene:
15/09/2012 20:04:45: Debug: Assigning texture 2 (gamebackground) [Hint: Nearest, Flags: None]
15/09/2012 20:04:45: Debug: InvalidValue
15/09/2012 20:04:45: Debug: Assigning texture 3 (bonusbackground) [Hint: Nearest, Flags: None]
15/09/2012 20:04:45: Debug: InvalidValue
15/09/2012 20:04:45: Debug: Assigning texture 4 (exitbackground) [Hint: Nearest, Flags: None]
15/09/2012 20:04:45: Debug: InvalidValue
15/09/2012 20:04:45: Debug: Assigning texture 5 (statusbanner) [Hint: Nearest, Flags: None]
15/09/2012 20:04:45: Debug: InvalidValue
15/09/2012 20:04:45: Debug: Assigning texture 6 (maintitle) [Hint: Linear, Flags: None]
15/09/2012 20:04:45: Debug: InvalidValue
15/09/2012 20:04:45: Debug: Assigning texture 7 (optionstitle) [Hint: Linear, Flags: Alpha]
15/09/2012 20:04:45: Debug: InvalidValue
15/09/2012 20:04:45: Debug: Assigning texture 8 (howtotitle) [Hint: Linear, Flags: Alpha]
15/09/2012 20:04:45: Debug: InvalidValue
15/09/2012 20:04:45: Debug: Assigning texture 9 (creditstitle) [Hint: Linear, Flags: Alpha]
15/09/2012 20:04:45: Debug: InvalidValue
15/09/2012 20:04:45: Debug: Assigning texture 10 (pausedtitle) [Hint: Linear, Flags: Alpha]
15/09/2012 20:04:45: Debug: InvalidValue
15/09/2012 20:04:45: Debug: Assigning texture 11 (debug) [Hint: Linear, Flags: Alpha]
15/09/2012 20:04:45: Debug: NoError
15/09/2012 20:04:45: Debug: Assigning texture 12 (rock-72_0) [Hint: Nearest, Flags: Alpha]
15/09/2012 20:04:45: Debug: NoError
15/09/2012 20:04:45: Debug: Assigning texture 13 (rock-32s_0) [Hint: Nearest, Flags: Alpha]
15/09/2012 20:04:45: Debug: NoError
15/09/2012 20:04:45: Debug: Assigning texture 14 (rock-36s_0) [Hint: Nearest, Flags: Alpha]
15/09/2012 20:04:45: Debug: NoError
15/09/2012 20:04:45: Debug: Assigning texture 15 (uni564-12s_0) [Hint: Nearest, Flags: Alpha]
15/09/2012 20:04:45: Debug: NoError
15/09/2012 20:04:45: Debug: Assigning texture 16 (JewelRush) [Hint: Nearest, Flags: Alpha]
15/09/2012 20:04:45: Debug: NoError


Clearly it's not liking a lot of graphics independent of settings.

The code is scatted about in different classes, hopefully I've gathered up all the relevant bits here. As mentioned I'm still a noob when it comes to OpenGL, so I'm currently doing all texture rendering via glDrawArrays.

Initializing texture support:
[source lang="csharp"]GL.Disable(EnableCap.CullFace);
GL.Enable(EnableCap.Texture2D);
GL.BlendFunc(BlendingFactorSrc.SrcAlpha, BlendingFactorDest.OneMinusSrcAlpha);
[/source]

Binding a System.Drawing.Bitmap to OpenGL:

[source lang="c#"] public override void Rebind()
{
BitmapData data;
Bitmap bitmap;
int magnificationFilter;
int minificationFilter;

bitmap = (Bitmap)this.Image;

if (this.TextureId == 0)
{
GL.GenTextures(1, out _textureId);
Log.WriteLog(LogLevel.Debug, "Assigning texture {0} ({1}) [Hint: {2}, Flags: {3}]", this.TextureId, this.Name, this.Hint, this.Flags);
}

OpenGL.BindTexture(this.TextureId);

data = bitmap.LockBits(new Rectangle(0, 0, bitmap.Width, bitmap.Height), ImageLockMode.ReadOnly, System.Drawing.Imaging.PixelFormat.Format32bppArgb);

switch (this.Hint)
{
case TextureHintMode.Nearest:
magnificationFilter = (int)TextureMagFilter.Nearest;
minificationFilter = (int)TextureMinFilter.Nearest;
break;
default:
magnificationFilter = (int)TextureMagFilter.Linear;
minificationFilter = (int)TextureMinFilter.Linear;
break;
}

GL.TexParameter(TextureTarget.Texture2D, TextureParameterName.TextureMinFilter, minificationFilter);
GL.TexParameter(TextureTarget.Texture2D, TextureParameterName.TextureMagFilter, magnificationFilter);
GL.TexImage2D(TextureTarget.Texture2D, 0, PixelInternalFormat.Rgba, data.Width, data.Height, 0, OpenTK.Graphics.OpenGL.PixelFormat.Bgra, PixelType.UnsignedByte, data.Scan0);

Log.WriteLog(LogLevel.Debug, GL.GetError().ToString());

bitmap.UnlockBits(data);
}[/source]
Although not shown in the above code, I experimented by creating a new bitmap object with an explicit RGBA pixel type and then drew the original image onto this prior to calling LockBits etc, this had no effect.

Part of my SpriteBatch class that handles drawing with as few calls to glBindTexture as possible:
[source lang="csharp"]public override void Draw()
{
if (this.Size != 0)
{
if (_requiresBlend)
GL.Enable(EnableCap.Blend);

OpenGL.BindTexture(_textureId);
this.SetupPointers();
GL.DrawArrays(BeginMode.Triangles, 0, this.Size);
this.Size = 0;

if (!_requiresBlend)
{
GL.Disable(EnableCap.Blend);
_requiresBlend = false;
}
}
}

private void SetupPointers()
{
GL.EnableClientState(ArrayCap.ColorArray);
GL.EnableClientState(ArrayCap.VertexArray);
GL.EnableClientState(ArrayCap.TextureCoordArray);

GL.VertexPointer(VertexDimensions, VertexPointerType.Double, 0, _vertexPositions);
GL.ColorPointer<BrColor>(ColorDimensions, ColorPointerType.Float, 0, _vertexColors);
GL.TexCoordPointer<Point>(UVDimensions, TexCoordPointerType.Float, 0, _vertexUVs);
}[/source]

Apologies for the somewhat rambling post, if anyone has any suggestions as to where I'm going wrong I'd be grateful.

Thanks;
Richard Moss
Advertisement
Is GL_TEXTURE_2D enabled when you try to use it?
Some implementations might be super strict and I ___think___ you're supposed to have it enabled before you can use it with such, or any, operations
Other than that, I would personally look into all the glEnable bits that could potentially cause this error.. It doesn't look like you are doing something that is inherently illegal,
so that would be my first quest smile.png
I don't see anything wrong with the small code you've posted, and from all the errors it seems the error is indeed in the glTexImage2D

edit: it could also be that you aren't specifying the S,T of the texture coordinates, but.. i don't think you have to :) a wild shot!
such as setting them to CLAMP_TO_BORDER
For my money - the textures that are giving you GL_INVALID_VALUE are not powers-of-two in size, which the VM hardware emulation is not supporting.

Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.

Kaptein and mhagain,

Thanks for the responses. I am enabling 2D textures prior to bind anything, that's not the issue. However, mhagain has hit the nail on the head it seems. Originally I tried to keep all textures power of two, until I read somewhere it wasn't actually required - so I got lazy for textures I used as single images, rather than being part of a tilesheet. And this seems to be exactly what the problem is - I did a quick test on changing one of the text graphics to be power of two, and it's loaded fine in the VM. So I just need to update my tilesheet code (which splits up textures into grids with equally sized cells) in order to support different sized sub images, then I think I'm sorted.

Thanks very much, I've been trying to fix this for a while and the fact I'd lazily switched image sizes never occured to me!

Regards;
Richard Moss
Any GL2.0 or better hardware should have generalized support for non-power-of-two textures, but some older hardware may not be robust (e.g. the driver may advertise support for them but drop you back to software emulation if you actually try to use them - thanks a lot, ARB). GL3.0 or better hardware should be fully generalized and robust, on GL1.x hardware - don't even bother.

If all that you're drawing is 2D sprites you can check for and use the GL_ARB_texture_rectangle extension. There are a number of restrictions (no mipmaps, clamp modes only, no borders, must use integer texcoords) but it should be more widely available.

Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.

This topic is closed to new replies.

Advertisement