• Advertisement
  • Popular Tags

  • Popular Now

  • Advertisement
  • Similar Content

    • By Fadey Duh
      Good evening everyone!

      I was wondering if there is something equivalent of  GL_NV_blend_equation_advanced for AMD?
      Basically I'm trying to find more compatible version of it.

      Thank you!
    • By Jens Eckervogt
      Hello guys, 
       
      Please tell me! 
      How do I know? Why does wavefront not show for me?
      I already checked I have non errors yet.
      using OpenTK; using System.Collections.Generic; using System.IO; using System.Text; namespace Tutorial_08.net.sourceskyboxer { public class WaveFrontLoader { private static List<Vector3> inPositions; private static List<Vector2> inTexcoords; private static List<Vector3> inNormals; private static List<float> positions; private static List<float> texcoords; private static List<int> indices; public static RawModel LoadObjModel(string filename, Loader loader) { inPositions = new List<Vector3>(); inTexcoords = new List<Vector2>(); inNormals = new List<Vector3>(); positions = new List<float>(); texcoords = new List<float>(); indices = new List<int>(); int nextIdx = 0; using (var reader = new StreamReader(File.Open("Contents/" + filename + ".obj", FileMode.Open), Encoding.UTF8)) { string line = reader.ReadLine(); int i = reader.Read(); while (true) { string[] currentLine = line.Split(); if (currentLine[0] == "v") { Vector3 pos = new Vector3(float.Parse(currentLine[1]), float.Parse(currentLine[2]), float.Parse(currentLine[3])); inPositions.Add(pos); if (currentLine[1] == "t") { Vector2 tex = new Vector2(float.Parse(currentLine[1]), float.Parse(currentLine[2])); inTexcoords.Add(tex); } if (currentLine[1] == "n") { Vector3 nom = new Vector3(float.Parse(currentLine[1]), float.Parse(currentLine[2]), float.Parse(currentLine[3])); inNormals.Add(nom); } } if (currentLine[0] == "f") { Vector3 pos = inPositions[0]; positions.Add(pos.X); positions.Add(pos.Y); positions.Add(pos.Z); Vector2 tc = inTexcoords[0]; texcoords.Add(tc.X); texcoords.Add(tc.Y); indices.Add(nextIdx); ++nextIdx; } reader.Close(); return loader.loadToVAO(positions.ToArray(), texcoords.ToArray(), indices.ToArray()); } } } } } And It have tried other method but it can't show for me.  I am mad now. Because any OpenTK developers won't help me.
      Please help me how do I fix.

      And my download (mega.nz) should it is original but I tried no success...
      - Add blend source and png file here I have tried tried,.....  
       
      PS: Why is our community not active? I wait very longer. Stop to lie me!
      Thanks !
    • By codelyoko373
      I wasn't sure if this would be the right place for a topic like this so sorry if it isn't.
      I'm currently working on a project for Uni using FreeGLUT to make a simple solar system simulation. I've got to the point where I've implemented all the planets and have used a Scene Graph to link them all together. The issue I'm having with now though is basically the planets and moons orbit correctly at their own orbit speeds.
      I'm not really experienced with using matrices for stuff like this so It's likely why I can't figure out how exactly to get it working. This is where I'm applying the transformation matrices, as well as pushing and popping them. This is within the Render function that every planet including the sun and moons will have and run.
      if (tag != "Sun") { glRotatef(orbitAngle, orbitRotation.X, orbitRotation.Y, orbitRotation.Z); } glPushMatrix(); glTranslatef(position.X, position.Y, position.Z); glRotatef(rotationAngle, rotation.X, rotation.Y, rotation.Z); glScalef(scale.X, scale.Y, scale.Z); glDrawElements(GL_TRIANGLES, mesh->indiceCount, GL_UNSIGNED_SHORT, mesh->indices); if (tag != "Sun") { glPopMatrix(); } The "If(tag != "Sun")" parts are my attempts are getting the planets to orbit correctly though it likely isn't the way I'm meant to be doing it. So I was wondering if someone would be able to help me? As I really don't have an idea on what I would do to get it working. Using the if statement is truthfully the closest I've got to it working but there are still weird effects like the planets orbiting faster then they should depending on the number of planets actually be updated/rendered.
    • By Jens Eckervogt
      Hello everyone, 
      I have problem with texture
      using System; using OpenTK; using OpenTK.Input; using OpenTK.Graphics; using OpenTK.Graphics.OpenGL4; using System.Drawing; using System.Reflection; namespace Tutorial_05 { class Game : GameWindow { private static int WIDTH = 1200; private static int HEIGHT = 720; private static KeyboardState keyState; private int vaoID; private int vboID; private int iboID; private Vector3[] vertices = { new Vector3(-0.5f, 0.5f, 0.0f), // V0 new Vector3(-0.5f, -0.5f, 0.0f), // V1 new Vector3(0.5f, -0.5f, 0.0f), // V2 new Vector3(0.5f, 0.5f, 0.0f) // V3 }; private Vector2[] texcoords = { new Vector2(0, 0), new Vector2(0, 1), new Vector2(1, 1), new Vector2(1, 0) }; private int[] indices = { 0, 1, 3, 3, 1, 2 }; private string vertsrc = @"#version 450 core in vec3 position; in vec2 textureCoords; out vec2 pass_textureCoords; void main(void) { gl_Position = vec4(position, 1.0); pass_textureCoords = textureCoords; }"; private string fragsrc = @"#version 450 core in vec2 pass_textureCoords; out vec4 out_color; uniform sampler2D textureSampler; void main(void) { out_color = texture(textureSampler, pass_textureCoords); }"; private int programID; private int vertexShaderID; private int fragmentShaderID; private int textureID; private Bitmap texsrc; public Game() : base(WIDTH, HEIGHT, GraphicsMode.Default, "Tutorial 05 - Texturing", GameWindowFlags.Default, DisplayDevice.Default, 4, 5, GraphicsContextFlags.Default) { } protected override void OnLoad(EventArgs e) { base.OnLoad(e); CursorVisible = true; GL.GenVertexArrays(1, out vaoID); GL.BindVertexArray(vaoID); GL.GenBuffers(1, out vboID); GL.BindBuffer(BufferTarget.ArrayBuffer, vboID); GL.BufferData(BufferTarget.ArrayBuffer, (IntPtr)(vertices.Length * Vector3.SizeInBytes), vertices, BufferUsageHint.StaticDraw); GL.GenBuffers(1, out iboID); GL.BindBuffer(BufferTarget.ElementArrayBuffer, iboID); GL.BufferData(BufferTarget.ElementArrayBuffer, (IntPtr)(indices.Length * sizeof(int)), indices, BufferUsageHint.StaticDraw); vertexShaderID = GL.CreateShader(ShaderType.VertexShader); GL.ShaderSource(vertexShaderID, vertsrc); GL.CompileShader(vertexShaderID); fragmentShaderID = GL.CreateShader(ShaderType.FragmentShader); GL.ShaderSource(fragmentShaderID, fragsrc); GL.CompileShader(fragmentShaderID); programID = GL.CreateProgram(); GL.AttachShader(programID, vertexShaderID); GL.AttachShader(programID, fragmentShaderID); GL.LinkProgram(programID); // Loading texture from embedded resource texsrc = new Bitmap(Assembly.GetEntryAssembly().GetManifestResourceStream("Tutorial_05.example.png")); textureID = GL.GenTexture(); GL.BindTexture(TextureTarget.Texture2D, textureID); GL.TexParameter(TextureTarget.Texture2D, TextureParameterName.TextureMagFilter, (int)All.Linear); GL.TexParameter(TextureTarget.Texture2D, TextureParameterName.TextureMinFilter, (int)All.Linear); GL.TexImage2D(TextureTarget.Texture2D, 0, PixelInternalFormat.Rgba, texsrc.Width, texsrc.Height, 0, PixelFormat.Bgra, PixelType.UnsignedByte, IntPtr.Zero); System.Drawing.Imaging.BitmapData bitmap_data = texsrc.LockBits(new Rectangle(0, 0, texsrc.Width, texsrc.Height), System.Drawing.Imaging.ImageLockMode.ReadOnly, System.Drawing.Imaging.PixelFormat.Format32bppRgb); GL.TexSubImage2D(TextureTarget.Texture2D, 0, 0, 0, texsrc.Width, texsrc.Height, PixelFormat.Bgra, PixelType.UnsignedByte, bitmap_data.Scan0); texsrc.UnlockBits(bitmap_data); GL.Enable(EnableCap.Texture2D); GL.BufferData(BufferTarget.TextureBuffer, (IntPtr)(texcoords.Length * Vector2.SizeInBytes), texcoords, BufferUsageHint.StaticDraw); GL.BindAttribLocation(programID, 0, "position"); GL.BindAttribLocation(programID, 1, "textureCoords"); } protected override void OnResize(EventArgs e) { base.OnResize(e); GL.Viewport(0, 0, ClientRectangle.Width, ClientRectangle.Height); } protected override void OnUpdateFrame(FrameEventArgs e) { base.OnUpdateFrame(e); keyState = Keyboard.GetState(); if (keyState.IsKeyDown(Key.Escape)) { Exit(); } } protected override void OnRenderFrame(FrameEventArgs e) { base.OnRenderFrame(e); // Prepare for background GL.Clear(ClearBufferMask.ColorBufferBit); GL.ClearColor(Color4.Red); // Draw traingles GL.EnableVertexAttribArray(0); GL.EnableVertexAttribArray(1); GL.BindVertexArray(vaoID); GL.UseProgram(programID); GL.BindBuffer(BufferTarget.ArrayBuffer, vboID); GL.VertexAttribPointer(0, 3, VertexAttribPointerType.Float, false, 0, IntPtr.Zero); GL.ActiveTexture(TextureUnit.Texture0); GL.BindTexture(TextureTarget.Texture3D, textureID); GL.BindBuffer(BufferTarget.ElementArrayBuffer, iboID); GL.DrawElements(BeginMode.Triangles, indices.Length, DrawElementsType.UnsignedInt, 0); GL.DisableVertexAttribArray(0); GL.DisableVertexAttribArray(1); SwapBuffers(); } protected override void OnClosed(EventArgs e) { base.OnClosed(e); GL.DeleteVertexArray(vaoID); GL.DeleteBuffer(vboID); } } } I can not remember where do I add GL.Uniform2();
    • By Jens Eckervogt
      Hello everyone
      For @80bserver8 nice job - I have found Google search. How did you port from Javascript WebGL to C# OpenTK.?
      I have been searched Google but it shows f***ing Unity 3D. I really want know how do I understand I want start with OpenTK But I want know where is porting of Javascript and C#?
       
      Thanks!
  • Advertisement
  • Advertisement
Sign in to follow this  

OpenGL White textures on Windows XP VM using OpenTK / C#

This topic is 2042 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hello,

While testing a build of my game on an XP machine some weeks back, I found that some of the textures, even those not using alpha blending, were being rendered as white rectangles on the VM. On the physical Windows 7 machine, they displayed perfectly.

From doing various google searches, I am led to believe that white textures are pretty much an error state and therefore I'm doing something wrong. (Which I then further clarified with glGetError)

The game used to work on the XP VM so I'm at a loss what I did to cause it - after I originally got textures working way back when I haven't really touched the code as it hasn't been nessecary and as I'm still pretty much a beginner at OpenGL I don't want to break things by mistake.

Part of the problem is that I'm not even sure if I really have broken the game, or if it's the VM itself - while testing I noted that if I changed the color depth of the VM from 32bit to lower, then the game window always displayed a solid red color, nothing else. Of course, that could still be a fault with my code ;)

Here's examples of what the game looks like running on Windows 7:

[attachment=11290:good1.png][attachment=11291:good2.png]

And this is what happens on the XP VM:

[attachment=11288:bad1.png][attachment=11289:bad2.png]

What is truly frustrating about this whole issue is it only affects some graphics, and it's always the same ones.

I added a call to glGetError after calling glTexImage2D and this is returning InvalidValue. But that doesn't make sense to me - why it would it return fine on one machine and not another, and why wouldn't it affect all graphics instead of just some? This is the output of a debug log I added:

15/09/2012 20:04:45: Debug: Assigning texture 1 (splash) [Hint: Linear, Flags: None]
15/09/2012 20:04:45: Debug: InvalidValue
15/09/2012 20:04:45: Debug: Activated Scene:
15/09/2012 20:04:45: Debug: Assigning texture 2 (gamebackground) [Hint: Nearest, Flags: None]
15/09/2012 20:04:45: Debug: InvalidValue
15/09/2012 20:04:45: Debug: Assigning texture 3 (bonusbackground) [Hint: Nearest, Flags: None]
15/09/2012 20:04:45: Debug: InvalidValue
15/09/2012 20:04:45: Debug: Assigning texture 4 (exitbackground) [Hint: Nearest, Flags: None]
15/09/2012 20:04:45: Debug: InvalidValue
15/09/2012 20:04:45: Debug: Assigning texture 5 (statusbanner) [Hint: Nearest, Flags: None]
15/09/2012 20:04:45: Debug: InvalidValue
15/09/2012 20:04:45: Debug: Assigning texture 6 (maintitle) [Hint: Linear, Flags: None]
15/09/2012 20:04:45: Debug: InvalidValue
15/09/2012 20:04:45: Debug: Assigning texture 7 (optionstitle) [Hint: Linear, Flags: Alpha]
15/09/2012 20:04:45: Debug: InvalidValue
15/09/2012 20:04:45: Debug: Assigning texture 8 (howtotitle) [Hint: Linear, Flags: Alpha]
15/09/2012 20:04:45: Debug: InvalidValue
15/09/2012 20:04:45: Debug: Assigning texture 9 (creditstitle) [Hint: Linear, Flags: Alpha]
15/09/2012 20:04:45: Debug: InvalidValue
15/09/2012 20:04:45: Debug: Assigning texture 10 (pausedtitle) [Hint: Linear, Flags: Alpha]
15/09/2012 20:04:45: Debug: InvalidValue
15/09/2012 20:04:45: Debug: Assigning texture 11 (debug) [Hint: Linear, Flags: Alpha]
15/09/2012 20:04:45: Debug: NoError
15/09/2012 20:04:45: Debug: Assigning texture 12 (rock-72_0) [Hint: Nearest, Flags: Alpha]
15/09/2012 20:04:45: Debug: NoError
15/09/2012 20:04:45: Debug: Assigning texture 13 (rock-32s_0) [Hint: Nearest, Flags: Alpha]
15/09/2012 20:04:45: Debug: NoError
15/09/2012 20:04:45: Debug: Assigning texture 14 (rock-36s_0) [Hint: Nearest, Flags: Alpha]
15/09/2012 20:04:45: Debug: NoError
15/09/2012 20:04:45: Debug: Assigning texture 15 (uni564-12s_0) [Hint: Nearest, Flags: Alpha]
15/09/2012 20:04:45: Debug: NoError
15/09/2012 20:04:45: Debug: Assigning texture 16 (JewelRush) [Hint: Nearest, Flags: Alpha]
15/09/2012 20:04:45: Debug: NoError


Clearly it's not liking a lot of graphics independent of settings.

The code is scatted about in different classes, hopefully I've gathered up all the relevant bits here. As mentioned I'm still a noob when it comes to OpenGL, so I'm currently doing all texture rendering via glDrawArrays.

Initializing texture support:
[source lang="csharp"]GL.Disable(EnableCap.CullFace);
GL.Enable(EnableCap.Texture2D);
GL.BlendFunc(BlendingFactorSrc.SrcAlpha, BlendingFactorDest.OneMinusSrcAlpha);
[/source]

Binding a System.Drawing.Bitmap to OpenGL:

[source lang="c#"] public override void Rebind()
{
BitmapData data;
Bitmap bitmap;
int magnificationFilter;
int minificationFilter;

bitmap = (Bitmap)this.Image;

if (this.TextureId == 0)
{
GL.GenTextures(1, out _textureId);
Log.WriteLog(LogLevel.Debug, "Assigning texture {0} ({1}) [Hint: {2}, Flags: {3}]", this.TextureId, this.Name, this.Hint, this.Flags);
}

OpenGL.BindTexture(this.TextureId);

data = bitmap.LockBits(new Rectangle(0, 0, bitmap.Width, bitmap.Height), ImageLockMode.ReadOnly, System.Drawing.Imaging.PixelFormat.Format32bppArgb);

switch (this.Hint)
{
case TextureHintMode.Nearest:
magnificationFilter = (int)TextureMagFilter.Nearest;
minificationFilter = (int)TextureMinFilter.Nearest;
break;
default:
magnificationFilter = (int)TextureMagFilter.Linear;
minificationFilter = (int)TextureMinFilter.Linear;
break;
}

GL.TexParameter(TextureTarget.Texture2D, TextureParameterName.TextureMinFilter, minificationFilter);
GL.TexParameter(TextureTarget.Texture2D, TextureParameterName.TextureMagFilter, magnificationFilter);
GL.TexImage2D(TextureTarget.Texture2D, 0, PixelInternalFormat.Rgba, data.Width, data.Height, 0, OpenTK.Graphics.OpenGL.PixelFormat.Bgra, PixelType.UnsignedByte, data.Scan0);

Log.WriteLog(LogLevel.Debug, GL.GetError().ToString());

bitmap.UnlockBits(data);
}[/source]
Although not shown in the above code, I experimented by creating a new bitmap object with an explicit RGBA pixel type and then drew the original image onto this prior to calling LockBits etc, this had no effect.

Part of my SpriteBatch class that handles drawing with as few calls to glBindTexture as possible:
[source lang="csharp"]public override void Draw()
{
if (this.Size != 0)
{
if (_requiresBlend)
GL.Enable(EnableCap.Blend);

OpenGL.BindTexture(_textureId);
this.SetupPointers();
GL.DrawArrays(BeginMode.Triangles, 0, this.Size);
this.Size = 0;

if (!_requiresBlend)
{
GL.Disable(EnableCap.Blend);
_requiresBlend = false;
}
}
}

private void SetupPointers()
{
GL.EnableClientState(ArrayCap.ColorArray);
GL.EnableClientState(ArrayCap.VertexArray);
GL.EnableClientState(ArrayCap.TextureCoordArray);

GL.VertexPointer(VertexDimensions, VertexPointerType.Double, 0, _vertexPositions);
GL.ColorPointer<BrColor>(ColorDimensions, ColorPointerType.Float, 0, _vertexColors);
GL.TexCoordPointer<Point>(UVDimensions, TexCoordPointerType.Float, 0, _vertexUVs);
}[/source]

Apologies for the somewhat rambling post, if anyone has any suggestions as to where I'm going wrong I'd be grateful.

Thanks;
Richard Moss

Share this post


Link to post
Share on other sites
Advertisement
Is GL_TEXTURE_2D enabled when you try to use it?
Some implementations might be super strict and I ___think___ you're supposed to have it enabled before you can use it with such, or any, operations
Other than that, I would personally look into all the glEnable bits that could potentially cause this error.. It doesn't look like you are doing something that is inherently illegal,
so that would be my first quest smile.png
I don't see anything wrong with the small code you've posted, and from all the errors it seems the error is indeed in the glTexImage2D

edit: it could also be that you aren't specifying the S,T of the texture coordinates, but.. i don't think you have to :) a wild shot!
such as setting them to CLAMP_TO_BORDER Edited by Kaptein

Share this post


Link to post
Share on other sites
For my money - the textures that are giving you GL_INVALID_VALUE are not powers-of-two in size, which the VM hardware emulation is not supporting.

Share this post


Link to post
Share on other sites
Kaptein and mhagain,

Thanks for the responses. I am enabling 2D textures prior to bind anything, that's not the issue. However, mhagain has hit the nail on the head it seems. Originally I tried to keep all textures power of two, until I read somewhere it wasn't actually required - so I got lazy for textures I used as single images, rather than being part of a tilesheet. And this seems to be exactly what the problem is - I did a quick test on changing one of the text graphics to be power of two, and it's loaded fine in the VM. So I just need to update my tilesheet code (which splits up textures into grids with equally sized cells) in order to support different sized sub images, then I think I'm sorted.

Thanks very much, I've been trying to fix this for a while and the fact I'd lazily switched image sizes never occured to me!

Regards;
Richard Moss

Share this post


Link to post
Share on other sites
Any GL2.0 or better hardware should have generalized support for non-power-of-two textures, but some older hardware may not be robust (e.g. the driver may advertise support for them but drop you back to software emulation if you actually try to use them - thanks a lot, ARB). GL3.0 or better hardware should be fully generalized and robust, on GL1.x hardware - don't even bother.

If all that you're drawing is 2D sprites you can check for and use the GL_ARB_texture_rectangle extension. There are a number of restrictions (no mipmaps, clamp modes only, no borders, must use integer texcoords) but it should be more widely available.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement