• Advertisement
  • Popular Tags

  • Popular Now

  • Advertisement
  • Similar Content

    • By lxjk
      Hi guys,
      There are many ways to do light culling in tile-based shading. I've been playing with this idea for a while, and just want to throw it out there.
      Because tile frustums are general small compared to light radius, I tried using cone test to reduce false positives introduced by commonly used sphere-frustum test.
      On top of that, I use distance to camera rather than depth for near/far test (aka. sliced by spheres).
      This method can be naturally extended to clustered light culling as well.
      The following image shows the general ideas

       
      Performance-wise I get around 15% improvement over sphere-frustum test. You can also see how a single light performs as the following: from left to right (1) standard rendering of a point light; then tiles passed the test of (2) sphere-frustum test; (3) cone test; (4) spherical-sliced cone test
       

       
      I put the details in my blog post (https://lxjk.github.io/2018/03/25/Improve-Tile-based-Light-Culling-with-Spherical-sliced-Cone.html), GLSL source code included!
       
      Eric
    • By Fadey Duh
      Good evening everyone!

      I was wondering if there is something equivalent of  GL_NV_blend_equation_advanced for AMD?
      Basically I'm trying to find more compatible version of it.

      Thank you!
    • By Jens Eckervogt
      Hello guys, 
       
      Please tell me! 
      How do I know? Why does wavefront not show for me?
      I already checked I have non errors yet.
      using OpenTK; using System.Collections.Generic; using System.IO; using System.Text; namespace Tutorial_08.net.sourceskyboxer { public class WaveFrontLoader { private static List<Vector3> inPositions; private static List<Vector2> inTexcoords; private static List<Vector3> inNormals; private static List<float> positions; private static List<float> texcoords; private static List<int> indices; public static RawModel LoadObjModel(string filename, Loader loader) { inPositions = new List<Vector3>(); inTexcoords = new List<Vector2>(); inNormals = new List<Vector3>(); positions = new List<float>(); texcoords = new List<float>(); indices = new List<int>(); int nextIdx = 0; using (var reader = new StreamReader(File.Open("Contents/" + filename + ".obj", FileMode.Open), Encoding.UTF8)) { string line = reader.ReadLine(); int i = reader.Read(); while (true) { string[] currentLine = line.Split(); if (currentLine[0] == "v") { Vector3 pos = new Vector3(float.Parse(currentLine[1]), float.Parse(currentLine[2]), float.Parse(currentLine[3])); inPositions.Add(pos); if (currentLine[1] == "t") { Vector2 tex = new Vector2(float.Parse(currentLine[1]), float.Parse(currentLine[2])); inTexcoords.Add(tex); } if (currentLine[1] == "n") { Vector3 nom = new Vector3(float.Parse(currentLine[1]), float.Parse(currentLine[2]), float.Parse(currentLine[3])); inNormals.Add(nom); } } if (currentLine[0] == "f") { Vector3 pos = inPositions[0]; positions.Add(pos.X); positions.Add(pos.Y); positions.Add(pos.Z); Vector2 tc = inTexcoords[0]; texcoords.Add(tc.X); texcoords.Add(tc.Y); indices.Add(nextIdx); ++nextIdx; } reader.Close(); return loader.loadToVAO(positions.ToArray(), texcoords.ToArray(), indices.ToArray()); } } } } } And It have tried other method but it can't show for me.  I am mad now. Because any OpenTK developers won't help me.
      Please help me how do I fix.

      And my download (mega.nz) should it is original but I tried no success...
      - Add blend source and png file here I have tried tried,.....  
       
      PS: Why is our community not active? I wait very longer. Stop to lie me!
      Thanks !
    • By codelyoko373
      I wasn't sure if this would be the right place for a topic like this so sorry if it isn't.
      I'm currently working on a project for Uni using FreeGLUT to make a simple solar system simulation. I've got to the point where I've implemented all the planets and have used a Scene Graph to link them all together. The issue I'm having with now though is basically the planets and moons orbit correctly at their own orbit speeds.
      I'm not really experienced with using matrices for stuff like this so It's likely why I can't figure out how exactly to get it working. This is where I'm applying the transformation matrices, as well as pushing and popping them. This is within the Render function that every planet including the sun and moons will have and run.
      if (tag != "Sun") { glRotatef(orbitAngle, orbitRotation.X, orbitRotation.Y, orbitRotation.Z); } glPushMatrix(); glTranslatef(position.X, position.Y, position.Z); glRotatef(rotationAngle, rotation.X, rotation.Y, rotation.Z); glScalef(scale.X, scale.Y, scale.Z); glDrawElements(GL_TRIANGLES, mesh->indiceCount, GL_UNSIGNED_SHORT, mesh->indices); if (tag != "Sun") { glPopMatrix(); } The "If(tag != "Sun")" parts are my attempts are getting the planets to orbit correctly though it likely isn't the way I'm meant to be doing it. So I was wondering if someone would be able to help me? As I really don't have an idea on what I would do to get it working. Using the if statement is truthfully the closest I've got to it working but there are still weird effects like the planets orbiting faster then they should depending on the number of planets actually be updated/rendered.
    • By Jens Eckervogt
      Hello everyone, 
      I have problem with texture
      using System; using OpenTK; using OpenTK.Input; using OpenTK.Graphics; using OpenTK.Graphics.OpenGL4; using System.Drawing; using System.Reflection; namespace Tutorial_05 { class Game : GameWindow { private static int WIDTH = 1200; private static int HEIGHT = 720; private static KeyboardState keyState; private int vaoID; private int vboID; private int iboID; private Vector3[] vertices = { new Vector3(-0.5f, 0.5f, 0.0f), // V0 new Vector3(-0.5f, -0.5f, 0.0f), // V1 new Vector3(0.5f, -0.5f, 0.0f), // V2 new Vector3(0.5f, 0.5f, 0.0f) // V3 }; private Vector2[] texcoords = { new Vector2(0, 0), new Vector2(0, 1), new Vector2(1, 1), new Vector2(1, 0) }; private int[] indices = { 0, 1, 3, 3, 1, 2 }; private string vertsrc = @"#version 450 core in vec3 position; in vec2 textureCoords; out vec2 pass_textureCoords; void main(void) { gl_Position = vec4(position, 1.0); pass_textureCoords = textureCoords; }"; private string fragsrc = @"#version 450 core in vec2 pass_textureCoords; out vec4 out_color; uniform sampler2D textureSampler; void main(void) { out_color = texture(textureSampler, pass_textureCoords); }"; private int programID; private int vertexShaderID; private int fragmentShaderID; private int textureID; private Bitmap texsrc; public Game() : base(WIDTH, HEIGHT, GraphicsMode.Default, "Tutorial 05 - Texturing", GameWindowFlags.Default, DisplayDevice.Default, 4, 5, GraphicsContextFlags.Default) { } protected override void OnLoad(EventArgs e) { base.OnLoad(e); CursorVisible = true; GL.GenVertexArrays(1, out vaoID); GL.BindVertexArray(vaoID); GL.GenBuffers(1, out vboID); GL.BindBuffer(BufferTarget.ArrayBuffer, vboID); GL.BufferData(BufferTarget.ArrayBuffer, (IntPtr)(vertices.Length * Vector3.SizeInBytes), vertices, BufferUsageHint.StaticDraw); GL.GenBuffers(1, out iboID); GL.BindBuffer(BufferTarget.ElementArrayBuffer, iboID); GL.BufferData(BufferTarget.ElementArrayBuffer, (IntPtr)(indices.Length * sizeof(int)), indices, BufferUsageHint.StaticDraw); vertexShaderID = GL.CreateShader(ShaderType.VertexShader); GL.ShaderSource(vertexShaderID, vertsrc); GL.CompileShader(vertexShaderID); fragmentShaderID = GL.CreateShader(ShaderType.FragmentShader); GL.ShaderSource(fragmentShaderID, fragsrc); GL.CompileShader(fragmentShaderID); programID = GL.CreateProgram(); GL.AttachShader(programID, vertexShaderID); GL.AttachShader(programID, fragmentShaderID); GL.LinkProgram(programID); // Loading texture from embedded resource texsrc = new Bitmap(Assembly.GetEntryAssembly().GetManifestResourceStream("Tutorial_05.example.png")); textureID = GL.GenTexture(); GL.BindTexture(TextureTarget.Texture2D, textureID); GL.TexParameter(TextureTarget.Texture2D, TextureParameterName.TextureMagFilter, (int)All.Linear); GL.TexParameter(TextureTarget.Texture2D, TextureParameterName.TextureMinFilter, (int)All.Linear); GL.TexImage2D(TextureTarget.Texture2D, 0, PixelInternalFormat.Rgba, texsrc.Width, texsrc.Height, 0, PixelFormat.Bgra, PixelType.UnsignedByte, IntPtr.Zero); System.Drawing.Imaging.BitmapData bitmap_data = texsrc.LockBits(new Rectangle(0, 0, texsrc.Width, texsrc.Height), System.Drawing.Imaging.ImageLockMode.ReadOnly, System.Drawing.Imaging.PixelFormat.Format32bppRgb); GL.TexSubImage2D(TextureTarget.Texture2D, 0, 0, 0, texsrc.Width, texsrc.Height, PixelFormat.Bgra, PixelType.UnsignedByte, bitmap_data.Scan0); texsrc.UnlockBits(bitmap_data); GL.Enable(EnableCap.Texture2D); GL.BufferData(BufferTarget.TextureBuffer, (IntPtr)(texcoords.Length * Vector2.SizeInBytes), texcoords, BufferUsageHint.StaticDraw); GL.BindAttribLocation(programID, 0, "position"); GL.BindAttribLocation(programID, 1, "textureCoords"); } protected override void OnResize(EventArgs e) { base.OnResize(e); GL.Viewport(0, 0, ClientRectangle.Width, ClientRectangle.Height); } protected override void OnUpdateFrame(FrameEventArgs e) { base.OnUpdateFrame(e); keyState = Keyboard.GetState(); if (keyState.IsKeyDown(Key.Escape)) { Exit(); } } protected override void OnRenderFrame(FrameEventArgs e) { base.OnRenderFrame(e); // Prepare for background GL.Clear(ClearBufferMask.ColorBufferBit); GL.ClearColor(Color4.Red); // Draw traingles GL.EnableVertexAttribArray(0); GL.EnableVertexAttribArray(1); GL.BindVertexArray(vaoID); GL.UseProgram(programID); GL.BindBuffer(BufferTarget.ArrayBuffer, vboID); GL.VertexAttribPointer(0, 3, VertexAttribPointerType.Float, false, 0, IntPtr.Zero); GL.ActiveTexture(TextureUnit.Texture0); GL.BindTexture(TextureTarget.Texture3D, textureID); GL.BindBuffer(BufferTarget.ElementArrayBuffer, iboID); GL.DrawElements(BeginMode.Triangles, indices.Length, DrawElementsType.UnsignedInt, 0); GL.DisableVertexAttribArray(0); GL.DisableVertexAttribArray(1); SwapBuffers(); } protected override void OnClosed(EventArgs e) { base.OnClosed(e); GL.DeleteVertexArray(vaoID); GL.DeleteBuffer(vboID); } } } I can not remember where do I add GL.Uniform2();
  • Advertisement
  • Advertisement
Sign in to follow this  

OpenGL Texture reloading when toggling fullscreen, SDL+OpenGL

This topic is 3033 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Dear reader, Today I decided to tackle the problem of reloading my textures whenever the OpenGL context gets lost and recovered (for example when SDL switches to fullscreen or windowed -> a new window is created). However, this seems to be harder than I reckoned, because I can't get it to work. Whenever I switch to fullscreen, the simple textured GL_QUADS I render every frame disappears, and when switching back to windowed mode, it doesn't come back. Before asking any other questions; when I don't texture the quad (but just glColor3f() it), and then switch to fullscreen, the quad disappears just as well. This isn't supposed to happen right? As far as I know, I need to reload textures because they allocate memory internal to OpenGL, but a simple glBegin(GL_QUADS) - glEnd() structure does not right? Can anyone point me to the right direction and take a guess as to why my vertices disappear? Oh, and if it's any help: I clear the screen with a colour determined by the mouse position (so it changes whenever I move with my cursor). This works before I switch to fullscreen, but does so too in fullscreen mode and after switching back. I also glEnable(GL_TEXTURE_2D); after (re)setting up SDL's video mode, so that can't be it either. Thank you for reading (and hopefully replying) - Stijn Frishert

Share this post


Link to post
Share on other sites
Advertisement
This is a problem only DX has. OpenGL always internally keeps copies of all objects in SysRAM. There's no concept of "lost context" in GL. But "a new window is created" means that your context stays intact in the previous window. Make SDL or whatever not recreate the window, it's pointless. If it can't be fixed, then ditch SDL and spend a few minutes in writing your own windowing :)

Share this post


Link to post
Share on other sites
Quote:
Today I decided to tackle the problem of reloading my textures whenever the OpenGL context gets lost and recovered (for example when SDL switches to fullscreen or windowed -> a new window is created). However, this seems to be harder than I reckoned, because I can't get it to work. Whenever I switch to fullscreen, the simple textured GL_QUADS I render every frame disappears, and when switching back to windowed mode, it doesn't come back.

Before asking any other questions; when I don't texture the quad (but just glColor3f() it), and then switch to fullscreen, the quad disappears just as well. This isn't supposed to happen right? As far as I know, I need to reload textures because they allocate memory internal to OpenGL, but a simple glBegin(GL_QUADS) - glEnd() structure does not right? Can anyone point me to the right direction and take a guess as to why my vertices disappear?

Oh, and if it's any help: I clear the screen with a colour determined by the mouse position (so it changes whenever I move with my cursor). This works before I switch to fullscreen, but does so too in fullscreen mode and after switching back. I also glEnable(GL_TEXTURE_2D); after (re)setting up SDL's video mode, so that can't be it either.
The first thing to check would probably be your OpenGL state setup. When the OpenGL context is lost and recreated, the entire OpenGL state is reset to its default; this includes render states, transform matrices, etc. Once the context has been recreated, you'll need to repeat whatever setup work you did at startup; if you miss anything crucial (such as setting up the projection matrix), you may not get the results you expect after the reset.

You are right that textures need to be re-uploaded to OpenGL after the context is recreated; this also holds for other volatile resources that may reside in video memory or in OpenGL-managed system memory, such as VBOs or display lists. Code such as glBegin()/End() blocks, on the other hand, simply issue commands and will be unaffected (aside from being dependent on the overall OpenGL state, of course).

What OS are you developing on? Windows? Also, if I may ask, are you only losing the context on windowed<->fullscreen switches? Or are you losing the context in other situations as well?

Share this post


Link to post
Share on other sites
Quote:
This is a problem only DX has. OpenGL always internally keeps copies of all objects in SysRAM. There's no concept of "lost context" in GL. But "a new window is created" means that your context stays intact in the previous window. Make SDL or whatever not recreate the window, it's pointless. If it can't be fixed, then ditch SDL and spend a few minutes in writing your own windowing :)
I think ditching SDL because of this behavior might be a little shortsighted; especially if you're targeting multiple OSs/platforms, writing your own windowing and events code (and making it 'play nice' with each OS) will likely take more than a few minutes, I would think. Also, for what it's worth, there's been some discussion on the SDL forums of trying to address this problem in SDL 1.3.

As for the context never being lost in Windows, I've read that as well, but even with very simple programs (i.e. rendering a single textured triangle) I've had problems with the app crashing sometimes when you alt-tab in and out of fullscreen mode. In other places I've read that you're not supposed to render to the context when the app doesn't have focus, but I haven't been able to confirm this. (I'm not sure if any of this relates to the OP's problem, but it's something I've been curious about.)

Share this post


Link to post
Share on other sites
I've been coding GL stuff for years without relying on stuff like SDL (I don't need cross-platforming), and gpu-resets never ever crashed my code or made it misbehave. Neither did unfocusing/hiding the window. All GL state was automatically and transparently restored for me. Have I been lucky for a change? (I doubt it)

Share this post


Link to post
Share on other sites
Quote:
I've been coding GL stuff for years without relying on stuff like SDL (I don't need cross-platforming)
Sure, it's a different situation if you have no need for cross-platform support.
Quote:
and gpu-resets never ever crashed my code or made it misbehave. Neither did unfocusing/hiding the window. All GL state was automatically and transparently restored for me. Have I been lucky for a change? (I doubt it)
Maybe the problem I'm seeing is specific to SDL or to my particular hardware configuration (I have a fairly old Intel video card).

Share this post


Link to post
Share on other sites
I setup the window for my own, and don't have problems with alt-tabbing. I simply don't render, when focus is lost (because it's a fullscreen app, so it's pointless to do anything when not having focus). I simply set ChangeDisplaySettings, ShowWindow, SetForegroundWindow when handling alt-tabbing.

I'm sure it has some kind of equivalent in SDL.

Share this post


Link to post
Share on other sites
Thanks for replying, idinev, jyk and szecs! I solved the problem, thanks to you guys. It lay indeed in the not resetting of the OpenGL state correctly after switching.

Quote:
original post by jyk:
You are right that textures need to be re-uploaded to OpenGL after the context is recreated; this also holds for other volatile resources that may reside in video memory or in OpenGL-managed system memory, such as VBOs or display lists. Code such as glBegin()/End() blocks, on the other hand, simply issue commands and will be unaffected (aside from being dependent on the overall OpenGL state, of course).

What OS are you developing on? Windows? Also, if I may ask, are you only losing the context on windowed<->fullscreen switches? Or are you losing the context in other situations as well?

Yeap, it was indeed the OpenGL state. Now for another important question though: Should I glDeleteTextures() all my textures before switching fullscreen<->windowed? I wouldn't want any allocated and unrefered to memory internal to OpenGL hanging around.

I'm developing for Windows, right now. Not necessarily into cross-platform, I just happen to like the SDL+OpenGL combination (however, if porting to e.g. OSX would work out, why wouldn't I?). As far as I know, I only lose the context when switching from windowed to fullscreen switches and vice versa. Alt-tabbing works perfectly well, if that's what you're asking.

Again, thanks for helping out,
- Stijn

Share this post


Link to post
Share on other sites
Quote:
Should I glDeleteTextures() all my textures before switching fullscreen<->windowed? I wouldn't want any allocated and unrefered to memory internal to OpenGL hanging around.
I think textures and other similar resources are released/destroyed with the context, but I don't know this for sure (I always delete such resources explicitly before recreating the context, just to be sure).

Share this post


Link to post
Share on other sites
Maybe you should post your initialization code.

I may be wrong, but maybe you are trying to set a fullscreen mode, which isn't supported. Maybe just 24 bit instead of 32, or something like that, so maybe that's the reason your context is getting lost.
But that may be totally wrong.

Lot of "maybe"s for a post.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement