Jump to content
  • Advertisement

PixelPants

Member
  • Content Count

    18
  • Joined

  • Last visited

Community Reputation

100 Neutral

About PixelPants

  • Rank
    Member
  1. So, I'm finding that when multiple sound sources play at exactly the same time at roughly the same location as the listener, the combined volume gets really loud. What I'd like to do is have OpenAL simply put a limit on the maximum volume of the combined source volumes, but I can't find a function in OpenAL to do this. Anyone got any suggestions better than just "don't play sound sources at the same time near the listener"?
  2. PixelPants

    [.net] Single-cast delegate?

    Agreed; if I can't do this cleanly with a delegate, I wonder if I could solve this with an interface. For instance, instead of a delegate based design: [source lang=csharp] SwimmingPool swimpool = new SwimmingPool(); swimpool.Overflow = new OverflowDelegate(PoolOverflowed); swimpool.Overflow += PoolOverflowed; /* attempting to prevent this (user may be tempted to write this in later files, without realizing they don't have to!) */ [/source] To replace the delegate with just an interface that would get called: [source lang=csharp] SwimmingPool swimpool = new SwimmingPool(); swimpool.IOverflow= this; /* no temptation to "+=" later, though in my situation this should never get changed after it has been set -- and yet it still can be! */ [/source] Hmmm, maybe if I made it readonly; SwimmingPool(IOverflow overflow)... [source lang=csharp] SwimmingPool swimpool = new SwimmingPool(this); // swimpool.IOverflow = this; // <-- would not compile =D [/source] Alright, I think I've got a work-around.
  3. PixelPants

    [.net] Single-cast delegate?

    Are you being sincere? Yah, that's what I had in mind; the KeyPressEventArts from Windows.Forms uses a 'Handled' property that has identical functionality to the 'bool' that would have normally been my return value.
  4. PixelPants

    [.net] Single-cast delegate?

    Are there any alternatives that can be used when .NET 2.0 is the target framework? Func is purely .NET 3.5+, and Action is only .NET 2.0 if you only use 1 parameter (0,2+ are all .NET 3.5+). I might be able to get by with Action (single parameter and no return type), but it won't be pretty...
  5. All this time I thought delegates were single-cast, and that events were essentially a list of delegates (allowing for multi-cast). Well, today I learned that you could use the += operator on a delegate! As can be expected, this knocked me off my chair... I dug into the theory a bit, and found that the 'delegate' keyword essentially used 'System.MulticastDelegate', and that 'System.Delegate' was a single-cast delegate. System.Object ->System.Delegate ->->System.MulticastDelegate However, according to MSDN, I'm not allowed to use System.Delegate (it's a "special class" to be used only by the compiler). So what's the standard way to get a single-cast delegate?
  6. PixelPants

    [solved] 2D Transparency Order

    I found it, it's the same issue these guys were having: http://www.opengl.org/discussion_boards/ubbthreads.php?ubb=showflat&Number=143671 http://forum.openframeworks.cc/index.php/topic,1643.0.html It turns out it has nothing to do with the Z-order sorting / depth testing, contrary to what I thought. The issue is how transparency is retained in an FBO. [quote name='TomorrowPlusX'] What's happening, as far as I can tell, is this: I render the tree trunk will full opacity, the alpha channel of the FBO's target texture is set to 1 wherever the trunk fragments go. This is correct behavior. Then, I render the leaves ( a bunch of textured quads ). The leaves are rendered fully opaque, but the texture has an alpha channel. Wherever the alpha < 1, the alpha channel of the FBO seems to be multiplied by the textured fragment's alpha. E.g., if the fragment's alpha is 0.5, and the destination value is 1, I'm getting 0.5 as a result. If another, at 0.5 is rendered on top, the alpha goes to 0.25, and so on. [/quote] In my situation, I was able to resolve it by disabling blending when rendering the FBO to the screen.
  7. PixelPants

    [solved] 2D Transparency Order

    Ugh, I can't figure this out. Soo lost, lol... apatriarca, by coincidence, the order in which I render everything is back to front, and (for the most part) opaque to non-opaque; I never make any attempts to render behind something that has already been rendered. It was my understanding that with the depth buffer disabled, I wouldn't have to specify the Z axis. However, even with a monotonically increasing (or even decreasing) Z-axis value, the blending is still wrong. In the pic I posted. if near=-1 & far=1, then I render this picture @ [50,50, 1] and then I render this picture @ [50,50,0.9] and yet instead of bluring with the opaque quad beneath it, it blended with the background. And that makes no sense. I've tried this with and without depth-testing, with and without reversing the near/far, and several other ideas that I had, none of which is working. The blending function is: GL.Enable(EnableCap.Blend); GL.BlendFunc(BlendingFactorSrc.SrcAlpha, BlendingFactorDest.OneMinusSrcAlpha); and it does the blending great, except... the background color of the scene seems to be bleaching in!
  8. PixelPants

    [solved] 2D Transparency Order

    Alright, I've got depth testing disabled, and I think I've sorted all my quads by Z-order. The way I sorted it was calling this function inside each of my primitive draw functions (drawpoints, drawline, drawquad, etc etc). [source] private void SetModelView() { GL.MatrixMode(MatrixMode.Modelview); GL.LoadIdentity(); double TheTranslate = 0.375; /* for pixel perfect accuracy */ MyOrderZ+=MyStepZ; /* increment Z for proper sorting */ GL.Translate(TheTranslate, TheTranslate, MyOrderZ); } [/source] and here is my ortho and step stuff... [source] const double MyStepZ = 0.0000001; const double MyFarZ = -1; const double MyNearZ = 1; [/source] Yet... something is still wrong, it looks like it's still blending with the background instead of the quad beneath it.. evident by this picture:
  9. So... if I've got some textured quads that I want to render in my (completely) 2D application, and they've either got transparent texels or transparent verticles, do I need to sort the quads manually (via the z-axis) or will OpenGL do that for me via the order in which I render the quads? Doing a search on the forums on this subject found me this: but the OP was using the depth-buffer in his application, so I'm not sure if phantom's answer is my answer aswell...
  10. I'm seeing that when I set the glPointSize to a value greater than 1, that when the upper-left pixel of the point goes offscreen the entire point disappears, even if it was 100 pixels wide... How do I tell OpenGL to stop doing the thinking, or at the least, to let these points render even if they were initially offscreen?
  11. So I'm using an FBO to allow me to render the entire scene onto a texture, and then stretching that texture 2x'd over the screen. The problem I'm having is that when the underlying FBO texture is resized and reattached to the FBO, things either stop rendering completely or are partially off the screen. CheckIfWeNeedToResizeOffscreenBuffer() This code just checks if the size of the window is too large for the underlying FBO buffer, and if it is, it deletes the existing texture and creates a new one that fits (always a power of 2 size) [source] private void CheckIfWeNeedToResizeOffscreenBuffer() { int TheExtent = Math.Max(this.ClientSize.Width, this.ClientSize.Height); if (TheExtent > MyOffscreenTextureSize) { if (MyOffscreenTexture != 0) cTexture.DeleteTexture(ref MyOffscreenTexture); MyOffscreenTextureSize = 1; while (MyOffscreenTextureSize < TheExtent) MyOffscreenTextureSize *= 2; cTexture.GenerateTexture(MyOffscreenTextureSize, out MyOffscreenTexture); sCheckError(); GL.BindTexture(TextureTarget.Texture2D, 0); GL.BindFramebuffer(FramebufferTarget.FramebufferExt, MyFrameBufferObject); GL.FramebufferTexture2D(FramebufferTarget.FramebufferExt, FramebufferAttachment.ColorAttachment0Ext, TextureTarget.Texture2D, MyOffscreenTexture, 0); FramebufferErrorCode TheRet = GL.CheckFramebufferStatus(FramebufferTarget.FramebufferExt); if (TheRet != FramebufferErrorCode.FramebufferComplete) throw new Exception("Fatal framebuffer error: " + TheRet.ToString() + "."); GL.Clear(ClearBufferMask.ColorBufferBit | ClearBufferMask.DepthBufferBit); } } [/source] SetProjection() This code sets up the Ortho for if the destination is the FBO or for the main window. [source] private void SetProjection(bool TheForFBO) { if (TheForFBO) { GL.Viewport(0, 0, MyOffscreenTextureSize, MyOffscreenTextureSize); sCheckError(); } else GL.Viewport(0, 0, this.ClientSize.Width, this.ClientSize.Height); GL.MatrixMode(MatrixMode.Projection); GL.LoadIdentity(); if (TheForFBO) { GL.Ortho(0, MyOffscreenTextureSize, MyOffscreenTextureSize, 0, -1.0, 1.0); sCheckError(); } else GL.Ortho(0, this.ClientSize.Width, this.ClientSize.Height, 0, -1.0, 1.0); GL.MatrixMode(MatrixMode.Modelview); GL.LoadIdentity(); double TheTranslate = 0.375; GL.Translate(TheTranslate, TheTranslate, 0.0); } [/source] oh and.. on the form's resize event, this is what gets called CheckIfWeNeedToResizeOffscreenBuffer(); SetProjection(true); and lastly... sDisplay() and DrawOffscreenTexture() Display gets called when the rendering to the FBO is finished, and it's time to draw it to the screen (which DrawOffscreenTexture()) is does. [source] public void sDisplay() { CheckContext(); /* Unbind the FBO and bind the texture we've been rendering to. */ GL.BindFramebuffer(FramebufferTarget.FramebufferExt, 0); GL.BindTexture(TextureTarget.Texture2D, MyOffscreenTexture); SetProjection(false); DrawOffscreenTexture(); MyControl.SwapBuffers(); sCheckError(); SetProjection(true); GL.BindFramebuffer(FramebufferTarget.FramebufferExt, MyFrameBufferObject); GL.Clear(ClearBufferMask.ColorBufferBit | ClearBufferMask.DepthBufferBit); GL.ClearColor(Color.Black); } private void DrawOffscreenTexture() { double TheSourceLeftX = 0; double TheSourceRightX = 1;// MyForm.MyWidth / (double)MyOffscreenTextureSize; double TheSourceTopY = 1; double TheSourceBottomY = 0;// ClientSize.Height / (double)MyOffscreenTextureSize; int[] TheViewport = new int[4]; GL.GetInteger(GetPName.Viewport, TheViewport); double TheDestLeftX = 0; double TheDestRightX = ClientSize.Width; double TheDestTopY = ClientSize.Height; double TheDestBottomY = 0; //double TheSourceLeftX = 0; //double TheSourceRightX = MyForm.MyWidth/ (double)MyOffscreenTextureSize; //double TheSourceTopY = ClientSize.Height / (double)MyOffscreenTextureSize; //double TheSourceBottomY = MyForm.MyHeight / (double)MyOffscreenTextureSize; /* Now draw the offscreen buffer onto the screen. */ GL.Begin(BeginMode.Quads); GL.Color4(Color.White); /* Bottom Left.*/ GL.TexCoord2(TheSourceLeftX, TheSourceBottomY); GL.Vertex2(TheDestLeftX, TheDestBottomY); /* Top Left. */ GL.TexCoord2(TheSourceLeftX, TheSourceTopY); GL.Vertex2(TheDestLeftX, TheDestTopY); /* Top Right. */ GL.TexCoord2(TheSourceRightX, TheSourceTopY); GL.Vertex2(TheDestRightX, TheDestTopY); /* Bottom Right, */ GL.TexCoord2(TheSourceRightX, TheSourceBottomY); GL.Vertex2(TheDestRightX, TheDestBottomY); GL.End(); } [/source] DrawOffscreenTexture is in a bit of a bad state atm because I'm essentially using trial&error to try and see which parameter may be off, but it's not helping; I'm completely stumped. Edit: I solved it.
  12. PixelPants

    My alpha-blended textures look odd...

    Nevermind; the issue magically disappeared.
  13. On the left is a PNG as it appears in GIMP, on the right is that image loaded into my 2D OpenGL application. As you can see, transparency is working, but the blending seems a bit off; the edges of the image are quite unpleasant. I'm using: glEnable (GL_BLEND); glBlendFunc (GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA); with AlphaTest enabled and DepthTest disabled. I don't think this is a "premultiplied alpha", so... could it be that I need DepthTesting enabled and to (very) slightly increment the Z value of each of the 2D draws?
  14. PixelPants

    Texture Border

    I replaced glCopyTexSubImage2D() with glCopyTexImage2D(), and the problem went away. My code is less efficient now (as it copies the entire frame-buffer as opposed to just the part that would get scaled up), but I'm too frustrated at the OpenGL API to figure out why this function works.
  15. PixelPants

    Texture Border

    Ugh.. I just observed another odd behavior... and it has convinced me it's no longer a border issue... The certain conditions is when the window size increases to next power-of-2 threshold; as the window size increases past these thresholds the underlying off-screen buffer that I render to needs to get resized. When the destination texture has a GL_TEXTURE_HEIGHT hardcoded to 1024 (1024x1024) everything is fine (even when "(yoffset+height) == h"). However, when I hardcode the GL_TEXTURE_HEIGHT to 2048 (2048x2048), the error occurs. My ATI Radeon HD 4650 is suppose to be able to support 8192x8192 textures though, so this is kinda weird. :|
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!