OpenGL 4.4 render to SNORM

Started by
5 comments, last by B_old 9 years, 10 months ago

Hi,

is it possible to a SNORM texture in OpenGL 4.4? Apparently they are not a required format for color targets in 4.2.

I want to render to a RG16_SNORM target to store normals in octahedron format. The linked paper contains code that expects and outputs data in the [-1, 1] range and I was just assuming that it would automatically work with SNORM textures.

The output seems to get clamped to [0, 1] though. It checked with a floating point render target and got the expected results so I don't think it is an issue with the code.

Should this work? Am I maybe doing something wrong when creating the texture?

EDIT:

D3D11 hardware supports SNORM render targets, so I guess I'm doing something wrong.

Advertisement

Hopefully this is helpful: https://www.opengl.org/registry/specs/EXT/texture_snorm.txt

I don't know. It says:

We are silent about requiring R, RG, RBA and RGBA rendering. This is an implementation choice.

As the hardware seems to perfectly capable of rendering to SNORM I expect that it is implemented for all drivers.

Has someone here successfully rendered to SNORM with OpenGL?

EDIT:

This OpenGL 4.4 core spec document also does not mark the SNORM formats as something that must be supported for a color target. Maybe it is really not supported to render to SNORM. Can anybody confirm this?

Are you checking for framebuffer completeness? The spec says:


Implementations must support framebuffer objects with up to MAX_COLOR_-
ATTACHMENTS color attachments, a depth attachment, and a stencil attachment.
Each color attachment may be in any of the color-renderable formats described
in section 9.4 (although implementations are not required to support creation of
attachments in all color-renderable formats)
.

RG16_SNORM is a color-renderable format, but it may not be supported by your graphics card or driver. I didn't actually see anywhere in the spec (might have overlooked it) where it lists the required formats, but there is this. I see UNORM, SINT and UINT, but no SNORM. So my best guess is that your GPU/driver does not support SNORM framebuffers.

Yes, I'm checking for completeness. The behavior I get is completely identical to just using a UNORM target, so it seems to be silently converting to that. I also don't get any debug output from OpenGL.

This is with a GeForce GTX 570, driver 331.38 on Ubuntu.

Do you have experience with rendering to this format?

Yes, I'm checking for completeness. The behavior I get is completely identical to just using a UNORM target, so it seems to be silently converting to that. I also don't get any debug output from OpenGL.

This is with a GeForce GTX 570, driver 331.38 on Ubuntu.

Do you have experience with rendering to this format?

Does that mean it returned GL_FRAMEBUFFER_COMPLETE? Like I said, SNORM is not a required format, so Nvidia doesn't have to supply it. If they don't, then it should return GL_FRAMEBUFFER_UNSUPPORTED?. If it doesn't, then it may be a driver bug.

Since SNORM is not a guaranteed thing, maybe you should just render to a floating point target and then convert to SNORM on the CPU.

Edit: Or i suppose you could render to an integer format and do the conversion in your shader. http://msdn.microsoft.com/en-us/library/windows/desktop/dd607323%28v=vs.85%29.aspx

Yes, it returned GL_FRAMEBUFFER_COMPLETE.

Maybe the easiest thing is to render to UNORM and map [-1, 1] to [0, 1] in the shader. Not really a big deal, but I wanted to find out what the problem is.

This topic is closed to new replies.

Advertisement