Jump to content

  • Log In with Google      Sign In   
  • Create Account

We're offering banner ads on our site from just $5!

1. Details HERE. 2. GDNet+ Subscriptions HERE. 3. Ad upload HERE.


OpenGL 4.4 render to SNORM


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
6 replies to this topic

#1 B_old   Members   -  Reputation: 666

Like
1Likes
Like

Posted 29 May 2014 - 07:27 AM

Hi,

 

is it possible to a SNORM texture in OpenGL 4.4? Apparently they are not a required format for color targets in 4.2.

 

I want to render to a RG16_SNORM target to store normals in octahedron format. The linked paper contains code that expects and outputs data in the [-1, 1] range and I was just assuming that it would automatically work with SNORM textures.

 

The output seems to get clamped to [0, 1] though. It checked with a floating point render target and got the expected results so I don't think it is an issue with the code.

 

Should this work? Am I maybe doing something wrong when creating the texture?

 

EDIT:

 

D3D11 hardware supports SNORM render targets, so I guess I'm doing something wrong.


Edited by B_old, 29 May 2014 - 07:51 AM.


Sponsor:

#2 ryutenchi   Members   -  Reputation: 353

Like
0Likes
Like

Posted 29 May 2014 - 09:27 AM

Hopefully this is helpful: https://www.opengl.org/registry/specs/EXT/texture_snorm.txt



#3 B_old   Members   -  Reputation: 666

Like
0Likes
Like

Posted 29 May 2014 - 10:15 AM

I don't know. It says:

 

We are silent about requiring R, RG, RBA and RGBA rendering. This is an implementation choice.

 

As the hardware seems to perfectly capable of rendering to SNORM I expect that it is implemented for all drivers.

Has someone here successfully rendered to SNORM with OpenGL?

 

EDIT:

 

This OpenGL 4.4 core spec document also does not mark the SNORM formats as something that must be supported for a color target. Maybe it is really not supported to render to SNORM. Can anybody confirm this?


Edited by B_old, 29 May 2014 - 10:35 AM.


#4 Chris_F   Members   -  Reputation: 2439

Like
0Likes
Like

Posted 29 May 2014 - 11:18 PM

Are you checking for framebuffer completeness? The spec says:

 


Implementations must support framebuffer objects with up to MAX_COLOR_-
ATTACHMENTS color attachments, a depth attachment, and a stencil attachment.
Each color attachment may be in any of the color-renderable formats described
in section 9.4 (although implementations are not required to support creation of
attachments in all color-renderable formats)
.

 

RG16_SNORM is a color-renderable format, but it may not be supported by your graphics card or driver. I didn't actually see anywhere in the spec (might have overlooked it) where it lists the required formats, but there is this. I see UNORM, SINT and UINT, but no SNORM. So my best guess is that your GPU/driver does not support SNORM framebuffers.



#5 B_old   Members   -  Reputation: 666

Like
0Likes
Like

Posted 30 May 2014 - 05:41 AM

Yes, I'm checking for completeness. The behavior I get is completely identical to just using a UNORM target, so it seems to be silently converting to that. I also don't get any debug output from OpenGL.

 

This is with a GeForce GTX 570, driver 331.38 on Ubuntu.

 

Do you have experience with rendering to this format?



#6 Chris_F   Members   -  Reputation: 2439

Like
0Likes
Like

Posted 30 May 2014 - 04:36 PM

Yes, I'm checking for completeness. The behavior I get is completely identical to just using a UNORM target, so it seems to be silently converting to that. I also don't get any debug output from OpenGL.

 

This is with a GeForce GTX 570, driver 331.38 on Ubuntu.

 

Do you have experience with rendering to this format?

 

Does that mean it returned GL_FRAMEBUFFER_COMPLETE? Like I said, SNORM is not a required format, so Nvidia doesn't have to supply it. If they don't, then it should return GL_FRAMEBUFFER_UNSUPPORTED​. If it doesn't, then it may be a driver bug.

 

Since SNORM is not a guaranteed thing, maybe you should just render to a floating point target and then convert to SNORM on the CPU.

 

Edit: Or i suppose you could render to an integer format and do the conversion in your shader. http://msdn.microsoft.com/en-us/library/windows/desktop/dd607323%28v=vs.85%29.aspx


Edited by Chris_F, 30 May 2014 - 04:47 PM.


#7 B_old   Members   -  Reputation: 666

Like
0Likes
Like

Posted 31 May 2014 - 04:53 AM

Yes, it returned GL_FRAMEBUFFER_COMPLETE.

 

Maybe the easiest thing is to render to UNORM and map [-1, 1] to [0, 1] in the shader. Not really a big deal, but I wanted to find out what the problem is.






Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS