Jump to content
  • Advertisement
Sign in to follow this  

Integer buffers and transform feedback

This topic is 2546 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I'm having serious trouble reading and updating integer buffers when doing transform feedback. My buffer setup follows exactly the same pattern as that of a regular float buffer and I'm getting no errors from GL. However, when accessing the buffer in the shader, I'm getting values that are definitely wrong (as in they bear no similarity to what I initially write into the buffer). The same happens when I update the buffer from the shader. The code itself is pretty sizeable so I thought I'd just post the question first - are there any limitations to using GL_INT with glVertexAttribPointer() or could this possibly be a driver fluke? I mean, there's pretty much zero information on this out on the web and float buffers are working just perfectly.

However, since I need to pack 5 parameters into a single int, I need bit level access to the stream and I don't see any way to reinterpret a float into an int without it being re-cast and transformed in GLSL.

Has anyone had any dealings with this or can verify that/whether int buffers should work at all with transform feedback.

Share this post

Link to post
Share on other sites
Sign in to follow this  

  • Advertisement

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!