Load generated data into texture.

Started by
1 comment, last by kc_0045 14 years, 1 month ago
Hey, So yeah im trying to load the var data into a opengl texture, but it doesnt seem to be working. Any ideas?


void effect::init(int h2,int w2)
{
    h=h2; w=w2;
    data = (unsigned int*)new unsigned int[ ((h*w)*4) ];
    for(int n=0; n<=(h*w);n++)
    {
        //Red n+0, Green n+1, Blue n+2, A0 n+3
        data[(n*4)+0]=255;
        data[(n*4)+1]=255;
        data[(n*4)+2]=0;
        data[(n*4)+3]=255;
    }
    glEnable(GL_TEXTURE_2D);
	glGenTextures(1, &texture);
	glBindTexture(GL_TEXTURE_2D, texture);
	glTexImage2D(GL_TEXTURE_2D, 0,GL_RGBA, h, w, 0,GL_RGBA, GL_UNSIGNED_BYTE, data);
	glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR);
	glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR);
}
I was trying to make a way to generate random effects into a texture then display it, any ideas? im declaring data in the effect class as public: unsigned int* data;
Advertisement
What do you mean by 'not working'? Is the texture being displayed but not as desired, or is it not displaying at all?

At a quick glance it looks like you're using unsigned integers and then telling OGL you're using unsigned bytes, so each integer element for R, G, B and A is being read as the entire 4 byte RGBA pixel in the texture. Use unsigned char instead of unsigned int, eg:

data = new unsigned char[ ((h*w)*4) ];


Edit: fixed typos
Quote:Original post by JackTheRapper
What do you mean by 'not working'? Is the texture being displayed but not as desired, or is it not displaying at all?

At a quick glance it looks like you're using unsigned integers and then telling OGL you're using unsigned bytes, so each integer element for R, G, B and A is being read as the entire 4 byte RGBA pixel in the texture. Use unsigned char instead of unsigned int, eg:

*** Source Snippet Removed ***

Edit: fixed typos


That fixed it, thanks.

This topic is closed to new replies.

Advertisement