Loading JPG as Texture

Started by
8 comments, last by Foobar of Integers 17 years, 12 months ago
Anyone has code for loading JPG as texture in OpenGL ? Oh, and I also need transparency (because, I want to make round-edge on the bridge card I draw). Thanks.
Advertisement
A jpeg does not have an alpha (or as you call it, transparency) channel.

Try using the PNG format. You can use DevIL (OpenIL) to load the texture. DevIL can load JPEGs too but as the AP said JPEG has not support for an alpha channel.
Sounds interesting, ill try the library too :D, exactly what I was looking for :).

Decrius
[size="2"]SignatureShuffle: [size="2"]Random signature images on fora
Could not read file.

Go back. /home/ftp/pub/sourceforge//o/op/openil/Devil-1.6.8-RC1-win32.zip
Apr 22, 2006 17:18


??????
You could also try Corona (http://corona.sourceforge.net/). It's very simple to use. There's a tutorial.txt included with the download, but if you're still having problems I can post my code.
http://prdownloads.sourceforge.net/openil

You can try GFL SDK too.
Quote:Original post by Wingman
You could also try Corona (http://corona.sourceforge.net/). It's very simple to use. There's a tutorial.txt included with the download, but if you're still having problems I can post my code.


Thanks :)
BTW, I tried this code, but the transparency doesn't work...

bool COpenGLControl::LoadCardImage(TextureImage *texture){	corona::Image* image = corona::OpenImage("Data/cards.gif", corona::PF_R8G8B8A8);	if(!image) return false;	corona::FlipImage(image, corona::CA_X);	texture->width=image->getWidth();	texture->height=image->getHeight();	texture->bpp=32;	int imageSize = texture->width*texture->height*4;	texture->imageData=(GLubyte *)malloc(imageSize);	void* pixels=image->getPixels();		typedef unsigned char byte;	byte* p=(byte*)pixels;	unsigned long j=0, i=0;	for(i=0; i<texture->width*texture->height; ++i)	{		j=i*4;		texture->imageData[j]=*p++;		texture->imageData[j+1]=*p++;		texture->imageData[j+2]=*p++;		texture->imageData[j+3]=*p++;	}//	free(pixels);	delete(image);	glGenTextures(1, &texture[0].texID);						// Generate OpenGL texture IDs	glBindTexture(GL_TEXTURE_2D, texture[0].texID);				// Bind Our Texture	glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);	// Linear Filtered	glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);	// Linear Filtered		glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, texture[0].width, texture[0].height, 0, GL_RGBA, GL_UNSIGNED_BYTE, texture[0].imageData);


I've never used Corona, but I guess that it probably will not automatically set the appropriate alpha value based on the RGB component pixels. If not, then you will need set this when you are transferring the image data from the corona image array to your own texture imageData array.

F451
I use SDL_Image to load PNGs for textures, works great.

If you try to use it, here's the code I use to load them: http://rafb.net/paste/results/g0jKm682.html

Ignore the TextureHandle stuff, that's just my texture manager's extra code.
"ok, pac man is an old gameand, there are faces which is eatin up shits" - da madface

This topic is closed to new replies.

Advertisement