BMP Textures as Embedded Resource (Compiled in EXE)

Started by
6 comments, last by MrPinky 22 years, 10 months ago
NeHe's code for reading in bitmaps and naming / generating the OpenGLogl textures is great, but I have had no luck in writing my own routines or modifying NeHe's to include the bitmaps in the exe. Talking VC++ the bitmaps can be included as a resource file, then assigned an ID (defaults to IDB_BITMAP1) and an integer (why the number? anyway it defaults to 101). They are defined in resource.h (just add it to the includes in main.cpp). Getting these bitmaps out of the resource file for display or texture conversion isn't as easy as it seems. I haven't seen any code examples and MSDN can't even tell me the function I need to use. Perhaps somebody who has done this or can see what is going on could enlighten me. There are examples in the DirectX 7.0 / 8.0a SDK's, most of the samples have embedded bitmaps, but the read routines are DirectX specific for loading them onto a DirectX surface. How does one read from the bitmap resource to name / create OpenGL textures? Any thoughts or example code = ) would be appreciated. Thanks! P.S. Has anyone done anything or seen any code for storing textures as compressed file formats? I know of a couple of demos where the textures are stored as JPEGs and decompressed before texture conversion but I can't get my hands on the source and have NFI how they did it. Edited by - MrPinky on June 9, 2001 10:35:16 PM
Advertisement
I have done the resource thing at work and it is possible. I can''t post the source till Monday though

Anymore I just use OpenIL (soon to be DevIL) at www.openil.org for all my OpenGL file ops as it supports about any common file format and is fairly sane.
Here's some code I cooked up to load RAW textures. It wouldn't be hard to modify the code to load just about anything.

  void LoadRawTexture(char* strName, const int TexNumber, const int width, const int height){	HRSRC   hResInfo = NULL;	HGLOBAL hResData = NULL;	VOID*   pvRes = NULL;	GLbyte* pbRes = NULL;	if (NULL == (hResInfo = FindResource(NULL, strName, TEXT("RAW"))))		return;	if (NULL == (hResData = LoadResource(NULL, hResInfo)))		return;	if (NULL == (pvRes = LockResource(hResData)))		return;	pbRes = (GLbyte*)pvRes;	glPixelStorei(GL_UNPACK_ALIGNMENT, 1);	glBindTexture(GL_TEXTURE_2D, TexNumber);	gluBuild2DMipmaps(GL_TEXTURE_2D, 1, width, height, GL_LUMINANCE, GL_UNSIGNED_BYTE, pbRes);	glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);	glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR_MIPMAP_NEAREST);}   


Good luck!


ThomW
www.LMNOpc.com

Edited by - thomw on June 9, 2001 11:28:32 PM
This must look pretty stupid, probably cause i don''t half understand pointers. i get that & returns the memory address as an integer, and i get that * does the opposite (returns the contents of the memory address on the right.

How do I feed the function in the last post the strName variable? I have been calling the function like so:

if (!LoadRawTexture(*IDR_RAW1, 1, 256, 256))
{
return FALSE;
}

The compiler returns invalid indirection (of course, i can''t do an indirection on IDR_RAW1... how do i reference the RAW resource?) Compiler also reckons it can''t convert ''const int'' to ''char *'' no doubt cause by the above. In short how do I point to the contents of IDR_RAW1

Thanks!
Thanks ThomW = ) will get started on that tonight...

also thx warpstorm, could you post the source when you get a chance? bitmaps are nice cause it saves me passing the dimensions every time = ) thx

Edited by - MrPinky on June 10, 2001 2:23:28 AM
Sorry for not providing a demo of how to call the routine from your program.

  LoadRawTexture("TextureResourceName", 1, 256, 256);  


Good luck!



ThomW
www.LMNOpc.com
No need to apologise! Thanks for your help ThomW its loading them up fine now = )

MrPinky
Compressing a file before loading is fine but you should look into S3TC first. When you compress is and then decompress it, it still takes up the same amount of space in you video cards ram. With S3TC the compressed data is loaded into the video card using less space for images of the same quality. The other way of compression is only good for people with small hard drives. I think nVIDIA has documentation on s3tc. After all, its an openGL extension.

"Those who want it, but can''''t get it, will complain about it.

This topic is closed to new replies.

Advertisement