Archived

This topic is now archived and is closed to further replies.

Gandalf

OpenGL Loading a bitmap in OpenGL

Recommended Posts

I want to load a bitmap image from a resource file in Visual Studio. This is the win32 function to do it: HBITMAP LoadBitmap(HINSTANCE hInstance, LPCTSTR lpBitmapName); But when I have this bitmap handler how can I use it in OpenGL with this function: glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, 256, 256, 0, GL_RGB, GL_UNSIGNED_BYTE, bmp[0]->data) ? (The problem is I don´t want to load the image from file.) Edited by - Gandalf on 6/18/00 12:44:27 PM

Share this post


Link to post
Share on other sites
Here is little code

            

HBITMAP bitmap = LoadBitmap(GetModuleHandle(NULL),MAKEINTRESOURC(IDB_BITMAP3));


unsigned char *data;
GetBitmapBits(bitmap, 256*256*3, &data);


glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, 256, 256, 0, GL_RGB, GL_UNSIGNED_BYTE, &data);


I don´t get a error but the bitmap look very ugly!

Gandalf the White


Edited by - Gandalf on June 19, 2000 4:23:18 AM

Share this post


Link to post
Share on other sites
How ugly? Is it just that the colors are all screwed up? or is it you cannot recognise your original image?

- make sure your image is 256*256, 24 bits per pixel
- if I remember, I had once to reverse the component order (bitmap is BGR and OpenGL expects RGB)
- and yes, data_size = bmp_width * bmp_height * bytes_per_pixel

maybe you should use LoadImage & GetDIBits instead.

ps: in your code, I assume you allocate enough space for the data!

Share this post


Link to post
Share on other sites
Gandalf, are you using 8 bits or 24 bits BMP''s?
if it''s 8 bit, it looks ugly because 8 bit images doesn''t have the colors stored as 24 bit images. In 8 bit, there''s a palette of colors, which is simply an array of RGB values, and each pixel of the image is stored as one byte, that represents the index of the palette.
You have to do a conversion, Here''s some pseudo-code:

- Create an array of 256 entries of the PALETTEENTRY structure (Pal)
- Allocate Width*Height bytes of memory (In)
- Allocate Width*Height*3 bytes of memory (Out)
- Use LoadBitmap to get an HBITMAP
- Use GetDIBColorTable to fill Pal
- Use GetBitmapBits to fill In

now, for (i=0; i < width*height-1;i++) do
Out[i*3+0] = Pal[In].peRed;
Out[i*3+1] = Pal[In[i]].peGreen;
Out[i*3+2] = Pal[In[i]].peBlue;

and use Out in the glTexImage2D:
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, width, height, 0, GL_RGB, GL_UNSIGNED_BYTE, &Out);

and that''s it! I don''t remember well, but I recall an extension to use 8 bit textures... but I don''t know how to use it.

For 24 bit images, the only problem is that bitmaps are stored in BGR, and not in RGB (don''t ask me why...), so you have to exchange the bytes, as Jehan said.

Hope that helps,


Nicodemus.

----
"When everything goes well, something will go wrong." - Murphy

Share this post


Link to post
Share on other sites
I´m using 24 bits bitmaps. The texture not just looking ugly, it must be something wrong with the color bytes. But I can recognise my original image. Switch from RGB to BGR? Something like this, or what?

    

unsigned char temp;
for(int i=0; i<(256*256*3); i++)
if(i%2==0 && i>=2)
{
temp = data ;
data<i> = data[i-2];
data[i-2] = temp;
}



My texture still looking bad.

Gandalf the White



Edited by - Gandalf on June 20, 2000 3:46:36 AM

Share this post


Link to post
Share on other sites
Hi Gandalf,
There is an extension called GL_BGR_EXT / GL_BGRA_EXT , which allows to load the channels in reverse order. The ext is supported by Nvidia ICDs since RivaTNT, maybe even earlier. But I dont know whether it is supported by other ipmlentations. But you can also use RAW or TGA files (onvert them with Photoshop for example), where the first one of them is very easy to use and allways in the right order.

???
Why do you write that in the for slope
temp=data; instead of temp=data [ i ];

shouldn't it look that way

int a=3;
for(int i=0;i<(256*256*3);i++)
{
if(a==3)
{
temp = data ;
data[i] = data[i+2];
data[i+2] = temp;
a=0;
}
a++;
}
Hope that helps !

How do you make these source code frames ?

Edited by - TheMummy on June 20, 2000 5:18:33 AM


Edited by - TheMummy on June 20, 2000 5:24:32 AM

Had to edit it because the forum always creats out of

Edited by - TheMummy on June 20, 2000 5:26:12 AM

Share this post


Link to post
Share on other sites
Hey Mummy,

I try the flags without success. Only half image shows up. The colors seem to be right, but the pixels are in wrong order.

(We do the same thing.)

Gandalf the White



Edited by - Gandalf on June 20, 2000 5:22:07 AM

Share this post


Link to post
Share on other sites
Just write "source" inside "[]" before the source code. Look at my reply.

Gandalf the White


Edited by - Gandalf on June 20, 2000 5:26:27 AM

Share this post


Link to post
Share on other sites
Sorry Gandalf,
It was the forum it always/often makes "data" out of "data(i)", it seems that /i is a html command.

I never had to face these problems because I use Raw files. Are you sure that your ICD supports GL_BGR ?


Edited by - TheMummy on June 20, 2000 5:37:23 AM

Share this post


Link to post
Share on other sites
Yes, my computer support GL_EXT_bgra. I have checked. You seems to have problem with the "[]" think!

Gandalf the White

Share this post


Link to post
Share on other sites
quote:
Original post by Gandalf

I´m using 24 bits bitmaps. The texture not just looking ugly, it must be something wrong with the color bytes. But I can recognise my original image. Switch from RGB to BGR? Something like this, or what?

        

unsigned char temp;
for(int i=0; i<(256*256*3); i++)
if(i%2==0 && i>=2)
{
temp = data ;
data<i> = data[i-2];
data[i-2] = temp;
}

[/source]

My texture still looking bad.

Gandalf the White<BR>


Edited by - Gandalf on June 20, 2000 3:46:36 AM <hr height=1 noshade></BLOCKQUOTE></font>

To switch BGR to RGB:

[source]
unsigned char *data; // you allocated & filled that buffer

unsigned char temp;

for( int i=0; i<(256*256*3); i+=3)
{
temp = data<i>;
data[i] = data[i+2];
data[i+2] = temp;
}


Regards

Share this post


Link to post
Share on other sites
quote:
Original post by Gandalf

Here is little code

                

HBITMAP bitmap = LoadBitmap(GetModuleHandle(NULL),MAKEINTRESOURC(IDB_BITMAP3));

unsigned char *data;
GetBitmapBits(bitmap, 256*256*3, &data);

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, 256, 256, 0, GL_RGB, GL_UNSIGNED_BYTE, &data);


I don´t get a error but the bitmap look very ugly!

Gandalf the White


Edited by - Gandalf on June 19, 2000 4:23:18 AM





Seems the formatting did not work as I expected in the previous post .
However...
Gandalf, I checked your first posts and fell on that bit of code. I wonder if you wrote that in your program, because it seems the last glTexImage2D argument is not correct: it expects an ''unsigned char*''. Using &data you give it an ''unsigned char**''.

Share this post


Link to post
Share on other sites
That was a ugly brain bug! Now that´s changed and the problem still exist!

Gandalf the White, witch codes with the sunglasses on

Edited by - Gandalf on June 21, 2000 6:04:22 AM

Share this post


Link to post
Share on other sites
Did you change the &data argument to data in the GetBitmapBits function call ? ''cause it''s the same problem...

Share this post


Link to post
Share on other sites
I have found the problem. It was the function GetBitmapBits (thanks Jehan ) witch probably only can load 8 bits bitmaps, but I´m not sure about that. Anyway now I´m using GetDIBits() instead and everything works fine.

Thanks everybody,

Gandalf the White

Share this post


Link to post
Share on other sites
But why is the colors not right? It seems to be something wrong with the yellow colors. I think that red and blue colors if swiched.

Here is how I set up the structure for a 24 bits 256x256 bitmap:

                    

HBITMAP bitmap = LoadBitmap(GetModuleHandle(NULL),MAKEINTRESOURC (IDB_BITMAP7));
unsigned char data[256*256*3];

BITMAPINFO info;
BITMAPINFOHEADER header;
header.biSize = sizeof(BITMAPINFOHEADER);
header.biWidth = 256;
header.biHeight = 256;
header.biPlanes = 1;
header.biBitCount = 24;
header.biCompression = BI_RGB;
header.biSizeImage = 0;
header.biClrUsed = 0;
header.biClrImportant = 0;

info.bmiHeader = header;
info.bmiColors->rgbRed = NULL;
info.bmiColors->rgbGreen = NULL;
info.bmiColors->rgbBlue = NULL;
info.bmiColors->rgbReserved = NULL;

HDC hdc = GetDC(g_App.m_hWndRender);

GetDIBits(hdc, bitmap, 0, 256, &data, &info, DIB_RGB_COLORS);

ReleaseDC(g_App.m_hWndRender, hdc);



Gandalf the White



Edited by - Gandalf on June 22, 2000 4:47:40 AM

Edited by - Gandalf on June 22, 2000 4:48:30 AM

Edited by - Gandalf on June 22, 2000 5:20:05 AM

Share this post


Link to post
Share on other sites

  • Partner Spotlight

  • Forum Statistics

    • Total Topics
      627646
    • Total Posts
      2978380
  • Similar Content

    • By xhcao
      Before using void glBindImageTexture(    GLuint unit, GLuint texture, GLint level, GLboolean layered, GLint layer, GLenum access, GLenum format), does need to make sure that texture is completeness. 
    • By cebugdev
      hi guys, 
      are there any books, link online or any other resources that discusses on how to build special effects such as magic, lightning, etc. in OpenGL? i mean, yeah most of them are using particles but im looking for resources specifically on how to manipulate the particles to look like an effect that can be use for games,. i did fire particle before, and I want to learn how to do the other 'magic' as well.
      Like are there one book or link(cant find in google) that atleast featured how to make different particle effects in OpenGL (or DirectX)? If there is no one stop shop for it, maybe ill just look for some tips on how to make a particle engine that is flexible enough to enable me to design different effects/magic 
      let me know if you guys have recommendations.
      Thank you in advance!
    • By dud3
      How do we rotate the camera around x axis 360 degrees, without having the strange effect as in my video below? 
      Mine behaves exactly the same way spherical coordinates would, I'm using euler angles.
      Tried googling, but couldn't find a proper answer, guessing I don't know what exactly to google for, googled 'rotate 360 around x axis', got no proper answers.
       
      References:
      Code: https://pastebin.com/Hcshj3FQ
      The video shows the difference between blender and my rotation:
       
    • By Defend
      I've had a Google around for this but haven't yet found some solid advice. There is a lot of "it depends", but I'm not sure on what.
      My question is what's a good rule of thumb to follow when it comes to creating/using VBOs & VAOs? As in, when should I use multiple or when should I not? My understanding so far is that if I need a new VBO, then I need a new VAO. So when it comes to rendering multiple objects I can either:
      * make lots of VAO/VBO pairs and flip through them to render different objects, or
      * make one big VBO and jump around its memory to render different objects. 
      I also understand that if I need to render objects with different vertex attributes, then a new VAO is necessary in this case.
      If that "it depends" really is quite variable, what's best for a beginner with OpenGL, assuming that better approaches can be learnt later with better understanding?
       
    • By test opty
      Hello all,
       
      On my Windows 7 x64 machine I wrote the code below on VS 2017 and ran it.
      #include <glad/glad.h>  #include <GLFW/glfw3.h> #include <std_lib_facilities_4.h> using namespace std; void framebuffer_size_callback(GLFWwindow* window , int width, int height) {     glViewport(0, 0, width, height); } //****************************** void processInput(GLFWwindow* window) {     if (glfwGetKey(window, GLFW_KEY_ESCAPE) == GLFW_PRESS)         glfwSetWindowShouldClose(window, true); } //********************************* int main() {     glfwInit();     glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 3);     glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 3);     glfwWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);     //glfwWindowHint(GLFW_OPENGL_FORWARD_COMPAT, GL_TRUE);     GLFWwindow* window = glfwCreateWindow(800, 600, "LearnOpenGL", nullptr, nullptr);     if (window == nullptr)     {         cout << "Failed to create GLFW window" << endl;         glfwTerminate();         return -1;     }     glfwMakeContextCurrent(window);     if (!gladLoadGLLoader((GLADloadproc)glfwGetProcAddress))     {         cout << "Failed to initialize GLAD" << endl;         return -1;     }     glViewport(0, 0, 600, 480);     glfwSetFramebufferSizeCallback(window, framebuffer_size_callback);     glClearColor(0.2f, 0.3f, 0.3f, 1.0f);     glClear(GL_COLOR_BUFFER_BIT);     while (!glfwWindowShouldClose(window))     {         processInput(window);         glfwSwapBuffers(window);         glfwPollEvents();     }     glfwTerminate();     return 0; }  
      The result should be a fixed dark green-blueish color as the end of here. But the color of my window turns from black to green-blueish repeatedly in high speed! I thought it might be a problem with my Graphics card driver but I've updated it and it's: NVIDIA GeForce GTX 750 Ti.
      What is the problem and how to solve it please?
  • Popular Now