Sign in to follow this  

OpenGL [solved] Save a Texture with DevIL

Recommended Posts

dedesite    100
Hi everybody,

I've been a game developer (mostly tools) for 2 and a half years and now I'm working as a web developer but still doing some game development stuff as amateur :).

I try to save 32 and 16bits OpenGL textures into a file using DevIL which seems to be a very simple library.

Since my knowledge in OpenGL and DevIL is a bit limited, I'm kind of stuck... and I can't find any tutorial explaining how to do it.

Here is approximately what I do (I've simplified the code just to let the OpenGL+DevIL part) :

[code]#include "IL/il.h"
#include "IL/ilu.h"
#include "IL/ilut.h"

void save_texture(GLuint texture)
int width = 128;
int height = 128;

glBindTexture(GL_TEXTURE_2D, texture);
glCopyTexImage2D(GL_TEXTURE_2D,0,GL_RGB,0, 0, width, height,0);

ILuint img_size = sizeof(BYTE) * width * height * 3;
BYTE *raw_img = (BYTE*) malloc(img_size);
glGetTexImage(GL_TEXTURE_2D, 0, GL_RGB, GL_UNSIGNED_BYTE, raw_img);

if(raw_img != NULL)
//Then save it using devil
ILuint ImgId = 0;
ilGenImages(1, &ImgId);

//I don't know if ilCopyPixel is the right function to call
ilCopyPixels(0, 0, 0, width, height, 1, IL_RGB, IL_BYTE, raw_img);


ilDeleteImages(1, &ImgId);


The problem is that DevIL can't save the texture cause there is nothing in it (but raw_img is not null).
I assume that ilCopyPixels is not the right function to call but I've tried with ilSetPixels, ilSetDatas etc. and got the same result.

Am I missing something ?

Other question : how should I do to save 16 bits textures (GL_RGB5 format) since DevIL doesn't seems to manage 16 bits textures formats ? ([url=""]DevIL format list[/url])

As I said, I'm new to this problematic so I've maybe done a terrible mistake or misinterpret DevIL's behaviour, so be gentle with me please :).

Thanks for your time, greetings,

Share this post

Link to post
Share on other sites
RobTheBloke    2553
IIRC, to do this you need to use one of the methods that is not exposed as part of il.h (i.e. it's hiding away in devil_internal_exports.h). I *think* it's something along the lines of a call to ilResizeImage before copying in the pixel data (which will fail because no memory has yet been allocated for it).

Share this post

Link to post
Share on other sites
dedesite    100
Thank you Rob for your response,

So you mean that DevIL can't *normally* save a texture from a array of pixels ? Strange... but it explains why I can't find any information on internet about that...
I'll look at devil_internal_exports.h thank you.


Share this post

Link to post
Share on other sites
dedesite    100
YEEEES!!! I manage to save my texture :D

In fact, it was quite simple, I just have to call ilTexImage.
[url=""]Here[/url] is where I found the solution.

And for the 16 bits issue ? OpenGL automatically convert the texture in RGB 32bits for me, sweet :).


Share this post

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this  

  • Similar Content

    • By pseudomarvin
      I assumed that if a shader is computationally expensive then the execution is just slower. But running the following GLSL FS instead just crashes
      void main() { float x = 0; float y = 0; int sum = 0; for (float x = 0; x < 10; x += 0.00005) { for (float y = 0; y < 10; y += 0.00005) { sum++; } } fragColor = vec4(1, 1, 1 , 1.0); } with unhandled exception in nvoglv32.dll. Are there any hard limits on the number of steps/time that a shader can take before it is shut down? I was thinking about implementing some time intensive computation in shaders where it would take on the order of seconds to compute a frame, is that possible? Thanks.
    • By Arulbabu Donbosco
      There are studios selling applications which is just copying any 3Dgraphic content and regenerating into another new window. especially for CAVE Virtual reality experience. so that the user opens REvite or CAD or any other 3D applications and opens a model. then when the user selects the rendered window the VR application copies the 3D model information from the OpenGL window. 
      I got the clue that the VR application replaces the windows opengl32.dll file. how this is possible ... how can we copy the 3d content from the current OpenGL window.
      anyone, please help me .. how to go further... to create an application like VR CAVE. 
    • By cebugdev
      hi all,

      i am trying to build an OpenGL 2D GUI system, (yeah yeah, i know i should not be re inventing the wheel, but this is for educational and some other purpose only),
      i have built GUI system before using 2D systems such as that of HTML/JS canvas, but in 2D system, i can directly match a mouse coordinates to the actual graphic coordinates with additional computation for screen size/ratio/scale ofcourse.
      now i want to port it to OpenGL, i know that to render a 2D object in OpenGL we specify coordiantes in Clip space or use the orthographic projection, now heres what i need help about.
      1. what is the right way of rendering the GUI? is it thru drawing in clip space or switching to ortho projection?
      2. from screen coordinates (top left is 0,0 nd bottom right is width height), how can i map the mouse coordinates to OpenGL 2D so that mouse events such as button click works? In consideration ofcourse to the current screen/size dimension.
      3. when let say if the screen size/dimension is different, how to handle this? in my previous javascript 2D engine using canvas, i just have my working coordinates and then just perform the bitblk or copying my working canvas to screen canvas and scale the mouse coordinates from there, in OpenGL how to work on a multiple screen sizes (more like an OpenGL ES question).
      lastly, if you guys know any books, resources, links or tutorials that handle or discuss this, i found one with marekknows opengl game engine website but its not free,
      Just let me know. Did not have any luck finding resource in google for writing our own OpenGL GUI framework.
      IF there are no any available online, just let me know, what things do i need to look into for OpenGL and i will study them one by one to make it work.
      thank you, and looking forward to positive replies.
    • By fllwr0491
      I have a few beginner questions about tesselation that I really have no clue.
      The opengl wiki doesn't seem to talk anything about the details.
      What is the relationship between TCS layout out and TES layout in?
      How does the tesselator know how control points are organized?
          e.g. If TES input requests triangles, but TCS can output N vertices.
             What happens in this case?
      In this article,
      the isoline example TCS out=4, but TES in=isoline.
      And gl_TessCoord is only a single one.
      So which ones are the control points?
      How are tesselator building primitives?
    • By Orella
      I've been developing a 2D Engine using SFML + ImGui.
      Here you can see an image
      The editor is rendered using ImGui and the scene window is a sf::RenderTexture where I draw the GameObjects and then is converted to ImGui::Image to render it in the editor.
      Now I need to create a 3D Engine during this year in my Bachelor Degree but using SDL2 + ImGui and I want to recreate what I did with the 2D Engine. 
      I've managed to render the editor like I did in the 2D Engine using this example that comes with ImGui. 
      3D Editor preview
      But I don't know how to create an equivalent of sf::RenderTexture in SDL2, so I can draw the 3D scene there and convert it to ImGui::Image to show it in the editor.
      If you can provide code will be better. And if you want me to provide any specific code tell me.
  • Popular Now