# OpenGL Texture problem

## Recommended Posts

Recently i started learning opengl, however i got stuck at textures, i dont know if im learning the right way, but its complicated for me. Im trying to apply a texture on to a square but everytime i compile and run, the texture appears in a different color than it should be. I dont know what im doing wrong, perhaps someone could point me in the right direction. Here is the function that loads the texture:
GLuint LoadTexture( const char * filename, int width, int height ){
GLuint texture;
unsigned char * data;
FILE * file;
file = fopen( filename, "rb" );
if ( file == NULL ) return 0;
data = (unsigned char *)malloc( width * height * 3 );
fread( data, width * height * 3, 1, file );
fclose( file );
glGenTextures( 1, &texture );
glBindTexture( GL_TEXTURE_2D, texture );
glTexEnvf( GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_DECAL);
glTexParameterf( GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR );
glTexParameterf( GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR );
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, width, height, 0, GL_RGB, GL_UNSIGNED_BYTE, data);
free( data );
return texture;
}


And here is the function that applies the texture onto the square:
void tile(void){
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
texture = LoadTexture( "texture.bmp", 64, 64 );
glEnable( GL_TEXTURE_2D );
glTexCoord2d(0.0,0.0); glVertex2d(0.0,0.0);
glTexCoord2d(1.0,0.0); glVertex2d(0.2,0.0);
glTexCoord2d(1.0,1.0); glVertex2d(0.2,0.2);
glTexCoord2d(0.0,1.0); glVertex2d(0.0,0.2);
glEnd();
glFlush();
glutSwapBuffers();
}


I would appreciate any help. Ne_cro

##### Share on other sites
set glColor to pure white when you draw the texture. eg -

glColor4ub(255,255,255,255)

##### Share on other sites
Quote:
 Original post by waryamsset glColor to pure white when you draw the texture. eg - glColor4ub(255,255,255,255)

Actually i thought of that before, anyways i tried what you said but i still get the same result unfortunately.
I even tried to change between GL_DECAL and GL_MODULATE, but its still no use.

##### Share on other sites
Sorry, didn't notice the GL_DECAL =/

##### Share on other sites
Quote:
 Original post by waryamsSorry, didn't notice the GL_DECAL =/See if this thread helps.

Thanks alot i think i found what my problem was. bmp format uses BGR and not RGB. On that thread they suggested to use GL_BGR which i tried and didnt work and i guess its because its only supported in older versions of opengl.

The only solution i guess would be is to use raw images (haven't tried that yet) but i would like someone to show me how to use bmp files instead, without using glaux or glut.

Btw thanks for the help waryams :)

EDIT: Does anyone know a site where i can download raw textures?

Ne_cro

##### Share on other sites
Just to let you know, Targa is pretty much the same as BMP, but also allows transparency if you want it.

##### Share on other sites
Quote:
 Original post by Ne_croThe only solution i guess would be is to use raw images

I just tried to load a raw format texture and all i got was black and white lines. Im starting to think that i might be doing all of this wrong :( its really frustrating.

[Edited by - Ne_cro on May 25, 2007 10:02:30 PM]

##### Share on other sites
Yay i fixed it :) My problem was that bmp use BGR and not RGB. To the people that have the same problem, i had to modify my code like this:

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, width, height, 0, GL_BGR_EXT, GL_UNSIGNED_BYTE, data);

notice that i used GL_BGR_EXT instead of GL_RGB. This tells opengl that the pixel data is stored as blue then green and then red.

To use GL_BGR_EXT simply include <gl/glext.h>

Ne_cro

## Create an account

Register a new account

• ## Partner Spotlight

• ### Forum Statistics

• Total Topics
627639
• Total Posts
2978344
• ### Similar Content

• By xhcao
Before using void glBindImageTexture(    GLuint unit, GLuint texture, GLint level, GLboolean layered, GLint layer, GLenum access, GLenum format), does need to make sure that texture is completeness.
• By cebugdev
hi guys,
are there any books, link online or any other resources that discusses on how to build special effects such as magic, lightning, etc. in OpenGL? i mean, yeah most of them are using particles but im looking for resources specifically on how to manipulate the particles to look like an effect that can be use for games,. i did fire particle before, and I want to learn how to do the other 'magic' as well.
Like are there one book or link(cant find in google) that atleast featured how to make different particle effects in OpenGL (or DirectX)? If there is no one stop shop for it, maybe ill just look for some tips on how to make a particle engine that is flexible enough to enable me to design different effects/magic
let me know if you guys have recommendations.
• By dud3
How do we rotate the camera around x axis 360 degrees, without having the strange effect as in my video below?
Mine behaves exactly the same way spherical coordinates would, I'm using euler angles.
Tried googling, but couldn't find a proper answer, guessing I don't know what exactly to google for, googled 'rotate 360 around x axis', got no proper answers.

References:
Code: https://pastebin.com/Hcshj3FQ
The video shows the difference between blender and my rotation:

• By Defend
I've had a Google around for this but haven't yet found some solid advice. There is a lot of "it depends", but I'm not sure on what.
My question is what's a good rule of thumb to follow when it comes to creating/using VBOs & VAOs? As in, when should I use multiple or when should I not? My understanding so far is that if I need a new VBO, then I need a new VAO. So when it comes to rendering multiple objects I can either:
* make lots of VAO/VBO pairs and flip through them to render different objects, or
* make one big VBO and jump around its memory to render different objects.
I also understand that if I need to render objects with different vertex attributes, then a new VAO is necessary in this case.
If that "it depends" really is quite variable, what's best for a beginner with OpenGL, assuming that better approaches can be learnt later with better understanding?

• Hello all,

On my Windows 7 x64 machine I wrote the code below on VS 2017 and ran it.
#include <glad/glad.h>  #include <GLFW/glfw3.h> #include <std_lib_facilities_4.h> using namespace std; void framebuffer_size_callback(GLFWwindow* window , int width, int height) {     glViewport(0, 0, width, height); } //****************************** void processInput(GLFWwindow* window) {     if (glfwGetKey(window, GLFW_KEY_ESCAPE) == GLFW_PRESS)         glfwSetWindowShouldClose(window, true); } //********************************* int main() {     glfwInit();     glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 3);     glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 3);     glfwWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);     //glfwWindowHint(GLFW_OPENGL_FORWARD_COMPAT, GL_TRUE);     GLFWwindow* window = glfwCreateWindow(800, 600, "LearnOpenGL", nullptr, nullptr);     if (window == nullptr)     {         cout << "Failed to create GLFW window" << endl;         glfwTerminate();         return -1;     }     glfwMakeContextCurrent(window);     if (!gladLoadGLLoader((GLADloadproc)glfwGetProcAddress))     {         cout << "Failed to initialize GLAD" << endl;         return -1;     }     glViewport(0, 0, 600, 480);     glfwSetFramebufferSizeCallback(window, framebuffer_size_callback);     glClearColor(0.2f, 0.3f, 0.3f, 1.0f);     glClear(GL_COLOR_BUFFER_BIT);     while (!glfwWindowShouldClose(window))     {         processInput(window);         glfwSwapBuffers(window);         glfwPollEvents();     }     glfwTerminate();     return 0; }
The result should be a fixed dark green-blueish color as the end of here. But the color of my window turns from black to green-blueish repeatedly in high speed! I thought it might be a problem with my Graphics card driver but I've updated it and it's: NVIDIA GeForce GTX 750 Ti.
What is the problem and how to solve it please?

• 10
• 12
• 22
• 13
• 33