Linux -> Windows Texture Mapping Woes

Started by
4 comments, last by James McColl 21 years, 5 months ago
I wrote a program using OpenGL at my University on Linux, which emulates the Solar System. It has all the planets, the sun, and a few moons. Everthing works 100% fine there. (I compile using gcc I believe) However, when I try to run it at home, compiling with Visual C++ under windows, I have all sorts of anomolies. First off, some of the planets, and all of the moons (so both small and large spheres) are not textured properly. The texture only gets applied to the top ''pole'' of the sphere. The rest is solid black. Some of the planets (one small, and two medium sized planets) are textured 100% correctly. It is important to note, that I texture all planets in the same way. In fact, I use the same function to draw all planets, which allows you to pass in an object which specifies planet size, and a texture object set up for that planet. Again, everything works 100% ok on Linux. The way I draw the planets, is thus: PlanetTextures[iPlanetIndex].activate(); // Draw the sphere that represents the planet GLUquadricObj *q = gluNewQuadric (); gluQuadricDrawStyle (q, GLU_FILL); gluQuadricNormals (q, GLU_SMOOTH); gluQuadricTexture (q, GL_TRUE); gluSphere (q, lpcPlanets[iPlanetIndex].getSize(), 30, 30); gluDeleteQuadric (q); PlanetTextures[iPlanetIndex].deactivate(); Another anomoly, is that when I try to draw my star map, which has a size of 1,000,000 units, it is drawn at around 100,000 units (again, it works fine on linux). I have no idea why this is done either... Yet another anomoly, is that even though I have the depth buffer enabled, the furthest object from the camera will always be drawn on top, unless you are pretty close to the closest object. I do not have this problem on Linux either. Now, I am starting to think that the problem is that I am using different OpenGL versions at School and at Home, and they are not compatible. However, I have no way of knowing which version I am using at home, the only thing I installed, was glut3.7.6, which if I remember correctly, had the glut32.dll in it. Is it possible to install OpenGL 1.2 or 1.3? Do I even need to install it? If it is possible to install, where can I d/l it? I would greatly appreciate any help in this. Thanks, James
Advertisement
Get the latest drivers for your vid card to be sure you''ve got a newish opengl implementation. This shouldn''t make any difference from what you describe though...

The only thing I can think of is that your code used MESA (an open source implementation of GL) on Linux, and now needs some small changes to use OpenGL on windows. I''ve never used MESA though so while I know its essentially the same API with a few small differences, I''m not sure if they''d affect what your doing.

One thing that makes me think it may be this is that it is fairly common practice to use the MESA version of the latest glu on linux even if using real GL, as nobody else has bothered implementing it (AFAIK) for linux.

If this is the case, then maybe MESA has different default values for texture edge repeating and the depth buffer?

Also, are you using the same basecode betwen platforms? If you''re just slapping your draw() code into an unfamiliar basecode, maybe something you are unaware of is going on.

Dan

[size="1"]
Or, your home video card could have some unusual settings.
If you turn hardware acceleration all the way off, do the problems still occur?
You don't need to vote for the "lesser of two evils"! Learn about Instant Runoff Voting, the simple cure for a broken democracy!
Ok, here''s an update.

First off, I installed the new Detonater drivers (I have a GeForce 2), and still no change.

The working version on Linux definately uses the MESA version, not that I really understand what that means, but I was told that it does.

I''m going to include my Texture initialization function, and my activate function here as well, so you can see exactly what I''m doing. Actually, I''ll just include my entire GLTexture class, which has all functionality in it, minus the part that draws the sphere, which is used above.


#ifndef __GLTEXTURE_H
#define __GLTEXTURE_H

#include <GL/gl.h>
#include <fstream.h>

class GLTexture
{
private:
GLuint name;
char pixels[1024 * 1024 * 6];
int xSize,ySize;

public:
// in is a stream leading to a stream of binary values, in RGB triplets
void construct(istream& in)
{
for(int i = 0 , idx = 0 ; i < (xSize * ySize) ; i++ , idx += 3)
in.read((char *)(&(pixels[idx])),3);

glPixelStorei(GL_UNPACK_ALIGNMENT,1);

glGenTextures(1,&name);
glBindTexture(GL_TEXTURE_2D,name);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_WRAP_S,GL_REPEAT);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_WRAP_T,GL_REPEAT);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,
GL_LINEAR_MIPMAP_LINEAR);
gluBuild2DMipmaps(GL_TEXTURE_2D,GL_RGB,xSize,ySize,GL_RGB,
GL_UNSIGNED_BYTE,&(pixels[0]));
}

GLTexture(int _xSize = 1024, int _ySize = 1024)
: xSize(_xSize) , ySize(_ySize) , name(0)
{ }

~GLTexture()
{
if(name != 0)
glDeleteTextures(1,&name);
}

void activate(void)
{
glEnable(GL_TEXTURE_2D);
glTexEnvf(GL_TEXTURE_ENV,GL_TEXTURE_ENV_MODE,GL_REPLACE);

glMatrixMode(GL_TEXTURE);
glLoadIdentity();
glMatrixMode(GL_MODELVIEW);

glBindTexture(GL_TEXTURE_2D,name);
}

void deactivate(void)
{
glDisable(GL_TEXTURE_2D);
}
};

#endif // __GLTEXTURE_H



So, the previous code is called like so: (with the construct function being called previously for each texture in the array)


PlanetTextures[iPlanetIndex].activate();

// Draw the sphere that represents the planet
GLUquadricObj *q = gluNewQuadric ();
gluQuadricDrawStyle (q, GLU_FILL);
gluQuadricNormals (q, GLU_SMOOTH);
gluQuadricTexture (q, GL_TRUE);
gluSphere (q, lpcPlanets[iPlanetIndex].getSize(), 30, 30);
gluDeleteQuadric (q);

PlanetTextures[iPlanetIndex].deactivate();



Where PlanetTextures is just an array of all of my GLTexture Objects, and iPlanetIndex is the index of the planet being drawn. Can anyone see my mistake? Perhaps I need some more initialization, or need to set some variables which are needed in the windows implementation of OpenGL.

Thanks,
James
Bah!

Bah, I say!

Well, it turns out (after much trouble and debugging) that the source of all my problems, was that I was not loading my .ppm files correctly.

I was using .ppm files, because it is the only file format I knew how to load myself. For some reason, they would load fine on some files, but then would just start spitting out zeros halfway or less through the file on others. Wierd.

At any rate, I now use auxDIBImageLoad() to load Bitmaps, instead of silly ppm files, and all is well. It turns out, that the rest of my code (a lot of which was redundant) worked perfectly .

Thanks for the help though.
James
AUX is buggy (memory leaks) + has been discountinued. for loading images try something else eg devil, search (devil opengl)

http://uk.geocities.com/sloppyturds/kea/kea.html
http://uk.geocities.com/sloppyturds/gotterdammerung.html

This topic is closed to new replies.

Advertisement