"works on my other computer" type bug: OpenGL, C++, Vista

Started by
8 comments, last by gah_ribaldi 16 years ago
Hi I've got an annoying problem I hope someone can help with. I have a small and unexciting project, well contained within its own directory with its own textures and suchlike. This compiles to give me my nicely textured cube. On my desktop. Copying the directory to the laptop, it compiles but none of the textures appear. I zipped the project as it exists on the laptop and sent it to a colleague - he compiles it and the program runs with textures. Joy. So, the question is, why isn't it working on the laptop? The laptop certainly doesn't have the grunt of my desktop but is by no means an antique, despite having one of those RAM borrowing graphics cards. It is however running Vista, while my desktop runs on XP. My error checking code isn't the best but I have the following in place which DOESN'T exit the program with a 5:

// openGL textures
GLuint		roomWall =0, roomFloor=0, roomCeiling=0;

// load textures
	
	roomWall = loadTexture("roomWall.bmp");
	roomFloor = loadTexture("roomFloor.bmp");
	roomCeiling = loadTexture("roomCeiling.bmp");

	if (roomWall == 0 || roomFloor == 0 || roomCeiling == 0 )
	{
		return 5;
	}

So I'm assuming the textures did get loaded without error on the laptop. Any thoughts? This one's doing my nut.
Advertisement
Update:

Upon receiving some advice that this may be due to a stack overflow, I used the command /F 10000000 in the project's settings. This had no effect.

Also, I tried a similar alteration in Configuration Properties / Linker / System / Stack Reserve Size - no success there either.
Did you try other OpenGL programs on the laptop?Are they working correctly?
If not I suggest installing the driver for the graphics card of the laptop.
By the way does the laptop have a nvidia/ati card or an intel one?
I have a basic 2d openGL program which compiles and runs fine with textures.

I'm using an ati Radeon Xpress 1100.

Have also tried using smaller texture files with no success.

I think I'll update the video drivers anyway - can't do any harm after all.
Quote:Original post by gah_ribaldi
I think I'll update the video drivers anyway - can't do any harm after all.


No joy after updating video drivers.
Are the textures power-of-two in size? (128x128, 256x256, etc)

My Sony Vaio (GeForce GO 6400) will not display textures that aren't power-of-two.

"The right, man, in the wrong, place, can make all the dif-fer-rence in the world..." - GMan, Half-Life 2

A blog of my SEGA Megadrive development adventures: http://www.bigevilcorporation.co.uk

It seems that mine won't display non power-of-two sized images either. Problem (kinda) solved.

Thanks.
Quote:Original post by gah_ribaldi
It seems that mine won't display non power-of-two sized images either. Problem (kinda) solved.

Thanks.


Support for non-power-of-two texture sizes is still relatively uncommon. You can only really count on it being available for newer desktop cards and certain laptop chips. There are a LOT of integrated/laptop video options that don't support it however, so that's something to keep in mind. Wasn't too long ago that all textures HAD to be a power of two.

Now, that said, it's easy enough to work around of course, just load your bitmap into the smallest power-of-two sized texture that will fit it, and only address the pixels that actually contain data. This is usually only an issue for sprites and UI type stuff--most textures are designed with this constraint in mind.
Quote:Original post by deadstar
Are the textures power-of-two in size? (128x128, 256x256, etc)

My Sony Vaio (GeForce GO 6400) will not display textures that aren't power-of-two.


It should. If you have GL 2.0 or 2.1, which you must, it is part of the GL core.
Sig: http://glhlib.sourceforge.net
an open source GLU replacement library. Much more modern than GLU.
float matrix[16], inverse_matrix[16];
glhLoadIdentityf2(matrix);
glhTranslatef2(matrix, 0.0, 0.0, 5.0);
glhRotateAboutXf2(matrix, angleInRadians);
glhScalef2(matrix, 1.0, 1.0, -1.0);
glhQuickInvertMatrixf2(matrix, inverse_matrix);
glUniformMatrix4fv(uniformLocation1, 1, FALSE, matrix);
glUniformMatrix4fv(uniformLocation2, 1, FALSE, inverse_matrix);
Quote:Original post by V-manIf you have GL 2.0 or 2.1, which you must, it is part of the GL core.


Seems like I've got 1.1 on my machine:

Vendor: Microsoft Corporation
Renderer: GDI Generic
Version: 1.1.0

Extensions supported:
GL_WIN_swap_hint
GL_EXT_bgra
GL_EXT_paletted_texture


How does one go about changing version?

This topic is closed to new replies.

Advertisement