Problems Converting a Fullscreen Game To Windowed

Started by
8 comments, last by CandleJack 17 years ago
I want to convert my fullscreen game to windowed mode so that I can utilize the Visual Studio debugger, but I am having some problems. I followed the instructions in this article but things still aren't working quite right. The problem is that the graphics are being displayed all distorted. The menu screen, for example, looks like this: Edit: Actually I just changed the desktop settings to 16 bit color, since the image data in the game is 16 bpp, and it's a little better. It now looks like this: But I don't want people to have to change their desktop settings just to play the game...
Advertisement
Can you post your rendering code, specifically the setup code that handles creating the window and so on? It looks like there's a small setting broken someplace, but it's hard to know for sure where. (Someone else with more low-level DirectX experience may have some ideas though.) I'm assuming it still works correctly when you run in fullscreen mode?

Wielder of the Sacred Wands
[Work - ArenaNet] [Epoch Language] [Scribblings]

Yep, everything works as expected in fullscreen mode. Here is the code I use to setup if windowed mode is specified.

// Create the Direct Draw object.if( DirectDrawCreateEx( NULL, (void**)&lpdd, IID_IDirectDraw7, NULL ) != DD_OK ){	MessageBox( getAppWindow(), "Failed to create Direct Draw object", "Error", MB_OK );	return 0;}// Set the cooperation level.if( windowed == true ){	if( lpdd->SetCooperativeLevel( getAppWindow(), DDSCL_NORMAL ) != DD_OK )	{		MessageBox( getAppWindow(), "Failed to set the cooperative  level", "Error", MB_OK );		return 0;	}	// Set up the surface description structure for the primary surface.	memset( &ddsd, 0, sizeof( ddsd ) );	ddsd.dwSize  = sizeof( ddsd );	ddsd.dwFlags = DDSD_CAPS;	ddsd.ddsCaps.dwCaps = DDSCAPS_PRIMARYSURFACE;	// Create the Primary surface.	if( lpdd->CreateSurface( &ddsd, &lpddsPrimary, NULL) != DD_OK )	{		MessageBox( getAppWindow(), "Failed to create the primary surface", "Error", MB_OK );		return 0;	}        // Set up the surface description structure for the secondary surface.	memset( &ddsd, 0, sizeof( ddsd ) );	ddsd.dwSize  = sizeof( ddsd );	ddsd.dwFlags = DDSD_CAPS | DDSD_HEIGHT | DDSD_WIDTH;	ddsd.ddsCaps.dwCaps = DDSCAPS_OFFSCREENPLAIN;	ddsd.dwWidth = 800;	ddsd.dwHeight = 600;	// Create the secondary surface.	if( lpdd->CreateSurface( &ddsd, &lpddsSecondary, NULL) != DD_OK 	{		MessageBox( getAppWindow(), "Failed to create the secondary surface", "Error", MB_OK );		return 0;	}}


And then, I have the clipper code, which I just discovered was being attached to the wrong surface, resulting in the problem earlier. But I still wish to be able to have the game dsiplay correctly in any desktop color settings. So things are now working correctly in 16 bit color mode, just not 32 bit.

LPDIRECTDRAWCLIPPER  lpddClipper;   // Create the clipper object.if( ( lpdd->CreateClipper( 0, &lpddClipper, NULL ) ) != DD_OK ){	MessageBox( getAppWindow(), "Failed to create the clipper", "Error", MB_OK );	return 0;}if( windowed == true ){	if( ( lpddClipper->SetHWnd( 0, getAppWindow() ) ) != DD_OK)	{		return 0;	}	// Attach the clipper to the surface specified.	if( ( lpddsPrimary->SetClipper( lpddClipper ) ) != DD_OK )	{		return 0;	}}
In windows mode, you can use whatever format is currently in use by the following function.

// pd3d = pre-initialized Direct3D (IDirect3D9) objectD3DDISPLAYMODE d3dmode;if(FAILED(pd3d->GetDisplayMode(D3DADAPTER_DEFAULT, &d3dmode))){     //Error occured}else{     //d3dmode now contains the display mode in use on the machine, including     //window size, refresh rate, and bit-depth.}
That will do him a tremendous amount of good since he's using DD7... :P

EDIT: Anyway looking at your problem more closely I'd imagine that your issue is that your offscreen surfaces are still being stored as 16 bit even when your primary surface chain is 32bit (because the user is in 32bit mode), the way to solve this is ask DD to create those offscreen surfaces in the same format as the primary surface, that way when you copy from them to the backbuffer things dont get screwed to hell.

If you want to post your surface creation / bitmap loading code I'll try and point out what you need to change.
Quote:Original post by Illumini
That will do him a tremendous amount of good since he's using DD7... :P

EDIT: Anyway looking at your problem more closely I'd imagine that your issue is that your offscreen surfaces are still being stored as 16 bit even when your primary surface chain is 32bit (because the user is in 32bit mode), the way to solve this is ask DD to create those offscreen surfaces in the same format as the primary surface, that way when you copy from them to the backbuffer things dont get screwed to hell.

If you want to post your surface creation / bitmap loading code I'll try and point out what you need to change.


I posted the code I used to create the two page flipping surfaces a few posts up. Here is the code I use when creating offscreen surfaces for sprites:

for( index = 0; index < numFrames; index++ ){	memset( &ddsd, 0, sizeof( ddsd ) );	ddsd.dwSize         = sizeof( ddsd );	ddsd.dwFlags        = DDSD_CAPS | DDSD_WIDTH | DDSD_HEIGHT;	ddsd.dwWidth        = width;	ddsd.dwHeight       = height;	ddsd.ddsCaps.dwCaps = DDSCAPS_OFFSCREENPLAIN;			// Create the surface.	if( ( lpdd->CreateSurface( &ddsd, &(images[index] ), NULL ) ) != DD_OK )	{		MessageBox( getAppWindow(), "Failed to create sprite surface", "Error", MB_OK );		return 0;	}}


As far as the bitmap loading code, I got it from a book I bought so I'm not sure if that's ok to post (if I'm wrong though let me know and I'll post it for you), but I can tell you what it does. It loads in the bitmap into a bitmap file structure, and if it is 8 bit or 16 bit, it loads the image data as it should. If it is not 8 or 16 bit it converts it to 16 bit. Could this be causing the problem? I use MS Paint to do all my sprites and the only options it gives to save as are 24 bit, 8 bit, or 16 color (4 bit I guess) bmp's. So I've been saving all my images as 24 bit bmp's and they get converted to 16 bit. Would I have to convert them to whatever format the user's desktop is in?
Bumping in hopes someone can offer some more insight into the desktop/game color mode issue.

If it is not 8 or 16 bit it converts it to 16 bit. Could this be causing the problem?


Yes.
Quote:Original post by CandleJack
It loads in the bitmap into a bitmap file structure, and if it is 8 bit or 16 bit, it loads the image data as it should. If it is not 8 or 16 bit it converts it to 16 bit. Could this be causing the problem?
Consider:
- You create a 24 bit bitmap.
- You load this bitmap into your program. As it is not either 8 or 16 bit it is converted to 16 bit. You now have a 16 bit image in memory that you can work with.
- You now try to display the image as a 24 bit bitmap.

The issue is that you're essentially discarding the additional data when you load the bitmap into memory; you can't get that data back once it's discarded, and your existing bitmap loader is therefore only good for displaying 8 or 16 bit (or lower quality) bitmaps. So, you'll need to get yourself some bitmap loading code which can handle your 24 bit image (or stick with 16 bit images, but I assume you don't want to do that).

- Jason Astle-Adams

I see. If I change it to load 24 bit images, would this display correctly if the desktop is in 32 bit color mode? The only options my desktop has are 16 and 32 bit. Also, is there a way to get the desktop color mode, so that I can convert the bitmaps to whatever format the desktop is in, and thus display correctly on all computers?

This topic is closed to new replies.

Advertisement