Advice on modifying old game's DirectX calls

Started by
13 comments, last by Slippery_Jim 14 years, 1 month ago
There is a small group of us trying to improve a much loved old game called "Swat 3" by Sierra. Unfortunately the source code is unavailable so our only choice is to hack the game by hex editing the executable. We would love to get the opinions of the experts here on a problem we are trying to solve. The game was originally released in 1999. At the time 16-bit colour and 800x600 resolution was the standard. We have hacked the game's .exe to allow much higher resolution textures and to run the game in any resolution we want ie. 1440x900. What we would really love now is to get the game to render in 24-bit or 32-bit colour. Would the smart people in the forums have a clue what we should be looking for in the .exe? Is it even possible? According to notes by the lead programmer Swat 3 was made for DirectX 6.1 originally but was updated to 7.0 in a later patch, and allegedly to 8.0 in the final patch. His notes also mention Visual C++ 6.0. Would they have actually written the whole game in Visual C++? Any help would be appreciated.
Advertisement
Visual C++ could mean it's written entirely in C, entirely C++ or a mixture unfortunately. Don't know if it matters for what you're doing though..

Going from D3D6 to D3D8 is interesting though.. Do you know if the latest version still needs to use DirectDraw? You could try grabbing a copy of the application "dependency walker" and seeing what d3d .dll files the game is loading. Then, figure out what work is involved depending on whether it's using Direct Draw or not
My first real experiences in DX were with DX 6.1. But today I can't remember, what was there, and what not. To much changed. But I want to help :) Maybe this can help you. Sorry if not.

There are three things, you have to look for:

First, it is the render target format. This is the buffer format, which will be rendered and then displayed on the screen. For 16bit there should be a value, for example "B5G6R5". This value is an integer in a struct called "present parameters", which will be passed to the device create function and to the device-reset function. The integer value for B5G6R5 should be 23. There are also other 16bit formats, the values can be 24-26, 29, 30 or 40. I do not expect other values there, but I'm not sure. For example, 40 stands for 8bit palette index and 8bit alpha. But: For a render target which will be displayed directly on the screen it doesn't make sense to have an alpha value. You have to change this format value to 20 (24 bit), 21 or 22 (32bit with alpha).

The another thing is the format of the textures, which has also be sampled with 24/32 bit. This should the same format value as mentioned before, except that this time it will be passed, while creating a texture.

The third thing is the alpha. If you want to have 32bit textures, then I suppose, that you want alpha. If so, you have to enable this first, if it is not yet enabled in the game. But this can become a little more complicated. I would say, try the 24bit mode first :)
Quote:Original post by sirlemonhead
Do you know if the latest version still needs to use DirectDraw?


I am pretty sure the game still uses DirectDraw by the game's error messages when players have DirectX errors. I will look for "dependency walker" and let you know the results. Thanks for the heads up about it. :)

UPDATE:

It does indeed use DirectDraw still. Here is a screen shot from DependencyWalker. Not sure it tells you anything useful.



Here are links to the Dependency Walker profile that was made. This is a capture that was done while the executables were run.

The original untouched Swat executable (800x600 resolution, maximum texture size of 256x256):

Download original Swat 3 profile in .rtf format (501 KB)



Our hacked Swat 3 file that gives a resolution of 1440x900 (instead of 800x600), allows maximum .bmp textures of 1024x1024 (instead of 256x256):

Download hacked Swat 3 profile in .rtf format (588 KB)



To be honest I am way over my head. I don't have a solid enough programming background. I have used hex editors for several years and program in Visual Basic and several web languages but have never touched C++ or DirectX coding. Please treat me like the idiot that I am, lol.

[Edited by - Slippery_Jim on February 14, 2010 3:51:15 PM]
Quote:Original post by Pyrogame
But I want to help :) Maybe this can help you. Sorry if not.


Hey, any help is much appreciated. I obviously lack the programming background to do this on my own.


Quote:Original post by Pyrogame
The integer value for B5G6R5 should be 23. There are also other 16bit formats, the values can be 24-26, 29, 30 or 40.


Hmm, it sounds like finding this in a hex editor would be like finding a needle in the haystack. Any suggestions how to narrow down the search? Is this more of a job for a disassembler? Pardon my lack of knowledge.
Did I scare people away?
But you did hack it for larger resolution/texture size?
So you have some grasp of what to do. I suggest some form of debugger like WinDebug or SoftIce to try to see where it changes sets the bitdepth.

I admit it can be hard to track down.

Another (cooler but diffult) way would be to create wrapper DLL's for the DirectX dlls used by the program and then implement these in DirectX 9/10/OpenGL or similar.

Quote:Original post by ozak
Another (cooler but diffult) way would be to create wrapper DLL's for the DirectX dlls used by the program and then implement these in DirectX 9/10/OpenGL or similar.


I believe I saw something like this done for Fallout2. You had to copy a modified DirectDraw .dll file into the game's folder and this gave higher resolutions and colour depth.

We have several people helping us that have a much greater knowledge than I do in terms of Disassembling and coding. Their time is limited so I was hoping to bring to them some info from this forum that will aid them in our quest.

Of course the real question is would the fact that this 3D game seems to be limited to 16-bit colour be a product of the lighting engine and thus nearly impossible to change without the source code or do people think that it could be a simple DirectX call that could be hacked?

One thing I should mention is that the game uses .BMP textures. The game accepts 8-bit or 24-bit textures but the textures appear to be rendered in 16-bit color since you get that characteristic banding instead of a clean grad when you view the textures.


Thanks for the reply.
I'm pretty sure it could be changed to 24/32-bit color with a simple exe hack.
I can't see why it's lighting engine should depend on this. You never know tho ;)
Quote:Original post by ozak
I'm pretty sure it could be changed to 24/32-bit color with a simple exe hack.
I can't see why it's lighting engine should depend on this. You never know tho ;)


Your post certain gives us encouragement. Thanks.

This topic is closed to new replies.

Advertisement