Sign in to follow this  
Slippery_Jim

Advice on modifying old game's DirectX calls

Recommended Posts

Slippery_Jim    100
There is a small group of us trying to improve a much loved old game called "Swat 3" by Sierra. Unfortunately the source code is unavailable so our only choice is to hack the game by hex editing the executable. We would love to get the opinions of the experts here on a problem we are trying to solve. The game was originally released in 1999. At the time 16-bit colour and 800x600 resolution was the standard. We have hacked the game's .exe to allow much higher resolution textures and to run the game in any resolution we want ie. 1440x900. What we would really love now is to get the game to render in 24-bit or 32-bit colour. Would the smart people in the forums have a clue what we should be looking for in the .exe? Is it even possible? According to notes by the lead programmer Swat 3 was made for DirectX 6.1 originally but was updated to 7.0 in a later patch, and allegedly to 8.0 in the final patch. His notes also mention Visual C++ 6.0. Would they have actually written the whole game in Visual C++? Any help would be appreciated.

Share this post


Link to post
Share on other sites
sirlemonhead    227
Visual C++ could mean it's written entirely in C, entirely C++ or a mixture unfortunately. Don't know if it matters for what you're doing though..

Going from D3D6 to D3D8 is interesting though.. Do you know if the latest version still needs to use DirectDraw? You could try grabbing a copy of the application "dependency walker" and seeing what d3d .dll files the game is loading. Then, figure out what work is involved depending on whether it's using Direct Draw or not

Share this post


Link to post
Share on other sites
Pyrogame    106
My first real experiences in DX were with DX 6.1. But today I can't remember, what was there, and what not. To much changed. But I want to help :) Maybe this can help you. Sorry if not.

There are three things, you have to look for:

First, it is the render target format. This is the buffer format, which will be rendered and then displayed on the screen. For 16bit there should be a value, for example "B5G6R5". This value is an integer in a struct called "present parameters", which will be passed to the device create function and to the device-reset function. The integer value for B5G6R5 should be 23. There are also other 16bit formats, the values can be 24-26, 29, 30 or 40. I do not expect other values there, but I'm not sure. For example, 40 stands for 8bit palette index and 8bit alpha. But: For a render target which will be displayed directly on the screen it doesn't make sense to have an alpha value. You have to change this format value to 20 (24 bit), 21 or 22 (32bit with alpha).

The another thing is the format of the textures, which has also be sampled with 24/32 bit. This should the same format value as mentioned before, except that this time it will be passed, while creating a texture.

The third thing is the alpha. If you want to have 32bit textures, then I suppose, that you want alpha. If so, you have to enable this first, if it is not yet enabled in the game. But this can become a little more complicated. I would say, try the 24bit mode first :)

Share this post


Link to post
Share on other sites
Slippery_Jim    100
Quote:
Original post by sirlemonhead
Do you know if the latest version still needs to use DirectDraw?


I am pretty sure the game still uses DirectDraw by the game's error messages when players have DirectX errors. I will look for "dependency walker" and let you know the results. Thanks for the heads up about it. :)

UPDATE:

It does indeed use DirectDraw still. Here is a screen shot from DependencyWalker. Not sure it tells you anything useful.



Here are links to the Dependency Walker profile that was made. This is a capture that was done while the executables were run.

The original untouched Swat executable (800x600 resolution, maximum texture size of 256x256):

Download original Swat 3 profile in .rtf format (501 KB)



Our hacked Swat 3 file that gives a resolution of 1440x900 (instead of 800x600), allows maximum .bmp textures of 1024x1024 (instead of 256x256):

Download hacked Swat 3 profile in .rtf format (588 KB)



To be honest I am way over my head. I don't have a solid enough programming background. I have used hex editors for several years and program in Visual Basic and several web languages but have never touched C++ or DirectX coding. Please treat me like the idiot that I am, lol.

[Edited by - Slippery_Jim on February 14, 2010 3:51:15 PM]

Share this post


Link to post
Share on other sites
Slippery_Jim    100
Quote:
Original post by Pyrogame
But I want to help :) Maybe this can help you. Sorry if not.


Hey, any help is much appreciated. I obviously lack the programming background to do this on my own.


Quote:
Original post by Pyrogame
The integer value for B5G6R5 should be 23. There are also other 16bit formats, the values can be 24-26, 29, 30 or 40.


Hmm, it sounds like finding this in a hex editor would be like finding a needle in the haystack. Any suggestions how to narrow down the search? Is this more of a job for a disassembler? Pardon my lack of knowledge.

Share this post


Link to post
Share on other sites
ozak    155
But you did hack it for larger resolution/texture size?
So you have some grasp of what to do. I suggest some form of debugger like WinDebug or SoftIce to try to see where it changes sets the bitdepth.

I admit it can be hard to track down.

Another (cooler but diffult) way would be to create wrapper DLL's for the DirectX dlls used by the program and then implement these in DirectX 9/10/OpenGL or similar.

Share this post


Link to post
Share on other sites
Slippery_Jim    100
Quote:
Original post by ozak
Another (cooler but diffult) way would be to create wrapper DLL's for the DirectX dlls used by the program and then implement these in DirectX 9/10/OpenGL or similar.


I believe I saw something like this done for Fallout2. You had to copy a modified DirectDraw .dll file into the game's folder and this gave higher resolutions and colour depth.

We have several people helping us that have a much greater knowledge than I do in terms of Disassembling and coding. Their time is limited so I was hoping to bring to them some info from this forum that will aid them in our quest.

Of course the real question is would the fact that this 3D game seems to be limited to 16-bit colour be a product of the lighting engine and thus nearly impossible to change without the source code or do people think that it could be a simple DirectX call that could be hacked?

One thing I should mention is that the game uses .BMP textures. The game accepts 8-bit or 24-bit textures but the textures appear to be rendered in 16-bit color since you get that characteristic banding instead of a clean grad when you view the textures.


Thanks for the reply.

Share this post


Link to post
Share on other sites
ozak    155
I'm pretty sure it could be changed to 24/32-bit color with a simple exe hack.
I can't see why it's lighting engine should depend on this. You never know tho ;)

Share this post


Link to post
Share on other sites
Slippery_Jim    100
Quote:
Original post by ozak
I'm pretty sure it could be changed to 24/32-bit color with a simple exe hack.
I can't see why it's lighting engine should depend on this. You never know tho ;)


Your post certain gives us encouragement. Thanks.

Share this post


Link to post
Share on other sites
ozak    155
Np. Did you check https://www.reverse-engineering.net?
There's some great discussions on similar subjects. Including DirectX stuff :)

Share this post


Link to post
Share on other sites
Evil Steve    2017
If the game is 3D (I know nothing about the game), then it's almost certainly using Direct3D. The reason it links to DirectDraw is that with DX 7 and earlier, you had to create a DirectDraw object, and then query the IDirectDraw7 object for an IDirect3D7 object.


Quote:
Original post by ozak
Another (cooler but diffult) way would be to create wrapper DLL's for the DirectX dlls used by the program and then implement these in DirectX 9/10/OpenGL or similar.
Or just write a proxy d3d7.dll (Or whatever the DLL is called), which forwards all calls, and fudges parameters where needed (E.g. the CreateSurface() calls). It's been a while since I used DX7, and I can't remember the exact details of what you need, except that there's a lot more work required and you need to create the frontbuffer, backbuffer and Z-buffer yourself (Which CreateDevice() does in DX8+)

Share this post


Link to post
Share on other sites
ozak    155
Yeah. That would be cool, but I think if it's just 16->24-bit then it might be an easy fix provided one could find the exact point in the executable to modify :)

Found something on https://www.reverse-engineering.net about changing fullscreen dx games to windowed via a simple change. Maybe similar methods could be used. I suggest posting there as it seems there's some hardcore RE guys there ;)

Share this post


Link to post
Share on other sites
Slippery_Jim    100
Quote:
Original post by Evil Steve
If the game is 3D (I know nothing about the game), then it's almost certainly using Direct3D.


I suspect it is using DirectDraw for the 2D menus (which consist mainly of 800x600 .BMP graphics for each screen) and then switches to Direct 3D when the actual 3D portion of the game begins. Of course I know almost nothing about DirectX so I could be wrong.

Share this post


Link to post
Share on other sites
Slippery_Jim    100
Quote:
Original post by ozak
https://www.reverse-engineering.net... suggest posting there as it seems there's some hardcore RE guys there ;)


Thanks for the suggestion. Will post there too :D

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this