Sign in to follow this  
mob1930

Disable rendering ?

Recommended Posts

How to quit/disable/don't render a scene at all ? So, basically, I don't need to see anything, the point is in lowering CPU at mem at minimum. I am creating a D3D hook, and I tried sleep() but it didn't help. So, how to disable rendering (of everything) ? Thanks in advance.

Share this post


Link to post
Share on other sites
Don't call any D3D functions?

I don't understand the problem - there's no IgnoreEverythingITellYou() function in D3D or anything, if you don't want anything rendered, just don't call any rendering functions.

Share this post


Link to post
Share on other sites
I am not making a game, I am making a hook for one game. You understand what am I saying ? I only want to leave the TCP functions of the game for example, well not that, I just want the game to stop rendering scenes.

Share this post


Link to post
Share on other sites
Quote:
Original post by mob1930
I am not making a game, I am making a hook for one game. You understand what am I saying ? I only want to leave the TCP functions of the game for example, well not that, I just want the game to stop rendering scenes.
Well, if you've hooked D3D you can just ignore the call to Present(), and nothing will be rendered.

Share this post


Link to post
Share on other sites
Quote:
Original post by mob1930
But will that lower CPU and mem usage ?
Somewhat. You can lower CPU usage considerably by ignoring Draw[Indexed]Primitive and any Set*() calls too, but for the best effect you'll have to make every call a no-op. That'll mean implementing your own interfaces so that CreateVertexBuffer(), etc can succeed and return a dummy pointer that will have hardly any memory footprint.
Effectively you're implementing every D3D interface that the app uses.

And if the app ever does something like reading back from a texture, you'll have to keep a copy of the data in any case so you can pass it back to the app.

Share this post


Link to post
Share on other sites
By ignoring calls, do you mean something like returning nothing (null) ? It's not possible to ignore a call with a D3D hook (maybe I'm wrong tho), because doesn't hook works as a proxy, something like application -> hook -> d3d interface ?

Thanks in advance.

Share this post


Link to post
Share on other sites
Quote:
Original post by mob1930
By ignoring calls, do you mean something like returning nothing (null) ? It's not possible to ignore a call with a D3D hook (maybe I'm wrong tho), because doesn't hook works as a proxy, something like application -> hook -> d3d interface ?

Thanks in advance.
If you put your own d3d9.dll file in the applications working directory, then it'll probably load that. Then the app will call your DLL's Direct3DCreate9() function. Alternatively, you can start the app up and then pause it before it calls Direct3DCreate9() (Usually done by calling CreateProcess to start the process as suspended).
In either case, your own Direct3DCreate9() function gets called. At that point, you can return your own interface which is derived from IDirect3D9. The target application will then call functions in your own class.

At that point you'll want to load the real d3d9.dll file, and use GetProcAddress() to get the real Direct3DCreate9() function. You can then use that to create a real D3D interface, and you can have your class act as a proxy, simply calling the real D3D functions, or in the case of CreateDevice, you can return another proxy object.

When the application calls DrawPrimitive, your class can simply ignore it and call D3D_OK, so the application thinks it all went well. The same goes for other functions.


So yes, you can just do nothing (return D3D_OK).

Share this post


Link to post
Share on other sites
I've already tried with a wrapper, and the application loaded it, but unfortunately it didn't start (only the splash screen). I'll try the methodes you described. Thanks much for help, I really appreciate it !

Share this post


Link to post
Share on other sites
Quote:
Original post by mob1930
I've already tried with a wrapper, and the application loaded it, but unfortunately it didn't start (only the splash screen). I'll try the methodes you described. Thanks much for help, I really appreciate it !
You can compile the DLL from Visual Studio and have it output the DLL to the target application's working directory, then run the target app through the debugger. That way you can place breakpoints in your DLL to see if there's anything funny going on.

Share this post


Link to post
Share on other sites
I got the wrapper to work (there was something funny going on :D), but the CPU usage is still the same. The memory usage is lower though, for about ~50MB. Any ideas ? I returned D3D_OK in DrawIndexedPrimitive, DrawPrimitive and all Set* calls.

Share this post


Link to post
Share on other sites
Quote:
Original post by mob1930
I got the wrapper to work (there was something funny going on :D), but the CPU usage is still the same. The memory usage is lower though, for about ~50MB. Any ideas ? I returned D3D_OK in DrawIndexedPrimitive, DrawPrimitive and all Set* calls.
Well, there's not really much you can do. Most of the memory will be used up in textures and vertex and index buffers. You could write your own wrappers for those classes too, so they don't actually allocate any memory - but if the application ever tries to read from a VB, IB or texture you'll be in trouble (Since you don't have the data to give back to the application).

As for CPU usage, it's entirely possible that the app isn't using up much CPU anyway, so disabling rendering calls might not make much (if any) difference. You could log what D3D functions it calls between BeginScene() and Present(), most if not all of them can be "nulled out" (By returning D3D_OK).

How are you measuring CPU usage? Remember that an app just doing while(true) {} will use up 100% of the CPU according to task manager.

Share this post


Link to post
Share on other sites
Well, I am measuring CPU usage via Task Manager, but when I minimize the game the CPU usage goes as low as 0-5%. Obviously, the game stops rendering when it's minimized. It seems I will need to reverse the function that loads textures to return nothing. There is a special DLL called gfxfilemanager.dll inside the game's folder, but I believe that you can't help me with reversing. Thanks a lot though, I really, really appreciate it !

Share this post


Link to post
Share on other sites
Quote:
Original post by mob1930
Well, I am measuring CPU usage via Task Manager, but when I minimize the game the CPU usage goes as low as 0-5%. Obviously, the game stops rendering when it's minimized.
The game will probably stop processing its game logic too - my engine for instance uses GetMessage() when it's minimised, so it'll use up zero CPU time. When it's in normal usage, it'll use PeekMessage() and then do a game tick and render tick.

Quote:
Original post by mob1930
It seems I will need to reverse the function that loads textures to return nothing. There is a special DLL called gfxfilemanager.dll inside the game's folder, but I believe that you can't help me with reversing. Thanks a lot though, I really, really appreciate it !
Again, this will only work if the game never tries to read from the resource, and many do.

Why is this a concern anyway? Memory is virtualised under Windows, the memory usage should have practically no impact?

Share this post


Link to post
Share on other sites
Quote:
Original post by Evil Steve
Quote:
Original post by mob1930
Well, I am measuring CPU usage via Task Manager, but when I minimize the game the CPU usage goes as low as 0-5%. Obviously, the game stops rendering when it's minimized.
The game will probably stop processing its game logic too - my engine for instance uses GetMessage() when it's minimised, so it'll use up zero CPU time. When it's in normal usage, it'll use PeekMessage() and then do a game tick and render tick.

Quote:
Original post by mob1930
It seems I will need to reverse the function that loads textures to return nothing. There is a special DLL called gfxfilemanager.dll inside the game's folder, but I believe that you can't help me with reversing. Thanks a lot though, I really, really appreciate it !
Again, this will only work if the game never tries to read from the resource, and many do.

Why is this a concern anyway? Memory is virtualised under Windows, the memory usage should have practically no impact?


No, the game doesn't stop its game logic when it's minimised. To be more specific, it's a MMO game.

Share this post


Link to post
Share on other sites
Quote:
Original post by mob1930
No, the game doesn't stop its game logic when it's minimised. To be more specific, it's a MMO game.
The game could be doing minimal processing; just handling socket input / output when it's minimised I suppose.

The point is that rendering doesn't use up that much CPU time. With D3D, most of the frame time goes in to Lock() calls and DrawPrimitive calls.

it might be a better idea to take one of the D3D samples and to try making a "null renderer" device for that.

Which reminds me; you could try hooking the CreateDevice() call and change the device type created to D3DDEVTYPE_NULLREF. That's about as empty as you can get. You may also have to hook device Create*() calls to create them in the system memory or scratch pool though.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this