Jump to content
  • Advertisement
Sign in to follow this  
alisahaf70

Depth Buffer of Injected DirectX(UDK)

This topic is 2028 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi (first of all sorry for my bad english :)       ) 

i injected my dll into udk(Close Source) and now i have access to beginstate , Endstate and also Device :D

i want to read stencil buffer data but i couldn't because in udk 

Device->GetDepthStencilSurface(&depthbuffer)) return error (D3DERR_NOTFOUND)

but i need to read depth of scene  from UDK

 

is there any solution to read depth of scene it in this situation ( i think udk use depth buffer for sth else)

 

if not can i read or compute it with another trick like using hlsl or sth else (anythings)

 

thank you very much

 

Share this post


Link to post
Share on other sites
Advertisement

Here is from the documentation:

 

"If the method succeeds, the return value is D3D_OK. If the device doesn't have a depth stencil buffer associated with it, the return value will be D3DERR_NOTFOUND. Otherwise, if the method fails, the return value can be D3DERR_INVALIDCALL"

 

Are you absolutely sure there is a depth stencil buffer associated with your device?

Share this post


Link to post
Share on other sites

Perhaps there is none assigned to the state at the time you are checking? What functions have you injected and tried with?

Share this post


Link to post
Share on other sites

@aregee

yes i think udk default settings disabled depth buffer

can i turn it on after device created without any problem? (i think no biggrin.png)

 

@spazzarama

i injected directX Dll and get endstate function and call getstencildepth method in endstate

 

so because udk default settings disabled depth buffer can i compute depth manually with HLSL or sth else?(any reference or guide)

 

thanks a lot 

Edited by alisahaf70

Share this post


Link to post
Share on other sites

so because udk default settings disabled depth buffer can i compute depth manually with HLSL or sth else?(any reference or guide)

 

You can calculate depth manually within a shader quite easily and render to a texture, but I imagine this would be quite difficult within a foreign application where you have no control over the render loop. Are you sure it isn't using depth at all? You could perhaps try hooking IDirect3DDevice9::SetDepthStencilSurface to get a reference to the depth buffer on the way in?

 

I don't know much about the engine, but if it is using a deferred rendering approach perhaps one of the render targets already contains the depth information you are after.

 

 

 

can i turn it on after device created without any problem?

 

That might actually work, worth trying.

Edited by spazzarama

Share this post


Link to post
Share on other sites

GetDepthStencilSurface will only get you a depth texture that was created with the device at startup. If that's null, then they're creating their own depth textures later on. You'd have to hook CreateTexture and/or SetDepthStencilSurface isntead.

Share this post


Link to post
Share on other sites

some article from ue3 doc not udk(i think udk use default settings)

 

Rendering state defaults

Since there are so many rendering states, it's not practical to set them all every time we want to draw something. Instead, UE3 has an implicit set of states which are assumed to be set to the defaults (and therefore must be restored to those defaults after they are changed), and a much smaller set of states which have to be set explicitly. The set of states that don't have implicit defaults are:

  • RHISetDepthState
  • RHISetBlendState
  • RHISetRasterizerState
  • RHISetBoundShaderState
  • RHISetStreamSource (if applicable)

All other states are assumed to be at their defaults (as defined by the relevant TStaticState, for example the default stencil state is set by RHISetStencilState(TStaticStencilState<>::GetRHI()).

 

 

 

 

That might actually work, worth trying.

can you give me some guide or refrence how can i do that? because i know a little bit about directX api

 

 

 

GetDepthStencilSurface will only get you a depth texture that was created with the device at startup. If that's null, then they're creating their own depth textures later on. You'd have to hook CreateTexture and/or SetDepthStencilSurface isntead.

thanks i will try this but are you sure that they are using depth buffer? (i read some where that we can use depth buffer memory for other purpose how can i found that)

Edited by alisahaf70

Share this post


Link to post
Share on other sites

@Hodgman

 

i injected setstencil function successfully and udk call it regulary but when i want to read argument data crash accured please help me

Line 4 failed to lock and after that crash accured

and even D3DXSaveSurfaceToFile("AKS.PNG", D3DXIFF_PNG, pNewZStencil, nullptr, nullptr); crash too(but it works for backbuffer)

HRESULT WINAPI mSetDepthStencilSurface(IDirect3DSurface9 *pNewZStencil)
{
      SetRect(&rectToLock, 10, 10, 15, 15);
      if(FAILED(pNewZStencil->LockRect(&lockedRect,&rectToLock,D3DLOCK_READONLY)))
     {
          lock=false;
     }
     else
     {
         lock=true;
         unsigned int *pixels=(unsigned int*)lockedRect.pBits;

     }
     if(lock)
     {
           pNewZStencil->UnlockRect();
           lock=false;
     }
     return oSetDepthStencilSurface(pNewZStencil);
}
Edited by alisahaf70

Share this post


Link to post
Share on other sites

You should print the error message: is it D3DERR_INVALIDCALL or D3DERR_WASSTILLDRAWING?

 

I am guessing you are getting an D3DERR_INVALIDCALL though, but you should not assume anything.

 

Note what the documentation is saying about: 

 

"The only lockable format for a depth-stencil surface is D3DFMT_D16_LOCKABLE", and

 

"A multisample back buffer cannot be locked", and

 

"This method cannot retrieve data from a surface that is is contained by a texture resource created with D3DUSAGE_RENDERTARGET because such a texture must be assigned to D3DPOOL_DEFAULT memory and is therefore not lockable. In this case, use instead IDirect3DDevice9::GetRenderTargetData to copy texture data from device memory to system memory."

 

If your problem is any of the first, I am not sure what you can do, and if it is the last one, Microsoft has a suggestion for you.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!