Jump to content

  • Log In with Google      Sign In   
  • Create Account


Member Since 28 Jun 2013
Offline Last Active Aug 21 2015 10:54 PM

Posts I've Made

In Topic: Graphic anomalies

26 December 2014 - 04:34 PM

Thanks for the prompt replies.




Assuming I saw the right thing it is called Z-Fighting.

Your map is huge (the one I loaded was at least 10,000 units), which means you've set your far clip plane to extremely far distance, which causes inaccuracies.

Workaround/Hack to this is to use 32 bit depth buffer. However more correct way is to scale down your maps to proper size and use better near/far clip planes.

You're right regarding the clip distance and originally tried to reducing it to 1.0f/10000.0f however to no avail. Although I suspect the issue may lie in this area as 10000 still seems awfully large (especially after reading the MSDN D3D9 errata which mentions problems with large near/far ratios) but was required because anything smaller resulted in what was visually a very low draw instance (as in, right in front of the camera)


I've included the source here just in case: http://laserblue.org/Game_src.zip


Relevant code:


Game.cpp:579 <render loop>

Game.cpp:389 RenderScene()

Game.cpp:92   InitializeD3D()

Map.cpp:34 RenderStartup()

Map.cpp:54: RenderScene()



EDIT: Nevermind, I figured it out -- it was indeed related to the far clip plane. Thanks for the help guys.

In Topic: Queries on a synchronization model

10 July 2013 - 12:40 AM

You're right; a jitter buffer would be a good solution.


There's one optimization to this model that I'd like to implement however I suspect it may be a problem: Given that a large proportional of the frames of input are likely to be empty there would be a considerable amount of wasted bandwidth (e.g, at a rate of 25Hz; 1KB/s of protocol overhead (IP+TCP) alone). Thus, would it be possible to omit this redundant traffic? Assuming that there is minimal jitter I figure the client would be able to set a time threshold for receiving input per frame and if exceeded assume that there was no input during this frame and opportunistically advance the simulation. If the assumption is wrong however it would require reversing the simulation, integrating the missed input, and then somehow fixing the descrepency between the current (errorneous state) and the correct state. I haven't heard of this being done so I'd be interested in hearing in any experiences with such a method.


I should point out that the game in question is effectively a carbon copy of Diablo II; the simulation computation requirement is minimal and thus it would be quite feasible to dump the entire game state (~250KB client-side) per frame (which is something I'm considering for implementing the reversing)