Jump to content
  • Advertisement

Sharky

Member
  • Content count

    215
  • Joined

  • Last visited

Community Reputation

197 Neutral

About Sharky

  • Rank
    Member
  1. Thanks for your reply, Jason - much appreciated. :)
  2. Hi gamedev community,   I have been asked to implement a method by which a single depth value can be sampled from the depth information of a g-buffer (deferred renderer using a 32-bit floating point texture storing viewspace depth) and then this floating point value returned to system memory for storage and use in calculations on the CPU.   For example, I would like to query the centre of the screen (effectively obtaining a depth value from the centre of the depth buffer) and then pass this back to the CPU / system memory. Sadly the depth buffer is the only source of depth information I have available to query for this particular application hence the need to pull values from it rather than elsewhere.   I am limited to using DirectX9 feature set, HLSL and Shader Model 3.0.   I am looking for suggestions as to how I might achieve this. I would suspect that I have to ultimately make use of the GetRenderTargetData function at some point and suffer the inherent GPU stalls that this results in. Any other ideas or things I may have missed?   Thanks in advance for your time.   p.s. I guess I'm basically looking for a view ray->scene intersection test that returns the first hit depth value whereby the only depth information I have available is that in the g-buffer. I hope that makes sense...
  3. As already mentioned, this looks extremely interesting... if you could provide some code/implementation sample material then please do.
  4. Thanks for the reply, Adam_42. Whilst I could not use the code found at the link provided (for various reasons) it made for an interesting read and provided useful information. Fortunately I have found a solution to the problem. The views (into which my MDI application renders) use DirectX and various render targets. Hence I simply grab the contents of the render target and save them to a file or, alternatively, grab the bits and convert into a bitmap format suitable for copying to the clipboard.
  5. Hi all, I am currently working on an MFC (MDI) application and need to get a simple screengrab working (i.e. given a view, I need to grab the client area and save the contents to the clipboard and/or a file). This appeared quite simple and, yes, I have a basic version running whereby I grab the device context of the view I wish to grab, blit the contents to a bitmap, and then export to a file (or to the clipboard). This works fine on Win2k, WinXP and Vista (on both 32- and 64-bit platforms). However, if Vista is running the Aero (Glassy) UI then I end up with just a blank image (whether grabbing to the clipboard or a file). Does anybody know how to programmatically grab the client area of a view/window successfully when running Vista Aero? Any links, ideas or suggestions would be most welcome. Kind regards, Sharky
  6. Sharky

    2D Tile Based Map Questions

    Take a look into "paging systems" and "streaming content" and go from there. This will give you some ideas and provide you with some useful concepts / practices for handling large data sets (such as tile maps). If you can get your hands on a copy of Game Programming Gems 5 take a look at the "Generic Pager" article by Cruz. Very simple to understand, easy to implement and easily adapted to suit your needs. If nothing else it might give you a starting point. (It's in C++ by the way).
  7. How are you rendering your 2D elements? Have you ensured you've cleared/set the relevant matrices? Or set up the correct projection matrix (such as an orthographic projection)?
  8. Rather than setting a timestamp for your quads/entities how about doign something simpler? For example... Calculate the time taken for the last frame (by using a timer/timer class similar to the one outlined above) and then simply use this value to update your quads. e.g. newXPos = oldXPos + (velocityX * frameTime);
  9. Sure is a strange one, Dookie! However it is good news that it works flawlessly on other configurations so not all is lost, huh? ;) Have fun with your quad engine and be sure to post the 'solution' should this strange issue ever get fixed!
  10. Sharky

    Fade in/out

    Pixel shaders are great for this sort of thing, too...
  11. Don't be so quick to blame nVidia driver issues... yes, there have been problems with them in the past but - generally - they are pretty solid and stable. Are you sure you have exhausted all other possibilities or reasons for the strange behaviour? Are you running overclocked hardware? Is your power supply sufficient for your PC? These may sound like weird questions but, believe me, they can quite often cause strange rendering artifacts and behaviour in the graphics hardware! :) Is it possible you could post your code somewhere for people to take a look at (if you are happy to do so)? There may be some issue elsewhere that another reader may quickly pick up on. Hopefully you will soon be able to get the results you are after!
  12. IFooBar's method is the ideal way to go... simply render your scene to a texture, have a pixelshader carry out the greyscale conversion, and then render the result to the screen as a fullscreen quad (using the greyscale'd texture). Take a search on the 'Net for post-process pixel shader effects, and similar. You will find a lot of information about various techniques for colour conversions, greyscaling, etc...
  13. It may be possible to draw your terrain in one DIP call, although it might not be easy and it may not be the most flexible method should you want to take your rendering engine further. You may also reach a point where your geometry consists of too many vertices to send to the graphics hardware in a single call, too. Looking at the rows of triangle strips in the OP it is possible to insert degenerate triangles into your list such that all the rows can be rendered in a single call. There are, I believe, a few articles on the 'Net covering this - although I cannot think of any off the top of my head.
  14. Sharky

    cannot install directX 9 sdk

    I have had a similar problem before whereby I was unable to uninstall an old version of the DirectX9 SDK and, as a result, I was also unable to install a newer version of the SDK. Unfortunately I found no easy solution to this problem and had to resort to manually working through the registry and removing key and entries relating to the SDK and the SDK installation. This was a bit of a laborious task and, to be honest, it is probably just as easy to backup your system and start from fresh. Of course there may be someone else on here who has a more plausible solution to this problem but I am yet to come across one! Good luck! Sharky
  15. Sharky

    Performance Problems...

    Quote:Original post by jollyjeffers Quote:On a project I am currently working on I found (under certain circumstances) that the process of calculating which 'nodes' were visible per frame actually took longer than to simply send all the geometry to the card I found this hit me really heard when I wrote a quadtree renderer for my terrain. Spent a long time with a pen and a piece of paper (!!) and streamlined the data flow. You can, granted very domain specific, move a lot of culling code to a pre-process stage (at a modest memory consumption increase). I found that I could pre-calculate a few aspects of my culler when the level loaded (partitioning the space not on geometry location, but on camera orientation*) and then a whole load more that is only necessary when the camera moves, and then only a trivial amount when each frame is renderered. Took a lot of effort, but I ended up with a system that ran at around 300-400fps when constantly moving, and over 1000fps when the camera was static. I would give some serious consideration to profiling your non-3D code, as in the data structures and algorithms you employ to decide WHAT goes to the GPU. hth Jack Yeah - I agree in that a well though out and designed system can result in some pretty fantastic performance gains (if done properly). The project to which I referred is not using octrees per se but a custom method for rendering zones and suchlike. I can't give too much away now as I believe there may be a possibility of writing this up for Gamasutra along with a couple of colleagues... although it won't be anytime soon! Regards, SharkyUK
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!