Jump to content
  • Advertisement


  • Content Count

  • Joined

  • Last visited

Community Reputation

122 Neutral

About BEM

  • Rank
  1. I am having problems creating a dx9 D3DDEVTYPE_NULLREF device on vista when not logged in. The same code works on xp. It fails with a message that saying hardware processing is not available for that adapter (if I use the directx debug dlls). Im using the default adapter. (sorry I dont have the exact message or my present settings etc.. left at work) ANY settings that would let me create a device would do! on a related topic.. why does creating D3DDEVTYPE_NULLREF balk if I try to use multisampling? Why does a null device care?
  2. Im looking into loading higher resolution versions of textures in-game, depending where you are. I am afraid of the renderer chugging of course. In an ideal world I would just be setting mipmap levels and the renderer would always just give me what it could at that instant without ever stalling. Does anyone have any general advice, and.. also how are Preload() and setLOD() meant to be used? eg is Preload something you call some seconds before you need a resource, or directly before a frame for all relevant resources? If you give it enough time can it load in in the background?
  3. btw, was going to suggest a mipmap trick i thought up but apparently it has already been done: http://www.gamedev.net/community/forums/topic.asp?topic_id=365001 Suppose your transparent objects are 2d planes, eg particles. I figure what you do is expand your billboard but shrink your UVs so the particle looks the same size on screen, then look up a mipmap level to create the correct blur radius. I guess you set your sampling to use a 0,0,0,0 border colour. Havent tried it though.
  4. What worries me most about DOF is the hard boundaries I expect when you blend transparent effects in. If your scene is full of volumetric effects they could be everywhere. One way I think you could solve this is to (a) generate your DOFed opaques first, without considering transparancy. (b) generate a 'DOFed depth map' also. This is your opaques' depth values blurred by the same function that you used in (a). However you also have to store another value to represent the fuzzyness of your DOF'ed depth values. (c) Then render your non-opaques: Use soft particle techniques but also use the DOFed Depth map and factor in your fuzzyness value. (you cant even clip to the depth buffer!) (Variance shadowmaps might be a clue how to generate and use this 'fuzzy depth'. maybe there is something simpler though.) It just seems like too much complexity for a problem no one seems to have made a big deal over. :)
  5. Hi dmonte, Yeah you can output a depth value to a texture even if you are not writing to a depth buffer, but this does not solve the problem I was describing. Also in the solution you are describing, if you have DOF for your window you will not have DOF for the scene you are watching through the window. This would be more obvious if your window was not darkly tinted. Im not so worried about that one though because I can see possible solutions.
  6. Quote:Original post by dmonte .. Particles are a whole new story, you'll have to render them after all objects have been rendered using already discussed techniques you might find online. I'd suggest soft particles. I have found no compromises for DOF, refraction and particles. However I've had to compromise on other techniques, but not these that you have mentioned. .. You don't need to compromise on any of the mentioned effects. Let me clarify, I am talking about compromises when combining these techniques. Take soft particles with DOF. What depth do you compare against for the soft particle? You should really be comparing against some sort of fuzzy depth value (variance or PCF?) or you get incorrect hard clipping even with soft particles. This hard edge would happen if a soft particle is poking out from behind an opaque object. Without considering DOF, this is correct behavior. The problem is that with DOF you still get this hard edge even though it is no longer correct. A blurred edge of an opaque object should fuzzily overlap anything (such as a particle or other transparent object) in the background.
  7. That single glass layer case with DOF is harder than I first thought. If the glass is at a blurred depth but does not contain any ripples then it should not add any blur to the refracted component. The only way I can think to do it correctly would be to render the background, the window, and the window's x-y offsets to three separate textures and apply DOF to them separately. This is HIDEOUS! - Render the opaque background and its depth to textures. - Render the window colour to a new texture, but clipped by the z-values of the opaque background. (omit this texture if the window has no colour) - Render the window again to another texture, also clipped, this time writing out x-y refraction offsets. Store x^2 and y^2 in z and w so we can do variance type maths later. - Render the window depth to a texture. - Apply DOF to the background, the window colour and the window offsets separately. - compose these three to a new texture using the DOF-ed offsets and their variances to sample the DOF-ed background, and overlaying the DOF-ed window colour. ..I dont think its worth it :)
  8. There are lots of cool effects out there and individually they are pretty straight forward. What usually isn't explained is how you fit two or more of them together. In reality you probably can't, so the question becomes what can you get away with? For example: Depth Of Field: pretty straight forward if you are only considering opaque objects. But how do you handle a transparent window in a DOF-blurred building? Refraction and heat effects: Pretty straight forward until they overlap, or are mixed with particles. With a single plane of water you could sort particles into infront and behind. Multiple heat haze effects could be accumulated if there are not particles between them. Particles and transparency: Pretty straight forward, apart from the above. I have been banging my head against this for a while. Is there any general way of fitting these together, or otherwise what constraints do you put up with and how does it not become a problem?
  • Advertisement

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!