Jump to content

  • Log In with Google      Sign In   
  • Create Account

We're offering banner ads on our site from just $5!

1. Details HERE. 2. GDNet+ Subscriptions HERE. 3. Ad upload HERE.


Refractive water


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
3 replies to this topic

#1 jajcek   Members   -  Reputation: 274

Like
0Likes
Like

Posted 29 May 2014 - 01:52 AM

Hey,

 

I want to optimize my water rendering on the grounds of this: http://http.developer.nvidia.com/GPUGems2/gpugems2_chapter19.html

 

However, I don't understand how the last step should look like when rendering the final scene. 3 ways came to my mind, but I don't know which one is the correct way to go. Could you give me some suggestions, please?

 

I

1. Render all to the S texture except for the water

2. Render water to the same S texture using (for refractions) the texture you are rendering to (is this even possible?)

3. Render texture on the orthogonal to camera plane

 

II

1. Render all to the S texture except for the water, alpha channel will contain what part of the water is seen

2. Render texture on the orthogonal to camera plane

3. Render water to the main back buffer (in front of the plane) using mask from alpha channel to clip what is not seen

 

III

1. Render all (without water) to the S texture an back buffer at once using MRT

2. Render water simply to the back buffer using the texture for refractions

 

Thanks!



Sponsor:

#2 BlackBrain   Members   -  Reputation: 359

Like
0Likes
Like

Posted 29 May 2014 - 02:49 AM

First of all you render all non refractive meshes to a color buffer . As this article says you need an other buffer for determining wheter the position you are sampling from color buffer is in front of your refractive mesh or not.

So when rendering non refractive meshes you may want to use MRT and put them into the black and white buffer .

Then render your refractive meshes on the black and white buffer . refractive meshesh need to have black color.

 

Then you need to render to color buffer and read from the color buffer meanwhile . It's not possible but you can render the contents of the color buffer to another buffer(Let's name it  color2).

So now render your refractive meshes onto the color buffer use the color2 and black and white buffer to fulfill your needs .

 

But for determining wheter the position you are sampling from color buffer is in front of your refractive mesh or not , you can also use a depth buffer if you have one .

I think you can also directly copy the contents of color buffer to color2 using the map functionalities in directx.


Edited by BlackBrain, 29 May 2014 - 02:54 AM.


#3 jajcek   Members   -  Reputation: 274

Like
0Likes
Like

Posted 29 May 2014 - 12:17 PM

I think I will go with the MRT solution, because I will probably need this when doing soft edges feature, so it will be good to study this topic a bit. Although I have a probably funny question before I start. Does MRT work with back buffer as well? I mean if it's possible to render simultaneously to back buffer and some other render target or you can render simultaneously only to 2 other render targets where none of them is the main back buffer?



#4 Burnt_Fyr   Members   -  Reputation: 1247

Like
0Likes
Like

Posted 31 May 2014 - 10:20 AM

your hardware will dictate the number of simultaneous render targets available. You can bind your swapchain's backbuffer as a render target, and others as well, or you may wish to bind the back buffer on a separate pass, as you would in deferred rendering.






Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS