Jump to content

  • Log In with Google      Sign In   
  • Create Account


chrisendymion

Member Since 05 Jul 2013
Offline Last Active Sep 19 2014 08:47 AM

Topics I've Started

Understanding a paper's shadow volume algorithm

17 September 2014 - 06:44 AM

Hello ;)
 
I have trouble to understand a part of an algo for shadows volumes.
The paper is from Eric Bruneton (thanks to him) and can be found here : 
 
Bruneton uses a texture for storing deltaN, deltaZ, Zmin, Zmax.
 
You have no need to understand what are deltaN or deltaZ, my problem is with Zmin.
Zmin is the distance from camera to the nearest shadow volume front face.
 
He wrotes :
 
We associate with each pixel 4 values deltaN, deltaZ, Zmin, Zmax initialized to 0, 0, INFINITY, 0. 
In a first step we decrement (resp. increment) deltaN by
1 and deltaZ by the fragment depth z, and update Zmin and Zmax
with z, for each front (resp. back) face of the shadow surface.
 
And in the rendering passes :
 
We draw the shadow volume of the terrain into a deltaN, deltaZ,
Zmin, Zmax texture. For this we use the ADD and MAX blending
functions, disable depth write, and use a geometry
shader that extrudes the silhouette edges (as seen from the
sun).
 
I understand the algorythm, all is ok except the Zmin computation.
Why it is initialized to infinity ?
Blend modes are ADD and MAX, so I suppose ADD for RGB and MAX for Alpha.
With that deltaN (Red), deltaZ (Green) and Zmax (Alpha) are easy to compute.. 
But ADD with Zmin (Blue) ? How to get it ?
 
I have some workarounds for this problem, but I'm interested to understand how Bruneton did it.
 
Hope I was clear and sorry for my poor english ;)
 
Thank you,
 
Chris

Compile shaders in build time with common functions

03 September 2014 - 03:24 AM

Hello ;)

 

I'm in an optimization time for my project.
 
My shaders are actualy compiled in build time (VS 2012) to "shadername.cso" files.
 
For some complex computations (atmospheric scattering, ...), there is a lot of "common functions" for multiple shaders.
They are duplicate in every hlsl file.. It's not very convenient when changes are made :-(
 
Is there a simple solution for compiling shader with only one "common hlsl file", unique for all shaders.. ?
 
Thank you in advance and sorry for my poor english.

 

Chris


About fixed points

21 April 2014 - 07:31 AM

Hello,
 
I'm in an optimization step for my little procedural engine.
For now, I worked with floats for a full planet coordinates. So, there was some artifacts in very high LOD, caused by float precision.
After reading a lot of threads and articles on the net, I came to think about using fixed points (fractional integer) instead of float.
I never worked with fixed points, and so, there are some (lot of) concepts I didn't understand !
 
Where to start using them and.. when to stop ?
 
1- I want to choose freely the planet's radius, let say 5629 kilometers 194 meters 29 centimeters. 
Using a 16 bits whole part and 16 bits fractional part, the fixed radius number will be 368914876 (5629.19429 * 65536), right ?
But in the first place, it will be a float => 5629.19429 in the code (much readable), so I must convert it before sending it to the shader ?
 
2- My planet is quadtree based LOD (cube with 6 grids). Does the grids have integer coordinates ? How to scale them on the radius ?
I imagine I must choose a power of 2 radius for quadtree divisions ?
 
3- If the coordinates of point.x is 368914876, it's a big number in the shader (with far/near frustum, Z buffer, etc.), so what to do ? Convert it to float again ?
I must scale somewhere... ? Buf if 1 equals 1 centimeter.. It will be very very big ?!
 
4- In the shaders, what happens when I use the world, view, projection matrices ? Is there a moment when fixed points are no more working ?
 
It must be something I didn't see/understand.. 
Hope I was readable because it's very coufuse to me and my poor english did not help ;)
 
Thank you,
 
Chris

Help for gameplay's ideas about a little space demo game

06 February 2014 - 02:58 AM

Hello,
 
I'm an indie dev, which working after work. I allways wanted to make my own graphical engine (DirectX) for procedural content.
My final goal is a full 4X game with a complete story line, separated in episodes (where each introduces more content).
Now my engine is in alpha version and I'm thinking about creating a little game for demo purpose.
 
Features that will be available :
- Procedural planet generation (not complet, but advanced)
- Volumetrics lights, clouds and dynamical weather
- A physical engine for flying
- Except precomputed atmospheric scattering, all parameters can be changed in real time
 
I hope, for the demo, to make a max real physical simulation, where a little spaceship comes from space at great speed, inserting in orbit for slowing down.
After, inserting it in low orbit, going down and touch down a station.
For that, I don't want an arcade style game, but a much more simulation where errors are not permitted.
Managing flying parameters, coordinates, gravity, system power for shield, engine, etc... View from cockpit.. 
A success to touch down without damages must be very difficult. (yes I'm a star trek fan)
 
I have the main idea in head. But... It's not very funny.. How to add some "attractive" stuffs ? Like mission to achieve, time score, rising levels... ?
What do you think ?
 
That will be just a first little indie game for demo (and free !).. Not a big AAA ;)
And I'm alone for making it....
 
Thank you
 
Chris
 
PS : Sorry for my poor english.. 
 
Some screens from actual dev version (so much work to do.. and bugs to correct)
 
dreamgate.jpg

Texture2D from volumetrics effects, missing depth

03 January 2014 - 03:18 AM

Hello ;)
 
For my little graphical engine, I'm trying to add some volumetrics effects (like clouds). 
For that, I filled all the viewing frustum with a 3D Texture (with 128 slices at low res).
Noise is applied on every slice and a final pass is used for compute all the slices into one texture 2D.
The final texture is applied on top of the scene (full screen quad) with alpha blending.
 
Volumetric clouds are working very well ! I'm happy for that ;)
 
But now, I have an another problem. As the final texture 2d didn't have depth informations, my clouds are overlapping the world.
Don't know how to say (and my english is bad). Imagine far away clouds, but close mountains appear behind the clouds (must be inversed).
Some clouds need to be between mountains and camera, and some other, behind mountains, depending on their distances.
I tested with a depth map (from camera) of the world, used when the noise is applied, but cannot get it to works.. 
 
Any idea how to achieve this or the depth map is my only solution ?
 
I'm working with DirectX 11 (C++).
 
Thank you and a happy new year !
 
Chris
 

PARTNERS