Jump to content

  • Log In with Google      Sign In   
  • Create Account


Member Since 05 Jul 2013
Offline Last Active Jul 25 2014 12:13 AM

Topics I've Started

About fixed points

21 April 2014 - 07:31 AM

I'm in an optimization step for my little procedural engine.
For now, I worked with floats for a full planet coordinates. So, there was some artifacts in very high LOD, caused by float precision.
After reading a lot of threads and articles on the net, I came to think about using fixed points (fractional integer) instead of float.
I never worked with fixed points, and so, there are some (lot of) concepts I didn't understand !
Where to start using them and.. when to stop ?
1- I want to choose freely the planet's radius, let say 5629 kilometers 194 meters 29 centimeters. 
Using a 16 bits whole part and 16 bits fractional part, the fixed radius number will be 368914876 (5629.19429 * 65536), right ?
But in the first place, it will be a float => 5629.19429 in the code (much readable), so I must convert it before sending it to the shader ?
2- My planet is quadtree based LOD (cube with 6 grids). Does the grids have integer coordinates ? How to scale them on the radius ?
I imagine I must choose a power of 2 radius for quadtree divisions ?
3- If the coordinates of point.x is 368914876, it's a big number in the shader (with far/near frustum, Z buffer, etc.), so what to do ? Convert it to float again ?
I must scale somewhere... ? Buf if 1 equals 1 centimeter.. It will be very very big ?!
4- In the shaders, what happens when I use the world, view, projection matrices ? Is there a moment when fixed points are no more working ?
It must be something I didn't see/understand.. 
Hope I was readable because it's very coufuse to me and my poor english did not help ;)
Thank you,

Help for gameplay's ideas about a little space demo game

06 February 2014 - 02:58 AM

I'm an indie dev, which working after work. I allways wanted to make my own graphical engine (DirectX) for procedural content.
My final goal is a full 4X game with a complete story line, separated in episodes (where each introduces more content).
Now my engine is in alpha version and I'm thinking about creating a little game for demo purpose.
Features that will be available :
- Procedural planet generation (not complet, but advanced)
- Volumetrics lights, clouds and dynamical weather
- A physical engine for flying
- Except precomputed atmospheric scattering, all parameters can be changed in real time
I hope, for the demo, to make a max real physical simulation, where a little spaceship comes from space at great speed, inserting in orbit for slowing down.
After, inserting it in low orbit, going down and touch down a station.
For that, I don't want an arcade style game, but a much more simulation where errors are not permitted.
Managing flying parameters, coordinates, gravity, system power for shield, engine, etc... View from cockpit.. 
A success to touch down without damages must be very difficult. (yes I'm a star trek fan)
I have the main idea in head. But... It's not very funny.. How to add some "attractive" stuffs ? Like mission to achieve, time score, rising levels... ?
What do you think ?
That will be just a first little indie game for demo (and free !).. Not a big AAA ;)
And I'm alone for making it....
Thank you
PS : Sorry for my poor english.. 
Some screens from actual dev version (so much work to do.. and bugs to correct)

Texture2D from volumetrics effects, missing depth

03 January 2014 - 03:18 AM

Hello ;)
For my little graphical engine, I'm trying to add some volumetrics effects (like clouds). 
For that, I filled all the viewing frustum with a 3D Texture (with 128 slices at low res).
Noise is applied on every slice and a final pass is used for compute all the slices into one texture 2D.
The final texture is applied on top of the scene (full screen quad) with alpha blending.
Volumetric clouds are working very well ! I'm happy for that ;)
But now, I have an another problem. As the final texture 2d didn't have depth informations, my clouds are overlapping the world.
Don't know how to say (and my english is bad). Imagine far away clouds, but close mountains appear behind the clouds (must be inversed).
Some clouds need to be between mountains and camera, and some other, behind mountains, depending on their distances.
I tested with a depth map (from camera) of the world, used when the noise is applied, but cannot get it to works.. 
Any idea how to achieve this or the depth map is my only solution ?
I'm working with DirectX 11 (C++).
Thank you and a happy new year !

[SOLVED] Tessellation on IcoSphere

19 September 2013 - 02:18 AM

Hello ;) Before, sorry for my poor english... 
I'm working on a procedural engine with some results on generating a full planet.
But I have a "crack" problem with the tessellation (based on frustum and distance LOD).
The icosphere is generated by code with X subdivisions. Let's go with 0 subdivision for theory..
I'm using this algorythm for adaptive tessellation :
Every vertex's distance is calculated and the minimum is used for defining the tessellation factor on the edge.
So, in theory, there will be every time the same factor for the same edge, and in result, no cracks !
But, there is a problem.. Which vertex for which edges ?
I need to address tessellation factor for each 3 edges in a hard order.
Edge 1 from triangle A must be the same edge shared by triangle B in the same order (egdge 1).
(in the exemple here, vertex variable is the tessellation factor calculated from distance to camera)
HULL SHADER (patch function)
_output.Edges[0] = min(_vertex2, _vertex3);
_output.Edges[1] = min(_vertex1, _vertex3);
_output.Edges[2] = min(_vertex1, _vertex2);
_output.Edges[0] = min(_vertex2, _vertex3);
_output.Edges[1] = min(_vertex1, _vertex3);
_output.Edges[2] = min(_vertex1, _vertex2);
So, if the edge shared by A and B is edge[1], A.vertex1 must be the same vertex as B.vertex3....
If not, there will be differents tessellation factors and cracks...
Whith a flat grid, this is working very well (every edges shared are in the same order) :
But for an icosphere.. It's not possible (view eclated from upside) :
The order is wrong......
I'm using 3 points patch list, I tried with 6 points patch list (neighbors), but it's the same problem...
I'm not sure to have correctly explained the problem.
My general question is : How to do tessellation on icosphere without cracks ?
Is there a better algorythm ?
I'm far from an expert in graphics.. 
Thank you very much for any help ;)
Here some screenshots from actual rendering results :

DirectX 11, using Tessellation & Geometry shader in a single pass

05 July 2013 - 12:34 AM

Hello ;)


Before all, sorry for my poor english !


With DirectX 11, i'm trying to create a random map full with GPU.


Using Hull shader stage, I'm managing LOD with tessellation.

Using Domain shader stage, I'm generating the map (based on perlin noise).


Now my goal, is to compute normals in the geometry shader (normal on vertex). For that, I must use vertex adjency, like geometry is capable of.


But here is the problem... For tessellation, my primitives must be D3D11_PRIMITIVE_TOPOLOGY_3_CONTROL_POINT_PATCHLIST.

But for geometry shader with 6 vertices(triangle primitive and adjency), I must use : D3D11_PRIMITIVE_TOPOLOGY_TRIANGLELIST_ADJ.


Think I'm missing something... It must be possible to tessellate and use the results in the geometry shader...

However, it's working with 3 points, but I cannot use the 3 others (they are 0.0, 0.0, 0.0)....


Thank you in advance for any help ;)