Jump to content

  • Log In with Google      Sign In   
  • Create Account

Banner advertising on our site currently available from just $5!

1. Learn about the promo. 2. Sign up for GDNet+. 3. Set up your advert!


Member Since 07 Nov 2012
Offline Last Active Dec 03 2014 04:37 PM

Topics I've Started

Use 64bit precision (GPU)

16 April 2014 - 12:35 PM



I've bin working on a large Terrain, so now im on the Point that i reach the precision of float.



Im working with DX11 / ShaderModel 5 and this provided doubles on GPU..



My Question is: How i can use double (Matrix, inputPosition) instead of float?



I read something about Render relative to the Eye or so, but the simple solution should use doubles, but how?






Normalize Vertices in VertexShader

12 April 2014 - 04:21 AM




With my old VertexShader i normalize and Scale each Vertex in VertexShader like:

VertexShaderOutput VertexShaderFunction(VertexShaderInput input)
    VertexShaderOutput output;

	float3 worldPosition = mul(input.Position, World).xyz;

		float3 normalized = normalize(worldPosition);
		float4 scaled = float4(normalized * SeaLevel, 1);

		output.Position = mul(mul(scaled, View), Projection);

		output.UV = input.UV;

    return output;

So, to Multiply the World, View, Projection Matrix on CPU in have a new VertexShader:

VertexShaderOutput VertexShaderFunction(VertexShaderInput input)
    VertexShaderOutput output;

	output.Position = mul(input.Position, worldViewProj);

    return output;

My question is, how can i normalize each Vertex in the new Shader? 



This is for my Planet, so i have a Cube, and the center of the Cube is Origin(0,0,0). 



Rendering Relative to the Eye Using the GPU

09 April 2014 - 04:58 PM



I have some float Precision errors with my Procedural Planet (Earth Size)



So i bought this Book:





In this book describe a few technique to eliminate the "jittering"



i choosen Rendering Relative to Eye Using the GPU (Improving Precision with DSFUN90)


Here is the C#/OpenGl code from this Book (openSource Code) 



So i tried to do this with my Planet, but i get no result.
I have a ChunkedLOD Quadtree with 1 VertexBuffer(i.e each Patch has the same Vertex/IndexBuffer) only the worldMatrix is different for each Patch.
Here some Code:
 private static void DoubleToTwoFloats(double value, out float high, out float low)
            high = (float)value;
            low = (float)(value - high);

        private static void Vector3DToTwoVector3F(Vector3Double value, out Vector3 high, out Vector3 low)
            float highX;
            float highY;
            float highZ;

            float lowX;
            float lowY;
            float lowZ;

            DoubleToTwoFloats(value.X, out highX, out lowX);
            DoubleToTwoFloats(value.Y, out highY, out lowY);
            DoubleToTwoFloats(value.Z, out highZ, out lowZ);

            high = new Vector3(highX, highY, highZ);
            low = new Vector3(lowX, lowY, lowZ);

MatrixDouble m = world * cam.View;
                MatrixDouble mv = new MatrixDouble(
                    m.M11, m.M12, m.M13, 0,
                    m.M21, m.M22, m.M23, 0,
                    m.M31, m.M32, m.M33, 0,
                    m.M41, m.M42, m.M43, m.M44); 

                //RTC GPU 
                Vector3 eyeHigh;
                Vector3 eyeLow;

                Vector3 posHigh;
                Vector3 posLow;

                Vector3DToTwoVector3F(cam.Position, out eyeHigh, out eyeLow);
                Vector3DToTwoVector3F(position, out posHigh, out posLow);

                effect.Parameters["u_modelViewPerspectiveMatrixRelativeToEye"].SetValue(Manager.ConvertToFloatMatrix(cam.Projection * mv));

The VertexShader look like:

VertexShaderOutput VertexShaderFunction(VertexShaderInput input)
    VertexShaderOutput output;

	/*float3 worldPosition = mul(input.Position, World).xyz;
	float3 normalized = normalize(worldPosition);
	float4 scaled = float4(normalized * SeaLevel, 1);*/

	float3 t1 = positionLow - u_cameraEyeLow;
	float3 e = t1 - positionLow;
	float3 t2 = ((-u_cameraEyeLow - e) + (positionLow - (t1 - e))) + positionHigh - u_cameraEyeHigh;
	float3 highDifference = t1 + t2;
	float3 lowDifference = t2 - (highDifference - t1);

	output.Position = mul(u_modelViewPerspectiveMatrixRelativeToEye, float4(highDifference + lowDifference, 1.0));
	//output.Position = mul(mul(scaled, View), Projection);

    return output;

The CommentLines in the Shader are the Code, that i used befor. (With jittering)




Somebody out there, who can help me ? sad.png



best regards alex

Relative to the Camera

27 March 2014 - 12:31 PM



Im working on a Procedural Planet, but i have (like all the others) floating precision problems.



So i read this great Article (http://britonia.wordpress.com/2009/05/23/scale-and-32-bit-imprecision/).


This guy told me i render everything relative to the Camera (That means, the Camera Position is always at Vector3.Zero)


The first deviation is, i generate the Sphere with the Mathlab funktion (http://mathproofs.blogspot.com/2005/07/mapping-cube-to-sphere.html), so the sphere-center must always stay at the World Origin (0,0,0).


Britonia say i should create a new Worldmatrix with  position - cameraPosition, but here this function from Mathproofs doesen't match not longer..


So i build a second "WorldMatrix" 

worldNew = Matrix.CreateTranslation(position - cam.Position) * Matrix.CreateFromYawPitchRoll(cam.deltaX * 0.0005f, cam.deltaY * 0.0005f, 0);

So i multiply this new Matrix in the Shader with the View matrix like:

float4x4 xx = mul(View, worldNew);

and now the output.Position is:

output.Position = mul(mul(scaled, xx), Projection);




So it works, the Camera is at 0,0,0 and my Planet are in front of me.


Now the problem join in:



Before i move the Camera to look/move around.


Now instead to move the Camera i must move the Object in the Scene (in my case the Planet)

  private void MoveForwBackw(float amount)
            Position += Forward * amount;

        private void MoveLeftRight(float amount)
            Right = Vector3.Cross(Up, Forward);
            Position += Right * amount;

Forward, Backward, Left, Right work fine, but when the Mouse come in, the Y Axis on my Camera is always Zero  (i mean the Camera don't follow the perspective of my mouse.



here at 0:10sec:




The View Matrix is:

View = Matrix.CreateLookAt(Vector3.Zero, Vector3.Forward, Vector3.Up);

So whats my line, somebody have an idea how to fix this Problem, maybe i need a Forward and Upvector for my ViewMatrix? sad.png


Or anybody has another option to solve the floating imprecision?



best regards

SharpDX not similar to D11/c++?

13 March 2014 - 12:35 PM




I just read a Dx11 Book to understand the API, i would translate it to C#/SharpDX.


But the Book describes the way to create a VertexBuffer like:


first the Description:

desc.ByteWidth = size; 
desc.MiscFlags = 0; 
desc.StructureByteStride = 0; 

So in SharpDX i don't have MiscFlags and ByteWidth ....


next, create the Buffer:

HRESULT hr = g_pDevice->CreateBuffer( &desc, pData, &pBuffer ); 

in SharpDX MiniCube Demo the way to create a Buffer:

var vertices = Buffer.Create(device, BindFlags.VertexBuffer, new[]{......});

Here is a difference with C++D3D and Sharpdx..




Or to create DeviceContext/SwapChain. 




and SharpDX:


I thought  SharpDX and C++ D3D are similar, why are the difference?



So now my question is: When i reading this book how i find die right SharpDX Commands to recreate in SharpDX?




Sorry for my english i hope you understand what i mean and can give me a answer..