This topic is 2015 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

Hello

With my old VertexShader i normalize and Scale each Vertex in VertexShader like:

VertexShaderOutput VertexShaderFunction(VertexShaderInput input)
{

float3 worldPosition = mul(input.Position, World).xyz;

float3 normalized = normalize(worldPosition);
float4 scaled = float4(normalized * SeaLevel, 1);

output.Position = mul(mul(scaled, View), Projection);

output.UV = input.UV;

return output;
}


So, to Multiply the World, View, Projection Matrix on CPU in have a new VertexShader:

VertexShaderOutput VertexShaderFunction(VertexShaderInput input)
{

output.Position = mul(input.Position, worldViewProj);

return output;
}


My question is, how can i normalize each Vertex in the new Shader?

This is for my Planet, so i have a Cube, and the center of the Cube is Origin(0,0,0).

##### Share on other sites

Hi.

I'm not sure if I understand correct. The 2nd vertex shader only transforms the modelspace vertex into viewspace (and all transformations in between).

If you still want to scale the source vertex, you need to keep some of the old vertex shader.

For example:

(I've assumed SeaLevel is an extern constant)

VertexShaderOutput VertexShaderFunction(VertexShaderInput input)
{

float3 worldPosition = normalize(mul(input.Position, World).xyz);
float4 scaled = float4(worldPosition * SeaLevel, 1);

output.Position = mul(scaled, worldViewProj);
output.UV = input.UV;

return output;
}


Not sure though why you have to normalize the worldposition, before scaling.

The result might be that all vertices are now 'quite close together'. Maybe you can save the nonscaled position, determine scale, then scale the position (and multiply by worldViewProj).

##### Share on other sites

If you need multiple copies of your position, just make an extra copy and multiply it accordingly.  Unfortunately I can't really follow what you are trying to do - can you describe it with a little more detail what you are trying to do?  Most likely if you describe how this geometry appears in a scene then it could be helpful.

##### Share on other sites

Thank you for replay...

So i have a QuadTree ( Chunked LOD)  like here:

http://www.gamedev.net/topic/621554-imprecision-problem-on-planet/

I scale and normalize each Vertex in the VertexShader, the advantage is, i can work with a cube on cpu side.

1 VertexBuffer for each Child/Patch.

every Patch has a own worldmatrix, i can calculate each Position of the Patch.

So to create a Sphere i use "Cube2Sphere" mapping, that mean the center of the "cube" should 0,0,0 (in WorldSpace)  to normalize each Vertex to get a Sphere.

But when you open the Link you see my problem -> float Precision.

I read, i should calculate the World,View, Projection Matrix on CPU side with doubles.

The result is i have one Matrix ( worldViewProjection ) and here my Question start.

I found to render "Relative to the GPU"

http://www.amazon.de/3D-Engine-Design-Virtual-Globes/dp/1568817118/ref=sr_1_1?ie=UTF8&qid=1397314306&sr=8-1&keywords=openglobe

https://github.com/virtualglobebook/OpenGlobe/tree/master/Source/Examples/Chapter05/Jitter/GPURelativeToEyeDSFUN90

It's in OpenGL, and he also use WordlViewProj Matrix instead of 3 single Matrix.

Edited by montify

• ### Game Developer Survey

We are looking for qualified game developers to participate in a 10-minute online survey. Qualified participants will be offered a \$15 incentive for your time and insights. Click here to start!

• 15
• 20
• 20
• 11
• 9