Jump to content
  • Advertisement

Toastmastern

Member
  • Content count

    34
  • Joined

  • Last visited

Community Reputation

320 Neutral

About Toastmastern

  • Rank
    Member

Personal Information

  • Interests
    |programmer|
  1. Toastmastern

    Dynamic Vertex buffer

    Turned out the second part of what you said is what I did wrong I now dont set any data in the buffer when I initialize the buffers. Before rendering I then map and unmap the vertex and index buffer to update it. I did it cause I thought the bottleneck of my game was that it reinitalized the buffers before instead of having them dynamic. Turns out that wasn’t the issue, the FPS is still really low for 15k vertices. Im thinking it has to do with the way I subdivison my sphere every time the camera is moved. Anyone have any idea on what way to move forward? Maybe I should implement a way to check if all of the vertex really need to be updated. No idea how to do that yet tho also going to use profiler tonight to try and see were my bottleneck actually lies //Toastmastern
  2. Hello everyone, After a few years of break from coding and my planet render game I'm giving it a go again from a different angle. What I'm struggling with now is that I have created a Frustum that works fine for now atleast, it does what it's supose to do alltho not perfect. But with the frustum came very low FPS, since what I'm doing right now just to see if the Frustum worked is to recreate the vertex buffer every frame that the camera detected movement. This is of course very costly and not the way to do it. Thats why I'm now trying to learn how to create a dynamic vertexbuffer instead and to map and unmap the vertexes, in the end my goal is to update only part of the vertexbuffer that is needed, but one step at a time ^^ So below is my code which I use to create the Dynamic buffer. The issue is that I want the size of the vertex buffer to be big enough to handle bigger vertex buffers then just mPlanetMesh.vertices.size() due to more vertices being added later when I start to do LOD and stuff, the first render isn't the biggest one I will need. vertexBufferDesc.Usage = D3D11_USAGE_DYNAMIC; vertexBufferDesc.ByteWidth = mPlanetMesh.vertices.size(); vertexBufferDesc.BindFlags = D3D11_BIND_VERTEX_BUFFER; vertexBufferDesc.CPUAccessFlags = D3D11_CPU_ACCESS_WRITE; vertexBufferDesc.MiscFlags = 0; vertexBufferDesc.StructureByteStride = 0; vertexData.pSysMem = &mPlanetMesh.vertices[0]; vertexData.SysMemPitch = 0; vertexData.SysMemSlicePitch = 0; result = device->CreateBuffer(&vertexBufferDesc, &vertexData, &mVertexBuffer); if (FAILED(result)) { return false; } What happens is that the result = device->CreateBuffer(&vertexBufferDesc, &vertexData, &mVertexBuffer); Makes it crash due to Access Violation. When I put the vertices.size() in it works without issues, but when I try to set it to like vertices.size() * 2 it crashes. I googled my eyes dry tonight but doesn't seem to find people with the same kind of issue, I've read that the vertex buffer can be bigger if needed. What I'm I doing wrong here? Best Regards and Thanks in advance Toastmastern
  3. So it's been a while since I took a break from my whole creating a planet in DX11. Last time around I got stuck on fixing a nice LOD. A week back or so I got help to find this: https://github.com/sp4cerat/Planet-LOD In general this is what I'm trying to recreate in DX11, he that made that planet LOD uses OpenGL but that is a minor issue and something I can solve. But I have a question regarding the code He gets the position using this row vec4d pos = b.var.vec4d["position"]; Which is then used further down when he sends the variable "center" into the drawing function: if (pos.len() < 1) pos.norm(); world::draw(vec3d(pos.x, pos.y, pos.z)); Inside the draw function this happens: draw_recursive(p3[0], p3[1], p3[2], center); Basically the 3 vertices of the triangle and the center of details that he sent as a parameter earlier: vec3d(pos.x, pos.y, pos.z) Now onto my real question, he does vec3d edge_center[3] = { (p1 + p2) / 2, (p2 + p3) / 2, (p3 + p1) / 2 }; to get the edge center of each edge, nothing weird there. But this is used later on with: vec3d d = center + edge_center; edge_test = d.len() > ratio_size; edge_test is then used to evaluate if there should be a triangle drawn or if it should be split up into 3 new triangles instead. Why is it working for him? shouldn't it be like center - edge_center or something like that? Why adding them togheter? I asume here that the center is the center of details for the LOD. the position of the camera if stood on the ground of the planet and not up int he air like it is now. Full code can be seen here: https://github.com/sp4cerat/Planet-LOD/blob/master/src.simple/Main.cpp If anyone would like to take a look and try to help me understand this code I would love this person. I'm running out of ideas on how to solve this in my own head, most likely twisted it one time to many up in my head Thanks in advance Toastmastern
  4. Toastmastern

    Normal mapping in Shader

    Almost got it working now. Only issue left is that the sun seems to be the camera, so that when I rotate and move around the sun moves a long with me. I've narrowed it down to being that the normal moves with the camera for some reason even tho I don't do any translation on it. Here is the updated code in the domain shader: float3 vertexPosition; float3 sphereNormal; float2 heightMapCoord; float2 colorMapCoord; PixelInputType output; float heightMapSample; float4 colorMapSample; float3 normalMapSample; float3 normal; heightMapCoord = uvwCoord.x * patch[0].heightMapCoord + uvwCoord.y * patch[1].heightMapCoord + uvwCoord.z * patch[2].heightMapCoord; colorMapCoord = uvwCoord.x * patch[0].colorMapCoord + uvwCoord.y * patch[1].colorMapCoord + uvwCoord.z * patch[2].colorMapCoord; vertexPosition = uvwCoord.x * patch[0].position + uvwCoord.y * patch[1].position + uvwCoord.z * patch[2].position; sphereNormal = uvwCoord.x * patch[0].sphereNormal + uvwCoord.y * patch[1].sphereNormal + uvwCoord.z * patch[2].sphereNormal; heightMapSample = heightMapTexture.SampleLevel(sampleType, heightMapCoord, 0); colorMapSample = colorMapTexture.SampleLevel(colorSampleType, colorMapCoord, 0); normalMapSample = normalMapTexture.SampleLevel(normalSampleType, heightMapCoord, 0).rgb; normal = normalize((normalMapSample * 2) - 1); vertexPosition.x = vertexPosition.x + (sphereNormal.x * ((heightMapSample * 29429.0f) - 8200.0f)); vertexPosition.y = vertexPosition.y + (sphereNormal.y * ((heightMapSample * 29429.0f) - 8200.0f)); vertexPosition.z = vertexPosition.z + (sphereNormal.z * ((heightMapSample * 29429.0f) - 8200.0f)); output.position = mul(float4(vertexPosition, 1.0f), worldMatrix); output.position = mul(output.position, viewMatrix); output.position = mul(output.position, projectionMatrix); //output.normal = mul(float4(normal, 0.0f), viewMatrix); output.normal = float4(normal, 0.0f); output.texCoord = heightMapCoord; output.color = colorMapSample; return output; Notice that the normal is just sampled from the texture and then sent to the pixel shader as a direction and not a position. I believe that's the way you do it tho with a 0.0f instead of a 1.0f as the last parameter. Here comes the pixel shader code, what I've tried here is to just set the color to the normal value and I can see that it is changing with the camera movement. Which leads me to believe something is funcy about the normal. float3 sunLightDir; float sunLightIntensity; float4 sunLightColor; float4 color = input.color; float3 normal = input.normal; float3 dp1 = ddx_fine(input.position); float3 dp2 = ddy_fine(input.position); float2 duv1 = ddx_fine(input.texCoord); float2 duv2 = ddy_fine(input.texCoord); float3 dp2perp = cross(dp2, normal); float3 dp1perp = cross(normal, dp1); float3 T = dp2perp * duv1.x + dp1perp * duv2.x; float3 B = dp2perp * duv1.y + dp1perp * duv2.y; float invmax = rsqrt(max(dot(T,T), dot(B,B))); float3x3 TBN = float3x3(T * invmax, B * invmax, normal); normal = mul(normal, TBN); sunLightDir = -lightDirection; sunLightIntensity = saturate(dot(normal, sunLightDir)); sunLightColor = saturate(diffuseColor * sunLightIntensity); color = sunLightColor * color; return color;
  5. Toastmastern

    Normal mapping in Shader

    Found some errors in my code while sitting at work: normal.x = (normalMapSample.r * 2) - 1; normal.y = (normalMapSample.g * 2) - 1; normal.b = (normalMapSample.b * 2) - 1; I need to normalize this vector. normal = mul(TBN, normalMappingNormal); Here I need to use normal instead of normalMappingNormal. I remember trying a lot of different things last night, I hope these things weren't part of those changes :P //Toastmastern
  6. I've been struggeling with this part for a good week now. I think I'm really close but missing something vital.   What I have is a planet, for this planet I have a height map and normal map. What I'm trying to achieve is to map the normal around the sphere that consist of my planet. I've read up on TBN matrix and how to calculate it. Everything I'm explaining in this post takes place in the domain shader.   I know that the normal is correct, since I get some shadow from the light on it but it looks weird since it 2D in 3D world if you know what I mean.   I will split up the shader code first and then post it all in the end. First I take 3 points on my height map: heightMapCoord1 = uvwCoord.x * patch[0].heightMapCoord + uvwCoord.y * patch[1].heightMapCoord + uvwCoord.z * patch[2].heightMapCoord; heightMapCoord2.x = ((heightMapCoord1.x * 46080.0f) + 1.0f) / 46080.0f; heightMapCoord2.y = heightMapCoord1.y; heightMapCoord3.x = heightMapCoord1.x; heightMapCoord3.y = ((heightMapCoord1.y * 22528.0f) + 1.0f) / 22528.0f; I then sample the height map to get 3 different Z values for my future vectors: heightMapSample1 = heightMapTexture.SampleLevel(sampleType, heightMapCoord1, 0); heightMapSample2 = heightMapTexture.SampleLevel(sampleType, heightMapCoord2, 0); heightMapSample3 = heightMapTexture.SampleLevel(sampleType, heightMapCoord3, 0); The next step is to create the 3 different vectors normalMappingVector1 = float3(heightMapCoord1.x, heightMapCoord1.y, heightMapSample1); normalMappingVector2 = float3(heightMapCoord2.x, heightMapCoord2.y, heightMapSample2); normalMappingVector3 = float3(heightMapCoord3.x, heightMapCoord3.y, heightMapSample2); I then create cross the vectors between vector 2-1 and 3-1 normalMappingNormal = normalize(cross((normalMappingVector2 - normalMappingVector1), (normalMappingVector3 - normalMappingVector1))); These code that follows is to calculate the tangent coef = 1 / ((heightMapCoord2.x * heightMapCoord3.y) - (heightMapCoord3.x * heightMapCoord2.y)); normalMappingTangent.x = coef * ((normalMappingVector2.x * heightMapCoord3.y) + (normalMappingVector3.x * -heightMapCoord2.y)); normalMappingTangent.y = coef * ((normalMappingVector2.y * heightMapCoord3.y) + (normalMappingVector3.y * -heightMapCoord2.y)); normalMappingTangent.z = coef * ((normalMappingVector2.z * heightMapCoord3.y) + (normalMappingVector3.z * -heightMapCoord2.y)); Normalize it and calculate the Bi-tangent normalMappingTangent = normalize(normalMappingTangent); normalMappingBiTangent = normalize(cross(normalMappingNormal, normalMappingTangent)); I then use all the 3 vectors I have gotten and combining them to a Matrix and then mul my Normal(this normal is taken from the normal map and not calculated) float3x3 TBN = float3x3(normalMappingTangent, normalMappingBiTangent, normalMappingNormal); normal = mul(TBN, normal); Here is the full shader code Texture2D<float> heightMapTexture; Texture2D colorMapTexture; Texture2D normalMapTexture; SamplerState sampleType; SamplerState colorSampleType; SamplerState normalSampleType; cbuffer MatrixBuffer { matrix worldMatrix; matrix viewMatrix; matrix projectionMatrix; }; cbuffer LightBuffer { float4 diffuseColor; float3 lightDirection; float padding; }; struct ConstantOutputType { float edges[3] : SV_TessFactor; float inside : SV_InsideTessFactor; }; struct HullOutputType { float3 position : POSITION; float4 color : COLOR; float3 sphereNormal : NORMAL; float2 heightMapCoord : TEXCOORD0; float2 colorMapCoord : TEXCOORD1; }; struct PixelInputType { float4 position : SV_POSITION; float4 color : COLOR; }; [domain("tri")] PixelInputType PlanetDomainShader(ConstantOutputType input, float3 uvwCoord : SV_DomainLocation, const OutputPatch<HullOutputType, 3> patch) { float3 vertexPosition; float3 sphereNormal; float2 heightMapCoord1; float2 heightMapCoord2; float2 heightMapCoord3; float2 colorMapCoord; PixelInputType output; float heightMapSample1; float heightMapSample2; float heightMapSample3; float4 colorMapSample; float3 normalMapSample; float3 sunLightDir; float sunLightIntensity; float4 sunLightColor; float3 normalMappingVector1; float3 normalMappingVector2; float3 normalMappingVector3; float3 normalMappingNormal; float coef; float3 normalMappingTangent; float3 normalMappingBiTangent; float3 normal; sunLightDir = -lightDirection; heightMapCoord1 = uvwCoord.x * patch[0].heightMapCoord + uvwCoord.y * patch[1].heightMapCoord + uvwCoord.z * patch[2].heightMapCoord; heightMapCoord2.x = ((heightMapCoord1.x * 46080.0f) + 1.0f) / 46080.0f; heightMapCoord2.y = heightMapCoord1.y; heightMapCoord3.x = heightMapCoord1.x; heightMapCoord3.y = ((heightMapCoord1.y * 22528.0f) + 1.0f) / 22528.0f; colorMapCoord = uvwCoord.x * patch[0].colorMapCoord + uvwCoord.y * patch[1].colorMapCoord + uvwCoord.z * patch[2].colorMapCoord; vertexPosition = uvwCoord.x * patch[0].position + uvwCoord.y * patch[1].position + uvwCoord.z * patch[2].position; sphereNormal = uvwCoord.x * patch[0].sphereNormal + uvwCoord.y * patch[1].sphereNormal + uvwCoord.z * patch[2].sphereNormal; heightMapSample1 = heightMapTexture.SampleLevel(sampleType, heightMapCoord1, 0); heightMapSample2 = heightMapTexture.SampleLevel(sampleType, heightMapCoord2, 0); heightMapSample3 = heightMapTexture.SampleLevel(sampleType, heightMapCoord3, 0); colorMapSample = colorMapTexture.SampleLevel(colorSampleType, colorMapCoord, 0); normalMapSample = normalMapTexture.SampleLevel(normalSampleType, heightMapCoord1, 0).rgb; normal.x = (normalMapSample.r * 2) - 1; normal.y = (normalMapSample.g * 2) - 1; normal.b = (normalMapSample.b * 2) - 1; normalMappingVector1 = float3(heightMapCoord1.x, heightMapCoord1.y, heightMapSample1); normalMappingVector2 = float3(heightMapCoord2.x, heightMapCoord2.y, heightMapSample2); normalMappingVector3 = float3(heightMapCoord3.x, heightMapCoord3.y, heightMapSample2); normalMappingNormal = normalize(cross((normalMappingVector2 - normalMappingVector1), (normalMappingVector3 - normalMappingVector1))); coef = 1 / ((heightMapCoord2.x * heightMapCoord3.y) - (heightMapCoord3.x * heightMapCoord2.y)); normalMappingTangent.x = coef * ((normalMappingVector2.x * heightMapCoord3.y) + (normalMappingVector3.x * -heightMapCoord2.y)); normalMappingTangent.y = coef * ((normalMappingVector2.y * heightMapCoord3.y) + (normalMappingVector3.y * -heightMapCoord2.y)); normalMappingTangent.z = coef * ((normalMappingVector2.z * heightMapCoord3.y) + (normalMappingVector3.z * -heightMapCoord2.y)); normalMappingTangent = normalize(normalMappingTangent); normalMappingBiTangent = normalize(cross(normalMappingNormal, normalMappingTangent)); vertexPosition.x = vertexPosition.x + (sphereNormal.x * ((heightMapSample1 * 29429.0f) - 8200.0f)); vertexPosition.y = vertexPosition.y + (sphereNormal.y * ((heightMapSample1 * 29429.0f) - 8200.0f)); vertexPosition.z = vertexPosition.z + (sphereNormal.z * ((heightMapSample1 * 29429.0f) - 8200.0f)); output.position = mul(float4(vertexPosition, 1.0f), worldMatrix); output.position = mul(output.position, viewMatrix); output.position = mul(output.position, projectionMatrix); float3x3 TBN = float3x3(normalMappingTangent, normalMappingBiTangent, normalMappingNormal); TBN = transpose(TBN); normal = mul(TBN, normalMappingNormal); sunLightIntensity = saturate(dot(normal, sunLightDir)); sunLightColor = saturate(diffuseColor * sunLightIntensity); output.color = sunLightColor * colorMapSample; return output; } I hope anyone can recognize what I'm trying to do here. I've read on many sites that precalculating the normal, tangent and bitangent is easier on the GPU but I'm deadset on getting this to work. Worth noting is that I have taken this code from http://www.fabiensanglard.net/bumpMapping/index.php, and then trying to combine it with other things I've read: generateNormalAndTangent(float3 v1, float3 v2, text2 st1, text2 st2) { float3 normal = v1.crossProduct(v2); float coef = 1/ (st1.u * st2.v - st2.u * st1.v); float3 tangent; tangent.x = coef * ((v1.x * st2.v) + (v2.x * -st1.v)); tangent.y = coef * ((v1.y * st2.v) + (v2.y * -st1.v)); tangent.z = coef * ((v1.z * st2.v) + (v2.z * -st1.v)); float3 binormal = normal.crossProduct(tangent); } Thanks in advance Toastmastern
  7. Toastmastern

    Sunlight theory

    So basically the distance of the "sun" makes the need to calculate a directional light for all vertices? unnecessary. I can just say that the light is aiming at position 0, 0, 0? Or shoult I choose a position way behind the planet? //Toastmastern
  8. Toastmastern

    Sunlight theory

    I was thinking of how to simulate my sun today and in theory if the sun is position in the same height(y-position) as my planet could I just calculate the light direction as this splendid paint image I created:   http://imgur.com/7R25Joo   Or what is the theory around the sunlight hitting a planet?   //Toastmastern
  9. Toastmastern

    texture still show line between vertices

    Thanks for all the help in this thread. The answered turned out to be linked to the normals. I do indeed have several vertices in the same spot but their normal was different so that when I added the height they split up. The error was that patch[0].sphereNormalx, y and z wasn't recalculated after the tessellation. I don't have the code here since I'm at work but will post tonight.
  10. Toastmastern

    texture still show line between vertices

    Did some research together with some trial and error and it is where I sample my 16bpp texture and add to my height that causes this. Here is the code:   vertexPosition.x = vertexPosition.x + (patch[0].sphereNormal.x * ((heightMapSample * 29429.0f) - 8200.0f)); vertexPosition.y = vertexPosition.y + (patch[0].sphereNormal.y * ((heightMapSample * 29429.0f) - 8200.0f)); vertexPosition.z = vertexPosition.z + (patch[0].sphereNormal.z * ((heightMapSample * 29429.0f) - 8200.0f)); It seems that when i multiply by 29429.0f(max height) and subtract by -8200.0f(lowest point) the points that sit in the same position split up causing the texture to move apart Might have something to do with the normal vector, however that one is just the normal vector of the sphere before adding the height that shouldn't fuck something up or could it? More trial and error ongoing //Toastmastern
  11. Hello,   If you look at this image:   http://imgur.com/9exjwkL   why is it that I still see the lines between the vertices even tho I use a texture? Has it something to do with the fact that I am using Trianglelist instead of TriangleStrip or something like that?   Best Regards Toastmastern
  12. Toastmastern

    Rotating object

      It is, what I found was that the radiansToRotate was always so small and I forgotten that it would work this way, I had to add radiansToRotate to the old radiansToRotate all the time to increase it. I tried a different approach before and that one had me only use the difference every frame and not the total amount. Now I only have 1 bug left that I will sort out tonight(something with my gameTime class that gets the difference wrong when the second mark goes from 59 to 0 or something like that)   //Toastmastern
  13. Toastmastern

    Rotating object

    Hello, I have a planet that I needs to rotate around it's own Y-axis. I read up some and the way to go seems to be to let the shader do the work. I have a rotation matrix:   mPlanetRotationMatrix = XMMATRIX(cosf(radiansToRotate), 0.0f, sinf(radiansToRotate), 0.0f, 0.0f, 1.0f, 0.0f, 0.0f, -sinf(radiansToRotate), 0.0f, cosf(radiansToRotate), 0.0f, 0.0f, 0.0f, 0.0f, 1.0f); I then multiply my viewMatrix I get from the camera with this rotationMatrix:   mCamera->GetViewMatrix(viewMatrix); planetViewMatrix = XMMatrixMultiply(mPlanetRotationMatrix, viewMatrix); I then send it to the ColorShader:   result = shaderManager->RenderColorShader(direct3D->GetDeviceContext(), mTerrain->GetIndexCount(), worldMatrix, planetViewMatrix, projectionMatrix, tessellationAmount, textureManager->GetTexture(0)); The view matrix(in this class it is called viewMatrix again) is updated every frame in the code:   result = SetShaderParameters(deviceContext, worldMatrix, viewMatrix, projectionMatrix, tessellationAmount, texture); if (!result) { return false; } RenderShader(deviceContext, indexCount); Here is the SetShaderParameters code:   HRESULT result; D3D11_MAPPED_SUBRESOURCE mappedResource; MatrixBufferType *dataPtr; unsigned int bufferNumber; TessellationBufferType *dataPtr2; worldMatrix = XMMatrixTranspose(worldMatrix); viewMatrix = XMMatrixTranspose(viewMatrix); projectionMatrix = XMMatrixTranspose(projectionMatrix); result = deviceContext->Map(mMatrixBuffer, 0, D3D11_MAP_WRITE_DISCARD, 0, &mappedResource); if (FAILED(result)) { return false; } dataPtr = (MatrixBufferType*)mappedResource.pData; dataPtr->world = worldMatrix; dataPtr->view = viewMatrix; dataPtr->projection = projectionMatrix; deviceContext->Unmap(mMatrixBuffer, 0); bufferNumber = 0; deviceContext->DSSetConstantBuffers(bufferNumber, 1, &mMatrixBuffer); result = deviceContext->Map(mTessellationBuffer, 0, D3D11_MAP_WRITE_DISCARD, 0, &mappedResource); if (FAILED(result)) { return false; } dataPtr2 = (TessellationBufferType*)mappedResource.pData; dataPtr2->tessellationAmount = tessellationAmount; dataPtr2->padding = XMFLOAT3(0.0f, 0.0f, 0.0f); deviceContext->Unmap(mTessellationBuffer, 0); bufferNumber = 0; deviceContext->HSSetConstantBuffers(bufferNumber, 1, &mTessellationBuffer); deviceContext->DSSetShaderResources(0, 1, &texture); return true; Is there anything obvious that I am missing? Thanks in advance Toastmastern
  14. Toastmastern

    Camera movements around a sphere(planet)

      I think I know Vectors pretty good but Matrixes is a totaly different story for me :) I will look throught the 4h of videos your videos before continueing. Hopefully in the end I will have a clearer grasp about it.   I thought my upVector got updated  when I used the transformcoord function together with the rotation matrix, but that might not be the case, learning by doing I guess :) I base that guess on that when I removed the transform of the upVector the rotation with the mouse worked flawless but then I couldn't get the rightVector since that one is the cross product between the upVector and forwardVector(the rotation vector) so I need to transform the upVector, just need to figure out why my mouse rotation won't work after I move forward :)   //Toastmastern
  15. Toastmastern

    Camera movements around a sphere(planet)

    Thanks for the reply,    I believe as I learn more and more I will move away from the LookAtLH function that I'm using right now to build my own view matrix. I will look at your code more in detail tonight when I get back from work :)   Once again thanks for taking your time!
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!