Jump to content
  • Advertisement

Toastmastern

Member
  • Content Count

    40
  • Joined

  • Last visited

Community Reputation

320 Neutral

About Toastmastern

  • Rank
    Member

Personal Information

  • Interests
    |programmer|
  1. Toastmastern

    DirectInput release key not detected

    Yea that isn't the issue. I rebuilt the input system to work with Raw Input instead to try and improve the system overall. What I've noticed now that it is something with the key repeat function. This is the code I'm testing on right now case WM_INPUT: { char buffer[sizeof(RAWINPUT)] = {}; UINT size = sizeof(RAWINPUT); GetRawInputData(reinterpret_cast<HRAWINPUT>(lparam), RID_INPUT, buffer, &size, sizeof(RAWINPUTHEADER)); // extract keyboard raw input data RAWINPUT* rawInput = reinterpret_cast<RAWINPUT*>(buffer); if (rawInput->header.dwType == RIM_TYPEKEYBOARD) { const RAWKEYBOARD& rawKB = rawInput->data.keyboard; // do something with the data here UINT virtualKey = rawKB.VKey; switch (virtualKey) { case Keyboard::W: if (rawInput->data.keyboard.Flags == RI_KEY_MAKE) { printf("W PRESSED!\r\n"); } if (rawInput->data.keyboard.Flags == RI_KEY_BREAK) { printf("W RELEASED!\r\n"); } break; } } } This is so basic code that there is nothing in my engine that can #*@! it up ^^ What happens that when I hold down the W key and then releases the W RELEASED! Text isn't printed until sometimes seconds has passed and W PRESSED! has printed like 20 more times. I mean I can handle it logically if just the W release is actually happening when the key is released and not until some que of W PRESSED is printed //Toastmastern
  2. So have been struggeling with this issue now for 20 hours or so. I'm using DirectInput, I know it's not the best but I have tried other approaches aswell which yields me the same results. What is happening is that I'm orbiting around a sphere with my up and down cursor buttons. Nothing to complicated really. What my issue is however is that when I press the key and hold it for a while(atleast this seems to be when the issue occurs, I have seen it at other times aswell but not as common) the keystate I get from DirectInput doesn't recognize that I have released the key. I can see from my debugging that I get a new keyboardState, and I can also see that from that keyboardState the key is still down, even tho I don't press it. This lag, delay or freeze or whatever one wants to call it can be up to several seconds. Here is a hastebin that includes the code in question: https://hastebin.com/ayucajobaw.cpp Any kind of ideas what my issue could be, or pointers for me what to google, cause I'm really running out of ideas here Best Regards Toastmastern
  3. Hello everyone, I've been during the past few days trying to fix sunlight on my sphere which have some bugs in it. For starters I'm using this code: https://github.com/Illation/ETEngine/blob/master/source/Engine/Shaders/PlanetPatch.glsl to calculate my normals instead of using a normal map. I'm then using this guide: http://www.thetenthplanet.de/archives/1180 To get my TBN Matrix. I have 2 main issues I'm working to solve when reworking this code. First I get seams in the normal map along the equator and from pole to pole. The normal also seems to move when I move my camera. Here is a video showing what I mean, the color is the normal calculated with the TBN matrix and as the camera moves it moves along with it. Nothing is multiplied by the view matrix or anything. Here is my code Vertex Shader: output.normal = mul(finalPos, worldMatrix); output.viewVector = (mul(cameraPos.xyz, worldMatrix) - mul(finalPos, worldMatrix)); mapCoords = normalize(finalPos); output.mapCoord = float2((0.5f + (atan2(mapCoords.z, mapCoords.x) / (2 * 3.14159265f))), (0.5f - (asin(mapCoords.y) / 3.14159265f))); output.position = mul(float4(finalPos, 1.0f), worldMatrix); output.position = mul(output.position, viewMatrix); output.position = mul(output.position, projectionMatrix); return output; and also what might be more important, the pixel shader: float3x3 GetTBNMatrix(float3 normalVector, float3 posVector, float2 uv) { float3 dp1, dp2, dp2perp, dp1perp, T, B; float2 duv1, duv2; float invMax; dp1 = ddx(posVector); dp2 = ddy(posVector); duv1 = ddx(uv); duv2 = ddx(uv); dp2perp = cross(dp2, normalVector); dp1perp = cross(normalVector, dp1); // * -1 due to being LH coordinate system T = (dp2perp * duv1.x + dp1perp * duv2.x) * -1; B = (dp2perp * duv1.y + dp1perp * duv2.y) * -1; invMax = rsqrt(max(dot(T, T), dot(B, B))); return float3x3(T * invMax, B * invMax, normalVector); } float GetHeight(float2 uv) { return shaderTexture.SampleLevel(sampleType, uv, 0).r * (21.229f + 8.2f); } float3 CalculateNormal(float3 normalVector, float3 viewVector, float2 uv) { float textureWidth, textureHeight, hL, hR, hD, hU; float3 texOffset, N; float3x3 TBN; shaderTexture.GetDimensions(textureWidth, textureHeight); texOffset = float3((1.0f / textureWidth), (1.0f / textureHeight), 0.0f); hL = GetHeight(uv - texOffset.xz); hR = GetHeight(uv + texOffset.xz); hD = GetHeight(uv + texOffset.zy); hU = GetHeight(uv - texOffset.zy); N = normalize(float3((hL - hR), (hU - hD), 2.0f)); TBN = GetTBNMatrix(normalVector, -viewVector, uv); return mul(TBN, N); } float4 MarsPixelShader(PixelInputType input) : SV_TARGET { float3 normal; float lightIntensity, color; float4 finalColor; normal = normalize(CalculateNormal(normalize(input.normal), normalize(input.viewVector), input.mapCoord)); lightIntensity = saturate(dot(normal, normalize(-lightDirection))); color = saturate(diffuseColor * lightIntensity); return float4(normal.rgb, 1.0f);//float4(color, color, color, 1.0f); } Hope anyone can help shine some light on this problem for me Best Regards and Thanks in advance Toastmastern
  4. Toastmastern

    /Weirdness regarding Z-buffer/Pixel shader

    I were able to solve it, The issue was with my blend state. This question got me in the right direction: https://gamedev.stackexchange.com/questions/107866/directx11-alphablending-rendering-problem My blend state code now looks like this: blendStateDesc.RenderTarget[0].SrcBlend = D3D11_BLEND_SRC_ALPHA; blendStateDesc.RenderTarget[0].DestBlend = D3D11_BLEND_INV_SRC_ALPHA; Instead of: blendStateDesc.RenderTarget[0].SrcBlend = D3D11_BLEND_ONE; blendStateDesc.RenderTarget[0].DestBlend = D3D11_BLEND_ONE; I believe it has to do with the .DestBlend only due to the way my pixelcolor is blended. But now it works anyway. Hope this thread can help anyone else in need //Toastmastern
  5. Toastmastern

    /Weirdness regarding Z-buffer/Pixel shader

    I started off with that, and just to be sure I went back to that now and it has the same effect. Interesting enough if I change the color red to black instead in the image it doesn't appear at all on the screen even tho I commented away any kind of manipulation in the pixel shader. This is how I get the texture into the application: hResult = CreateWICTextureFromFile(device, fileName, &mMousePointerResource, &mMousePointerResourceView); if (FAILED(hResult)) { return false; } //Toastmastern
  6. Hello everyone, I'm looking for some advice since I have some issues with my textures for my mouse pointer and I'm not sure where to start to look. I have checked everything that I know off and now I'm in need of advice on what to look for in my code when I try to fix it. I have a planet that is rendered, I have a UI that is rendered and I also have a mouse pointer that is rendered. First the planet is rendered, then the UI and then the mouse pointer last. When the planet is done rendering I turn off Z-Buffer and enable Alpha Blending while I render the UI and the Mouse Pointer. In the Mouse Pointers Pixel Shader I look for black color and if that is the case I blend it. But what seems to happen is that it also blends part of the texture that isn't supose to be blended. I'm going to provide some screenshot of the effect. In the first image you can see that the mouse pointer changes color to a more white one when behing infront of the planet. The correct color is the one that is displayed when it's not infron of the planet. The second thing I find weird is that the mouse pointer is behind the ui text even tho it is rendered after. I also tried switching them around and it makes no difference. Also the UI doesn't have the same issues when being above the planet, it's color is displayed as it should. Here comes the Pixel Shader code if that helps anyone get a better grip of the issue: float4 color; color = shaderTexture.Sample(sampleType, input.tex); if(color.b == 0.0f && color.r == 0.0f && color.g == 0.0f) { color.a = 0.0f; } else { color.a = 1.0f; } return color; The UI uses almost the same code, but only checks the r channel of the color but I'm using all 3 channels in the Mouse Pointer due to colors might be abit more off. Should be that if the pixel is black it's should be blended. And it does work, but it's just that somehow it also does something with the parts that shouldn't be blended. Right now I'm leaning towards there being something in the Pixel Shader since I can set all pixels to white and it behaves as it should and creates a white box for me. Any pointers of what kind of issues I'm looking at here and what to search for to find a solution will be appreciated alot Best Regards and Thanks in Advance Toastmastern
  7. Toastmastern

    Dynamic Vertex buffer

    Turned out the second part of what you said is what I did wrong I now dont set any data in the buffer when I initialize the buffers. Before rendering I then map and unmap the vertex and index buffer to update it. I did it cause I thought the bottleneck of my game was that it reinitalized the buffers before instead of having them dynamic. Turns out that wasn’t the issue, the FPS is still really low for 15k vertices. Im thinking it has to do with the way I subdivison my sphere every time the camera is moved. Anyone have any idea on what way to move forward? Maybe I should implement a way to check if all of the vertex really need to be updated. No idea how to do that yet tho also going to use profiler tonight to try and see were my bottleneck actually lies //Toastmastern
  8. Hello everyone, After a few years of break from coding and my planet render game I'm giving it a go again from a different angle. What I'm struggling with now is that I have created a Frustum that works fine for now atleast, it does what it's supose to do alltho not perfect. But with the frustum came very low FPS, since what I'm doing right now just to see if the Frustum worked is to recreate the vertex buffer every frame that the camera detected movement. This is of course very costly and not the way to do it. Thats why I'm now trying to learn how to create a dynamic vertexbuffer instead and to map and unmap the vertexes, in the end my goal is to update only part of the vertexbuffer that is needed, but one step at a time ^^ So below is my code which I use to create the Dynamic buffer. The issue is that I want the size of the vertex buffer to be big enough to handle bigger vertex buffers then just mPlanetMesh.vertices.size() due to more vertices being added later when I start to do LOD and stuff, the first render isn't the biggest one I will need. vertexBufferDesc.Usage = D3D11_USAGE_DYNAMIC; vertexBufferDesc.ByteWidth = mPlanetMesh.vertices.size(); vertexBufferDesc.BindFlags = D3D11_BIND_VERTEX_BUFFER; vertexBufferDesc.CPUAccessFlags = D3D11_CPU_ACCESS_WRITE; vertexBufferDesc.MiscFlags = 0; vertexBufferDesc.StructureByteStride = 0; vertexData.pSysMem = &mPlanetMesh.vertices[0]; vertexData.SysMemPitch = 0; vertexData.SysMemSlicePitch = 0; result = device->CreateBuffer(&vertexBufferDesc, &vertexData, &mVertexBuffer); if (FAILED(result)) { return false; } What happens is that the result = device->CreateBuffer(&vertexBufferDesc, &vertexData, &mVertexBuffer); Makes it crash due to Access Violation. When I put the vertices.size() in it works without issues, but when I try to set it to like vertices.size() * 2 it crashes. I googled my eyes dry tonight but doesn't seem to find people with the same kind of issue, I've read that the vertex buffer can be bigger if needed. What I'm I doing wrong here? Best Regards and Thanks in advance Toastmastern
  9. So it's been a while since I took a break from my whole creating a planet in DX11. Last time around I got stuck on fixing a nice LOD. A week back or so I got help to find this: https://github.com/sp4cerat/Planet-LOD In general this is what I'm trying to recreate in DX11, he that made that planet LOD uses OpenGL but that is a minor issue and something I can solve. But I have a question regarding the code He gets the position using this row vec4d pos = b.var.vec4d["position"]; Which is then used further down when he sends the variable "center" into the drawing function: if (pos.len() < 1) pos.norm(); world::draw(vec3d(pos.x, pos.y, pos.z)); Inside the draw function this happens: draw_recursive(p3[0], p3[1], p3[2], center); Basically the 3 vertices of the triangle and the center of details that he sent as a parameter earlier: vec3d(pos.x, pos.y, pos.z) Now onto my real question, he does vec3d edge_center[3] = { (p1 + p2) / 2, (p2 + p3) / 2, (p3 + p1) / 2 }; to get the edge center of each edge, nothing weird there. But this is used later on with: vec3d d = center + edge_center; edge_test = d.len() > ratio_size; edge_test is then used to evaluate if there should be a triangle drawn or if it should be split up into 3 new triangles instead. Why is it working for him? shouldn't it be like center - edge_center or something like that? Why adding them togheter? I asume here that the center is the center of details for the LOD. the position of the camera if stood on the ground of the planet and not up int he air like it is now. Full code can be seen here: https://github.com/sp4cerat/Planet-LOD/blob/master/src.simple/Main.cpp If anyone would like to take a look and try to help me understand this code I would love this person. I'm running out of ideas on how to solve this in my own head, most likely twisted it one time to many up in my head Thanks in advance Toastmastern
  10. Toastmastern

    Normal mapping in Shader

    Almost got it working now. Only issue left is that the sun seems to be the camera, so that when I rotate and move around the sun moves a long with me. I've narrowed it down to being that the normal moves with the camera for some reason even tho I don't do any translation on it. Here is the updated code in the domain shader: float3 vertexPosition; float3 sphereNormal; float2 heightMapCoord; float2 colorMapCoord; PixelInputType output; float heightMapSample; float4 colorMapSample; float3 normalMapSample; float3 normal; heightMapCoord = uvwCoord.x * patch[0].heightMapCoord + uvwCoord.y * patch[1].heightMapCoord + uvwCoord.z * patch[2].heightMapCoord; colorMapCoord = uvwCoord.x * patch[0].colorMapCoord + uvwCoord.y * patch[1].colorMapCoord + uvwCoord.z * patch[2].colorMapCoord; vertexPosition = uvwCoord.x * patch[0].position + uvwCoord.y * patch[1].position + uvwCoord.z * patch[2].position; sphereNormal = uvwCoord.x * patch[0].sphereNormal + uvwCoord.y * patch[1].sphereNormal + uvwCoord.z * patch[2].sphereNormal; heightMapSample = heightMapTexture.SampleLevel(sampleType, heightMapCoord, 0); colorMapSample = colorMapTexture.SampleLevel(colorSampleType, colorMapCoord, 0); normalMapSample = normalMapTexture.SampleLevel(normalSampleType, heightMapCoord, 0).rgb; normal = normalize((normalMapSample * 2) - 1); vertexPosition.x = vertexPosition.x + (sphereNormal.x * ((heightMapSample * 29429.0f) - 8200.0f)); vertexPosition.y = vertexPosition.y + (sphereNormal.y * ((heightMapSample * 29429.0f) - 8200.0f)); vertexPosition.z = vertexPosition.z + (sphereNormal.z * ((heightMapSample * 29429.0f) - 8200.0f)); output.position = mul(float4(vertexPosition, 1.0f), worldMatrix); output.position = mul(output.position, viewMatrix); output.position = mul(output.position, projectionMatrix); //output.normal = mul(float4(normal, 0.0f), viewMatrix); output.normal = float4(normal, 0.0f); output.texCoord = heightMapCoord; output.color = colorMapSample; return output; Notice that the normal is just sampled from the texture and then sent to the pixel shader as a direction and not a position. I believe that's the way you do it tho with a 0.0f instead of a 1.0f as the last parameter. Here comes the pixel shader code, what I've tried here is to just set the color to the normal value and I can see that it is changing with the camera movement. Which leads me to believe something is funcy about the normal. float3 sunLightDir; float sunLightIntensity; float4 sunLightColor; float4 color = input.color; float3 normal = input.normal; float3 dp1 = ddx_fine(input.position); float3 dp2 = ddy_fine(input.position); float2 duv1 = ddx_fine(input.texCoord); float2 duv2 = ddy_fine(input.texCoord); float3 dp2perp = cross(dp2, normal); float3 dp1perp = cross(normal, dp1); float3 T = dp2perp * duv1.x + dp1perp * duv2.x; float3 B = dp2perp * duv1.y + dp1perp * duv2.y; float invmax = rsqrt(max(dot(T,T), dot(B,B))); float3x3 TBN = float3x3(T * invmax, B * invmax, normal); normal = mul(normal, TBN); sunLightDir = -lightDirection; sunLightIntensity = saturate(dot(normal, sunLightDir)); sunLightColor = saturate(diffuseColor * sunLightIntensity); color = sunLightColor * color; return color;
  11. Toastmastern

    Normal mapping in Shader

    Found some errors in my code while sitting at work: normal.x = (normalMapSample.r * 2) - 1; normal.y = (normalMapSample.g * 2) - 1; normal.b = (normalMapSample.b * 2) - 1; I need to normalize this vector. normal = mul(TBN, normalMappingNormal); Here I need to use normal instead of normalMappingNormal. I remember trying a lot of different things last night, I hope these things weren't part of those changes :P //Toastmastern
  12. I've been struggeling with this part for a good week now. I think I'm really close but missing something vital.   What I have is a planet, for this planet I have a height map and normal map. What I'm trying to achieve is to map the normal around the sphere that consist of my planet. I've read up on TBN matrix and how to calculate it. Everything I'm explaining in this post takes place in the domain shader.   I know that the normal is correct, since I get some shadow from the light on it but it looks weird since it 2D in 3D world if you know what I mean.   I will split up the shader code first and then post it all in the end. First I take 3 points on my height map: heightMapCoord1 = uvwCoord.x * patch[0].heightMapCoord + uvwCoord.y * patch[1].heightMapCoord + uvwCoord.z * patch[2].heightMapCoord; heightMapCoord2.x = ((heightMapCoord1.x * 46080.0f) + 1.0f) / 46080.0f; heightMapCoord2.y = heightMapCoord1.y; heightMapCoord3.x = heightMapCoord1.x; heightMapCoord3.y = ((heightMapCoord1.y * 22528.0f) + 1.0f) / 22528.0f; I then sample the height map to get 3 different Z values for my future vectors: heightMapSample1 = heightMapTexture.SampleLevel(sampleType, heightMapCoord1, 0); heightMapSample2 = heightMapTexture.SampleLevel(sampleType, heightMapCoord2, 0); heightMapSample3 = heightMapTexture.SampleLevel(sampleType, heightMapCoord3, 0); The next step is to create the 3 different vectors normalMappingVector1 = float3(heightMapCoord1.x, heightMapCoord1.y, heightMapSample1); normalMappingVector2 = float3(heightMapCoord2.x, heightMapCoord2.y, heightMapSample2); normalMappingVector3 = float3(heightMapCoord3.x, heightMapCoord3.y, heightMapSample2); I then create cross the vectors between vector 2-1 and 3-1 normalMappingNormal = normalize(cross((normalMappingVector2 - normalMappingVector1), (normalMappingVector3 - normalMappingVector1))); These code that follows is to calculate the tangent coef = 1 / ((heightMapCoord2.x * heightMapCoord3.y) - (heightMapCoord3.x * heightMapCoord2.y)); normalMappingTangent.x = coef * ((normalMappingVector2.x * heightMapCoord3.y) + (normalMappingVector3.x * -heightMapCoord2.y)); normalMappingTangent.y = coef * ((normalMappingVector2.y * heightMapCoord3.y) + (normalMappingVector3.y * -heightMapCoord2.y)); normalMappingTangent.z = coef * ((normalMappingVector2.z * heightMapCoord3.y) + (normalMappingVector3.z * -heightMapCoord2.y)); Normalize it and calculate the Bi-tangent normalMappingTangent = normalize(normalMappingTangent); normalMappingBiTangent = normalize(cross(normalMappingNormal, normalMappingTangent)); I then use all the 3 vectors I have gotten and combining them to a Matrix and then mul my Normal(this normal is taken from the normal map and not calculated) float3x3 TBN = float3x3(normalMappingTangent, normalMappingBiTangent, normalMappingNormal); normal = mul(TBN, normal); Here is the full shader code Texture2D<float> heightMapTexture; Texture2D colorMapTexture; Texture2D normalMapTexture; SamplerState sampleType; SamplerState colorSampleType; SamplerState normalSampleType; cbuffer MatrixBuffer { matrix worldMatrix; matrix viewMatrix; matrix projectionMatrix; }; cbuffer LightBuffer { float4 diffuseColor; float3 lightDirection; float padding; }; struct ConstantOutputType { float edges[3] : SV_TessFactor; float inside : SV_InsideTessFactor; }; struct HullOutputType { float3 position : POSITION; float4 color : COLOR; float3 sphereNormal : NORMAL; float2 heightMapCoord : TEXCOORD0; float2 colorMapCoord : TEXCOORD1; }; struct PixelInputType { float4 position : SV_POSITION; float4 color : COLOR; }; [domain("tri")] PixelInputType PlanetDomainShader(ConstantOutputType input, float3 uvwCoord : SV_DomainLocation, const OutputPatch<HullOutputType, 3> patch) { float3 vertexPosition; float3 sphereNormal; float2 heightMapCoord1; float2 heightMapCoord2; float2 heightMapCoord3; float2 colorMapCoord; PixelInputType output; float heightMapSample1; float heightMapSample2; float heightMapSample3; float4 colorMapSample; float3 normalMapSample; float3 sunLightDir; float sunLightIntensity; float4 sunLightColor; float3 normalMappingVector1; float3 normalMappingVector2; float3 normalMappingVector3; float3 normalMappingNormal; float coef; float3 normalMappingTangent; float3 normalMappingBiTangent; float3 normal; sunLightDir = -lightDirection; heightMapCoord1 = uvwCoord.x * patch[0].heightMapCoord + uvwCoord.y * patch[1].heightMapCoord + uvwCoord.z * patch[2].heightMapCoord; heightMapCoord2.x = ((heightMapCoord1.x * 46080.0f) + 1.0f) / 46080.0f; heightMapCoord2.y = heightMapCoord1.y; heightMapCoord3.x = heightMapCoord1.x; heightMapCoord3.y = ((heightMapCoord1.y * 22528.0f) + 1.0f) / 22528.0f; colorMapCoord = uvwCoord.x * patch[0].colorMapCoord + uvwCoord.y * patch[1].colorMapCoord + uvwCoord.z * patch[2].colorMapCoord; vertexPosition = uvwCoord.x * patch[0].position + uvwCoord.y * patch[1].position + uvwCoord.z * patch[2].position; sphereNormal = uvwCoord.x * patch[0].sphereNormal + uvwCoord.y * patch[1].sphereNormal + uvwCoord.z * patch[2].sphereNormal; heightMapSample1 = heightMapTexture.SampleLevel(sampleType, heightMapCoord1, 0); heightMapSample2 = heightMapTexture.SampleLevel(sampleType, heightMapCoord2, 0); heightMapSample3 = heightMapTexture.SampleLevel(sampleType, heightMapCoord3, 0); colorMapSample = colorMapTexture.SampleLevel(colorSampleType, colorMapCoord, 0); normalMapSample = normalMapTexture.SampleLevel(normalSampleType, heightMapCoord1, 0).rgb; normal.x = (normalMapSample.r * 2) - 1; normal.y = (normalMapSample.g * 2) - 1; normal.b = (normalMapSample.b * 2) - 1; normalMappingVector1 = float3(heightMapCoord1.x, heightMapCoord1.y, heightMapSample1); normalMappingVector2 = float3(heightMapCoord2.x, heightMapCoord2.y, heightMapSample2); normalMappingVector3 = float3(heightMapCoord3.x, heightMapCoord3.y, heightMapSample2); normalMappingNormal = normalize(cross((normalMappingVector2 - normalMappingVector1), (normalMappingVector3 - normalMappingVector1))); coef = 1 / ((heightMapCoord2.x * heightMapCoord3.y) - (heightMapCoord3.x * heightMapCoord2.y)); normalMappingTangent.x = coef * ((normalMappingVector2.x * heightMapCoord3.y) + (normalMappingVector3.x * -heightMapCoord2.y)); normalMappingTangent.y = coef * ((normalMappingVector2.y * heightMapCoord3.y) + (normalMappingVector3.y * -heightMapCoord2.y)); normalMappingTangent.z = coef * ((normalMappingVector2.z * heightMapCoord3.y) + (normalMappingVector3.z * -heightMapCoord2.y)); normalMappingTangent = normalize(normalMappingTangent); normalMappingBiTangent = normalize(cross(normalMappingNormal, normalMappingTangent)); vertexPosition.x = vertexPosition.x + (sphereNormal.x * ((heightMapSample1 * 29429.0f) - 8200.0f)); vertexPosition.y = vertexPosition.y + (sphereNormal.y * ((heightMapSample1 * 29429.0f) - 8200.0f)); vertexPosition.z = vertexPosition.z + (sphereNormal.z * ((heightMapSample1 * 29429.0f) - 8200.0f)); output.position = mul(float4(vertexPosition, 1.0f), worldMatrix); output.position = mul(output.position, viewMatrix); output.position = mul(output.position, projectionMatrix); float3x3 TBN = float3x3(normalMappingTangent, normalMappingBiTangent, normalMappingNormal); TBN = transpose(TBN); normal = mul(TBN, normalMappingNormal); sunLightIntensity = saturate(dot(normal, sunLightDir)); sunLightColor = saturate(diffuseColor * sunLightIntensity); output.color = sunLightColor * colorMapSample; return output; } I hope anyone can recognize what I'm trying to do here. I've read on many sites that precalculating the normal, tangent and bitangent is easier on the GPU but I'm deadset on getting this to work. Worth noting is that I have taken this code from http://www.fabiensanglard.net/bumpMapping/index.php, and then trying to combine it with other things I've read: generateNormalAndTangent(float3 v1, float3 v2, text2 st1, text2 st2) { float3 normal = v1.crossProduct(v2); float coef = 1/ (st1.u * st2.v - st2.u * st1.v); float3 tangent; tangent.x = coef * ((v1.x * st2.v) + (v2.x * -st1.v)); tangent.y = coef * ((v1.y * st2.v) + (v2.y * -st1.v)); tangent.z = coef * ((v1.z * st2.v) + (v2.z * -st1.v)); float3 binormal = normal.crossProduct(tangent); } Thanks in advance Toastmastern
  13. Toastmastern

    Sunlight theory

    So basically the distance of the "sun" makes the need to calculate a directional light for all vertices? unnecessary. I can just say that the light is aiming at position 0, 0, 0? Or shoult I choose a position way behind the planet? //Toastmastern
  14. Toastmastern

    Sunlight theory

    I was thinking of how to simulate my sun today and in theory if the sun is position in the same height(y-position) as my planet could I just calculate the light direction as this splendid paint image I created:   http://imgur.com/7R25Joo   Or what is the theory around the sunlight hitting a planet?   //Toastmastern
  15. Toastmastern

    texture still show line between vertices

    Thanks for all the help in this thread. The answered turned out to be linked to the normals. I do indeed have several vertices in the same spot but their normal was different so that when I added the height they split up. The error was that patch[0].sphereNormalx, y and z wasn't recalculated after the tessellation. I don't have the code here since I'm at work but will post tonight.
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!