• Content count

  • Joined

  • Last visited

Community Reputation

140 Neutral

About Decibit

  • Rank
  1. Though I haven't tried the tessellation in OpenGL, I'm quite sure that the geometry shader that expects adjacency information is invalid when the tessellator is active in DIrect3D. ([url=""]MSDN[/url]) The both APIs use the same hardware and generally tend to converge. So I would expect that the trick that you want to achieve is not possible. If anyone has a good workaround I would be interested to know about it myself.
  2. The foundation of many [i]serious[/i] CAD systems is BREP. This is a combination of data structures and techniques that are used to design complex 3D objects. These forms may combine a lot of basic forms: cubes, spheres, cylinders, Bezier and NURBS patches, sweep surfaces etc. A good book that covers the topic in great detail is [url=""]"Boundary Representation Modelling Techniques[/url]" by Ian Stroud. The set of operations that are possible with BREP is rich and may be quite difficult to grasp. You can use the [url=""]OpenCASCADE[/url] library as a playground for learning BREP where you can construct basic objects, apply the operations and see what they do.
  3. After some manual checking of the boundary pixels I've discovered that my incoming texture coordinates were shifted by 0.5. As a result the larger heightmap (that is used to calculate the normals at the boundary regions) was sampled incorrectly. So the shader from [b]ComputeNormals.fx[/b] was performing correctly but unfortunately processing the wrong input data. Problem solved! Syranide, thanks for your help with the GPU clipmaps and for the general terrain rendering tips!
  4. [quote name='Syranide' timestamp='1308824101' post='4826703'] I would assume the discontinuity is because you are computing the normals, but you don't actually sample outside of the boundaries specified by the target texture (as is required)... although, the pattern on the edges doesn't really seem ordinary... perhaps the edge normals get computed from some irrelevant data outside the texture? I'm not sure if you've noticed it, but the edge at the top and the bottom are equal or near equal... it would seem like the edges wrap around when sampling the normals?That would also explain why the first image looks correct, because it tiles. [/quote] Well, that it is quite likely that the sampling outside the boundaries gets messed up. I'm not sure that I understand the paper correctly at this point. It is necessary to sample the neighboring points in order to compute the normal to at some clipmap point. But where should the data come from for the points along the boundaries? I have no idea right now. No I haven't noticed that the texture tiles and the edges at the top and bottom are nearly equal. Thanks for the tip! I have rewritten my code a little bit. I can now use (almost) arbitrary clipmap resolution. I'm going to make the textures 8 x 8 instead of 128 x 128 and check manually the calculation of the edge texel values.
  5. Additional info: normal maps calculated for all LODs. The X and Y components are encoded in the red and green color channels. The Z component is supposed to be always 1. [attachment=3421:normals_level3.png] [attachment=3420:normals_level2.png] [attachment=3419:normals_level1.png] [attachment=3418:normals_level0.png] It is obvious that all the textures except the first one have a strange discontinuity along the border. The reason for this is still not clear to me.
  6. Hi Syranide! Thanks for the reply! [quote name='Syranide' timestamp='1308258148' post='4824226'] Anyway, as for your actual problem, it seems like you forgot to consider the X/Y-scaling of the different heightmap LODs as you compute your normals... remember, one LOD further away, means each quad is twice as wide (but not twice as high!), which also means you have to consider this when computing your normals. [/quote] You've spotted the problem precisely! The XY-scaling doubles as the LODs get farther away. My [b]ScaleFac[/b] variable doesn't take it into account. Unfortunately the error was only in the problem description I've posted here. My code snippet was wrong. Sorry! In fact I calculate the [b]ScaleFac[/b] as [code] uniform float2 ScaleFac = float2(-0.5/2^L,-0.5/2^L); // L is the clipmap index increasing for the LODs farther away [/code] For the posted image I'm drawing 4 LODs with the squares of the closest LOD being 1x1 units big (2x2 units for the next LOD, than 4x4 and finally 8x8). [quote name='Syranide' timestamp='1308258148' post='4824226'] First off, I would recommend that you implement your normals using textures instead, this will allow you to vastly improve the graphical quality of your terrain, without increasing the vertex count. You can easily use normal maps with 2-3 times higher resolution (than the heightmap), and it's really really cheap. [/quote] I'm not sure how to combine this LOD method with normal maps. As the LOD heightmaps are calculated at run time on the GPU with only incremental updates for being provided by the CPU as the camera moves. Normal-mapping the LODs with the double or tripple resolution would require some efficient update scheme. Right now I'm just trying to get the basic method work. [quote name='Syranide' timestamp='1308258148' post='4824226'] EDIT: I'm not sure what's going on in your wireframe-screenshot... it look likes the LODs get coarser and coarser, and suddenly, some of them get finer... and then coarser again?... those look like the LODs that are incorrectly colored as well. [/quote] The LODs don't get finer, only coarser. I'm drawing 4 LOD rings. For each successive ring the size of the squares is doubled. They may seem to get finer but that's just the screenshot.The incorrectly lit area is exactly at the border between 2 rings. Anyway after you hint I've double checked my normal scaling factors. They seem to be correct but the problem is not solved. I'm still looking for help.
  7. Hi Gamedevers! My question is about GPU clipmaps, a terrain rendering method presented in the [url=""]GPU Clipmaps (GPU Gems 2)[/url] paper. I've managed to load and update the heightmap and handle terrain transitions (blending between a coarse and a fine level). The normal map generation seems to work fine everywhere except for the clipmap level borders. There you can see a terrible texture/lighting seam. [attachment=3215:02.PNG] I think that the GPU clipmaps are a popular rendering method. It seems that at least several people here have successfully implemented it. I hope someone could give me a clue how to get rid of this normal map error. I'm using the original implementation [b]ComputeNormals.fx[/b] presented [url=""]here[/url]. The uniform variables are initialized as follows: [code] uniform float Size = 128; // clipmap resolution uniform float OneOverSize = 1 / 128; uniform float2 Viewport = float2(128,128); // I update the entire texture uniform float2 ToroidalOrigin = float2(0,0); // no shift has been made yet uniform float2 TextureOffset = float2(0.25,0.25); // fine level position relative to the coarse ring origin // EDIT: //uniform float2 ScaleFac = float2(-0.5/127,-0.5/127); // -0.5 divided by the step length uniform float2 ScaleFac = float2(-0.5/2^L,-0.5/2^L); // L is the clipmap index increasing for the LODs that are farther away // 4 LODs are rendered, L=0 for the closest LOD, L=3 for the farthest LOD[/code] Here are some screenshots showíng the wireframe mode and the transition regions. I'm willing to provide further information when necessary. [attachment=3216:03.PNG] [attachment=3214:01.PNG] Please help!
  8. Shader for sphere glow?

    RenderMonkey contains a nice glow effect sample in both HLSL and GLSL. I'm not sure exactly if that's exactly what are you trying to achieve, but it looks cool and seems simple enough. The sample is called "Evil.rfx".
  9. So, right now 3DS Max provides you with the tangent frames. I suppose it calculates them with the method similar to the one described in the book, that I used as a reference in my first reply. This method uses the vertex positions, normals and texture coordinates as input. The vertex information comes from the teapot model itself. The problem is that you are using your own texture projector to look up the normal map. That means that you are calculating a new set of texture coordinates. These are in general not the same as those that are used by 3DS Max to calculate the tangent frames. The old tangent frame vectors are therefore not valid for your (spherical) texture mapping. You can make a little test. Try your normal map with a basic 3DS Max sphere mesh and you will see no seam. After that try some custom textured low poly model and you will probably see a mess instead of proper bump mapping. Possible solution: apply the spherical UV-mapping to the teapot before rendering or exporting. Hope that helps.
  10. Quote:Original post by Bosduif If I understand what your link talks about correctly, I have to either get my normal map in world space or my lighting vectors in tangent space, correct? Yes, that's correct. But the problem is independent of that. How do you get your tangent frame vectors? Are you loading them with your 3D model or calculating everything from the vertex/triangle coordinates?
  11. I think that your tangent frame vectors have a discontinuity (see Diagramm 3.7 here for more details). This is typical for vertices that are located at the UV seams. To get rid of this nasty effect you have to split the affected vertices.
  12. Quote:Original post by Hyunkel As I understand it, it should read from stencil buffer, and check if the current value equals to StencilReadMask, which is 1 in this case. It doesn't compare the value with the StencilReadMask. In fact the value is compared it to the stencil reference value which is 0 by default, that is set using the second parameter of the OMSetDepthStencilState method. The read mask is ANDed with the stencil value before comparison. So the operation looks like this: (stencil_value & read_mask) == stencil_refernce. Set the mask to 0xffffffff if you want to utilize all possible stencil bits.
  13. Matrices Help.

    I suppose you mean D3DFVF_XYZRHW. Quote:DirectX Graphics Documentation D3DFVF_XYZRHW - Vertex format includes the position of a transformed vertex. This means that the vertices are considered already transformed by the application. Direct3D ignores the transformation matrices in this mode. Use D3DFVF_XYZ or D3DFVF_XYZ | D3DFVF_DIFFUSE instead. You may find more information on the reference page.
  14. Matrices Help.

    What is the vertex format you've set with IDirect3DDevice9::SetFVF? Do you provide the fourth vector element w for the triangle vertices?
  15. 3D Model importing

    Check the FBX SDK out. The downloadable package contains tutorials and samples. A lot of 3D modelling packages (3DS Max, Maya, Blender, etc.) can export to FBX. The format was chosen as a primary model import format by the XNA developers. The SDK will let you open an FBX file and provide the functions to query the model vertices, triangles, materials, textures etc.