• Advertisement

DX11 Is this texture seaming caused by sample type?

Recommended Posts

Hi, so I imported some new models into my engine, and some of them show up with ugly seams or dark patches, while others look perfect (see pictures)

I'm using the same shader for all of them, and all of these models have had custom UV mapped textures created for them, which should wrap fully around them,

instead of using tiled textures. I have no idea why the custom UV mapped textures are mapping correctly on some, but not others.

Possible causes are

1. Am I using the wrong SamplerState to sample the textures? (Im using SampleTypeClamp )

2. The original models had quads, and were UV mapped by an artist in that state, then I reimported them into 3DS Max and reexported them as all triangles 

(my engine object loader only accepts triangles).

3. Could the original model UVs just be wrong?

Please let me know if somebody can help identify this problem, I'm completely baffled. Thanks.

For reference, here's a link to the shader being used to draw the problematic models and the shader code below.

https://github.com/mister51213/DirectX11Engine/blob/master/DirectX11Engine/Light_SoftShadows_ps.hlsl

/////////////
// DEFINES //
/////////////
#define NUM_LIGHTS 3

/////////////
// GLOBALS //
/////////////
// texture resource that will be used for rendering the texture on the model
Texture2D shaderTextures[7];// NOTE - we only use one render target for drawing all the shadows here!
// allows modifying how pixels are written to the polygon face, for example choosing which to draw. 
SamplerState SampleType;

///////////////////
// SAMPLE STATES //
///////////////////
SamplerState SampleTypeClamp : register(s0);
SamplerState SampleTypeWrap  : register(s1);

///////////////////
// TYPEDEFS //
///////////////////

// This structure is used to describe the lights properties
struct LightTemplate_PS
{
    int type;
    float3 padding;
    float4 diffuseColor;
    float3 lightDirection; //(lookat?) //@TODO pass from VS BUFFER?
    float specularPower;
    float4 specularColor;
};

//////////////////////
// CONSTANT BUFFERS //
//////////////////////
cbuffer SceneLightBuffer:register(b0)
{
    float4 cb_ambientColor;
    LightTemplate_PS cb_lights[NUM_LIGHTS];
}

//////////////////////
// CONSTANT BUFFERS //
//////////////////////
// value set here will be between 0 and 1.
cbuffer TranslationBuffer:register(b1)
{
    float textureTranslation; //@NOTE = hlsl automatically pads floats for you
};

// for alpha blending textures
cbuffer TransparentBuffer:register(b2)
{
    float blendAmount;
};

struct PixelInputType
{
    float4 vertex_ModelSpace : SV_POSITION;
    float2 tex : TEXCOORD0;
    float3 normal : NORMAL;
    float3 tangent : TANGENT;
    float3 binormal : BINORMAL;
    float3 viewDirection : TEXCOORD1;
    float3 lightPos_LS[NUM_LIGHTS] : TEXCOORD2;
    float4 vertex_ScrnSpace : TEXCOORD5;
};

float4 main(PixelInputType input) : SV_TARGET
{
    bool bInsideSpotlight = true;
    float2 projectTexCoord;
    float depthValue;
    float lightDepthValue;
    float4 textureColor;
    float gamma = 7.f;

    /////////////////// NORMAL MAPPING //////////////////
    float4 bumpMap = shaderTextures[4].Sample(SampleType, input.tex);

    // Sample the shadow value from the shadow texture using the sampler at the projected texture coordinate location.
    projectTexCoord.x =  input.vertex_ScrnSpace.x / input.vertex_ScrnSpace.w / 2.0f + 0.5f;
    projectTexCoord.y = -input.vertex_ScrnSpace.y / input.vertex_ScrnSpace.w / 2.0f + 0.5f;
    float shadowValue = shaderTextures[6].Sample(SampleTypeClamp, projectTexCoord).r;

    // Expand the range of the normal value from (0, +1) to (-1, +1).
    bumpMap = (bumpMap * 2.0f) - 1.0f;

    // Change the COORDINATE BASIS of the normal into the space represented by basis vectors tangent, binormal, and normal!
    float3 bumpNormal = normalize((bumpMap.x * input.tangent) + (bumpMap.y * input.binormal) + (bumpMap.z * input.normal));

    //////////////// AMBIENT BASE COLOR ////////////////
    // Set the default output color to the ambient light value for all pixels.
    float4 lightColor = cb_ambientColor * saturate(dot(bumpNormal, input.normal) + .2);

    // Calculate the amount of light on this pixel.
    for(int i = 0; i < NUM_LIGHTS; ++i)
    {
        float lightIntensity = saturate(dot(bumpNormal, normalize(input.lightPos_LS[i])));
        if(lightIntensity > 0.0f)
        {
        lightColor += (cb_lights[i].diffuseColor * lightIntensity) * 0.3;
        }
    }

    // Saturate the final light color.
    lightColor = saturate(lightColor);

    // TEXTURE ANIMATION -  Sample pixel color from texture at this texture coordinate location.
    input.tex.x += textureTranslation;

    // BLENDING
    float4 color1 = shaderTextures[0].Sample(SampleTypeWrap, input.tex);
    float4 color2 = shaderTextures[1].Sample(SampleTypeWrap, input.tex);
    float4 alphaValue = shaderTextures[3].Sample(SampleTypeWrap, input.tex);
    //textureColor = saturate((alphaValue * color1) + ((1.0f - alphaValue) * color2));
    textureColor = color1;

    // Combine the light and texture color.
    float4 finalColor = lightColor * textureColor * shadowValue * gamma;

    //if(lightColor.x == 0)
    //{
    //    finalColor =  cb_ambientColor * saturate(dot(bumpNormal, input.normal) + .2) * textureColor;
    //}

    return finalColor;
}

 

badTexture3.PNG

badTexture2.PNG

badTexture1.PNG

Share this post


Link to post
Share on other sites
Advertisement

Those look like flipped normals. Could also be a normal map that doesn't support mirroring well, like a object normal map.

It could be that you have been mirroring your mesh to save time but did not correct the normals before export. Try the texture on a sphere and see if the same thing happens.

Edit:

After reading your post I will say it's the mesh. Recalculate your normals in your 3D software.

Edited by Scouting Ninja

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


  • Advertisement
  • Advertisement
  • Popular Tags

  • Advertisement
  • Popular Now

  • Similar Content

    • By matt77hias
      Is it common to have more than one ID3D11Device and/or associated immediate ID3D11DeviceContext?
      If I am correct a single display subsystem (GPU, video memory, etc.) is completely determined (from a 3D rendering perspective) by a
      IDXGIAdapter (meta functionality facade); ID3D11Device (resource creation facade); ID3D11DeviceContext (pipeline facade). So given that you want to use multiple display subsystems, you will have to handle multiple of these interfaces. A concrete example would be a graphics card dedicated to rendering and a separate graphics card dedicated to computation, or combining an integrated and dedicated graphics card. All such cases seem to me quite far fetched to justify support in a majority of games. So moving one abstraction level further downstream, should a game engine even consider multiple display systems (i.e. there is just one ID3D11Device and one immediate ID3D11DeviceContext)?
    • By matt77hias
      Is it common to have more than one ID3D11Device and/or associated immediate ID3D11DeviceContext?
      If I am correct a single display subsystem (GPU, video memory, etc.) is completely determined (from a 3D rendering perspective) by a
      IDXGIAdapter (meta functionality facade); ID3D11Device (resource creation facade); ID3D11DeviceContext (pipeline facade). So given that you want to use multiple display subsystems, you will have to handle multiple of these interfaces. A concrete example would be a graphics card dedicated to rendering and a separate graphics card dedicated to computation, or combining an integrated and dedicated graphics card. All such cases seem to me quite far fetched to be just supported at all by the majority of games. So moving one abstraction level further downstream, should a game engine even consider multiple display systems (i.e. there is just one ID3D11Device and one immediate ID3D11DeviceContext)?
    • By katastrophic88
       

      Hello Everyone!
      I’m now working with Zugalu -- developer of an upcoming SHMUP named Technolites – as Community Manager.
      In Technolites, you’ll take command of a fully customizable ship and dash across the universe to defeat an ancient alien threat. Deploy more than 500 weapons and utility upgrades – fighting by yourself or with a friend. The fate of humanity rests in your hands. We’re currently live on Kickstarter, and we’d love it if you could check out our page! Any feedback is greatly appreciated
      https://www.kickstarter.com/projects/643340596/technolites-a-story-driven-shoot-em-up-by-zugalu
      Looking forward to seeing all of the great game projects being developed and launching this year!
    • By Nimmagadda Subba Rao
      Hi,
         I am a CAM developer working with C++ and C# for the past 5 years. I started working on DirectX from past 6 months. I developed a touch screen control viewer using Direct2D. I am working on 3D viewer currently. I am very slow with working on Direct3D. I want to be a gaming developer. As i am new to this i want to know what are the possibilities to explore in this area. How to start developing gaming engines? Is it through tutorials? I heard suggestions from my friends that going for an MS helps. I am not sure on which path to choose. Is it better to go for higher studies and start exploring? I am currently working in India. I want to go to Canada and settle there. Are there any good universities there to learn about graphics programming? Sorry if I am asking too many questions but i want to know the options to choose to get ahead. 
  • Advertisement