Jump to content

  • Log In with Google      Sign In   
  • Create Account

Framebuffer Artifacts


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
2 replies to this topic

#1 laztrezort   Members   -  Reputation: 972

Like
0Likes
Like

Posted 30 August 2012 - 08:27 PM

I am getting odd artifacts when displaying scaled-up textured quads, see attached for screenshot. The vertical bars are good, the small horizontal "ticks" are the artifacts.

What you are looking at is just a single 8x8 textured quad drawn multiple times on a non-overlapping grid, the original texture source is a 2 pixel wide vertical line. The artifacts only appear when scaling up (resizing the render target), and each time I resize the ticks sort of randomly appear and disappear. Perhaps significant, the artifacts seem to always appear at where the quads adjoin each other.

Multisampling is off, not using depth or alpha blending. Vertices in the Pre-vs mesh look correct.

The strangeness starts when I debug one of the culprit pixels with PIX - it reports the pixel shader ouput and framebuffer as 1,1,1,1 which is what I am seeing on the screen. But, when I debug pixel & step through the actual shader, the output shows 0,0,0,1 (which would be what I want to see on the screen). I do not understand what could happen between the pixel shader output and the framebuffer that can change the color of the pixel.

To add to the strangeness, I cannot reproduce the issue on a different computer. I've updated my drivers on my development machine to the latest, no change. My best guess at this point is that I am doing something a bit off that is compounded by a particular quirk of my GPU (Nvidia, btw).

Here is some code (using SharpDX, Direct3D11):

Swapchain and Rasterizer options:
swapDesc = new SwapChainDescription()
			{
				ModeDescription = new ModeDescription(ParentForm.ClientSize.Width,
					ParentForm.ClientSize.Height,
					new Rational(60, 1), Format.R8G8B8A8_UNorm),
				SampleDescription = MultiSampling,
				Usage = Usage.RenderTargetOutput,
				BufferCount = 1,
				OutputHandle = ParentForm.Handle,
				IsWindowed = true,
				SwapEffect = SwapEffect.Discard,
				Flags = SwapChainFlags.None
			};

var rasterDesc = new RasterizerStateDescription()
			{
				IsAntialiasedLineEnabled = false,
				CullMode = CullMode.None,
				DepthBias = 0,
				DepthBiasClamp = 0.0f,
				IsDepthClipEnabled = false,
				FillMode = FillMode.Solid,
				IsFrontCounterClockwise = false,
				IsMultisampleEnabled = enableMultiSample,
				IsScissorEnabled = false,
				SlopeScaledDepthBias = 0.0f
			};

This gets called when the window is resized, width and height are directly from Form.ClientSize:

renderTargetView.Dispose();
swapChain.ResizeBuffers(1, width,
				height,
				Format.R8G8B8A8_UNorm, SwapChainFlags.None);
using (Texture2D backBuffer = swapChain.GetBackBuffer<Texture2D>(0))
{
				renderTargetView = ToDispose(new RenderTargetView(device, backBuffer));
}
device.ImmediateContext.OutputMerger.SetTargets(renderTargetView);
device.ImmediateContext.Rasterizer.SetViewport(0, 0, width, height, 0, 1);

The sampler state used by the pixel shader, (which btw I have tried randomly tweaking settings out of desperation):
SamplerStateDescription samplerDesc = new SamplerStateDescription()
			{
				AddressU = TextureAddressMode.Border,
				AddressV = TextureAddressMode.Border,
				AddressW = TextureAddressMode.Border,
				MipLodBias = 0,
				MaximumAnisotropy = 1,
				Filter = Filter.MinMagMipPoint,
				ComparisonFunction = Comparison.NotEqual,
				BorderColor = new Color4(0,0,0,0),
				MinimumLod = 0,
				MaximumLod = float.MaxValue
			};

Finally, here is the pixel shader. This is maybe a bit unusual, but basically it is just using the alpha from a texture to lerp between 2 colors:

Texture2D shaderTexture;
SamplerState sampleType;
struct PixelIn
{
	float4 PosH : SV_POSITION;
	float3 BColor : COLOR0;
	float3 FColor : COLOR1;
	float2 Tex : TEXCOORD;
};
float4 PS(PixelIn pin) : SV_TARGET
{
	float4 textureColor;
	float3 finalColor;
	textureColor = shaderTexture.Sample(sampleType, pin.Tex);
  
	finalColor = lerp(pin.BColor, pin.FColor, textureColor.a);
	return float4(finalColor, 1.0f);
}

Anyone have any ideas, even a longshot, or has anyone seen anything like this before?

Attached Thumbnails

  • Capture.PNG


Sponsor:

#2 ankhd   Members   -  Reputation: 1359

Like
0Likes
Like

Posted 30 August 2012 - 08:39 PM

What are the 2 images. One may have some odd pixel colour on the edge. Like gray stray pixels at the bottom. Long shot.

#3 laztrezort   Members   -  Reputation: 972

Like
0Likes
Like

Posted 30 August 2012 - 09:04 PM

What are the 2 images. One may have some odd pixel colour on the edge. Like gray stray pixels at the bottom. Long shot.


Thanks, that triggered something in my brain about floating-point innacuracies... The texture is part of an atlas, and it certainly could be a piece of the neighboring texture is bleeding through.

I suppose I can theorize why different GPU hardware would produce different results in this case (floating point wierdness etc.) - but I still don't understand how the pixel shader/sampler can output the correct color yet the framebuffer if off?

I suppose the obvious solution (if this is the case) would be to post-process the image so it has a 1 pixel border around each sub-texture? Are the other viable options, like somehow forcing the PS to clamp to the boundary?




Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS