Sign in to follow this  

Pixel shaders not correctly sampling subresouces

Recommended Posts



I am trying to use d2deffects with custom shaders to create a depth based pixel shifitng shader


So the px shader output is something like this :  return  InputTexture1.Sample(InputSampler,  newcoord);


Like always, dx will slice the textures into subresources into sizes of power of 2 for GPU processing. In my current windowsize, they sliced  it into

4 1024x512 texture from a input 1920x 1080 texture.


This is a sample output of the shader


The shader tries to shift the center part of the texture to the right.

What it did was to sample the texture with an offset to the left,


but when texture slice 2 passes the PS


It does not try to sample the neighbour subresource.


Why is this so? How do I provide the correct texture for it to sample?










Full code:


Texture2D InputTexture1 : register(t0);
Texture2D InputTexture2 : register(t1);

SamplerState InputSampler : register(s0);

cbuffer constants : register(b0)
	float										zeroplane : packoffset(c0.x);
	float										depthfactor : packoffset(c0.y);
	float										dispersion : packoffset(c0.z);
	float										currview : packoffset(c0.w);
	float										maxview : packoffset(c1.x);
	float										height : packoffset(c1.y);
	float										width : packoffset(c1.z);
	float										dpi : packoffset(c1.w);


float4 main(
	float4 pos      : SV_POSITION,
	float4 posScene : SCENE_POSITION,
	float4 uv0 : TEXCOORD0,
	float4 uv1 : TEXCOORD1

	) : SV_Target
	float	db_depthfactor	=100.0f;
	float	db_zeroplane	=128.0f;
	float	db_dispersion = dispersion;

	float2 dist;
	//float distance = length(toPixel * (96.0f / 96.0f)) * uv0.z;
	float w, h,ww,hh, level;

	float4 depth = InputTexture2.Sample(InputSampler, uv0.xy);
	InputTexture1.GetDimensions(1,ww,hh, level);

	float shift = depth.r*256.0;
	shift = ((shift - db_zeroplane) / 256.0) * -1 * db_dispersion *;  // /w; //
	float2 newcoord = uv0.xy + float2(shift, 0);

	float4 text = InputTexture1.Sample(InputSampler,  newcoord);

return text;

Share this post

Link to post
Share on other sites

You may also consider not using minification filtering, you then will be able to create arbitrary sized textures without mipmaps. You can keep magnification filtering of any type, only for minification use no filter (this will create a noise artefact, visible when you will move during minificated texture being sampled). But if you will not minify your textures in animated ways, you practicaly do not need minification filtering.

Share this post

Link to post
Share on other sites

Is it possible to get these border co-ordinates? Say from the Semantics, I dont see any of them mentioning borders in the Semantics, but how are they mapped correctly onto a surface if the renderer does not know the borders?

Share this post

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this