Jump to content
  • Advertisement
Sign in to follow this  
Migi0027

DX11 DX11 - 1D Textures - Driver Crash

This topic is 2106 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi guys! happy.png

 

Just a small (I hope) question, why does this provide troubles for my driver? I'm basically sending a 1D Texture to my shader, here's how I do it:

 

Creating the texture.

ilInit();
 
// Load image from DevIL
ILuint idImage;
ilGenImages( 1, &idImage );
ilBindImage( idImage );
ilLoadImage( filePath.c_str() );
_ASSERT ( IL_NO_ERROR == ilGetError() );
 
// Fetch dimensions of image
int width = ilGetInteger( IL_IMAGE_WIDTH );
int height = ilGetInteger( IL_IMAGE_HEIGHT );
 
// Load the data
ilConvertImage( IL_RGBA,IL_UNSIGNED_BYTE );
unsigned char * pData = ilGetData();
 
// Build the texture header descriptor
D3D11_TEXTURE1D_DESC descTex;
descTex.Width = width;
descTex.Format = DXGI_FORMAT_R8G8B8A8_UNORM;
descTex.Usage = D3D11_USAGE_DEFAULT;
descTex.BindFlags = D3D11_BIND_SHADER_RESOURCE | D3D11_BIND_RENDER_TARGET;
descTex.CPUAccessFlags = 0;
descTex.MipLevels = 1;
descTex.ArraySize = 1;
descTex.MiscFlags = D3D11_RESOURCE_MISC_GENERATE_MIPS;
 
// Resource data descriptor
D3D11_SUBRESOURCE_DATA data ;
memset( &data, 0, sizeof(D3D11_SUBRESOURCE_DATA));
data.pSysMem = pData;
data.SysMemPitch = 4 * width;
 
// Create the 2d texture from data
ID3D11Texture1D * pTexture = NULL;
HV( pDevice->CreateTexture1D( &descTex, &data, &pTexture ));
 
// Create resource view descriptor
D3D11_SHADER_RESOURCE_VIEW_DESC srvDesc;
srvDesc.Format = descTex.Format;
srvDesc.ViewDimension = D3D11_SRV_DIMENSION_TEXTURE1D;

srvDesc.Texture1D.MostDetailedMip = 0;
srvDesc.Texture1D.MipLevels = D3D11_RESOURCE_MISC_GENERATE_MIPS;

// Create the shader resource view
HV( pDevice->CreateShaderResourceView( pTexture, &srvDesc, &pSRV ));
 
// Delete from IL buffer after image loaded correctly
ilDeleteImages( 1, &idImage );
idImage = 0;

Sending it: tongue.png

devcon->PSSetShaderResources(1, 1, &LineColors.pTexture);

Receiving it:

Texture1D t_lensColors : register(t1);

Now what on earth have I done wrong? Again...

-MIGI0027

Share this post


Link to post
Share on other sites
Advertisement

Does it crash when you create it, or when try to use it? Either way I don't have psychic debugging powers so I can't tell you exactly why the driver is crashing, but you should try changing or simplifying things until you figure out what's triggering the crashing. For starters, you might want to try just making the texture as a 2D texture with a height of 1.

 

EDIT: actually 1 thing that looks busted in your code is that you're passing D3D11_RESOURCE_MISC_GENERATE_MIPS as the "MipLevels" field of srvDesc. That should either be set to the actual number of mips, or to -1 to indicate that it should use all mip levels. You can also just pass NULL to CreateShaderResourceView instead of filling out a D3D11_SHADER_RESOURCE_VIEW_DESC structure if you just want the default settings of using the same format, same dimensions, and all mip levels.

Edited by MJP

Share this post


Link to post
Share on other sites

If you give it -1, you are also expected to give an array of D3D11_SUBRESOURCE_DATA structures, one for each mip level. Otherwise, you may (and often will) run into access violation.

Share this post


Link to post
Share on other sites

Alright, I boiled the issue down, and it seemed to be an external issue (Sorry about that).

 

It seems to happen when sampling a texture in a loop (I've done it before, so I don't know why it should cause problems):

float4 textureDistorted(
	float2 texcoord, 
	float2 direction,
	float3 distortion 
) {
	return float4(
		t_dff.Sample(ss, texcoord + direction * distortion.r).r,
		t_dff.Sample(ss, texcoord + direction * distortion.g).g,
		t_dff.Sample(ss, texcoord + direction * distortion.b).b,
		1.0
	);
}
.......
for (int i = 0; i < uSamples; ++i) { // PS. the uSamples is a constant in the shader, so it should unroll the loop
	float2 offset = frac(texcoord + ghostVec * float(i));
		
	float weight = length(float2(0.5, 0.5) - offset) / length(float2(0.5, 0.5));
	weight = pow(1.0 - weight, 10.0);
	
	result += textureDistorted(
		offset,
		normalize(ghostVec),
		distortion
	) * weight; // <- If this command is removed (the textureDistorted), my driver doesn't crash. Though I can use it outside a loop...
}

-MIGI0027

Heh, the topic kind of changed here, just tell me if you see something terribly wrong here.

Edited by Migi0027

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!