Jump to content
  • Advertisement
Sign in to follow this  
EnlightenedOne

Graphical Artifact when on emulated software. [SOLVED]

This topic is 2950 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I am using Pixel Shader 2 in order to draw a graphical effect on my PC with a Radeon 5000 series it works perfectly but on my laptop I get an unusual artifact and I am not sure what to do to isolate it!

Its using Intel(R) Graphics Media.
Accelerator in Use: Mobile Intel(R) 965 Express Chipset Family
Video BIOS: 1471
Current Graphics Mode: 1280 by 800 True Color (60 Hz)

My debug mode of DX outputs no real errors as such just this.

 
D3D9 Helper: Enhanced D3DDebugging disabled; Application was not compiled with D3D_DEBUG_INFO
Direct3D9: (INFO) :======================= Hal SWVP device selected

Direct3D9: (INFO) :HalDevice Driver Style b

Direct3D9: :BackBufferCount not specified, considered default 1
Direct3D9: :DoneExclusiveMode
Direct3D9: (INFO) :Using FF to PS converter

Direct3D9: (INFO) :Using FF to VS converter in software vertex processing

'HLSLWater.exe': Loaded 'C:\Windows\System32\hid.dll', Cannot find or open the PDB file
'HLSLWater.exe': Loaded 'C:\Windows\System32\setupapi.dll', Cannot find or open the PDB file
'HLSLWater.exe': Loaded 'C:\Windows\System32\wintrust.dll', Cannot find or open the PDB file
'HLSLWater.exe': Loaded 'C:\Windows\System32\crypt32.dll', Cannot find or open the PDB file
'HLSLWater.exe': Loaded 'C:\Windows\System32\msasn1.dll', Cannot find or open the PDB file
'HLSLWater.exe': Loaded 'C:\Windows\System32\imagehlp.dll', Cannot find or open the PDB file
'HLSLWater.exe': Loaded 'C:\Windows\System32\D3DCompiler_43.dll', Cannot find or open the PDB file
'HLSLWater.exe': Unloaded 'C:\Windows\System32\D3DCompiler_43.dll'
'HLSLWater.exe': Loaded 'C:\Windows\System32\clbcatq.dll', Cannot find or open the PDB file




Anyway that aside here is an image of what happens when the program is acting normally (when your at a distance from any verticies). The second image is how it creates a group of artifacts that I cannot figure out when you move closer to the objects. You get parts of the surface being drawn over in minature and infinite projection of the texture against a black surface the closer you are to a vertex the more unusual its artifacts at a distance everything behaves as expected.

Here is normal behaviour at a distance.
Picture 1

Here is behaviour you wont see in any commercial product :(
Picture 2

If you want any more details please specify what you want to know, thanks for any helpful advice :) to clarify this thing can run DX9 and pixel shader 3 code but the error remains. I am going to try adding in the DX_DEBUG include it wants and adding shader debugging to the outputs from the DX Control Panel although I cannot see that being the problem. Age of empires 3 and many other games work without a hitch so im worried I have this cross platform instability.

[Edited by - EnlightenedOne on August 24, 2010 10:16:09 AM]

Share this post


Link to post
Share on other sites
Advertisement
How do I compile with D3D_DEBUG_INFO?

Trying to force the program into software mode to see if the error was down to the hardware onboard but it seems the error stems deeper than that.

Trying to run in software mode causes any validation of get caps or create d3d device to fail D3DDEVTYPE_SW instead of HAL causes the issue, but this worries me. Anything that can run a program should be able to run the program in SW mode.

What can I do to test further? The hr value returned is always -858993460 and "DX Error Lookup" tool doesnt reveal any information. All I know is that D3DERR_NOTAVAILABLE == hr after I do

if( FAILED( hr = D3D->CheckDeviceFormat( D3DADAPTER_DEFAULT, D3DDEVTYPE_SW,
d3ddm.Format, D3DUSAGE_DEPTHSTENCIL,
D3DRTYPE_SURFACE, D3DFMT_D16 ) ) )

Could this be down to the automatic response after HAL is created shown in the debug "Direct3D9: :BackBufferCount not specified, considered default 1" maybe in software mode that doesn't get done automatically although it seems improbable.

Any comments or advice on what to test are welcome.

Share this post


Link to post
Share on other sites
It seems while the device is set to hardware mode the vertex processing is software based anyway, not sure if that would have a massive impact. Can software vertex processing not handle shaders at all?

// Use hardware vertex processing if supported, otherwise default to software
if( d3dCaps.VertexProcessingCaps != 0 )
dwBehaviorFlags |= D3DCREATE_HARDWARE_VERTEXPROCESSING;
else
dwBehaviorFlags |= D3DCREATE_SOFTWARE_VERTEXPROCESSING;

Still not sure why I can't have a software based device.

Share this post


Link to post
Share on other sites
From the SDK:
Quote:
D3DDEVTYPE_SW
A pluggable software device that has been registered with IDirect3D9::RegisterSoftwareDevice.
I believe you want D3DDEVTYPE_REF instead.
The Intel 965 does support shaders, even with D3DCREATE_SOFTWARE_VERTEXPROCESSING, and you might want to check (d3dCaps.DevCaps & D3DDEVCAPS_HWTRANSFORMANDLIGHT) instead of d3dCaps.VertexProcessingCaps in order to set this.

To my eyes what's happening to you looks uncannily like you've got an unsigned int * for your index buffer but D3DFMT_INDEX16, or something similar.

Share this post


Link to post
Share on other sites
You appear to have struck the nail on the head. I have tried various D3DFMT types but not found one that works yet, they either fail completely or work fine on my PC.

I am confused by an unsigned int being used for the index buffer.

In the code to load unanimated meshes I do define an index buffer type but am not sure how I compare between the meshes index buffer and the one used by the D3D format type.


//Zero Mesh and create buffer
Mesh = 0;
ID3DXBuffer* MeshBuffer = 0;

//Load and optimize the mesh
D3DXLoadMeshFromX( MeshFilename, D3DXMESH_MANAGED, D3DDevice, &MeshBuffer, 0, 0, 0, &Mesh);
Mesh->OptimizeInplace( D3DXMESHOPT_ATTRSORT | D3DXMESHOPT_COMPACT | D3DXMESHOPT_VERTEXCACHE, (DWORD*)MeshBuffer->GetBufferPointer(), 0, 0, 0);

//Release and zero the buffer
MeshBuffer->Release(); MeshBuffer = NULL;



The index buffer is of index buffer format and im not defining my own element vertex type to draw to screen just drawing meshes so not sure what to be looking at really.


if( FAILED( hr = D3D->CheckDeviceFormat( D3DADAPTER_DEFAULT, D3DDEVTYPE_HAL,
d3ddm.Format, D3DUSAGE_DEPTHSTENCIL,
D3DRTYPE_SURFACE, D3DFMT_D16 ) ) )



If I change the D3DFMT around the degree of distortion changes when it works so it must be contributing to the problem if it is not it. I would like an explanation as to how to determine what format I should be using here. Thank you for narrowing it down to that line.

I am drawing to the screen via a render surface.


D3DDevice->CreateTexture(512, 512, 1, D3DUSAGE_RENDERTARGET, D3DFMT_A8R8G8B8, D3DPOOL_DEFAULT, &ProjectedTexture, NULL);
ProjectedTexture->GetSurfaceLevel(0, &RenderSurface);



I tried using the same render format as the render target surface but that didn't help. Windows would trigger a breakpoint when I tried this "D3DFMT_A8R8G8B8" which indicates a bug in the program or any of the DLLs it has loaded. Can you clarify what I need to be looking for?

Share this post


Link to post
Share on other sites
Your render target format should be the same as your back buffer format, which is probably D3DFMT_X8R8G8B8.

You can use ID3DXBaseMesh::GetIndexBuffer, then IDirect3DIndexBuffer9::GetDesc to determine the format of the index buffer in your mesh. Reading the SDK documentation again, the conditions seem to be that (1) if you're using D3DCREATE_SOFTWARE_VERTEXPROCESSING (or presumable if your index buffer was created with D3DUSAGE_SOFTWAREPROCESSING and you have a mixed device) you can use 32 bit indexes always, whereas (2) if you're using D3DCREATE_HARDWARE_VERTEXPROCESSING you can create a 32-bit index buffer but you can't render from it unless D3DCAPS9::MaxVertexIndex is greater than 0x0000ffff. Therefore if my index buffer format theory was correct you would crash rather than get artefacts.

It may be worthwhile nonetheless to dump the contents of your index buffer to disk on both your laptop and your PC and comparing the two to see if there are differences. At least you'll be able to rule it out as a possible cause.

Share this post


Link to post
Share on other sites
With the back buffer format set to D3DFMT_X8R8G8B8 I die on the create device statement on both my pc and my laptop.

The back buffer format was set to be D3DFMT_16 originally on both with everything set to that format I get the artifact. Both outputs for the index buffer description object is D3DFMT_INDEX16 I cant get any other formats to run. Which ones should I be trying out to remove the artifact?

Here is the code I used to test the description of the index buffer.


//Zero Mesh and create buffer
Mesh = 0;
ID3DXBuffer* MeshBuffer = 0;

IDirect3DIndexBuffer9* MeshIndexBuffer;
D3DINDEXBUFFER_DESC MeshIndexBufferDesc;

//Load and optimize the mesh
D3DXLoadMeshFromX( MeshFilename, D3DXMESH_MANAGED, D3DDevice, &MeshBuffer, 0, 0, 0, &Mesh);
Mesh->OptimizeInplace( D3DXMESHOPT_ATTRSORT | D3DXMESHOPT_COMPACT | D3DXMESHOPT_VERTEXCACHE, (DWORD*)MeshBuffer->GetBufferPointer(), 0, 0, 0);

Mesh->GetIndexBuffer(&MeshIndexBuffer);

MeshIndexBuffer->GetDesc(&MeshIndexBufferDesc);

//Release and zero the buffer
MeshBuffer->Release(); MeshBuffer = NULL;



Theres an error I hadn't got before because I hadn't modified everything to be the same format correctly. If I dont use D3DFMT_16 and use D3DFMT_X8R8G8B8 instead I get.

Direct3D9: (ERROR) :Format is not approved list for Z buffer formats. CheckDeviceFormat fails.

D3DFMT_R5G6B5 also fails. I need to google how to find acceptable back buffers, im abit depressed its not a standardised thing!

How do I get an approved list of formats? this must be a step closer to isolating the problem.

Share this post


Link to post
Share on other sites
Via D3DCAPS

On the PC and the laptop the max vertex index is identical.

Max VertexIndex = 16777215
Thats well over 0x11111111

I cant see anything in there on the back buffer formats available on MSDN it suggests that there are standardised formats that any machine should be able to process including ones neither my pc or laptop will run, so I am perplexed. Anything I should be testing with?

Share this post


Link to post
Share on other sites
I think you're now mixing up your backbuffer format with your depth buffer format. Maybe back up a little and have a re-read of the SDK docs? And maybe try some of the SDK samples on both machines, compare the output, check the code.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!