This topic is now archived and is closed to further replies.


Function returns S_OK but debug spew says different

Recommended Posts

neneboricua19    634
I''m using DX9.0b and I have a small .x file that I''m trying to load. It''s just a red sphere that I want to place at the position of the light sources in my scene so that I can tell if the lighting looks right. I use the function D3DXLoadMeshFromX. This function returns S_OK but the debug spew (with debug output level at maximum) says this: D3DXOF:426596: syntax error near "0.000000" D3DXOF:XStreamRead::GetNextObject: Error while parsing stream D3DXOF:GetNextObject: Parse error. The mesh loads and displays just fine with Mview. I believe my code for loading the mesh is ok because it works just fine when I load and render another mesh that has over 75k vertices using many different textures and materials. After I''ve loaded the model I went ahead and locked the buffers so that I could see what data was actually loaded. The data that''s being loaded is all wrong, especially the index buffers. Some of the values in the index buffer are numbers like 4294967295 which is the maximum value of a DWORD (0xffffffff). Given the fact that the mesh only has 89 vertices, I think it''s safe to say that values like this are incorrect. I went ahead and tried to render the small red sphere but naturally, DrawIndexedPrimitive fails because some of the indices are messed up. I''ve tried reinstalling the SDK and that''s no help. But I have noticed that it renders just fine if I use the Retail runtime but not the Debug runtime. The values in the buffers are still wrong, but somehow, it works using the Retail runtime. Does anyone have any ideas on what could be causing this weirdness? Thanks a lot, neneboricua

Share this post

Link to post
Share on other sites