Jump to content
  • Advertisement
Sign in to follow this  
Bastich-666

GMA 3150 rendering problems :-(

This topic is 2465 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi everyone.

I need to get my engine to work on low spec hardware.
As such I have got a Toshiba NB500 netbook from work to use as a target system.
It uses an Intel GMA 3150 so I was expecting to have problems but nothing as strange as what I am getting.
After first degrading my shaders to only use VS2.0 and PS2.0 I fired up my test rig.
All seemed fine which was a surprise.
My GUI console and everything else 2d appears as it should.
However all 3d models etc are all invisible.
Its not that its a black poly on a black background or anything.
DrawIndexedPrimative reports no error codes it just doesn't render anything???
What makes it more strange is that if I change it to use DrawPrimative it does render.
Its not correct mind as I will need to expend my faces/vertices when loading the models in.
Is this the only solution I have can anyone think of a reason for what I am getting???

Please help.

Share this post


Link to post
Share on other sites
Advertisement
Is your index data 16 or 32 bit? If 32 bit, have you tried 16 bit?

Also, in the DrawIndexedPrimitive call, are your MinVertexIndex and NumVertices correctly set, such that MinVertexIndex is the lowest vertex index referenced by the index data, and NumVertices is highest vertex index - lowest vertex index + 1? (alternatively these can be set to 0 and total number of vertices in the buffer, those values would just be for optimization)

Share this post


Link to post
Share on other sites
Hello thanks for the response.
My indices are 16bit (they were originally 32bit but the GMA only supports 16 so had to swap it over)
Here is a line from my engine where I use DrawIndexedPrimative
device->DrawIndexedPrimitive( D3DPT_TRIANGLELIST, 0, 0, num_vertices, 0, num_faces );
On every other system I have tried this on (ATI, NVidia) it works without any modification.
Its all very frustrating :-(

Share this post


Link to post
Share on other sites
Most probably you're using software vertex processing with this part - this is somewhat more fussy about the params to DIP than hardware (which you would be using with NV and ATI, and which largely ignores the range params). I'd recommend that you test with the debug runtimes, which I suspect will tell you that you've got a DIP param out of range. The next step would be to find out which param (my money's on num_vertices) and how it got out of range (double-check how you're calculating this).

Share this post


Link to post
Share on other sites
Woohooo!!!
Found and fixed it.
It turned out to be my fault, I had incorrectly converted from 32 to 16bit index buffers.
Now that's sorted it all works, well apart from shadow/environment/normal mapping but that's a different issue.
Thanks again for all the help and suggestions.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!