Jump to content
  • Advertisement
Sign in to follow this  
Lio

[SOLVED] C# - problem with DrawIndexedPrimitives()

This topic is 4014 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Ive just startet on C# and DX. And ive been stuck on this now for 6 hours. So this is last resort. Ive tried searchin on this forum, and google. Ive even tried to read a tutroial in german. Ive looked at the NeHe Tutroials that have been converted to managed DX but in lesson 4,5,6,7 he didnt use any IndexBuffers or even VertexBuffers. Ive taken bits and pices from various tutorials and added it together and made changes here and there. Ive been able to draw sprites and triangles using DrawPrimitives() together with CustomVertex.PositionColored. Ive also been able to draw quads and triangles with texture on em using CustomVertex.PositionTextured. My trouble begins when i want to try out DrawIndexedPrimitives() combined with IndexBuffer and VertexBuffer's. Ive read all there is to read about em i think. Im fully aware of the DrawIndexedPrimitives() calls parameters etc. So i cant for the life of me figure out what im doing wrong. I will also be quick to add the problem is not with any of my matrices since i have other stuff showing up just fine. Atleast not in terms of me having my LookAt vector in the wrong direction. Now for some code I have the following set. pps is the PresentParameters for the Device. Ive tried with both lightning set to false and true, and same with all the CullMode settings. I do ofc have lots of other settings in the pps. but thought these might be of use. ______________________________________________________________________
pps.EnableAutoDepthStencil = true;
pps.AutoDepthStencilFormat = DepthFormat.D16;

DX_Device.RenderState.Lighting = false;
DX_Device.RenderState.CullMode = Cull.None;
______________________________________________________________________ Heres how i set up the input to the vertexBuffer and the IndexBuffer. ______________________________________________________________________
 
CustomVertex.PositionTextured[] verts = new CustomVertex.PositionTextured[4];
verts[0] = new CustomVertex.PositionTextured(new Vector3(-1, 1, 1), 0, 0);
verts[1] = new CustomVertex.PositionTextured(new Vector3(1, 1, 1), 1, 0);
verts[2] = new CustomVertex.PositionTextured(new Vector3(-1, -1, 1), 0, 1);
verts[3] = new CustomVertex.PositionTextured(new Vector3(1, -1, 1), 1, 1);

int[] inds = { 0, 1, 2, 1, 3, 2 };
______________________________________________________________________ This code snippit is where i set by VertexBuffer and IndexBuffer. Is it right to use setData call. Ive seen lots of tutorials where they instead use the Lock and Unlock calls in the VertexBuffer and IndexBuffer and load them into a GraphicsStream. ______________________________________________________________________
        texture = TextureLoader.FromFile(device, path);

        vb = new VertexBuffer(
            typeof(CustomVertex.PositionTextured),
            verts.Length,
            device,
            Usage.WriteOnly,
            CustomVertex.PositionTextured.Format,
            Pool.Default);

        vb.SetData(verts, 0, LockFlags.None);

        ib = new IndexBuffer(
            typeof(int),
            indices.Length,
            device,
            Usage.WriteOnly,
            Pool.Default);

        ib.SetData(inds, 0, LockFlags.None);
______________________________________________________________________ This code pice is the actual render method that gets called in my animation loop. As you can see ive left the code in that i have commented out. The code that i have commented out works for some reason. This have me puzzeled even more. Its only when i use the DrawIndexedPrimitives() call it wont show. ______________________________________________________________________
    public override void Render(Device device)
    {
        device.VertexFormat = CustomVertex.PositionTextured.Format;
        device.SetTexture(0, texture);
        device.SetStreamSource(0, vb, 0);
        device.Indices = ib; 
/*
        device.DrawPrimitives(
            PrimitiveType.TriangleStrip,
            0,
            2); // number of primitives
*/
        device.DrawIndexedPrimitives(
            PrimitiveType.TriangleList,     //list
            0,
            0,
            4,        //number of vertices
            0,
            2);       //number of primitives - triangles
     
    }
______________________________________________________________________ Last i just want to say thx for any help. Its very much appriciated. [Edited by - Lio on September 29, 2007 2:37:25 AM]

Share this post


Link to post
Share on other sites
Advertisement
If DrawPrimitives is working for you, this should work as well:


int[] inds = { 0, 1, 2, 3 };
device.DrawIndexedPrimitives(
PrimitiveType.TriangleStrip, //list
0,
0,
4, //number of vertices
0,
2); //number of primitives - triangles



If that works, then there must be something wrong with the way you are arranging the index buffer.

Share this post


Link to post
Share on other sites
Ive tried what you described, setting the indeces to {0,1,2,3} and the primitive type to a TriangleStrip. It didnt work. Still nothing i showing.

Ive tried to compy paste in some other code from a tutroial, it works. The difference is they use short's instead of int's.

Heres a pice of the code, at the end buf is set with SetData(arr,0,0) call
______________________________________________________________

IndexBuffer buf = new IndexBuffer(
typeof(short), // What will the indices be?
36, // How many of them will there be?
device, // The device
0, // Advanced setting
Pool.Default // Advanced setting
);

short[] arr = new short[36];

int i = 0;
// Front face
arr[i++] = 0; arr[i++] = 3; arr[i++] = 1;
arr[i++] = 0; arr[i++] = 2; arr[i++] = 3;

______________________________________________________________
Anyone who have a clue why i cant use ints?

Edit:
Yep that sortet it. It all works fine after i changed the type from int to shorts in the constructor of the IndexBuffer and when i made the array of indices to load into the IndexBuffer.

If anyone have any explination for this i will be happy to hear from you :)

Edit2:
Read on another forum that it have to do with ones hardware. If the graphics card dont support 32 bit indices you cant use ints ofc. Currently im on my puny IBM X40 laptop so that might explain it.

Share this post


Link to post
Share on other sites
Quote:
Original post by Lio
Edit2:
Read on another forum that it have to do with ones hardware. If the graphics card dont support 32 bit indices you cant use ints ofc. Currently im on my puny IBM X40 laptop so that might explain it.


That's right. You can check this property in the DirectX Caps Viewer under "DirectX Graphics Adapters" -> <yourcard> -> "D3D Device Types" -> "HAL" -> "Caps"

Click on this folder icon and there look for "MaxVertexIndex". I guess it's below 65536 if your card only supports 16 bit indices (2^16)

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!