Jump to content
  • Advertisement
Sign in to follow this  
Beco

glEdgeFlagPointer vs GL_QUADS

This topic is 4811 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hello! I've got a problem with glEdgeFlagPointer(). In my application the model contains triangles and quads. I use Vertex Arrays and in wireframe mode I want to use edge flags to hide invisible lines. It works well when drawing triangles but when I want to call glDrawArrays(GL_QUADS ..) and edgeflag array is enabled my card creates an error. I couldn't find any information in any documentation about this problem. Does anybody have an idea? Thx

Share this post


Link to post
Share on other sites
Advertisement
You will probably have to post some of the code, its quite vague and the only thing anyone can suggest is just pure speculation. perhaps post the way you make the array, the way you use glEdgeFlagPointer, some of those things that are highly relevant to the problem.

Share this post


Link to post
Share on other sites
Quote:
Original post by Xero-X2
You will probably have to post some of the code, its quite vague and the only thing anyone can suggest is just pure speculation. perhaps post the way you make the array, the way you use glEdgeFlagPointer, some of those things that are highly relevant to the problem.


And also what kind of error do you get? Is it an error obtained with glGetError() (if so which one), or a crash (if so did you try a debugger)?

Tom

Share this post


Link to post
Share on other sites
So here are the critical lines of my code:

This first version works well when I use edgeflag pointer only for triangles.

glEnableClientState( GL_VERTEX_ARRAY );
glEnableClientState( GL_EDGE_FLAG_ARRAY );

//Drawing triangles with edge flag
glVertexPointer( 3, GL_FLOAT, 0, triVertex );
glEdgeFlagPointer(0, triEdge);

for(int i = 0; i < maxTriLength - 2; i += 3){
glDrawArrays(GL_TRIANGLES, i, 3);
}

glDisableClientState( GL_EDGE_FLAG_ARRAY );

//Drawing quads without edge flag
glVertexPointer( 3, GL_FLOAT, 0, quadVertex );

for(int j = 0; j < maxQuadLength - 3; j += 4){
glDrawArrays(GL_QUADS, j, 4);
}

glDisableClientState( GL_VERTEX_ARRAY );



When i modify the code to use edge flag for quads some problem occurs. glGetError returns with 0 but my cards crashes and switches off the monitor (only reset can help).


glEnableClientState( GL_VERTEX_ARRAY );
glEnableClientState( GL_EDGE_FLAG_ARRAY );

//Drawing triangles with edge flag
glVertexPointer( 3, GL_FLOAT, 0, triVertex );
glEdgeFlagPointer(0, triEdge);

for(int i = 0; i < maxTriLength - 2; i += 3){
glDrawArrays(GL_TRIANGLES, i, 3);
}

//Drawing quads with edge flag
glVertexPointer( 3, GL_FLOAT, 0, quadVertex );
glEdgeFlagPointer(0, quadEdge);

for(int j = 0; j < maxQuadLength - 3; j += 4){
glDrawArrays(GL_QUADS, j, 4);
}

glDisableClientState( GL_EDGE_FLAG_ARRAY );
glDisableClientState( GL_VERTEX_ARRAY );



All arrays are stored in a Standard Template Library vector structure. I got the pointers by calling vector::begin() member function. I think it's correct because it works in the first version (and also everywhere else in my program). I've tried many versions leaving out different lines and I found out that it crashes when I use glEdgeFlagPointer and glDrawArrays with GL_QUADS together. Maybe my card's (Radeon 9250) GL implementation doesn't support it.

Thx

Share this post


Link to post
Share on other sites
Quote:
Original post by Beco
All arrays are stored in a Standard Template Library vector structure. I got the pointers by calling vector::begin() member function. I think it's correct because it works in the first version (and also everywhere else in my program).

I definately would not do that. I think that trick will only work for some compilers. If you're convinced that this is not the problem, I can't help much other than asking if you installed the latest drivers...

Tom

Share this post


Link to post
Share on other sites
Quote:
Original post by dimebolt
Quote:
Original post by Beco
All arrays are stored in a Standard Template Library vector structure. I got the pointers by calling vector::begin() member function. I think it's correct because it works in the first version (and also everywhere else in my program).

I definately would not do that. I think that trick will only work for some compilers. If you're convinced that this is not the problem, I can't help much other than asking if you installed the latest drivers...

Tom

No, don't use vector.begin(), since that will only work if your implementation uses raw pointers for its iterators. Instead use &vector[0], which will always work.

As to your problem, the only thing I can think to suggest is to double check that your quadEdge array is valid and large enough. I don't know much about edge flags, so I can't really help there.

By the way, why are you using multiple contiguous glDrawArray calls instead of just a pair of calls (one with GL_TRIANGLES and one with GL_QUADS) to draw all your objects?

Enigma

Share this post


Link to post
Share on other sites
Quote:
Original post by Enigma

No, don't use vector.begin(), since that will only work if your implementation uses raw pointers for its iterators. Instead use &vector[0], which will always work.

As to your problem, the only thing I can think to suggest is to double check that your quadEdge array is valid and large enough. I don't know much about edge flags, so I can't really help there.

OK, I'll change the begin calls.

The size of the edge flag array is exactly the same as the vertex array. I use vertex, normal, color and texcoord arrays too with this method and all of them works except the edge flag with quads. The strangest thing that I do the same as with triangles and it's good but with quads it fails.

Quote:

By the way, why are you using multiple contiguous glDrawArray calls instead of just a pair of calls (one with GL_TRIANGLES and one with GL_QUADS) to draw all your objects?

Enigma

The reason is that sometimes I don't want to draw all faces and they have flags signing if they are enabled. These flags are tested inside the "for" period. I posted only the main problematic lines.

Beco

Share this post


Link to post
Share on other sites
Yesterday I found out the reason of the problem.

I searched for the Windows system error code that was generated after my card's crash. It sad that it is usually a hardver or driver error. The solution was that I pulled down hardver acceleration in display dialog from 100% and it works.

I spent more than a day debugging my code. I think it was a good lesson for me.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!