Jump to content

  • Log In with Google      Sign In   
  • Create Account

We're offering banner ads on our site from just $5!

1. Details HERE. 2. GDNet+ Subscriptions HERE. 3. Ad upload HERE.


Render cost of degenerate triangles


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
10 replies to this topic

#1 Tom   GDNet+   -  Reputation: 352

Like
Likes
Like

Posted 23 April 2004 - 08:12 AM

Are the latest nVidia drivers designed to understand the concept of degenerate triangles, in that they will effectively skip rendering them? This is pure speculation; I''ve read nothing to suggest it, and I have no evidence to support it. The reason I ask is because the overdraw for degenerate triangles approaches 200 percent as your triangle count rises (it''s never less than 100 percent); however, if the video card knows to skip these "invisible" triangles, then it really doesn''t matter how many you use, right? It''s only a matter of vertex buffer memory. I wonder if driver developers have considered this.

Sponsor:

#2 neneboricua19   Members   -  Reputation: 634

Like
Likes
Like

Posted 23 April 2004 - 09:19 AM

Short answer: it''s totally safe to use degenerate triangles. Even older cards handle them just fine.

Long answer:
The idea of using degenerate triangles has been around for a long time. It''s really not even possible for a degenerate triangle to be rasterized because, by definition, a degenerate triangle has at least two vertices that are the same. This means that any attempt to rasterize the triangle will result in just a line being rendered.

Depending on what kind of triangle rasterization algorithm is actually implemented by the hardware, this can be difficult to handle. Most triangle rasterization techniques require that all the vertices of the triangle be distinct.

For this reason, degenerate triangles are simply ignored by the hardware

neneboricua

#3 DrunkenHyena   Members   -  Reputation: 805

Like
Likes
Like

Posted 23 April 2004 - 09:24 AM

Degenerate tris do take up memory, they may slow the upload of the buffers (depending on config) and they will be transformed, which includes running through a vertex shader if you have one.

They aren''t free, but they are pretty cheap. Though if you have more than a few it is often a win to switch to indexed lists rather than strips.


Stay Casual,

Ken
Drunken Hyena

#4 Metron   GDNet+   -  Reputation: 266

Like
Likes
Like

Posted 23 April 2004 - 09:41 AM

The usage of degenerated triangles is virtually free because (using indexlists) every vertex only gets transformed one time thus the statement of DrunkenHyena is wrong in the way that there is an unnecessary transformation overhead for the vertices of the degenerated triangle.

Remember that degenerated triangles are made of the last vertex of the previous ''has to be drawn'' triangle and the first vertex of the next ''has to be drawn'' triangle.

Sidenote : Degenerated triangles are always (!) used with index lists... never without index lists.

Best regards,
Metron

#5 DrunkenHyena   Members   -  Reputation: 805

Like
Likes
Like

Posted 23 April 2004 - 09:52 AM

quote:

Sidenote : Degenerated triangles are always (!) used with index lists... never without index lists.


Degenerate triangles are used in STRIPS because it''s necessary to stich non-contiguous strips together. I cannot think of a reason why you would ever want to use them with lists.



Stay Casual,

Ken
Drunken Hyena

#6 Raloth   Members   -  Reputation: 379

Like
Likes
Like

Posted 23 April 2004 - 09:55 AM

quote:
Original post by DrunkenHyena
quote:

Sidenote : Degenerated triangles are always (!) used with index lists... never without index lists.


Degenerate triangles are used in STRIPS because it''s necessary to stich non-contiguous strips together. I cannot think of a reason why you would ever want to use them with lists.
Rendering heightmaps.



#7 neneboricua19   Members   -  Reputation: 634

Like
Likes
Like

Posted 23 April 2004 - 10:12 AM

quote:
Original post by Metron
The usage of degenerated triangles is virtually free because (using indexlists) every vertex only gets transformed one time thus the statement of DrunkenHyena is wrong in the way that there is an unnecessary transformation overhead for the vertices of the degenerated triangle.


Actually, this isn''t quite correct. Degenerate triangles are used with triangle strips, not triangle lists. They can also be used with or without index buffers.

How many times a vertex is transformed depends on the index buffer and has nothing to do with whether a triangle is degenerate or not. The vertices of a degenerate triangle may indeed be transformed more than once because an index buffer can specify the vertices of degenerate triangles more than once if it needs to.
quote:

Sidenote : Degenerated triangles are always (!) used with index lists... never without index lists.


Again, degenerates cannot be used when rendering using triangle lists (whether it be with or without an index buffer). They are only used with triangle strips. Degenerates can be used in a triangle strip regaurdless of whether or not an index buffer is used.

neneboricua

#8 JonStelly   Members   -  Reputation: 127

Like
Likes
Like

Posted 23 April 2004 - 10:55 AM

quote:
Again, degenerates cannot be used when rendering using triangle lists


They CAN be used, it''s just that there''s no need to use them.

#9 Coincoin   Members   -  Reputation: 122

Like
Likes
Like

Posted 23 April 2004 - 03:22 PM

Ok, let's clear up things a little.

Degenerates and lists/strips.
Degenerate SHOULD only be used with strips. They CAN be used with lists but thats totally useless. If you see a reason to use degenerates with lists, feel free to post it here.

Degenerates and index buffers
Degenerates are used to get better batching with strips. Strips are used to reduce download and transformation overhead. Index overcome is almost exactly the same as strips, as they let the hardware reuse transformed (cached) vertices and save a lot of download overhead since you only need to upload 16 (or 32) bits instead of full repeated vertices. Therefore, there is (almost) no point in using strips and index buffers, except in very specific cases, such as heightmaps, since you can reuse the same index buffer for each patch (and still, that's very close to using strips only).


Conclusion
You should use EITHER:
- Degenerates on non indexed strips.
OR
- Indexed lists (with no degerates, of course)

OR MAYBE
- Indexed strips, which should only be used on several batches that use the same indices.

[edited by - Coincoin on April 23, 2004 10:28:06 PM]

#10 Anonymous Poster_Anonymous Poster_*   Guests   -  Reputation:

Likes

Posted 23 April 2004 - 06:01 PM

Just to add, degenerate triangles are used in tri-lists for some special algorithms. For example, you can implement a silhouette edge rendering algorithm using degenerate triangles. So they aren''t just used in the tri-strip case.

#11 Tom   GDNet+   -  Reputation: 352

Like
Likes
Like

Posted 24 April 2004 - 07:10 AM

Thanks for the info. Sounds like the only thing I've got to worry about is assembling my vertex buffers fast enough.

My original overdraw assessment was incomplete and therefore incorrect. I was assuming a batch of sprites, where each sprite is two triangles, and no two sprites are connected. In this case alone, overdraw approaches (but never reaches) 200 percent.

Of course, in a more reasonable application (like a terrain renderer), you would have far less overdraw, thereby making degenerate triangles extremely efficient. I'll remember this for future reference.

[edited by - Tom on April 24, 2004 2:18:53 PM]




Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS