Hardware accelarated 2D graphics

Started by
13 comments, last by OrangyTang 15 years, 9 months ago
Using a 3D accelerated library (like OpenGL) for 2D graphics seems to be the way to go today. Is it possible, though, to have hardware accelarated antialiased thick lines, antialiased font rendering, antialiased ellipse, fluent thick curves, blur/edge detection and other such filters, a spearate alpha channel for R, G and B, "XOR" overlay mode, etc...? If so, how?
Advertisement
Quote:Original post by Lode
Using a 3D accelerated library (like OpenGL) for 2D graphics seems to be the way to go today.

Hell yes.

Quote:Is it possible, though, to have hardware accelarated antialiased thick lines, antialiased font rendering, antialiased ellipse, fluent thick curves, blur/edge detection and other such filters, a spearate alpha channel for R, G and B, "XOR" overlay mode, etc...? If so, how?

Sure. Most can be done by simply generating the right geometry and appropriate textures - thick lines are best done with three quads (a start and end cap for shaped ended, and a quad in the middle for the body of the line). Similarly for ellipses and curves.

Filters and blurs are reasonably easy to do with render-to-texture and shaders. XOR can be done with an appropriate blending mode (IIRC you want ZERO, ONE_MINUS_DEST_COLOR). Fonts can be done with either bitmap fonts (fast, easy, high quality but don't scale too well) or more advanced tricks like Valve's vector text method.

I love this stuff - you can get much more interesting graphics as soon as you realise that 2d doesn't just have to be sprites and instead start working with polys and sneaky use of textures. [grin] Check out my journal for more examples of what you can do with these techniques.
I've done the lines with caps and stuff :)

But for example, while I've done an attempt that somewhat works (by remembering and connecting points to previous lines and not drawing the cap until there's a non connected line), drawing a bezier curve that is translucent and does NOT have overlapping line segments visible through the translucency, is kind of hard and sloppy this way.

But what about antialiased ones?
Either use edge-antialiasing (cheap and easy in OpenGL - iirc glEnable(GL_POLY_SMOOTH) ) which is screenspace, or make the edges of the texture alpha out and get antialiasing that way (which gives you antialiasing in world space but you get more control over the shape of the edge).

Connected line and curves are best done as one single object I find, so you can easily just generate the geometry on the fly and easily stitch the edges together. Translucent curves with non-overlapping geometry weren't too hard when I tried it, what were you having trouble on?

If you're really having trouble (maybe some weird line shapes or brush strokes or whatever) you can compose the lines/strokes into a single texture at full opacity (which will eliminate overlapping bits of geometry) then render this back to the screen with the appropriate transparency level.

I get the feeling your questions are all a little vauge, perhaps if you described what you were having trouble with we'd be able to provide a more useful answer.
Quote:
hardware accelarated antialiased thick lines


OpenGL supports this natively (LineWidth etc.)

Quote:
antialiased font rendering


Depends on what you want, exactly. Usually, you'd render the glyphs into a texture, then blit the texture. However, I remember seeing a paper where they implemented glyph rendering using shaders (don't have the link anymore, sorry).

Quote:
antialiased ellipse


Hmm... You could render a quad that contains the ellipse, and use a fragment program to fill the appropriate pixels by solving the ellipse equation in the fragment program.

Quote:
fluent thick curves


Please elaborate.

Quote:
blur/edge detection and other such filters, a spearate alpha channel for R, G and B


All these things could be done via fragment programs.

Quote:
"XOR" overlay mode


Can be done in OpenGL via LogicOp.

cu,
Prefect
Widelands - laid back, free software strategy
Quote:Original post by Prefect

Quote:
fluent thick curves


Please elaborate.



The translucent bezier curve mentioned in a later post. It works, but just doesn't feel "right", I mean, it doesn't look like 3D libraries were ever made to be supposed to do that.

Do all video cards support the line thickness of OpenGL, and can Direct3D do something similar?
Quote:Original post by Prefect
Quote:
hardware accelarated antialiased thick lines


OpenGL supports this natively (LineWidth etc.)

It's not great though - depending on the hardware it may have a low upper limit (16 IIRC on some of the intel chips) and you always get ugly square end caps. And if you draw a sequence of lines they won't get properly joined up and you'll get ugly cracks. Doing it yourself with geometry is a much better idea.
I'm surprised how many people think that the _SMOOTH flags (or even MSAA) in OpenGL will do proper AA. They'll do something, but it won't be anywhere near as nice-looking as a proper AA vector-renderer (like the Flash renderer).

This page offers some comparisons and a pretty good work-around (as other people have mentioned, adding caps/fins so that the texturing units do the smoothing): http://homepage.mac.com/arekkusu/bugs/invariance/
Sounds to me like you just want to recreate Anti-Grain Geometry using 3D hardware. It's possible to some degree, but certainly isn't easy (depending on how far you want to go; if it's just thick lines, that's quite easy to do with quads).
I've used antigrain and think it is really nice - it uses c++ templates to construct a vector and raster transform pipeline. it might be possible to adapt the the upper stages for creating geometry such as as stroked lines etc and avoid the lower non-accelerated scanline rasterizing stages.


Could I ask a question that is on topic - I am very ignorant about accelerated hardware but would it possible to speculate whether a late model accel-card (nvidia 8800) would be able to render a screen full of text (such as a displayed word document or firefox tab) using just the glyph polys and poly-holes constructred from bezier geometry (this is easily extracted using freetype). If so how would this be done - can the card (in 2d) work with raw poly geometry or would the beziers need to be interpolated on the cpu. also would triangulation of the polys need to be done on the cpu or can opengl or directx handle this directly. do shaders have any application here.

I work on an app that combines text and symbol (not raster) and other 2d vector information and i would be very open to the possibility of treating everything in a uniform manner if the hardware was capable of supporting it. unforunately i dont know enough of gl or dx api's (and lack time) to do my own experimentation.

This topic is closed to new replies.

Advertisement