Advantages of Software Rendering

Started by
34 comments, last by GameDev.net 18 years, 10 months ago
Hello, This morning I was reading about an article about when to use OpenGL and when to use D3D, but then the article brought up something else, a software render. It didn't really explain what it was good for, it just said it was a sort of third party source. So now I'm curious about why one might want to use a software render. If I'm correct in my understanding, some advantages of a pure software rendering environment are -Performs consistantly on any machine -100% cross platform(Given it's written in a cross platform language) -Full control over everything The disadvantages are -Slow(er) -Difficult to create/takes more time So am I correct in assuming, to build my own software renderer for my game, it'd be like writting my own version of OpenGL/D3D? How big of the difference is speed between a good software render and OpenGL/D3D? Thanks
I ask for help and you give me a book? I hate book. Book is stupid.Also known as Yellow at the Dark Basic forums.
Advertisement
Quote:Original post by Basic
How big of the difference is speed between a good software render and OpenGL/D3D?


that would really depend. However all new hardware has acceleration to some extent, and you don't get to use that with a SW renderer. As an example, on a little app I was writing a while back, MesaGL (SW implementation of the OpenGL spec) on an AMD Athlon 1.7 is still way slower (4 fps) than a GeForce2 MX using OpenGL 1.1 (50 fps).
Quote:Original post by lonesock
Quote:Original post by Basic
How big of the difference is speed between a good software render and OpenGL/D3D?


that would really depend. However all new hardware has acceleration to some extent, and you don't get to use that with a SW renderer. As an example, on a little app I was writing a while back, MesaGL (SW implementation of the OpenGL spec) on an AMD Athlon 1.7 is still way slower (4 fps) than a GeForce2 MX using OpenGL 1.1 (50 fps).

uhhh you realized you compared two different things right? Athlon 1.7 compared to a GeForce2 MX... what kind of computer was the GeForce2 MX in?

Beginner in Game Development?  Read here. And read here.

 

Actually, software rendering doesn't necessarily perform consistently at all across different machines. It still has to interface with the hardware at some point, so hardware will still make some speed difference (aside from the obvious processor speed factor). The advantage of separating from hardware is more in what you can do with the engine (and not worrying about driver/card specific issues) and not so much speed. However, as GPU pipelines become more programmable, this becomes less useful. On a DX9, PS/VS 3.x card, for instance, you have so much control over the render pipeline that software rendering is virtually unneeded. As programmable shader support continues to advance, software rendering will no longer hold any meaningful advantage in the fact that it gives you control over the render pipeline. Considering advancements like photon mapping with programmable shaders, and more recent developments, the arguments for software rendering are all but moot. The age of software rendering (for realtime applications) is over.

The platform independence benefits are questionable as well. You still have to write the rendered frames to the video hardware at some point, and unless you want to individually support a huge number of video drivers (or be limited to extremely low resolutions or ancient VESA modes) you'll probably end up rendering via an API like DirectX or OpenGL. It used to be that hardware vendor support for software rendering was fairly decent, but this is no longer really the case. The only real win for software rendering is that you don't have to have different shader code on different hardware - which, with DirectX, is basically a non-issue anyways.

With the special-purpose dedicated processing power of modern GPUs, it is extremely unlikely that even a master graphics wizard would be able to write a software renderer that could hold its own against hardware-accelerated rendering in the general case. Either way, it'd take a graphics wizard to write a software renderer that could even begin to compete with the quality capabilities of modern hardware accelerated engines.


Writing a (simple) software renderer is still an excellent exercise for 3D graphics programmers, however. You will get a very intimate (if not slightly outdated) understanding of the theory and basic elements involved, and a lot of the more arcane behaviors of hardware accelerators will make a lot more sense to you. However, it is a massive undertaking, and I wouldn't recommend starting such a project lightly.

It's not really comparable to writing OGL/DX; low-level graphics APIs are really little more than a bridge that sends geometry and texture data (and, more recently, shader information) to the video drivers, which then sends it on to the hardware. A software renderer has to be responsible for everything from managing the scene data to actually performing z-buffering, scanline conversion, texture interpolation and possibly blending, and so on. DirectX and OpenGL actually do very little with your graphics in the big scheme of things; they're primarily preparation and preprocessing interfaces for getting the data into a format recognizable to the hardware.

Wielder of the Sacred Wands
[Work - ArenaNet] [Epoch Language] [Scribblings]

Quote:Original post by Alpha_ProgDes
uhhh you realized you compared two different things right? Athlon 1.7 compared to a GeForce2 MX... what kind of computer was the GeForce2 MX in?


Sorry if I was unclear; that was all the same machine. I just dropped in the Mesa OpenGL32.dll into the project directory so Windows used that instead of the nVidia driver.
The advantage of a software renderer is that you have a software renderer. [wink]

It's true that they are no match against modern programmable graphics cards. But sometimes you're in a situation where you need the flexibility of such card, but it's simply not available. One example is when you're writing games for kids. Often, they get a 'dumped' PC with little or no 3D acceleration. So if performance demands are quite low a software renderer can be a safe choice. Another situation are laptops with integrated graphics. Mine has an Intel Extreme Graphics 2, which isn't capable of pixel shaders at all. If you do need them, software rendering is you only choice. The same is often true for 'office' desktops. Nowadays they even sell 3 GHz computers with integrated graphics.

This is why I created swShader. It's a high performance, high quality software renderer with a DirectX interface. It currently supports many features that are not available on my integrated graphics. It's like a match where the opponent doesn't show up...

Pixomatic is another modern software renderer. Have a look at the Why Pixomatic? for more reasons why software rendering will remain to exist. It has less features than swShader but has best performance and is of commercial quality. It's currently used in Unreal Tournament 2004, Medal of Honor, and others!
Software renderers are good for one thing. Running 3d on systems that don't have any kind of 3d acceleration.
...I believe understanding the fundamentals of software rendering gives you a more solid understanding of the hardware behind 3D acceleration, although this is not a necessity of course. A better understanding of hardware can lead to better programming techniques IMHO.

GCoder
GCoder
Software rendering is extremely useful. You can use it to do things that are difficult with hardware rendering more easily:

- Software rendering can be done easily into system memory
- Software rendering can use any colour depth / resolution, even those not supported by the graphics card
- Software rendering is guaranteed to produce the same result on any machine - you're not at the mercy of driver developers.

Of course its extremely low relative performance for most things make it less useful for realtime graphics. But using software rendering for bits of non-realtime stuff is still a definite option:

- Backgrounds
- Prerendered bitmap / voxel graphics which are subsequently blitted to the screen (possibly using hardware rendering) - for example, decorating sprites with different colour schemes, weapons, clothes etc.
- Intermission screens etc.

Say you have a space combat game and the player hyperspaces to another part of the galaxy - you could use software rendering to rerender the galaxy as a skybox / backdrop. This would involve rendering a large number of stars (perhaps 10k, 100k) according to some IFS fractal or other pseudo-random system (because obviously you can't store the position of all 100k stars). You *COULD* do this with a GPU vertex program or something (probably) but it would probably be a lot easier, and perhaps not very much slower to do it on the CPU (seeing as it's not that time critical anyway).

Once the player reaches the hyperspace location, the stars won't have to be redrawn again until she makes another leap (assuming that the craft is not capable of massively faster-than-light travel)

Mark
Quote:Original post by Spoonbender
Software renderers are good for one thing. Running 3d on systems that don't have any kind of 3d acceleration.

That's a little too general. There are several cases where you do have 3D acceleration but software rendering is still valuable:

- When there is 3D acceleration but it does not support a required feature (e.g. shaders).
- When instability of drivers and differences in quality between cards is unacceptable. For example in industrial and medical applications.
- For research. You can use algorithms that don't work with current hardware rendering (e.g. true parametric surfaces).
- For high-quality rendering. Often non-real-time rendering. Pixar uses only software rendering for final images.
- When hardware rendering is too slow. This is exceptional but it still happens. For example line rendering is sometimes faster in software.

Even in the context of games some of these points apply. We'll also come to a point where buying a graphics card is no longer a neccesity for older games because of increasing CPU performance (dual-core now, quad-core next year)...

This topic is closed to new replies.

Advertisement