State-of-art in real-time software rendering?

Started by
18 comments, last by MJP 4 years, 10 months ago

Now that my real-time software renderer is almost complete after many years, I'm finding it difficult to research the current state-of-art in the area to compare performance. Most other software renderers I've found are either non-released hobby projects, slow emulations of GPUs with all their limitations or targeting very old CPUs from the 1990s. Best so far was a near-real-time CPU ray-tracing experiment by Intel around year 2004 ac.

Feel free to share any progress you've made on the research subject or interesting real-time software renderers you've found. With real-time, I mean at least 60 FPS in a reasonable resolution for the intended purpose. With software, I mean no dependencies on Direct3D, OpenGL, OpenCL, Vulkan nor Metal.

Advertisement

Well the WARP adapter in D3D11 is pure software as far as I know. Of course, that falls into the area of GPU emulations I guess. I don't think that anybody has put in any serious effort into software rasterizers. Or are we talking about any kind of renderer here? Most (all?) software raytracers are geard towards the best possible image quality and pay multi-hour rendering times per frame as the price. I'm thinking of Arnold, RenderMan, and so on. Other than that, the preview renderers are pretty fast most of the time. However, rasterizers haven't been done in software for a long time for a good reason. Even the cheapest Intel chip has an IGPU.

44 minutes ago, Batzer said:

However, rasterizers haven't been done in software for a long time for a good reason. Even the cheapest Intel chip has an IGPU.

You have a valid point for AAA games on Windows and Macintosh, but all systems don't have a well behaved graphic drivers. Some indie developers just want their graphics to look the same on each plaftorm instead of wondering why their OpenGL shaders won't compile on 12 out of 30 targeted graphics drivers, leaks memory (Nvidia drivers), produces random noise from integer addition (Nexus) or gives black seams in the 2D overlays (reference OpenGL). Going around OpenGL driver bugs on Android used to be my never ending full-time job as a firmware developer because all systems don't have Vulkan drivers. Even if you let someone else do it for you, bugs will creep in when the software gets old because there is no safe sub-set of the graphics API when bugs are everywhere and hardware vendors argue about how to interpret the OpenGL specification. So if software rendering can be fast enough with Direct3D's quality of deterministic graphics, it would mean a lot for cross-platform portability and long-time quality of indie games that don't need the insane overkill amount of performance in a modern GPU.

Just the fact that nobody else has been there for ages when 32-core CPUs are available sounds like a treasure hunt to me. ?

In addition to WARP, which is a fast software rasterizer, there was also old versions of D3D REF, which was the slow software rasterizer with readable code, and Mesa has a reasonably fast OpenGL implementation in software through the LLVMPipe backend.

5 hours ago, Dawoodoz said:

With software, I mean no dependencies on Direct3D, OpenGL, OpenCL, Vulkan nor Metal.

IMO this is not a useful or meaningful distinction. While it's totally true that D3D/GL/Vulkan/Metal expose a "rendering pipeline", they also expose compute shaders which completely bypass the normal rasterization pipe. So I would have no problem calling something a "software renderer" if it completely implemented the rendering in compute shaders (and people have done exactly that, although more often in Cuda). In more general terms I would consider these APIs to be "GPU APIs" rather than "3D rendering APIs", since they can be used purely as a means of running arbitrary vectorized programs on the GPU. I mean it's fine if you have some particular reason to only target CPUs, but if you don't then you may want to consider what kind of software renderer could be implemented in Cuda or compute shaders. There's also a really interesting continuum of techniques that blur the lines between software/hardware rendering now that hardware-assisted raytracing has hit mainstream GPUs, and even without that there's lots of opportunities to get away from simply rasterizing triangles.

42 minutes ago, MJP said:

IMO this is not a useful or meaningful distinction. While it's totally true that D3D/GL/Vulkan/Metal expose a "rendering pipeline", they also expose compute shaders which completely bypass the normal rasterization pipe. So I would have no problem calling something a "software renderer" if it completely implemented the rendering in compute shaders (and people have done exactly that, although more often in Cuda). In more general terms I would consider these APIs to be "GPU APIs" rather than "3D rendering APIs", since they can be used purely as a means of running arbitrary vectorized programs on the GPU. I mean it's fine if you have some particular reason to only target CPUs, but if you don't then you may want to consider what kind of software renderer could be implemented in Cuda or compute shaders. There's also a really interesting continuum of techniques that blur the lines between software/hardware rendering now that hardware-assisted raytracing has hit mainstream GPUs, and even without that there's lots of opportunities to get away from simply rasterizing triangles.

There are many odd systems out there that cannot access the GPU properly, so that's the reason for my distinction.

* Google tried to block OpenCL on Android in order to promote their own RenderScript, which is not nearly as good.

* Ubuntu often comes with software emulated GPU drivers that are many times slower for 2D overlays. Installing drivers is usually a month's work of rebooting the X-server and manually modifying and recompiling the Linux kernel with a level 4 Nvidia support technician. Even our company's senior Linux admin called it a real headache. Now imagine a 12 year old gamer given the same task with one hour of patience before rebooting on Windows.

* In safety critical systems, the customer will usually specify "CPU only" because of safety concerns.

* I also developed firmware for a platform where the CPU outperformed its low-end GPU because of missing essential OpenGL extensions.

3 hours ago, Dawoodoz said:

Ubuntu often comes with software emulated GPU drivers that are many times slower for 2D overlays. Installing drivers is usually a month's work of rebooting the X-server and manually modifying and recompiling the Linux kernel with a level 4 Nvidia support technician. Even our company's senior Linux admin called it a real headache.

OT but, I'm used to have Linux and nVidia cards for more than 15 years. I don't use Ubuntu, but its mother, Debian. And there's nothing more simple than installing nvidia drivers on Linux (for example compared to having to install AMD drivers maybe 6 or 7 years ago or so). It's just as simple as launching a script while being root. Of course, if you use the latest kernels with latest GCC (which have not yet been supported by nVidia), you might run into some trouble. But if you're patient enough (and I believe in Ubuntu you don't have to be as patient as with Debian), you should not face any issues.

Rather recent drivers are in the repositories, if one allows non-free ...

6 hours ago, _Silence_ said:

OT but, I'm used to have Linux and nVidia cards for more than 15 years. I don't use Ubuntu, but its mother, Debian. And there's nothing more simple than installing nvidia drivers on Linux (for example compared to having to install AMD drivers maybe 6 or 7 years ago or so). It's just as simple as launching a script while being root. Of course, if you use the latest kernels with latest GCC (which have not yet been supported by nVidia), you might run into some trouble. But if you're patient enough (and I believe in Ubuntu you don't have to be as patient as with Debian), you should not face any issues.

Nice to know that things are improving, but even Titanic needed life boats, so I will keep working on mine because I like software that just works no matter what happens.

Well, software that just works will be found on Linux, but rather on Debian or Gentoo than Ubuntu or Mint. Too much bloatware and undocumented "free" software in the latter.

And i just wouldn't run a graphical surface at all on a safety critical system. What for ? Also (to stay with the paraphrases of a sinking ship :-)), on such a system one might want to throw overboard the systemd and replace it with something more safe & sound, which pretty much excludes Ubuntu since, from what i have read, the integration is too strong.

I admire your work, i wouldn't be able to do that or would die of old age over such a task. And, though i have not experienced the bad old days myself, i really think things (especially graphics drivers for Linux of the main vendors) have vastly improved in the past. I don't have AMD (yet), but an old intel HD 4400 in the notebook and GTX 660 and 970 in my pcs. The Notebook can "only" run OpenGL 4.5, but that's fine for me. Back in the day i would, as @_Silence_ said, install nvidia via the downloaded script, but now the drivers are in the repositories (418.74 for Debian) and thus seamlessly updated with the usual update cycle without manual interaction. It is, of course, a pity that the free nouveau driver is so bad, but if i am not mistaken that's mostly Nividias responsibility because they don't work together with the open source community. So those guys have to reverse engineer ...

Am reading (and trying to understand? ) "Real Time Rendering" by Akenine-Möller et.al. I think it really is a very rewarding and aspirational goal to completely implement even a subset of these completely from scratch, without building on the fundament of available APIs and drivers. One can just accomplish so much more using these fundaments. "Standing on the shoulders" and all that ?

This topic is closed to new replies.

Advertisement