Jump to content

  • Log In with Google      Sign In   
  • Create Account


What can software rasterizers be used for today?


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
19 replies to this topic

#1 Xcrypt   Members   -  Reputation: 154

Like
0Likes
Like

Posted 28 November 2012 - 07:38 AM

I decided to code my own software renderer, because I think it would make for a great learning experience.
And although it will probably stop there for me, I'd like to know if there is still use for software rasterizers today, and what they can be used for?
Can they be used for anything besides systems that lack a GPU? And what are the most important target devices today that lack a GPU?

Edited by Xcrypt, 28 November 2012 - 07:41 AM.


Sponsor:

#2 Olof Hedman   Crossbones+   -  Reputation: 2768

Like
1Likes
Like

Posted 28 November 2012 - 07:51 AM

It indeed is a great learning experience.
And a simple software renderer running on Symbian OS landed me my first real job :)

Though I think the use for them is really limited today, when most mobile phones (even low end ones) have a GPU.
And the ones that don't might be hard to run 3rd party code on.

Possibly in some embedded system, but then again, if you want advanced graphics, you might choose a system with a cheap GPU instead.

In some hypothetical system where your GPU is overloaded, but you have lots of CPU cycles to spare, I guess it could be useful for some multipass effect. (would probably need to be a unified memory system, preferably with texture streaming enabled to avoid copies, to be efficient though)

Edited by Olof Hedman, 28 November 2012 - 07:55 AM.


#3 Tordin   Members   -  Reputation: 604

Like
1Likes
Like

Posted 28 November 2012 - 07:55 AM

I belive they are still usefull today, i mean why would they be "depricated" just becuase gpu´s are more efficent at it?

hobbyist might want to use them for 2d games or 3d games. or to do some wierd application that do some cool 3d render when you write text. or what ever :P
i can probably think of some.

"And what are the most important target devices today that lack a GPU?"
- Thos darn LED displays that are on busses and trains. i hate it that they cant do better effects with their texts. ( There probably are some sort of rendering stuff in them, but probablt not a gpu :P )
"There will be major features. none to be thought of yet"

#4 ZBethel   Members   -  Reputation: 570

Like
4Likes
Like

Posted 28 November 2012 - 07:55 AM

I wrote one. It's great fun! :)

There isn't much use for them nowadays, to be honest. However, I think that as CPUs become more vectorized and parallel, there's potential to see a comeback, but as a supplement to the GPU rather than a replacement. For instance, DICE utilizes software rasterization to aid in occlusion culling, because reading back data from the GPU takes a while.

#5 Olof Hedman   Crossbones+   -  Reputation: 2768

Like
0Likes
Like

Posted 28 November 2012 - 08:02 AM

For instance, DICE utilizes software rasterization to aid in occlusion culling, because reading back data from the GPU takes a while.


Interesting!
That could be useful, when you need the rendered scene for some CPU processing, but don't want to add synchronization points with the GPU...
I assume it's a rather simple version of it though, maybe just render the depth buffer?

#6 ZBethel   Members   -  Reputation: 570

Like
1Likes
Like

Posted 28 November 2012 - 08:09 AM

They have a paper somewhere that outlines some of the details, but I believe they do very simplistic rasterization to a 320x280 depth buffer (or something like that). It's heavily vectorized. If you own a core-I7 with AVX, I believe those can do 8-wide vector operations, which would speed up something like that heavily. I've considered writing an occlusion library that utilizes AVX and SSE instructions. I don't think anyone's really used AVX much yet in production (from my very limited viewpoint).

#7 Xcrypt   Members   -  Reputation: 154

Like
0Likes
Like

Posted 28 November 2012 - 08:38 AM

Another thing I'm wondering, I've been thinking about experimenting with some rendering techniques like rasterization with gpgpu by using something like CUDA. However, I have no CUDA experience and wondering if it would be possible to do so? Could there be certain advantages over just using Dx / GL?

Edited by Xcrypt, 28 November 2012 - 08:41 AM.


#8 Radikalizm   Crossbones+   -  Reputation: 2837

Like
1Likes
Like

Posted 28 November 2012 - 09:02 AM

Another thing I'm wondering, I've been thinking about experimenting with some rendering techniques like rasterization with gpgpu by using something like CUDA. However, I have no CUDA experience and wondering if it would be possible to do so? Could there be certain advantages over just using Dx / GL?


I've seen real-time raytracers done with GPGPU solutions, but no rasterizers as far as I can remember. There's a good reason for a lack of GPGPU rasterizers though, as DX and OGL would always outperform GPGPU solutions as they can use the actual rasterizer hardware, while a GPGPU solution would need to do rasterization completely in software.
Maybe there are some obscure use cases where a GPGPU rasterizer would be actually useful, but in general it'd be better to stick with libraries like DX and OGL.

I gets all your texture budgets!


#9 Xcrypt   Members   -  Reputation: 154

Like
0Likes
Like

Posted 28 November 2012 - 09:08 AM

I've seen real-time raytracers done with GPGPU solutions, but no rasterizers as far as I can remember. There's a good reason for a lack of GPGPU rasterizers though, as DX and OGL would always outperform GPGPU solutions as they can use the actual rasterizer hardware, while a GPGPU solution would need to do rasterization completely in software.
Maybe there are some obscure use cases where a GPGPU rasterizer would be actually useful, but in general it'd be better to stick with libraries like DX and OGL.


Well, gonna have to try the gpgpu raytracing then :D

#10 Memories are Better   Prime Members   -  Reputation: 769

Like
0Likes
Like

Posted 28 November 2012 - 09:48 AM

I believe Blender still renders on the CPU, or at least did, they had 'reasons' but I forgot what they were

#11 Ravyne   Crossbones+   -  Reputation: 7143

Like
0Likes
Like

Posted 28 November 2012 - 09:58 AM

As mentioned, software rasterizers are still of some practical use to help aid the GPU, as dice does -- in general, any time you might get away with a lower-resolution stand-in with limited or no "pixel shading" -- so things like occlusion, perhaps shadow maps generation or other lighting effects could be done.

The very best software rasterizers of today are incredible pieces of technology that scale across CPU cores and across vector instruction sets, and just-in-time-compiling their own pixel shaders to SSE/AVX -- Still, once the pixel shading and texture sampling start to get cranked up, even several CPUs can barely keep pace with even entry-level GPUs it just doesn't have the compute throughput necessary to drive resolution much beyond 1024x768 or so, which is to say nothing of the meager memory and cache bandwidth a typical PC has compared to a GPU.

Still a very interesting exercise though -- I'm debating pulling out my old single-threaded, mostly-non-vectorized rasterizer (which was still decently fast) and seeing how far I can push it with 4 cores and AVX.

#12 landagen   Members   -  Reputation: 376

Like
0Likes
Like

Posted 28 November 2012 - 10:15 AM

I believe high end renderers such as RenderMan are implemented in software and go across multiple computers but I do believe they can utilize the GPU. I wouldn't think they use OpenGL or DirectX directly. I would think they would use it similar as a gpgpu. But for a real-time renderer, you will definitely need to use the GPU.



#13 Ravyne   Crossbones+   -  Reputation: 7143

Like
0Likes
Like

Posted 28 November 2012 - 11:48 AM

Yes, that's a distinction to be made for sure, but RenderMan is fundamentally a ray-tracer, and the final renderings take minutes or hours per frame on large clusters of computers. More and more of that can move onto the GPU as technology advances, but the very wide vector machines that GPUs are aren't a great match for ray-casters, because its very, very hard to keep all the rays moving in the same direction, and in the same state. There are quick-turnaround previewing tools that can run scenes on a GPU in real-time, but they don't have anywhere near the subtle lighting that they get with the final rendering.

I think the OP is mostly talking about a real-time rasterizer

#14 mdias   Members   -  Reputation: 786

Like
0Likes
Like

Posted 28 November 2012 - 11:50 AM

I believe Blender still renders on the CPU, or at least did, they had 'reasons' but I forgot what they were


Blender has a new raytracer called Cycles which can advantage of CUDA and OpenCL, can have custom shaders and so on.

Edited by Kamikaze15, 28 November 2012 - 11:50 AM.


#15 ZBethel   Members   -  Reputation: 570

Like
0Likes
Like

Posted 28 November 2012 - 02:48 PM

Another thing I'm wondering, I've been thinking about experimenting with some rendering techniques like rasterization with gpgpu by using something like CUDA. However, I have no CUDA experience and wondering if it would be possible to do so? Could there be certain advantages over just using Dx / GL?


These guys did.

Edited by ZBethel, 28 November 2012 - 02:50 PM.


#16 DarkRonin   Members   -  Reputation: 610

Like
0Likes
Like

Posted 06 December 2012 - 04:11 AM

How does one even start to code a software renderer?

#17 ginkgo   Members   -  Reputation: 294

Like
0Likes
Like

Posted 06 December 2012 - 07:29 AM

We use a very simplified software rasterizer for voxelizing triangle-meshes.

We set up an orthogonal projection of the model and rasterize it into a simple A-buffer data-structure. (A-buffers are framebuffers with per-pixel list of fragment depths)

After we did that we can just sort those lists, move through them from front and back and set ranges between fragments to either inside or outside in the voxel volume.

#18 DarkRonin   Members   -  Reputation: 610

Like
0Likes
Like

Posted 06 December 2012 - 02:35 PM

Haha, like I said 'How does one even start to code a software renderer?' Posted Image

Way over my head Posted Image

#19 Ravyne   Crossbones+   -  Reputation: 7143

Like
0Likes
Like

Posted 06 December 2012 - 04:33 PM

How does one even start to code a software renderer?


All you need is a chunk of memory to call your framebuffer, and a way to get it onto your screen. In GDI/Win32, SetDIBBits or StretchDIBBits can get your framebuffer onto a window. Unfortunately win32 doesn't provide hooks for syncing to the vertical retrace, so you'll see tearing in the results (which is mostly not an issue).

Then you just need a to write pixels into your framebuffer which are in a compatible format -- the usual suspects apply: RGBA8888, RGB565, etc.

#20 ZBethel   Members   -  Reputation: 570

Like
0Likes
Like

Posted 07 December 2012 - 01:39 PM

Haha, like I said 'How does one even start to code a software renderer?' Posted Image

Way over my head Posted Image


A good way to start is to try and understand how the GPU processes things. That means learning how index/vertex buffers work, how to write shaders, how world/view/projection space works and the math behind that, how the z-buffer works, how alpha blending works, how mip-mapping works, clipping, etc. Once you understand those concepts, you can start to write a rasterizer (alternatively, learning while you write one is a great way to learn it!). Tackling the entire thing all at once is completely overwhelming, but breaking it down into small parts takes care of that. For instance, you could start by writing a simple wireframe rasterizer. That was what I started with. If you're not looking write a super fast parallelized one then this is actually quite easy.




Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS