Jump to content

  • Log In with Google      Sign In   
  • Create Account

FREE SOFTWARE GIVEAWAY

We have 4 x Pro Licences (valued at $59 each) for 2d modular animation software Spriter to give away in this Thursday's GDNet Direct email newsletter.


Read more in this forum topic or make sure you're signed up (from the right-hand sidebar on the homepage) and read Thursday's newsletter to get in the running!


What is software rendering really?


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
5 replies to this topic

#1 Gerrard88   Members   -  Reputation: 101

Like
0Likes
Like

Posted 15 July 2014 - 09:23 AM

I know that you do all the graphics related computations on the cpu but i don't know what happens after that...How do you tell the monitor what to show, i.e. how do you set the colors of the pixels? Do you still need a graphics card to show something on the screen? The wikipedia article on software rendering says nothing about this.


Edited by Gerrard88, 15 July 2014 - 09:27 AM.


Sponsor:

#2 rnlf   Members   -  Reputation: 1185

Like
3Likes
Like

Posted 15 July 2014 - 09:29 AM

Depends. If your output device is a local monitor, you usually need some kind of video adapter (you could also write your renderings to a file, networks stream, printer, ...).

 

But if you want it to be visible on the screen, you copy (or let the operating system copy) your prerendered image to the video memory. In that case, the only thing the video adapter does is to output the image data to the screen.

 

In hardware rendering, you do the same, the difference is that you don't first use the CPU to do the rendering work but some hardware accelerated mechanism. But the stuff that happens after rendering is exactly the same with the same hardware parts involved. That's why it's usually (or was at least at some point) called hardware accelerated rendering.


Edited by rnlf, 15 July 2014 - 09:30 AM.

my blog (German)


#3 Samith   Members   -  Reputation: 2323

Like
3Likes
Like

Posted 15 July 2014 - 09:31 AM

Well, your monitor cable is most likely plugged into your graphics card, so yeah I guess technically you still need it to send data to the monitor. ;)

The spirit of software rendering, though, is that you do all the rendering computations (vertex transform, primitive assembly, rasterization, shading, alpha blend) in software, without any special, purpose built hardware. You do everything that needs to be done on a fully programmable die, with no special silicon.

The GPU isn't some magic piece of hardware necessary for communicating with the monitor. And while the monitor plugs into your graphics card, that's mostly because the graphics card is probably where all the video memory the OS uses for the screen is stored. But it (in theory) doesn't need to be that way.

#4 megadan   GDNet+   -  Reputation: 662

Like
3Likes
Like

Posted 15 July 2014 - 09:43 AM

In the old days you would switch the display to a mode like 13h with 320x200 resolution and 256 colors by using a little assembly:

 

mov ax, 13h

int 10h

 

Then you just grab a byte pointer to the screen address at 0xA0000000L and start filling it in with paletted colors.  Search for Mode 13h for more info.


Edited by megadan, 15 July 2014 - 12:25 PM.


#5 Buster2000   Members   -  Reputation: 1778

Like
0Likes
Like

Posted 16 July 2014 - 01:05 AM

If you really want to know what a software renderer is then have a go at writing one.  Just google software renderer tutorials.  There are dozens out there and it is always a fun little side project.



#6 JohnnyCode   Members   -  Reputation: 307

Like
0Likes
Like

Posted 16 July 2014 - 03:57 PM

rendering is a very well paralelizable issue. CPU (central processing unit) targets serialized demanding problematics (on step 2 I need step 1 result vitaly). Thus, gpu is a parallel pile of processors very conforming to paralely solvalble instruction (think of a web server serving a unique web client - that's where nvidia with tesla units is dominating), even if the gpu processor cores are slow of ferquency, there is many, thus they gulp down and burp over tremendous computation issues that gains paralelism.

 

software rendring is doing those instructions without paralel assisatnce and paralel distribution (or not, carefull).

 

Yet, paralel computation setups at those times suffer from memory access. (thats why gpus compete in memory frequency, rather than in cores frequency and their count)

 






Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS