Archived

This topic is now archived and is closed to further replies.

Software Rendering is not dead (or is it...)

This topic is 5671 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Feminine gamers.... hmmmm... You got the RPG bit right, but don't forget adventure games. Anyway, Ultima 9 requires a 3D accelerator. It will become standard (if not already) so whatever PC you next buy in the future will do 3D. Also, if there is such a large market, they would have bought more of the 2D stuff (Baldur's Gate anyone?) that is just as good. However, they aren't and 3D has boomed because of (and only of) FPS games.

How do you program without 3D asks CJ? Look at Baldur's Gate. No 3D whatsoever and sold very well. Starcraft ditto. 2D is not dead. Besides, I prefer the isometric to the first person view, so I'm biased. :P

Oh wait, you mean software rendering of 3D worlds... uhmmm, it pretty much is, I'm afraid, or at least, it's a last resort. Why? Sloooooow... Unless of course you're making a game for the mid-90's... If it's an RPG, umm, it might work, but nowadays, I don't think it can. It needs to be fantastically good on all other levels for it to compensate in an increasingly eye candy hungry world.

[This message has been edited by Jeranon (edited November 10, 1999).]

Share this post


Link to post
Share on other sites
The funny things is, that 3D accelerators are built to solve the problems that yesterdays software 3D engines had.

Take "outcast" as an example: This is a 3D game that takes a different approach to 3D rendering. Because of this, the renderer is largely (entirely?) software.

I don't think SW rendering is dead per-se, new features will appear first in SW, only to be included in next generation HW. I DO believe, however, that 3D cards are a lot more mainstream than you suggest - Even the lamest gfx card today have some degree of 3D support, and even the most hardcore 3D accelerators (TNT2, VoodooIII, Geeforce) cost way less than a new processor).

/Niels

Share this post


Link to post
Share on other sites
Personally I don't think that software rendering is dead, or even going. Sure many people are using such libraries as DirectX and OpenGl for use in their games. And that's perfectly reasonable and understandable because they may have a deadline for their game and so they would want to create the graphics in the fastest possible way.

But for others it doesn't have to be so, other people especially hoby programmers who are programming for the thrill of the "journey" as much as the end product (Like me would much rather take it slower and actually code software rendering into their programs, and when they do it well the effects can be quite worthwhile, especially that they know that almost every joe can ??download?? their game from the internet and play it without the need to buy an expensive piece of hardware.

Another reason for coding software rendering into a game is so that the programmer learns a lot about what they are actually programming about and so when they finally get a "real" programming job they will know how the concepts work and so they will be better for it. I know that it isn't so easy to program in 3d, matrices aren't that easy as well, but when you get the 3d theory into your head, remember it, and use it in programming then you will retain that knowledge for later and thus it will help you out when you are stuck on something about which no tutorials exist.

But still, do we really need all these fancy 3d graphics? I mean from what I have seen 3d graphics have not gone far since quake , or quake 2, they have improved in quake3 a bit but still there have been no overwhelming technologies or new ideas into the sea of 3d programming for quite some time...

We will need some innovation soon!

------------------
-Dom:)
Visit - http://www.eisa.net.au/~sdgrab/index.html

Share this post


Link to post
Share on other sites
Yup! There is a reason why id is still setting the standards - John Carmack knows the details of software rendering better than most, which is probably why he is capable of utilizing the HW to the extend he is.

/Niels

Share this post


Link to post
Share on other sites
A quick point, I agree that we need innovation but I disagree as to the expense of the hardware.
You can pick up a voodoo I for under $50, if that and all new computers come with better than that as minimum.
There's still a lot of craft in software renders and many unaccelerated effects but people like 3dfx seem to take criticism and comments seriously and try to help developers (John Carmack being one such developer) by designing their next gen board with them in mind.

Share this post


Link to post
Share on other sites
Since the orignal post brought up the 'market' I'd like to talk about that. Back when I didn't have the money to upgrade my out of date 386, I sure as hell didn't have the money to spend $50 dollars on a game. I was more than happy to go to the cut-up bin and buy 2-3 year old games for $15-20 dollars.

All those hunting games that are targeted at people who arn't computer enthusiasts cost $20. They'd never sell to their target audience at $50.

If someone is going to buy 5+ fifty dollar games a year, it's hard to believe that they won't invest less than $100 for a decent videocard.

That being said, you take a pretty big performance hit just developing a Windows native game (opposed to DOS) with or without hardware acceleration. For most people it's worth it to avoid dealing with the hardware in general. It's concievable someone could write all kinds of 3-D hardware drivers in DOS and port OpenGL and get a game that'll run faster on less memory, but it's not worth it to most people. It's up to you to decide where you want to take performance hits, whether its a higher Benchmark machine, or sticking to 256 color display.

Share this post


Link to post
Share on other sites
Whoa! hot topic...
Ok I'll start from the top:
1) No I am not using DIB's- at least not at this point in time (I am programming for both Linux and Win32)

2) It is true that alot of new computers come with standard 3d acceleration, but alot of people out there can't afford an new computer.

3) Sure they will take John Carmack's opinion seriously- but we're not all as 'important' to the game industry as he is are we? ;-)

[This message has been edited by Eternity (edited November 10, 1999).]

Share this post


Link to post
Share on other sites
I hope one or two people remember when they first bought in Maths co-processors. What was that with? The second half of the 386 range. More expensive, only available on the newer motherboards and yet, stop me if I'm wrong, but they caught on pretty sharpish.
I admit, I'm lucky enough to be able to afford to upgrade my system now and again, a lot of people aren't, but the hardware accelerator cards first started getting serious around two years ago and that's a pretty long time in the world of computer hardware.
I'm not saying software acceleration is in any way dead. I remember writing my first 3D system with some vague lecture notes and a bad knowledge of matrix multiplication but unless your aiming for the markets of developing countries then hardware is here to stay

Share this post


Link to post
Share on other sites
I just saw some screenshots of Outcast. I stand corrected. Software rendering of 3D is still quite viable, just requires a bit more work (when I say work, I mean more code, rather than more improvement). And as Outcast shows, apparently feasible. Where the mentality of "let the hardware do it" is king, it's good to see developers do it the hard way and bring good looking 3D to those with "low end" machines.

[This message has been edited by Jeranon (edited November 10, 1999).]

Share this post


Link to post
Share on other sites
I think I worded my first post a little too ambiguously...

I am not totally disregarding hardware! Far from it! But I won't be adding any features to the engine that I can't reasonably support in software as well- my test machine(the only one I own!) is the PII 266...

I can say that there are not alot of people out the there that can't afford or can't be bothered upgrading, because I know alot of them! (I personally can't afford it...)

I have a great deal of respect for John Carmack- but I regret his move to hardware only engines. (Although they are QUITE impressive!)

Share this post


Link to post
Share on other sites
A quick point about software rendering.

If you have time I advise go for it. Using Unreal tournement as an example. On my Voodoo 3 PIII 450 I use the glide API. Total briliance, the best game I ahve ever played (even if it is still only a demo)

My other machine is a PII something (still cant remember) with no 3d hardware. If you use the software renderer supplied with UT on it it runs brilliantly, where as the big API versions run like a dog. (where as Quake 3 doesn't run at all on the PII, and I have such poo internet conection at home it isn't worth using)

My point is that less graphics doesn't mean less fun, but in this day and age you have to be quite a large concern to be able to invest teh time required.

Share this post


Link to post
Share on other sites
John Carmack is a fairly good programmer, I'm not saying he isn't, but IMHO Michael Abrash is a lot better than Carmack... He knows how to do a lot more things and I bet that he knows 3d programming better than Carmack. The only reason that we don't talk about him that much is because he is not in the spotlight currently.

Just thought that I'd mention that

------------------
-Dom:)
Visit - http://www.eisa.net.au/~sdgrab/index.html

Share this post


Link to post
Share on other sites
Abrash is cool - love his books. But I wouldn't underestimate Carmack - he is WAY more than a 3D guru. Trust me (or check it out for yourself) this guy knows stuff...

/Niels

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
I have made my own 3D program that is extremely fast, and it is purely software rendering, because I have written my own stuff. I wrote it in Windows 95/98 in assembly code, but I also use the same code (slightly modified) for my 3D programming in DOS.
I have made a DOS program that lets you rotate a 4000+ polygon 3D object at about 25 frames per second on a Pentium 166. Not bad for a 3KB program that can only access a maximum of 64KB of memory! There is no way you could do that with hardware acceleration.
On my computer (Pentium 166Mhz, 16MB RAM), my 3D programs are a LOT faster than DirectX or OpenGL or anything else (especially the Windows API), but this may be because I dont have a 3D card, which slows DirectX down obviously.

If you want proof of any of these things (or even source-code), go to
http://www.geocities.com/emami_s/Draw3D.html

by Shervin Emami, savanna@tpg.com.au

Share this post


Link to post
Share on other sites
Obviously a highly optimized and specialized piece of code is going to run faster than more generalized APIs. This almost goes without saying. As for there being no way to do that with hardware acceleration, your argument is self-defeating. As long as you can get the polygons out to the rasterizer fast enough, hardware accelerated rendering is going to be faster than the software alternative, not to mention better looking. That is unless some radical new method of rendering is invented, which as fas as I'm concerned is entirely possible.

Share this post


Link to post
Share on other sites
Any piece of software is taking advantage of general purpose hardware (I.e. the CPU, suport circuits etc.) so the same functionality can obviously be implemented in HW (it IS implemented in HW). If you remove "general purpose" from the equation, the HW implementation (unless it's really f*cked) is bound to be faster.

For practical purposes, no one is going to implement HW for every odd idea someone comes up with, but the bulk of the time in any 3D engine is spend doing basic stuff such as converting numbers to visible entities - If anyone out there knows of a way to do that in SW that is faster than, say, a GeeForce, I'm all ears...

/Niels

Share this post


Link to post
Share on other sites
No offense here Niels, but the GeForce is relatively new- there are none too many T&L 3D cards on the market. Also depending on how much T&L you are doing, you have to consider bus bandwidth to the card as well. So in some cases (besides not having a T&L card), it WILL be faster to do T&L on the CPU...

Share this post


Link to post
Share on other sites
Not taken !

The question is not whether the HW for any given job is mainstream or even available - what made me raise an eyebrow was the rather bold statement that a given software renderer could not be beat by HW (which, in it self, is something of a contradictive statement)... And you're right, it IS still a bit early to rely on people having TnL HW... Give'em 6 months !!

/Niels

Share this post


Link to post
Share on other sites
For polygonal data HW will whip SW all over town- however some special effects can currently only be achieved in SW (albeit I have just seen an _Impressive_ Voxel routine in OGL!)
However I don't want to go into a HW/SW bash... My initial comment was related to the fact that too many developers rely on HW acceleration and that there are alot of people out there without it...

Share this post


Link to post
Share on other sites
However, the end of 2D-only cards is basically here. The number of people without ANY 3D acceleration is shrinking rapidly. The number of people with adequate 3D is rising rather quickly as well. You can't possibly expect developers to sacrifice visual effects, detail, and/or quality because that 5% of low-end computer users are still out there.

- Splat

Share this post


Link to post
Share on other sites

This topic is 5671 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Guest
This topic is now closed to further replies.