Quote:Original post by jyk
It's not that innovation never occurs, since it obviously does; it's just kind of unlikely that some random person who's not already at the forefront of ongoing research is going to be the one to do it. (Or so I would guess.)
That's a common delusion.
People believe that there are geniuses who can invent stuff that nobody thought of. WRONG. Those times are gone in about 1930 or so.
There's nothing left to invent in the garage any more. All those stuff have been invented. All new garage inventions are proven to be fake/lie or just reinvention of something. Inventions today require way too much resources, knowledge and time for a lone wolf genius.
I'm talking about scientific inventions. You can still invent toys.
And as others have already stated: matrices are not magic. It's just a representation of linear equations. Sure, the calculations on the GPU are generalized, so a special perspective projection is much simpler than a general matrix operation. But it's hardware accelerated.
So if you do a software renderer, that division thing will be faster than your own fully implemented matrix multiplication.
The question is: is that the bottleneck, or you are just trying to be smart?
EDIT: before someone (who is Jyk) states, that these all have been discussed already: Do you think I don't take the opportunity show how freaking smart I am by reading the thread? [grin]