Jump to content
  • Advertisement
Sign in to follow this  
datasurge

OpenGL 3d programming without directx/opengl

This topic is 3589 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I recently read an interesting article from Tim Sweeney. In it he believes that with the advent of the chip manufacturers moving towards multi core solutions that eventually systems will not require an addon graphics card solution for 3d graphics that in reality the cpu of today will be some sort of multi core solution that will support normal cpu and gpu operations that we are use to today. In relation to this he believes that directx and opengl solutions will no longer be necessary as these created a generic way of accessing this specialized hardware. I was wondering if anyone knows of or could direct me to some good reading material of doing graphics programming in C++ w/o using directx/opengl. He also went on that directx/opengl programming fundamentals are based around the ideas of 25 years ago and that perhaps in the future we will move to something different than triangle based scenes for an example he said maybe spheres where you could anti alias them to smooth images.

Share this post


Link to post
Share on other sites
Advertisement
you can also buy a book on ray tracing. There are a lot of good books on

www.amazon.com

There are also lots of ray-tracing <-> rasterizing thread in here. Don't forget voxels and point rendering. There are also a bunch of interesting NPR rendering techniques like coal or cartoon that might be implemented with different rendering schemes.
Finally do not forget that the C language used to program CUDA or CAL is also considered a "valid" language and that you can program any form of rendering algorithm on those cards without being restricted to rasterization. Lots of fun.

Share this post


Link to post
Share on other sites
... and not to forget there is 40 year old chip technology that can be used for rendering as well. Check out any INTEL or AMD chip :-)

Share this post


Link to post
Share on other sites
If you would like to be ready when we stop using OGL or DX, you better study raytracing: I don't think we will ever go back to software rasterization, even when Intel's Larrabee will be out. We either will continue to use OGL and DirectX, or we begin using RT. Or both together.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!