Jump to content
  • Advertisement

Archived

This topic is now archived and is closed to further replies.

duke

OpenGL opengl -> sql

This topic is 5460 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Ok, if you don''t like to contemplate absolutely ridiculous ideas, skip this post. Have any of "you", being the knowledgeable readers of this forum, ever contemplated using opengl for something totally non-graphics related? Consider this... what does opengl actually do? Well for starters it reduces 4 dimensional data into 2 dimensions... my thinking is that is might be possible to use opengl as an sql database. By representing the data in the database as opengl vertex data, you could position the camera in certain ways to "select" the data. simply render to a texture, and then read the texture back... Firstly, assuming it would even be possible, why would you want to do this. Reason #1, it would just be cool as hell Reason #2, when running a machine as a dedicated server it may have a good video card in it. You could effectively use this video card as an additional processor. Dual CPU machines are very cost effective, beyond 2 CPU''s you start to pay out the butt. So if you bought a dual cpu, put a $100 dollar GeForce of ATI card in it, you would have like 2.5 CPUs on your SQL server... Well like I said it is an "out there idea", I just wonder if anyone else has A) thought along these lines and B) actually wrote some code to the effect. Now if you wanted to do such a thing, you would obviously need to come pu with creative ways to place your data into opengl vertex buffer obs. And employ even more creative methods as to how to position the camera to select data. anyways just a thought I had while bored as hell

Share this post


Link to post
Share on other sites
Advertisement
Anything you want, it could be ''the smell of a flower'' or something.

Thing is the only benefit would be the use of graphics hardware, which is pretty tuned to streaming data.

You''d be pretty limited by bit-precision on current hardware, perhaps when 96-bit or greater bit depth is the norm this might be feasible.

It would be interesting to see research in this direction, but then again, why wouldn''t you just stick a second general CPU into the system? (you could for instance buy a cheap PII system for less than a mid-level GPU system)

Share this post


Link to post
Share on other sites
The fourth dimension is the ''w'' component. when you submit a vertex in opengl, with a call such as glVertex3f(1,1,1) there is also a 4 value in that vertex. That is the w component. If you do not specify it, I believe it defaults to 1. But it is possible to specify that explicity.

As for just adding a second CPU, my point was that you add both. a second CPU and a good video card. anyways I think it would make an interesting research project, too bad I am not a student with tons of time on my hands anymore

Share this post


Link to post
Share on other sites

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!