Archived

This topic is now archived and is closed to further replies.

duke

OpenGL opengl -> sql

Recommended Posts

Ok, if you don''t like to contemplate absolutely ridiculous ideas, skip this post. Have any of "you", being the knowledgeable readers of this forum, ever contemplated using opengl for something totally non-graphics related? Consider this... what does opengl actually do? Well for starters it reduces 4 dimensional data into 2 dimensions... my thinking is that is might be possible to use opengl as an sql database. By representing the data in the database as opengl vertex data, you could position the camera in certain ways to "select" the data. simply render to a texture, and then read the texture back... Firstly, assuming it would even be possible, why would you want to do this. Reason #1, it would just be cool as hell Reason #2, when running a machine as a dedicated server it may have a good video card in it. You could effectively use this video card as an additional processor. Dual CPU machines are very cost effective, beyond 2 CPU''s you start to pay out the butt. So if you bought a dual cpu, put a $100 dollar GeForce of ATI card in it, you would have like 2.5 CPUs on your SQL server... Well like I said it is an "out there idea", I just wonder if anyone else has A) thought along these lines and B) actually wrote some code to the effect. Now if you wanted to do such a thing, you would obviously need to come pu with creative ways to place your data into opengl vertex buffer obs. And employ even more creative methods as to how to position the camera to select data. anyways just a thought I had while bored as hell

Share this post


Link to post
Share on other sites
Anything you want, it could be ''the smell of a flower'' or something.

Thing is the only benefit would be the use of graphics hardware, which is pretty tuned to streaming data.

You''d be pretty limited by bit-precision on current hardware, perhaps when 96-bit or greater bit depth is the norm this might be feasible.

It would be interesting to see research in this direction, but then again, why wouldn''t you just stick a second general CPU into the system? (you could for instance buy a cheap PII system for less than a mid-level GPU system)

Share this post


Link to post
Share on other sites
The fourth dimension is the ''w'' component. when you submit a vertex in opengl, with a call such as glVertex3f(1,1,1) there is also a 4 value in that vertex. That is the w component. If you do not specify it, I believe it defaults to 1. But it is possible to specify that explicity.

As for just adding a second CPU, my point was that you add both. a second CPU and a good video card. anyways I think it would make an interesting research project, too bad I am not a student with tons of time on my hands anymore

Share this post


Link to post
Share on other sites

  • Partner Spotlight

  • Forum Statistics

    • Total Topics
      627646
    • Total Posts
      2978383
  • Similar Content

    • By xhcao
      Before using void glBindImageTexture(    GLuint unit, GLuint texture, GLint level, GLboolean layered, GLint layer, GLenum access, GLenum format), does need to make sure that texture is completeness. 
    • By cebugdev
      hi guys, 
      are there any books, link online or any other resources that discusses on how to build special effects such as magic, lightning, etc. in OpenGL? i mean, yeah most of them are using particles but im looking for resources specifically on how to manipulate the particles to look like an effect that can be use for games,. i did fire particle before, and I want to learn how to do the other 'magic' as well.
      Like are there one book or link(cant find in google) that atleast featured how to make different particle effects in OpenGL (or DirectX)? If there is no one stop shop for it, maybe ill just look for some tips on how to make a particle engine that is flexible enough to enable me to design different effects/magic 
      let me know if you guys have recommendations.
      Thank you in advance!
    • By dud3
      How do we rotate the camera around x axis 360 degrees, without having the strange effect as in my video below? 
      Mine behaves exactly the same way spherical coordinates would, I'm using euler angles.
      Tried googling, but couldn't find a proper answer, guessing I don't know what exactly to google for, googled 'rotate 360 around x axis', got no proper answers.
       
      References:
      Code: https://pastebin.com/Hcshj3FQ
      The video shows the difference between blender and my rotation:
       
    • By Defend
      I've had a Google around for this but haven't yet found some solid advice. There is a lot of "it depends", but I'm not sure on what.
      My question is what's a good rule of thumb to follow when it comes to creating/using VBOs & VAOs? As in, when should I use multiple or when should I not? My understanding so far is that if I need a new VBO, then I need a new VAO. So when it comes to rendering multiple objects I can either:
      * make lots of VAO/VBO pairs and flip through them to render different objects, or
      * make one big VBO and jump around its memory to render different objects. 
      I also understand that if I need to render objects with different vertex attributes, then a new VAO is necessary in this case.
      If that "it depends" really is quite variable, what's best for a beginner with OpenGL, assuming that better approaches can be learnt later with better understanding?
       
    • By test opty
      Hello all,
       
      On my Windows 7 x64 machine I wrote the code below on VS 2017 and ran it.
      #include <glad/glad.h>  #include <GLFW/glfw3.h> #include <std_lib_facilities_4.h> using namespace std; void framebuffer_size_callback(GLFWwindow* window , int width, int height) {     glViewport(0, 0, width, height); } //****************************** void processInput(GLFWwindow* window) {     if (glfwGetKey(window, GLFW_KEY_ESCAPE) == GLFW_PRESS)         glfwSetWindowShouldClose(window, true); } //********************************* int main() {     glfwInit();     glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 3);     glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 3);     glfwWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);     //glfwWindowHint(GLFW_OPENGL_FORWARD_COMPAT, GL_TRUE);     GLFWwindow* window = glfwCreateWindow(800, 600, "LearnOpenGL", nullptr, nullptr);     if (window == nullptr)     {         cout << "Failed to create GLFW window" << endl;         glfwTerminate();         return -1;     }     glfwMakeContextCurrent(window);     if (!gladLoadGLLoader((GLADloadproc)glfwGetProcAddress))     {         cout << "Failed to initialize GLAD" << endl;         return -1;     }     glViewport(0, 0, 600, 480);     glfwSetFramebufferSizeCallback(window, framebuffer_size_callback);     glClearColor(0.2f, 0.3f, 0.3f, 1.0f);     glClear(GL_COLOR_BUFFER_BIT);     while (!glfwWindowShouldClose(window))     {         processInput(window);         glfwSwapBuffers(window);         glfwPollEvents();     }     glfwTerminate();     return 0; }  
      The result should be a fixed dark green-blueish color as the end of here. But the color of my window turns from black to green-blueish repeatedly in high speed! I thought it might be a problem with my Graphics card driver but I've updated it and it's: NVIDIA GeForce GTX 750 Ti.
      What is the problem and how to solve it please?
  • Popular Now