Jump to content
  • Advertisement


  • Content Count

  • Joined

  • Last visited

Community Reputation

104 Neutral

About madgallagher

  • Rank
  1. madgallagher

    Strange Video Memory Increase

    Thanks, I'll check out the GL_EXT_compiled _vertex_array. I get the same problem when I have varied colours at the vertex and just change the values to say RGB ->RBG. Basically, any change to the colour array causes the problem.   I appreciate all your comments gentlemen.
  2. madgallagher

    Strange Video Memory Increase

    Its all in C baby. I am not using anything like GLEW or SDL. Its pretty basic stuff.   Do you think it is a driver issue which is out of my control ??
  3. madgallagher

    Strange Video Memory Increase

    Yes, sorry I forgot to add in the glDisableClientState commands. These are all deleted after the tris are drawn each time.   I added in VBOs and got the same effect even when deleting the VBO. Even simply using glBegin/End for the rendering (shock horror !!) causes the same effect.   It is kind of driving me crazy..................
  4. madgallagher

    Strange Video Memory Increase

    OK, the render code is basically as follows:   glShadeModel(GL_SMOOTH); glEnable(GL_COLOR_MATERIAL) glColorMaterial(GL_FRONT_AND_BACK,GL_AMBIENT); glColorMaterial(GL_FRONT_AND_BACK,GL_SPECULAR); glColorMaterial(GL_FRONT_AND_BACK,GL_DIFFUSE);   glEnableClientState(GL_VERTEX_ARRAY); glVertexPointer(....); glEnableClientState(GL_NORMAL_ARRAY); glNormalPointer(...); glEnableClientState(GL_COLOR_ARRAY); glColorPointer(....); glDrawElements(........);   So, I am just using arrays, not VBOs. All I do is update the values in the color array that glColorPointer points to before I come into this routine.   The problem is, on a 1GB graphics card, if I am using 600MB and then change the color, it swamps the card and the system starts badly lagging because it is trying to double the memory usage.
  5. madgallagher

    Strange Video Memory Increase

    Hi   I am using the latest linux driver from Nvidia. The card is pretty old. Its an FX3800, but we see the same issue on an FX4000. The same routines are used for both types of shading. I'm simply changing the values in the color array. Nothing else.   We get the same "doubling" of memory usage on smaller models too. Its like when the color array is updated, the memory is not re-allocated on the card properly.
  6. Hi Guys   I have a weird issue on opengl. I am rendering a lot of tris (millions) using vertex arrays. Here is the issue.   If I load up the tris and display as shaded with a constant color, the video memory is showing around 600MB.   If I then change the color array values to be varied at different vertex, the memory usage increases to around 1GB. If I then spin the model around, the memory usage slowly decreases to around 600MB (after a couple of minutes).   But, if I load up the model straight away with varied shading, the memory is only 600MB.   This is on linux, so is it a driver issue ? Or is there something more obvious I am neglecting.   I am using nvidia-smi to check the graphics memory usage.   Cheers in advance for any help !
  • Advertisement

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!