Jump to content
  • Advertisement
Sign in to follow this  

[solved] GLSL - Memory keeps growing

This topic is 3299 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi, having a bit of trouble here in GLSL. I've been trying to learn using the SwiftLess tutorials (which are great, I think). But, I forgot about my open application for a while, got something to drink, played with my cat and so on. When I came back, the program had crashed. After checking a bit, I found out that A) The program crashed because of a lack of memory B) The program keeps getting larger in RAM (quick check in the Task Manager) C) It's coming from the shader initialization/reading/disabling functions that I've written. When I say it keeps getting larger in RAM. It's growing quickly with an increase of about 2-4 megs per second until there's no more ram. And I'm pretty certain it's not my shaders that are too demanding. A vertex shader that calls "ftransform()" and a fragment shader that calls "gl_FragColor = gl_Color". So, I'm wondering, since I can't find anything wrong, myself, can you see anything wrong here?
#include <windows.h>
#include "glee.h"
#include <gl\gl.h>
#include <gl\glut.h>
#include <stdio.h>
#include <stdlib.h>

#include "shaderProcessor.h"

char *vs_src;
char *fs_src;
int vs;
int fs;
int sp;

// A function to read the shader files
char *readShader(char *filename){
	FILE *fp;
	char *DATA = NULL;

	int flen = 0;
	fp = fopen(filename, "rt");

	fseek(fp, 0, SEEK_END);
	flen = ftell(fp);

	DATA = (char *)malloc(sizeof(char) * (flen + 1));
	flen = fread(DATA, sizeof(char), flen, fp);
	DATA[flen] = '\0';

	return DATA;

// A function to initialize the shaders
void initShader(void){

	vs = glCreateShader(GL_VERTEX_SHADER_ARB);
	fs = glCreateShader(GL_FRAGMENT_SHADER_ARB);

	vs_src = readShader("shaders/test.vert");
	fs_src = readShader("shaders/test.frag");

	const char *VS = vs_src;
	const char *FS = fs_src;

	glShaderSource(vs, 1, &VS, NULL);
	glShaderSource(fs, 1, &FS, NULL);



	sp = glCreateProgram();

	glAttachShader(sp, vs);
	glAttachShader(sp, fs);


// A function to deinitialize the shaders
void deinitShader(void){
	glDetachShader(sp, vs);
	glDetachShader(sp, fs);


This is it's own .cpp with a .h hanging on to it. The .h only contains declarations of the functions used. I'll post it, but I don't think it's got anything to do with my errors (but what do I know, I'm a newbie :) )

char *readShader(char *filename);
void initShader(void);
void deinitShader(void);


I include this .h in my "main.cpp" and then call the "initShader();" function when/where I want to activate it. I hope you can help me, I'm as I said a newbie but learing fast. Thanks! Marcus Axelsson [Edited by - tre on September 14, 2009 2:28:35 AM]

Share this post

Link to post
Share on other sites
Hi shultays,
I am absolutely sure it's in my c++ code.
I just can't figure it out.
From what I can tell, there's nothing that can leak in the shader init, deinit or reader. It should be solid.
But nevertheless, when I remove the calls to these functions the program doesn't "overload" on memory.

My draw function:

void draw(void){


glTranslatef(0.0f, 0.0f, -13.0f);

glTranslatef(-4.0f, 0.0f, 0.0f);
glRotatef(angle, 1.0f, 1.0f, 0.0f);



glTranslatef(4.0f, 0.0f, 0.0f);
glRotatef(angle, 1.0f, 1.0f, 0.0f);




Share this post

Link to post
Share on other sites
The only thing I see is that you're calling initShader every frame, which means you're opening a file every frame. Assuming you're getting 30 frames per second, that means 30 file operations a second, which means buffering 30 files per second. Assume each file is about 1KB, that's 30KB per second x 2 (fragment and shader files), which means 60KB per second.

Move the initShaders function into an initialization area (such as in main, before you call glutMainLoop()) and you should be good. Also move deinitShaders to your main, right after glutMainLoop.

Share this post

Link to post
Share on other sites
You recreate your shader each time you render a frame. I don't think it is what cause your memory leak, but it's clearly a bad thing to do, performance wise.
Try to call initShader only one time during initialization and have another function that activate your program (just a call to glUseProgram) that you will call inside your rendering loop.

Share this post

Link to post
Share on other sites
You're great, guys! :)
The memory sits quietly at "26 332 kB".

I created two new, very small functions called "useShaders" and "noShaders". One with a call to "glUseProgram(sp)" and one to "glUseProgram(0)". It works like a charm to switch on and off shaders.
I guess it might even serve to shut off and on different shaders later, when I'm getting more advanced. I'd just have to make them recieve variables, so that I can switch on and off shaders.

I don't know what I'd do without this forum :)

Thanks again, shultays, stevo86 and Arkhyl.

Share this post

Link to post
Share on other sites
Sign in to follow this  

  • Advertisement

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!