stl and memory leaks - c++

Started by
12 comments, last by Conner McCloud 16 years, 7 months ago
i have been having trouble with cleaning up memory in the STL ever since i started using it. stl is designed way better than my code and i rarely see people posting issues on this. let me give some examples of what im facing and maybe someone will know. networking: use a std::vector to hold an array of pointers to client objects. when a new client connects then we push a "new" client object. when users disconnect, call delete on pointer and set to NULL and proceed. when the deconstructor is called for the container class holding the std::vector we iterate through the array calling "delete" on each pointer. scripting: std::map of objects similiarly arranged and at cleanup problems: stack corruption - i cant prove it via debugging but i feel the stack or heap is getting corrupted at various locations when using std::map, std::list, std::vector on pointers the way i am doing it. crashes - when read or write to invalid locations. debugging hell - its so much harder for me to debug stl because i have to go down a tree of elements just to get 1 and debugger doesnt recognize operators on stl ie. [] one thing i see people doing is using smart pointers now. stl was not the norm when i learned c++ and so they are new to me so i may be misusing the classes but i have done a fair amount of research on the usage of stl to learn it. im tired or really ignoring these errors, allowing the mem leaks, etc. i just want to delete the objects i create when they are no longer needed and take advantage of not having to reinvent the wheel every time i need another linked list here is some code from a rewrite of my patcher for those in question of my angle of attack(from various sections of code)

	// Declare neccessary variables
	std::vector<CPatchFile*> FileList;

	// In constructor
	: FileList(10) {}

	// Add a file to list of files for this directory
	// Called multiple times
	if(FileList.capacity() - FileList.size() < 1)
		FileList.resize(FileList.capacity() + 10);
	FileList[numFiles] = new CPatchFile(/*params passed here to construct obj*/);
	numFiles++;

	// Clean up memory
	// Called once
	for(unsigned int i=0; i < numFiles; i++)
	{
		if(FileList != NULL)
		{
			delete FileList;
			FileList = NULL;
		}
	}


Advertisement
i just want to make sure i dont get flamed here by a bunch of guys who think im putting this on stl. it has to be something im not doing i know that. stl is written well
std::vector<CPatchFile *> FileList;// add a new patch file. push_back() will call resize() // if the vector needs to growFileList.push_back(new CPatchFile(...));...// empty the file listfor(std::vector<CPatchFile *>::iterator i = FileList.begin();    i != FileList.end();    ++i){  delete *i;              // i is an iterator to a CPatchList*                          // release the memory we allocated in push_back()  i = FileList.erase(i);  // remove the element from the vector}


Smart pointers would allow you to eliminate the delete call by letting the smart pointer handle it. Once you no longer have to worry about deallocation strategy, you can simply let the vector (now a vector of smart pointers of CPatchFile) as a whole go out of scope and get cleaned up by its destructor and the destructors for the smart pointer instances.
just a tip, I usually define a functor to do deletion (and other simple tasks), that way I can use std::for_each and std::remove_if quickly/easily.
"I am a donut! Ask not how many tris/batch, but rather how many batches/frame!" -- Matthias Wloka & Richard Huddy, (GDC, DirectX 9 Performance)

http://www.silvermace.com/ -- My personal website
beautiful tips guys thank you
Quote:Original post by silvermace
just a tip, I usually define a functor to do deletion (and other simple tasks), that way I can use std::for_each and std::remove_if quickly/easily.


when i first learned about functors i had to turn my computer off and sleep on it because my head was spinnin with awesome ideas

u would do this instead of smart pointers? hmmm i hate to open up a can of worms here in this great thread about performance of smart pointers but it is the reason i have stayed away from them. an extra operation per pointer access is kinda heavy.
Quote:Original post by Oluseyi
*** Source Snippet Removed ***

Smart pointers would allow you to eliminate the delete call by letting the smart pointer handle it. Once you no longer have to worry about deallocation strategy, you can simply let the vector (now a vector of smart pointers of CPatchFile) as a whole go out of scope and get cleaned up by its destructor and the destructors for the smart pointer instances.


one thing i need to be able to do though is to be able to access the elements from the array via their id. ie file 10 needs to be identified to point to the 10th file pushed into the array which would seemingly work but its not 100% if the count mis syncs somewhere or will i be able to know by calling FileList.size()?
no, smart pointers should perform well, they are essentially a simple implementation. I use these mechanisms because my architectures usually allow for clearly defined creation and deletion points, because of this, simplification is the better policy, and IMHO using primitive pointers and a few v. basic std templates is simpler :)

RE: fileID-to-pointer mapping, I think you are using the wrong container.
if you can guarantee that the file id's are unique and only identify 1 file and you need random-lookup and sporadic and regular deletion of files, then I think you fit the criteria for a std::map or stdext::hash_map (VC8), be warned though, std::map is a supposed to be tree based container, and hash_map is supposed to be O(1) but they both have more lookup overhead than a simple vector.
"I am a donut! Ask not how many tris/batch, but rather how many batches/frame!" -- Matthias Wloka & Richard Huddy, (GDC, DirectX 9 Performance)

http://www.silvermace.com/ -- My personal website
what did work although not the way vector is intended to be used, pretty optimal for a patch server-longer init faster seek-, was not initializing a buffer in the vector just size 0 and then size()-1 as the id.
Quote:Original post by yoshscout
	if(FileList.capacity() - FileList.size() < 1)		FileList.resize(FileList.capacity() + 10);	FileList[numFiles] = new CPatchFile(/*params passed here to construct obj*/);	numFiles++;
There are a number of problems with that:
You don't have to track the capacity yourself in order to grow the vector when you think you need to. If you need to resize the vector to a bigger size then you would be much better off to just do this:
FileList.resize(numFiles+1);
This will only cause reallocation if the vector class itself thinks it is necessary (and beileve me it is smarter than you think!).

Also, you don't need to track the number of items in the vector yourself (using numFiles). The vector already tracks the number of items which is accessible through the size() member. Tracking this yourself as well simply leaves open the possibility that numFiles might get out of sync with the vectors size().
The better alternative to the entire quoted code snippet above is:
	FileList.push_back(new CPatchFile(/*params passed here to construct obj*/));
That's it! a new CPatchFile object is created, the vector is grown if necessary, the size() is incremented, and the new item is placed at the end. Then you iterate over the vector using:
	for(unsigned int i=0; i < FileList.size(); i++)
or better still:
	typedef std::vector<CPatchFile*>  PatchFileVector;	for(PatchFileVector::iterator it = FileList.begin(); it != FileList.end(); ++it)
which will also work on other container types.
All of this of course shouldn't really be using raw pointers in the vector anyway, as mentioned. boost's ptr_container is what you want.
"In order to understand recursion, you must first understand recursion."
My website dedicated to sorting algorithms

This topic is closed to new replies.

Advertisement