Archived

This topic is now archived and is closed to further replies.

AndreTheGiant

Is this a mem leak?

Recommended Posts

int main() {
 
 int eger = new int;

}
 
Is this a memory leak? I know it probably would have been just plain good practice to simply do this instead:
int main() {
 
 int eger = new int;
 delete eger;

}
 
But suppose you don''t, doesnt the OS get the memory back when your program terminates anyway? In other words, even though this program technically has a memory leak, couldnt you run this infinetly many times without problem because the OS will get the memory back at the end of each run? Or does it depend on which OS you are using? If it depends on the OS, then how does Windows behave? How does Linux behave? Thanks.

Share this post


Link to post
Share on other sites
It is a leak.

It would be a problem if your program did this over and over again though. Just because the OS cleans up the process memory on closing doesn''t mean its okay to leak memory.

Share this post


Link to post
Share on other sites
quote:
Original post by AndreTheGiant

int main() {

int eger = new int;

}


Is this a memory leak? I know it probably would have been just plain good practice to simply do this instead:


int main() {

int eger = new int;
delete eger;

}


But suppose you don''t, doesnt the OS get the memory back when your program terminates anyway? In other words, even though this program technically has a memory leak, couldnt you run this infinetly many times without problem because the OS will get the memory back at the end of each run?

Or does it depend on which OS you are using? If it depends on the OS, then how does Windows behave? How does Linux behave?

Thanks.


Err.... int* eger = new int?

.lick

Share this post


Link to post
Share on other sites
That is a leak. Even if it is freed after the program exits, it won''t be freed for you to use later on in your program. I''d also note that, while Win2000 and XP and linux are pretty robust and will most likely free up any memory when your program exits, win 95/98 sucks for stuff like that. Run windows explorer 20 times in a row under win98 and you''ll run out of free memory and have to reboot. In short, don''t depend on the operating system to do your dirty work.

Share this post


Link to post
Share on other sites
Someone once referred to this type of leak:

void main() {
int* eger = new int;
};

as a "memory dribble". Since it''s so small and the program ends almost immediately.

Regardless of whether the OS frees this memory at the end or not, the issue is getting into the habit of NEVER DOING THIS. Always delete the memory you allocate once you''re done with it.

Regards,
Jeff

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
quote:

It is absoultely NOT a leak, you still have a pointer to the allocated memory, it is a (very small) memory pool.



So you''re saying that even after the function has finished, and the activation frame is removed from the stack, it''s not a leak because the address is still somewhere on the stack? Come on.

Share this post


Link to post
Share on other sites
sorry magmai, this time your wrong, because when the function exits, the only remaining pointer to the memory is lost ... so it''s a classic leak ...

another reason to think of this as a memory leak is because all software which helps detects memory leaks will call this a leak ... which means leaving junk like this in the program will prevent you from finding the real / important leaks deeper in your code (i assume you are asking if it is a leak to intentionally not call delete on an object since you know the OS will reclaim the space) ...

Share this post


Link to post
Share on other sites
When an operating system runs a program, it partitions a section of RAM and gives that exclusively to the application. The app gets an error if it tries writing to another sector of memory. Anyway when the program terminates the OS will wipe clean this area of memory and flag it as unused.

The only way there could be a memory leak after a program exits would be if something prevented the OS from correctly flagging that section of memory. I don''t think any currently used OS has a bug like that.

A memory leak occurs, as has been said, when a program tries to create and delete memory but it accidentally deletes less memory than it created, so it gradually uses more and more memory. Well, that''s one type of leak.

Share this post


Link to post
Share on other sites
I just finished my college course in C++ ... and I could have remebr being told that memory gets freed when it gets out of scope.

Say I have an integer declared in a function.

int getajob(blahblah)
{
int crap = jommastrash;
return crap;
}

My Main calls this function once... I get crap returned... and then crap is deleted, and its memory is freed. Right?

This is what i was told in programming class anyways.
It was never stressed to clean up your own mess. In this case my mess being 'junk'.

I'm going to google & MSDN for a few mins, but if anyone knows of a resource or tutorial that explains this... feel free to post.

-DD

EDIT*****************************
#include <iostream>
using std::cout;
using std::endl;

int main()
{
int* eger[40000];
for (int count = 0; count < 50000; count++)
{
eger[count] = new int;
cout << "int " << count << endl;
}
delete *eger;

return 0;

}


After running this source code, i was basically looking for it to crash. I'm playing around a little seeing what can be done and what can't.

I ran this code, expecting a crash when it tried to assign eger[40000] or 40001. Well.. no error. It actually kept assigning values up to 40031.

Why in the world would it go to 40031 and then stop.? Why didn't it stop earlier. Can anyone explain this anomolie to me?

If ya don't believe me try it. I',m using Visual 6.0 Standard.

EDIT*****************************************

[edited by - agentidd on May 2, 2003 8:46:25 PM]

Share this post


Link to post
Share on other sites
quote:
Original post by Agentidd
I just finished my college course in C++ ... and I could have remebr being told that memory gets freed when it gets out of scope.

Say I have an integer declared in a function.

int getajob(blahblah)
{
int crap = jommastrash;
return crap;
}

My Main calls this function once... I get crap returned... and then crap is deleted, and its memory is freed. Right?

This is what i was told in programming class anyways.
It was never stressed to clean up your own mess. In this case my mess being ''junk''.

I''m going to google & MSDN for a few mins, but if anyone knows of a resource or tutorial that explains this... feel free to post.

-DD


Yes, but that''s a single variable. We''re talking about allocating memory using new. That memory for that variable is actually allocated within the executable, so even though it goes out of scope, the memory doesn''t get deleted until the executable is unloaded . When calling new, ALWAYS call delete when you are done. If you don''t, you will get into the nasty habbit of not doing it when you need to and create all sorts of problems with larger software. It''s much easier to avoid bugs than to track them down, especially memory leaks, because sometimes they don''t surface until much later on down the line.

Share this post


Link to post
Share on other sites
The memory does not get freed outside scope. Actually the whole purpose of new and malloc is to have memory allocation that survives just that. You must release the memory when you are not using it any more or you have a leak. Even if the OS eventually cleans the unreferenced memory up, it is not immediate. You can read more about this at Stroustrups homepage at: http://www.research.att.com/~bs/bs_faq2.html#delete-scope I would also advice against assuming that any OS will clean up after you. It is analogous with buying food at McDonalds and not cleaning the table when you leave, because you assume the waiter will clean up after you. Eventually yes, but in the mean time some other poor person who can't find a table have to sit in your mess and eat.

question: Why in the world would it go to 40031 and then stop.?
Because when you allocate memory you allocate approximately a space. When you create x[100] you may be able to write x[130]. But because you are accessing memory you won't notice anything wrong until you hit a boundary (page break) or another allocated memory space causing the error. Going out of bounds is something widely used in viruses to help run code from lets say a text field. Nimda virus for instance uses this technique.
This is something that .NET has strategies against.

____________________________________________________________
Try RealityRift at www.planetrift.com
Feel free to comment, object, laugh at or agree to this. I won't engage in flaming because of what I have said.
I could be wrong or right but the ideas are mine.



[edited by - MichaelT on May 2, 2003 8:58:30 PM]

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
main() leaks the pointer when it goes out of scope.

However, if you allocate memory that you will use until program termination, you do not need to free all that memory on termination. That''s a job explicitly performed by the operating system. You should let it do that -- an application should quit in 1 second or less. I hate applications that take 30 seconds just to shut down; that''s entirely unnecessary.

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
That''s what sucks about VC++. In g++ (at least in Linux), the instant your program touches memory it shouldn''t, you get a seg-fault. Very nice for the application to work correctly, but a pain in the butt to debug.

Share this post


Link to post
Share on other sites
As I said, not releasing resources is bad. very bad. Don't assume your mess will be cleaned up. A debugger will scream if you behave like that.

Not even in Linux are you protected against out of bounds programming.
____________________________________________________________
Try RealityRift at www.planetrift.com
Feel free to comment, object, laugh at or agree to this. I won't engage in flaming because of what I have said.
I could be wrong or right but the ideas are mine.



[edited by - MichaelT on May 2, 2003 9:08:12 PM]

Share this post


Link to post
Share on other sites
quote:
Original post by Agentidd

EDIT*****************************
#include <iostream>
using std::cout;
using std::endl;

int main()
{
int* eger[40000];
for (int count = 0; count < 50000; count++)
{
eger[count] = new int;
cout << "int " << count << endl;
}
delete *eger;

return 0;

}


After running this source code, i was basically looking for it to crash. I''m playing around a little seeing what can be done and what can''t.

I ran this code, expecting a crash when it tried to assign eger[40000] or 40001. Well.. no error. It actually kept assigning values up to 40031.

Why in the world would it go to 40031 and then stop.? Why didn''t it stop earlier. Can anyone explain this anomolie to me?

If ya don''t believe me try it. I'',m using Visual 6.0 Standard.

EDIT*****************************************



Go back to college... j/k

The reason is that you are writing past the array''s last element (one reason for using iterators and a vector in this case.) The compiler doesn''t check for this (though I think VS.NET has a compiler option to do so at runtime.) This is a classic mistake frequently made with arrays.

Depending on your compiler and whether you compiled in debug or release this bug could actually be really hard to find.

Lastly, your call to delete is wrong:



delete *eger;



should be:

 

delete [] eger;



Hope this helps.

Share this post


Link to post
Share on other sites
HEY !

int *pint=new int;

is NOT A LEAK

but if you forget it, then its A LEAK

This question is extremely silly. Please learn thy basics and post again.

Thanks

Signed BY
THY M. C.

Share this post


Link to post
Share on other sites