Jump to content
  • Advertisement
Sign in to follow this  
ultramailman

Close file sooner or later?

This topic is 1975 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

When I need to use a file, I would open it, copy the contents into a string, and close the file. Then I feel like I can do whatever I want with the string.

What if the file is not closed until the very end? What advantage would that give? Will there be performance penalties?

Say I open this file in write mode, and not close it for some time. Would that lock the file so that the user cannot change the file with a text editor?

If you'd like, share your file handling habits too, I'd like to know how other people do it.

Share this post


Link to post
Share on other sites
Advertisement

I'd recommend sooner, Damian Conway - Perl Best Practices also recommend such approach, I guess its valid for other languages too. The only option for not releasing any resource (after its not required any more) is when the release process itself consumes much resources. For instance any garbage collector delays memory releasing because it could be cpu and time consuming.

 

 

What if the file is not closed until the very end?

 

for example any unhanded exception will probably halt your program and any buffered data won't be flushed to a file.

Edited by santa01

Share this post


Link to post
Share on other sites

You asked "would this lock the file"? You're referring to mandatory locking which is enforced on some OSs in some cases (mainly DOS / Windows). It is not a good idea to use this as a "feature", in my opinion.

 

The only benefit of leaving a file open is that you don't have to open it again if you need it later. If this doesn't fit your use-case, then you probably want to close it as soon as possible.

 

It's normally easier to close files as soon as possible, because you don't end up with bad "side effects" - what if someone instantiates several of your objects at once? What happens 1 million get created at once?

 

The OS can run out of file handles, or you could "forget" to close them and create a leak. You probably normally want to close the file when you're finished.

 

Closing and opening files is normally a fairly efficient operation, particularly if the file has been accessed recently, and its metadata are in an OS cache.

Share this post


Link to post
Share on other sites

What if the file is not closed until the very end?

If you open enough files, you run out of file handles.

 

Certain systems set this fairly low (default config for Mac OS X tops out at ~10,000 file handles per process), though older OS were typically for more restrictive (sometimes as low as 256).

Share this post


Link to post
Share on other sites

You asked "would this lock the file"? You're referring to mandatory locking which is enforced on some OSs in some cases (mainly DOS / Windows). It is not a good idea to use this as a "feature", in my opinion.

 

Oh, so it's not portable "feature" then. I thought it would be a nice "feature" for some important files, for example the settings file, or map file, etc.

 

How about using abort() ? If I recall correctly, abort will not cause open and written files to be saved. Can that also be another "feature" to be used?

 

Seems like everyone is leaning towards closing it as soon as possible. Then I have to ask, is it common to parse a file while reading it? I've always copied files to strings first, but that just seems like one wasted pass.

Share this post


Link to post
Share on other sites

Seems like everyone is leaning towards closing it as soon as possible. Then I have to ask, is it common to parse a file while reading it? I've always copied files to strings first, but that just seems like one wasted pass.

As soon as possible means "as soon as you are done reading/writing". No one is suggesting that your modify your reading routines to speed up how soon you close a file...

Share this post


Link to post
Share on other sites

... suggesting that your modify your...

 

Oh no, I didn't mean to imply that. I am honestly just wondering how others parse their files, either parse while reading or parse after copying to string and closing file.

Share this post


Link to post
Share on other sites

Oh no, I didn't mean to imply that. I am honestly just wondering how others parse their files, either parse while reading or parse after copying to string and closing file.

It really depends on the size of the file.
 
If I am reading a 1GB+ file, I read it very differently than I would a 100kb file.

Share this post


Link to post
Share on other sites

Besides the size issue, it depends on how you can process the file's data. If you can process the data in an event fashion (much like a SAX parser parsing XML data), I'd tell you to go with a stream based approach instead of loading all data into memory.

 

I think this approach will lead to less memory usage and less heap fragmentation (or lesser stack overflows occurrences, if you're allocating your buffers on the stack)

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!