Close file sooner or later?

Started by
12 comments, last by wintertime 11 years, 1 month ago
When I need to use a file, I would open it, copy the contents into a string, and close the file. Then I feel like I can do whatever I want with the string.

What if the file is not closed until the very end? What advantage would that give? Will there be performance penalties?

Say I open this file in write mode, and not close it for some time. Would that lock the file so that the user cannot change the file with a text editor?

If you'd like, share your file handling habits too, I'd like to know how other people do it.
Advertisement

I'd recommend sooner, Damian Conway - Perl Best Practices also recommend such approach, I guess its valid for other languages too. The only option for not releasing any resource (after its not required any more) is when the release process itself consumes much resources. For instance any garbage collector delays memory releasing because it could be cpu and time consuming.

What if the file is not closed until the very end?

for example any unhanded exception will probably halt your program and any buffered data won't be flushed to a file.

You asked "would this lock the file"? You're referring to mandatory locking which is enforced on some OSs in some cases (mainly DOS / Windows). It is not a good idea to use this as a "feature", in my opinion.

The only benefit of leaving a file open is that you don't have to open it again if you need it later. If this doesn't fit your use-case, then you probably want to close it as soon as possible.

It's normally easier to close files as soon as possible, because you don't end up with bad "side effects" - what if someone instantiates several of your objects at once? What happens 1 million get created at once?

The OS can run out of file handles, or you could "forget" to close them and create a leak. You probably normally want to close the file when you're finished.

Closing and opening files is normally a fairly efficient operation, particularly if the file has been accessed recently, and its metadata are in an OS cache.

What if the file is not closed until the very end?

If you open enough files, you run out of file handles.

Certain systems set this fairly low (default config for Mac OS X tops out at ~10,000 file handles per process), though older OS were typically for more restrictive (sometimes as low as 256).

Tristam MacDonald. Ex-BigTech Software Engineer. Future farmer. [https://trist.am]

@swiftcoder Fairly high you mean wink.png

You should close as soon as possible. I guess it's best to use the RAII pattern to manage the file handler lifecycle,

You asked "would this lock the file"? You're referring to mandatory locking which is enforced on some OSs in some cases (mainly DOS / Windows). It is not a good idea to use this as a "feature", in my opinion.

Oh, so it's not portable "feature" then. I thought it would be a nice "feature" for some important files, for example the settings file, or map file, etc.

How about using abort() ? If I recall correctly, abort will not cause open and written files to be saved. Can that also be another "feature" to be used?

Seems like everyone is leaning towards closing it as soon as possible. Then I have to ask, is it common to parse a file while reading it? I've always copied files to strings first, but that just seems like one wasted pass.

Seems like everyone is leaning towards closing it as soon as possible. Then I have to ask, is it common to parse a file while reading it? I've always copied files to strings first, but that just seems like one wasted pass.

As soon as possible means "as soon as you are done reading/writing". No one is suggesting that your modify your reading routines to speed up how soon you close a file...

Tristam MacDonald. Ex-BigTech Software Engineer. Future farmer. [https://trist.am]

... suggesting that your modify your...

Oh no, I didn't mean to imply that. I am honestly just wondering how others parse their files, either parse while reading or parse after copying to string and closing file.

Oh no, I didn't mean to imply that. I am honestly just wondering how others parse their files, either parse while reading or parse after copying to string and closing file.

It really depends on the size of the file.

If I am reading a 1GB+ file, I read it very differently than I would a 100kb file.

Tristam MacDonald. Ex-BigTech Software Engineer. Future farmer. [https://trist.am]

Besides the size issue, it depends on how you can process the file's data. If you can process the data in an event fashion (much like a SAX parser parsing XML data), I'd tell you to go with a stream based approach instead of loading all data into memory.

I think this approach will lead to less memory usage and less heap fragmentation (or lesser stack overflows occurrences, if you're allocating your buffers on the stack)

This topic is closed to new replies.

Advertisement