# file i/o overwriting

This topic is 4462 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

What I'm trying to do is search through a .dat file, and when (or if) I find a certain line, I want the next 15 lines (in the file) to be overwritten by 15 strings the user has input. I know that if I use only std::ios::out, then by default it erases the current file and then outputs the new text, and with std::ios::app it goes to then end of the file and appends the text. Is there a atd::ios::ovr or something like that? Thanks. [Edited by - sarbruis on November 29, 2005 5:37:07 PM]

##### Share on other sites
That might be difficult to do since lines are packed together. For example, a line with a length of 100 bytes will overwrite two lines of 50 bytes each. In answer to your question though, you use ios::in|ios::out.

According to MSDN,
ios::app The function performs a seek to the end of file. When new bytes are written to the file, they are always appended to the end, even if the position is moved with the ostream::seekp function

##### Share on other sites
I modified my initial post for clarity. If I use regular ios::out it'll delete the file and then output the 15 lines, instead of just overwriting the 15 lines that I want to overwrite.

##### Share on other sites
just re-write the file. At least the part at and after the insertion

##### Share on other sites
Write out a new file, and then arrange for it to be copied over the original if necessary. Trust us. Far, far fewer headaches.

Alternatively, look up some form of database API. In Python you can do this sort of thing quite neatly using the 'pickle' module.

##### Share on other sites
I suppose I have a problem in that my entire database is in that one file. So I guess I'll just have to read in the entire file again (but not the 15 lines that I don't want) and then make a new file. And then at the end I'll paste the 15 lines (because order doesn't matter). Is that the best way to go about it (without switching to python?).

##### Share on other sites
Use a real database. They are offered for other languages besides Python of course, but the documentation for Python's implementation may lead you in useful directions.

##### Share on other sites
I'll hit up google when I've got time, but can you give me a few good links about embedding python into c++ and using its database functionality (if you know any offhand--don't go out of your way)?

Thanks.

##### Share on other sites
Since lines are packed together... if the "data structure" you write to the file is fixed won't this be okay?

I mean... if each line is written with a defined lendth like

char name[30];int points;

the length of each line is defined to a length so overwriting any line won't be a problem... but adding lines anywhere except for after the last line would be.

Adding data somewhere in the file would need to keep a copy of the data that will be after the new entry (in memory or in another file) and append it to the file. If you append the new lines to the end of that file you will simplify everything.

If this is a big file, reading/writing to such a file may take a while, so you could index the entries with a key (somewhat like what is used for some databases programs). I don't know how your entries look like but if you have a lot of them it can help you. In the index you store your seach key and the offset in the file where is the entry so you don't have to rewrite all your db file each time you want to add an entry. This way you can append all your data to the big db file and find them quickly in the order set in the index.

The index file would contain a key variable used for search and an offset in the file. You can also have a field in the data to set it to deleted so you don't have to remove the data in the file everytime (which can lead to a lot of reading and writing). Provide a function to remove those entries when you decide (like compact database).

This will all look like a database... this is why you might search on google for libraries already implemented to do all that. But if you only have small files I suggest that you just rewrite the files. And also, index files (or in the header of the db file) will only help speed if you can have a key you can use to search. If you only write strings to the db file, the index file will be bigger and if you use too much indexed fields you will slow this down.

So in the end the way you choose depends on the problem you have and how much you are ready to do. Reading from one file and overwriting one field, adding one and append the rest of the file might be more simple and easier to implement but on the other side if you have big files will juste take too long to update.

JFF

##### Share on other sites
What do you mean by big files? I'm thinking that the max size will be something like two to four million bits or so (just text). Surely it won't take too long even if I'm copying the file for every modification (but appending is easy--I just open the file and write).

The database is the inventory for a store (each item takes up 15 lines). And there really shouldn't be more than a few thousand items (I don't even think we have a thousand currently).

##### Share on other sites
if you have fixed lines and a large file > 20 mb
i would consider a memory mapped file

you get a pointer to a piece of memory that is actually a file, you can treat it as normal memory though :) *thx to the paging mmus nowadays*

its a lot faster that normal io(reading reordering writing)

look up these function in the msdn to get an idea on how it works:

CreateFileMapping
OpenFileMapping
MapViewOfFile
MapViewOfFileEx
UnmapViewOfFile
FlushViewOfFile
CloseHandle

hope that helps