Are there any concrete benefits other than ease of use for the C++ streams over the C file streams? It is much easier to parse files and read text with C++ streams, by being able to get the next character without advancing the file pointer, and other features. However, I am hesitant to declare C++ streams as being more beneficial than C streams, because the standard streams in C++ seem to require the input to be placed into a buffer that the stream interface then uses to get the data; this means that a certain number of elements must be read into the buffer in order to do any reading operation. This has the side effect that the offset of the stream's back end won't actually reflect the logical file position reported by the stream interface. This becomes important if you need to interact with the stream through its interface, then interact with the back end alone, which becomes broken because it is left in an unexpected state; if it must fill a 32 byte buffer, then it would read 32 bytes, and you would read from the buffer. If you get 5 elements from the stream, the buffer's position would only be 5 ahead, but the back end would be 32 ahead, the size of the buffer.
Now, this may seem like an unimportant concept, but there is a use case for this, which has many different problems with this method. I have a unique archive format, with a library that allows you to open the files contained within as streams like ordinary files. Additionally, the files may be compressed, and the whole archive can be encrypted, so the archive library manages an encryption stream on top of the stream that contains the archive itself. If the file is compressed, there is a compression stream on top of the encryption stream. On top of everything else, there is an archive stream, which effectively simulates a normal file stream while reading from the stream containing the archive. This may seem overly complex, but it allows one to transparently use a file from an archive, which may be compressed or encrypted, and the modules that read and write data with streams don't know the difference. This makes it possible to read an image from an encrypted archive using the same function that reads an image from a file on disk.
My point isn't to debate how the stream hierarchy works; it is a successfully implemented extensible framework using a stream model like C's streams. The thing is, using an interface like C++ streams that require a stream buffer would prevent one from having any control over where the file pointer of the back end actually is, and would prevent you from having unbuffered streams.
So, put another way, how is the C++ stream model not problematic? Does the issue of unbuffered streams ever come up for anyone? How about non-rewindable streams? I had a stream type at one point that paired a font and a frame buffer, which would allow you to write characters to the stream, and it would be printed to the frame buffer. Being that the stream was a static image and a raster position, there was no way to measure how far back to move, or overwrite already written characters. As a result, the stream is unrewindable. There are also examples of unrewindable input streams. If the stream can't be rewound, and it is used in a buffered stream as described above, then the stream will be positioned further ahead than the logical offset in order to fill the buffer, and when all is said and done, the characters that were unused between the logical offset and the physical offset of where the buffer ended are lost, because you can't seek backward.