The fallacy is in believing that every possible world state has a corresponding 32-bit seed. Those seeds only produce an infinitesimal subset of all possible world states, and so "almost all" world states do not correspond to any small seed. If you study one-way functions you will see that the reason you can go "one way" from a small seed to a huge interesting world is that the procedural algorithm has inherent structure which makes it easy to encode those particular worlds in a small seed (because that's how you defined it, e.g. place a planet here, scatter some rocks there) but doing the opposite is not something we know how to do efficiently, because once you change the world so that it no longer falls in that set of "simple" worlds, the structure is broken and you need a new procedural algorithm. Needless to say, this kind of analysis is impossible to do because of the amount of variables involved in describing an average world.
And either way, you would not save anything, by a simple information-theoretic argument: you might save a little for some worlds, and you might lose a little for others, but on average, the seed would have to contain as much information as the entire world state, and would therefore be more or less the same size as a typical save file (but you get encoding/decoding time as well). It is a fun idea, but unfortunately it doesn't work.
As an example, suppose my "world" is a single integer. Suppose I have a seed from 0 to 31, that my world states range from "1" to "around 4 billion", and I define my procedural algorithm as P : x -> 2^x. Then I can reach a bunch of interesting worlds, like world "1", "4", "32768", "1024", etc... but ultimately I can only reach 32 of them through my procedural algorithm. Now suppose I pick world "1024", but then my character moves a rock, and the world state is changed to "1025". Oops! There is no seed that works to generate this world.
So I can try a couple things: I could say that you need to subtract 1 before saving the seed, and then adding back 1 again when loading the world. So now the seed for this world is (10, 1), such that 1025 - 1 = 2^10. Looks good, until you realise that on average that offset won't be 1 but might be a large number with no specific properties, so now your seed has been bumped up to around 37 bits (instead of 5 bits like it was before, from 0 to 31) which is actually larger than necessary since 32 bits are sufficient to encode all possible worlds.
Or I could just say that, hey, 1025 is not a power of two, but there's still an x such that 2^x = 1025, which is approximately 10.0014. So let's store that instead, it's just a number. Great! But then you realise that you need space to store those decimal digits, and since the states range from 1 to 4 billion approximately, it turns out you'll need approximately 11 digits of precision to represent everything (the exact amount varying somewhat randomly depending on the state), which comes out to a 32-bit seed at least, and you're back to square one.
At this point it's probably obvious - there is no encoding that will work like you want it to, and in practice the best approach is to store the initial seed to get most of the world up and then only save player actions in the traditional way (and occasionally wiping player actions when it's been too long or he's gone too far away from that point such that going back would take long enough so that plants regrow/rock formations are created again/etc...).