does bring up the idle thought of if there were C and C++ standards that better reflected what typical compilers and architectures already do.
for example:
defining the sizes of various numeric types (for example: "char" and "unsigned char" are 8 bits, "short" and "unsigned short" are 16-bits, ...);
defining the use of things like two's complement or IEEE floats (or at least the implementation behaving as-if these were used);
better defining the behavior of structs and data are laid out in memory (for example, defining struct packing behavior);
defining that char is signed (this is how most common compilers define it), vs leaving it undefined;
...
and probably also make optional some features people probably haven't really used in decades, such as trigraphs, which some common compilers (such as GCC) complain about if/when they are ever seen;
likewise, probably formally drop old-style declarations ("int foo(a, b) int a, b; { ... }");
...
secondarily (more drastic / extensions):
place some "sane" restrictions of the ordering of type qualifiers and specifiers (to ease more efficient parsing);
maybe define some mechanism to indicate the endianess of stuct-members (*1);
...
*1: unlike a lot of the other factors, both big and little endian are in common use, mostly as relevant to file-formats.
unlike most of the other changes listed, this would actually result in a visible effect.
all this doesn't even need to be "the standard", but maybe could be a "standardized profile".