How could the automatic storage not be a stack?
On the venerable TM9900, subroutine calls are perfomed using the BLWP instruction (branch-and-link-workspace-pointer), which implements call/returns semantics using what is effectively a linked list. No stack. None, nada, not a sausage. Variables of automatic storage duration are created in scratch RAM indexed by one of the 16 general-purpose registers. I'm not sure of a C compiler was every available for that platform, but if there was it would make more sense to use that native method than to try to hack one of general-purpose registers for use as a stack pointer (call/return would still use BLWP, since there was no alternative).
Most calls on a SPARC were done by switching registers sets, with arguments passed in registers and, if possible, automatic variables were also in registers. SPARCs had a lot of registers. That means, unless the parameters or locals were huge, subroutine calls were through a chained linked-list structure, not a stack. There was definitely a C compiler for the Unixes that run SPARCs.
If the size of the locals or arguments on a 68k or a PPC are small enough, only the registers are used and stack use is avoided by most optimizing compilers. The OP's code is an example of such small code.
To assume you have to use some sort of stack structure to implement automatic variables in C or C++ is just an invalid assumption. That's why it's a misnomer. The word means "improperly named", and it's an accurate description. Sure, you can go ahead and use the phrase "on the stack" to describe variables of automatic storage duration and most folks will know what you're talking about, but if you take the name literally and make assumptions about the underlying implementation, you're going to run in to trouble. That's what happened here. That's why it happened here. I do not think it is pedantic or pretentious to explain why it's a misnomer and how it being a misnomer caused misunderstanding.
As to the phrase 'on the heap' when referring to the free store, there was never a heap in the technical computer-science sense of the word. The term originated when Unix was running on the DEC PDP-11 (as the "on the stack'). This was before the era of virtual memory, and each process had its own memory region. The executable code was in one area, and area was designated for the stack, which grew "downwards", and the rest of the memory was reserved for allocations (using malloc() -- which was an acronym for memory allocation). The memory allocations and the stack grew towards each other. Because the free store was always drawn at the bottom of diagrams, depictions of memory allocations looked like a big pile of boxes thrown one on top of the other. Deallocations could occur in the middle, so eventually the diagrams would look like a big pile of stuff at the bottom. As Hodgman puts at, 'a whole heap of bytes'.
With the advent of virtual memory, this description no longer makes sense. It's still a misnomer. It doesn't hurt to use the slang, but don't expect to find any kind of heap if you look under the hood, even if it might be a reasonable design on some system. Don't expect to find the term in the standard, since that would require the use of a particular implementation where it might be inappropriate.