I am at the level of how does a command turn into any sort of electronic signal.
It's not easy to answer this question because a "command" is an abstract concept. There's no such thing as a "command" at the hardware level. Nor is there any "stack" or "memory" etc. The deeper down you go the more concrete everything gets. Concepts created through abstractions no longer bear any meaning.
It may be worth looking into automata theory. Computers in their "lowest software form" consist of combinational logic, which is stateless and time independent (it's just boolean functions). In the earlier days, combinational logic was organised together directly in hardware to create an abstract turing machine capable of "executing instructions". In modern CPUs, this became less feasible due to it being inflexible, so a method for dynamically being able to organise boolean circuits using a hardware description language (such as verilog, VHDL, SpinalHDL or whatever) was devised, making it possible to describe things such as finite state machines or turing machines in software. These descriptions are synthesized and programmed into a hardware device, thus creating something that behaves like a modern CPU. This is what we know as "micro code", which is a stupid name in my opinion, because it creates this false idea that the CPU is "running a program", when in fact the micro code is conceptually operating 3 classes below a turing machine.
If you go deeper, you will see that combinational logic can be implemented in many different ways in hardware. It is not restricted to just "2 voltages" nor is it restricted to being a voltage (as previously mentioned). One could for example choose to build an electrical circuit that computes boolean functions using analog current levels. You would then quantize these levels to form the concept of a "digital value". The number of quantization levels can be arbitrary. One might even go further and implement an optical or mechanical solution to computing boolean expressions.
But sticking to electrical circuits: These days, combinational logic is largely implemented using CMOS technology; Transistors that turn "on" and "off" (this is an over-simplified way of thinking about it).
So to answer your question to some degree: A specific CPU "command" or "instruction" is encoded in memory by a series of transistors that are either turned on or off and are most likely located in one of the chips on one of your RAM sticks. The fetch of these "bits" into the CPU's instruction cache is achieved over a bus (a parallel routing of copper wires) using some standard protocol. The bits are stored inside the CPU in an abstract piece of memory, which doesn't physically exist, but was instantiated via a hardware description language which describes the behaviour of a collection of lookup tables inside the CPU, which in turn were implemented using CMOS technology, thus organising a number of transistors in such a way as to create something that behaves like memory. Depending on what the instruction is, it will cause a series of state transitions to happen on the CPU (the transition functions being described in a hardware description language and also programmed into hardware lookup tables) which will eventually lead to some result data being computed and transferred to some other place in your computer.
Going deeper, it is important to realise that an analog circuit is a superset of everything digital, meaning that any digital circuit is by definition analog. A circuit that implements the evaluation of a boolean expression exhibits some complicated analog phenomenon (which can be ignored if you aren't directly involved with the fabrication of the circuit). For example, look at this picture (blue line) to see what the voltage and current does when a complementary MOSFET transistor pair switches from "off" to "on". While the circuit will perform as expected most of the time, random events can occasionally happen, causing a bit to be flipped every now and again. This is why almost every transport of data goes through some form of error check (ranging from a simple parity check to CRC, depending on how important the data is).
You can go even deeper and start looking at how the lattice of silicon atoms transport electrons, and how impurities ("doping") of the lattice cause electrons to be under- or oversaturated, and how electrical fields influence how electrons are transported across silicon lattices with different dopings to eventually construct a physical chunk of atoms that behave like a transistor.
Indeed, you can go even deeper and touch on quantum mechanics, as has also been previously mentioned in this thread, which introduces even more weird stuff that can interfere with the normal digital operation of the circuit. This is especially relevant when dealing with very tiny transistors. But we've reached a level where going deeper just gets less and less relevant.
"I would try to find halo source code by bungie best fps engine ever created, u see why call of duty loses speed due to its detail." -- GettingNifty