software and hardware interaction

Started by
22 comments, last by xeyedmary 7 years, 3 months ago

The CPU is a logic controller that reads assembly instructions from a loaded program in RAM which dictates how program memory is manipulated in RAM, and also how input signals are interpreted and output signals are generated. In the case of modern OS's, the output signals are typically generated over a USB connection, which requires that the program explains to the CPU how to interact and accept the USB device, along with device drivers that basically handle the actual interpretation of hardware signals over USB and generalize them (and responding to them) via an abstraction called an API, or 'Application Programming Interface', which is just a collection of functions that generalize everything that could happen with whatever hardware is communicating via USB.

However, it's been the same since the original serial and parallel ports that proceeded the 'universal serial bus' of modern times.

For graphics, and audio, you *could* control such things over USB, serial, parallel, but they have become their own core parts of the computer because they've been essential since the dawn of the personal computer. So they have their own means of communication. But you could just as well develop your own protocol for communicating with a robotic body via audio output or video output: it's all about the output signals generated and what they are intended to do on the recipient hardware.

It's actually not that complicated once you understand how a CPU itself works.

Advertisement

I am at the level of how does a command turn into any sort of electronic signal.



It's not easy to answer this question because a "command" is an abstract concept. There's no such thing as a "command" at the hardware level. Nor is there any "stack" or "memory" etc. The deeper down you go the more concrete everything gets. Concepts created through abstractions no longer bear any meaning.

It may be worth looking into automata theory. Computers in their "lowest software form" consist of combinational logic, which is stateless and time independent (it's just boolean functions). In the earlier days, combinational logic was organised together directly in hardware to create an abstract turing machine capable of "executing instructions". In modern CPUs, this became less feasible due to it being inflexible, so a method for dynamically being able to organise boolean circuits using a hardware description language (such as verilog, VHDL, SpinalHDL or whatever) was devised, making it possible to describe things such as finite state machines or turing machines in software. These descriptions are synthesized and programmed into a hardware device, thus creating something that behaves like a modern CPU. This is what we know as "micro code", which is a stupid name in my opinion, because it creates this false idea that the CPU is "running a program", when in fact the micro code is conceptually operating 3 classes below a turing machine.

If you go deeper, you will see that combinational logic can be implemented in many different ways in hardware. It is not restricted to just "2 voltages" nor is it restricted to being a voltage (as previously mentioned). One could for example choose to build an electrical circuit that computes boolean functions using analog current levels. You would then quantize these levels to form the concept of a "digital value". The number of quantization levels can be arbitrary. One might even go further and implement an optical or mechanical solution to computing boolean expressions.

But sticking to electrical circuits: These days, combinational logic is largely implemented using CMOS technology; Transistors that turn "on" and "off" (this is an over-simplified way of thinking about it).

So to answer your question to some degree: A specific CPU "command" or "instruction" is encoded in memory by a series of transistors that are either turned on or off and are most likely located in one of the chips on one of your RAM sticks. The fetch of these "bits" into the CPU's instruction cache is achieved over a bus (a parallel routing of copper wires) using some standard protocol. The bits are stored inside the CPU in an abstract piece of memory, which doesn't physically exist, but was instantiated via a hardware description language which describes the behaviour of a collection of lookup tables inside the CPU, which in turn were implemented using CMOS technology, thus organising a number of transistors in such a way as to create something that behaves like memory. Depending on what the instruction is, it will cause a series of state transitions to happen on the CPU (the transition functions being described in a hardware description language and also programmed into hardware lookup tables) which will eventually lead to some result data being computed and transferred to some other place in your computer.

Going deeper, it is important to realise that an analog circuit is a superset of everything digital, meaning that any digital circuit is by definition analog. A circuit that implements the evaluation of a boolean expression exhibits some complicated analog phenomenon (which can be ignored if you aren't directly involved with the fabrication of the circuit). For example, look at this picture (blue line) to see what the voltage and current does when a complementary MOSFET transistor pair switches from "off" to "on". While the circuit will perform as expected most of the time, random events can occasionally happen, causing a bit to be flipped every now and again. This is why almost every transport of data goes through some form of error check (ranging from a simple parity check to CRC, depending on how important the data is).

You can go even deeper and start looking at how the lattice of silicon atoms transport electrons, and how impurities ("doping") of the lattice cause electrons to be under- or oversaturated, and how electrical fields influence how electrons are transported across silicon lattices with different dopings to eventually construct a physical chunk of atoms that behave like a transistor.

Indeed, you can go even deeper and touch on quantum mechanics, as has also been previously mentioned in this thread, which introduces even more weird stuff that can interfere with the normal digital operation of the circuit. This is especially relevant when dealing with very tiny transistors. But we've reached a level where going deeper just gets less and less relevant.

"I would try to find halo source code by bungie best fps engine ever created, u see why call of duty loses speed due to its detail." -- GettingNifty
@TheComet, that was an awesome description of electronics, computing and even electrical engineering that I wish I could +1 multiple times :)

It's almost like the "in the beginning, the engineer said" let there be electrons"..." of descriptions :)

@TheComet, that was an awesome description of electronics, computing and even electrical engineering that I wish I could +1 multiple times :)

It's almost like the "in the beginning, the engineer said" let there be electrons"..." of descriptions :)

There is no spoon.

Actually, if you want to pursue this to the extreme, what might have more value knowledge-wise is find a handy youtube video on building analog computers out of tinker toys, Lincoln logs, there's probably videos on how to create simple logic gates using light bulbs and switches, which I played with as a kid. Google Babbage's engine. Quantum computing is getting into reality, there's a cool topic.

This topic is closed to new replies.

Advertisement