Difference between clock tick and clock cycle?

Started by
0 comments, last by Promit 9 years, 3 months ago

I am working on timing in a game loop, which lead me to researching how a CPU works. From my research, I got confused on what the two different terms mean. From what I understand, a clock tick is one full revolution of a cycle, fetch -> decode -> execute. The impression that I got was that a clock tick and clock cycle are the same, but according to this post on stackoverflow they are not.

So the sole purpose of a clock tick is to measure the system time, while clock cycle is to measure the speed of the cpu, which is measured in (hertz), the number of times a cpu makes a full revolution. Is that correct? Some clarification would be nice. Thanks

Advertisement

There are many, many different clocks in any given computing device. You always need to be clear about which clock you're talking about, or the terminology is pointless. Clock cycles typically refer to the clock signal driving a processor, usually the CPU or occasionally the GPU, but those are not the only clocks around.

SlimDX | Ventspace Blog | Twitter | Diverse teams make better games. I am currently hiring capable C++ engine developers in Baltimore, MD.

This topic is closed to new replies.

Advertisement