# Somewhat confused about computer speed

## Recommended Posts

ChJees    141
I think i have grown paranoid about computers, nearly all the time when i am coding something do i try to make it as small as i can and do not know why. I seem to think that doing one repeated action 100 times per second to slow down the computer though it works smooth in practice. Is something wrong with my mindset?

##### Share on other sites
SeraphLance    2603
You're underestimating the power of a computer. I did it a lot too when I was starting out. To show an example, I wrote a quick python script:

For x in xrange(1000):     x*2

All this does is multiply every number between 1 and 1000 by 2. It finished before I could blink. Raising the number to 10000 has a slight delay (about a tenth of a second).

Everyone has different opinions on optimization. In general, my best advice is not to worry about it at all until your code DOES run slow. There are some powerful debuggers out there that can show you just where the problem is. Programmers have a habit of not being able to find their own bottlenecks, like writers with awkward language.

##### Share on other sites
ChJees    141
Im not alone then :).

But accessing a file 100 times per second just to check it's size, is it slowing down the computer? (I have seen the CPU activity in my program go from 30% to 70%. But it runs smooth.)

##### Share on other sites
Antheus    2409
The only way you can slow down the computer is by lowering the CPU's clock frequency. Additionally you can stall the CPU by lowering the bus frequencies, making CPU wait for input/output.

But other than that, CPU is ticking away at constant rate, whether it's doing anything useful, or running idly.

The question of "speed" is how well you can utilize these ticks you're given.

Quote:
 But accessing a file 100 times per second just to check it's size, is it slowing down the computer?

Do you need to check it that often? If yes, then that's how long it's going to take. If no, then it's time to refactor the code.

There's also profilers that can tell you what is taking the longest.

Quote:
 i try to make it as small as i can

Size and performance are not closely related. Short code doesn't even remotely mean fast code.

##### Share on other sites
Spoonbender    1258
Quote:
 I seem to think that doing one repeated action 100 times per second to slow down the computer though it works smooth in practice. Is something wrong with my mindset?

Yes, you're assuming that computers are slow.
How about this? In ideal circumstances, a single CPU core can execute something like 10 billion instructions per second. (Assuming 2.5 GHz and 4 instructions per cycle, as a Core 2 is capable of)

Your repeated action, how long does it take? 1000 instructions? That's 100,000 instructions per second, so you can do that 100,000 times more often before you spend all your CPU time on it.
Obviously, that's not worth worrying about.

There are plenty of things that do hurt performance on a modern CPU though, so if you want your applications to go fast, the best you can do is 1) learn how processors work, and 2) profile your code to find out which parts of it are slowing everything down.

The thing is that if some of your code takes up 0.01% of the CPU time to execute, it just isn't worth it to try to optimize it. No matter how much effort you put into it, you can never achieve an overall speedup of more than 0.01%. In other words, a second spent optimizing that code, is a second wasted.

##### Share on other sites
Write a program that works correctly and meets all your requirements. If the performance of that program is not good enough you can then use a profiler to find out which sections of code are good candidates for optimisation and go about finding faster alternatives.

##### Share on other sites
wodinoneeye    1689

If you want to see it slow down, in place of you minimal operation call thru 20 layers of functions and do a memory allocate and deallocate at several of the layers.

For much logic that runs only a few time per second all the extra overhead (which is endemic/chronic in certain styles of programming) wont cause a game to slow down. But if its something called thousands of times per render frame
(example- pathfinding) the inefficient code will start showing a significant slowdown.

As they frequently say: "10% of the code uses 90% of the CPU resources, so concentrate performance improvements in that 10%")