How many lines of code does an average prog.....

Started by
56 comments, last by Basiror 20 years, 10 months ago
quote:Original post by Arild Fines
Measuring programming progress by lines of code is like measuring aircraft building progress by weight -- Bill Gates
Interesting... upon googling for that, I find no occurences of it being punctuated properly.
Advertisement
quote:Original post by Big Sassy
My point wasn't that the code was faster. My point was Programmer A, B, and C all follow the same design. There was no thought put into how the code works, just how to reduce the lines of code. Programmer D is a better example becuase it's a "better" design. Granted, it's not the greatest example. But once again, at that small a level you don't really have to worry about design of the code too much other then whether it works or doesn't work (unless it's causing a bottleneck). Wouldn't you agree?


When I wrote programmer B's response, I was merely showing how different coding practices lead to less lines of code, not implying that either was a better design. Any program can be implemented on one line; I merely showed how this disparity could occur with sanely written code.

You say that programmer D's design is "better". I fail to see how that's true. There are architectures out there where bitshifts are slower than multiplication and addition; there are (rare) architectures out there where addition and multiplication are the same size.

If I was looking at programmer D's code at a code review, these are the things that I would conclude:

1. Programmer D realizes that a+a is the same as a*2 in functionality.
2. Programmer D thinks that a+a is faster than a*2.
3. Programmer D thinks that the compiler is not smart enough to change a*2 into a+a if necessary.

2 may or may not be correct. 3 most certainly is not, with any semi-modern compiler. Furthermore, a+a certainly obscures the purpose of the statement in comparison to a*2.

Now, here's how the conversation would go.
Me: So you used a+a instead of a*2?
D: Yeah, because it'd be faster.
Me: Did you determine whether it's actually faster?
D: No.
Me: Did you determine whether the increase in efficiency, if there was one, would have any impact on the program's performance?
D: No.
Me: Does the micro-optimization decrease readability?
D: Possibly.
Me: Is the micro-optimization potentially worthwhile:
D: Yes.
Me: Has the micro-optimization been shown to be worthwhile?
D: No.

Programmer D has the best of intentions, and he understands about ALUs. But he needs to learn that optimization is counterproductive without verification.

How appropriate. You fight like a cow.

[edited by - sneftel on June 5, 2003 7:01:12 PM]
quote:Original post by Sneftel
stuff...
Yeah, I see what you're saying. You're right. I put "design" in quotes becuase it wasn't the greatest example of good design. But now that I think about it again, it's just a plain bad example. I completely agree with everything you said.

EDIT - reworded it a little.

EIDT 2 - And now that I look back through the posts, I didn't notice that you started the Programmer A, B thing. I didn't post in reference to what you were saying. I was refering to a programmer taking longer to write lines of code becuase of time spent designing the best solution rather then just coding the first thing that comes to mind. Sorry about that

[edited by - Big Sassy on June 5, 2003 7:20:16 PM]

[edited by - Big Sassy on June 5, 2003 7:28:20 PM]
quote: There are architectures out there where bitshifts are slower than multiplication and addition


Seriously?
My main interest is the intel/nvidia(sp?)/AMD chipsets (I think the CPU is responsible for the bit-shifts, but maybe the GPU might do it as well sometimes).

In the book, "Game Programming All in One", the author used bit-shifts as oppose to division (he did >>2 as oppose to /4). I took that to heart and decided to use bit shifts instead of division whenever I could.

In his example, the line of code that did ">>2" ran about 480,000 times every frame. In places like that, it is good to know what is the fastest method.
quote:
In the book, "Game Programming All in One", the author used bit-shifts as oppose to division (he did >>2 as oppose to /4). I took that to heart and decided to use bit shifts instead of division whenever I could.

In his example, the line of code that did ">>2" ran about 480,000 times every frame. In places like that, it is good to know what is the fastest method.


As has already been mentioned, any reasonably modern compiler will recognise divisions by a constant like 4, and compile them as bit shifts any way. You are just proving Sneftel''s point, you have good intentions, but you are not checking if these micro-optimisations are actually doing anything.

Alan
"There will come a time when you believe everything is finished. That will be the beginning." -Louis L'Amour
I just noticed that this thread (I don''t know whether intentionally or not) is the ultimate lesson in micro-optimisation. Every one started jumping in with "oh, lets bitshift", "lets add instead of multiply", but no one looked at optimising the algorithm.

Programmer B writes:

for(i=0; i<MAX_FOO; i++){    if(foos[i] == bar)        return dblfoo ? baz[i]*2 : baz[i];}


But read carefully. The if block ALWAYS returns on the first for loop iteration, if it matches or not. The best optimisation here is to remove the for loop entirly, and change the array indexes to zeros.

if(foos[0] == bar)    return dblfoo ? baz[0]*2 : baz[0];


You may be able to shave a few cycles by optimising the intructions, but if I can find a better algorithm, I will always beat you for speed.

Alan
"There will come a time when you believe everything is finished. That will be the beginning." -Louis L'Amour
quote:Original post by AlanKemp
The if block ALWAYS returns on the first for loop iteration, if it matches or not.


No, it doesn''t.
I guess this isn''t really important, but a day or two ago I wrote 3k lines of code in a few hours.. And it all worked :D
AlanKemp, what world do you live in? Your algorithm might be faster; however, who cares when it doesn''t get the damn job done.

Anyway, assuming that the compiler will optimise a chuck of code is a bad idea unless you actually know it definetly will.
/smacks head against wall.

What was I just saying about actualy reading the code...

Sorry about that :-)

Alan
"There will come a time when you believe everything is finished. That will be the beginning." -Louis L'Amour

This topic is closed to new replies.

Advertisement