Jump to content

  • Log In with Google      Sign In   
  • Create Account

We're offering banner ads on our site from just $5!

1. Details HERE. 2. GDNet+ Subscriptions HERE. 3. Ad upload HERE.


#ActualJTippetts

Posted 20 March 2013 - 10:17 PM

I spent many years hand-tuning assembly for EGA, CGA and later VGA/ModeX. Back then, bare-metal programming was not only "all the rage" it was necessary if you wanted to achieve acceptable levels of performance. You had to know your assembly. Now, though... while I won't argue that knowing things on the assembly level isn't useful (all knowledge is useful, in one way or another)and fun, it is inarguably far less vital. It is possible to become a very competent developer without ever touching it. In fact, I'd even go so far as to say that for developers up to a certain (undefined) level of expertise, it might be dangerous to dabble too deeply into assembly. Writing assembly now is pretty far removed from what it once was. Optimization is just so different from what it used to be, that you are almost always better off letting the compiler do it rather than trying to do it yourself. And I'm not convinced that it really can help you to become a better programmer anymore, unless you're a kernel or driver hacker. Modern programming takes place on top of so many layers of abstraction, that the actual details of the underlying hardware are almost irrelevant. And you certainly won't find very many job listings that list it as a requirement. It arguably might make you marginally better at debugging, but if you are planning your education arc around something that will only come in handy in a small number of cases, I worry you might be wasting your time.

Now, if you're doing it for the joy/learning/bragging rights, or you want to be a kernel developer, a driver developer, or a compiler developer, then knock yourself out. But beyond a small handful of niche specialties, it's just not all that useful anymore.

#1JTippetts

Posted 20 March 2013 - 10:16 PM

I spent many years hand-tuning assembly for EGA, CGA and later VGA/ModeX. Back then, bare-metal programming was not only "all the rage" it was necessary if you wanted to achieve acceptable levels of performance. You had to know your assembly. Now, though... while I won't argue that knowing things on the assembly level is useful (all knowledge is useful, in one way or another)and fun, it is inarguably far less vital. It is possible to become a very competent developer without ever touching it. In fact, I'd even go so far as to say that for developers up to a certain (undefined) level of expertise, it might be dangerous to dabble too deeply into assembly. Writing assembly now is pretty far removed from what it once was. Optimization is just so different from what it used to be, that you are almost always better off letting the compiler do it rather than trying to do it yourself. And I'm not convinced that it really can help you to become a better programmer anymore, unless you're a kernel or driver hacker. Modern programming takes place on top of so many layers of abstraction, that the actual details of the underlying hardware are almost irrelevant. And you certainly won't find very many job listings that list it as a requirement. It arguably might make you marginally better at debugging, but if you are planning your education arc around something that will only come in handy in a small number of cases, I worry you might be wasting your time.

Now, if you're doing it for the joy/learning/bragging rights, or you want to be a kernel developer, a driver developer, or a compiler developer, then knock yourself out. But beyond a small handful of niche specialties, it's just not all that useful anymore.

PARTNERS