[Future-Proof]Should we try to follow the new technologies? Or going back to the basics?

Started by
5 comments, last by Bacterius 10 years, 11 months ago

Hi, VERY occasional poster here smile.png

Thanks to the articles and advices from Gamedev, I spent quite a number of summers practicing C++ concepts learned from school and applying to the game engine experiment I had. And now I have a job smile.png

Other than expressing my gratitude towards the community and share my joy of being able to create a rudimentary engine, I'd like to bring this topic to discussion.

Is it better to try to follow the new technology? Or is it better to sit back and wait and prepare for the future by strenghthening the basics?

Why I prefer the basics

Now there are new concepts, new technologies everywhere and every day, new phones, new frameworks, new IDEs almost every month. (okay, I'm exaggerating a bit, but you get the idea). I'm finding it hard to follow. And in reality, I'm a little resistant to these changes because deep down, I think they'll be obsolete by the time I'll need them anyway.

Although I know that the (not even general concensus, more of an accepted fact) is that we need to stay current to new technologies. I prefer to stick with the basics.

As an example, I didn't even know what design patterns were when I started applying for the jobs, so I just brushed it up before the interviews and forgot all about it when I started working. I didn't even know what the Interfaces were at the time.

But once I learned what the interfaces are, it was very easy for me to implement these patterns although I didn't even know I was implementing one. I mean, it simply made sense because I started writing my engine in spagetti code, it became a nightmare to debug, so I naturally started using inheritence (although I still didn't know interface) and proper namespace to organize my project.

It just seemed natural to me, sometimes I even think learning design patterns might limit the possibilities of these uses. Again, as an example of my engine, shortly after my work started, I learned my engine, albeit it used Lua to act as commands (so it was kinda cheating), it was almost an implementation MVVM/MVC pattern (sorry, i still can't quite grasp the difference)

Then I learned WCF, which I thought was a pretty awesome technology, but the configurations is a giant pain in the ass. We work with another company and our services would talk through WCF, sometimes it took almost a week just to identify a typo in the interfaces and then there's all these other settings like KnownTypes, Messages, sessions etc...

What triggered the discussion was today, I found the book i always wanted to read: MUD Game programming by Ron Penton and started reading it finally. I really enjoy it. But then again, this is OLD OLD technology. On the other hand, new technologies are all relying on these basics, they will change/evolve, but the basics won't.

Then I looked at my bookshelf: "Assembly Language for Intel-based computers", "Understanding Telephone Electronics", a book on how to fix cell phones using OLD OLD GSM chips, these are the books I've gotten a long time ago, but were still on my to-read list. I felt like I'm killing my career,lol

Now my beef with the new technologies

There are just too many abstractions~~ There are soo many things that are hidden from us it feels more like using an application (like MS Word) rather than writing code. I love the libraries Microcontroller developers make, they provide the bare minimum to do simple things, so we don't have to go too deep down on protocol level, but we have enough control to know exactly what's going on.

But the new frameworks? I've worked with Android for a bit, and thought that was the most annoying framework ever. So often I had to fight just to make it compile because of these rules imposed. I probably didn't learn how to use the UI Thread properly, ended up having to create another Thread inside the UI thread to show a simple text change.....

Sure, I could've spent more time on reading how to update the UI properly, but that's exactly the beef I have with these type of new technologies. Too many abstractions.

Yet, according to what some manufacturers are saying, programmers WANTED them to ditch WM and move to android, it wasn't the users, it was the Devs who wanted the change.

I suspect the real reason why, though, is something like this (using the graphics guy at work's quote):"I hate working with Magentas, why can't them use Alpha channels like a 21st century technology"

So, what do you guys think? Is it worth spending the time to try to follow as many new technologies as possible? Or is it better to occasionally taking peeks at some of them but really focusing on the basics?

Advertisement
Technologies and languages come and go, it is only algorithms and data structures that you can rely on.

When it comes to languages, it doesn't matter if you are programming in C, C++, Java, Perl, Python, Ruby, Eiffel, Erlang, shell script, or something else entirely. They come and go. Use whatever gets the job done.

The same with technologies. If you can get an app written in MFC, or WPF, or Mono, or Qt, or something else, then use whatever gets the job done at present.

Let's imagine it in other disciplines:

There will be better musical instruments than mine, so I'm not going to play.
There will be better paints, brushes, and canvases than mine, so I'm not going to paint.
There will be better hammers, nails, and building materials than mine, so I'm not going to build.
There will be better landscaping materials than mine, so I'm not going to landscape.


You say you have the basics down. That is enough. Go from there. If you find a tool or technology is useful then employ it, but otherwise just keep going and do your best with what you have got.

If you spend your days just trying to"sit back and wait for the future", you will find the present will have passed you by while you were playing. You may suddenly discover you are left with nothing but "could have" and "should have" to keep you company.

Technologies and languages come and go, it is only algorithms and data structures that you can rely on.

When it comes to languages, it doesn't matter if you are programming in C, C++, Java, Perl, Python, Ruby, Eiffel, Erlang, shell script, or something else entirely. They come and go. Use whatever gets the job done.

The same with technologies. If you can get an app written in MFC, or WPF, or Mono, or Qt, or something else, then use whatever gets the job done at present.

Let's imagine it in other disciplines:

There will be better musical instruments than mine, so I'm not going to play.
There will be better paints, brushes, and canvases than mine, so I'm not going to paint.
There will be better hammers, nails, and building materials than mine, so I'm not going to build.
There will be better landscaping materials than mine, so I'm not going to landscape.


You say you have the basics down. That is enough. Go from there. If you find a tool or technology is useful then employ it, but otherwise just keep going and do your best with what you have got.

If you spend your days just trying to"sit back and wait for the future", you will find the present will have passed you by while you were playing. You may suddenly discover you are left with nothing but "could have" and "should have" to keep you company.

I'm a little confused by your points though. Because on one hand, use whatever gets the job done seems to drift towards the strengthening the basics part.

I expect everybody agrees in theory about using whatever technology that gets things done. But when it comes to employment, it simply doesn't seem to be the case

Note I didn't say just sit back and do nothing. I'm saying sit back and let it unfold while strenghthening the basics and occasionally peek into technologies that sound interesting.

Also, I don't agree with the other disciplines analogy, as far as I know, there is very few disciplines that change as fast as the electronics industry.

And people don't judge you/hire/endorse you because of the instruments you use as long as you can get the job done. If you are a painter, they just care if you know how to paint, if you are a musician, people just want you to create great music, tools are irrelevant.

But from what I've been reading about the IT industry, that simply does not look like the case. Interviewers will ask you about design patterns, they'll expect you to be an expert in the latest and coolest technology even though by the time they decide to upgrade, another cooler and better technology might totally makes it obsolete.

One example is NHibernate, we use NHibernate in the product at work, and I figured it was probably important to learn since it's gained a lot of popularity. But then we had a meeting where basically everybody said NHibernate is not very good for large scale enterprise solutions, many DBAs hate it, and we decided to use code generation instead.

So what's the point to try to learn these technologies when you probably are not going to need it? By the time you become the expert at it, you'll probably needing to learn something completely new again. In that regard, isn't it more important to get the foundations? Because it'll help understand the new technology and makes the learning curve much more acceptable when the time comes.

I totally agree.

I learned assembly, totally useless now.

I learned Delphi, it got replaced with c#.

I learned c/c++, 0x is the new thing now.

I learned win32, it probably will be replaced soon.

... Directx 9 -> useless now because of dx 11

same for opengl 2.0

It's getting anoying... By the time you master something, that you are really good at it,

it become useless/depreciated/fallen deep into the abyss of another dimention..........

I learned assembly, totally useless now.
I learned Delphi, it got replaced with c#.
I learned c/c++, 0x is the new thing now.
I learned win32, it probably will be replaced soon.
... Directx 9 -> useless now because of dx 11
same for opengl 2.0

It's getting anoying... By the time you master something, that you are really good at it,
it become useless/depreciated/fallen deep into the abyss of another dimention..........

Perhaps you learned the wrong parts.

I also learned assembly for several processors. The knowledge of how things work at a lower level is useful.
I learned several languages that are now dead. In the process I learned alternative patterns that I can apply to new problems.
I have re-learned C++ many times. I learned it the first time in the late 1980s. I learned it again in the mid 90's. I learned it again in the late 90's when it was standardized. I learned the tiny differences in 2003, then bigger differences in TR1, and now I've learned the many latest updates. Every time I have become a better programmer because of it.
I have learned many versions of DirectX and have forgotten more OGL ARB extensions than most people ever learn. In the process I have learned hundreds of interesting techniques and ways to leverage processing power.


If you are only learning the technology for technology's sake, that is a sad reason to learn it.

If you are learning so you can grow and apply the knowledge broadly, then for such a person all learning is useful as they can transfer the knowledge to new problems.

Learning languages and technology just for learning languages and technology could be fun, but it doesn't get you anywhere... Learning the concepts and patterns used in different languages and learning how to apply them in other languages is what makes you a better developer. Both OO languages and functional/procedural languages are around since a long time. Most of the new languages I see still use the same concepts that where available in the first languages. So when you really learn to understand and use OO, you can apply it in all the languages that support it. You only need to learn the syntax.

Learning languages and technology just for learning languages and technology could be fun, but it doesn't get you anywhere... Learning the concepts and patterns used in different languages and learning how to apply them in other languages is what makes you a better developer. Both OO languages and functional/procedural languages are around since a long time. Most of the new languages I see still use the same concepts that where available in the first languages. So when you really learn to understand and use OO, you can apply it in all the languages that support it. You only need to learn the syntax.

Pretty much this. Adaptation to a new technology is easy (unless it's a revolutionary breakthrough, which doesn't happen often) and is not the issue. The real problem is parting from the previous technology, which quite often has become so entrenched in an individual's or a company's workflow that it is difficult to upgrade without severely disrupting activities. That is the real trap, in my opinion.

“If I understand the standard right it is legal and safe to do this but the resulting value could be anything.”

This topic is closed to new replies.

Advertisement