Technologies and languages come and go, it is only algorithms and data structures that you can rely on.
When it comes to languages, it doesn't matter if you are programming in C, C++, Java, Perl, Python, Ruby, Eiffel, Erlang, shell script, or something else entirely. They come and go. Use whatever gets the job done.
The same with technologies. If you can get an app written in MFC, or WPF, or Mono, or Qt, or something else, then use whatever gets the job done at present.
Let's imagine it in other disciplines:
There will be better musical instruments than mine, so I'm not going to play.
There will be better paints, brushes, and canvases than mine, so I'm not going to paint.
There will be better hammers, nails, and building materials than mine, so I'm not going to build.
There will be better landscaping materials than mine, so I'm not going to landscape.
You say you have the basics down. That is enough. Go from there. If you find a tool or technology is useful then employ it, but otherwise just keep going and do your best with what you have got.
If you spend your days just trying to"sit back and wait for the future", you will find the present will have passed you by while you were playing. You may suddenly discover you are left with nothing but "could have" and "should have" to keep you company.
I'm a little confused by your points though. Because on one hand, use whatever gets the job done seems to drift towards the strengthening the basics part.
I expect everybody agrees in theory about using whatever technology that gets things done. But when it comes to employment, it simply doesn't seem to be the case
Note I didn't say just sit back and do nothing. I'm saying sit back and let it unfold while strenghthening the basics and occasionally peek into technologies that sound interesting.
Also, I don't agree with the other disciplines analogy, as far as I know, there is very few disciplines that change as fast as the electronics industry.
And people don't judge you/hire/endorse you because of the instruments you use as long as you can get the job done. If you are a painter, they just care if you know how to paint, if you are a musician, people just want you to create great music, tools are irrelevant.
But from what I've been reading about the IT industry, that simply does not look like the case. Interviewers will ask you about design patterns, they'll expect you to be an expert in the latest and coolest technology even though by the time they decide to upgrade, another cooler and better technology might totally makes it obsolete.
One example is NHibernate, we use NHibernate in the product at work, and I figured it was probably important to learn since it's gained a lot of popularity. But then we had a meeting where basically everybody said NHibernate is not very good for large scale enterprise solutions, many DBAs hate it, and we decided to use code generation instead.
So what's the point to try to learn these technologies when you probably are not going to need it? By the time you become the expert at it, you'll probably needing to learn something completely new again. In that regard, isn't it more important to get the foundations? Because it'll help understand the new technology and makes the learning curve much more acceptable when the time comes.