Posted 09 September 2011 - 02:28 PM
It's been said before, but really, it boils down to 1) who your target end-user is and 2) what's your personal preference. If you're familiar with Windows and will be writing software for people running on Windows, then by all means use Windows. Ultimately, in terms of getting stuff done it shouldn't matter because you can setup efficient development tool chains in either. Under Windows, the de facto standard is Visual Studio, which bundles everything you'll care to use for day-to-day activities: editor, compiler, linker, debugger.
Under Linux, you're more likely to use (at first) a more decentralized tool chain. You'll pick an editor (vim, emacs, nano, joe, whatever) and do all your coding in all that. You'll more likely than not use GCC for compiling and linking, on the command line. It's definitely a different approach than you'll typically see using Visual Studio, but after fumbling a bit with the options, it's not really that hard to use. Eventually though hand compiling is bloody tedious, and then you'll look at some sort of build system. The foundation for a lot of build systems under Linux is GNU Make and the Makefile. You may write a couple of these for some small projects, but then realize that writing these, too, by hand tends to suck. And that's when you start learning to use one of the many open source build systems out there. For the longest of time, and even still today, the Autotools package has been the de facto standard for building projects from source under Linux. I damn well near shot myself trying to figure out how to use it effectively and instead switched to CMake. If I were to recommend an open source build system for someone to use, it'd definitely be CMake. Once things are built though, and you start running your programs, you'll then start using GDB (with possibly one of its many front-ends) predominantly for your debugging purposes.
Now I may make Linux to sound like this PITA to use on a day-to-day activity, but honestly, once you get the initial project files setup, it's really not that bad. There are also IDEs under Linux that (I would guess) automate a lot of the setup, for instance tools like Eclipse CDT (note, I've never used it, so I don't know what all it provides) and bundle your tool chain into one place if you like that approach. Personally I find it useful to at least be aware of what these environments are doing for you behind the scenes because at the end of the day, they're still using a lot of the tools I mentioned above, and if anything screwy happens from within the IDE, it might be helpful to understand exactly which tool failed when trying to diagnose the problem.
As far as third party library support goes, it's pretty much a toss up. You can always find examples that go both ways where a library is easy to configure and install under one OS and not the other. But that's the "fun" with cross-platform, open source software. Linux package managers may make this easier because they'll (for the most part) know how to also install whatever additional third party dependencies are needed, but for well known and popular libraries, typically I've seen developers either 1) provide a link to the exact dependencies you need to build their software, 2) provide binary distributions to the dependencies you need, or 3) bundle the source code for the dependencies you need with their software. Regardless of whatever OS you choose, you're going to find some library or tool that's going to require some finagling to get working right -- just comes with the software development gig.