However, adding DirectX very much rubs me the wrong way. The problem with OpenGL is that it is not used enough, so switching away from it makes the situation worse for OpenGL...
My initialisation code is just what is in my opening post. So I don't consciously set any weird pixel formats. What would be wrong there that would cause this situation? Also, since it works on 99% of videocards as is, is there a way with SDL to do my current code, see whether it gave proper OpenGL, and if not, try again with different pixel settings?
It is very possible to select a pixel format which will generate non-accelerated OpenGL 1.1 contexts. PFD_SUPPORT_GDI is mutually exclusive with "accelerated OpenGL" and "anything higher than 1.1".
I have seen this happen even on good Nvidia cards, so just ignoring them as "not our target audience" is definitely not an option...
All in all, if someone doesn't have an IHV driver installed, it's questionable whether you really want them as customers. I'm not talking of the latest bleeding-edge drivers, but, you know, anything at all.
They must either be quite poor (not able to afford a $10 card!?) or quite stupid. In either case, you probably don't want them as customers. The poor won't pay you, and the stupid will cost you more in support than the revenue they bring in.
I am indeed seeing that computers with multiple videocards sometimes select the wrong one. This is only a few cases, though: most just have one videocard that needs its drivers updated. I am also seeing that the game doesn't work together with Nvidia Surround (their multi-monitor-SLI-thingie).
One likely candidate for the Win7/8 problems could be that the machines are using multiple GPUs, alot of newer laptops have both an integrated Intel GPU and a nvidia GPU, it is not impossible that the integrated GPU lacks OpenGL drivers while the nvidia one has them and that by default your application tries to use the Intel GPU. (installing drivers manually from the IHV rather than from the system builder might override this behaviour)