We may not be able to imperially measure the goodness of the code we write and systems we design, but we do have a reasonably strong understanding of what's good none-the-less. If we could make perfect statements about what is--now--and will remain--forever--single, then Singleton at least wouldn't introduce that very dangerous assumption. The trouble is: we can't. Not many years ago, the assumption that the GPU was singular in any system might have seemed reasonable. Now, multi-GPU setups are common, things like nVidia Optimus are common (switchable integrated+discrete GPUs), and Hydra (allocating graphics calls among multiple GPUs, even from different vendors) is a thing. Assumptions very often turn out wrong.
So now the point may be argued, correctly, that 'forever' is an awful long time to be concerned about. And it is. Probably most software (gaming, anyways) will be developed over a period of 12-36 months, and have an active support cycle of about twice that if the game continues to generate revenue. So, already we are predicting 3-6 years into the future. Now, if we base our next game on the same codebase, we can extend that number out another 12-36 months per iteration. We're now predicting things 4-9 years into the future. Not forever, but in a realm very few people are consistently right at guessing about.
So, if we are considering the potential productivity costs of choosing Singleton or not, we have to also consider the unknown cost of guessing wrongly about the uniqueness of the thing, multiplied by the likelihood of our being wrong. On the other hand, if we do not make the simplifying assumption of its singleness, then we do pay a cost now, but we probably have a decent idea of what it is and being more general, rather than less, is not likely to cause us any unexpected grief later on.