We are getting to a world where there's no such thing as "fast enough."
Sure, unless you have an actual optimal solution that simply cannot be improved upon further there's always some benefit to optimizing further.
That's great in theory, but in practice we have other considerations. It takes time and effort to optimize software and at some point we have to accept that what we have is "good enough" to ship. It would be nice to optimize further so that start-up times are just a bit quicker, more resources are left free for other programs, and we can be more environmentally friendly, but if the start-up time isn't overly long and customers consider the performance to be acceptable it's often hard to justify continued work - especially if someone else is paying the bills.
Other than that, I agree with the overwhelming majority of excellent comments above as well as those in the topic "optimization philosophy and what to do when performance doesn't cut it" (also linked above by Eck).
This is computer science, not computer voodoo -- you should always use your tools to make proper measurements so that optimization can be an intelligent and properly informed process, but optimizations are a good thing and are often necessary. We also shouldn't use the existence of these tools or some misguided philosophy as an excuse to write bad code or avoid obvious well-known improvements in the first place.