I am confused however because it typically ends up being > 1000 fps. If it drops below 400, I start to notice a lag. If it were to run at 60 fps it would be cripplingly slow. But 60 fps is what most games run at, isn't it? That is my understanding anyway. Why are the numbers I'm seeing not matching up with my understanding? Is this method incorrect?
It's your own code. You should be able to describe what is happening better than "lag". You should be able to see in your code what is happening.
I didn't see an answer to this question: (Please excuse me if I missed it.)
1. I assume your objects all are moving based on the deltaTime?! If not, you will need to make that change because if not, things will move at different speeds on different computers.
To be clear, where in your code do you determine how far something should move in one frame? Does the time factor into it? You're almost certainly doing it wrong, if frame rate breaks your game in any way other than how it looks.