Different Timing

Started by
9 comments, last by laztrezort 12 years, 6 months ago
Hey guys, my game is using a fixed time step.... so I was kind of under the impression that my game would run at the same speed no matter what platform is runs on. However, I have a cutscene that is based off of timers; when I watch the cutscene on my 360 it ends exactly when it's supposed to, but on my PC it's a second or two off. Is this a difference between Xbox and PC, or could it be that it would have different ending results on different PCs? I'm coding in XNA, and like I said, I'm using fixed time step so I'm kind of confused on why this is happening.

Thanks!
Advertisement
How exactly does your code handle the fixed timestep? There are a few possible causes for this but it depends a lot on how your code works.

Wielder of the Sacred Wands
[Work - ArenaNet] [Epoch Language] [Scribblings]

The only thing I have toggled is this.IsFixedTimeStep = true, and then XNA automatically handles the frame rate from there (or so I thought!). I don't have any other code in my game relating to the time step besides the initial toggle.

What happens in the cutscene is that it finishes too fast, at first I was just wondering if maybe some timer caluclations are different on the PC platform... but before I go and fix it I want to be sure I am fixing it correctly. Let me know if you have any more questions I can answer to help, sorry I don't have anymore code to show!

sorry I don't have anymore code to show!


Is there a particular reason for that? It's hard (impossible) to point out problems in code we can't see.
Yes, XNA is automatically handling the fixed time step... so when it comes to code relating to that, all I have is that initial toggle. I have no other code.

XNA will automatically adjust how fast/slow the game is to keep that frame rate, there are certain flags I can use to see if the game is running slowly or not, but the problem is the game is running too fast on the PC. My question isn't really involving code, but rather a general question asking if this is normal to see a difference between two consoles like the 360 and PC. Technically, I am asking if it is finishing early on the PC because the code is being calculated different somehow on the PC's hardware, my gut instinct tells me that's what's going on, but I wanted to be sure.

Yes, XNA is automatically handling the fixed time step... so when it comes to code relating to that, all I have is that initial toggle. I have no other code.


I think what they meant is your code relating to the timing of your cut scenes that you mentioned in the OP. How are you timing them? By counting the times Update is called, by counting frames (Draw), or by keeping track of GameTime, or using a dedicated timer (e.g. Stopwatch)? What happens when a cutscene ends? GC is supposedly a different animal between PC and XBOX, so perhaps on the XBOX you are seeing a delay during a collection?

Also, I believe the particulars of (fixed step) timing changed between XNA 3 & 4, so which version you are using may be important.
Oh ok, sorry if there was a misunderstanding.

In this particular cutscene I am using a dedicated timer... each frame I add to the timer and when the timer reaches certain amounts then key events happen in the scene. I am using XNA 4.0, and I am running a steady 60FPS on the 360, as well on the PC.

Thanks for the reply!

The only thing I have toggled is this.IsFixedTimeStep = true, and then XNA automatically handles the frame rate from there (or so I thought!).


I'd have to look into more, but my understanding of that flag was it only effects the rendering updates, and won't effect your logic updates. Basically if you tell XNA to render something, but it hasn't reached the time to render it won't kinda' like how vSync works. But your game logic is still updating.
[ dev journal ]
[ current projects' videos ]
[ Zolo Project ]
I'm not mean, I just like to get to the point.
With fixed timestep, the value of TargetElapsedTime (or whatever it is) , which defaults to 60, is how many times the framework will (try to ) call Update(). Draw() is called "whenever they feel like it" and likely depends on several things, such as VSync and display refresh rate.


In this particular cutscene I am using a dedicated timer... each frame I add to the timer and when the timer reaches certain amounts then key events happen in the scene. I am using XNA 4.0, and I am running a steady 60FPS on the 360, as well on the PC.
[/quote]

So is your timer a simple counter that you ++ every time Draw is called, or are you using an actual timer? It sounds like you aren't using GameTime passed to the Update method, which was made for this scenario. Really, you should post some relevant sections of your code - you may not think this is code related, but I am almost positive it is.

If you don't, for some reason, want to post code here, have a look at some of the entries at Shawn Hargreaves Blog, and also download an look at how Nick Gravelyn made his timing classes tick here.
No, I have no problem posting code, but you're right, I never really thought this to be a code issue; I'm starting to see now that I probably am not really using a true timer like you said.


sceneTimer += 1;

if (sceneTimer == 25)
//Cue text or something here. This is really all there is to it, I ++ each time update is called and when the timer reaches a certain amount, I trigger the events of my scene.


My timer (or rather "counter" I suppose) is an int, and it is added to each time update is called, when sceneTimer == a certain amount, then I call for certain text events to happen etc. From what you are saying, I need to be using GameTime instead of basic int adding? That's really all I have, it's just an int that is being added when I update, and I wait for that int to reach a certain amount and then call events in my scene.

This topic is closed to new replies.

Advertisement