Archived

This topic is now archived and is closed to further replies.

Agemaniac

Debuging results differs from running

Recommended Posts

i´m using Visual C++ 6.0 and in my project i saw a strange situation. when i run it with ctrl-f5 it show me (graficaly) one thing and when i run it with f5 or ctrl f10 it shows me things totaly diferent. actually the ctrl-f5 is the result that i want, but its very hard to debug it because of this. i´m using tokamak physics lib in my project and OpenGl but i thought it doesnt matter. Any suggestions????

Share this post


Link to post
Share on other sites
Upgrade to VS.Net. This will become much less likely, IME.

It''s also possible that you''re doing something illegal, that isn''t flagged by the compiler. Like using a variable that wasn''t assigned.

Try raising your compiler warning level. Maybe he will spot your error.

Cédric

Share this post


Link to post
Share on other sites
I dont use the debugger much, but one time i did use it, i had a similar problem.

It turned out it was because i was making a graphics demo (particle engine to be exact), and when i was using the debugger, I would get huge bursts of particles, and then nothing, then huge bursts... But when i executed the program normally, it was a smothe flow of particles (which was what i wanted)

The problem was the time-based movement of the particles. I had a function called getTime(), and my program was using the difference in time between the current frame and the last frame. When running normally, this difference would be about 20 milliseconds, but in the debugger, it was between 200 - 20000 milliseconds (up to 20 seconds , when i was stepping through the code slowly). needless to say, this was the cause of the strange behavior.

HTH

Share this post


Link to post
Share on other sites
When you run debug mode, I believe Visual C++ sets some (possibly all?) of the uninitialized variables to seme default values. This is isn''t the case for non-debug version. This is quite often the difference between the two builds in my experience.

Share this post


Link to post
Share on other sites
Hello Agemaniac,

As CrazyMike says, I can verfiy that in debug mode it will initial global pointers to a value of 0xcdcdcdcd or something like that it been a while since I did any M$ compiling, which if you do any checking of pointer a vaild by doing this if(pointer) instead of if(pointer==NULL) you will get funny problems.

Why they do this I not sure, they must have a reason.

Lord Bart

Share this post


Link to post
Share on other sites
quote:
Original post by Lord Bart
if(pointer) instead of if(pointer==NULL) you will get funny problems.
Uh, could you explain the difference between the two? I always thought that they were equivalent.

Share this post


Link to post
Share on other sites
quote:
Original post by Lord Bart
Hello Agemaniac,

As CrazyMike says, I can verfiy that in debug mode it will initial global pointers to a value of 0xcdcdcdcd or something like that it been a while since I did any M$ compiling, which if you do any checking of pointer a vaild by doing this if(pointer) instead of if(pointer==NULL) you will get funny problems.

Why they do this I not sure, they must have a reason.

Lord Bart


By setting all uninitialized data to 0xCD you''ll have a much easier time finding out what''s uninitialized when debugging.

And I beleive that running in debug mode might use different PATHs.

Share this post


Link to post
Share on other sites
quote:
Original post by AndreTheGiant
I dont use the debugger much, but one time i did use it, i had a similar problem.

It turned out it was because i was making a graphics demo (particle engine to be exact), and when i was using the debugger, I would get huge bursts of particles, and then nothing, then huge bursts... But when i executed the program normally, it was a smothe flow of particles (which was what i wanted)

The problem was the time-based movement of the particles. I had a function called getTime(), and my program was using the difference in time between the current frame and the last frame. When running normally, this difference would be about 20 milliseconds, but in the debugger, it was between 200 - 20000 milliseconds (up to 20 seconds , when i was stepping through the code slowly). needless to say, this was the cause of the strange behavior.

HTH



If you use "virtual time", this problem goes away.

Share this post


Link to post
Share on other sites
just a comment..if in debug mode it sets all uninitialized varible and in runnning mode it does not how can this affect the simulation. If it is not initialized in running i may got wrong results, but i get right results. then i thought all the variables are ok...i dont know. is there a way in MSC++ 6.0 to disable this kind of initialization in debug mode?

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
quote:
Original post by Agemaniac
just a comment..if in debug mode it sets all uninitialized varible and in runnning mode it does not how can this affect the simulation. If it is not initialized in running i may got wrong results, but i get right results. then i thought all the variables are ok...i dont know. is there a way in MSC++ 6.0 to disable this kind of initialization in debug mode?


It depends on your code (and the inherent assumptions or lack of in it). Just because your code appears to be working right in one case doesn''t necessarily mean your code is correct (although I''m not saying your code is incorrect either - sometimes compilers don''t genrate correct code). It may not be unitialized varibles - this just happens to be a very common reason for the debug and retail mode to differ.

Another, less common reason, things could be different between the two builds is that you are trashing memory somehow (bad pointer or writing past the ends of arrays). The run-time results can differ between the two builds, although not necessarily.

Or it could be completely something else.

Try binary debugging, if possible, (i.e. split code into half verify one half works, split the other half into working and non-working and so on). It''s a simple technique but many people don''t think of using it.

Share this post


Link to post
Share on other sites
yeah...the binary search is simple but works...after lost 2 hours looking for the error i figure out that there was a variable that i was using erroneous. it was assign with 1 in debug and with 0 in running... 0 was the correct and i thought was the default to running mode....why does it assign 1 in debug mode i really dont know....thanks.....everything is working now...really thanks.

Share this post


Link to post
Share on other sites
quote:

--------------------------------------------------------------------------------
Original post by AndreTheGiant
I dont use the debugger much, but one time i did use it, i had a similar problem.

It turned out it was because i was making a graphics demo (particle engine to be exact), and when i was using the debugger, I would get huge bursts of particles, and then nothing, then huge bursts... But when i executed the program normally, it was a smothe flow of particles (which was what i wanted)

The problem was the time-based movement of the particles. I had a function called getTime(), and my program was using the difference in time between the current frame and the last frame. When running normally, this difference would be about 20 milliseconds, but in the debugger, it was between 200 - 20000 milliseconds (up to 20 seconds , when i was stepping through the code slowly). needless to say, this was the cause of the strange behavior.

HTH
--------------------------------------------------------------------------------




If you use "virtual time", this problem goes away.







please explain what virtual time is

Share this post


Link to post
Share on other sites