Jump to content

  • Log In with Google      Sign In   
  • Create Account

Banner advertising on our site currently available from just $5!

1. Learn about the promo. 2. Sign up for GDNet+. 3. Set up your advert!

Felipe Lira

Member Since 27 Nov 2006
Offline Last Active Jul 13 2012 05:29 AM

Topics I've Started

Game runs choppy even with high frame rate

06 July 2012 - 12:47 PM

[SOLVED~ I posted my insight on this below, plus some possible causes for stuttering in case someone has as similar problem]


I'm coding a rhythm game and the game runs smoothly with uncapped fps. But when I try to cap it around 60 the game updates in little chunks, like hiccups, as if it was skipping frames or at a very low frame rate. The reason I need to cap frame rate is because in some computers I tested the fps varies a lot (from ~80 - ~250 fps) and those drops are noticeable and degrade response time. Since this is a rhythm game this is very important.
This issue is driving me crazy. I've spent a a whole week already and still can't figure out the problem. I hope someone more experienced than me could shed some light on it. I'll try to put here all the hints I've tried along with the code for my game loop, so I apologize with this post gets too lengthy.

1st GameLoop:

const uint UPDATE_SKIP = 1000 / 60;
uint nextGameTick = SDL_GetTicks();[/background][/background]
	// only false when a QUIT event is generated!
	if (processEvents())
		if (SDL_GetTicks() > nextGameTick)
			nextGameTick += UPDATE_SKIP;

2nd Game Loop:
	const uint UPDATE_SKIP = 1000 / 60;
	while (isNotDone)
		   LARGE_INTEGER startTime;
		  // process events will return false in case of a QUIT event processed
		  if (processEvents())
		do {
			   frameTime = static_cast<uint>((endTime.QuadPart - startTime.QuadPart) * 1000.0 / frequency.QuadPart);
		} while (frameTime < UPDATE_SKIP);

[1] At first I thought is was a problem with timer resolution. I was using SDL_GetTicks, but even when I switched to QueryPerformanceCounter I saw no difference.

[2] Then I thought it could be an rounding error in my position computation and since game updates are smaller in high FPS that would be less noticeable. Indeed there is an small error, but from my tests I realized that is not enough to produce the position jumps I'm getting. Also, another intriguing factor is that if I enable vsync I'll get smooth updates @60fps regardless frame cap code. So why not rely on vsync? Because some computers can force a disable on gfx card config.

[3] I started printing the maximum and minimum frame time measured in 1sec span, in the hope that every a few frames one would take a long time and still not enough to drop my fps computation. With frame cap I get always min = 16ms and max = 18, and still, the game "does not moves like jagger".

[4] My process priority is set to HIGH (Windows doesn't allow me to set REALTIME for some reason). As far as I know there is only one thread running along with the game (sound callback, which I really don't have access to it). I'm using Audiere. I then disabled Audiere by removing it from the project and still got the issue. Maybe there are some others threads running and one of them is taking too long to come back right in between when I measured frame times, I don't know. Is there a way to know which threads are attached to my process?

[5] There are some dynamic data being created during game run. But It is a little bit hard to remove it to test. Maybe I'll have to harder this one.
Well, as I told you I really don't know what to try next. What bugs me more is why at 60fps & vsync enabled I get smooth results and with 60fps no vsync I don't. Is there a way to implement software vsync???

Thanks in advance. I appreciate the ones that got this far and yet again I apologize for the long post.

Game gets higher fps when executing through gDebugger

31 May 2012 - 06:36 AM


I'm working on improving my game performance and I'm using gDebugger to profile it. I've come to a problem that when executing the game through gDebugger I get ~150fps, while when executing the game normally I get only ~60. I print the fps on screen and I can assure it's not an error in my fps computation, since when I run through dDebugger the fps print matches the value in gDebugger.

I'm using MSV Compiler. The binary has no code optimization (-O0). My game had a 60 frame cap but I've disabled it. Any ideas?


Pitch Estimation Algorithm

26 March 2012 - 08:34 AM


I'm trying to implement an algorithm that, given human singing samples captured from a mic, tell the singing note.
I'm no math expert, but I've read quite a lot of material on this topic the past weeks. I have an algorithm working but with a few problems, that I don't know how to solve them.

Basically what I do is:

1) Apply a hamming window to raw input data. (1024 samples from mono mic input at 44100Hz rate)
2) Apply FFT
3) Get magnitude and true frequency (frequency computed from the bin frequency + phase offset) for each bin
4) Return the frequency of the bin with greatest magnitude.

I'm using this tool to test my results: http://www.seventhst...tuningfork.html
I place the mic next to the speaker to capture input. I've noticed that this estimation depends on how far is the mic from my speakers. If I put it right in front of it I get right results for tones greater than D2 (~293Hz) even with speakers volumes at the minimum. Below that freq it gives me completely wrong values. If I move the mic 5 inches away of the speakers I start getting wrong values from below G2 (~392Hz) even if the speaker volumes are at the maximum.

It seems something is either wrong with my algorithm or my mic, or both.

The algorithm follows, pehaps you could shed some light on it:

// Using portaudio to capture mic Input. This callback is called each frame with inputBuffer filled with raw data.
int paCallback(const void* inputBuffer, void* outputBuffer, unsigned long framesPerBuffer,
			   const PaStreamCallbackTimeInfo* timeInfo, PaStreamCallbackFlags statusFlags,
			   void* userData)
	float* input = (float*) inputBuffer;
	double* data = (double*)userData;
	if(input != NULL)
		 for(unsigned int i = 0; i < FFT_N; i++)
			  // apply hamming window
			  data[2 * i] = input[i] * fc.getWindow(i);
			  data[2 * i + 1] = 0.0;
		 double freq = 0.0;
		 double ampl = 0.0;
		 // fc is a global instance of Analyzer class
		 fc.analyze(data, FFT_N, SAMPLE_RATE, 1, &freq, &ampl);
		 double db = log10(ampl);
		 printf("%-10s | %8.3lfHz | %5.2lfdB | %lf\n", Note::noteName(freq), freq, db, ampl);
	return 0;

void FrequencyCounter::analyze(double* data, unsigned long nn, double sampleRate, int overlapFactor, double* outFreq, double* outMagnitude)
	// daniel-lanzos algorithm (reverse binary reindexing dark magic)
	fft(data, FFT_N);
	// Precalculated constants
	const double freqPerBin = sampleRate / FFT_N;
	const double stepSize = FFT_N / overlapFactor;
	const double expectPhaseDiff = 2.0 * M_PI * stepSize / FFT_N;
	double real = 0.0;
	double imag = 0.0;
	double phase = 0.0;
	double delta = 0.0;
	long qpd = 0;
	const size_t iMax = std::min(size_t(FFT_N / 2), size_t(FFT_MAXFREQ / freqPerBin));
	for (size_t i = 0; i < iMax; ++i)
		real = data[2*i];
		imag = data[2*i+1];

		phase = atan2(imag, real);
		// process phase difference
		delta = phase - mFFTLastPhase[i];
		mFFTLastPhase[i] = phase;
		// subtract expected phase difference
		delta -= i * expectPhaseDiff;
		/* map delta phase into +/- Pi interval */
		qpd = delta / M_PI;
		if (qpd >= 0)
			qpd += qpd & 1;
			qpd -= qpd & 1;
		delta -= M_PI * static_cast<double>(qpd);
		/* get deviation from bin frequency from the +/- Pi interval */
		delta = overlapFactor * delta / (2.0 * M_PI);
		// true frequency
		data[2 * i] = (i + delta) * freqPerBin;

		// magnitude
		data[2 * i + 1] = 2.0 * sqrt(real * real + imag * imag);

	unsigned int maxI = 0;
	double maxMag = data[1];
	for(unsigned int i = 0; i < iMax; i++)
		if(data[2 * i + 1] > maxMag)
			 maxI = i;
			 maxMag = data[2 * i + 1];
	if(outFreq != NULL)
		*outFreq = data[2 * maxI];
	if(outMagnitude != NULL)
		*outMagnitude = data[2 * maxI + 1];

Thanks in advance.

Help with software validation

10 November 2011 - 05:45 AM


I'm about to finish my first commercial game and I'm concerned with piracy. First of all I want to say that I've looked for similar topics in the forum and I agree that providing service and rewarding the legit customer with extra content is the best approach but it is not sufficient for my case. I'm releasing the game in Brazilian market, which is well known for piracy. The general sense of avoid DRM will not work here. I'm not naive to think it will be piracy free. I just want to make it troublesome for one to make a copy of it. For instance, If just lay the files in the CD in a way that a customer can copy and paste it to a flash drive and install in any computer that obviously not working for me. On the other hand, if I have some sort of copy protection and validation it will not harm the legit user, since he is not trying to copy the CD, and inserting a code will not harm much either.

I want to hear for you that have much much more experience than me in releasing software/games what approaches I can take to minimize copy protection and perform validation. I've seen many software to come with a serial number, and then perform a validation either online or by phone. There are quite a few validation software that claim to do this at a reasonable cost and use their own server to do the validation. Is that a good approach?

Concerning copy protection I really don't know what to do. Is there a way to write something in a special track of the CD that the most common ripping software won't copy, perhaps a bad bit, or something like that.

Thanks in advance.

FTGL linking problem in Release configuration

20 July 2011 - 07:50 AM


I'm using FTGL to render Truetype fonts with OpenGL. I've compiled FTGL project to generate both ftgl_static_D(debug) and ftgl_static.lib (release) libraries to use in my project.

The debug configuration of my project is able to link(i get some "locally defined symbol" linker warning though), while the release is not. I event tried to link release with the ftgl_static_D.lib just to get the same linker errors (__declspec(dllimport)).

After a while I realized that changing the project settings from "Use link time optimization" to "No whole program optimization" will do the trick but with the same link warnings as the debug configuration.

Does anyone knows why enabling link time optimization would cause it to fail linking.