Advertisement Jump to content
  • Advertisement


  • Content Count

  • Joined

  • Last visited

Community Reputation

150 Neutral

About felipedrl

  • Rank
  1. I'm about to get my hands on a guitar hero and check that. I'd appreciate if you could share your time. Thanks.
  2. [color=#333333] Well, after a lot of more tests I realized that the problem was either in the way I was measuring time, which I find really unlikely due to the fact that I've tested many timers, or something in the graphics card that was preventing it from rendering to screen, like it was skipping frames.[color=#333333] I found the exact term for that and it made a lot easier to search for the solution after. It's called (macro/micro) stuttering. It can be caused by a whole variety of this, not related to your game (yay! the art of PC game programming).[color=#333333] It could be due to an anti-virus running, instant message programs interference, multi-GPU sincronization (micro stuttering), laptop clock variety due to power consumption settings, intel processors can give errors due to "intel speed stepping" that changes cpu clock based on cpu usage (that would probably mess up with my QFC timer), it could be a driver problem (nvidia as of today 07/12/12 has a driver bug that causes micro-stuttering in single GPU for GTX 6 card family):[color=#333333] Well in my case I realized another particularity, as stated before that enabling vsync fixes the issue. What is probably happening is a tearing effect that gives the illusion of a frame skip, since the camera is fixed I don't see the "tear line", only notes scrolling weirdly. I can't really tell if a tearing is happening because it scrolls so fast, but I'm accepting it for now as vsync fixes the issue.[color=#333333] I hope it helps someone out there struggling with this problem.
  3. Thanks for replying to that. I agree with you about being user's responsibility to force that off. I'm also concerned that some Intel HD chipsets have a bug that fails to turn vsync on. My brother-in -law has one of those. I think I should put some more effort into this. I get the hiccups by capping frame rate on software it doesn't matter if vsync is on or off. It can indeed be interference between vsync and my code, but it seems to be other problem too. I did a few more tests in the weekend and realized those hiccups are evenly spaced in time. With 30fps cap I get hiccups every ~2s, while at 60 I get every ~0.5s which is really annoying. It just occurred to me. Can this be a thread yielding problem. I mean, maybe between my measured time and the update another thread would run making my time measure not right for that update. Next frame when it tries to compensate it would give a jump. If that is the case, how should I proceed, execute time query and update in a critical section, block interruptions if that is even allowed? I'll try this a little harder, if I get some progress I'll let you know. Again, thanks for the insight, I really appreciate that. Best Regards,
  4. [SOLVED~ I posted my insight on this below, plus some possible causes for stuttering in case someone has as similar problem] Hi,[background=transparent] [/background] [background=transparent][background=transparent] [background=transparent]I'm coding a rhythm game and the game runs smoothly with uncapped fps. But when I try to cap it around 60 the game updates in little chunks, like hiccups, as if it was skipping frames or at a very low frame rate. The reason I need to cap frame rate is because in some computers I tested the fps varies a lot (from ~80 - ~250 fps) and those drops are noticeable and degrade response time. Since this is a rhythm game this is very important.[/background] [background=transparent]This issue is driving me crazy. I've spent a a whole week already and still can't figure out the problem. I hope someone more experienced than me could shed some light on it. I'll try to put here all the hints I've tried along with the code for my game loop, so I apologize with this post gets too lengthy.[/background] [/background][/background] [background=transparent][background=transparent][background=transparent]1st GameLoop:[/background] [/background][/background] [background=transparent][background=transparent] const uint UPDATE_SKIP = 1000 / 60; uint nextGameTick = SDL_GetTicks();[/background][/background] while(isNotDone) { // only false when a QUIT event is generated! if (processEvents()) { if (SDL_GetTicks() > nextGameTick) { update(UPDATE_SKIP); render(); nextGameTick += UPDATE_SKIP; } } } [/background][/background] [background=transparent][background=transparent] [background=transparent]2nd Game Loop:[/background] [color=black][background=transparent] [/background][/background][/background][background=transparent][background=transparent][color=black][background=transparent] const uint UPDATE_SKIP = 1000 / 60; while (isNotDone) { LARGE_INTEGER startTime; QueryPerformanceCounter(&startTime);[/background][/background][/background] // process events will return false in case of a QUIT event processed if (processEvents()) { update(frameTime); render(); } LARGE_INTEGER endTime; do { QueryPerformanceCounter(&endTime); frameTime = static_cast<uint>((endTime.QuadPart - startTime.QuadPart) * 1000.0 / frequency.QuadPart); } while (frameTime < UPDATE_SKIP); } [/background][/background][/background] [background=transparent][background=transparent] [background=transparent][1] At first I thought is was a problem with timer resolution. I was using SDL_GetTicks, but even when I switched to QueryPerformanceCounter I saw no difference.[/background] [/background][/background] [background=transparent][background=transparent][background=transparent][2] Then I thought it could be an rounding error in my position computation and since game updates are smaller in high FPS that would be less noticeable. Indeed there is an small error, but from my tests I realized that is not enough to produce the position jumps I'm getting. Also, another intriguing factor is that if I enable vsync I'll get smooth updates @60fps regardless frame cap code. So why not rely on vsync? Because some computers can force a disable on gfx card config.[/background] [/background][/background] [background=transparent][background=transparent][background=transparent][3] I started printing the maximum and minimum frame time measured in 1sec span, in the hope that every a few frames one would take a long time and still not enough to drop my fps computation. With frame cap I get always min = 16ms and max = 18, and still, the game "does not moves like jagger".[/background] [/background][/background] [background=transparent][background=transparent][background=transparent][4] My process priority is set to HIGH (Windows doesn't allow me to set REALTIME for some reason). As far as I know there is only one thread running along with the game (sound callback, which I really don't have access to it). I'm using Audiere. I then disabled Audiere by removing it from the project and still got the issue. Maybe there are some others threads running and one of them is taking too long to come back right in between when I measured frame times, I don't know. Is there a way to know which threads are attached to my process?[/background] [/background][/background] [background=transparent][background=transparent][background=transparent][5] There are some dynamic data being created during game run. But It is a little bit hard to remove it to test. Maybe I'll have to harder this one.[/background] [background=transparent]Well, as I told you I really don't know what to try next. What bugs me more is why at 60fps & vsync enabled I get smooth results and with 60fps no vsync I don't. Is there a way to implement software vsync???[/background] [/background][/background] [background=transparent][background=transparent][background=transparent]Thanks in advance. I appreciate the ones that got this far and yet again I apologize for the long post.[/background] [/background] [/background]
  5. Figured out the magic, it turns out the gdebugger has vsync off while my game setting turns it on.
  6. It would be awesome to ship with gDebugger, hehehe. But seriously, I'm hitting 60fps in my hardware, which is a fairly good computer, other less powerful hardware will not achieve that. I guess it would be a better approach to debug on worse machine and find the bottleneck there instead of mine. The fps problem still bugs me though. I wonder if gDebugger has some sort of dark magic...
  7. Hi, I'm working on improving my game performance and I'm using gDebugger to profile it. I've come to a problem that when executing the game through gDebugger I get ~150fps, while when executing the game normally I get only ~60. I print the fps on screen and I can assure it's not an error in my fps computation, since when I run through dDebugger the fps print matches the value in gDebugger. I'm using MSV Compiler. The binary has no code optimization (-O0). My game had a 60 frame cap but I've disabled it. Any ideas? Thanks.
  8. Hi, I'm trying to implement an algorithm that, given human singing samples captured from a mic, tell the singing note. I'm no math expert, but I've read quite a lot of material on this topic the past weeks. I have an algorithm working but with a few problems, that I don't know how to solve them. Basically what I do is: 1) Apply a hamming window to raw input data. (1024 samples from mono mic input at 44100Hz rate) 2) Apply FFT 3) Get magnitude and true frequency (frequency computed from the bin frequency + phase offset) for each bin 4) Return the frequency of the bin with greatest magnitude. I'm using this tool to test my results: http://www.seventhst...tuningfork.html I place the mic next to the speaker to capture input. I've noticed that this estimation depends on how far is the mic from my speakers. If I put it right in front of it I get right results for tones greater than D2 (~293Hz) even with speakers volumes at the minimum. Below that freq it gives me completely wrong values. If I move the mic 5 inches away of the speakers I start getting wrong values from below G2 (~392Hz) even if the speaker volumes are at the maximum. It seems something is either wrong with my algorithm or my mic, or both. The algorithm follows, pehaps you could shed some light on it: // Using portaudio to capture mic Input. This callback is called each frame with inputBuffer filled with raw data. int paCallback(const void* inputBuffer, void* outputBuffer, unsigned long framesPerBuffer, const PaStreamCallbackTimeInfo* timeInfo, PaStreamCallbackFlags statusFlags, void* userData) { float* input = (float*) inputBuffer; double* data = (double*)userData; if(input != NULL) { for(unsigned int i = 0; i < FFT_N; i++) { // apply hamming window data[2 * i] = input * fc.getWindow(i); data[2 * i + 1] = 0.0; } double freq = 0.0; double ampl = 0.0; // fc is a global instance of Analyzer class fc.analyze(data, FFT_N, SAMPLE_RATE, 1, &freq, &ampl); double db = log10(ampl); printf("%-10s | %8.3lfHz | %5.2lfdB | %lf\n", Note::noteName(freq), freq, db, ampl); } return 0; } void FrequencyCounter::analyze(double* data, unsigned long nn, double sampleRate, int overlapFactor, double* outFreq, double* outMagnitude) { // daniel-lanzos algorithm (reverse binary reindexing dark magic) fft(data, FFT_N); // Precalculated constants const double freqPerBin = sampleRate / FFT_N; const double stepSize = FFT_N / overlapFactor; const double expectPhaseDiff = 2.0 * M_PI * stepSize / FFT_N; double real = 0.0; double imag = 0.0; double phase = 0.0; double delta = 0.0; long qpd = 0; const size_t iMax = std::min(size_t(FFT_N / 2), size_t(FFT_MAXFREQ / freqPerBin)); for (size_t i = 0; i < iMax; ++i) { real = data[2*i]; imag = data[2*i+1]; phase = atan2(imag, real); // process phase difference delta = phase - mFFTLastPhase; mFFTLastPhase = phase; // subtract expected phase difference delta -= i * expectPhaseDiff; /* map delta phase into +/- Pi interval */ qpd = delta / M_PI; if (qpd >= 0) { qpd += qpd & 1; } else { qpd -= qpd & 1; } delta -= M_PI * static_cast<double>(qpd); /* get deviation from bin frequency from the +/- Pi interval */ delta = overlapFactor * delta / (2.0 * M_PI); // true frequency data[2 * i] = (i + delta) * freqPerBin; // magnitude data[2 * i + 1] = 2.0 * sqrt(real * real + imag * imag); } unsigned int maxI = 0; double maxMag = data[1]; for(unsigned int i = 0; i < iMax; i++) { if(data[2 * i + 1] > maxMag) { maxI = i; maxMag = data[2 * i + 1]; } } if(outFreq != NULL) { *outFreq = data[2 * maxI]; } if(outMagnitude != NULL) { *outMagnitude = data[2 * maxI + 1]; } } Thanks in advance.
  9. Hi, I'm about to finish my first commercial game and I'm concerned with piracy. First of all I want to say that I've looked for similar topics in the forum and I agree that providing service and rewarding the legit customer with extra content is the best approach but it is not sufficient for my case. I'm releasing the game in Brazilian market, which is well known for piracy. The general sense of avoid DRM will not work here. I'm not naive to think it will be piracy free. I just want to make it troublesome for one to make a copy of it. For instance, If just lay the files in the CD in a way that a customer can copy and paste it to a flash drive and install in any computer that obviously not working for me. On the other hand, if I have some sort of copy protection and validation it will not harm the legit user, since he is not trying to copy the CD, and inserting a code will not harm much either. I want to hear for you that have much much more experience than me in releasing software/games what approaches I can take to minimize copy protection and perform validation. I've seen many software to come with a serial number, and then perform a validation either online or by phone. There are quite a few validation software that claim to do this at a reasonable cost and use their own server to do the validation. Is that a good approach? Concerning copy protection I really don't know what to do. Is there a way to write something in a special track of the CD that the most common ripping software won't copy, perhaps a bad bit, or something like that. Thanks in advance.
  10. I've found out the problem but still don't know the reason. It was due to a project configuration. At some point in my development I had a warning complaining "msvcrt.lib" conflicting with other library. So I added it to "Ignore Specific Default Library" and Ta Da! It worked. The weird thing is that Debug has the same ignore and doesn't crash. Another odd fact that happens is that the link fails when "Use Link Time Code Generation" option is activated. I can only link with "No Whole Program Optimization". The link fails regarding FTGL calls. Maybe it has something to do with name mangling since one of FTGL dependecies is truetype, which was compiled on C. I don't know. My programming skills are too short for this problem. Maybe you could give me some tips or recommend a reading. Thanks for all the support. You rock.
  11. Hi, I'm using FTGL to render Truetype fonts with OpenGL. I've compiled FTGL project to generate both ftgl_static_D(debug) and ftgl_static.lib (release) libraries to use in my project. The debug configuration of my project is able to link(i get some "locally defined symbol" linker warning though), while the release is not. I event tried to link release with the ftgl_static_D.lib just to get the same linker errors (__declspec(dllimport)). After a while I realized that changing the project settings from "Use link time optimization" to "No whole program optimization" will do the trick but with the same link warnings as the debug configuration. Does anyone knows why enabling link time optimization would cause it to fail linking.
  12. Thanks for all the help! A new project works.I'm checking right now all the things you pointed out. I'll post if I discover anything important or fix the problem.
  13. Thanks for replying so fast. The code snipped is from file.c in standard lib. I just pasted it to show it fails the "if" in Release configuration. The last call to my application code is to read method from std::ifstream. It also crashes upon calling the "<<" operator. I think this error is not due to memory trash since I tried to access a file right after the application entry point and it still fails. The following code fails in Release: int main(int argc, char *argv[]) { // Added this here to test memory trash. std::ifstream file("res/chart_01_easy.tws"); if (file.is_open()) { unsigned int test = 0;<char*>(&test), sizeof(unsigned int)); } // Game code goes below The question is why pf is not part of _iob buffer considering that all that I've done was to create a file object and tried to read a value from it, as you can see in the snippet above.
  14. Hi, I'm struggling with a weird error which I can't figure out why it happens. So, once more, I ask for the aid of gamedev's wizards. I'm using VisualStudio 2010 express edition and the game runs fine while in Debug configuration. When I run in Release configuration the game crashes upon the first access to a file (std::ifstream) giving an "access violation writing location" error message. The error happens inside this function: void __cdecl _lock_file ( FILE *pf ) { /* * The way the FILE (pointed to by pf) is locked depends on whether * it is part of _iob[] or not */ if ( (pf >= _iob) && (pf <= (&_iob[_IOB_ENTRIES-1])) ) { /* * FILE lies in _iob[] so the lock lies in _locktable[]. */ _lock( _STREAM_LOCKS + (int)(pf - _iob) ); /* We set _IOLOCKED to indicate we locked the stream */ pf->_flag |= _IOLOCKED; } else /* * Not part of _iob[]. Therefore, *pf is a _FILEX and the * lock field of the struct is an initialized critical * section. */ EnterCriticalSection( &(((_FILEX *)pf)->lock) ); } I've found out that in Debug the execution path takes the if, while in Release it takes else. My application does not create any additional thread. My project configuration is the default "Empty Project" with the following changes: Whole Program Optimization: No whole program optimization (By default it was using Link time code generation but this was giving me many linker error, so I turned it off) Ignore Specific Library: msvcrt SubsSystem: Windows Does anyone have a clue of why this is happening? Thanks in advance.
  15. Hi, I'm trying to get some nice water effects for my game. The problem is that the hardware I'm writing to doesn't support shaders neither cube map. So I've came up with a bump and gloss map. The bump is working just fine but I'm facing some problems with the gloss map. To tell you the truth I'm not sure if I'm doing it right. All the samples I found on the internet use shaders. Nevertheless, I managed to find one piece of code which uses DOT3_RGBA and worked on it, but it seems the code have some bugs. The gloss don't quite appear... Well, Here's I've do so far. I draw the water in two passes. The first one draws the geometry with a bump texture set to channel 0 and a base(water) texture set to channel 1. I use DOT3_RGB and the blend mode is set to (GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA) because I want the water to be transparent. The second pass I draw the water with the bump set to channel 0 and the gloss set to channel 1 using DOT3_RGBA and blend mode (GL_SRC_ALPHA, GL_ONE) This is the piece of code where I set up the texenv ( 2nd pass ). // Second Pass if( m_bEnableGloss && m_iGlossMapTextureID ) { glEnable(GL_BLEND); glBlendFunc( GL_SRC_ALPHA, GL_ONE ); glActiveTexture(GL_TEXTURE0); glEnable( GL_TEXTURE_2D ); glClientActiveTexture(GL_TEXTURE0); glEnableClientState(GL_TEXTURE_COORD_ARRAY); glBindTexture( GL_TEXTURE_2D, m_iBumpMapTextureID); //dot3(bumpmap, tangent_space_light_vector) glTexEnvx(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_COMBINE); glTexEnvx(GL_TEXTURE_ENV, GL_COMBINE_RGB, GL_DOT3_RGBA); glTexEnvx(GL_TEXTURE_ENV, GL_SRC0_RGB, GL_TEXTURE); glTexEnvx(GL_TEXTURE_ENV, GL_OPERAND0_RGB, GL_SRC_COLOR); glTexEnvx(GL_TEXTURE_ENV, GL_SRC1_RGB, GL_PRIMARY_COLOR); glTexEnvx(GL_TEXTURE_ENV, GL_OPERAND1_RGB, GL_SRC_COLOR); // Apply Gloss texture glActiveTexture(GL_TEXTURE1); glEnable( GL_TEXTURE_2D ); glClientActiveTexture(GL_TEXTURE1); glEnableClientState(GL_TEXTURE_COORD_ARRAY); glBindTexture( GL_TEXTURE_2D, m_iBumpMapTextureID ); glTexCoordPointer(2, GL_FIXED, 0, m_pMultiTexCoords); glTexEnvx(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_COMBINE); glTexEnvx(GL_TEXTURE_ENV, GL_COMBINE_RGB, GL_ADD_SIGNED); glTexEnvx(GL_TEXTURE_ENV, GL_COMBINE_ALPHA, GL_ADD_SIGNED); glTexEnvx(GL_TEXTURE_ENV, GL_SRC1_RGB, GL_PREVIOUS); glTexEnvx(GL_TEXTURE_ENV, GL_OPERAND1_RGB, GL_SRC_COLOR); glTexEnvx(GL_TEXTURE_ENV, GL_SRC0_RGB, GL_PREVIOUS); glTexEnvx(GL_TEXTURE_ENV, GL_OPERAND0_RGB, GL_SRC_COLOR); glTexEnvx(GL_TEXTURE_ENV, GL_SRC0_ALPHA, GL_TEXTURE); glTexEnvx(GL_TEXTURE_ENV, GL_OPERAND0_ALPHA, GL_SRC_ALPHA); glTexEnvx(GL_TEXTURE_ENV, GL_SRC1_ALPHA, GL_PREVIOUS); glTexEnvx(GL_TEXTURE_ENV, GL_OPERAND1_ALPHA, GL_SRC_ALPHA); if(m_pGlossVecColor) { glEnableClientState(GL_COLOR_ARRAY); glColorPointer(4, GL_FIXED, 0, m_pGlossVecColor); } glDrawElements( m_apSubMeshList[iSubMesh]->m_iMode, indexCount, m_apSubMeshList[iSubMesh]->m_iType, indices); if(m_pGlossVecColor) { glDisableClientState(GL_COLOR_ARRAY); } // Restore default state glDisableClientState(GL_TEXTURE_COORD_ARRAY); glDisable(GL_TEXTURE_2D); glClientActiveTexture(GL_TEXTURE0); glActiveTexture(GL_TEXTURE0); } The m_pGlossVecColor is computed as follows: if ( m_bEnableGloss ) { Vector3 halfVec, halfVecInTanSpace; lightVector = m_lightPosition - m_pVerticeCoords[iElement]; halfVec.x.value = lightVector.x.value + cameraPos.x.value; halfVec.y.value = lightVector.y.value + cameraPos.y.value; halfVec.z.value = lightVector.z.value + cameraPos.z.value; halfVec.Normalize(); halfVecInTanSpace.x = m_pTangentArray[iElement].Dot(halfVec); halfVecInTanSpace.y = m_pBinormalArray[iElement].Dot(halfVec); halfVecInTanSpace.z = m_pNormalArray[iElement].Dot(halfVec); m_pGlossVecColor[iElement].r.value = (halfVecInTanSpace.x.value >> 1) + FIXED32_HALF; m_pGlossVecColor[iElement].g.value = (halfVecInTanSpace.y.value >> 1) + FIXED32_HALF; m_pGlossVecColor[iElement].b.value = (halfVecInTanSpace.z.value >> 1) + FIXED32_HALF; m_pGlossVecColor[iElement].a.value = FIXED32_1; } Anyone has any why's the gloss not neing shown? Thanks in advance.
  • Advertisement

Important Information

By using, you agree to our community Guidelines, Terms of Use, and Privacy Policy. is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!