Archived

This topic is now archived and is closed to further replies.

zennehoy

Quick and Dirty fps display

Recommended Posts

Does anyone have a quick and dirty way to write the fps onto a dx8 device? I just need it for debugging, and am looking for something simple that I can just comment out when I don''t need it anymore. I have code to get the fps into a word, but I need to write that to the display. THx.

Share this post


Link to post
Share on other sites
When you say you can get the fps into a word, does that mean into a char array? If so, good, if not, well, use the itoa or some other such function. Once you have you lovely formatted string, you can use this for a quick and easy display information. Warning! It will probably drop your FPS anyway.. try writing it to a file or something once every 100 frames - maybe faster...


  
int Draw_Text_GDI(char *text, int x,int y,COLORREF color, int size, char *font, LPDIRECTDRAWSURFACE7 lpdds, int underline)
{
// this function draws the sent text on the sent surface

// using color index as the color in the palette


HFONT my_font; //the font information

HFONT old_font;
HDC xdc; // the working dc


//initialise the font

my_font = CreateFont(size,0,0,0,0,0,underline,0,0,0,0,0,0, font);

// get the dc from surface

if (lpdds->GetDC(&xdc)!=DD_OK)
return(0);

// set the colors for the text up

SetTextColor(xdc, color);

//select our custom font

old_font = (HFONT)SelectObject(xdc, my_font);

// set background mode to transparent so black isn''t copied

SetBkMode(xdc, TRANSPARENT);

// draw the text a

TextOut(xdc,x,y,text,strlen(text));

//restore the old font

SelectObject(xdc, old_font);

//delete our font

DeleteObject(my_font);

// release the dc

lpdds->ReleaseDC(xdc);

// return success

return(1);
} // end Draw_Text_GDI



That should suffice until you get your bitmap based text engine up and running ... you _are_ making a bitmap font engine... aren''t you???

cheers


ps this code works in DX7. I have no idea about DX8... but the process should be very similar.

Share this post


Link to post
Share on other sites
I would recommend not doing it the way described above (no offense). The reason is that using the GDI to draw is very slow compared to other methods. And although it''s just for debugging, there''s no point in looking at artificially low rates. I believe the above method was indeed the quickest and dirtiest in DX7, but:

A texture based font is already implemented in DX8 - look at CD3DFont. If for some reason it doesn''t suit you, look at the source code for CD3DFont and reproduce it to fit your needs. Basically, it uses GDI (once) to draw a font to a texture and then uses that texture to do some fast rendering. Should be much faster than GDI.

Here''s a line of my code:
m_pPerformanceFont->DrawText(2, 0, D3DCOLOR_ARGB(255,255,255,255), m_strFPS);

Share this post


Link to post
Share on other sites
Your could dump the FPS to a file (or your debug output file), or use D3DXSprite and a bitmap containing digits 0 to 9.
My D3DXSprite wrapper and app has the stuff to do this, download it at:

  Game Download  ZeroOne Realm

Share this post


Link to post
Share on other sites
Yup, I agree with CrazedGenius here - NEVER use GDI calls (or any code which does - some of the D3DX calls do) to display your framerate or anything else where you want good performance.

The problem is that when you GetDC for a surface, internally it calls Lock() - which flushes all pending rendering on the chip and also spins the CPU waiting for the chip to finish the flush.

After that it writes to the locked memory using the CPU - the locked memory is video memory so you''re going to be limited by the AGP bus speed, and to an extent by the cache. (Although VRAM does get the benefit of WC burst writes IIRC).

Even worse, often video cards store pixels in memory in a different order than you think (known as "swizzling" - re-ordering pixels to allow faster hardware access to memory) - when you lock the surface (or GetDC does) the data must be converted from the internal device format into one you can use (ie. into traditional RGB data). It must also be re-swizzled when Unlock()ed.

Worse still, the GDI text drawing routines are setup for scaling outline fonts, doing colour masking operations on them, handling output to other forms of DC (such as a printer), antialiasing etc - which isn''t good if you want to display a bitmap font fast

As you can guess all of this can kill performance - I''ve seen cases where a simple GetDC/TextOut/ReleaseDC took approximately 20fps off an app which was really running at 60fps (ie. the onscreen framerate counter said 40fps, I removed the GDI calls and used IPEAK to get the true frame rate and got 60fps!).

Best way by far is to do it yourself - draw a small font to a single texture (say 256x256) either offline or when the application starts up. Then when you need to draw the text draw a list of indexed screenspace quads (D3DTLVERTEX), one for each character you want to display - this will work very nicely with hardware acceleration, and give you the flexibility to use coloured fonts etc.

--
Simon O''''Connor
Creative Asylum Ltd
www.creative-asylum.com

Share this post


Link to post
Share on other sites
Hmm I feel I must reply. First of all, zennehoy asekd for a quick and dirty way to show the fps. I think I qualified there:

quick: just copy and paste the code =)
dirty: it uses GDI

Anyway, I am at a loss to work out HOW GDI slows down my game, because, well, it doesn''t! I run my fullscreen DX7 game with quite a few GDI functions - arcs, ellipses, mathematically Complex stuff I would assume, and even that text function - but my frame rate remains over 100. I turn it off and get the same FPS. Wierd, hey? The thing that REALLY slows the game down is D3D7, when I render about 4000 triangles per frame.. but I wouldn''t think that would be too much, especially when they are small...

Share this post


Link to post
Share on other sites
Thanks for the info.
I have been using GDI functions to display text before, but I have since moved on to D3D8, where it is not nearly as simple to get the DC as in DD8. SO! Thanks Freshya for the code, but I fear I won''t be able to use it. Unless someone comes up with a better idea, I''ll just dump the framerate into a file. Not as nice as having it on screen, but it will do. I really want to get the 3d graphics part on its way before I start work on a bitmap-based text routine.

Share this post


Link to post
Share on other sites
To Freshya - Yes, for DX7, your method is perfect as a "quick and Dirty" way. My only explanation for your performance is that perhaps (???) you are doing things in such a way that your incremental cost of rendering the GDI stuff is negligible - if you are locking for the text drawing, then your shape drawing might not cost much more (???) For you D3D issues, try using MadOnion''s benchmark for High Polygon scenes. Their numbers are very close to what I get in my code. You can use them to see if you are doing something suboptimally. 4000 triangles per frame at 100 fps = 400000 tri/s, which seems really low depending on your hardware.

To zennehoy: Before writing files, look at the C3DFont class. It''s in the SDK, and basically takes 3-4 lines of code to use:

When you initialize:
//Create the font

Per Frame:
//figure out your frame rate
//display the rate - very similar to TextOut!

Cleaning up:
//destroy the font

Super easy!

Share this post


Link to post
Share on other sites
CrazedGenius: I doubt it''s my hardware that''s at fault (32mb GF2, 128mb RAM, 900mHz Athlon) but maybe the fact that they have vertex alpha, colour keyed textures, offset texture grabs, and each quad is part of a different triangle strip... that could be the problem ... although they are TL vertices so that should speed things up somewhat...

Share this post


Link to post
Share on other sites
Aha.. semi-solved the problem:

Converted each particle to a fan, cuts down from 16 vertices to 6 (don''t ask me WHAT I was thinking when I wrote the routine...)

As you might guess.. there was a slight speed improvement...

Still, it shouldn''t have been running that slow, however inefficient a coder I may be =)

Share this post


Link to post
Share on other sites
hehe - I had a bizarre case once where GDI actually increased my frame rate....

For some reason whenever I turned my quick and dirty fps counter off, my framerate plummeted to around 1 fps. (ie, it was so slow you didnt need an fps counter to measure it)

I never did figure out why.... I think it was something to do with my network code.

Share this post


Link to post
Share on other sites
Allright, for now I''m just dumping the framerate to a file. I''ll try c3dfont tonight. Here''s whats bothering me though:
I am writing a single triangle to the display, doing practically nothing else (other than checking windows messages etc.), and only getting ~60-70fps. Is there something in my initialization code that could slow everything down? I have a Radeon VE 32mb, 128mb ram and 450mhz proc, so I think it should be able to go faster than that?
I am wondering if there are any flags in initialization that have to be set/cleared to speed up rendering. Some people boast a complete scene with 100s of triangles renders at >200fps. What gives?

Share this post


Link to post
Share on other sites
Hmmm...

Could be how you''re computing the fps.

For example, the actual time it takes to get to the screen will be a function of the vertical frequency of the monitor. An LCD runs at 60Hz, so the actual framerate never exceeds 60. If you time just the processing, then you will be timing the processing time of the frame, regardless of screen frame rate.

Take a look at how you are computing the rate. Also, are you using the high performance timers? That will make a difference as well. I think there was a recent GameDEv article called "frame rate independent animation" that talked about the timers.

Share this post


Link to post
Share on other sites
Another thing to watch out for is don''t do any initialisation during your main loop. I know some people have tried loading their graphics etc, every frame which is, undeniably, slow.

Share this post


Link to post
Share on other sites
Yeah, I was running into the 70 hz refresh rate of the monitor, but only in fullscreen mode. I am now rendering ~33000 triangles/sec windowed. Is that acceptable?
I don''t know how to find a fps rate above 70 (actually 76) in fullscreen ''cause my code syncs with the monitor. Anyone have a workaround for this? It would be nice not having to wait for the refresh but instead to have time for unit movement etc. THx.

Share this post


Link to post
Share on other sites
Most drivers will allow you to disable the VSync, at least for OGL. Seeing as we''re using DX, I''m not quite sure - all I know is that my NVidia Detonator3 drivers don''t have VSync enabled and for that I am thankful!

Share this post


Link to post
Share on other sites
But... you could see how much time it actually took for each frame, find the average, and divide 1 second by it. This way you could see how fast your game *could* have run if it were not for VSync.

Share this post


Link to post
Share on other sites