16bit = averages 60 fps, 32 bit averages 0.15 fps!
I don''t know what''s wrong with my code! When I load it in 800x600x16 I run at approximately 60fps, when I run at 800x600x32 it averages out at 0.15 fps =( Has anyone come across this problem before with their code?
Don`t know what exactly you are doing but I had a problem when switching from 32 to 16 Bit because my color calculations where too expensive.
It could also be that your graphics hardware does not support this mode although that is not very likely to happen.
Im Anfang war die Tat...
Faust
It could also be that your graphics hardware does not support this mode although that is not very likely to happen.
Im Anfang war die Tat...
Faust
Since your card runs in that mode (0.15fps), it obviously supports it in a technical sense. However it sounds almost like you''re trying to run it on some PCI video card with 2 or 4MB of vram. What happens is that there is not enough space to store the back buffer in video ram so it is stored in system ram. On a PCI card copies from system to video ram have to be done by the cpu which is *very* slow.
Actually I think I know what the problem might be, I created a runtime log for my engine and when some part of my code errors out then it logs it in a text file, when I switch to 32bit it says that no HAL device can be found and it uses REF. Which would explain to fps I think. What I don''t understand is why it doesn''t support the 32bit. I use 32bit for my system colors. Perhaps I should check my cards caps, I will and then I''ll post and report my crappy card''s limitations lol
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement