Untitled

posted in DruinkJournal
Published June 02, 2009
Advertisement
Another mini-update.

I've been playing Guild Wars again a lot recently with my girlfriend. I'm also away from Friday for a week, to house-sit with her for her parents in Bournemouth, and we were planning on playing Guild Wars while we were away.

However, I just installed Guild Wars on my laptop (A Toshiba Equium L300), and it runs at 6 FPS with all the graphics settings set to minimum. It runs at 4 FPS with them all on "Normal" though, so I'm currently thinking that the game is doing one particular thing that the card doesn't do well.

I'm going to try running it through Pix and a proxy d3d9.dll that I'll write so I can see exactly what it's doing, and perhaps change the device caps that my card exposes to make it playable.

The annoying thing is that my girlfriend's laptop runs it fine (30 FPS+ at default settings), and it's very similar to mine (I can't remember the exact model, but it was the same price or maybe GBP50 more and is 6 months newer).

The laptop has an Intel Express 965M chipset in it, which I gather is pretty crappy. If the chipset is just too crap to play Guild Wars then I guess it is, but I would have thought it'd do better than 6 FPS - it even plays Red Alert 3 fine.

I've got the latest drivers and so on, and it rates a 2.4 on Vista's user experience score (The lowest value).
If anyone has any suggestions for things for me to try, please feel free to make them [sad]
Previous Entry Untitled
Next Entry Untitled
0 likes 3 comments

Comments

Evil Steve
Hrm. PIX says that there's 3 vertex buffers being created and destroyed each frame, with the same size and usage flags. I wouldn't be surprised if that's what's thrashing the GPU. PIX also shows around 30 VB locks per frame, around 4- SetPixelShader calls and around 160 SetTexture calls.

At device creation, the game tries to create a pure device + software vertex processing, which fails ("Pure device cannot perform software vertex processing"), so it tries without the pure device flag and it works.
In looking into that, I noticed that the graphics card reports it can do VS 3.0 and PS 3.0, which is a bit ambitious - although the game only seems to use VS 1.1 and PS 1.1 from the few shaders I looked at.

So, things I need to do with my proxy DLL:
1. Try forcing the device creation to hardware vertex processing and see if that makes a difference.
2. Try caching vertex buffers in the proxy DLL to avoid constantly re-creating them.
3. Try reporting lower versions for VS and PS versions in the device caps.

The first and last ones will be getting tested first, since they should be fairly quick to test...
June 02, 2009 01:32 PM
benryves
Quote:Original post by Evil Steve
I've got the latest drivers and so on, and it rates a 2.4 on Vista's user experience score (The lowest value).
I wouldn't take those values to heart. I used to get 4.2 with a 256MB Radeon X1300 Pro. I recently installed a 512MB Radeon HD 3450, which knocked my score down to 3.9.

June 02, 2009 04:13 PM
Evil Steve
Quote:Original post by benryves
Quote:Original post by Evil Steve
I've got the latest drivers and so on, and it rates a 2.4 on Vista's user experience score (The lowest value).
I wouldn't take those values to heart. I used to get 4.2 with a 256MB Radeon X1300 Pro. I recently installed a 512MB Radeon HD 3450, which knocked my score down to 3.9.
True, but it gives a very vague idea that it's "Not good" [smile]
June 02, 2009 05:02 PM
You must log in to join the conversation.
Don't have a GameDev.net account? Sign up!
Advertisement
Advertisement