Advertisement Jump to content
  • Advertisement


  • Content Count

  • Joined

  • Last visited

Community Reputation

312 Neutral

About MrFlibble

  • Rank

Personal Information

  • Interests

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Hmm, Perhaps it's not actually the call to SetStreamSource that is causing the bottle neck. Try changing it to this and re-run your test. LPDIRECT3DVERTEXBUFFER9 pVertBuffer = data->grids[it->x][it->y].vBuffer; g->d3d->SetStreamSource(0, pVertBuffer, 0, sizeof(_3DVERTEX)); My Thought is that perhaps you're incuring cache issues with the data lookup - though calling SetStreamSource as little as possible is probably a good idea so swapping to use a dynamically filled single buffer might be advantageous. By my quick calculations you'd need a vertex buffer capable of holding 3,840-13,440 vertices which is fairly small anyway.
  2. The slowdown you're probably seeing is related to the texture cache and the size of the polygon you are drawing. A texture polygon of that size will hammer the texture cache, causing lots of data stalls loading cache lines. Reducing the size of the polygon or splitting it up into smaller areas to be rendered will improve your performance. Unfortunately the size of the texture cache varies depending on the graphics chip-set. Sticking with polygons that are roughly up to 64x64 in size should give you good results though - but try out any variation that is a power of 2 until you're happy.
  3. MrFlibble

    Well duh!

    I spent some time this weekend working on my little app project at home, which has progressed well - I'm now at the 7th milestone out of the 9 I set myself. Currently the application is still quite small but is starting to develop the features I wanted from it - basically having written in the OpenGL shader support and then built a material system on top of it, including reusing a file parser utility my Doom 3 model viewer (which I really should finish!!). Over the weekend though I managed to hit one those moments which had me seriously considering if everything I wrote was correct. Having started the project and just displayed a flat 2D poly to test the shader & material support it was time to move to displaying proper 3D objects ... setup the projection matrix, move the camera ... err ... where is the polygon?! Queue a few hours of scratching head, putting in a moving camera and generally lots of code reading, only to discover the mistake by accident ... See - when the app starts up you get a WM_SIZE message, which I handled to setup the viewport & projection matrix for the application ... what I hadn't done was to ensure that the OpenGL context had actually been created before hand. I only noticed this after accidentally resizing the application and suddenly everything appearing correctly on screen. Queue much head bashing on desk, 30 seconds of coding and recompilation ... and suddenly it worked. Ever just had one of those moments? Edit: Also - as a side note, be careful when trying to use Vertex Buffer Objects or Vertex Arrays with OpenGL. I had a problem today not long after posting this where I couldn't figure out why the app was crashing. Turns out that you need to be careful when specifying what arrays are active - and I'd left one enabled when I shouldn't have ... see - I still have bits and pieces to learn!
  4. ----------------------------------------------- Captains log - 30/08/3006 For the past 4 days now we have been stuck in deep space, unable to travel anywhere and our power supplies running low. The backups are now running low after the main engines stopped working. Our best technicians are at work trying to correct the problem - but it seems like a mistake in the main operating systems of our ships computer have caused us a problem. Due to overuse of the holodecks we have run out of memory to run vital systems - I would order a purge of the main computer to correct the problem but it seems this would shut down the ship long enough for our air supplies to run out. Damn these people who write our leisure software - they seems to cause us no end of problems with their ever expanding desire to write realistic software. Apparently we would have been fine if we'd managed to get our planned upgrade while in space dock last month though ... but for how long? ----------------------------------------------- One part of my job is to ensure that a game runs on a given platform within the constraints set by the console or our memory specifications. Consoles are easy since you have a limited amount of memory to play with anyway, so you know how much overhead you have to play with. PC's are a little more flexible due to various reasons - but its still nice not to push peoples machines too much - for various performance reasons, but mostly for not having to force people to upgrade their machines just to play a game. Fighting to ensure that the game data, textures and model data all fit into a limited space is a very delicate job ... but then you have another thing to think about - fragmentation. All together its actually an amasing thing that games can be fit into the memory on some consoles and still have such a high visual quality. From the start of a project you need to ensure that you have clearly laid out limits for all your data - and stick to it. Memory profiles of layouts and strict checking of exported data is needed - especially when your artists are prone to author textures in massive sizes and then scale them down at export - hopefully, I have seen exported data creep into the game where textures are still at their 2048x2048 original resolution. Over time we have developed quite an extensive set of memory libraries and tracking tools - both in debug builds and PC side tools that help us to track all manner of allocations, deallocations and fragmentation holes. But it is still alot of work to keep a check on games and ensure they are running smoothly in all areas. Being able to track where and when an item was allocated - or even what deallocated it - is a major asset to identifying and correcting problems. Its not that much of a problem with homebrew applications/demos, but its still good practice to watch out for the pitfalls and try to code yourselves a good set of libraries and tools to just see what its like to deal with these sorts of problems. Understanding these problems, their causes and solutions makes you much better programmer! Another thing is that there are libraries/programs out there that can help you to profile your application to see wether you have memory leaks. Direct3D's debugging levels are also useful in that they will report if you have any objects still referenced/allocated when your application is shutdown. Make use of them :)
  5. MrFlibble

    2D rendering performance

    What you want to do is keep the number of DrawPrimitive calls to an absolute minimum by batching up as many primitives as possible. It might be better for you to transform all of your sprites CPU side, then just throw the post-transformed data at the graphics card via a vertex buffer and a single DrawPrimitive call. What you could do is be very cheeky and actually encode the matrix into the Vertex declaration, declare addtional D3DDECLTYPE_FLOAT4's as TEXCOORDS and then transfer them to a matrix to transform perhaps? Obviously this would need a custom vertex & pixel shader though :)
  6. MrFlibble

    Determining Vertex Processing type

    These flags are used to provide hints to Direct3D as to how to deal with vertex information. There isn't really a way to tell which one to use, the best way to check if you can apply the D3DCREATE_HARDWARE_VERTEXPROCESSING flag is to check what shader version the card supports. D3DCAPS9.VertexShaderVersion or D3DCAPS9.PixelShaderVersion This will tell you wether the hardware is capable of doing Vertex/Shader processing - then use either D3DCREATE_HARDWARE_VERTEXPROCESSING, D3DCREATE_SOFTWARE_VERTEXPROCESSING or D3DCREATE_MIXED_VERTEXPROCESSING appropriately.
  7. Quote:Original post by baldurk No avatar yet? Mr Flibble is very cross indeed. Your right - I knew I forgot something :)
  8. MrFlibble

    DrawSubset() + Triangle Strip possible?

    Having looked at the function description in the DirectX documentation, I don't think you can actually tell the mesh to change its primitive type. Looks like you will have to change your data to be a triangle list.
  9. Isn't it always the way - you start something, learn alot but in the process, but end up forgeting something important. Having been in the games industry for around 8 years now - its kind of hard to remember the happy thought I had the day I got my first job offer in the industry. Having finished university and worked for 9 months doing a job I hated, while practicing my coding and working on small demos in my spare time, that work had paid off. Then the actual reality of the job hits - the long hours, the stresses to meet deadlines and the outpouring of your energy into something that earns you little reward or you have little creative control over. I have to say though, the last 8 years have been fantastic both in terms of what I have learnt and what I have done. I still enjoy working in the games industry, but I don't think I'm quite as enthusiastic about the games I work on. Which is probably why I've come full circle back to working on my own little projects in my spare time - not that I have much!! I used to work with OpenGL alot before I entered the games industry - recently I've started to write a little application for windows with OpenGL. Now when I used to write programs there were no shaders or multi-texturing (yes - scary), so I've essentially started again from scratch with my knowledge and tried to make sure I don't fall into any pitfalls I remember. My old projects used to take me a long time to complete because I didn't formulate myself a plan to follow, or objectives to complete. This is one of the things I'm glad I've picked up from my work. I sat down before I started and layed out a simple plan to follow that would be both measurable and have visible results in each step. This has helped me to actually get something up and running quickly - no screenshots, its not at that good a stage yet - and I have an actual goal to aim for with my next step. So what am I doing here now - well, gamedev has been a massive resource that I have used alot to find out information and keep myself reminded of what I used to do. Now I'm back to pass on hopefully helpful information to others - and try to rediscover what I've forgotten!
  10. MrFlibble


    Well, if you are getting that error then you are trying to use this pixel shader on a Shader 1.1 card, or have compiled it for Shader 1.1. Pixel Shader 1.1 had a restriction that a single set of texture co-ordinates could only be used for addressing one texture. This was removed in Pixel Shader 2.0 onwards. The only way around this is to feed in the texture co-ordinate 3 times via seperate variables. Then use these 3 variables to sample the textures.
  11. MrFlibble


    Err - not quite sure what you mean. As far as I remember if you are multi-texturing, on anything above a shader 1.1 card, you will be able to reuse a single set of texture co-ordinates (U,V) to pull information from any number of textures - up to the maximum supported by the card.
  12. MrFlibble

    Question about Vertexbuffers

    I don't think there is a way that you can change the way that DirectX handles colors at all. Internall DirectX manages colors in the ARGB format - if you wanted to change this then you'd have to write your own Pixel Shader that extracted the color components as RGBA.
  13. MrFlibble

    [Lua] Is this possible

    I didn't actually want the second file to have to 'require' the first one. After looking into this a bit more I found that if I use the lua_pcall function it does what I need. It sets up the variables & functions from the first file enabling me to use the second file to call global functions from the first. cheers
  14. MrFlibble

    [Lua] Is this possible

    I've just started looking into using Lua for a small project that I'm doing but I've run into a problem, either that or I'm not understanding how to use Lua correctly. I want to load a lua script into memory - without running it. i.e. a = 10; function AddA( ) a = a + 10; print( a ); end I then want to be able to run code from another Lua file/string which calls this function. Unfortunatley when I try this - I get the error: [string "AddA();"]:1: attempt to call global `AddA' (a nil value) Ideas? or am I being silly?
  15. MrFlibble

    MD5Mesh Loading/Displaying

    Quote:Original post by joanusdmentia Technically the sqrt() of a number can be positive AND negative ( (-5)^2=25, 5^2=25), Doom3 for some reason goes with the negative solution. I think I need to brush up a bit on my Quaternion maths again ... ... time to break out the headache pills!!
  • Advertisement

Important Information

By using, you agree to our community Guidelines, Terms of Use, and Privacy Policy. is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!