• Announcements

    • khawk

      Download the Game Design and Indie Game Marketing Freebook   07/19/17

      GameDev.net and CRC Press have teamed up to bring a free ebook of content curated from top titles published by CRC Press. The freebook, Practices of Game Design & Indie Game Marketing, includes chapters from The Art of Game Design: A Book of Lenses, A Practical Guide to Indie Game Marketing, and An Architectural Approach to Level Design. The GameDev.net FreeBook is relevant to game designers, developers, and those interested in learning more about the challenges in game development. We know game development can be a tough discipline and business, so we picked several chapters from CRC Press titles that we thought would be of interest to you, the GameDev.net audience, in your journey to design, develop, and market your next game. The free ebook is available through CRC Press by clicking here. The Curated Books The Art of Game Design: A Book of Lenses, Second Edition, by Jesse Schell Presents 100+ sets of questions, or different lenses, for viewing a game’s design, encompassing diverse fields such as psychology, architecture, music, film, software engineering, theme park design, mathematics, anthropology, and more. Written by one of the world's top game designers, this book describes the deepest and most fundamental principles of game design, demonstrating how tactics used in board, card, and athletic games also work in video games. It provides practical instruction on creating world-class games that will be played again and again. View it here. A Practical Guide to Indie Game Marketing, by Joel Dreskin Marketing is an essential but too frequently overlooked or minimized component of the release plan for indie games. A Practical Guide to Indie Game Marketing provides you with the tools needed to build visibility and sell your indie games. With special focus on those developers with small budgets and limited staff and resources, this book is packed with tangible recommendations and techniques that you can put to use immediately. As a seasoned professional of the indie game arena, author Joel Dreskin gives you insight into practical, real-world experiences of marketing numerous successful games and also provides stories of the failures. View it here. An Architectural Approach to Level Design This is one of the first books to integrate architectural and spatial design theory with the field of level design. The book presents architectural techniques and theories for level designers to use in their own work. It connects architecture and level design in different ways that address the practical elements of how designers construct space and the experiential elements of how and why humans interact with this space. Throughout the text, readers learn skills for spatial layout, evoking emotion through gamespaces, and creating better levels through architectural theory. View it here. Learn more and download the ebook by clicking here. Did you know? GameDev.net and CRC Press also recently teamed up to bring GDNet+ Members up to a 20% discount on all CRC Press books. Learn more about this and other benefits here.


  • Content count

  • Joined

  • Last visited

Community Reputation

181 Neutral

About benp444

  • Rank
  1. My solution was to uninstall the update and now everything works fine:   How to uninstall update 2670838 To uninstall update 2670838, follow these steps: Click Start, click Control Panel, click Programs, and then click Programs and Features. Click View installed updates. Click Update for Microsoft Windows (KB 2670838), and then click Uninstall.
  2. Yup - thats it. KB 2670838 has stopped debug mode working. The link you pasted above leads on to a more detailed debate, but my solution was to uninstall this update and now everything works fine.   Going to switch off automatic updates from now on.   Thanks,   Ben.
  3. For a few weeks I have been developing only in release mode. I recently switched back to debug mode and found my previously working code failed. I have narrowed down the issue but have found it is not only present in the code I was working on, but all code (test code, old projects, sample code etc) on my machine.   The problem is that when D3D10CreateDeviceAndSwapChain is called in debug mode I get an E_FAIL error code (see error box attached). I.e. the code below (taken from a Frank Luna sample) fails:         HR( D3D10CreateDeviceAndSwapChain(             0,                 //default adapter             md3dDriverType,             0,                 // no software device             D3D10_CREATE_DEVICE_DEBUG,             D3D10_SDK_VERSION,             &sd,             &mSwapChain,             &md3dDevice) );   However if I change the 4th argument to 0 everything works.   I can only think that either my graphics card is broken or the Direct X SDK or Direct X runtime is somehow corrupt. Has anyone got any suggestions as to what should be my next steps?   Ben.
  4. Sorry, solved it. I have created a shader variable (g_createExplosion). On each frame if this is set to true it will kick off another cycle of particles without effecting the existing ones.
  5. I have gone through the GSParticles demo that comes with the DirectX SDK and studied Frank Lunas chapter on particles in his DirectX 10 book. I can get particles working and have no problem understanding the basics however I am struggling to come up with a good solution or class design such that I can create (e.g.) explosions whenever I want. Here is a basic requirement that I want: 1) Create a particle explosion on the GPU using a geometry shader - similar to the GSParticle but the explosion will start and then complete. Lets say the explosion lasts for 3 seconds. 2) be able to create an arbitrary amount of explosions within in reason (e.g. 5) without disrupting the existing ones. 3) be able to create these explosions during any game frame I choose. 4) An example of the above would be explosion a) at t=0 secs, explosion b) at t=0.5, c) at t=2.5, d) at t=360 etc. The pseudo code looks at bit like this: [source lang="cpp"] class Particle { void init(); void render(); //this actually takes the elapsed and totalTime as well } Particle::init(){ m_fXPosition= m_pEffect10->GetVariableByName( "g_position" )->AsVector(); } Particle::render() { //update pass m_fXPosition->SetFloatVector((float*)&m_position); if(firstTime) Draw(); else DrawAuto(); //render pass, passes in elapsed time //etc... } [/source] This class currently works in my game as it stands e.g: [source lang="cpp"]onDeviceInit() { m_explosion=new Particle(); m_explosion.init(); } onFrameUpdate() { //an explosion event occurred so m_explosion.reset(); //sets first time to true; } onFrameRender() { //this is continually called whether there are particles alive or not. m_explosion.render(); } [/source] The above however fails requirement 2. If a second explostion event occurs before the existing one is complete then the existing particles will be destroyed because firstTime will be true and Draw() will be called. My shader is very similar to GSParticles.fx and for the sake of this topic consider it the same. I appreciate that seeding particles might be the way to go and of course GSParticle has a single seed (PT_LAUNCHER), but I can't see how I can inject seeds to an already running shader. Bearing in mind that I don't want to create a new Particle class and call init() everytime I want an explosion, The only way I can see of doing this would be to create (e.g.) 5 particle instances and store them in an array. Then when I want an explosion I get the next instance set its position. This would allow me a maximum of 5 explosions at any given time (a 6th explosion would kill all particles in the 1st explosion). Can anyone give me some hint as to how to create a simple particle framework using Geometry shaders in DirectX 10?
  6. Thanks for you help. It has taken me a day of playing with a toy aeroplane to work it out, but I have got it working and the resulting code is posted below. The major problem was that I needed to maintain state of my complete model transform and the 3 axis. My game actor now has a stateful tranformation matrix (m_transform) and 3 stateful axis (m_XAxis, m_YAxis and m_ZAxis). [source lang="cpp"]void GameActor::VUpdate(float elapsedTime){ D3DXQUATERNION qX; D3DXQUATERNION qY; D3DXQUATERNION qZ; D3DXMATRIX matRotX; D3DXMATRIX matRotY; D3DXMATRIX matRotZ; D3DXMATRIX rotation; //calculate incremental change for 3 axis D3DXQuaternionRotationAxis(&qX,&m_Xaxis,m_rotation.x); D3DXQuaternionRotationAxis(&qY,&m_Yaxis,m_rotation.y); D3DXQuaternionRotationAxis(&qZ,&m_Zaxis,m_rotation.z); //get matix rotations for each axis. D3DXMatrixRotationQuaternion(&matRotX,&qX); D3DXMatrixRotationQuaternion(&matRotY,&qY); D3DXMatrixRotationQuaternion(&matRotZ,&qZ); //calculate final rotation. This will still have slight errors as the axis were //not recalculated between calls to D3DXMatrixRotationQuaternion above, but //the are insignificant. rotation=matRotX*matRotY*matRotZ; //modify the stateful transform m_transform=m_transform*rotation; //will add scale and move later. These will both need to be stateless. //modify the stateful axis D3DXVECTOR4 vec4X; D3DXVec3Transform(&vec4X, &m_Xaxis, &rotation); m_Xaxis=D3DXVECTOR3(vec4X.x,vec4X.y,vec4X.z); //realign Y axis D3DXVECTOR4 vec4Y; D3DXVec3Transform(&vec4Y, &m_Yaxis, &rotation); m_Yaxis=D3DXVECTOR3(vec4Y.x,vec4Y.y,vec4Y.z); //realign Z axis D3DXVECTOR4 vec4Z; D3DXVec3Transform(&vec4Z, &m_Zaxis, &rotation); m_Zaxis=D3DXVECTOR3(vec4Z.x,vec4Z.y,vec4Z.z); }[/source] I still may still be able to make this faster and with less lines of code. Perhaps use QuaternionRotationYawPitchRoll. If anyone can see how to do this in a better way then please say. Ben.
  7. In answer to 'what I am expecting': I have a model of an aircraft. I am looking down on it sitting on the runway. the z axis points out of the screen, the y axis points up and the x axis off to the right. The aircraft is pointing up at the top of the screen. If I press the Y key I expect it to rotate along the planes fuselage (roll), if I press the X key I expect pitch and Z yaw. I get all these actions if I only press a single key - this is because model space and world space are initially aligned. The GameActor class is mine and is very transient at the moment. I can/will add any members and methods which enable me to provide motion. Note that the GameActor code only provides accessors so I seen no point in posting it. The meat of what I am trying to do is posted in my code above. I sort of understand where you are coming from, and I have read elsewhere that you cannot simply convert Euler angles to quats like this. OK so if I can't rotate around X, then Y, then Z what should my approach be? +1 for that. Do I need to do something with my axis vectors (axisX/Y/Z in the code above)? Is it wrong to specify them the way I have? Should I store a single quat as a member of my GameActor class and apply rotations on this - I have read this somewhere, but I cannot see how a single quat would work.
  8. Thanks, but I already tried that: [CODE] D3DXQuaternionRotationYawPitchRoll(&qOrient,ga->getRotation().y,ga->getRotation().x,ga->getRotation().z); D3DXMatrixRotationQuaternion(&rotation,&qOrient); world=scale*rotation*move; [/CODE] The results are similar in that one axis rotates in model coords and the rest in world coords. Note in this case it is Z that always rotates in model space and X and Y that rotate in world space.
  9. Firstly sorry for yet another newby post regarding quaternions but I have scoured the forums for hours without any luck. Secondly I have posted this in the Direct X section as I am hoping to get answers in the form of D3DXQuaternion... method calls rather than a load of maths which may only hinder my progression. Anyhow my problem is fairly straightforward. I have a simple model viewer which shows my model in the middle of the screen. When I press the keys X,Y or Z the model should rotate in MODEL coordinates. I am trying to use Quaternions to do this. The results are that I can only get a single axis to rotate in model space, the other two will subsequently rotate in world space. My code is: [CODE] D3DXQUATERNION qZ; D3DXQUATERNION qY; D3DXQUATERNION qX; D3DXQUATERNION qOrient; D3DXQUATERNION qTotal; D3DXQuaternionIdentity(&qZ); D3DXQuaternionIdentity(&qY); D3DXQuaternionIdentity(&qX); D3DXQuaternionIdentity(&qOrient); D3DXVECTOR3 axisZ(0,0,1); D3DXVECTOR3 axisY(0,1,0); D3DXVECTOR3 axisX(1,0,0); D3DXQuaternionRotationAxis(&qZ,&axisZ,ga->getRotation().z); D3DXQuaternionRotationAxis(&qY,&axisY,ga->getRotation().y); D3DXQuaternionRotationAxis(&qX,&axisX,ga->getRotation().x); //I am not sure if this is neccessary. D3DXQuaternionNormalize(&qZ,&qZ); D3DXQuaternionNormalize(&qY,&qY); D3DXQuaternionNormalize(&qX,&qX); //The first quat will rotate correctly in model space, the latter 2 will rotate in world space. qTotal=qY*qX*qZ; D3DXMATRIX rotation; D3DXMatrixRotationQuaternion(&rotation,&qTotal); world=scale*rotation*move; [/CODE] Note that ga is an instance of GameActor and ga->getRotation() returns a D3DXVECTOR3 containing x,y and z rotations in radians. I would be grateful if anyone could give me a high level approach to solve my problem? Ben.
  10. Thanks, I was actually re-normalizing (in c++, not shader) I just didn't paste the code. Anyhow you have answered my question and I will now define the full faces for all my low polygon shapes.
  11. I have created a tetrahedron using 4 vertices and 12 indices. I have then calculated the normals for each vertex and passed it to the shader to do ambient and diffuse lighting. The results are shown in the picture below and as you can see I get dark sections where I expected lit sections: [attachment=9981:tetra bad shade.png] I am assuming that this is because of interpolation of normals. As I say my approach to create this tetrahedron was to use only 4 vertices and let the indices do the rest. Should I actually define 4 faces (12 vertices) and define the normals for each face. I know this latter approach would work but my question is should the former approach also work, or should I reserve the low vertex count solution only for high polygon count objects? My code for defining the vertices, indices and normal is below: [CODE] PrimitiveVertexStruct vertices[]= { { D3DXVECTOR3(0.0f,1.0f,0.0f), WHITE_VECTOR, ZERO_VECTOR3 }, //1 { D3DXVECTOR3(1.0f,-1.0f,0.0f), RED_VECTOR, ZERO_VECTOR3 }, //2 { D3DXVECTOR3(-1.0f,-1.0f,0.0f),BLUE_VECTOR, ZERO_VECTOR3 }, //3 { D3DXVECTOR3(0.0f,0.0f,2.0f), BLACK_VECTOR, ZERO_VECTOR3 }, //4 }; m_numTetraVertices = sizeof(vertices) / sizeof(PrimitiveVertexStruct); m_numTetraIndices=12; DWORD indices[] = { 0,1,3, //1,2,4 1,2,3, //2,3,4 2,0,3, //3,1,4 0,2,1 //1,3,2 }; //compute average normals for(int i=0;i<m_numTetraVertices;i++) { //indices of triangle int i0=indices[i*3+0]; int i1=indices[i*3+1]; int i2=indices[i*3+2]; //vertices of triangle D3DXVECTOR3 v0=vertices[i0].Pos; D3DXVECTOR3 v1=vertices[i1].Pos; D3DXVECTOR3 v2=vertices[i2].Pos; //compute face normal D3DXVECTOR3 e0=v1 - v0; D3DXVECTOR3 e1=v2 - v0; D3DXVECTOR3 normal; D3DXVECTOR3 cross; D3DXVec3Cross(&normal, &e0, &e1); //add to any existing normals for that vertex vertices[i0].Normal+=normal; vertices[i1].Normal+=normal; vertices[i2].Normal+=normal; } [/CODE]
  12. [quote name='Dancin_Fool' timestamp='1341424542' post='4955677'] With depth testing enabled, the drawing order for alpha objects matter. The problem you're seeing is caused by the first object filling the depth buffer with a quad so when the second object which is behind draws it's tested against the depth buffer which contains a quad so that area is clipped out rather than alpha blended with the object in front of it. This is a common problem with alpha blending. What you want to do is sort your objects from back to front before you render them. Even if you are submitting these objects out of order, if you queue them up in a buffer you can sort that buffer before you actually submit them to be rendered. [/quote] OK. So for sprites the following line sorts the buffer: m_pSpriteObject->Begin(D3DX10_SPRITE_SORT_DEPTH_BACK_TO_FRONT); I will however have to ensure that no two sprites have the same depth; in image 1) both sprites were 0.6. Thats fine I can write a trivial bit of code that takes the user defined depth and adds 0.0001 to it so that it is highly unlikely two sprites will have the same depth. I was going to say that I still had a problem with the primitives (which are drawn at a depth of 0.5). However I found that if I draw the primitives before the sprites (which may be shallower or deeper than sprites) then there is no clipping and all the depth is correct. I can't quite understand why this works as the ordering is not as discussed but it does. Successful depth enabled sprites and primitives below: [attachment=9847:DepthEnabledWorks.png] Thanks both for you help. Ben.
  13. Thanks for your response. [quote name='NightCreature83' timestamp='1341419387' post='4955660'] If you turn depth testing off rendering is done in the order you submit the draw calls in, so say you have two objects that overlap and one is at 0.1 and the other is at 0.2 depth. With depth testing on and rendering the 0.1 object first that would be the object you see, when it is off however the one with 0.2 is rendered last and will overlap the other one. [/quote] My calls to draw may not come in an order based on depth so I really want to keep depth testing on. However this approach results in image 1) shown in my original post. [quote name='NightCreature83' timestamp='1341419387' post='4955660'] Are you sure you aren't changing the alpha blend equations as well? If you render something thats changing the Alpha blend state before or after this draw call when you next come to this drawcall the alpha state will not have reset itself. [/quote] I don't know much about the alpha blending. I have posted my code below that handles the alpha channel. Again looking at image 1); are you suggesting that this problem may have nothing to do with depth testing and that I should be looking at my alpha code? Apart from the initialisation code below and the rendering code that follows I don't touch the alpha set up. Perhaps my Alpha setup is wrong? More initialisation code wrt alpha: [source lang="cpp"] D3D10_BLEND_DESC StateDesc; ZeroMemory(&StateDesc, sizeof(D3D10_BLEND_DESC)); StateDesc.AlphaToCoverageEnable = TRUE; //changed from FALSE 040712 StateDesc.BlendEnable[0] = TRUE; StateDesc.SrcBlend = D3D10_BLEND_SRC_ALPHA; StateDesc.DestBlend = D3D10_BLEND_INV_SRC_ALPHA; StateDesc.BlendOp = D3D10_BLEND_OP_ADD; StateDesc.SrcBlendAlpha = D3D10_BLEND_ZERO; StateDesc.DestBlendAlpha = D3D10_BLEND_ZERO; StateDesc.BlendOpAlpha = D3D10_BLEND_OP_ADD; StateDesc.RenderTargetWriteMask[0] = D3D10_COLOR_WRITE_ENABLE_ALL; hr=pD3DDevice->CreateBlendState(&StateDesc, &m_pBlendState10);[/source] And the code that renders the sprites: [source lang="cpp"]hr = m_pSpriteObject->SetProjectionTransform(&(g_pGameManager->m_pMatProjection)); // start drawing the sprites hr=m_pSpriteObject->Begin(D3DX10_SPRITE_SORT_DEPTH_BACK_TO_FRONT); //D3DX10_SPRITE_SORT_DEPTH_FRONT_TO_BACK,D3DX10_SPRITE_SORT_DEPTH_BACK_TO_FRONT //Save the current blend state ID3D10BlendState* pOrigBlendState; DXUTGetD3D10Device()->OMGetBlendState(&pOrigBlendState, OriginalBlendFactor, &OriginalSampleMask); // Draw all the sprites in the pool hr=m_pSpriteObject->DrawSpritesBuffered(&(m_spritePool[0]), m_spritePoolIdx); // Set the blend state for alpha drawing if(m_pBlendState10) { FLOAT NewBlendFactor[4] = {1,1,1,1}; DXUTGetD3D10Device()->OMSetBlendState(m_pBlendState10, NewBlendFactor, 0xffffffff); } // Finish up and send the sprites to the hardware hr=m_pSpriteObject->Flush(); hr=m_pSpriteObject->End(); DXUTGetD3D10Device()->OMSetBlendState(m_pBlendState10, OriginalBlendFactor, OriginalSampleMask); [/source] Note: I am checking all my Direct X calls. I have just removed them for clarity.
  14. Hi, I am trying to draw both sprites and 2d primitives on the screen at the same time and am having problems with the depth. The primitives are created using D3D10_PRIMITIVE_TOPOLOGY_LINESTRIP. There are two situations that I can have. Note that the rectangles have a depth of 0.5, the left hand ball of 0.2 and the two balls on the right 0.6: 1) Depth Enabled: the sprites render correctly based on the assigned depth. However the alpha channels of the sprite show up. [attachment=9834:DepthEnabled.png] 2) Depth Disabled: the primitives and sprites do not render correctly on depth. If the sprites are the last group to be drawn the they will be on top and visa versa [attachment=9833:DepthDisabled.png] During initialisation have created two depth stencils: [source lang="cpp"]//Create a disabled depth state SAFE_RELEASE(m_pDepthDisabledState); D3D10_DEPTH_STENCIL_DESC depthStencilDesc; ZeroMemory(&depthStencilDesc, sizeof(depthStencilDesc)); depthStencilDesc.DepthEnable = false; depthStencilDesc.DepthWriteMask = D3D10_DEPTH_WRITE_MASK_ALL; depthStencilDesc.DepthFunc = D3D10_COMPARISON_LESS; depthStencilDesc.StencilEnable = false; depthStencilDesc.StencilReadMask = 0xFF; depthStencilDesc.StencilWriteMask = 0xFF; depthStencilDesc.FrontFace.StencilFailOp = D3D10_STENCIL_OP_KEEP; depthStencilDesc.FrontFace.StencilDepthFailOp = D3D10_STENCIL_OP_INCR; depthStencilDesc.FrontFace.StencilPassOp = D3D10_STENCIL_OP_KEEP; depthStencilDesc.FrontFace.StencilFunc = D3D10_COMPARISON_ALWAYS; depthStencilDesc.BackFace.StencilFailOp = D3D10_STENCIL_OP_KEEP; depthStencilDesc.BackFace.StencilDepthFailOp = D3D10_STENCIL_OP_DECR; depthStencilDesc.BackFace.StencilPassOp = D3D10_STENCIL_OP_KEEP; depthStencilDesc.BackFace.StencilFunc = D3D10_COMPARISON_ALWAYS; hr = DXUTGetD3D10Device()->CreateDepthStencilState(&depthStencilDesc, &m_pDepthDisabledState); //Create an enabled depth state SAFE_RELEASE(m_pDepthEnabledState); depthStencilDesc.DepthEnable = true; //the only difference hr = DXUTGetD3D10Device()->CreateDepthStencilState(&depthStencilDesc, &m_pDepthEnabledState); //set to disabled by default. DXUTGetD3D10Device()->OMSetDepthStencilState(m_pDepthDisabledState, 0); //DXUTGetD3D10Device()->OMSetDepthStencilState(m_pDepthEnabledState, 0); [/source] I have tried switching between depth enabled/disabled during the rendering of either the primitives or the sprites but it seems and cannot mix and match the two. Has anyone any ideas what approach I should take to solve this? Thanks, Ben.
  15. Great! and the solution to the washed out colour is to create a new SRGB texture in memory (not from file), copy the non-SRGB texture (from file) into it and use that to create the texture resource. See method below. Now my balls are perfect! [CODE] ID3D10Texture2D* GameManager::convertTextureToSRGB(ID3D10Texture2D* srcTexture){ if(srcTexture==NULL) { throw GameException(E_POINTER,"GameManager::convertTextureToSRGB: srcTexture was null"); } D3D10_TEXTURE2D_DESC origDesc; srcTexture->GetDesc(&origDesc); D3D10_TEXTURE2D_DESC srgbDesc; ZeroMemory( &srgbDesc, sizeof(srgbDesc)); srgbDesc.Width = origDesc.Width; srgbDesc.Height = origDesc.Height; srgbDesc.MipLevels = 1; srgbDesc.ArraySize = 1; srgbDesc.Format = DXGI_FORMAT_R8G8B8A8_UNORM_SRGB; srgbDesc.SampleDesc.Count = 1; srgbDesc.Usage = D3D10_USAGE_DYNAMIC; srgbDesc.BindFlags = D3D10_BIND_SHADER_RESOURCE; srgbDesc.CPUAccessFlags = D3D10_CPU_ACCESS_WRITE; ID3D10Texture2D *pSRGBTexture = NULL; HRESULT hr=DXUTGetD3D10Device()->CreateTexture2D( &srgbDesc, NULL, &pSRGBTexture ); testHr("CreateTexture2D:CreateTexture2D",hr); //copy the original texture in the into the SRGB texture D3D10_BOX sourceRegion; sourceRegion.left = 0; sourceRegion.right = origDesc.Width; sourceRegion.top = 0; sourceRegion.bottom = origDesc.Height; sourceRegion.front = 0; sourceRegion.back = 1; DXUTGetD3D10Device()->CopySubresourceRegion(pSRGBTexture, 0, 0, 0, 0, srcTexture, 0,&sourceRegion); return pSRGBTexture; } [/CODE]