Jump to content
  • Advertisement

kbaird

Member
  • Content Count

    10
  • Joined

  • Last visited

Community Reputation

230 Neutral

About kbaird

  • Rank
    Member
  1. Do you have any sort of file system?  I am still using the effect framework, and it has a nice constructor for Effect (Device, byteCode[]) that works if you can manually load it from storage somehow.  I'm not familiar with windows store restrictions.   In my stuff on PC I compile from file, then take the compilation result and save it out to a file myself.  Then later load that up and pass it into effect.
  2. It's been a few years since I've dealt with C++ linker goblins.  I switched to C# a few years back, and only occasionally get back into c/c++ land, so someone else will probably be better on those linker errors.   The usual cause is missing the proper lib, like timegettime() if I remember right needs winmm.lib, but there are also problems if the calling conventions differ (declared as extern 'c' vs C++ name mangled).  Also static vs dynamic linking can cause all sorts of standard library problems.  I don't remember any of this fondly.   In broad terms a lib is a compiled form of a bunch of code from a project.  It could be a project you have the source to, or sometimes you'll just get a lib itself, like with an sdk.  Sometimes it will be small import library and link to a larger dll.  These are linked at link time, after your project compiles.   Shaders and other assets (textures, meshes etc) are loaded or compiled after your game runs.  So usually they will be in a folder structure with your built executable.  So you might load a texture with a path like "Textures/goblin.png" or something.   Actually I'm not really sure how the C++ side deals with shaders these days.  For myself, I compile only when the shader source files have changed, and save out a compiled shader, as mine take a long time to compile.
  3. I've been coding games/directX for ages and just recently started web development and I feel exactly the same (except in reverse). :)   I think starting simple is wise though.  You might even consider using a middleware package to get you going, though this is advice I don't follow myself.   You might want to take some running samples and pull them apart to see how they work with regards to libs, solutions, project files etc.   If you are coming from the web world, you'll want to get used to using a debugger.  Debuggers actually work on C++, and you'll have a real call stack to work with.  On the GPU side, RenderDoc is a fantastic tool for solving problems and debugging down into shaders.   Good luck!
  4. Yes I think the default is only to draw a pixel if the depth is less than the current depth.  Usually for 2D stuff you'd just turn it off.   You can also manually reject a pixel with the clip() instruction.
  5. If you put a breakpoint on the drawindexed, and see it calling on the second pass, my guess is your renderstates are to blame.   I use 2pass for drawing point and directional shadows onto my world geometry, so I set my depthfunc to EQUAL, and BlendOp to REV_SUBTRACT to get the shadow to draw onto what is already there.   In your case for spritey stuff, you might want to DepthEnable false, and do a traditional alpha blend (states very on that depending on if you are premultiplying or not)
  6. Input layouts are probably not something you want to allocate on the fly per drawcall like that.   But other than that, you might check that localpass.IsValid.  I've had cases where everything compiled but IsValid was false, and I can't remember what caused it.   Also don't you need a vertex shader for your second pass?
  7. kbaird

    [SharpDX] Draw UI elements to screen as quads

    I wrote a really quick and dirty UI module that does sort of #2 above.  I keep a dictionary of "gumps" which are just a quad with a texture, scale, position, and color.  I check a dirty flag every update and rebuild a dynamic vertex buffer with the info from the gump dictionary.   I draw each gump one at a time, with an ortho projection matrix, and just identity for the view.  The shader takes a 2d pos and 2d uv, and just multiplies the texture by the color.   It works for really basic stuff but there's no concept of draw order so complicated stuff and alphas probably won't work.  I made it just to display some static images really quickly.   I can paste some of the code if you'd like, but there's a bit of cruft in there from my material library.
  8. I had trouble resizing for awhile as well, but I'm not entirely sure it was the same problem for me.  Mine turned out to be some SRVs I had forgotten about.   I tracked it down by using the DebugName parameter on SharpDX's SRVs.  I think C++ has something similar.  I can't remember if I had to call something to get the debug spew to report it or if it just happens.  This link has some info:  http://blogs.msdn.com/b/chuckw/archive/2012/11/30/direct3d-sdk-debug-layer-tricks.aspx
  9. I've done collada skinned stuff for xna and directx, and both times I had a lot of coordinate system problems.  You appear to have the base mesh inside out stuff figured out, and looking at my code it appears there are no tricks with rotation keys that I can see.  Oddly I don't do any swapping or negating on translation keys for DirectX.   I do a lot of strange stuff to the inverse bind pose matrices.  Loading them from the collada float array I load them in M11, M21, M31, M41, M12, M22, etc... which might be a sort of transpose as you load.  Then I invert, do the below right hand to left hand conversion that I think I got from this very forum, then multiply by a rotation matrix of pi/2 in X, then invert back.  Sounds odd but it works from maxland to directxland. public static void RightHandToLeft(ref Matrix mat) { mat.M31 =-mat.M31; mat.M32 =-mat.M32; mat.M33 =-mat.M33; mat.M34 =-mat.M34; mat.M13 =-mat.M13; mat.M23 =-mat.M23; mat.M33 =-mat.M33; mat.M43 =-mat.M43; } I do two things that make my implementation not really collada compliant.  I assert that the up axis of the file is Z, and the bind shape matrix is identity.  I do that because I can use a single set of inverse bind poses for a character, including all of the outfits and extra parts.  This means I'll have one set of bones to feed to the shader and I can compute bone bounds and animate and such once.   Looking at your code, it looks like you do the inverseBind multiply over all bones, then do a parent multiply below, but I can't really follow what that is going to do to the bones in my noggin.  I have a recursive routine that multiplies by parents on the way out, then multiplies the inverse bind pose by the retrieved matrix like matrix = inverseBindPose * matrix.  Order matters.   I think I do this because I sort of do animation in the coordinate system of whatever exported it, in my case Max.  That inverse bind pose multiply sort of gets things back into a directx friendly coordinate system because of the aforementioned right to left conversion.   The resulting set of bones is what I pass into the shader.   Shader side, I compute a skin transform and do the position multiply once, but I'm not sure it matters for correctness.  You'll need it for the normals anyway. //ints for > sm2 float4x4 GetSkinXForm(int4 bnIdxs, half4 bnWeights, float4x4 bones[MAX_BONES]) { float4x4 skinTransform =bones[bnIdxs.x] * bnWeights.x; skinTransform +=bones[bnIdxs.y] * bnWeights.y; skinTransform +=bones[bnIdxs.z] * bnWeights.z; skinTransform +=bones[bnIdxs.w] * bnWeights.w; return skinTransform; } //skin pos and normal VVPosNorm ComputeSkin(VPosNormBone input, float4x4 bones[MAX_BONES]) { VVPosNorm output; float4 vertPos =float4(input.Position, 1); //generate the world-view-proj matrix float4x4 wvp =mul(mul(mWorld, mView), mProjection); //do the bone influences float4x4 skinTransform =GetSkinXForm(input.Blend0, input.Weight0, bones); //xform the vert to the character's boney pos vertPos =mul(vertPos, skinTransform); //transform the input position to the output output.Position =mul(vertPos, wvp); //skin transform the normal float3 worldNormal =mul(input.Normal.xyz, skinTransform); //world transform the normal output.Normal =mul(worldNormal, mWorld); return output; }
  10. It kind of feels like you might be having a problem with the intersection of two adjoining planes being off a bit if pushed out by a radius.  The sphere can actually be a little off from a radius-pushed-out plane.   See the potential inaccuracy and beveling section of this old article:  http://www.gamasutra.com/view/feature/3099/bsp_collision_detection_as_used_in_.php?print=1
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!