Jump to content
  • Advertisement

Bitem2k

Member
  • Content Count

    42
  • Joined

  • Last visited

Everything posted by Bitem2k

  1. Does anyone know how i would go about combining multiple Mesh's with multiple vertex buffers into a single bsp/oct tree? i.e My level consists of many different meshes, for instance walls, floor, tables etc. All the examples i have found on the net have wanted a single list of polygons to build the tree from. Thanks in advance.
  2. Bitem2k

    Building Bsp/Octrees

    Cool, i will check that out. thanks
  3. Bitem2k

    Building Bsp/Octrees

    Im so very stuck, that im going to be forced to purchase many books! thanks for your help though.
  4. Bitem2k

    Building Bsp/Octrees

    Ok, Sorry to sound horrendously stupid here ( i have searched the internet long and hard for information on how to implement an octree ) so say i concentrate the objects vertex data into one list, what would the method that generates the octree look like? The way i understood it was that the octree is generated by an arranged list of polygons. If i just add the vertices on to the end of the list will this work? How does the algorithm know where to place the polygons in relation to the octree? Thank you very much for helping me out as im close to pulling my hair out!
  5. How would i go about turning multiple mesh loaded from xfiles into polygon soup? (i.e combine their vertex buffers) I wish to create an octree with the resulting poly list. Thanks in advance
  6. Bitem2k

    Building Bsp/Octrees

    Quote:Original post by head_hunter What you could do is merge all vertex lists into one and keep an extra list with object IDs. That way you can assign vertices to proper nodes (assuming you are building an octree) and still know to which object they belong. You should store the object IDs in the nodes as you will probably need them later on. That makes sence, but how would i go about creating the combined list? Thanks
  7. Quote:Original post by Amma Trust me, the mentioned suggestions don't work ... I think I've tried all vector/matrix conversations outthere. The problem is that the matrices hold scale information which cannot be demangled once it's in the matrix. Instead (and this is the answer I looked all around for!!) you should use the IGame API calls. There's plenty of information on it in the 3ds max SDK. You simply specify which API (OpenGL/DX/User) you want the coordinates to be in, and it's automatically computed for you! Im not to sure about his exact problem and im far,far far from being an expert, however the solution I posted was an answer that i recieved from someone for my particular problem. My problem was i needed to export camera locations from 3ds max into my game engine (dx9), and swapping the y and z values did indeed work, for atleast that problem.
  8. Bitem2k

    Building Bsp/Octrees

    Quote:Original post by 5MinuteGaming How are you storing the Polygon data or face data from the Vertex buffers. Hi thanks for replying. At the moment im stuck and cannot procede until i know what im doing (to save time later if i chose the wrong solution. As i stated before i intend to have multiple x files making up my world, but as of yet ive no idea how to implement it! Any ideas would be appreciated. thanks
  9. All you have to do is swap the Y and the Z position. (i.e) (X,Y,Z) = (X,Z,Y) Hope this helps
  10. Bitem2k

    BSp trees

    Thanks for replying. Yeah i have allready looked at that but it doesnt help with my problem.
  11. Bitem2k

    BSp trees

    Here's my problem: Im writing a game engine and i wish to implement a bsp tree for collision detection and culling. My levels will consist of a number of .x file meshes that have been positioned in my level editior. Once i have all the mesh's loaded how do i add them to a bsp tree? all the examples i have seen, seem to require a list of polygons. If i retrieve the polygons from the vertex buffer of each mesh, how do i know where to put these in the tree? Thanks in advance.
  12. Bitem2k

    Scene Management/Collision Detection

    Thanks for the reply! Ok what you say about the material makes sense! however if i was using the tree for not just coll detection but also for frustrum culling, if all my data is in a vertex buffer would that mean that i would have to 'hand' draw the visible bits of mesh using directx's draw primitive method?
  13. Theres a few things that i still dont understand with regards to management of my scene; 1) When dividing my scene into some sort of tree (bsp octree,etc..) Where do i get the data from? Do i retrieve it from a vertex buffer? Or maybe i am meant to have a collection of object mesh's that i put into a tree? 2) once i have that data in a tree, when doing collisions on the level terrain how do i know which part of the terrain ive hit? (i.e wall floor hole)... If someone could explain the following to me i would appr. it greatly, as i am stumped! Thanks
  14. Sorry im not sure about the picking however there are many articles about this subject. Do a search on this forum and im sure u will find something usefull. U and V are simply texture coordinates, not actual world space coords. That is why they range from 0 to 1. You may think of U being X and V being Y. the texture is stretched across the surface of polygons/triangles using these coordinates. See The picture below. Hope this helps
  15. Bitem2k

    About moving a 'mesh'

    When you draw in direct3d, the mesh object has no position of its own. You must move the position of the world to the correct place where you want to draw the mesh and then draw it. This has to be repeated for every frame. You can think of the world transform as being kind of like a marker that is defined within 3d space and tells dx where to draw the next mesh. The world transform starts off equals to the identity matrix which is = to (0,0,0). You may multiply different matrices together to achieve multiple transforms, for instance you may Multiply a rotation matrix by a translation matrix to first rotate and then move the position of this 'marker'.Once you have moved the draw position 'marker' you may drawn your mesh. After that mesh has been positioned and drawn. Lets say that you wanted to draw the same mesh in a different position/rotation. You would set the world transform back to the identity matrix to move the location that you are going to draw back to the origin (0,0,0). You may now use another single or multiple combined matrices to move the draw location once more, then you may call the mesh drawsubset method (or equivelent in dx8). Important( The same mesh object can be used over and over again to draw copies of the mesh in world space, YOU DONT NEED Multiple instances of the same mesh class if you want to draw several copies of the same mesh, you may re-use the original). I have done my best to try to explain this as well as i can, if your still stuck dont hesitate to instant messsage me, I know how frustrating learning the basics is! Hope this helps [Edited by - Bitem2k on August 2, 2005 7:50:21 AM]
  16. Hi this question is similar to one that i asked and Mr Robert Dunlop replied here's what was said: By me: Quote:Does any one know how I can make a mesh look in the direction it's > 'walking'. > > I want it to face a given Vector. > > At the moment i can rotate my model using a transform matrix and then use > the m31,m32,m33 components of this matrix to determin which direction it > faces, however this doesnt help me for the above problem By Robert Dunlop Quote:Actually, it is the same problem in reverse, you can construct a matrix by the same means, though you also need to figure an "up" and "right" direction for the mesh. The you would put the forward vector in m31,m32,m33, the up vector in m21,m22,m23, the right vector in m11,m12,m13, and the location in m41,m42,m43 (and of course set the last column (m11,m21,m31,m41) to 0,0,0,1). You can calculate the right and up vectors in a manner similar to that used by Matrix.LookAtLH. Assuming your object will stay within a pitch range of +/- 90 degrees and does not need to roll (otherwise you would need more info than a forward vector), you can calculate the matrix like this (pseudocode) : forward (m31,m32,m33) = normalize(forward) right (m11,m12,m13) = normalize(cross(vector(0,1,0),forward)) up (m21,m22,m23) = cross(forward,right) (m41,m42,m43) = location -- Robert Dunlop The X-Zone http://www.directxzone.com/ Microsoft DirectX MVP
  17. Hi im using a vertex buffer to draw normal lines efficently. Im writting chunks of data to a vertex buffer and then drawing it to screen. This is working quite nicely however some lines appear to jump around like lasers, all the lines are in the correct places its just that they seem to be drawing before theyre in position! this must be something to do with my code but i cant see anything wrong with it. Can anyone see a problem here? //Variables private VertexBuffer vb; private static int maxVerts=1024; private static int flushSize=maxVerts/4; private static int flushLockSize = flushSize * CustomVertex.PositionColored.StrideSize; private VertexFormats format= CustomVertex.PositionColored.Format ; private int formatStride = CustomVertex.PositionColored.StrideSize; public void Draw() { int vertsToRender=0; if (vb ==null) this.device_DeviceReset (this,null); device.SetStreamSource (0,vb,0,formatStride); device.VertexFormat = CustomVertex.PositionColored.Format; if (nextVert > 0) nextVert += flushSize; if (nextVert > maxVerts) { //Reset nextVert = 0; } GraphicsStream vertexData = vb.Lock(nextVert * formatStride,flushLockSize,(nextVert != 0 ?LockFlags.NoOverwrite : LockFlags.Discard )); foreach(Ray r in rays) { //Write Postion and colour vertexData.Write (r.Position); vertexData.Write (Color.Yellow.ToArgb()); vertexData.Write (r.EndPosition ); vertexData.Write (Color.Yellow.ToArgb()); vertsToRender +=2; if (vertsToRender == flushSize) { //Unlock and draw vb.Unlock(); device.DrawPrimitives (PrimitiveType.LineList,nextVert,(vertsToRender/2)); nextVert += flushSize; if (nextVert > maxVerts) { nextVert = 0; } vertexData = vb.Lock (nextVert * formatStride,flushLockSize,nextVert !=0 ? LockFlags.NoOverwrite : LockFlags.Discard ); vertsToRender = 0; } } vb.Unlock(); //Render remaining if (vertsToRender >0) { device.DrawPrimitives (PrimitiveType.LineList,nextVert,vertsToRender/2); } } Thanks
  18. Does anyone know how to get a list of Normals and postions of faces from a loaded mesh? Im looking to render the normals on a model, like the directx sdk meshviewer. At the moment im looping through the vertex buffer and i think im getting the normals but i cannot retrieve the correct position to start drawing them on the model. Thanks
  19. Bitem2k

    Rendering Normals

    At last i have cracked it! It seems that my problem was due to the 3dsmax exporter that comes with the directx sdk. When i use that to export, the vertex positions dont seem to load in my program correctly. When i use the Panda exporter all the normals positions are correct. This is very strange as the Meshviewer app will render both the exported versions ( with no problems ) from each exporter. Armed with the knowledge that the vertex positions must be correct (because the models render on screen correctly), i figured it must be something to do with the fact that the ms dx sdk exporter uses a different vertex format. Mesh mesh = loadedMesh.Clone (MeshFlags.Dynamic,CustomVertex.PositionNormalTextured.Format,xdata.Mesh.Device); Adding this line fixed all my problems! Now when i read from the vertex buffer, its using the same vertex format so it works! Thanks to all that replied.
  20. Bitem2k

    Rendering Normals

    Yeah, that was just the code i posted! The project does contain the modification of dist to dir (sorry). The end colour has been changed to the start colour intentionally ( to match the single yellow colour used in the meshViewer) Thanks
  21. Bitem2k

    Rendering Normals

    Ok ive taken your advice and removed the face i%3 thing. I dont think its due to Matrices as i have cleaned my code of Transform.Worlds , plus some of the normals are in the right place. One thing i have noticed is that when i change the vertex format of the vertex buffer i get slightly different incorrect results,i.e device.VertexFormat = CustomVertex.PositionNormal.Format ; gives different result than device.VertexFormat = CustomVertex.PositionNormalTextured.Format ; although for both of them the normals that are correctly positioned are the same(in correct place), its only the normals in the middle that move. Here is the code used to draw the normal: //Work out verts int startCol = Color.Yellow.ToArgb(),endCol = Color.Red.ToArgb(); float scale =5; Vector3 start = position; Vector3 dir = direction; dir *= scale; Vector3 end = start + dist; int i =0; verts[i++] = new Microsoft.DirectX.Direct3D.CustomVertex.PositionColored(start.X ,start.Y,start.Z ,startCol); verts[i++] = new Microsoft.DirectX.Direct3D.CustomVertex.PositionColored(end.X ,end.Y,end.Z ,startCol); Heres the screen shots of what i mean when i say some are positioned correctly (see the ones round the boxes and the tops of the walls. Here is my c# project. Thanks for your help.
  22. Bitem2k

    Rendering Normals

    Im not using any transforms at all! just matrix.identity. I dont think that the problems lies there because some of the normal lines (the ones on the back wall) are positioned correctly. I just cant find why some work and some dont!
  23. Bitem2k

    Rendering Normals

    Hi thanks for your reply. Im using managed directx 9 in c# . I turned off the rendering of the model and now i can see that only some on my normals are rendering correctly! Does anyone know what is wrong with the following code: public Vector3[] normals = null; public Vector3[] positions = null; public void CreateNormals() { Mesh mesh = xdata.Mesh; normals = new Vector3[mesh.NumberFaces]; positions = new Vector3[mesh.NumberFaces]; using (IndexBuffer ib = mesh.IndexBuffer ) { short[] faces = ib.Lock (0,typeof(short),LockFlags.None,mesh.NumberFaces *3 ) as short[]; using (VertexBuffer vb = mesh.VertexBuffer) { CustomVertex.PositionNormalTextured[] verts = vb.Lock (0,typeof(CustomVertex.PositionNormalTextured) ,LockFlags.None,levelMesh.NumberVertices) as CustomVertex.PositionNormalTextured []; //Now find each face vertex position for (int i =0;i<verts.Length ;i++) { if (i % 3==0) { CustomVertex.PositionNormalTextured d; d= verts[faces]; normals[i / 3] = d.Normal ; positions[i / 3] = d.Position; System.Diagnostics.Debug.WriteLine (d.ToString ()); } } vb.Unlock(); } ib.Unlock(); } } Here is a picture of what i mean. Some off the normals are deposited in the center of the level, which is all wrong, this is not the case if you view in meshviewer. The ones shown in the background are in the correct place though. Why would this be? any ideas?
  24. Quote:Original post by sirob BTW, you can use Image Shack to host images for free. They only keep them for like 60 days, but mostly you don't need more than that. As for downloadable files, check out Save File. They host files up to 60MBs for free, and delete them after 30 days of inactivity. There is no bandwidth limit. Sounds pretty good to me :). Thanks ill certainly look into them next time i post images.
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!