Sign in to follow this  
pristondev

3D (DX9) Problem Octree with a Big Static Mesh - Urgent Help

Recommended Posts

Hey, Im trying to use this Octree sample from Davig Ang https://www.programmingmind.com/demo/basic-octree

What happens its that works perfectly with small meshes or medium meshes, buuut, with a really big mesh, dont.

 

The mesh I want to use:

01a31144cb.png

What happens on Octree App:

32b881d7e1.png

 

As I said, this happens ONLY with BIG MESHES, small meshes like on the sample all works good. I already tried to change the #defines, because its a big mesh, but nothing works.

Thanks for help.

Share this post


Link to post
Share on other sites

I deleted most of the map objects, to turn it into a small mesh, but I got the same problem lol.

4798505ad9.png

 

With this mesh (example), all works good.

6217eb6061.png

 

@Loading the mesh out of this octree system, like DX mesh viewer, I can see normally

Edited by pristondev

Share this post


Link to post
Share on other sites

Probably the problem is here:

// Copy over all the faces and groups from the mesh			
	unsigned int *indices		= NULL;
	unsigned long  *attributes	= NULL;

	newMesh->LockIndexBuffer( D3DLOCK_READONLY, (void**)&indices );	
	newMesh->LockAttributeBuffer(D3DLOCK_READONLY, &attributes );

	for(DWORD i = 0; i < totalFaces; i++)
	{
		mFaces[i].v1 = *indices++; 
		mFaces[i].v2 = *indices++; 
		mFaces[i].v3 = *indices++;
		mFaces[i].group = attributes[i]; // Store the group (attribute) index - this is where the grouping happens

		mGroups[attributes[i]].numFaces++; // Keep count of the number of faces		
	}

	newMesh->UnlockIndexBuffer();
	newMesh->UnlockAttributeBuffer();

Im getting problem with mGroups on my meshes, so CreateIndexBuffer failed and crash the program. What I was doing before to run program, I was simply ignoring when mGroups.indexBuffer is null, but I think that its the problem.

 

Someone?

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this  

  • Announcements

  • Forum Statistics

    • Total Topics
      628333
    • Total Posts
      2982130
  • Similar Content

    • By OpaqueEncounter
      I have a very simple vertex/pixel shader for rendering a bunch of instances with a very simple lighting model.
      When testing, I noticed that the instances were becoming dimmer as the world transform scaling was increasing. I determined that this was due to the fact that the the value of float3 normal = mul(input.Normal, WorldInverseTranspose); was shrinking with the increased scaling of the world transform, but the unit portion of it appeared to be correct. To address this, I had to add normal = normalize(normal);. 
      I do not, for the life of me, understand why. The WorldInverseTranspose contains all of the components of the world transform (SetValueTranspose(Matrix.Invert(world * modelTransforms[mesh.ParentBone.Index]))) and the calculation appears to be correct as is.
      Why is the value requiring normalization? under);
      );
      float4 CalculatePositionInWorldViewProjection(float4 position, matrix world, matrix view, matrix projection) { float4 worldPosition = mul(position, world); float4 viewPosition = mul(worldPosition, view); return mul(viewPosition, projection); } VertexShaderOutput VS(VertexShaderInput input) { VertexShaderOutput output; matrix instanceWorldTransform = mul(World, transpose(input.InstanceTransform)); output.Position = CalculatePositionInWorldViewProjection(input.Position, instanceWorldTransform, View, Projection); float3 normal = mul(input.Normal, WorldInverseTranspose); normal = normalize(normal); float lightIntensity = -dot(normal, DiffuseLightDirection); output.Color = float4(saturate(DiffuseColor * DiffuseIntensity).xyz * lightIntensity, 1.0f); output.TextureCoordinate = SpriteSheetBoundsToTextureCoordinate(input.TextureCoordinate, input.SpriteSheetBounds); return output; } float4 PS(VertexShaderOutput input) : SV_Target { return Texture.Sample(Sampler, input.TextureCoordinate) * input.Color; }  
    • By pristondev
      Hey, Im using directx allocate hierarchy from dx9 to use a skinned mesh system.
      one mesh will be only the skeleton with all animations others meshes will be armor, head etc, already skinned with skeleton above. No animation, idle position with skin, thats all I want to use the animation from skeleton to other meshes, so this way I can customize character with different head, armor etc. What I was thinking its copy bone matrices from skeleton mesh to others meshes, but Im a bit confused yet what way I can do this.
       
      Thanks.
    • By G-Dot
      This is some fotages of enemies for my first game project. I used 3ds Max to model them, substance painter for PBR texturing, Marmoset for rendering. Probably this is he first version of them but not the last.
    • By G-Dot
      This is some fotages of enemies for my first game project. I used 3ds Max to model them, substance painter for PBR texturing, Marmoset for rendering. Probably this is he first version of them but not the last.
    • By G-Dot
      This is some fotages of enemies for my first game project. I used 3ds Max to model them, substance painter for PBR texturing, Marmoset for rendering. Probably this is he first version of them but not the last.
  • Popular Now