• Advertisement


  • Content count

  • Joined

  • Last visited

Community Reputation

122 Neutral

About razialx

  • Rank
  1. The first thing I thought of is, as it turns out, how Oblivion does it. Develop a schedule for entities that can be referenced based on time, so that when an entity enters a certain range of the player (Sphere, or AABB) the entities can be set to _roughly_ where they should be doing what they should be doing. Then micromanagement can take over. I guess an easy (overly simplified) view would be Entity TIM has Schedule: T:0 Get Up T:100 Go Downstairs T:300 Leave House T:400 Get In Car ... T:168300 Arrive Home T:168400 Go to bed So, for that entity, when it goes out of range of the player, we store the time that it became no longer active (given that these are relative times, interactions will take entities off their course and you can't expect them to follow this when under micromanagement). So, when Entity TIM becomes active again, we take the difference in time to find where the entity should be. This is extremely simple, of course, and you could expand on it by doing perhaps a cursory evaluation of a tree of interactions to determine where things would be. This would be a computation hit when entities become active, but it would allow entities that are not active to have no affect on your CPU load. So, a tree could involve decisions that you evaluate based on heuristics or just randomness, so things would not be as predictable. Of course, I don't know the nature of your entities. If this was a RTS type game involving flight for example, you could have decision trees for what they would have built in your absence. If I am way off here, please let me know. I just ramble off the top of my head at times. Good luck! Tim
  2. It really isn't an issue of reinventing the wheel here. For a system like this, you have to plan for it and design around it. Yes, it is very possible to write a library to handle this, however, it is less reasonable to make it so that the library can be plugged into the myrad of 3D engines out there. The way that Ogre3d, Unreal Tech, Quake, etc handle animations varies. As well as how the memory is used. And pixel/vertex shaders. With most engines, you aren't going in and directly altering the vertex data without using an API provided(if well designed). So, you could write this to work with Ogre, or Crystal Space, but it would really not just 'plug in' into another engine. Video memory is a tricky thing. How people use it is even trickier. Maybe I am wrong.
  3. Ok. Slow down, hehe, hard to read through the typos. What you want is not going to happen, because it isn't something you can just 'plug in' your code like a library. What he is talking about is the dynamic clothing/shape of models in game. You have to design around this, both in art and engine. If any of you have played a MMORPG lately (WoW, EQ2), he wants to know how you create a system where you can change your clothing/equipment/shape of body dynamically. So, you aren't going to find something, and I highly doubt someone is just going to start a sourceforge project for you. You should instead learn how it is done and create your own. That is how open source works. You have to be interested, not ask someone else to be interested for you. To give you a basic(very, dont have the time for something long and drawn out) you would use a system like this. Imagine you have a triangle. {EDIT: It butchered my triangle art.... boo} |\ | | |___ Ok. That is a neutral triangle. Imagine a middle sized triangle. Now, you have 3 verticies on that, A B C, which are connected with lines 1 2 3. (This is all example) Well, you model that triangle large and small as well, with the same verticies and lines connecting. So, in 3d data you could have [Vertex] 0 0 0 1 1 0 0 1 0 [Index] 1 2 1 3 2 3 Which tells us that we have a vertex at 0,0,0 at 1,1,0 and 0,1,0 respectively. And we have connections between vertex 1 and 2, 1 and 3 , 2 and 3 (Should index at zero, but this is an example!) Then we create a new set of 3d data. [Vertex] 0 0 0 2 2 0 0 2 0 [Index is the same]... and another [Vertex] 0 0 0 .1 .1 0 0 .1 0 [Index is the same]... So, we have 3 setups. Now in this example the scaling is even, but in a real model it will not be the case usually(The torso might get larger around the chest, but not just larger overall, that doesn't look good art-wise) So, in your code you can say that a player has a property called ChestScale. If chest scale is 0.0, you use the first 3d data. If it is 1.0 you use the second, if it is -1.0 you use the third. But what about if it is 0.5, half way between the two? You need to find the half-way point between those data sets, and use that. In this way you can smoothly scale between different models. I used just one triangle, but this would work for a complex model. Is it a good solution? Doubt it, but it should explain the concepts here. I look forward to your sourceforge project ;) Tim razialx
  4. Help on a shader please

    By that you mean, send up the texture coordinates to the vertex shader, but not actually the coordinates but the values with which they should be displaced? Yeah, that sounds pretty solid. I will have to give it a shot... I will let you know, thanks.
  5. Help on a shader please

    Wow, that is amazingly awful... Hmm... Can you guys think of any type of alternative I could use? The effect I am going for is this: Take your standard zombie, and shoot him in the face. There is quite a big blood splat, which is very rewarding :) I wanted to cause that blood splat (or a similar shaped downsized texture) to cause the vertexes that it affects to scale backwards along the vertex's normal. This would cause the skull static mesh that I have linked to the skeleton to be exposed. End effect, it looks like you blew the skin off the face to expose the underlying bone. I think the effect would be great and I figured easy to implement. Geeze, I just figured I was doing something wrong trying to do a texture lookup in the Vertex Shader section. What about some kind of preprocessing to pack the data into the vertex information? Perhaps hijack the Binormal or some other facet... Also, I considered just using Alpha blending for the effect on lower end cards... But I couldn't get the shader to actually ALPHA the pixels. I would set the alpha chanel of each pixel based on the maps intensity, however they would still be drawn at fully opaque. Ugh. I need a good reference. I have not found one yet... Google has not been as helpful as usual in this.
  6. Hello, I was hoping someone could help me on a shader I am trying to write... I have just started using Vertex/Pixel shaders, so I am not too familiar with them yet. What I am trying to do is, create a shader that will displace the vertexes of a model in the negative direction of the vertex's normal based on a texture map. This is for a Doom 3 mini-modification I want to write. Essentially, I wish to displace the vertexes based on the intensity of red in a blood stain map. I have created the effect, however, it is not based on the texture map right now. It is just based on specular lighting. I just wanted to get the other parts working. I can not figure out how to reference a texture map in the Vertex Shader part. If that is not where I should be looking, I also can not figure out how to change the position of a pixel in the pixel shader. Any help would be appreciated Tim
  • Advertisement