Jump to content
  • Advertisement
  • entries
    15
  • comments
    6
  • views
    2520

Items and AI

Sign in to follow this  
trill41

995 views

Items

I was thinking about weapons and equipment's, which all are Items. So sooner or later some item system must be implemented. All items are stored in a database table (let's name it game_items) with a name, value, the type of the item (weapon, armor, crap...), a 3D model file (which is loaded by the client) and an optional Lua script file, because e.g. weapons are scriptable.

In OOP terms the game_items table has a list of item classes. Now we just need another table with the objects, because when an item drops for a player and the player picks it up, the player doesn't want that this item disappears, so it must be stored in the database. Let's name this table concrete_items.

This table contains what item it is (so the primary key of the item in the game_items table), where it is stored, to which player it belongs, item stats (like damage for weapons), and some other information like creation time, value etc.

Drops

When NPCs die, they may or may not drop a random item. Each item has a certain chance to drop on a certain map. This requires another database table which associates an item with a map and a chance to drop on this map.

Selecting a random item from a pool of items that may drop on the current map isn't really hard, there are algorithms that do that, like WalkerMethod which is O(1), or my adaption of it.

Items drop for certain players, other players can not pickup an item dropped for a different player. When an NPC dies and it drops an item, an item is picked from the item pool and a new concrete item is created and inserted into the database with random stats. A random player is selected from the party that killed the NPC and this new item drops for this player.

If a player picks up an item, it is added the the players inventory. Technically just the storage place of the concrete item is changed from Scene to Inventory.

itemdrop800.jpg.09174438de2efe1fd3db05f7eb294944.jpg

AI

First I thought I'll make a PvP only game because then I don't need to mess around with game AI, but I realized even PvP games need some NPCs that have some kind of AI. Then I came across a behavior tree library (SimpleAI) that seemed to be a nice fit. It was really easy to integrate this library into the game server.

ai::Zone

A Game instance has a Map object with the terrain, navigation mesh, the octree and now an ai::Zone object.

ai::ICharacter and ai::AI

The class hierarchy of game objects looks a bit like this.

classes.png.c96675b5b74a5b9e7388e2795acb3969.png

SimpleAI controls the NPC via the ai::ICharacter class. The NPC class could just inherit also from the ai::ICharacter class, but I try to avoid multiple inheritance where I can, so the NPC class has a ai::ICharacter member.

Then the NPC has an ai::AI object which has the behavior and does all the "intelligent" stuff.

Behaviors

Behaviors are defined with simple Lua scripts, e.g.:

-- Try to stay alive
function stayAlive(parentnode)
  -- Executes all the connected children in the order they were added (no matter what
  -- the TreeNodeStatus of the previous child was).
  local parallel = parentnode:addNode("Parallel", "stayalive")
  parallel:setCondition("IsSelfHealthLow")
    parallel:addNode("HealSelf", "healself")
  -- TODO: Flee
end

-- Heal an ally
function healAlly(parentnode)
  local parallel = parentnode:addNode("Parallel", "healally")
  parallel:setCondition("And(IsAllyHealthLow,Filter(SelectLowHealth))")
    parallel:addNode("HealOther", "healother")
end

-- Do nothing
function idle(parentnode)
  -- This node tries to execute all the attached children until one succeeds. This composite only
  -- fails if all children failed, too.
  local prio = parentnode:addNode("PrioritySelector", "idle")
    prio:addNode("Idle{1000}", "idle1000")
end
function initPriest()
  local name = "PRIEST"
  local rootNode = AI.createTree(name):createRoot("PrioritySelector", name)
  stayAlive(rootNode)
  healAlly(rootNode)
  -- ...
  idle(rootNode)
end

So now we have a behavior with the name "PRIEST" and all NPCs with this behavior try to stay alive, heal an ally or do nothing.

Conditions, Filters and Actions

Of course the IsSelfHealthLow is not part of SimpleAI (although it already comes with a set of Conditions, Filters and Actions). The IsSelfHealthLow condition just checks if the health points of the NPC is under some threshold:

class IsSelfHealthLow : public ai::ICondition
{
public:
    CONDITION_CLASS(IsSelfHealthLow)
    CONDITION_FACTORY(IsSelfHealthLow)

    bool evaluate(const ai::AIPtr& entity) override
    {
        const ai::Zone* zone = entity->getZone();
        if (zone == nullptr)
            return false;

        const AiCharacter& chr = entity->getCharacterCast<AiCharacter>();
        const auto& npc = chr.GetNpc();
        if (npc.IsDead())
            // Too late
            return false;
        return npc.resourceComp_.GetHealthRatio() < LOW_HP_THRESHOLD;
    }
};

If now IsSelfHealthLow::evaluate() evaluated to true it executes the action HealSelf which is again a C++ class. It tries to find a skill that do some self healing, and uses it.

For just staying alive, no Filters are needed, but for a Priest that may also heal others, Filters are need to select those Allies with low health points. Such a Filter class could look like this:

void SelectLowHealth::filter(const ai::AIPtr& entity)
{
    ai::FilteredEntities& entities = getFilteredEntities(entity);
    Game::Npc& chr = getNpc(entity);
    std::map<uint32_t, std::pair<float, float>> sorting;

    chr.VisitAlliesInRange(Game::Ranges::Aggro, [&](const Game::Actor* o)
    {
        if (o->resourceComp_.GetHealthRatio() < LOW_HP_THRESHOLD)
        {
            entities.push_back(o->id_);
            sorting[o->id_] = std::make_pair<float, float>(o->resourceComp_.GetHealthRatio(), o->GetDistance(&chr));
        }
    });
    std::sort(entities.begin(), entities.end(), [&sorting](int i, int j)
    {
        const std::pair<float, float>& p1 = sorting[i];
        const std::pair<float, float>& p2 = sorting[j];
        if (fabs(p1.first - p2.first) < 0.05)
            // If same HP (max 5% difference) use shorter distance
            return p1.second < p2.second;
        return p1.first < p2.first;
    });
}

Now the HealOther class can get the filtered entities (which is just a std::vector<uint32_t> containing the IDs of the allies with low health points sorted by priority), and use some heal skill on that target.

Conclusion

SimpleAI is a great library, it is easy to integrate, easy to use and easy to configure, and as far as I can see now, it just works.

 

Sign in to follow this  


0 Comments


Recommended Comments

There are no comments to display.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Advertisement
  • Advertisement
  • Blog Entries

  • Similar Content

    • By G-Dot
      Hello everyone! Recently I was doing behavior trees for enemies in my game and I've stuck with one issue. One enemy type of enemies is called Shooters (working name). So they've got a simple behaviour: pick a point near player, go to it and perform some attacks. The issue is what then they are picking point to move it happens that often they took almost equal points and I often see heaps of them in one single place. I want them to spread across arena evenly, but don't know how to do this. I think that it can be done with more advanced work with blackboards and communications. 
    • By vicodescity
      Hey guys a newbie game dev here. 
      I was wondering if the totally awesome nemesis system for middle earth shadow of mordor was possible with unreal engine 4. I plan on making a game soon that has a MUCH SIMPLER version of it.
      Also which would be better for it? Blueprints or c++.
      I would really appreciate it if you could break it down into multiple concepts for a mere beginner such as myself :).
    • By polpel
      Hi,
      So basically I'm developing a 2d RPG type game, in the old pokemon style.
      One of the parts of the game is an enemy that starts following you once you get in a certain range.
      I was able to implement the detection part (quite easy, just basic vector math), now I need to make him move according to a clock (not the hard part), but I don't know how I would make him follow a vector (enemy position to player position).
      The code below is a function that returns distance between two points (mouse cursor and sprite position).
      Any sort of help would be hugely appreciated.
      Cheers
      // GETTING DISTANCE BETWEEN TWO POINTS IN 2D // RESULT IS USED IN A CONDITION ALONGSIDE A CLOCK int vector_dist(sfVector2i mouse, sfVector2f pos) { int dist = 0; int x = mouse.x - pos.x; int y = mouse.y - pos.y; dist = sqrt(x * x + y * y); return (dist); }  
    • By congard
      I ran into a problem when testing a program on an AMD GPU. When tested on Nvidia and Intel HD Graphics, everything works fine. On AMD, the problem occurs precisely when trying to bind the texture. Because of this problem, the shader has no shadow maps and only a black screen is visible. Id textures and other parameters are successfully loaded. Below are the code snippets:
      Here is the complete problem area of the rendering code:
      #define cfgtex(texture, internalformat, format, width, height) glBindTexture(GL_TEXTURE_2D, texture); \ glTexImage2D(GL_TEXTURE_2D, 0, internalformat, width, height, 0, format, GL_FLOAT, NULL); void render() { for (GLuint i = 0; i < count; i++) { // start id = 10 glUniform1i(samplersLocations[i], startId + i); glActiveTexture(GL_TEXTURE0 + startId + i); glBindTexture(GL_TEXTURE_CUBE_MAP, texturesIds[i]); } renderer.mainPass(displayFB, rbo); cfgtex(colorTex, GL_RGBA16F, GL_RGBA, params.scrW, params.scrH); cfgtex(dofTex, GL_R16F, GL_RED, params.scrW, params.scrH); cfgtex(normalTex, GL_RGB16F, GL_RGB, params.scrW, params.scrH); cfgtex(ssrValues, GL_RG16F, GL_RG, params.scrW, params.scrH); cfgtex(positionTex, GL_RGB16F, GL_RGB, params.scrW, params.scrH); glClear(GL_COLOR_BUFFER_BIT); glClearBufferfv(GL_COLOR, 1, ALGINE_RED); // dof buffer // view port to window size glViewport(0, 0, WIN_W, WIN_H); // updating view matrix (because camera position was changed) createViewMatrix(); // sending lamps parameters to fragment shader sendLampsData(); glEnableVertexAttribArray(cs.inPosition); glEnableVertexAttribArray(cs.inNormal); glEnableVertexAttribArray(cs.inTexCoord); // drawing //glUniform1f(ALGINE_CS_SWITCH_NORMAL_MAPPING, 1); // with mapping glEnableVertexAttribArray(cs.inTangent); glEnableVertexAttribArray(cs.inBitangent); for (size_t i = 0; i < MODELS_COUNT; i++) drawModel(models[i]); for (size_t i = 0; i < LAMPS_COUNT; i++) drawModel(lamps[i]); glDisableVertexAttribArray(cs.inPosition); glDisableVertexAttribArray(cs.inNormal); glDisableVertexAttribArray(cs.inTexCoord); glDisableVertexAttribArray(cs.inTangent); glDisableVertexAttribArray(cs.inBitangent); ... } renderer.mainPass code:
      void mainPass(GLuint displayFBO, GLuint rboBuffer) { glBindFramebuffer(GL_FRAMEBUFFER, displayFBO); glBindRenderbuffer(GL_RENDERBUFFER, rboBuffer); glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT, params->scrW, params->scrH); glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_RENDERBUFFER, rboBuffer); glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); } glsl:
      #version 400 core ... uniform samplerCube shadowMaps[MAX_LAMPS_COUNT]; There are no errors during the compilation of shaders. As far as I understand, the texture for some reason does not bind. Depth maps themselves are drawn correctly.
      I access the elements of the array as follows:
      for (int i = 0; i < count; i++) { ... depth = texture(shadowMaps[i], fragToLight).r; ... }  
      Also, it was found that a black screen occurs when the samplerCube array is larger than the bound textures. For example MAX_LAMPS_COUNT = 2 and count = 1, then
      uniform samplerCube shadowMaps[2];
      glUniform1i(samplersLocations[0], startId + 0); glActiveTexture(GL_TEXTURE0 + startId + 0); glBindTexture(GL_TEXTURE_CUBE_MAP, texturesIds[0]); In this case, there will be a black screen.
      But if MAX_LAMPS_COUNT = 1 (uniform samplerCube shadowMaps[1]) then shadows appear, but a new problem also arises: 


      Do not pay attention to the fact that everything is greenish, this is due to incorrect color correction settings for the video card.
      Any ideas? I would be grateful for the help
    • By Brandon Druschel
      When it comes to video games, immersion is one of the more popular topics among game critics and players alike, and for good reason. When you ask people why Marvel’s Spider-Man is such a joy to play, it shouldn’t surprise you if at least one of them tells you that the game “makes you feel like Spider-Man.” In fact, the phrase “makes you feel like…” is so common that it has been parodied and even ridiculed, with popular YouTubers like VideoGameDunkey providing direct examples of the phrase being used repeatedly in video reviews.
      But despite the phrase’s over-usage, immersion is indeed an important aspect of just about any video game. The game doesn’t necessarily have to be realistic in order to engross the players within the game environment – it’s the quality of the game’s content alongside the player’s involvement that heavily influences what the player “feels” as they play.
      So what makes a game like Audiosurf truly immersive? Before we get into the nitty gritty, I’ll provide a brief overview of the game itself and I mean exactly by immersive.
      The official website describes Audiosurf as a “music-adapting puzzle racer,” and has the unique feature of generating tracks by using the player’s own music files. The game provides many different modes, but the most popular one is Mono, which involves touching color blocks to fill a grid. If one column of the grid is over-filled or enough time has elapsed, the grid clears and the player gets points – the more the grid is filled, the more points you gain. There are also grey blocks which the player must avoid – if the player doesn’t touch any grey blocks through the whole song, a 30% bonus is added to the final score.
      Dr. Damian Schofield defines immersion as the following:
      Ernest Adams, an author and consultant on game design, puts immersion into three different categories: Tactical Immersion and Strategic Immersion and Narrative Immersion.
      Tactical immersion is experienced when performing tactile operations that involve skill – players feel “in the zone” while perfecting actions that result in success. Fast-paced action games, like Call of Duty, are good examples of games in which tactical immersion is at play.
      Strategic immersion is more cerebral and is associated with mental challenge. Puzzle games like Portal, in which players contemplate choosing a correct solution among an array of possibilities, are good examples of games with strategic immersion.
      Narrative immersion has the player become invested in a story, much like what is experienced when reading a book or watching a movie. Audiosurf doesn’t have a story, so narrative immersion won’t be focused on as much.
      The “build tracks with your own music” feature is critical to the level of immersion that Audiosurf achieves. A rhythm game can try be as engrossing as possible, but the player is far less likely to get the game if the soundtrack isn’t their cup of tea. Audiosurf circumvents the issue entirely by letting the player choose any track they want. The game doesn’t even have to tell the player how to feel – the player themselves can play a track that suits their mood, or compliments the mood they want to feel. The customization is even more impressive in the game’s sequel, Audiosurf 2, where custom skins can be used to dramatically change how tracks respond visually to the music.
      To top it off, the visual representation of the track complements the intensity of the music itself. A slow and serene tune (think ambient, “chill-out” music) presents the player with a dark blue-green, smooth track, and gameplay is more relaxed and the player moves slowly. More intense, upbeat tracks will be bright and yellow-red in hue, with the track often jumping up and down to the beat, and the player moves significantly faster down the track.
      Most games would often attribute themselves towards one of Adams’ immersion categories over another, but Audiosurf’s gameplay compliments tactical and strategic types of immersion in unison.
      Topping the scoreboard in an audio track requires a lot of skill, especially in tracks with a high BPM, where a lot more blocks (notably grey blocks which must be avoided) are involved. In most cases, the player must avoid every grey block in order to achieve a high enough score to be leaderboard-worthy. Thus, tactical immersion is experienced.
      Getting as many points as possible isn’t as simple as touching every single colored block you come across, because overflowing a column on the 3x7 grid clears the grid entirely and accumulates the total points from the remaining tiles. For example, if I blindly touched every colored block I saw, regardless of how filled up a column is, I would end up gathering less total points than if I chose to deliberately pass by some colored blocks in order to fill the grid up entirely to gather 21 blocks. The lanes in which each colored block passes by is more or less random throughout the song, so the player must keep track of the incoming blocks, and quickly determine which of those blocks they should gather depending on the current state of their grid.
      As you can probably tell by reading that last paragraph, obtaining the highest score in Audiosurf can be quite a mental challenge. Sometimes, the challenge is exacerbated by more intense tracks, which pressure you to think faster than usual. In that particular situation, tactical and strategic immersion work together seamlessly.
      Audiosurf isn’t a perfect game, but its ability to immerse the player through its colorful visuals, freedom in music selection, and complex, challenging gameplay is undeniable. Audiosurf’s unique feature of music-generated gameplay would go on to inspire other games like Beat Hazard, Crypt of the Necrodancer (through an optional mode), and Symphony to base their gameplay on the player’s own music.
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!