Jump to content

  • Log In with Google      Sign In   
  • Create Account


Member Since 25 Oct 2008
Offline Last Active Aug 07 2014 10:08 PM

#5137261 Quantifying insanity.

Posted by on 07 March 2014 - 10:18 PM

I think we're sort of talking past each other at this point. In any case, I love survival games and psychology so I look forward to playing whatever you come up with. Best of luck!

#5137259 Quantifying insanity.

Posted by on 07 March 2014 - 09:30 PM

The issue here is that I can't account for how deep in character the player is.


So, this is not strictly true. In fact, there are entire genres built around accounting for exactly that. To be sure, game design based on player immersion is much more artistic and subjective than game design based on systems, but it is definitely not out of reach.


It relies on player immersion to be any more than that.


Player immersion is not something that the player chooses to engage in. They have to set the conditions for it, but you as the developer have to actually create it, if that's the route you want to take. And keep in mind, it may not be the best option. Immersion and entertainment are orthogonal attributes of game design. Particularly in a sandbox or procedural game, it could be incredibly difficult to construct an immersive experience, and even if you did it might not be as entertaining as a more rule-driven experience.

#4974160 Monster thinking in an action rpg

Posted by on 28 August 2012 - 10:28 AM

You're smarter already than the best NNs and Genetics out there

not waste time in irrelevant methods like neural networks and genetic programming.

I don't think I agree with these sentiments, but the advice is good. Neural networks and genetic algorithms are among the most promising subjects in AI research. They are the only techniques with a good likelihood of producing real emergent behavior or machine learning. However, most implementations *are* unpredictable, training intensive, and unlikely to provide a whole lot of benefit in a typical rpg.
I would say if you want to look at something more advanced than finite state machines or scripts, the most fertile ground would be production systems. These are kind of (not really) like a large collection of interrelated if-then-else statements. The idea is to build up a collection of productions (sometimes called rules) that represent all of the 'mental' considerations of the AI.
In my opinion, this type of representation is much more authentic than a real-time plastic method like a GA or NN. Consider, during the course of a battle, it is pretty unlikely that a monster will be learning and incorporating a whole lot. It seems a bit more realistic to say that a monster or enemy has a lot of knowledge and experience (in the form of productions) that it brings into the battle, but that set of knowledge doesn't necessarily change a whole lot.

#4973023 Self-storing class-instances on creation

Posted by on 24 August 2012 - 10:07 AM

One thing to keep in mind is that everything doesn't have to be an object. If some collection of functions and data is related, then it's often useful to group them together, but you should really consider whether it makes sense to create one. If clients of the gui management code will likely only use one or two methods (register, unregister?) then you don't have to wrap that functionality into an object to be instantiated, especially if doing so would require you to provide static functions anyway.
On the other hand, if you really do think there are enough operations and data to justify defining an object, and it does in fact make sense to instantiate one, then by avoiding singletons you actually gain a lot of flexibility (one lightweight gui manager handles the hud, another for the pause screen menus).
I think if you embrace the fact that a single program can incorporate multiple paradigms, you should see the need for singletons decline.

#4972886 Monster thinking in an action rpg

Posted by on 23 August 2012 - 10:53 PM

I think all the advice you've received so far is great, but just to give an alternative idea. What if you approached the problem not as a perfectly rational actor weighing the costs and benefits of perfectly measured variables, but instead as a monster trying to decide what the hell to do with this crazy hero. Putting myself in the place of said monster, I can think of a couple of likely cognitive strategies:
a) I am an instinctual sort of creature, like an alligator or maybe an ogre, and in the clutch I tend to rely upon particular strategies. These strategies will most likely be widely applicable and crudely effective (having gotten me this far in life) but due to their generality may be exploitable. So as an alligator if I smell a hero crossing my swamp I likely try and take a bite out of him. As an Ogre, I put my club to use. There is probably not going to be much decision-making involved for an instinctual sort of creature.
b) I am a sneaky, crafty, or cunning sort of monster like a goblin shaman or a street urchin. I am not very intelligent, but I may come up with some unexpected way to approach a confrontation. In my view, as the cunning monster I'm not so much comparing a bunch of options and deciding on the cleverest one, rather I'm just more likely to try something unorthodox rather than always rely on a standard approach. So, again I don't need complex decision making, what I really need is some mechanism for simulating creativity (a simple solution might be a lot of prescripted behaviors, that are randomly selected from so as to appear emergent). The key to my cleverness as a goblin shaman isn't that I pick the best possible move, rather, that I tend to try unexpected things, potentially gaining an advantage.
c) I am a normal, competent, human-equivalent intelligent being. This one is pretty complex, because there is such a range of cognitive behavior in humans, let alone fantasy races. However, I would say some good approximations are certainly possible. For one thing, depending on how your system works and whether it can handle this, you might consider the fact that the most important (and often the only) decision that a typical soldier makes in a brief conflict is whether to fight, and a lot of the time only one side makes even that decision. So, an orc hunter might put some effort into sizing up his opponent and deciding whether he feels lucky, but once he's charged in, he probably isn't spending a lot of time deciding who to swing his axe at. I would say, there are a few decisions (whether to fight, whether to run, maybe others?) that could benefit from a weighted statistical model *or* a fuzzy approach, but honestly for these kinds of monsters the choice of which action to take should probably be really simple.
d) I am a highly intelligent being, such as a wizard, a battlefield commander, an elder dragon, or whatever. For this category, I am somewhat divided. Traditionally, games tend to assign the least flexible, and least intelligent cognitive simulations (almost always a simple, scripted pattern) to ostensibly the most intelligent type of enemy. I understand why, as games have to maintain a certain level of fun, and often have to follow certain conventions to do so, but I still dislike it. If I am a seasoned, veteran troop commander, I am not entering a battle without a plan that stands a high chance of success (unless I'm in a desert badum-bum-tsh!). So, for these kinds of monsters, I could see employing a fairly elaborate cognitive model, perhaps even a perfectly rational algorithmic model. But, if you plan on keeping with RPG tradition, then actually you don't even need that for these guys, just some state machines and scripts will do it.

I hope this helps. I'm not disagreeing with anything else said, just offering my take on how certain monsters could think in battle. I probably over-simplified the human-types, because there is really a whole lot that you could do there. Good luck.

#4958991 Can you make a big game by starting small?

Posted by on 13 July 2012 - 09:30 PM

It seems like the consensus is that, yes, it certainly can be done, though of course it'll have its challenges.
As a counterpoint though, you could just as easily pick out, say one or two core features, build up a prototype with those features, and then slowly add content that way. So, you don't have to chunk development spatially (in terms of levels), you can break it down in whatever way will be most convenient given your development plan (e.g. if you expect it to take only a few weeks to build the game world, but many months to build the graphics engine, then you might split up rendering components instead).

#4958989 Ways of forcing players to play together without risk of getting griefed?

Posted by on 13 July 2012 - 09:10 PM

Cause everyone wants to beat the game, right?

Nope. In fact, most of the time if a person is causing trouble for their teammates, doing stupid things, or intentionally not contributing it's exactly because they simply don't care very much what the outcome is.

I think it might not be possible?
And only thing they can do to make it less destructive when you get griefed is to make the game as casual as possible... no penalties for losing.. just small almost meaningless rewards for winning.. just some fun gameplay you can get into and be done with in a few mins.

I think this is exactly the wrong idea. Players will mess around more the less consequence and interest the game maintains. A fairly slow paced game like WoW is not going to keep a player's attention all the time, so you'll inevitably have people playing games inside the game. The highly repetitive nature of WoW doesn't help. I would guess that it applies less so to Left 4 Dead, but it's much the same thing. L4D was awesome, scary, and intense the first (and perhaps second) play through each level, but by the time you've played each level 5 or 6 times, they get a bit mundane (personally, I think this is because of the linearity). Cue the griefers.
My suggestion would be, if you're really worried about people not taking your game seriously, eliminate unnecessary repetition, and produce more exciting content. When a player is having a great time, their much less likely to try and ruin someone else's.

#4958093 Weekly Discussion on RPG Genre's flaws [Week 3 : Attrition]

Posted by on 11 July 2012 - 10:48 AM

n00b0dy, your approaches are not a dichotomy. In fact probably all elements of game design (as all elements of life) are a continuum. For the sake of argument, let's say approaches A and B are both bad as you claim, well what about all of the possibilities in between? That is, after all, why we call it game balancing, because its a matter of finding the most entertaining balance between unpleasant extremes.

Also, WoW is not particularly relevant to balancing a turn based retro rpg. I know we've already introduced plenty of outside examples to support the discussion, but there are limits to how far you can take an example. It might be useful to look at why the designers made the pacing decisions they did, but it would probably be fruitless to try and port them into this model.

However, I actually think your example of a petrify (or potentially death spell, or other, really negative things) effect which would increment over time, forcing the player to avoid engaging the Medusa until he was ready to fully commit, that might be fun. It's maybe not a good basis for the basic attrition system in a game, but in a dungeon or two I think it has a lot of promise.

#4957435 Fathoming the Unfathomable

Posted by on 09 July 2012 - 04:26 PM

I'm going to focus on Limbo, because you brought it up, and because it's such a fantastic game. I think what the designers really did perfectly was to get out of the way of their game. The intentionally minimal graphics let your imagination run wild (seriously, tell me you didn't get a grossed out chill-down-the-spine the first time the little head worm got you) and your imagination is going to be a better renderer than a computer for the foreseeable future. This is why books have not been supplanted by television, nor comics by cartoons. Now, I'm not suggesting that you do away with all graphics, but I think if you look for places where you know technology won't match your vision (or your player's imagination) then just don't try. Use suggestion of detail, make things indistinct (within reason), rely on fewer, but more significant, colors, etc. Also, you keep referring to ambiance, which is good, but keep in mind that in real life we don't notice every tiny thing (and you wouldn't be able to capture it anyway), really it's about the big, important things being good, and a few key details being right. So, make the grass that perfect, grassy green. Or, instead of a musical score, give me a realistic crunch of gravel. Basically, use the technology to aid the player's imagination rather than trying to override it.
To me, that is really what made Limbo successful. I didn't get immersed in the game because I was emotionally involved, I was able to get emotionally involved because I was immersed.

There was some robotics theorist who talked about the "unreal valley" (or something) where humans would reject robots as they got more human-like, because we would sense subtle differences between them and ourselves.

'Uncanny valley' is actually only a dip in the perceived realism of a construct, as it approaches human-like. Perceived realism decreases, as we notice and magnify tiny imperfections and flaws in the construct, but as modern CG shows, it increases again on the other side. A good artist with modern tools can make it to the other side of the valley. Thus, it wouldn't be impossible to make a realistic and immersive game, just impossible with current technology and less than millions of dollars.

#4957405 Programming Techniques

Posted by on 09 July 2012 - 02:36 PM

That's a little extreme. You shouldn't break a function up into many smaller functions unless you need to. You seem to be advocating doing this for the sake of doing it.

Sorry if it wasn't clear, my advice was given in the spirit of "if you aren't sure how to acquire the taste for function decomposition, then..." and not as a general practice. I believe there are few, if any, situations where a long, complex process with multiple levels of detail should be all thrown into a single function, but it takes some forced practice to determine when a function will benefit from a split.
So, to be clear, I am not saying that all functions must be a particular length, or that they should be split arbitrarily if no obvious distinction arises. What I am saying is that obvious distinctions are often present, and there are advantages to capitalizing on them.

Edit: Also, I don't believe "you need to" do anything with regards to styling in code, rather, some styles promote code reuse, some promote readability, some promote maintainability, etc. In my experience, function decomposition is a very beneficial style, and to only employ it only when I am compelled would be to forego its benefits altogether.

#4955630 My Dream Game

Posted by on 04 July 2012 - 08:32 AM

I'm going to suggest that you develop the text-based game to completion. Most likely, nobody outside of your circle will play it, but if you really put effort into it, it will be an incredibly cool in-group project. Not only that, you will have at least a chance of success. The kinds of complexity you are imagining are not easily realized (not impossible either) in a graphical environment. Combined with the facts that you have effectively zero game development experience (I'm not trying to be negative, but there is no other way to put that) and that you have made an assumption about the game development industry that probably won't hold up (that some company out there will pick up your prototype and make a game out of it) and you basically have a recipe for failure.
Now, if you are very interested in game development and just want to learn, there's no reason you can't start teaching yourself about graphics (using say, C# and XNA, or C++ and OGRE), and game physics, and animation, and content pipelines, and for that matter, artificial intelligence (if you can easily write and solve an equation to determine some behavior, it's unlikely the AI will be very interesting or emergent). And while you are learning, you could try to make some progress on the game idea.
Honestly though, if what you know is basic java and your goal is to play this game, not to make it, or make money from it, stick with what you know (you might use a basic GUI to organize the info on a civ, instead of pure CLI). It won't be pretty, but you are already imagining the epic battles and sprawling civs in your head anyway, what's wrong with that? Anyway, good luck whichever route you choose. :D

#4955387 Weekly Discussion on RPG Genre's flaws [The "Fight" Command]

Posted by on 03 July 2012 - 12:28 PM

Like I said however, we're terribly off topic.
On that note, next week's topic will definitely be related to grinding as I believe we've reached a natural bridge here.

Ok, so, what I was trying to get at is that, I don't believe there is any simple catch-all method of reducing the reliance on the fight button without completely devaluing it, all other things being equal. My proposed solution would be to revise the number and quality of encounters, so that players never feel the need to settle into a rut; however, you make a good point about grinding, and I can definitely imagine many core rpg gamers being upset if this feature were completely removed. I think your ideas for training and/or optional areas would mitigate that, but possibly at the cost of throwing off the balance we've just worked to achieve.

Can you provide an example of a situation a player would chose not the maximum dps way ???
1) A player would only replace the 1 button max dps macro only if a situational effect grants him more damage.

Thus bots are always better than human players, because they can always follow the max damage rotation, automatically following the damage buff "ifs".

Note: a monster forces you to cancel the "max dps strategy" only if it kills you. If it doesn't kill you, it is cheaper (timewise) to finish it as fast as you can and heal to full in 1sec with no resource costs after the battle finishes.

I think here you are making a lot of assumptions about the nature of the game and the nature of the player. Not everyone is going to approach even a traditional rpg in the same way. Nowhere is it implied (and it's often not the case) that mashing fight will provide the maximum damage per second, and in a turn based rpg I find it unlikely that most players even care about dps in terms of strategy, rather, when playing strategically a player will consider damage per turn and enemy damage per turn, but if the player starts thinking in terms of dps they have probably already abandoned all strategy. On the other hand, I agree that once a player settles into a one-button rut, they will probably not break out of it unless the game forces them to. But, I don't think that's because they want to play in such a single-minded manner, I think it's just because the game allows them to.

I dont agree, just because a game is bugged doesn't mean that everyone that uses this system is bugged.
Oblivion was bugged because :
1) unlimited hp monsters : regenerating trolls were unkillable and took hours to kill, and it was only a crappy trash mob, that you met every 10 steps.
2) unlimited damage monsters, 1 shot : Also there were some fiora humanoid monsters that just 1shotted you with their elemental spells.

In my game a lvl 1 monster is as hard as a lv 100000 monster when you fight em at same level. Why is that ? because i don't switch their monster type, a goblin remains the same, just higher level.

I wasn't implying (nor do I believe) that Oblivion was 'bugged', just that the incentive to improve was reduced (not removed entirely) compared to the other Elder Scrolls games. Also, the goblin thing is basically what I was getting at. If level in your game is just an arbitrary or abstract concept that will be manipulated at will to balance difficulty, fine, but if level is a number intended loosely to measure the combat prowess of a creature (as in D&D) then it just seems silly to say that the goblins gain levels to match my own. Certainly, some goblins might have higher and others lower level, and the average level might change as the game progresses, but not implicitly as my characters develop, otherwise there is no benefit to developing my characters (and no, I don't agree that people want bigger numbers just for the sake of bigger numbers, those numbers have to mean something to be interesting at all).

#4954003 Weekly Discussion on RPG Genre's flaws [The "Fight" Command]

Posted by on 29 June 2012 - 11:05 AM

A designer knows he has achieved perfection not when there is nothing left to add, but when there is nothing left to take away.

To me, bashing the “Fight” command is precisely that. It can be removed without reducing the game depth in it. The only reason it is there is like what others here have said to give players a way out if they run out of potions to use the other options. There are many ways to prevent users running out of potions and therefore get into a point of no return.

I think the issue is bigger than just the problem of always bashing the “Fight” command. The issue with those Retro/Console (J) RPG i played is that All Attack commands(Fight, Magic, skills, summon) or whatever you called it is strategical irrelevant.

In a roleplaying game, there are 3 components:
Resource Management (getting enough hp and mana pots)
Character Power(Levels, Skill power of attacks, Stats)
Tactical decision in fight( Who to choose to fight who, with which attack command)

The problem is that if Character Power is high enough, you can kill anything just by bashing the “Fight” or any attack command for that matter.

The solution is then to
a) keep the balance between Character Power and Tactical decision in fight or
b) Make them incomparables.

I think you are one the right track with your suggestion that battles have to be more balanced to involve players (I think n00b0dy is trying to get at the same thing), but I think the most important lesson from TES: Oblivion was that game difficulty should not scale to the player. This is especially important for retro-rpgs, where a core component of the game is leveling up. In any game, though, it's going to reduce the incentive for a player to improve.
A better way, I think, would be to put more effort into designing meaningful, well balanced encounters (balanced for the intended player level) and then maybe restricting the player from over-leveling. After all, a goblin is only a goblin, it doesn't make sense for him to become super powered just because my character is super powered. But, at the same time, there is no good reason to allow me to run around slaughtering goblins indefinitely.
I see two ways to accomplish this restriction:
1) Offer only a finite number of encounters. This would be my preferred method. It better reflects reality (how many goblins live in this freaking cave?). Having fewer encounters alone has some considerable benefits, mentioned above, but also having a completely limited number of encounters will encourage players to make every one count (you could even scale experience based on how 'well' the player performed), and prevent them from breaking the carefully balanced difficulty. Although, I still believe there should be considerable latitude for players to improve within any area, possibly by making many of the encounters optional (yay it all comes together!).
2) Limit player grinding in some other, artificial way. Time limits, scaling experience sharply downward with increasing player level, etc. This is what rpgs that offer random battles have done historically, to lesser or greater extent. I think the benefit here is that the player's level can be loosely contained, while offering unlimited "play" for those that want it. Really, though, I've spent a lot of time grinding in retro-rpgs in order to max my characters out, but never because it was inherently enjoyable. The enjoyable part was having maxed out characters and getting to fight the few vicious end-game battles with them. So, unlimited "play" for me is kind of a non-argument.
So, obviously I think 1) is the way to go. If retro-rpgs are going to maintain any appeal compared to other games, the entertainment value per minute (or hour) has got to be more consistent.

#4953708 Weekly Discussion on RPG Genre's flaws [The "Fight" Command]

Posted by on 28 June 2012 - 11:17 AM

I have a very soft spot for retro-rpgs, but I haven't played one since high school essentially because of this issue. As my gaming time decreased, so did my willingness to sit through "encounters" that were so very important to the game designer that he couldn't even be bothered to design them. At this point, I'm with KylHu. If I can have just the story, that's ok. If I can have the story and an intellectually challenging combat mechanic, that's great, but I will probably not bother with a physically challenging (i.e. one which requires more manual effort than mental effort) combat mechanic just to play a game.
That being said, I have long believed that the first step should be reducing the number of encounters. The problem with long encounters is that (using the default fight mechanism) they get boring or frustrating, and when there's a whole lot of boring, frustrating encounters the game pretty much sucks. So, don't have so many encounters, and now you are free to make them more challenging and more strategically involved. This is exactly why tactical rpgs have always featured fewer encounters than static rpgs, the increased complexity requires more time and more time requires fewer repetitions.

Some of the old era rpgs had the concept of attrition built into them. Where getting through the dungeon and to boss with enough hp and mana to beat them was part of the challenge. Rather having the player recover after each fight force them to carry the damage, fatigue, and injuries along until they can set up camp or return to town. I remember playing FF1 back in the day and limping back to town after a tough dungeon with 3 characters dead and 1 barely hanging on and then not having enough gil to resurrect my entire party.
I suppose the real question for me is what problem are you trying to solve? Do you want to make each random battle more meaningful and challenging. Or should the challenge be in the journey. I'd prefer to see the challenge be about getting from town through the deadly swamp down into the depths of dark cave to retrieve the crown from its guardians and then making home again.

I have such fond memories of FF1 (3 dead, 1 barely hanging on, 4 dead once you leave the dungeon and accidentally step back in the poison swamp, or even better, 4 level 20s, awesome gear, promoted to Knights and Wizards and whatnot, saviors of the land, 'Oh hey, what's this Greenish dragon do?').