Jump to content
  • Advertisement

Archived

This topic is now archived and is closed to further replies.

Srekel

How to put "weight" on factors.

This topic is 5649 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Short explanation: to weigh terms when adding, you can multiply each term with a weighing factor (so 0.5 would mean that it''s not as much worth as something multiplied with 2). Now, I need to do this with percentage values, and multiplication. I have a number of values that I want to multiply with eachother. I want them to be "weighable", and I want the result to be in the same size (somewhere between 1 and 200 or so). If I convert divide them by 100 first, I get roughly the same size (50*50 = 2500 which is wrong, but 0.5*0.5 = 0.25 = 25% which is what I want). But if I want to put weights on them I need to use "to the power of the weight"... Hmm.. I hope you understand what I mean. Is the correct solution to first power each factor with it''s weight (or importance), and THEN divide it by 100? So it would be (50^(0.9)) / 100 * (50^(1.1)) / 100 = 0.34 * 0.74 = 0.25 Hmm Thanks for the help Is this the "standard procedure" for this kind of operation? Is there a more optimal way to write this? "Kaka e gott" - Me

Share this post


Link to post
Share on other sites
Advertisement
If the values are percentages, you'd divide by 100 first, and then raise it to a power. If you want the range to be the same as the domain, the sum of the powers should be 1. But I've never seen weighting of this sort used in multiplication.... what is the use you are putting this to? there may be a better way.


How appropriate. You fight like a cow.

[edited by - sneftel on March 30, 2003 6:01:39 PM]

Share this post


Link to post
Share on other sites
Seems like your percentages should be the exponent. What you are raising to that exponent should be what the percentages represent. Then that is a geometric mean. So say a roll of a d6 die the exponent would be 1/6 applied to 1,2,3,4,5,6. That comes out to about 2.994. My statistics suck, but I believe that says the the average of product of m rolls taken n times should be 2.994^m as n goes to infinity. That opposed to the sum of m rolls taken n times going to m*3.5 as n goes to infinity.

Share this post


Link to post
Share on other sites
It''s for my RPG skill system.

Basically, every problem you try to solve (lets say picking a lock) has a certain success chance, based on a number of variables.

In the lock-picking case, the variables would be:
Chance to pick lock
Toolknowledge
Lockpick quality
Intelligence bonus
Creativity bonus
Lock Difficulty

I want every variable to have an importance value, here are some I''ve come up with:
1
0.8
1
0.7
0.6
1

Now, without the importance values, the chance to be successful could simply be calculated as "each variable divided by hundred, and then multiplied with eachother".

But with importance, I''m not sure how to do it? Mine seemed to work pretty nicely.


I guess I could also just calculate the average value (add all values and divide by the number of values) but I think it would make sense the other way, because a Lock Difficulty of 0.1 would actually mean it''s 10 times more difficult than a lock with Lock Difficulty 1.


Share this post


Link to post
Share on other sites
Srekel, You did say you wanted to multiply the values didn''t you? Why''s that? Couldn''t you use a simple weighted (arithmetic) mean, like a centre of mass calculation. ie if w_i is the "weight" of the ith property, s_i the current score for that property,

P(success) = sum(w_i * s_i)/sum(w_i)

if each of the s_i are in the same range say s_i in [0,100] then P() is also in [0,100] alternatively you could normalise the s_i to force this. For eg if s_i lies in [0, s_i_max] then,

P(succuess) = sum(w_i * s_i/s_i_max)/sum(w_i)

gives you a conventional probability ie lies in [0,1]. Obviously you can multiply this result by 100 to make it a percentage if you prefer.

Share this post


Link to post
Share on other sites
Probably what you are looking for is something like this

Given values a1 ... an with ''weights'' w1 .. wn, evaluate

a1^w1 * ... * an^wn

This is equivalent to doing a logarithmic transform, doing the weighted average, then doing the exponential transform back.

Share this post


Link to post
Share on other sites
Yep.

Just raise each value to its weight, add them up, and divide by the number of values. As long as your initial values were 0-1, your final result will be 0-1 too.

Share this post


Link to post
Share on other sites
The reason I don''t want to use

P(success) = The sum of all values multiplied with their respective weight, and then divide by the number of values, is:

Say that I have three values, all having the same weight, 1.
Chance to pick lock = 0.8
Lockpick quality = 0.5
Lock Difficulty = 1

The average (in the way described above) would be
(0.8 + 0.5 + 1) / 3 = 0.76
I agree that this would be a good value, and well it would certainly work to do it like this.
However, say that I want to make a lock with a Difficulty of 0.1 (this should be ten times harder, basically), the average would be 0.46.


Whereas if I would have multiplied the values, I would have gotten first 0.4, then 0.04.

In other words, it will be easier to say "I want this lock to be twice as difficult to pick as this one". Plus, if there are more values, say 10 for example, then a Lock Difficulty of 1.0 VS one with LD 0.1 wouldn''t affect the chance much at all.







"Kaka e gott" - Me

Share this post


Link to post
Share on other sites
Ok, lets say you want the lock difficulty to count twice as much as the other two combined. So you have (0.8*0.5*1.0^4)^(1/6)=0.858. So if they roll less than an 86 on a d100 they pick the lock. Now say the difficulty is 0.1 so (0.8*0.5*0.1^4)^(1/6)=0.0858. The important quality of a mean is that it is between the max and min. No matter the weights the lock can be no more difficult than the lowest factor nor easier than the highest. With just taking a product that isn''t true. 0.1*0.1=0.01. So if you only had difficulty and skill level weighted equally then 10% chance on each would be a 1% chance overall. That will do a good job of pissing your players off.

Share this post


Link to post
Share on other sites
Well I''m not trying to make it a "mean" value..

If you only have 10% pick lock skill, and you try to pick a lock that has a difficulty of 0.1, then it SHOULD be very hard to pick it. But a player won''t have 10% (it''ll probably range from 30% to maybe 200%.). And most locks probably won''t have a difficulty of 0.1. Anyways, that''s a balance/game design issue, and this isn''t the game design forum




"Kaka e gott" - Me

Share this post


Link to post
Share on other sites

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!