If my end goal is learning about trust, what is the best way?

Started by
7 comments, last by Steve132 14 years, 1 month ago
I am thinking of designing a game for my community, my end goal being to learn how much each member trusts every other member. Maybe obtain a relative trust ordering. I am trying to keep it entertaining but at the same time useful to deduce the information. I was thinking of the following: - Directly give two random names and ask who the user trusts more. For instance, Who do you trust more? A or B - Give them a scenario and ask them to specify who would be more appropriate. For instance, if you were being chased by a lion, who do you think would save you? A or B The first one seems dumb to me and I'm not sure about the second one. Does anyone have suggestions? I'm trying to look at individual trust measures. The reason why I am doing this is because if tomorrow, a user is faced with a question, he would turn to the most trusted in his reach and see what that user thinks about it. For this, community trust might not be a very good measure. [Edited by - John Adams on February 19, 2010 1:10:36 AM]
Advertisement
Activity: Secret Keeper

Step 1) Ask each participant to write down three embarrassing incidents or secrets that they had never told anyone, something that they would not want everyone to know.

Step 2) Ask them to arrange the three incidents in level of secrecy, put each in an envelope labeled 1, 2, 3.

Step 3) Ask each participant to assign their envelope to a community member (i.e. "Who do you trust that can keep that secret?"). Each participant cannot assign more than one envelope to a fellow member.

Step 4) Tally the envelopes received by each members. The member that has the highest score receives the most secrets and is the most trusted member in the community.


Off Topic:
How come you can posted on 2/18/2010 and you are a member since 2/19/2010?
Ah... I should've been more clear. Sorry... I'm trying to look at individual trust measures. The reason why I am doing this is because if tomorrow, a user is faced with a question, he would turn to the most trusted in his reach and see what that user thinks about it. For this, community trust might not be a very good measure.

Creating this as an online game but I'm sure the design you specified can be implemented online as well. Thanks a lot for your time. If your answer does change with my above comment, please let me know your suggestions...

About the Off-topic question: I guess it must be the timezones.. :) I'll have to ask the mods here the same question.
Quote:
- Directly give two random names and ask who the user trusts more. For instance, Who do you trust more? A or B

This isn't going to work so well. Smith might trust Bob over Frank when money is involved, but Smith trusts Frank more with his personal life. Smith's answer to "who do you trust more" is going to 100% be primed by whatever he is thinking about right now, and thus be 100% arbitrary.

Quote:
- Give them a scenario and ask them to specify who would be more appropriate. For instance, if you were being chased by a lion, who do you think would save you? A or B

This is the one that is going to get you interesting information. But you will want to ask the same question several times in different ways to get the whole story. And you will want to ask questions that you can relate to one another. But not ask all the questions back to back (ie space out any related questions so the subjects won't see the pattern as quickly.) This way YOU can prime their mind with scenarios that give YOU a good idea of what value they are thinking about when they decide to trust someone.

Consider:
"loan my car"
"give a ride to"
"ask to loan me their car"
"ask for a ride"

Add in more questions that cover items of value, like:
"buy lunch"
"ask to cover my lunch"
And patterns would likely emerge around people who trust in some risky event like "loan my car" because they trust the opposite risk in other areas like "ask for lunch".

You should be able to make up questions that test other values, like "fun", "fear", "strength", "lust", and "morals". Some people will have obvious prefrences of "i implictly trust Bob", others will have obvious skews like "Bob is good with money, but Frank is better with secrets" Overall, you should be able to rank people in the group as a whole, since some will fall in the same skew for several friends.
I see your point. That's a really great suggestion. So what you are suggesting is that I split the questions into various dimensions and then ask the user a question related to each dimension. Perhaps from there, I should worry about establishing these trust metrics. I am very new to the area of gaming, but are there any datasets out there that I can start building upon? I'm sure this must be often encountered in RPG style games unless I've mistaken the whole thing...
Take a look at:
Dante's Inferno Purity Test for instance.
It follows the questioning theory I was trying to present.
There are subsets of the questions for things like "sex/lust", "morality", "belief in god/deamons/scripture", "money", "food", etc.
Each subset is spaced out throughout the questioning.
Each subset contains "the same question" posed in several ways, including positive and negative framing.
In the end, it tells you what level of hell you went to. But as you can see, each level just corresponds to some subset of the question topics. (including a ranking of how consistent your answers were within any particular subset of correlated questions)

By doing something similar (where "yes or no" is replaced with "Frank or Sarah"), you could pair off people enough to categorize each person into listings of what "Bob" thinks they are most trustworthy with. Then with a yes/no survey like the purity test i linked, you could try to determine the relative importance of each topic to "Bob". From that you'd rank the people who he trusts based on his ranking of what he trusted them with. But you also have all the other cross-corolated information, like "Sally is good with money" because 3 people who knew "Sally" chose her over everyone else when it came to the money questions, independent of those people's rankings for importance of money.

Make note that sometimes directly asking a question isn't the right path, it skews a person's thoughts toward a specific answer. Asking broadly related questions can get you more info, especially when the connotation of the answer is more in the grey:

"Are you fat?"
"are you gluttonous?"
"do you lust after women?
vs
"are you in shape?"
"do you indulge on good food?"
"have you been to a strip club?"

The second set is more likely to get a good representation of yeses and noes. The first set is likely to get a lot of noes, since the connotation of saying "yes" is bad. But you can infer with some certainty that the answers to the second set represent the same answers from the first set assuming you could have gotten people to answer the first set truthfully. And don't go elaborate like "a huge 3 course turkey dinner, complete with a sugary glazed ham sits before you. Do you eat?". Any extraneous details you add will also skew your results.

Quote:
I am very new to the area of gaming, but are there any datasets out there that I can start building upon?

Sorry can't help you there.
Quote:
I'm sure this must be often encountered in RPG style games unless I've mistaken
the whole thing...

Unless I'm getting the wrong idea about what you are talking about, then I've never seen this in any RPG i've played.

Quote:
The reason why I am doing this is because if tomorrow, a user is faced with a question, he would turn to the most trusted in his reach and see what that user thinks about it.

And again, I'd remind you. Given the choice, a normal person would go to the person they trust most to have a respectable answer. I wouldn't go ask "bob" about girlfriend troubles if he is bad with women; But he's really good at finances, so I would ask him about investment advice.

[Edited by - KulSeran on February 19, 2010 3:02:34 AM]
I think there is a general part of trust that need not be divided. This happens when you associate trust to a person independent to their knowledge or competence. It is a quality of character. It is the warm and fuzzy part of trust.

You feel that someone is trustworthy when your well-being is a priority in their considerations. To build trust is to commit to the well-being of the others, and to articulate the considerations.

For example, when you say you trust your auto dealer, it means that you believe that he has your best interest in mind.

You could trust a person that does not have the knowledge you need. The reason is that when that person has your best interest in mind, he will not pretend to know something he doesn't. You can trust that he does not risk your well-being feeding his ego.

"Trust no one" means "do not believe that anyone would act in your best interest."
I have just one suggestion. No matter what, make sure your application doesn't tell or imply it is measuring trust. You must create an application that has situations that it can *infer* how trustworthy a person is. If they understand what you're trying to measure, all you'll do is reward the biggest liars.
http://en.wikipedia.org/wiki/Stable_marriage_problem

This seems like it may be relevant to you...

This topic is closed to new replies.

Advertisement