Sign in to follow this  
Lode

Poker hand evaluation

Recommended Posts

Hi, I've found some very interesting code for evaluating poker hands fast (http://www.codingthewheel.com/archives/poker-hand-evaluator-roundup). I tried it (=The Two Plus Two Evaluator) and it can do millions of hands per second. That allows doing simulations of poker cards, which allows statistically calculating the chances of your poker hand compared with other players with unknown hands. BUT... That's not what this post is about. I was wondering, if it's possible to calculate the chances of winning with your hand, without using simulations, but instead purely symbolic, and faster due to not doing millions of evaluations? For Texas Hold'em. Given: Your hand cards: 2 cards The cards on the table: 3, 4 or 5 cards One other player has two unknown cards in his hand, and there are a few unknown cards left on the table (2, 1 or 0). What is the chance that you win? To perfectly simulate this, you need to calculate every possible combination of the two cards of the other player and values for the not-yet-flipped cards on the table. With the fast simulation of millions of games per second, this might get calculated in a reasonable time for some cases, but still pretty slow. So could this be done without the simulations but purely based on statistics? Or, alternatively, does there exist an algorithm that can give a rough estimate? I didn't find any when searching with Google, I only found those fast evaluators. Thanks!

Share this post


Link to post
Share on other sites
Sure. First hit on Google.

See other links under "texas hold'em odds".

Of course, Poker being an interesting game probably keeps odds very close until final card. And most of the time, odds are just that - "given 1 million hands, who wins", while the winners are determined by playing 10 hands optimally.

The chances of dying in car crash are well known, yet they can't help one prevent an accident.

Share this post


Link to post
Share on other sites
Hmm, this article seems to be written for a human, not for computer calculation :)

A human sees quite fast based on his cards and what's on the table, what combination to calculate the outs for.

But for a computer, I think it needs to try this for any possible combination of cards. Not just for a pair, but for all possible pairs with all possible kickers, and so on. Then it requires again millions of simulations...

Unless I'm missing something :)

EDIT: Most Poker Odds calculators found online also mention that they use millions of simulations. So I think that this problem is so difficult that indeed the simulations is the only viable solution.

Share this post


Link to post
Share on other sites
Quote:
Original post by Lode

EDIT: Most Poker Odds calculators found online also mention that they use millions of simulations. So I think that this problem is so difficult that indeed the simulations is the only viable solution.


Odds don't change.

So the problem becomes a simple lookup table. Index is known state. For 1 known card, there are 52 distinct probabilities (less due to symmetry).

Quote:
Your hand cards: 2 cards
The cards on the table: 3, 4 or 5 cards


This makes 5..7 known cards. There are 52^7 different odds. This makes 1e12 different values. But due to lots of symmetry, this number is probably much smaller. This table can now be calculate in advance, and then used to look up probability of each individual hand.

Quote:
One other player has two unknown cards in his hand, and there are a few unknown cards left on the table (2, 1 or 0).

The unknowns do not matter for calculation, since until they are revealed they are, well, unknown, and after they are revealed they cannot be changed.

Share this post


Link to post
Share on other sites
"This makes 5..7 known cards. There are 52^7 different odds. "

This doesn't make sense. Poker hand strengths are based on conditional probabilities and the simple case of particular hands is done in terms of permutations where order don't matter - combinations. The reason monte carlo or simulations are used is cause the conditional probabilities get too hard to do and do flexibly/generally. In terms of utility, nothing else makes sense really.

Share this post


Link to post
Share on other sites
Quote:
Original post by Antheus
Quote:
One other player has two unknown cards in his hand, and there are a few unknown cards left on the table (2, 1 or 0).

The unknowns do not matter for calculation, since until they are revealed they are, well, unknown, and after they are revealed they cannot be changed.


Maybe I didn't explain everything well enough, but the unknowns do matter here. Because, they're really unknown, which means, iterating through all possible combinations they can be.

The chance you win can be calculated like this at the river:

Iterate over all possible hand combinations of the opponent (this are 990 combinations: each card is one of 45 cards, the other 7 are in your hand and on the table). For each combinations, check if you or the opponent wins. Add 1 if you win. At the end, divide this through 990.

At the flop, there are 4 unknowns, two player cards and two table cards. This requires 178365 iterations if I didn't make a mistake, and that goes very fast in fact.

The code from the "2+2 forum" mentioned in my first post, uses a generated table that is 123MB in size, to evaluate the best 5-card combination formed by 7 cards.

So I don't even need Monte Carlo for a single opponent, I can just check every possible combination. But if I want to extend this to multiple opponents I'll need to take random samples indeed.

So I got a solution that works fast enough now. I'm still curious if there's an analytical way that doesn't involve brute-force iterating over all possibilities or Monte Carlo integration.

Share this post


Link to post
Share on other sites
I've wrote a bot to play online with real money and it used AI rules, not even statistical analysis. I was going to go that route until I stumbled on this little gem:

You need to calculate what everyone might have, so it's not just your 2 hole cards and 5 on the board.

For example, if the suit of a king turns someones low straight into a flush, and your best hand is a straight draw to king, then even if you get the king you still lose if its a heart to the other guys flush.

You can't know who needs what, but you surely need to factor in these, "I just lost my house by getting the card I wanted" own goals.

There's probably other shit going on as well that means you really must simulate for each player as well as yourself and then the combos get truly biblical.

Share this post


Link to post
Share on other sites
Quote:
Original post by Lode

Maybe I didn't explain everything well enough, but the unknowns do matter here. Because, they're really unknown, which means, iterating through all possible combinations they can be.

The probability of winning depends exclusively on the 7 known cards, as far as player is concerned - unknowns do not affect probability of winning, since the unknowns cannot be fed into the algorithm. This is why the table above is 123MB in size: 52 choose 7 = ~133 million. Unknown cards would matter if deck history had to be taken into consideration.

Individual hands might need to be brute forced though, but that only needs to be done once. Same for multiple players.

Quote:
But if I want to extend this to multiple opponents I'll need to take random samples indeed.

If more than 2 players are in a game - do you see more than 7 cards? If not - then the table will be different, but still only 52 choose 7 elements large. So just precompute for 3, 4, 5, 6, 7 players.

All that can be calculated is defined by number of visible cards. Everything else boils into a single number and doesn't change once computed.

Share this post


Link to post
Share on other sites
Technically, you do have some information about the "unknown" hands of the other players - their betting patterns. However, that's well beyond the scope of an odds calculating algorithm.

Share this post


Link to post
Share on other sites
You may also want to find out how card counting works....

The probability of a given hand is well known and documented. A simple look-up will tell you the relative value of a given hand. Eliminating possible hands based on known cards is also a viable method.

To avoid the brute force method try working off the assumption each player recieved a best possible hand untill that possibility can be eliminated. For example if they have a diamond and a club then a flush is not possible. If they show 2 cards that are too far apart for a straight then that also can be eliminated.

Share this post


Link to post
Share on other sites
Some hands can be evaluated without needing to do a full simulation. Here's a couple of examples:

1. Hand: 2C 2D Table: 2S 2H 3C 3D 3S

You have four twos - the only card someone could beat you with is the three of hearts. It's simple to work out the chances of someone having that card, based on the number of players.

2. Hand: 3C 3D Table: 2H 2C 2D 2S 3S

Everyone has four 2s, probability of winning is zero, unless all opponents fold.


Note that both of those cases contain the same set of cards.

However I carefully set up those hands to make the maths easy. In most cases the difficult bit is working out what possible hands your opponents might have.

I think you can also get away with less than (52 choose 7) entries in your table - that gives you every possible ordering of the 7 cards but in poker most of those orderings are equivalent. You only care which cards are in your hand, and which are on the table, which is only (6*7) / 7! = 1/120th of the ~133 million.

Edit: Thinking more about it you need (7 choose 2) * (52 choose 7) entries in your table, which is getting rather big - 2.8 billion. That allows you to consider all combinations of which of the seven cards are in your hand vs on the table. Unfortunately that makes a lookup table a bit less practical.

[Edited by - Adam_42 on April 26, 2010 1:17:33 PM]

Share this post


Link to post
Share on other sites
There seems to be a focus here on looking at hands and seeing whether they're winning, rather than what seems to me to be the more natural approach: instead looking at each possible type of winning hand, and then working out the chances that your opponent has an example of that hand.

So, for example, if you have three of a kind, your chance of winning is the chance that they don't have a royal flush or four of a kind or a full house or a flush or a straight or a better three of a kind. You don't care if they have a pair, or what flush they have if they have one.

As the hands get higher value, they're easier to evaluate: if there's a pair on the table, the chance of someone having four of a kind is the chance of two of their unrevealed cards being the two specific cards they need. So first check that those cards haven't been revealed elsewhere, and then it's the number of combinations of cards which include those two cards divided by the total number of possible card combinations.

If you're talking about post-flop, instead of post-river, you first look at your chances of making each hand, and then look at the conditional probabilities that your opponent makes each hand.

Share this post


Link to post
Share on other sites
Quote:
Original post by Antheus
Quote:
Original post by Lode

Maybe I didn't explain everything well enough, but the unknowns do matter here. Because, they're really unknown, which means, iterating through all possible combinations they can be.

The probability of winning depends exclusively on the 7 known cards, as far as player is concerned - unknowns do not affect probability of winning, since the unknowns cannot be fed into the algorithm. This is why the table above is 123MB in size: 52 choose 7 = ~133 million. Unknown cards would matter if deck history had to be taken into consideration.

Individual hands might need to be brute forced though, but that only needs to be done once. Same for multiple players.

Quote:
But if I want to extend this to multiple opponents I'll need to take random samples indeed.

If more than 2 players are in a game - do you see more than 7 cards? If not - then the table will be different, but still only 52 choose 7 elements large. So just precompute for 3, 4, 5, 6, 7 players.

All that can be calculated is defined by number of visible cards. Everything else boils into a single number and doesn't change once computed.


Oh, I see now, you just gave me a new insight, about using these tables for other things than the combination evaluation too.

It would indeed be a table of the same size. But pre-computing this table would take a LONG time, since for each element, a huge amount of combinations needs to be evaluated.

So I'm still looking for a totally different kind of algorithm that would take an approach of analyzing the possible combinations and seeing how much outs you / the opponent might have for certain combinations.

Quote:
Original post by Seol
There seems to be a focus here on looking at hands and seeing whether they're winning, rather than what seems to me to be the more natural approach: instead looking at each possible type of winning hand, and then working out the chances that your opponent has an example of that hand.

So, for example, if you have three of a kind, your chance of winning is the chance that they don't have a royal flush or four of a kind or a full house or a flush or a straight or a better three of a kind. You don't care if they have a pair, or what flush they have if they have one.

As the hands get higher value, they're easier to evaluate: if there's a pair on the table, the chance of someone having four of a kind is the chance of two of their unrevealed cards being the two specific cards they need. So first check that those cards haven't been revealed elsewhere, and then it's the number of combinations of cards which include those two cards divided by the total number of possible card combinations.

If you're talking about post-flop, instead of post-river, you first look at your chances of making each hand, and then look at the conditional probabilities that your opponent makes each hand.


That's indeed how a human does it. And sort of what I was trying to search for in the first place (I mean, searching for an *algorithmic* way to describe this).

A human can do this intuitively. A human can also intuitively recognize whether an object is a piece of fruit. But programming a computer to do that??

It's just that I think it's harder than it looks. For example, if the flop was done, you need to evaluate whether or not the 3 cards that are their might already be part of a possible straight of the opponent, or for you, or for multiple opponents, etc....

I'm wondering if it has been done before this way.

At first sight it seems simulations (brute force or with random samples) are usually used by programs (online odds calculators and such).

Share this post


Link to post
Share on other sites
Quote:
Original post by Lode
Quote:
Original post by Antheus
Quote:
One other player has two unknown cards in his hand, and there are a few unknown cards left on the table (2, 1 or 0).

The unknowns do not matter for calculation, since until they are revealed they are, well, unknown, and after they are revealed they cannot be changed.


Maybe I didn't explain everything well enough, but the unknowns do matter here. Because, they're really unknown, which means, iterating through all possible combinations they can be.

The chance you win can be calculated like this at the river:

Iterate over all possible hand combinations of the opponent (this are 990 combinations: each card is one of 45 cards, the other 7 are in your hand and on the table). For each combinations, check if you or the opponent wins. Add 1 if you win. At the end, divide this through 990.

At the flop, there are 4 unknowns, two player cards and two table cards. This requires 178365 iterations if I didn't make a mistake, and that goes very fast in fact.

The code from the "2+2 forum" mentioned in my first post, uses a generated table that is 123MB in size, to evaluate the best 5-card combination formed by 7 cards.

So I don't even need Monte Carlo for a single opponent, I can just check every possible combination. But if I want to extend this to multiple opponents I'll need to take random samples indeed.

So I got a solution that works fast enough now. I'm still curious if there's an analytical way that doesn't involve brute-force iterating over all possibilities or Monte Carlo integration.


Yep Monte Carlo type simulation is your best bet for multiple opponents. Now you can work it out by hand but it explodes the more people you have and is never really accurate due to incomplete information and gets worse with more people. Every time a card is dealt you have a new set of [conditional] probabilities to use, the past doesn't matter. For example in a 1 on 1 situation and you and your opponent have been dealt your hole cards and the three cards have been dealt then there are 7 known cards. Your two, your opponents two and the flop results. Of course you only know 5 cards.

Now you need to work out the Probability of your winning. There are what 10 ways to win which are mutually exclusive. Lets assume you have a pair, so you can calculate your chance of winning by P(Win) = P(Pairwin | KnownCards) + ... + P(RFlush | knownCards). Where known cards are those that are known to you. Now each probability is it self complex to calculate and is given as a Joint probability.

For example the probability of winning with your pair is the joint probability that you do not end up with any of the higher 8 possible win combinations and neither does your opponent and they have a lower pair or hand than you. P( PairWin | KnownCards) = (1 - P(HigherHandWin | KnownCards)) * (1 - P(OppHihgerHandWind | KnownCards)). The problem is this probability can't be properly worked due to lack of information.

For example suppose you have two 10s (spades,club) and there is an ace of spades, a queen of hearts and a jack (spades) on the table then the probability of your winning with a royal flush is the probability that he wasn't dealt a Queen of spades Or a King of Spades and that they get dealt in the next two rounds. By my reckoning, assuming you were dealt first and alternating, that is (47/51) * (1/45) * (1/44) or about .04% chance.

You need to update the probabilities every time a card is dealt. Suppose that a queen of spades is dealt then your probability of a royal flush increases and now you know the queen wasn't dealt to person so your chance is now ~ 2.2% - (80/83) * 1/44. Any other card than one of those pair and your chance for flush goes to zero. Opponent's royal Flush chance is that they got a 10 of hearts and a Jack while a King and then Ace - all hearts - will be dealt next. The problem though is that due to lack of information we (effectively) look at it like a joint probability (with a very low outcome) when we in fact need to look at it as conditional on All of what has already been dealt. But we can't since we don't have this information. Hence the need to bruteforce or utilize monte carlo type things. You can also create rule based systems and utilize game theory , build a learning statistical model (bnet with belief updating) or some mishmash of the above :)

Share this post


Link to post
Share on other sites
Quote:
Original post by Lode
It would indeed be a table of the same size. But pre-computing this table would take a LONG time, since for each element, a huge amount of combinations needs to be evaluated.


If you are dealt 2 cards, and opponent is dealt 2 cards, the other 5 are on the table, then there are 45*44 permutations (2 for you, 5 on table, 2 for each additional opponent, cards in the deck at this point do not matter) for each input. Or, about 2000 possibilities. For each of these, assuming perfect play, a winner can be determined fairly fast.

So 133 million * 2000 evaluations. This is doable, especially after eliminating symmetric permutations, doing some pruning and perhaps dynamic programming.

The problem above becomes the following:
- Given (P1, P2, T1, T2, T3, T4, T5, O1, O2), determine who wins
P1,P2 - player's cards
T1..T5 - cards on table
O1,O2 - opponent's cards
Let's call this function DetermineWinner.

This probably involves assembling highest hand from P+T and testing it against highest hand from T+O.

Now let's make additional function:
- HighestDeck(A,B, C,D,E,F,G)
Returns highest deck that can be assembled from A,B and combination of C-G. A,B can be sorted, so can C-G.

Enter dynamic programming:
Function DetermineWinner is now written as:

if (HighestDeck(P1,P2, T1,T2,T3,T4,T5) > HighestDeck(O1,O2,T1,T2,T3,T4,T5)) {
// p1 wins
} else {
// p2 wins
}
}


And it becomes O(1) operation. And HighestDeck can be precalculated.

Quote:
So I'm still looking for a totally different kind of algorithm that would take an approach of analyzing the possible combinations and seeing how much outs you / the opponent might have for certain combinations.
I believe that the consensus is the problem is too complicated for analytical solution.


Edit:
After reading the thread originally linked, apparently HighestDeck can be evaluated efficiently (millions of tests per second).

To precalculate the table, this means 133e6*1980/~20million/sec = ~13000 seconds = ~4 hours. Throw in a multi-core, along with previously mentioned sorting and dynamic programming, and computation becomes completely managable.

Without pre-computation, entering 5 known cards, only 1980 tests need to be performed to simulate all possible games, which means each hand can be computed in a fraction of a second.

This makes it viable to test for 3 or more players as well. For 3 players, 4 cards must be evaluated, or approximately 3.6 million. Anything above that gets tricky.

[Edited by - Antheus on April 26, 2010 11:34:22 AM]

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this