With 128 possible tiles you only need one byte per entry in your lookup table which is just 16 MB, not 64 MB. That doesn't seem like a bad trade off to get O(1) lookup times.
That said, if the lookup table approach uses too much RAM then we should be able to get O(log n) lookup with some clever nesting of the "if" statements. The idea here is to design a kind of binary search algorithm where at each step you examine a particular cell and narrow down the potential patterns until you reach the one and only pattern left. The cells that are examined will be different depending on which branches get executed but a given cell should never be examined more than once. Any cells that don't matter to the final pattern may not be examined at all. The result should be a monster set of nested if statements like so:
if (ter[1,1]) {
if (ter[1,2]) {
if (ter[2,1]) {
return 1;
}
else {
...
}
}
else {
return 2;
}
}
else {
if (ter[2,1]) {
return 3;
}
else {
if (ter[3,1]) {
...
}
else {
...
}
}
}
The trick to making this work, of course, is figuring out the optimum order to do your checks. I would not want to do that by hand -- I'd suggest you write a tool to do that for you. The tool should examine the possible patterns, figure out which cell position results in the most even split of the possibilities, and output an "if" statement. Then it can recurse for the case where the cell is filled and the case where the cell is empty, using the smaller set of possible patterns each time. Note that if the chosen cell doesn't matter to a particular pattern then it will be included as a possibility on both sides of the branch. When the set of possibilities is narrowed to a single pattern, output a "return" statement for that tile number.