Algorithm to reduce string via rules?
I forgot to mention that you need a rule-preprocessing step that turns every rule into the form "AB -> C". Read about Chomsky normal form to learn about it.
OK, here it is:
And now I have some babies to feed. ;)
EDIT: I found the name of the algorithm: CYK.
#include <string>
#include <vector>
#include <iostream>
typedef unsigned long mask;
struct Rule {
mask i1, i2, o;
Rule(char i1, char i2, char o)
: i1(1ul<<(i1-'A')),
i2(1ul<<(i2-'A')),
o(1ul<<(o-'A')) {
}
};
std::vector<Rule> rules = {
Rule{'A','A','A'},
Rule{'A','A','C'},
Rule{'C','A','B'},
Rule{'A','B','B'}
};
void parse(std::string s) {
int n = s.length();
mask conversions[n+1][n+1];
for (int i=0; i<n; ++i)
conversions[i][i+1] = 1ul << (s[i]-'A');
for (int l=2; l<=n; ++l) {
for (int i=0; i<=n-l; ++i) {
int j = i+l;
conversions[i][j] = 0ul;
for (int k = i+1; k < j; ++k) {
for (Rule rule : rules) {
if ((rule.i1 & conversions[i][k])
&& (rule.i2 & conversions[k][j]))
conversions[i][j] |= rule.o;
}
}
}
}
for (mask m = conversions[0][n], i=0; m; i++, m >>= 1){
if (m & 1ul)
std::cout << char('A'+i) << ' ';
}
std::cout << '\n';
}
int main() {
parse("AAAB");
}
And now I have some babies to feed. ;)
EDIT: I found the name of the algorithm: CYK.
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement