Up first is one of my favorite projects I've done thus far at school. It was for my machine learning class, and it involved using a neural net for mouse gesture recognition, a la Black & White. It turned out really well, with around an 80-90% recognition rate, once the networked was fully trained. Now, I should point out that this was a group project, and one of my teammates actually did most of the work (not because the rest of us couldn't/wouldn't, he just had most of it done beforehand), so, credit to him and the rest of my team. Before getting into some of the details, heres a screen shot:
You draw on the grid to the left, and if the NN recognizes the pattern, it will play the corresponding animation for tiny.
If you're interested in some of the technical details, we used a simple multilayer network. I think we ended up with one hidden layer of 6 neurons, and it worked well enough.
The input was gathered simply by sampling mouse points every few fractions of a second, and making vectors out of the collection of points. The input to the NN was a series of 12 normalized vectors representing the gesture itself. The NN used a sigmoid function, and back prorogation for dealing with error. Basically, after an epoch, we calculate the mean square error, and update the weights between the neurons as follows:
deltaW(ji) = n*error(j)*output(i)
where error(j) is:
output(j)*(1 - output(j))*(expected(j) - output(j))
if j was an output neuron, and:
output(j)*(1 - output(j))*sum(0 to k)(error(k)*deltaW(kj))
if j was a neuron in the hidden layer.
It looks a little confusing typed out rather than with the proper symbols, but hopefully you get the idea. Anyway, we trained the network with 4 gestures until we hit an MSE of less than .0003.
Thats the gist of it. I was very happy with how it turned out, and am racking my brain for game ideas that can use mouse gestures as a control mechanism now [grin]. I may post the demo on my website in a few days, if there's any interest.