would just like to know if, for a given problem, in this case the classic XOR binary problem,
the network will eventually fluctuate to the solution, if given infinite time to do so, or, if due to
learning rate issues, it can get stuck in over-shooting behaviors.
If the learning rate should pose no problem, can it be mathematically said that the network
should travel towards solution-space?