I just finished up work on a school project with my classmate Sami, solving the traveling salesman problem (TSP) using neural networks (NNs). And not just any NN, but a chaotic NN! What? Yes. And it’s amazing.
The Code
The code was written in Python. Go try it out! It’s not hard to run, and the README should help you get started.
The Short Story
Hopfield neural networks are being souped up for optimization, by inducing transient chaotic dynamics. That’s basically all there is. (see the long story for more details).
The Long Story
We followed the approach described by Aihara and Chen[1]. They start with a Hopfield NN (HNN), which is fitting since it was Hopfield and Tank[2] who first applied NNs to solve combinatorial optimization problems, using the HNN of course. So, we start with a HNN. What’s missing? Well, the HNN is typically used because it avoids chaotic dynamics (assuming weights are symmetric and self-connection weights are 0).
In fact, a HNN is guaranteed to converge to a stable local minimum. Well, that’s a good thing, actually, but how will this help us produce chaotic dynamics? Chen and Aihara simply add some self-feedback (non-zero self-connection weight). Now we get attractors, and hence rich chaotic dynamics. Now we’ve got a mechanism to get our NN to explore the state space (exploratory) instead of going straight for the nearest local minima (greedy). That’s what we wanted, but of course this throws the guaranteed convergence right out the window. Hmmm…
Not to worry, the simple trick is to decay the self-feedback amplitude, just like the cooling in simulated annealing. In fact, the authors are calling this chaotic simulated annealing (CSA) because there’s no stochastic behavior in a chaotic system. So as the chaos-inducing term decays, the convergent behavior begins to dominate and the system will magically converge to a local minimum (which could also be the global minimum).
The Long Long Story
If you really want it, here it is, in a brief report that Sami and I authored. Or, you can watch our presentation on the same topic.