Watching fractals emerge at the edge of chaos
Chaos is particularly hard to explain, of course. It's easier to describe than define; one of those you- know-it-when-you-see-it types of things. That's an odd thing to find in mathematics, the one place where we expect everything to be perfectly logical, clear and well-defined. But then, it's called chaos for a reason.
Chaos in this context is something that happens to dynamical systems, and these were the main focus of this paper. A dynamical system is any set of possible events that can happen; together with a 'rule' that dictates how the events develop over time. That covers a vast range of possible scenarios, from a simple pendulum swinging back and forth, to the behaviour of a complex weather system. A deterministic dynamical system is one where we should —in theory — be able to predict what happens next, based on what came before. But sometimes, what should be doable in theory becomes virtually impossible in practice — and that's probably as close as we can get to a definition of chaos. The butterfly effect is a well-known example, where a small change in a complex system can have dramatic and unpredictable results.
But even a chaotic system tends to reach some kind of equilibrium eventually. This 'attractor' is the state the system reaches eventually, when it's explored all the possibilities it's going to: when the more things change, the more they stay the same. There are also 'repellors', areas of possibility that the system tends to shy away from. These change at different points in the system's lifetime. What areas are attracting and repelling our process when it's operating in its normal, non-chaotic state? How does that change when it becomes fully chaotic? And, most interestingly, what's going on at that critical transition point when it's just about to tip over into chaos? This 'edge of chaos' is a curious place, mathematically. Strange things happen there.
There's more than one route to chaos. A common one, though, is the period-doubling cascade, when changes in the behaviour of the system start happening at twice the previous rate, and then twice that new rate itself, taking it accelerating rapidly into chaos and unpredictability. Write down the equation for this process, and use it to plot a graph of the results when it reaches its attractor, and you'll end up with a startling fractal pattern. This is the famous Feigenbaum attractor, and its existence is well-known. What wasn't so clearly understood, however, was exactly how it happens, and what the steps along the way look like.
The researchers here broke down the process, watching the gradual evolution of the attractor as it changed from the simple limit of the non-chaotic system to the perfectly formed fractal of chaos. The path was not a smooth one. Studying their results, the authors remarked on the complex structure, the many 'rough, jagged features'. And it got more interesting, the closer they looked. With each step, it was accumulating a hierarchical structure, a pattern that emerged on multiple scales; the fractal building up little by little, emerging before their eyes with each successive step. We now have the equations and histograms to describe exactly how that process happens.
The press release vaguely mentions that the results might lead to better understanding of chaotic natural phenomena. That's difficult to judge as they don't explain exactly what or how.
But if we really have that ability to see beauty in pure mathematics, maybe it doesn't matter so much. Maybe what this story really needs is not a vague practical 'application' tacked onto it to justify its existence, but a visualisation of the data, so we could see the system evolving from simplicity to chaos, along that wild and rugged road into multifractal complexity. The idea of it is beautiful, I think. But seeing is believing. And sometimes it takes more than a histogram.