N. J. Balmforth, P. J. Morrison, J. -L. Thiffeault
Pattern formation in biological, chemical and physical problems has received considerable attention, with much attention paid to dissipative systems. For example, the Ginzburg--Landau equation is a normal form that describes pattern formation due to the appearance of a single mode of instability in a wide variety of dissipative problems. In a similar vein, a certain "single-wave model" arises in many physical contexts that share common pattern forming behavior. These systems have Hamiltonian structure, and the single-wave model is a kind of Hamiltonian mean-field theory describing the patterns that form in phase space. The single-wave model was originally derived in the context of nonlinear plasma theory, where it describes the behavior near threshold and subsequent nonlinear evolution of unstable plasma waves. However, the single-wave model also arises in fluid mechanics, specifically shear-flow and vortex dynamics, galactic dynamics, the XY and Potts models of condensed matter physics, and other Hamiltonian theories characterized by mean field interaction. We demonstrate, by a suitable asymptotic analysis, how the single-wave model emerges from a large class of nonlinear advection-transport theories. An essential ingredient for the reduction is that the Hamiltonian system has a continuous spectrum in the linear stability problem, arising not from an infinite spatial domain but from singular resonances along curves in phase space whereat wavespeeds match material speeds (wave-particle resonances in the plasma problem, or critical levels in fluid problems). The dynamics of the continuous spectrum is manifest as the phenomenon of Landau damping when the system is ... Such dynamical phenomena have been rediscovered in different contexts, which is unsurprising in view of the normal-form character of the single-wave model.
View original:
http://arxiv.org/abs/1303.0065
No comments:
Post a Comment