This issuePrevious ArticleSurvival of subthreshold oscillations: The interplay of
noise, bifurcation structure, and return mechanismNext ArticleBifurcation delay - the case of the sequence: Stable focus - unstable focus - unstable node
On stability loss delay for dynamical bifurcations
In the classical bifurcation theory, behavior of systems depending on a parameter is considered for values of this parameter close to some critical, bifurcational one. In the theory of dynamical bifurcations a parameter is changing slowly in time and passes through a value that would be bifurcational in the classical static theory. Some arising here phenomena are drastically different from predictions derived by the static approach. Let at a bifurcational value of a parameter an equilibrium or a limit cycle loses its asymptotic linear stability but remains non-degenerate. It turns out that in analytic systems the stability loss delays inevitably: phase points remain near the unstable equilibrium (cycle) for a long time after the bifurcation; during this time the parameter changes by a quantity of order 1. Such delay is not in general found in non-analytic (even infinitely smooth) systems. A survey of some background on stability loss delay phenomenon is presented in this paper.