# Contents

## Idea

Stochastic resonance is an effect whereby the response of a system to an external periodic signal is amplified by the presence of noise. Simply put, the noise makes it easier for the system to hop back and forth between two stable states.

This phenomenon has been suggested as a possible mechanism whereby small periodic changes in the Earth’s orbital parameters are amplified by the presence of random changes in the weather, thus leading to significant glacial cycles. For details, see Milankovitch cycle.

A less controversial example is the neurons of crickets, which try to detect an approaching predator, a bird, by detecting the periodic up- and downturns of air pressure caused by the bird’s wing beats. The presence of stochastic resonance in such detection systems means that more noise increases the signal to noise ratio.

Systems displaying stochastic resonance are often modeled using stochastic differential equations.

## Examples

### A simple model

We will look at a one dimensional system, a particle in one space dimension, described by a Langevin equation with a potential with two minima together with a time-periodic forcing. The time independent potential is:

$V(x) := \frac{1}{4} x^4 - \frac{1}{2} x^2$

The periodic forcing, or ‘signal’, is

$- A sin(t) x$

Sticking to our example, this is the periodic change in air pressure due to an approaching bird. We have introduced a constant $A$, the amplitude of the periodic forcing.

The time dependent effective potential of the system is:

$V(x, t) := \frac{1}{4} x^4 - \frac{1}{2} x^2 - A sin(t) x$

Including noise leads to this Langevin equation:

$dX \; = -V'(X_t, t) dt + \sqrt{2D} \; dW_t = (X_t - X_t^3 + A sin(t)) dt + \sqrt{2D} \; dW_t$

The potential has two local minima at $x = \pm 1$.

In the context of the example of the introduction, the case $x \lt 0$ is that “a neuron is not active” and $x \gt 0$ is the case that “a neuron is active”, (active in the sense that the neuron distributes an electrical signal along its axon).

The potential wall between the minima has a height of $\frac{1}{2}$. If the amplitude of the periodic forcing is below this value, and there is no noise ($D = 0$), the system won’t be able to do a transition from one minimum to the other. Therefore, if we observe the system, i.e. the position of our particle as a function of time $x(t)$, we won’t see any signal. According to our example, the neurons will stay in one state, like “not active”, and the poor cricket will have no chance to detect what is approaching.

If we increase $D$ to a very high value, the particle will jump arbitrarily between the two minima, and again we won’t see any signal. Heuristically it seems clear that there should be a value for $D$ such that the signal to noise ratio is optimal: In this case the influence of white noise increases the signal to noise ratio to a level such that the signal is observable.

### What is the “signal to noise ratio”?

There is no general definition of the “signal to noise ratio” for all kinds of systems, therefore one has to either be content with an intuitive understanding of this term, or define it for a specific system under scrutiny.

For our model system, let’s assume that the expectation value of $x(t)$ has a Fourier expansion like this:

$\langle x(t) \rangle = \sum_{n = - \infty}^{\infty} M_n \exp{(i n t)}$

Then one useful definition of the signal to noise ratio $\eta$ is

$\eta := \frac{|M_1|^2}{A^2}$

### Computer simulation

We’ll look at some sample paths obtained from a numerical simulation. The background in the graphs below indicate the potential $V(x, t)$: dark areas show low values and pale areas show high values (but the relationship is not linear). The particle always starts in position $X(t=0) = 0$. The plots were done using R.

First, here is a simulation of a sample path with a small value of the diffusion constant $D$. We see that the system does not do any transition to the other minimum in the simulated time range: Second, here is a simulation with a very high value of $D$, such that the periodic forcing is completely lost in random fluctuations of the particle: Finally, a simulation using an intermediate value of $D$. Here we see that the signal can be distinguished from the noise. Since the amplitude of the signal is $0.1$, while the “amplitude” of our system is something around 1 (ignoring all questions about measuring units for the moment), this shows that it could be possible to find systems where the influence of noise actually increases the signal to noise ratio of the original signal: ### Interpretation

Our model is so simple and generic that it is open to many interpretations, and has applications to many different systems. With regard to Earth’s climate, we could for example assume for the moment that the x-position represents an average surface temperature, that the periodic forcing comes from periodically occurring astronomic constellations (through gravitation), and that we model everything else as white noise.

According to climate scientists there may be several locally stable states of the climate with differing average temperatures, this is for example a result obtained from energy balance models. One possible scenario is a completely frozen earth, Snowball Earth. Our model is therefore an example how seemingly random influences can cause a transition from one local minimum to another, by multiplying the effect of an otherwise invisibly weak forcing.

### Implementation

This file: R code to make graphs for stochastic resonance. was used to create the graphs.

Java source code for the same model is available from the Azimuth code project.

It is a very simple implementation in Java (version 6), using the simplest random number generators and the Euler schema.

### Interactive online examples

Allan Erskine and Jim Stuttard helped to create an interactive online model which you can find here (last visited on 22.01.2013, hosted by Glynn Adgie?):

This model was explained and written for the blog post Blog - increasing the signal-to-noise ratio with more noise.

Michael Knap modified this model to get one more explicitly connected to climate physics, which is here:

This model was explained in the blog post Mathematics of the environment (part 8).

Stochastic resonance is an active area of research, here are two review papers:

For an application to climate science, see for example this:

For applications to the glacial cycles, see Milankovitch cycle.

For attempts to rigorously define stochastic resonance and prove it exists, see:

In the absence of noise, the Langevin equation described on this page reduces to this first-order differential equation:

$\dot X = X - X^3 + A sin(t)$

This can be seen as a special limiting case of the Duffing equation

$\ddot X = \alpha X + \beta \dot X + \gamma X^3 + \delta \sin(\omega t)$

describing a damped anharmonic oscillator with an sinusoidal driving force:

The Duffing equation describes complex phenomena such as this:

To regain the stochastic resonance model discussed here, we must first take a high-damping limit in which the $\ddot X$ term of the Duffing equation become negligible, and then add noise. For work on the Duffing equation with noise, see:

Abstract: This paper is concerned with attractors of randomly perturbed dynamical systems, called random attractors. The framework used is provided by the theory of random dynamical systems. We first define, analyze, and prove existence of random attractors. The main result is a technique, similar to Lyapunov’s direct method, to ensure existence of random attractors for random differential equations. This method is formulated as a generally applicable procedure. As an illustration we shall apply it to the random Duffing-van der Pol equation. We then show, by the same example, that random attractors provide an important tool to analyze the bifurcation behavior of stochastically perturbed dynamical systems. We introduce new methods and techniques, and we investigate the Hopf bifurcation behavior of the random Duffing-van der Pol equation in detail. In addition, the relationship of random attractors to invariant measures and unstable sets is studied.

For applications to neurobiology, see: