The Azimuth Project
Blog - network theory (part 24) (Rev #3, changes)

Showing changes from revision #2 to #3: Added | Removed | Changed

This page is a blog article in progress, written by John Baez. To see discussions of this article while it is being written, go to the Azimuth Forum.

Review

The storm is brewing for the final climax, so let’s remember where we are. We start with a stochastic reaction network:

This consists of:

• finite sets of transitions TT, complexes KK and species SS,

• a map r:T(0,)r: T \to (0,\infty) giving a rate constant for each transition,

source and target maps s,t:TKs,t : T \to K saying where each transition starts and ends,

• a map Y:K SY : K \to \mathbb{N}^S saying how each complex is made of species.

Then we extend s,ts, t and YY to linear maps:

Then we put inner products on these vector spaces as described last time, which lets us ‘turn around’ the maps ss and tt by taking their adjoints:

s ,t : K T s^\dagger, t^\dagger : \mathbb{R}^K \to \mathbb{R}^T

More surprisingly, we can ‘turn around’ YY and get a nonlinear map using ‘matrix exponentiation’:

S K x x Y \begin{array}{ccc} \mathbb{R}^S &\to& \mathbb{R}^K \\ x &\mapsto& x^Y \end{array}

This is most easily understood by thinking of xx as a row vector and YY as a matrix:

x Y = (x 1, x 2, , x k) (Y 11 Y 12 Y 1 Y 21 Y 22 Y 2 Y k1 Y k2 Y k) = (x 1 Y 11x k Y k1,,x 1 Y 1x k Y k) \begin{array}{ccl} x^Y &=& {\left( \begin{array}{cccc} x_1 , & x_2 , & \dots, & x_k \end{array} \right)}^{ \left( \begin{array}{cccc} Y_{11} & Y_{12} & \cdots & Y_{1 \ell} \\ Y_{21} & Y_{22} & \cdots & Y_{2 \ell} \\ \vdots & \vdots & \ddots & \vdots \\ Y_{k1} & Y_{k2} & \cdots & Y_{k \ell} \end{array} \right)} \\ \\ &=& \left( x_1^{Y_{11}} \cdots x_k^{Y_{k1}} ,\; \dots, \; x_1^{Y_{1 \ell}} \cdots x_k^{Y_{k \ell}} \right) \end{array}

Remember, complexes are made out of species. The matrix entry Y ijY_{i j} says how many things of the jjth species there are in a complex of the iith kind. If ψ K\psi \in \mathbb{R}^K says how many complexes there are of each kind, Yψ SY \psi \in \mathbb{R}^S says how many things there are of each species. Conversely, if xmathbbR Sx \in mathbb{R}^S says how many things there are of each species, Y x KY^x \in \mathbb{R}^K says how many ways we can build each kind of complex from them.

So, we get these maps:

The Next, theboundary operator

: T K \partial : \mathbb{R}^T \to \mathbb{R}^K

describes how each transition causes a change in complexes:

=ts \partial = t - s

As we saw last time, there is a Hamiltonian

H: K K H : \mathbb{R}^K \to \mathbb{R}^K

describing a Markov processes on the set of complexes, given by

H=s H = \partial s^\dagger

But the star of the show is the rate equation. This describes how the number of things of each species changes with time. We write these numbers in a list and get a vector x Sx \in \mathbb{R}^S with nonnegative components. The rate equation says:

dxdt=YHx Y \displaystyle{ \frac{d x}{d t} = Y H x^Y }

We are can looking read for this as follows:equilibrium solutions of the rate equation, where the number of things of each species doesn’t change with time at all:

xx says how many things of each species we have now.

x Yx^Y says how many complexes of each kind we can build from these species.

s x Ys^\dagger x^Y says how many transitions of each kind can originate starting from these complexes, with each transition weighted by its rate.

Hx Y=s x YH x^Y = \partial s^\dagger x^Y is the rate of change of the number of complexes of each kind, due to these transitions.

YHx YY H x^Y is the rate of change of the number of things of each species.

The zero deficiency theorem

We are looking for equilibrium solutions of the rate equation, where the number of things of each species doesn’t change at all:

YHx Y=0 Y H x^Y = 0

In fact we will find complex balanced equilibrium solutions, where even the number of complexes of each kind don’t change:

Hx Y=0 H x^Y = 0

More precisely, we have:

Deficiency Zero Theorem (Child’s Version). Suppose we have a reaction network obeying these two conditions:

  1. It is weakly reversible, meaning that whenever there’s a transition from one complex κ\kappa to another κ\kappa' , there’s a sequence directed path of transitions going back fromκ\kappa' to κ\kappa.

  2. It has deficiency zero, meaning imdkerY={0} \mathrm{im} d \partial \cap \mathrm{ker} Y = \{ 0 \}.

Then for any choice of rate constants there exists an a complex balanced equilibrium solution of the rate equation where all species are present in nonzero amounts. In other words, there existsx(0,) Sx \in (0,\infty)^S withHx Y=0H x^Y = 0.

Proof of the zero deficiency theorem

Hx Y=0H x^Y = 0

Proof. Because our reaction network is weakly reversible, the theorems in Part 23 show there exists ψ(0,) K\psi \in (0,\infty)^K with

Hψ=0 H \psi = 0

This ψ\psi may not be of the form x Yx^Y, but we shall adjust ψ\psi so that it becomes of this form, while still remaining a solution of Hψ=0H \psi = 0 .

We need to use a few facts from linear algebra. If VV is a finite-dimensional vector space with inner product, the orthogonal complement L L^\perp of a subspace LVL \subseteq V consists of vectors that are orthogonal to everything in LL:

L ={vV:wLv,w=0} L^\perp = \{ v \in V : \; \forall w \in L \; \langle v, w \rangle = 0 \}

We have

(LM) =L +M (L \cap M)^\perp = L^\perp + M^\perp

where LL and MM are subspaces of VV and ++ denotes the sum of subspaces. Also, if T:VWT: V \to W is a linear map between finite-dimensional vector spaces with inner product, we have

(kerT) =imT (\mathrm{ker} T)^\perp = \mathrm{im} T^\dagger

and

(imT) =kerT (\mathrm{im} T)^\perp = \mathrm{ker} T^\dagger

Now, because our reaction network has deficiency zero, we know that

imkerY={0} \mathrm{im} \partial \cap \mathrm{ker} Y = \{ 0 \}

Taking the orthogonal complement of this subspace of S\mathbb{R}^S, we get

(imkerY) = S (\mathrm{im} \partial \cap \mathrm{ker} Y)^\perp = \mathbb{R}^S

but using the rules we mentioned, we obtain

ker +imY = S \ker \partial^\dagger + \im Y^\dagger = \mathbb{R}^S

category: blog