STOCHASTIC PROCESSES, MEAN SQUARE CALCULUS

Let xt tT be a stochastic process. For any finite set t1 tn, tiT the joint distribution function of the random variables x(t1) ,, x(tn) is called a finite dimensional distribution of the process. The stochastic process can be characterized by specifying the joint density function p xt1 xtn .

Classification of Stochastic Processes

Continuous state space Random sequence Stochastic process random function
Discrete state space Discrete parameter chain Continuous parameter chain
  Discrete parameter set Continuous parameter set

Example 5.1. At times 0, 1, 2, … we toss a fair coin. For each time we define a random variable

wn(ω) = { +Δx,ω=h +Δx,ω=t

We assume that the start is at x0=0 and the random variable is defined as

xn = i=0n-1 wi , n=1,2,

It is clear that xt, n=0,1,2, is a discrete parameter chain.

 

The most important are the first order and the second order densities p(xt) and p(xt, xτ)t,τT

The mean value function (expected value) is defined as

m(t) = E{x(t)} = - x p(xt) d x (5.1)

The (auto) correlation function is defined as

R( t1,t2 ) = E{ x(t1) x(t2) } = - - x1 x2 p( x1, x2; t1, t2 ) d x1 d x2 (5.2)

The (auto) covariance function is defined as

C( t1,t2 ) = E{ [ x(t1) - m(t1) ] [ x(t2) - m(t2) ] } = R( t1,t2 ) - m(t1) m(t2) (5.3)

STATIONARY PROCESSES

A process is strictly stationary if it has the same probability laws

p( x1, , xn; t1, , tn ) = p( x1, , xn; t1+τ, , tn+τ ) (5.4)

for all finite sets tiT and for any τT.

The process is strictly stationary of order k if it holds for nk. For k=1 the first order density is independent of time

p( x,t ) = p(x) . (5.5)

Thus the expected value m(t) is constant.

The second order density can depend only on t2- t1, thus the correlation and covariance functions are

R( t,t+τ ) = E{ x(t) x(t+τ) } = R(τ) , (5.6)
C( t,t+τ ) = E{ [ x(t) -m ] [ x(t+τ) -m ] } = C(τ) .

The stochastic process xt, tT is said to be weakly stationary (or stationary in the wide sense, or covariance stationary) if it has finite second moments, the mean value function is constant and the correlation function is a function of distance τ.

The stochastic process xt, tT is said to have strictly stationary increments if the process {x(t+s) -x(t), tT} is strictly stationary for every sT .

Convergence of Random Sequences

There are a number of ways in which a sequence may converge as n.

The sequence {xt} is said to converge to x with probability 1 if

lim n xn(ω) = x(ω) , lim n xn = x wp 1, (5.7)

for almost all realizations ω. (Does not hold perhaps for event A with probability Pr(A)=0.

The sequence {xt} is said to converge to x in probability if, for every ε>0

lim n PrE{ | xn(ω) - x(ω) | ε } = 0 , plimxn=x , (5.8)

The sequence {xt} is said to converge to x in mean square if

n E{ |xn| 2 } , E{ |x| 2 } , lim n E{ |xn-x| 2 } = 0 (5.9)

We write l.i.m.xn=x .

The Cauchy criterion

lim n,m E{ | xn - xm | 2 } = 0 , (5.10)

is a necessary and sufficient condition for mean square covergence.

Mean Square Calculus

The random function xt is said to be continuous in mean square at tT if

l.i.m. h0 x(t+h) = x , for t+hT (5.11)

The correlation function of the random function is Rt1t2. Let us introduce a random function y=x(t+h) -x(t) and calculate the cross correlation function

Rxy t1t2 = E{ x(t1) [ x(t1+h) - x(t2) ] } = Rt1t2+h - Rt1t2

and then the correlation function

Ryy t1t2 = E{ [ x(t2+h) - x(t1) ] y(t2) } = Rxy t1+ht2 - Rxy t1t2

Finally upon substitution

Ryy = R t1+h t2+h - R t1+h t2 - R t1 t2+h + R t1 t2

It follows that for t1=t2=t

Ryy tt = E{ [ xt1+h - xt1 ] 2 } (5.12)

and thus the random function is mean square continuous if the correlation function is continuous at t1=t2=t.

Theorem. The random function Xt is mean square continuous at tT if, and only if Rt1t2 is continuous at if, and only if tt.

Let us consider a derivative of a stochastic process.

The problem is simple when we know how to calculate the realizations of the stationary process Xt with a mean value mt and a covariance function Rt1t2. If we can calculate the derivatives of all realizations, then the problem of the derivative is the problem of derivatives of a family of deterministic functions.

In view of the Cauchy criterion a sufficient condition for mean square differentiability of Xt is the convergence to zero with h, s of E{ | Xt+h - Xt h - Xt+s - Xt s | } .

The final result is the theorem.

Theorem. The random function Xt is mean square differentiable at tT if, and only if 2 Rt1t2 /t1t2 exists at tt.

For stationary processes it is easier to prove similar theorems.

A stationary process Xt is mean square continuous at tT if, and only if Rτ is continuous at τ=0.

A stationary process Xt is mean square differentiable at tT if, and only if dRτ /dτ and d2 Rτ /dτ2 exist at τ=0.

For example let us consider the stationary random process with the following covariance function

Ct1t2 = σ02 e-ητ , τ=| t2-t1|

The covariance function at τ=0, thus the random process is continuous. The covariance function is not differentiable for τ=0. (The right side and left side derivatives are different.). The second derivative does not exist, the random function is not differentiable.

Let us consider a Riemann integral of a stochastic process.

We introduce a partition of the interval ab, a,bT, ab, a= t0< t1< < tn=b . We denote ρ=maxi ti+1- ti, and ti ti,< ti+1.

The random function is mean square Riemann integrable if the following limit, which then defines the integral, exists

l.i.m. ρ0 i=0 n X ti, ti+1 - ti = a b Xt dt (5.13)

Theorem. The random function Xt is mean square Riemann integrable over ab if, and only if, R t1t2 is Riemann integrable over ab×ab (The Riemann integral ab ab R t1t2 dt1 dt2 exist).

If a random function Xt is mean square integrable over ab for every t ab, then

Yt = a t Xτ dτ (5.14)

is a random function of t defined on ab.

Theorem. The random function Xt is mean square integrable over ab for every t ab. Then Yt =at Xτdτ is mean square continuous on ab. If Xt is ms continuous on ab, then Yt is ms differentiable on ab and Y˙ t=Xt.

The Fundamental Theorem on ms Calculus.

Let X˙ t be ms Riemann integrable on ab. Then

Xt-Xa = a t X˙τ dτ (5.15)

with probability one.

The Brownian Motion Process

A random function Xt tT has independent increments if, for all finite sets ti: ti<ti+1 T the random variables (vectors) Xt2- Xt1 , Xt3- Xt2 ,, Xtn- Xtn-1 are independent.

The Xt tT process has stationary independent increments if, in addition Xt+h- Xτ+h has the same distribution as Xt- Xτ for every t>τT and every h.

A random function Bt, t0 is a Brownian motion (Wiener or Wiener-Lévy) process if

(i) Bt, has stationary independent increments;
(ii) for every t0, Bt is normally distributed;
(iii) for every t0, E Bt=0;
(iv) Pr B0=0=1.

It follows from (ii) that Xt -Xτ is also normally distributed for every t, τ0. It remains to specify the distribution of increments Xt -Xτ for all t> τ0. The mean, E Xt-Xτ in view of (iii) is zero. From the definition it follows:

E B2t = E Bt - Bτ + Bτ - B0 2 E B2t = E Bt - Bτ 2 + E B2 τ

The function does not decrease when t increases. Thus the equation

E Bt - Bτ 2 = E B2 t - E B2 τ

has a solution

E B2 τ = σ2t E Bt - Bτ 2 = σ2t-τ (5.16)

It is the only solution.

Finally the mean value of the increment is zero mt=0 and its variance is var Bt-Bτ =σ2t-τ.

Let us calculate the correlation (the covariance is the same) function. For t>τ

Rtτ = E Bt Bτ = E Bt - Bτ + Bτ Bτ Rtτ = E Bt - Bτ Bτ + E Bτ Bτ = 0+σ2τ

Therefore in general for the Brownian motion process the correlation function is

Rtτ = σ2mintτ (5.17)

The correlation function is continuous for every tτ. Thus the process is mean square continuous on 0. The second derivative 2 Rtτ /tτ exists at no tt, thus the process is ms differentiable nowhere. It is Riemann integrable on every t interval.

To compute the values we have to chose the time increment Δt and and to write the covariance function in terms of the increment

Cij = σ2miniΔt jΔt = σ2minti tj (5.18)

This leads to the following form of the difference equation:

X0 = 0 , Xs - Xs-1 = σ Δt us , s = 1,2, , (5.19)

where us is a term of the Gaussian white noise sequence with mean value zero and unit variance. The expressions (5.18) and (5.19) reduce the problem in continuum to the solution of a difference equation as described in chapter 4 (compare with the relations (4.6) and (4.5)).

It should be noted that Xs - Xs-1 / Δt = σ us and this expression is an invariant with respect to the choice of the increment Δt, and not the relation for the derivative in finite differences.