Let be a stochastic process. For any finite set , the joint distribution function of the random variables is called a finite dimensional distribution of the process. The stochastic process can be characterized by specifying the joint density function .
Continuous state space | Random sequence | Stochastic process random function |
Discrete state space | Discrete parameter chain | Continuous parameter chain |
Discrete parameter set | Continuous parameter set |
Example 5.1. At times 0, 1, 2, … we toss a fair coin. For each time we define a random variable
We assume that the start is at and the random variable is defined as
It is clear that , is a discrete parameter chain.
The most important are the first order and the second order densities and ,
The mean value function (expected value) is defined as
(5.1) |
The (auto) correlation function is defined as
(5.2) |
The (auto) covariance function is defined as
(5.3) |
A process is strictly stationary if it has the same probability laws
(5.4) |
for all finite sets and for any .
The process is strictly stationary of order if it holds for . For the first order density is independent of time
(5.5) |
Thus the expected value is constant.
The second order density can depend only on , thus the correlation and covariance functions are
(5.6) | |
The stochastic process is said to be weakly stationary (or stationary in the wide sense, or covariance stationary) if it has finite second moments, the mean value function is constant and the correlation function is a function of distance .
The stochastic process is said to have strictly stationary increments if the process is strictly stationary for every .
There are a number of ways in which a sequence may converge as .
The sequence is said to converge to with probability 1 if
(5.7) |
for almost all realizations . (Does not hold perhaps for event with probability .
The sequence is said to converge to in probability if, for every
(5.8) |
The sequence is said to converge to in mean square if
(5.9) |
We write .
The Cauchy criterion
(5.10) |
is a necessary and sufficient condition for mean square covergence.
The random function is said to be continuous in mean square at if
(5.11) |
The correlation function of the random function is . Let us introduce a random function and calculate the cross correlation function
and then the correlation function
Finally upon substitution
It follows that for
(5.12) |
and thus the random function is mean square continuous if the correlation function is continuous at .
Theorem. The random function is mean square continuous at if, and only if is continuous at if, and only if .
Let us consider a derivative of a stochastic process.
The problem is simple when we know how to calculate the realizations of the stationary process with a mean value and a covariance function . If we can calculate the derivatives of all realizations, then the problem of the derivative is the problem of derivatives of a family of deterministic functions.
In view of the Cauchy criterion a sufficient condition for mean square differentiability of is the convergence to zero with , of .
The final result is the theorem.
Theorem. The random function is mean square differentiable at if, and only if exists at .
For stationary processes it is easier to prove similar theorems.
A stationary process is mean square continuous at if, and only if is continuous at .
A stationary process is mean square differentiable at if, and only if and exist at .
For example let us consider the stationary random process with the following covariance function
The covariance function at , thus the random process is continuous. The covariance function is not differentiable for . (The right side and left side derivatives are different.). The second derivative does not exist, the random function is not differentiable.
Let us consider a Riemann integral of a stochastic process.
We introduce a partition of the interval , , , . We denote , and .
The random function is mean square Riemann integrable if the following limit, which then defines the integral, exists
(5.13) |
Theorem. The random function is mean square Riemann integrable over if, and only if, is Riemann integrable over (The Riemann integral exist).
If a random function is mean square integrable over for every , then
(5.14) |
is a random function of t defined on .
Theorem. The random function is mean square integrable over for every . Then is mean square continuous on . If is ms continuous on , then is ms differentiable on and .
The Fundamental Theorem on ms Calculus.
Let be ms Riemann integrable on . Then
(5.15) |
with probability one.
A random function has independent increments if, for all finite sets the random variables (vectors) are independent.
The process has stationary independent increments if, in addition has the same distribution as for every and every .
A random function , is a Brownian motion (Wiener or Wiener-Lévy) process if
(i) | , has stationary independent increments; |
(ii) | for every , is normally distributed; |
(iii) | for every , ; |
(iv) | . |
It follows from (ii) that is also normally distributed for every . It remains to specify the distribution of increments for all . The mean, in view of (iii) is zero. From the definition it follows:
The function does not decrease when increases. Thus the equation
has a solution
(5.16) |
It is the only solution.
Finally the mean value of the increment is zero and its variance is .
Let us calculate the correlation (the covariance is the same) function. For
Therefore in general for the Brownian motion process the correlation function is
(5.17) |
The correlation function is continuous for every . Thus the process is mean square continuous on . The second derivative exists at no , thus the process is ms differentiable nowhere. It is Riemann integrable on every interval.
To compute the values we have to chose the time increment and and to write the covariance function in terms of the increment
(5.18) |
This leads to the following form of the difference equation:
(5.19) |
where is a term of the Gaussian white noise sequence with mean value zero and unit variance. The expressions (5.18) and (5.19) reduce the problem in continuum to the solution of a difference equation as described in chapter 4 (compare with the relations (4.6) and (4.5)).
It should be noted that and this expression is an invariant with respect to the choice of the increment , and not the relation for the derivative in finite differences.