Let us consider the following set of differential equations with constant coefficients and and the variance parameter .
(8.1) |
The first equation is a stochastic Itô differential equation. Its general solution may be easily written. The solution is a continuous Riemann integrable random function. The second equation is a stochastic differential equation. The solution is a once differentiable Riemann integrable random function. Finally the function is times differentiable.
It is convenient to write the relations in a matrix notation
(8.2) |
where and are column matrices
and is a lower triangular matrix
The fundamental solution is satisfying the corresponding homogeneous equation and initial conditions
(8.3) |
is expressed by the following matrix
The general solution is
(8.4) |
The mean value vector satisfies the following equation
(8.5) |
Let us denote the variance matrix of as
(8.6) |
From the differential equation in matrix notation it follows
and thus
Finally the evolution of is described by the following differential equation
(8.7) |
(The third term on the right side is due to the Itô integral; it gives a contribution in the first equation only.)
If the asymptotic covariance matrix exists it is a solution of the following algebraic equation.
(8.8) |
The matrix has the following structure
(8.9) |
The solution of the differential equation for the evolution of the variance is
(8.10) |
The final expression for the covariance matrix is
(8.11) |
For example in the stationary case the expressions are
The spectral density function of a stationary in the wide sense is the Fourier transform of the autocorrelation function
(8.12) |
where is the angular frequency.
The inverse transformation is
(8.13) |
For example the spectral densities for the non, once and twice differentiable processes are
Let us discuss how to calculate realizations of the above considered process in a recursive formulation. For the considered stationary process, we divide the time into equal time intervals , and we want to express the column matrix in terms of the values of . The process is stationary and thus we may consider one interval from to . It is easy to verify that in our case the general solution
(8.14) |
for one step from to may be written in the following form
(8.15) |
The expected value of a stationary process is constant and thus without loss of generality it may be assumed equal to zero. For a stationary process the variances for and must have the same values. It follows due to the independence conditions that
(8.16) |
All the terms correspond to symmetric matrices thus they may be represented by where is a lower triangular matrix.
(8.17) |
The matrix may be calculated from the procedure or directly by the following representation (given for a 3×3 matrix )
The matrix on the right side is symmetric and thus has 6 different elements. The matrices on the left have 6 unknown elements. First we multiply the second and third matrices. Then we multiply the first row by the first column and obtain the value of . Multiplication by the second and third columns leads to the values of and . Then we multiply the second row by the second column and obtain the value of . Multiplication by the third columns yields the value of . Finally the multiplication of the third row by the third column leads to the last unknown value . It may happen that is zero. It means the matrix is singular but this matrix corresponds to a covariance matrix and must be positive definite. In such a case we have to reduce the number of elements in the column matrix of the white noise sequence.
Finally the values of the realizations in one step may be calculated from the following recursive equation
(8.18) |
where is a column matrix with Gaussian independent random numbers in rows.
To obtain a stationary series the initial conditions , must correspond to jointly normally distributed random numbers with mean values equal to zero and covariance matrix equal to the asymptotic variance matrix . To compute the initial conditions it is convenient to represent the asymptotic variance matrix by the product of a lower triangular matrix by its transpose .
(8.19) |
For example for a twice differentiable process the matrix is
where is a column matrix with Gaussian independent random numbers in three rows.
Example 1
The script file pwsemb04 calculates examples of once and
twice differentiable processes with no dominant frequency.
Download
Scilab: pwsemb04.sci
Octave/Matlab: pwsemb04.m
Example 2
The script file pwsemd04 depicts the correlation functions and
the spectral densities of the not, once and twice differentiable
processes.
Download
Scilab: pwsemd04.sci
Octave/Matlab: pwsemd04.m