The probability space is denoted by and its elements are samples or experimental outcomes. Certain subsets (collection of outcomes) are called events .
The set theory notation is used. If and are two sets, then , () is their union, , () is their intersection, is the complement of with respect to , the empty set is denoted by . If , that is the sets are disjoint then the events are mutually exclusive.
For a class of events we assign probabilities to events via a probability function . That is, to each event we assign a number , called the probability of . The probability function satisfies the following axioms:
(i) | |
(ii) | |
(iii) | if , then |
(iv) | if , then |
The class of events has to be defined. In defining the class of events we want the set operations (unions, intersections, complements) to yield sets that are also events. A class of sets having these properties is called a Borel field. A class of sets is called a Borel field if
(i) | |
(ii) | if then |
(iii) | if then and |
(iv) | if then and |
The triplet is called an experiment.
Example: , Borel field .
Example, rolling a die once: , Borel field . But is not a Borel field, because .
A real finite-valued function defined on is called a (real) random variable if, for every real number , the inequality defines an set whose probability is defined. A random variable is a Borel measurable function.
For a random variable the function
(3.1) |
is defined for all real and is called the distribution function of the random variable . A random variable is called discrete if there exists a mass function such that
(3.2) |
A random variable is called continuous if there exists a density function such that
(3.3) |
If the number of points at which is not differentiable is countable then
(3.4) |
at all at which the derivative exists.
The expectation, average, mean or first moment of a continuous random variable is defined by
(3.5) |
The nth moment of is defined by
(3.6) |
The second moment is called the mean square value.
The nth central moment of is define by
(3.7) |
The second central moment is called the variance of .
Example: rolling a die. We can define a probability space , a random variable , and probability as for example given in the table
1 | 2 | 3 | 4 | 5 | 6 | |
-30 | -20 | -10 | 10 | 10 | 30 | |
1/6 | 1/6 | 1/6 | 1/6 | 1/6 | 1/6 |
Let us introduce subsets (events, corresponding to odd or even numbers) . and . The corresponding Borel field is and the corresponding probabilities are elements of the row matrix . This is also the case of tossing a coin with two events heads or tails.
Example: choice of a random phase from the interval in continuum, thus . Let us define the random variable and the probability function is ,
The corresponding distribution function is:
if then ,
if then ,
if then .
The standard uniform density function results by differentiating the distribution function with respect to the random variable .
A random variable is Gaussian or normally distributed if its density function is given by
(3.8) |
where and ( - mean - variance).
The normal distribution function is
(3.9) |
It is convenient to specify a random variable by its characteristic function defined as the Fourier transform of the density function
(3.10) |
The nth moment of the random variable is
(3.11) |
It is easy to verify the following relation
(3.12) |
Thus when the characteristic function is calculated it is easy to calculate the values of the set of nth moments by differentiation.
The characteristic function for a Gaussian density with expectation is
(3.13) |
(The assumption that is not a serious loss of generality because from all samples we can subtract the value of to obtain a random variable with mean values equal zero.)
Introducing the change of variables in the integral , , , the integration finally yields the following expression
(3.14) |
Differentiation and substitution into the expressions for the nth central moments lead to the conclusion that for odd values of the central moments are zero and it is easy to establish a relation between the expression for consecutive even numbers of . The final result is that for a normal distribution the nth central moment is equal to
(3.15) |
The even central moments grow without bound when tends to infinity.
Let us consider two random variables and . The two sets and are events with probabilities
(3.16) |
where are distribution functions of the random variables and . The intersection of these two sets
(3.17) |
is an event. The probability of this event is the joint distribution function of the jointly distributed random variables and
(3.18) |
Example: temperature measured at 6 at night and at 12 (at the same day). It is possible to consider the random variables and separately or to look at the pair , jointly. To estimate the joint probability one has to consider a two dimensional problem related to the , plane.
In general the continuous random variables defined on the same probability space are said to be jointly distributed. They may be characterized by their joint distribution function
(3.19) |
where
(3.20) |
or by their joint density function
(3.21) |
For the differentiable case
(3.22) |
The marginal distribution function is defined by
(3.23) |
The marginal density function is
(3.24) |
The expectation of is given by
(3.25) |
The second moment of is
(3.26) |
Of great importance in applications is the covariance of and which is defined by
(3.27) |
The generalization of the higher moments and central moments from the case of one random variable to the joint variables is straightforward.
Two jointly distributed random variables and are independent if any of the following equivalent conditions is satisfied
(3.28) | |
We say that are mutually independent if
(3.29) |