A4 Brown I An Motions And Random Walks

Concepts defined in this section:

• Unitary symmetric random walk

• Brownian motion

• First variation

• Second (quadratic) variation

• n -Dimensional Brownian motion

• Orthogonal n-dimensional Brownian motion

The essential probabilistic tools needed in order to obtain some important results in stochastic calculus have been presented in the preceding sections. Since the most powerful of these results (Ito's lemma) can only be obtained under the assumption that the filtration representing the accretion of information is generated by a very special process, namely a Brownian motion, a brief review of the fundamental properties of the latter is presented in this section. The topic is well covered in virtually all the texts in stochastic calculus: see, e.g., Oksendal (1995). The conceptual layout of this section and the following owes a lot to Shreve (1997).

Consider a filtered probability space (Q, P). A unitary symmetric random walk can be defined on this space starting from a sequence of random variables, Xk(co), constructed by the following procedure:

(i) Take a given probabilistic 'experiment', and label its outcomes, a>k, by an integer k, where k = 0, 1, 2... (i.e. the outcomes can be thought of as 'occurring at time fc').

(ii) For each integer k, to each possible outcome, cok, a random variable Xk(a>) can be associated. This random variable can only assume, depending on the particular elementary event cok occurring at 'time' k, the values of +1 and — 1. Let the sum of the probabilities of the events at time k to which there corresponds the value for the random variable Xk of +1 be Similarly for the events associated with Xk = — 1.

(iii) Let each realisation be independent of the previous one.

To lighten notation X(k, cok) will also be denoted in the following as X(k, a>) or simply X(k). The simplest example of 'experiment' could be the tossing of a coin, or the realisation of an 'up' or 'down' move on a binomial tree. The 'experiment', however, need not necessarily have only two possible outcomes: the rolling of a dice could constitute a possible 'experiment' if one associated, say, the value +1 to any even outcome, and the value —1 to any odd outcome. The second condition requires that the coin or the dice should be 'fair' (unbiased), or that the 'up' and 'down' probabilities should be equal to \. The third condition requires that the outcome of the 'experiment' should not be influenced by previous outcomes. The values of ±1 for the random variable Xk justify the term 'unitary' for the random walk, and the 'fairness' of the experiment characterises the random walk as 'symmetric'. In addition to these properties, let us assume that the filtration 5 is generated by the random variable Xk.

From properties (i), (ii) and (iii) it is immediate to verify that the expectation of X(k + 1), given any sequence of X(n), n = 0, 1, 2, ..., k, is 0; and that the conditional variance of X(k + 1) is equal to 1:

In order to establish the variance property (A.2), for instance, one can start from the definition of variance, and obtain:

Var[X(fc + l)|®t] = E[X(k + 1)2|&] - (E[X(k + 1)|&])2

since Xk+i can only assume the values of +1 or —1, and use has been made of property (A.l).

Having defined the sequence X(k, u>k), one can now construct a symmetric random walk, RW(n), by defining

From the properties of X^, it is easy to establish the fundamental properties of the symmetric random walk. First of all, by construction, the successive steps are independent, and each step has unit variance. In addition, the so-called martingale property relates the expectation of the future value of a symmetric random walk conditioned to the value RW* it has attained after n steps. More precisely, let us consider the filtration 3(n) generated by the first n tosses. We want to evaluate E[JW0)|5(")], for i > «. RW(s) can be rewritten as

Since 5(n) is the filtration generated by the first n tosses, RW(n) is an measurable quantity (i.e., is a known number after n tosses). Therefore

All the quantities Xj are independent of previous realisations (i.e. since they are independent of 5'(«)). Therefore, for each j,

It follows that

In other terms, the best prediction that can be made after n realisations of the value that a symmetric random walk will assume after s realisations is simply its value after n realisations.

If the variance property (A.2) is now coupled with the independence of successive realisations, it is also simple to show that the variance of the difference between the random walks at times n and m (as evaluated at time i) is simply equal to n — m. Since the continuous-time equivalent of this result is of fundamental importance, a sketch of the proof for the discrete case is given below. Starting from the definition of variance,

Var[/?W„ = E[(RWn - /?Wm)2|&] - (E[(RWn - Wm)|&])2.

But from result (A.9) we know that

(E[(RWn -*W„)|&])2 = (E[/?IV„|&] - E[RWm\$i])2

As for the first term on the RHS of Equation (A. 10), the term RW(n) can be written as RW(i) + St=i,„X(fc), and similarly for RW(m); therefore the term inside the expectation becomes

where use has been made of result (A.3) concerning the variance of X, and of the fact that successive realisations are independent.

Having established these important relationships for the 'unitary' RW, we want to be able to 'scale' the process in such a way that the expectation and variance properties (A. 10) are retained. In order to do so, for a given integer m and a time t > 0, let us define the new quantity

where k = Int(wr) is the integer truncation of the real number mt. In other terms, the quantity [JW* + (J?Wt+i — RWk)(mt — fc)] constructs the linear interpolation of the random walk RW between the two consecutive integers k and k + 1 such that k < mt < k + 1. Therefore, to consider two special cases, if the quantity mt happened to be the integer k then

or, if mt happened to be equal to the integer k + 1, then

Figures A.5 and A.6 show the realisations of a unitary symmetric random walk and of a scaled symmetric random walk with m = 3, respectively.

Apart from these limiting cases, notice carefully, however, that the new quantity Bm(t) is defined for any t in between the two 'times' k and k + 1. Let us now choose a sequence of times {/;},

such that, for a given m, all the terms mti are integers. For t € {/¡},Bm{t) = RW(mt) by construction, and, therefore, the new quantity Bm(t) coincides with RW on the set {ij}. It is therefore easy to show that, for t € {»,-}

Symmetric Random Walk

Figure A.5 The realisations of a unitary symmetric random walk

Unitary Scaled Random Walk Sfl(m,i), m = 3

0 0

Post a comment