In mathematics, a local martingale is a type of stochastic process, satisfying the localized version of the martingale property. Every martingale is a local martingale; every bounded local martingale is a martingale; in particular, every local martingale that is bounded from below is a supermartingale, and every local martingale that is bounded from above is a submartingale; however, a local martingale is not in general a martingale, because its expectation can be distorted by large values of small probability. In particular, a driftless diffusion process is a local martingale, but not necessarily a martingale.

Local martingales are essential in stochastic analysis (see Itō calculus, semimartingale, and Girsanov theorem).

Definition edit

Let   be a probability space; let   be a filtration of  ; let   be an  -adapted stochastic process on the set  . Then   is called an  -local martingale if there exists a sequence of  -stopping times   such that

  • the   are almost surely increasing:  ;
  • the   diverge almost surely:  ;
  • the stopped process
     
    is an  -martingale for every  .

Examples edit

Example 1 edit

Let Wt be the Wiener process and T = min{ t : Wt = −1 } the time of first hit of −1. The stopped process Wmin{ tT } is a martingale. Its expectation is 0 at all times; nevertheless, its limit (as t → ∞) is equal to −1 almost surely (a kind of gambler's ruin). A time change leads to a process

 

The process   is continuous almost surely; nevertheless, its expectation is discontinuous,

 

This process is not a martingale. However, it is a local martingale. A localizing sequence may be chosen as   if there is such t, otherwise  . This sequence diverges almost surely, since   for all k large enough (namely, for all k that exceed the maximal value of the process X). The process stopped at τk is a martingale.[details 1]

Example 2 edit

Let Wt be the Wiener process and ƒ a measurable function such that   Then the following process is a martingale:

 

where

 

The Dirac delta function   (strictly speaking, not a function), being used in place of   leads to a process defined informally as   and formally as

 

where

 

The process   is continuous almost surely (since   almost surely), nevertheless, its expectation is discontinuous,

 

This process is not a martingale. However, it is a local martingale. A localizing sequence may be chosen as  

Example 3 edit

Let   be the complex-valued Wiener process, and

 

The process   is continuous almost surely (since   does not hit 1, almost surely), and is a local martingale, since the function   is harmonic (on the complex plane without the point 1). A localizing sequence may be chosen as   Nevertheless, the expectation of this process is non-constant; moreover,

    as  

which can be deduced from the fact that the mean value of   over the circle   tends to infinity as  . (In fact, it is equal to   for r ≥ 1 but to 0 for r ≤ 1).

Martingales via local martingales edit

Let   be a local martingale. In order to prove that it is a martingale it is sufficient to prove that   in L1 (as  ) for every t, that is,   here   is the stopped process. The given relation   implies that   almost surely. The dominated convergence theorem ensures the convergence in L1 provided that

     for every t.

Thus, Condition (*) is sufficient for a local martingale   being a martingale. A stronger condition

     for every t

is also sufficient.

Caution. The weaker condition

     for every t

is not sufficient. Moreover, the condition

 

is still not sufficient; for a counterexample see Example 3 above.

A special case:

 

where   is the Wiener process, and   is twice continuously differentiable. The process   is a local martingale if and only if f satisfies the PDE

 

However, this PDE itself does not ensure that   is a martingale. In order to apply (**) the following condition on f is sufficient: for every   and t there exists   such that

 

for all   and  

Technical details edit

  1. ^ For the times before 1 it is a martingale since a stopped Brownian motion is. After the instant 1 it is constant. It remains to check it at the instant 1. By the bounded convergence theorem the expectation at 1 is the limit of the expectation at (n-1)/n (as n tends to infinity), and the latter does not depend on n. The same argument applies to the conditional expectation.[vague]

References edit

  • Øksendal, Bernt K. (2003). Stochastic Differential Equations: An Introduction with Applications (Sixth ed.). Berlin: Springer. ISBN 3-540-04758-1.