Martingale (probability theory)

In probability theory, a martingale is a sequence of random variables (i.e., a stochastic process) for which, at a particular time, the conditional expectation of the next value in the sequence is equal to the present value, regardless of all prior values.

Stopped Brownian motion is an example of a martingale. It can model an even coin-toss betting game with the possibility of bankruptcy.

History

Originally, martingale referred to a class of betting strategies that was popular in 18th-century France. The simplest of these strategies was designed for a game in which the gambler wins their stake if a coin comes up heads and loses it if the coin comes up tails. The strategy had the gambler double their bet after every loss so that the first win would recover all previous losses plus win a profit equal to the original stake. As the gambler's wealth and available time jointly approach infinity, their probability of eventually flipping heads approaches 1, which makes the martingale betting strategy seem like a sure thing. However, the exponential growth of the bets eventually bankrupts its users due to finite bankrolls. Stopped Brownian motion, which is a martingale process, can be used to model the trajectory of such games.

The concept of martingale in probability theory was introduced by Paul Lévy in 1934, though he did not name it. The term "martingale" was introduced later by Ville (1939), who also extended the definition to continuous martingales. Much of the original development of the theory was done by Joseph Leo Doob among others. Part of the motivation for that work was to show the impossibility of successful betting strategies in games of chance.

Definitions

A basic definition of a discrete-time martingale is a discrete-time stochastic process (i.e., a sequence of random variables) X1, X2, X3, ... that satisfies for any time n,

E ( | X n | ) < ∞ {\displaystyle \mathbf {E} (\vert X_{n}\vert )<\infty } E ( X n + 1 ∣ X 1 , … , X n ) = X n . {\displaystyle \mathbf {E} (X_{n+1}\mid X_{1},\ldots ,X_{n})=X_{n}.}

That is, the conditional expected value of the next observation, given all the past observations, is equal to the most recent observation.

Martingale sequences with respect to another sequence

More generally, a sequence Y1, Y2, Y3 ... is said to be a martingale with respect to another sequence X1, X2, X3 ... if for all n

E ( | Y n | ) < ∞ {\displaystyle \mathbf {E} (\vert Y_{n}\vert )<\infty } E ( Y n + 1 ∣ X 1 , … , X n ) = Y n . {\displaystyle \mathbf {E} (Y_{n+1}\mid X_{1},\ldots ,X_{n})=Y_{n}.}

Similarly, a continuous-time martingale with respect to the stochastic process Xt is a stochastic process Yt such that for all t

E ( | Y t | ) < ∞ {\displaystyle \mathbf {E} (\vert Y_{t}\vert )<\infty } E ( Y t ∣ { X τ , τ ≤ s } ) = Y s ∀ s ≤ t . {\displaystyle \mathbf {E} (Y_{t}\mid \{X_{\tau },\tau \leq s\})=Y_{s}\quad \forall s\leq t.}

This expresses the property that the conditional expectation of an observation at time t, given all the observations up to time s {\displaystyle s} , is equal to the observation at time s (of course, provided that s ≤ t). The second property implies that Y n {\displaystyle Y_{n}} is measurable with respect to X 1 … X n {\displaystyle X_{1}\dots X_{n}} .

General definition

In full generality, a stochastic process Y : T × Ω → S {\displaystyle Y:T\times \Omega \to S} taking values in a Banach space S {\displaystyle S} with norm ‖ ⋅ ‖ S {\displaystyle \lVert \cdot \rVert _{S}} is a martingale with respect to a filtration Σ ∗ {\displaystyle \Sigma _{*}} and probability measure P {\displaystyle \mathbb {P} } if

E P ( ‖ Y t ‖ S ) < + ∞ ; {\displaystyle \mathbf {E} _{\mathbb {P} }(\lVert Y_{t}\rVert _{S})<+\infty ;} E P ( χ F ) = 0 , {\displaystyle \mathbf {E} _{\mathbb {P} }\left(\chi _{F}\right)=0,} where χF denotes the indicator function of the event F. In Grimmett and Stirzaker's Probability and Random Processes, this last condition is denoted as Y s = E P ( Y t ∣ Σ s ) , {\displaystyle Y_{s}=\mathbf {E} _{\mathbb {P} }(Y_{t}\mid \Sigma _{s}),} which is a general form of conditional expectation.

It is important to note that the property of being a martingale involves both the filtration and the probability measure (with respect to which the expectations are taken). It is possible that Y could be a martingale with respect to one measure but not another one; the Girsanov theorem offers a way to find a measure with respect to which an Itō process is a martingale.

In the Banach space setting the conditional expectation is also denoted in operator notation as E Σ s Y t {\displaystyle \mathbf {E} ^{\Sigma _{s}}Y_{t}} .

Examples of martingales

X n + 1 = X n ± 1 {\displaystyle X_{n+1}=X_{n}\pm 1} with "+" in case of "heads" and "−" in case of "tails". Let Y n = ( q / p ) X n . {\displaystyle Y_{n}=(q/p)^{X_{n}}.} Then { Yn : n = 1, 2, 3, ... } is a martingale with respect to { Xn : n = 1, 2, 3, ... }. To show this E = p ( q / p ) X n + 1 + q ( q / p ) X n − 1 = p ( q / p ) ( q / p ) X n + q ( p / q ) ( q / p ) X n = q ( q / p ) X n + p ( q / p ) X n = ( q / p ) X n = Y n . {\displaystyle {\begin{aligned}E&=p(q/p)^{X_{n}+1}+q(q/p)^{X_{n}-1}\\&=p(q/p)(q/p)^{X_{n}}+q(p/q)(q/p)^{X_{n}}\\&=q(q/p)^{X_{n}}+p(q/p)^{X_{n}}=(q/p)^{X_{n}}=Y_{n}.\end{aligned}}} Y n = ∏ i = 1 n g ( X i ) f ( X i ) {\displaystyle Y_{n}=\prod _{i=1}^{n}{\frac {g(X_{i})}{f(X_{i})}}} If X is actually distributed according to the density f rather than according to g, then { Yn : n = 1, 2, 3, ... } is a martingale with respect to { Xn : n = 1, 2, 3, ... }. Software-created martingale series

Submartingales, supermartingales, and relationship to harmonic functions

There are two popular generalizations of a martingale that also include cases when the current observation Xn is not necessarily equal to the future conditional expectation E but instead an upper or lower bound on the conditional expectation. These definitions reflect a relationship between martingale theory and potential theory, which is the study of harmonic functions. Just as a continuous-time martingale satisfies E − Xs = 0 ∀s ≤ t, a harmonic function f satisfies the partial differential equation Δf = 0 where Δ is the Laplacian operator. Given a Brownian motion process Wt and a harmonic function f, the resulting process f(Wt) is also a martingale.

E ⁡ ≥ X n . {\displaystyle \operatorname {E} \geq X_{n}.} Likewise, a continuous-time submartingale satisfies E ⁡ ≥ X s ∀ s ≤ t . {\displaystyle \operatorname {E} \geq X_{s}\quad \forall s\leq t.} In potential theory, a subharmonic function f satisfies Δf ≥ 0. Any subharmonic function that is bounded above by a harmonic function for all points on the boundary of a ball is bounded above by the harmonic function for all points inside the ball. Similarly, if a submartingale and a martingale have equivalent expectations for a given time, the history of the submartingale tends to be bounded above by the history of the martingale. Roughly speaking, the prefix "sub-" is consistent because the current observation Xn is less than (or equal to) the conditional expectation E. Consequently, the current observation provides support from below the future conditional expectation, and the process tends to increase in future time. E ⁡ ≤ X n . {\displaystyle \operatorname {E} \leq X_{n}.} Likewise, a continuous-time supermartingale satisfies E ⁡ ≤ X s ∀ s ≤ t . {\displaystyle \operatorname {E} \leq X_{s}\quad \forall s\leq t.} In potential theory, a superharmonic function f satisfies Δf ≤ 0. Any superharmonic function that is bounded below by a harmonic function for all points on the boundary of a ball is bounded below by the harmonic function for all points inside the ball. Similarly, if a supermartingale and a martingale have equivalent expectations for a given time, the history of the supermartingale tends to be bounded below by the history of the martingale. Roughly speaking, the prefix "super-" is consistent because the current observation Xn is greater than (or equal to) the conditional expectation E. Consequently, the current observation provides support from above the future conditional expectation, and the process tends to decrease in future time.

Examples of submartingales and supermartingales

A stopping time with respect to a sequence of random variables X1, X2, X3, ... is a random variable τ with the property that for each t, the occurrence or non-occurrence of the event τ = t depends only on the values of X1, X2, X3, ..., Xt. The intuition behind the definition is that at any particular time t, you can look at the sequence so far and tell if it is time to stop. An example in real life might be the time at which a gambler leaves the gambling table, which might be a function of their previous winnings (for example, he might leave only when he goes broke), but he can't choose to go or stay based on the outcome of games that haven't been played yet.

In some contexts the concept of stopping time is defined by requiring only that the occurrence or non-occurrence of the event τ = t is probabilistically independent of Xt + 1, Xt + 2, ... but not that it is completely determined by the history of the process up to time t. That is a weaker condition than the one appearing in the paragraph above, but is strong enough to serve in some of the proofs in which stopping times are used.

One of the basic properties of martingales is that, if ( X t ) t > 0 {\displaystyle (X_{t})_{t>0}} is a (sub-/super-) martingale and τ {\displaystyle \tau } is a stopping time, then the corresponding stopped process ( X t τ ) t > 0 {\displaystyle (X_{t}^{\tau })_{t>0}} defined by X t τ := X min { τ , t } {\displaystyle X_{t}^{\tau }:=X_{\min\{\tau ,t\}}} is also a (sub-/super-) martingale.

The concept of a stopped martingale leads to a series of important theorems, including, for example, the optional stopping theorem which states that, under certain conditions, the expected value of a martingale at a stopping time is equal to its initial value.

See also

Notes

  1. ^ Balsara, N. J. (1992). Money Management Strategies for Futures Traders. Wiley Finance. p. 122. ISBN 978-0-471-52215-7. martingale.
  2. ^ Mansuy, Roger (June 2009). "The origins of the Word "Martingale"" (PDF). Electronic Journal for History of Probability and Statistics. 5 (1). Archived (PDF) from the original on 2012-01-31. Retrieved 2011-10-22.
  3. ^ Grimmett, G.; Stirzaker, D. (2001). Probability and Random Processes (3rd ed.). Oxford University Press. ISBN 978-0-19-857223-7.
  4. ^ Bogachev, Vladimir (1998). Gaussian Measures. American Mathematical Society. pp. 372–373. ISBN 978-1470418694.

References