In probability theory, a compound Poisson distribution is the probability distribution of the sum of a number of independent identically-distributed random variables, where the number of terms to be added is itself a Poisson-distributed variable. The result can be either a continuous or a discrete distribution.
Suppose that
N ∼ Poisson ( λ ) , {\displaystyle N\sim \operatorname {Poisson} (\lambda ),}i.e., N is a random variable whose distribution is a Poisson distribution with expected value λ, and that
X 1 , X 2 , X 3 , … {\displaystyle X_{1},X_{2},X_{3},\dots }are identically distributed random variables that are mutually independent and also independent of N. Then the probability distribution of the sum of N {\displaystyle N}
Y = ∑ n = 1 N X n {\displaystyle Y=\sum _{n=1}^{N}X_{n}} i.i.d. random variablesis a compound Poisson distribution.
In the case N = 0, then this is a sum of 0 terms, so the value of Y is 0. Hence the conditional distribution of Y given that N = 0 is a degenerate distribution.
The compound Poisson distribution is obtained by marginalising the joint distribution of (Y,N) over N, and this joint distribution can be obtained by combining the conditional distribution Y | N with the marginal distribution of N.
The expected value and the variance of the compound distribution can be derived in a simple way from law of total expectation and the law of total variance. Thus
E ( Y ) = E = E = E ( N ) E ( X ) , {\displaystyle \operatorname {E} (Y)=\operatorname {E} \left=\operatorname {E} \left=\operatorname {E} (N)\operatorname {E} (X),} Var ( Y ) = E + Var = E + Var , = E ( N ) Var ( X ) + ( E ( X ) ) 2 Var ( N ) . {\displaystyle {\begin{aligned}\operatorname {Var} (Y)&=\operatorname {E} \left+\operatorname {Var} \left=\operatorname {E} \left+\operatorname {Var} \left,\\&=\operatorname {E} (N)\operatorname {Var} (X)+\left(\operatorname {E} (X)\right)^{2}\operatorname {Var} (N).\end{aligned}}}Then, since E(N) = Var(N) if N is Poisson-distributed, these formulae can be reduced to
E ( Y ) = E ( N ) E ( X ) , {\displaystyle \operatorname {E} (Y)=\operatorname {E} (N)\operatorname {E} (X),} Var ( Y ) = E ( N ) ( Var ( X ) + ( E ( X ) ) 2 ) = E ( N ) E ( X 2 ) . {\displaystyle \operatorname {Var} (Y)=\operatorname {E} (N)(\operatorname {Var} (X)+(\operatorname {E} (X))^{2})=\operatorname {E} (N){\operatorname {E} (X^{2})}.}The probability distribution of Y can be determined in terms of characteristic functions:
φ Y ( t ) = E ( e i t Y ) = E ( ( E ( e i t X ∣ N ) ) N ) = E ( ( φ X ( t ) ) N ) , {\displaystyle \varphi _{Y}(t)=\operatorname {E} (e^{itY})=\operatorname {E} \left(\left(\operatorname {E} (e^{itX}\mid N)\right)^{N}\right)=\operatorname {E} \left((\varphi _{X}(t))^{N}\right),\,}and hence, using the probability-generating function of the Poisson distribution, we have
φ Y ( t ) = e λ ( φ X ( t ) − 1 ) . {\displaystyle \varphi _{Y}(t)={\textrm {e}}^{\lambda (\varphi _{X}(t)-1)}.\,}An alternative approach is via cumulant generating functions:
K Y ( t ) = ln E = ln E ] = ln E = K N ( K X ( t ) ) . {\displaystyle K_{Y}(t)=\ln \operatorname {E} =\ln \operatorname {E} ]=\ln \operatorname {E} =K_{N}(K_{X}(t)).\,}Via the law of total cumulance it can be shown that, if the mean of the Poisson distribution λ = 1, the cumulants of Y are the same as the moments of X1.
Every infinitely divisible probability distribution is a limit of compound Poisson distributions. And compound Poisson distributions is infinitely divisible by the definition.
When X 1 , X 2 , X 3 , … {\displaystyle X_{1},X_{2},X_{3},\dots } (or stuttering-Poisson distribution) . We say that the discrete random variable Y {\displaystyle Y} satisfying probability generating function characterization
P Y ( z ) = ∑ i = 0 ∞ P ( Y = i ) z i = exp ( ∑ k = 1 ∞ α k λ ( z k − 1 ) ) , ( | z | ≤ 1 ) {\displaystyle P_{Y}(z)=\sum \limits _{i=0}^{\infty }P(Y=i)z^{i}=\exp \left(\sum \limits _{k=1}^{\infty }\alpha _{k}\lambda (z^{k}-1)\right),\quad (|z|\leq 1)} are positive integer-valued i.i.d random variables with P ( X 1 = k ) = α k , ( k = 1 , 2 , … ) {\displaystyle P(X_{1}=k)=\alpha _{k},\ (k=1,2,\ldots )} , then this compound Poisson distribution is named discrete compound Poisson distributionhas a discrete compound Poisson(DCP) distribution with parameters ( α 1 λ , α 2 λ , … ) ∈ R ∞ {\displaystyle (\alpha _{1}\lambda ,\alpha _{2}\lambda ,\ldots )\in \mathbb {R} ^{\infty }}
X ∼ DCP ( λ α 1 , λ α 2 , … ) {\displaystyle X\sim {\text{DCP}}(\lambda {\alpha _{1}},\lambda {\alpha _{2}},\ldots )} (where ∑ i = 1 ∞ α i = 1 {\textstyle \sum _{i=1}^{\infty }\alpha _{i}=1} , with α i ≥ 0 , λ > 0 {\textstyle \alpha _{i}\geq 0,\lambda >0} ), which is denoted byMoreover, if X ∼ DCP ( λ α 1 , … , λ α r ) {\displaystyle X\sim {\operatorname {DCP} }(\lambda {\alpha _{1}},\ldots ,\lambda {\alpha _{r}})} Poisson distribution and Hermite distribution, respectively. When r = 3 , 4 {\displaystyle r=3,4} , DCP becomes triple stuttering-Poisson distribution and quadruple stuttering-Poisson distribution, respectively. Other special cases include: shift geometric distribution, negative binomial distribution, Geometric Poisson distribution, Neyman type A distribution, Luria–Delbrück distribution in Luria–Delbrück experiment. For more special case of DCP, see the reviews paper and references therein.
, we say X {\displaystyle X} has a discrete compound Poisson distribution of order r {\displaystyle r} . When r = 1 , 2 {\displaystyle r=1,2} , DCP becomesFeller's characterization of the compound Poisson distribution states that a non-negative integer valued r.v. X {\displaystyle X} infinitely divisible if and only if its distribution is a discrete compound Poisson distribution. The negative binomial distribution is discrete infinitely divisible, i.e., if X has a negative binomial distribution, then for any positive integer n, there exist discrete i.i.d. random variables X1, ..., Xn whose sum has the same distribution that X has. The shift geometric distribution is discrete compound Poisson distribution since it is a trivial case of negative binomial distribution.
isThis distribution can model batch arrivals (such as in a bulk queue). The discrete compound Poisson distribution is also widely used in actuarial science for modelling the distribution of the total claim amount.
When some α k {\displaystyle \alpha _{k}} We define that any discrete random variable Y {\displaystyle Y} satisfying probability generating function characterization
G Y ( z ) = ∑ i = 0 ∞ P ( Y = i ) z i = exp ( ∑ k = 1 ∞ α k λ ( z k − 1 ) ) , ( | z | ≤ 1 ) {\displaystyle G_{Y}(z)=\sum \limits _{i=0}^{\infty }P(Y=i)z^{i}=\exp \left(\sum \limits _{k=1}^{\infty }\alpha _{k}\lambda (z^{k}-1)\right),\quad (|z|\leq 1)} are negative, it is the discrete pseudo compound Poisson distribution.has a discrete pseudo compound Poisson distribution with parameters ( λ 1 , λ 2 , … ) =: ( α 1 λ , α 2 λ , … ) ∈ R ∞ {\displaystyle (\lambda _{1},\lambda _{2},\ldots )=:(\alpha _{1}\lambda ,\alpha _{2}\lambda ,\ldots )\in \mathbb {R} ^{\infty }}
where ∑ i = 1 ∞ α i = 1 {\textstyle \sum _{i=1}^{\infty }{\alpha _{i}}=1} and ∑ i = 1 ∞ | α i | < ∞ {\textstyle \sum _{i=1}^{\infty }{\left|{\alpha _{i}}\right|}<\infty } , with α i ∈ R , λ > 0 {\displaystyle {\alpha _{i}}\in \mathbb {R} ,\lambda >0} .If X has a gamma distribution, of which the exponential distribution is a special case, then the conditional distribution of Y | N is again a gamma distribution. The marginal distribution of Y is a Tweedie distribution with variance power 1 < p < 2 (proof via comparison of characteristic function (probability theory)). To be more explicit, if
N ∼ Poisson ( λ ) , {\displaystyle N\sim \operatorname {Poisson} (\lambda ),}and
X i ∼ Γ ( α , β ) {\displaystyle X_{i}\sim \operatorname {\Gamma } (\alpha ,\beta )}i.i.d., then the distribution of
Y = ∑ i = 1 N X i {\displaystyle Y=\sum _{i=1}^{N}X_{i}}is a reproductive exponential dispersion model E D ( μ , σ 2 ) {\displaystyle ED(\mu ,\sigma ^{2})} with
E = λ α β =: μ , Var = λ α ( 1 + α ) β 2 =: σ 2 μ p . {\displaystyle {\begin{aligned}\operatorname {E} &=\lambda {\frac {\alpha }{\beta }}=:\mu ,\\\operatorname {Var} &=\lambda {\frac {\alpha (1+\alpha )}{\beta ^{2}}}=:\sigma ^{2}\mu ^{p}.\end{aligned}}}The mapping of parameters Tweedie parameter μ , σ 2 , p {\displaystyle \mu ,\sigma ^{2},p}
λ = μ 2 − p ( 2 − p ) σ 2 , α = 2 − p p − 1 , β = μ 1 − p ( p − 1 ) σ 2 . {\displaystyle {\begin{aligned}\lambda &={\frac {\mu ^{2-p}}{(2-p)\sigma ^{2}}},\\\alpha &={\frac {2-p}{p-1}},\\\beta &={\frac {\mu ^{1-p}}{(p-1)\sigma ^{2}}}.\end{aligned}}} to the Poisson and Gamma parameters λ , α , β {\displaystyle \lambda ,\alpha ,\beta } is the following:A compound Poisson process with rate λ > 0 {\displaystyle \lambda >0} and jump size distribution G is a continuous-time stochastic process { Y ( t ) : t ≥ 0 } {\displaystyle \{\,Y(t):t\geq 0\,\}} given by
Y ( t ) = ∑ i = 1 N ( t ) D i , {\displaystyle Y(t)=\sum _{i=1}^{N(t)}D_{i},}where the sum is by convention equal to zero as long as N(t) = 0. Here, { N ( t ) : t ≥ 0 } {\displaystyle \{\,N(t):t\geq 0\,\}} Poisson process with rate λ {\displaystyle \lambda } , and { D i : i ≥ 1 } {\displaystyle \{\,D_{i}:i\geq 1\,\}} are independent and identically distributed random variables, with distribution function G, which are also independent of { N ( t ) : t ≥ 0 } . {\displaystyle \{\,N(t):t\geq 0\,\}.\,}
is aFor the discrete version of compound Poisson process, it can be used in survival analysis for the frailty models.
A compound Poisson distribution, in which the summands have an exponential distribution, was used by Revfeim to model the distribution of the total rainfall in a day, where each day contains a Poisson-distributed number of events each of which provides an amount of rainfall which has an exponential distribution. Thompson applied the same model to monthly total rainfalls.
There have been applications to insurance claims and x-ray computed tomography.