Notation | Pois ( λ ) ∧ λ π ( λ ) {\displaystyle \operatorname {Pois} (\lambda )\,{\underset {\lambda }{\wedge }}\,\pi (\lambda )} | ||
---|---|---|---|
Parameters | λ ∈ ( 0 , ∞ ) {\displaystyle \lambda \in (0,\infty )} | ||
Support | k ∈ N 0 {\displaystyle k\in \mathbb {N} _{0}} | ||
PMF | ∫ 0 ∞ λ k k ! e − λ π ( λ ) d λ {\displaystyle \int \limits _{0}^{\infty }{\frac {\lambda ^{k}}{k!}}e^{-\lambda }\,\,\pi (\lambda )\,\mathrm {d} \lambda } | ||
Mean | ∫ 0 ∞ λ π ( λ ) d λ {\displaystyle \int \limits _{0}^{\infty }\lambda \,\,\pi (\lambda )\,d\lambda } | ||
Variance | ∫ 0 ∞ ( λ + ( λ − μ π ) 2 ) π ( λ ) d λ {\displaystyle \int \limits _{0}^{\infty }(\lambda +(\lambda -\mu _{\pi })^{2})\,\,\pi (\lambda )\,d\lambda } | ||
Skewness | ( μ π + σ π 2 ) − 3 / 2 π ( λ ) d λ + μ π ] {\displaystyle {\Bigl (}\mu _{\pi }+\sigma _{\pi }^{2}{\Bigr )}^{-3/2}\,{\Biggl \,\pi (\lambda )\,d{\lambda }+\mu _{\pi }{\Biggr ]}} | ||
MGF | M π ( e t − 1 ) {\displaystyle M_{\pi }(e^{t}-1)} | , with M π {\displaystyle M_{\pi }} the MGF of π||
CF | M π ( e i t − 1 ) {\displaystyle M_{\pi }(e^{it}-1)} | ||
PGF | M π ( z − 1 ) {\displaystyle M_{\pi }(z-1)} |
A mixed Poisson distribution is a univariate discrete probability distribution in stochastics. It results from assuming that the conditional distribution of a random variable, given the value of the rate parameter, is a Poisson distribution, and that the rate parameter itself is considered as a random variable. Hence it is a special case of a compound probability distribution. Mixed Poisson distributions can be found in actuarial mathematics as a general approach for the distribution of the number of claims and is also examined as an epidemiological model. It should not be confused with compound Poisson distribution or compound Poisson process.
A random variable X satisfies the mixed Poisson distribution with density π(λ) if it has the probability distribution
P ( X = k ) = ∫ 0 ∞ λ k k ! e − λ π ( λ ) d λ . {\displaystyle \operatorname {P} (X=k)=\int _{0}^{\infty }{\frac {\lambda ^{k}}{k!}}e^{-\lambda }\,\,\pi (\lambda )\,\mathrm {d} \lambda .}If we denote the probabilities of the Poisson distribution by qλ(k), then
P ( X = k ) = ∫ 0 ∞ q λ ( k ) π ( λ ) d λ . {\displaystyle \operatorname {P} (X=k)=\int _{0}^{\infty }q_{\lambda }(k)\,\,\pi (\lambda )\,\mathrm {d} \lambda .}In the following let μ π = ∫ 0 ∞ λ π ( λ ) d λ {\displaystyle \mu _{\pi }=\int \limits _{0}^{\infty }\lambda \,\,\pi (\lambda )\,d\lambda \,}
be the expected value of the density π ( λ ) {\displaystyle \pi (\lambda )\,} and σ π 2 = ∫ 0 ∞ ( λ − μ π ) 2 π ( λ ) d λ {\displaystyle \sigma _{\pi }^{2}=\int \limits _{0}^{\infty }(\lambda -\mu _{\pi })^{2}\,\,\pi (\lambda )\,d\lambda \,} be the variance of the density.The expected value of the mixed Poisson distribution is
E ( X ) = μ π . {\displaystyle \operatorname {E} (X)=\mu _{\pi }.}For the variance one gets
Var ( X ) = μ π + σ π 2 . {\displaystyle \operatorname {Var} (X)=\mu _{\pi }+\sigma _{\pi }^{2}.}The skewness can be represented as
v ( X ) = ( μ π + σ π 2 ) − 3 / 2 . {\displaystyle \operatorname {v} (X)={\Bigl (}\mu _{\pi }+\sigma _{\pi }^{2}{\Bigr )}^{-3/2}\,{\Biggl }.}The characteristic function has the form
φ X ( s ) = M π ( e i s − 1 ) . {\displaystyle \varphi _{X}(s)=M_{\pi }(e^{is}-1).\,}Where M π {\displaystyle M_{\pi }} moment generating function of the density.
is theFor the probability generating function, one obtains
m X ( s ) = M π ( s − 1 ) . {\displaystyle m_{X}(s)=M_{\pi }(s-1).\,}The moment-generating function of the mixed Poisson distribution is
M X ( s ) = M π ( e s − 1 ) . {\displaystyle M_{X}(s)=M_{\pi }(e^{s}-1).\,}
Theorem — Compounding a Poisson distribution with rate parameter distributed according to a gamma distribution yields a negative binomial distribution. ProofLet π ( λ ) = ( p 1 − p ) r Γ ( r ) λ r − 1 e − p 1 − p λ {\displaystyle \pi (\lambda )={\frac {({\frac {p}{1-p}})^{r}}{\Gamma (r)}}\lambda ^{r-1}e^{-{\frac {p}{1-p}}\lambda }} be a density of a Γ ( r , p 1 − p ) {\displaystyle \operatorname {\Gamma } \left(r,{\frac {p}{1-p}}\right)} distributed random variable.P ( X = k ) = 1 k ! ∫ 0 ∞ λ k e − λ ( p 1 − p ) r Γ ( r ) λ r − 1 e − p 1 − p λ d λ = p r ( 1 − p ) − r Γ ( r ) k ! ∫ 0 ∞ λ k + r − 1 e − λ 1 1 − p d λ = p r ( 1 − p ) − r Γ ( r ) k ! ( 1 − p ) k + r ∫ 0 ∞ λ k + r − 1 e − λ d λ ⏟ = Γ ( r + k ) = Γ ( r + k ) Γ ( r ) k ! ( 1 − p ) k p r {\displaystyle {\begin{aligned}\operatorname {P} (X=k)&={\frac {1}{k!}}\int _{0}^{\infty }\lambda ^{k}e^{-\lambda }{\frac {({\frac {p}{1-p}})^{r}}{\Gamma (r)}}\lambda ^{r-1}e^{-{\frac {p}{1-p}}\lambda }\,\mathrm {d} \lambda \\&={\frac {p^{r}(1-p)^{-r}}{\Gamma (r)k!}}\int _{0}^{\infty }\lambda ^{k+r-1}e^{-\lambda {\frac {1}{1-p}}}\,\mathrm {d} \lambda \\&={\frac {p^{r}(1-p)^{-r}}{\Gamma (r)k!}}(1-p)^{k+r}\underbrace {\int _{0}^{\infty }\lambda ^{k+r-1}e^{-\lambda }\,\mathrm {d} \lambda } _{=\Gamma (r+k)}\\&={\frac {\Gamma (r+k)}{\Gamma (r)k!}}(1-p)^{k}p^{r}\end{aligned}}} Therefore we get X ∼ NegB ( r , p ) . {\displaystyle X\sim \operatorname {NegB} (r,p).} |
Theorem — Compounding a Poisson distribution with rate parameter distributed according to a exponential distribution yields a geometric distribution. ProofLet π ( λ ) = 1 β e − λ β {\displaystyle \pi (\lambda )={\frac {1}{\beta }}e^{-{\frac {\lambda }{\beta }}}} integration by parts n times yields: P ( X = k ) = 1 k ! ∫ 0 ∞ λ k e − λ 1 β e − λ β d λ = 1 k ! β ∫ 0 ∞ λ k e − λ ( 1 + β β ) d λ = 1 k ! β ⋅ k ! ( β 1 + β ) k ∫ 0 ∞ e − λ ( 1 + β β ) d λ = ( β 1 + β ) k ( 1 1 + β ) {\displaystyle {\begin{aligned}\operatorname {P} (X=k)&={\frac {1}{k!}}\int \limits _{0}^{\infty }\lambda ^{k}e^{-\lambda }{\frac {1}{\beta }}e^{-{\frac {\lambda }{\beta }}}\,\mathrm {d} \lambda \\&={\frac {1}{k!\beta }}\int \limits _{0}^{\infty }\lambda ^{k}e^{-\lambda \left({\frac {1+\beta }{\beta }}\right)}\,\mathrm {d} \lambda \\&={\frac {1}{k!\beta }}\cdot k!\left({\frac {\beta }{1+\beta }}\right)^{k}\int \limits _{0}^{\infty }e^{-\lambda \left({\frac {1+\beta }{\beta }}\right)}\,\mathrm {d} \lambda \\&=\left({\frac {\beta }{1+\beta }}\right)^{k}\left({\frac {1}{1+\beta }}\right)\end{aligned}}} be a density of a E x p ( 1 β ) {\displaystyle \mathrm {Exp} \left({\frac {1}{\beta }}\right)} distributed random variable. Using Therefore we get X ∼ G e o ( 1 1 + β ) . {\displaystyle X\sim \operatorname {Geo\left({\frac {1}{1+\beta }}\right)} .} |
mixing distribution | mixed Poisson distribution |
---|---|
gamma | negative binomial |
exponential | geometric |
inverse Gaussian | Sichel |
Poisson | Neyman |
generalized inverse Gaussian | Poisson-generalized inverse Gaussian |
generalized gamma | Poisson-generalized gamma |
generalized Pareto | Poisson-generalized Pareto |
inverse-gamma | Poisson-inverse gamma |
log-normal | Poisson-log-normal |
Lomax | Poisson–Lomax |
Pareto | Poisson–Pareto |
Pearson’s family of distributions | Poisson–Pearson family |
truncated normal | Poisson-truncated normal |
uniform | Poisson-uniform |
shifted gamma | Delaporte |
beta with specific parameter values | Yule |