25 Lecture 24, March 08, 2024
25.1 Proof of the moments of exponential distribution
Recall that we can have many ways to show the expectation and the variance of X∼Exp(θ) using the Gamma function. Then we have E(X)=θ and Var(X)=θ2.
Let X∼Exp(θ). We use the change of variable y=xθ with dx=θdy E[X]=∫∞0x⋅1θe−xθdxy=x/θ=∫∞0ye−yθdy=θ∫∞0ye−ydy⏟=Γ(2)=θΓ(2)=θ⋅(1!)=θ and similarly E[X2]=∫∞0x2⋅1θe−xθdxy=x/θ=∫∞0θy2e−yθdy=θ2∫∞0y3−1e−ydy⏟=Γ(3)=θ2Γ(3)=θ⋅(2!)=2θ2 so that Var(X)=E[X2]−E[X]2=2θ2−θ2=θ2
25.1.1 Memoryless property of exponential distribution
Theorem 25.1 (Memoryless property) If X∼Exp(θ), then P(X>s+t|X>s)=P(X>t).
- We’ve seen the memoryless property for the Geo(p) earlier (and the geometric distribution is the only discrete distribution with this property)
- If a continuous random variable has memoryless property, it must follow exponential distribution.
- Intuitively, both the geometric and exponential distributions measure waiting time until first success
Proof. Recall the cdf of X∼Exp(θ) is F(x)=P(X≤x)=∫x−∞f(t)dt=∫x0θ−1e−t/θdt=1−exp(−x/θ) for x>0 and 0 otherwise. Hence, P(X>s+t|X>s)=P(X>s+t and X>s)P(X>s)=P(X>s+t)P(X>s)=e−(s+t)/θe−s/θ=e−t/θ=1−F(t)=P(X>t) as desired.