Mathematical Statistics Anna Janicka
Lecture V, 18.03.2019
PROPERTIES OF ESTIMATORS, PART I
Plan for today
1. Maximum likelihood estimation examples – cont.
2. Basic estimator properties:
estimator bias
unbiased estimators
3. Measures of quality: comparing estimators
mean square error
incomparable estimators
minimum-variance unbiased estimator
MLE – Example 1.
Quality control, cont. We maximize or equivalently maximize
i.e. solve solution:
x n x
x n x
X P
L − −
=
=
= ( ) (1 )
)
(θ θ θ θ
) 1
ln(
) (
) ln(
ln )
) 1
ln((
) ln(
ln )
(θ θ θ + θ + − −θ
=
− +
+
= − x n x
x n x
n
l x n x
1 0 )
(
' =
−
− −
= θ θ
θ x n x l
n
MLE ( θ ) = θ ˆ
ML= x
MLE – Example 3.
Normal model: X
1, X
2, ..., X
nare a sample from N( µ , σ
2). µ , σ unknown.
we solve
we get:
( ) ( )
( )
(
2 2)
2 1 2
2 2
1 2
1
2 ln
) 2 ln(
) (
exp ln
) , (
2 2
µ µ
σ π
µ σ
µ
σ σ σ
π
n x
x n
x l
i i
n
i n
+ Σ
− Σ
−
−
−
=
−
−
=
∑
=
− Σ
=
= +
Σ
− Σ
+
−
=
∂∂
∂∂
0
0 )
2 (
2 2
3
1
2 1 2
σ µ µ σ
σ σ
σ
µ µ
n i
l
i i
l n
x n x
x
1 2
2
( )
ˆ ,
ˆ
ML= X σ
ML=
n∑ x
i− X
µ
Estimator properties
Aren’t the errors too large? Do we estimate what we want?
is supposed to approximate θ .
In general: is to approximate g( θ ).
What do we want? Small error. But:
errors are random variables (data are RV)
→ we can only control the expected value the error depends on the unknown
θ
.→ we can’t do anything about it...
θ ˆ
)
ˆ X (
g
Estimator bias
If is an estimator of θ :
bias of the estimator is equal to
If is an estimator of g( θ ):
bias of the estimator is equal to
/ is unbiased, if
other notations, e.g.:
) ˆ X ( θ
) ( )
ˆ ( ))
( )
ˆ ( ( )
( θ E
θg X g θ E
θg X g θ
b = − = −
θ θ
θ θ
θ ) =
θ( ˆ ( ) − ) =
θˆ ( ) −
( E X E X
b
) ˆ X ( g
) ˆ X ( ) g
ˆ X ( θ
Θ
∈
∀
= θ
θ ) 0 for (
b
ˆ) (g Bθ
The normal model: reminder
Normal model: X
1, X
2, ..., X
nare a sample from distribution N( µ , σ
2). µ , σ unknown.
Theorem. In the normal model, and S
2are independent random variables, such that
In particular:
X
) ,
(
~ N
2 nX µ
σ) 1 (
~
21 2
2
χ −
−
S n
n σ
) 1 2 (
Var and
, , 2 4
2 2
, = = −
S n S
Eµ σ σ µ σ σ
X n X
Eµ,σ = µ ,and Varµ,σ = σ 2
Estimator bias – Example 1
In a normal model:
is an unbiased estimator of µ :
is an unbiased estimator of µ : is biased:
bias:
= X
µ ˆ
µ µ
µ
µ σ µ σσ
µ
X = E X = E ∑ = X = n =
E
n ni i
n
1 1
1 ,
, ,
ˆ ( )
1
ˆ
1= X
µ
µ
µ
µ σσ
µ,
ˆ
1( X ) = E
,X
1= E
5 ˆ
2=
µ
2 for
eg
5 5
) (
ˆ
2 ,,σ
µ =
µ σ= ≠ µ µ =
µ
X E
E
µ µ ) = 5 − (
b
any model with unknown mean µ:
Estimator bias – Example 1 cont.
is a biased estimator of σ
2:
is an unbiased estimator of σ
2:
∑
=−
=
ni i
n
X X
S
11 2
2
( )
ˆ
(
2 2 2)
2 21
2 2
, 1
1 1 2
, 2
,
2
2 )
( )
(
) (
) (
) ˆ (
σ σ
µ σ
µ σ σ
σ µ σ
µ σ
µ
≠
−
= +
− +
=
−
=
−
=
∑
=∑
n n n
n i n
i i
n
n n
X n X
E X
X E
X S
E
∑
=−
−
=
ni i
n
X X
S
12 1
2 1
) (
(
2 2 2)
11(
2)
21 1
2 2
1 , 1 1
2 1
1 ,
2 ,
) 1 (
) (
) (
) (
) (
) (
2 σ σ
µ σ
µ σ
σ µ σ
µ σ
µ
=
−
= +
− +
=
−
=
−
=
−
−
= −
−
∑ ∑
n n
n
X n X
E X
X E
X S
E
n n n
n i n
i i
n
not necessarily the normal model!
Estimator bias – Example 1 cont. (2)
Bias of estimator is equal to
for n → ∞, bias tends to 0, so this estimator is also OK for large samples
∑
=−
=
ni i
n
X X
S
11 2
2
( )
ˆ
b n
2
)
( σ
σ = −
for any distribution with a variance
Asymptotic unbiased estimator
An estimator of g( θ ) is asymptotically unbiased, if
0 )
( lim
: =
Θ
∈
∀ θ
→∞b θ
n
)
ˆ X (
g
How to compare estimators?
We want to minimize the error of the estimator; the estimator which makes smaller mistakes is better.
The error may be either + or -, so usually we look at the square of the error (the
mean difference between the estimator
and the estimated value)
Mean Square Error
If is an estimator of θ :
Mean Square Error of estimator is the function
If is an estimator of g( θ ):
MSE of estimator is the function
We will only consider the MSE. Other measures are also possible (eg with absolute value)
) ˆ X ( θ
))
2( )
ˆ ( ( ˆ )
,
( θ g E
θg X g θ
MSE = −
)
2) ˆ (
( ˆ )
,
( θ θ = E
θθ X − θ
MSE
) ˆ X ( g
) ˆ X (
θ
)
ˆ X (
g
Properties of the MSE
We have:
For unbiased estimators, the MSE is equal to the variance of the estimator
ˆ ) ( Var )
( ˆ )
,
( g b
2g
MSE θ = θ +
MSE – Example 1
X
1, X
2, ..., X
nare a sample from a distribution with mean µ , and variance σ
2. µ , σ unknown.
MSE of (unbiased):
MSE of (unbiased):
MSE of (biased):
= X
µ ˆ
X n Var
X E
X MSE
2 ,
2
,
( )
) ,
,
( σ
µ σ
µ =
µ σ− =
µ σ=
1
ˆ
1= X
µ
5 ˆ
2=
µ
2 1
, 2
1 ,
1
) ( )
, ,
( µ σ X = E
µ σX − µ = Var
µ σX = σ
MSE
2 2
,
( 5 ) ( 5 )
) 5 , ,
( µ σ = E
µ σ− µ = − µ
MSE
MSE – Example 2 Normal model
MSE of
MSE of = ∑n= −
i i
n
X X
S
11 2
2
( )
ˆ
∑
=−
−
=
ni i
n
X X
S
12 1
2 1
) (
1 ) 2
( )
, , (
4 2
, 2
2 2
, 2
= −
=
−
= E S Var S n
S
MSE σ
σ σ
µ
µ σ µ σ4 2
4 2
2 2
4
2 ,
2 2
2 2
, 2
1 2
1 2
) 1 (
) ˆ ( ˆ )
( ˆ )
, , (
σ σ σ
σ σ
σ
µ
µ σ µ σn n n
n n n
S Var
b S
E S
MSE
= −
− + −
=
+
=
−
=
ˆ ) , , ( )
, ,
( S2 MSE S2
MSE
µ σ
>µ σ
in any model: similarly, just with different expressions
MSE and bias – Example 2.
Poisson Model: X
1, X
2, ..., X
nare a sample from a Poisson distribution with unknown parameter θ .
ML
= ... = X
θ ˆ
0 )
( θ = b
X n X
X
MSE
ni i
n
θ =
θ=
θ∑
=1= θ
Var
1Var )
,
(
Comparing estimators
is better than (dominates) , if
and
an estimator will be better than a different estimator only if its plot of the MSE never lies above the MSE plot of the other estimator; if the plots intersect, estimators are
incomparable
) ˆ
1( X
g g ˆ
2( X )
ˆ ) , ( ˆ )
, (
MSE θ g
1MSE θ g
2θ ∈ Θ ≤
∀
ˆ ) , ( ˆ )
, (
MSE θ g
1MSE θ g
2θ ∈ Θ <
∃
Comparing estimators – cont.
A lot of estimators are incomparable →
comparing any old thing is pointless; we need to constrain the class of estimators If we compare two unbiased estimators,
the one with the smaller variance will be
better
Comparing estimators – Example 1.
In any model:
From among
is better (for n>1)
are incomparable, just like
From among is better
1
ˆ
1and
ˆ = X µ = X
µ µ ˆ
5 ˆ
and
ˆ = µ
2=
µ X
5 ˆ
and
ˆ
1=
1µ
2=
µ X
2 2
and S ˆ S
ˆ
2S
Minimum-variance unbiased estimator
We constrain comparisons to the class of unbiased estimators. In this class, one can usually find the best estimator:
g*(X) is a minimum-variance unbiased estimator (MVUE) for g( θ ), if
g*(X) is an unbiased estimator of g(
θ
),for any unbiased estimator we have for
θ
∈Θ) ˆ X ( g
) ˆ (
) (
* X Var g X g
Var
θ≤
θHow can we check if the estimator has a minimum variance?