Score
The score is defined as the partial derivative of the likelihood function with respect to the parameters
∂θ∂logf(X;θ) Under certain regularity conditions, we can show that the expectation of score evaluated at true parameters equals to 0.
Fisher information is defined to to be the variance of the score
I(θ):=Varθ(∂θ∂logf(X∣θ))=−Eθ[∂θ2∂2logf(X∣θ)] In matrix form, Fisher information matrix for k parameters is a k×k matrix where the i,j entry of the matrix is
I(θ)ij:=Covθ(∂θi∂logf(X∣θ),∂θj∂logf(X∣θ))=−Eθ[∂θi∂θj∂2logf(X∣θ)] Last updated