Hi,
My question to answer is: Estimate Lambda in a Poisson distribution if X1, X2, etc. are iid, with the:
a) Method of Moments.
b) Maximum Likelihood Estimator.
Are the estimators unbiased? Are they the best estimators for Lambda?
I'm really having trouble understanding how these two methods work. I just don't see what's going on. I've read so much about them though that I think I have the answers to the questions at least. For Method of moments & MLE, estimator is 1/n * (Sum from i=1 to n of: x[sub:bzzisawh]i[/sub:bzzisawh]) Both unbiased estimators, and they are efficient estimators.
But I really want to understand what's going on here, so I was wondering if someone could explain how each of these is derived? From what i've gathered both can be found from the pmf of poisson: P(X=x)=[Lambda^x * e^(-Lambda)]/x!
I'm not sure how to derive them though.
My question to answer is: Estimate Lambda in a Poisson distribution if X1, X2, etc. are iid, with the:
a) Method of Moments.
b) Maximum Likelihood Estimator.
Are the estimators unbiased? Are they the best estimators for Lambda?
I'm really having trouble understanding how these two methods work. I just don't see what's going on. I've read so much about them though that I think I have the answers to the questions at least. For Method of moments & MLE, estimator is 1/n * (Sum from i=1 to n of: x[sub:bzzisawh]i[/sub:bzzisawh]) Both unbiased estimators, and they are efficient estimators.
But I really want to understand what's going on here, so I was wondering if someone could explain how each of these is derived? From what i've gathered both can be found from the pmf of poisson: P(X=x)=[Lambda^x * e^(-Lambda)]/x!
I'm not sure how to derive them though.