+1 (315) 557-6473 

Get Reliable Computational Statistics Homework Help

Are you overwhelmed by your computational statistics homework? Do you wish for a helping hand to ease your academic burden? Statistics Homework Helper is the perfect platform to visit when you need reliable computational statistics homework help. Our comprehensive service is provided by immensely experienced and adept computational statistics homework helpers. Place your order with us today and secure top grades in your homework.

Computational Statistics

Complete the problems either by hand or using the computer and upload your nal document to the Blackboard course site. All nal submittals are to be in PDF form. Please document any code used to solve the problems and include it with your submission.
1. The following data are an i.i.d. sample from a Normal( ; 1) distribution: 28, 33, 22, 35, 31. We wish to estimate by minimizing residuals. We will use the L2 norm squared as our metric.
(a) What is the function sp( ) that we wish to minimize?
(b) Graph sp( ).
(c) Find the Minimum Residual Estimator for using the Bisection Method correct to 2 decimal places.
(d) If we were to use Newton’s Method to solve this optimization problem, what would the re nement increment h(t) be?

Newton's Method

2. Maximize the function f (x) = x4 + x2 x + 2 by using Newton’s Method and the starting values below. For each value state the number of iterations Newton’s Method takes converge if we want our solution to be correct to 2 decimal places.
(a) x0 = 1
(b) x0 = 2
3. Problem 2.1. For part (e) you only need to discuss your results, you do not need to reapply the methods to a random data set.

Poisson and Exponential Distribution

4. In each of the following, assume a random sample of size n, x1; x2; : : : ; xn and nd the Maximum Likelihood Estimator for:
(a) for the Poisson distribution.
Note: These are examples of distributions for which the MLE can be found analytically
in terms of the data x1; : : : ; xn and so no advanced computational methods are required.

Independent Bernoulli Trials

5. Consider a sequence of n independent Bernoulli trials in which the probability of success is and the probability of failure is 1 . If A represents the observed number of success and B represents the observed number of failures, (with A + B = n), then nd I( ), the Fisher information matrix. (Hint: Recall that the sum of n Bernoulli trials is a Binomial random variable. Also assume that n; A and B are xed and so the only unknown parameter is , in the case I( ) will be a scalar.)