代写MA 575 – Fall 2022. Final Exam代写Processing

MA 575 – Fall 2022. Final Exam

Some useful formulas

• The Gaussian distribution N(µ, σ2), µ 2 R, σ2 > 0 has pdf  x 2 R. If X ~ N(µ, σ2), we have E(X) = µ, Var(X) = σ2.

• Throughout the exam we consider a multiple linear regression model

The parameters of the model are β 2 Rp and σ2 > 0 with true values β?, σ? 2 respectively. Throughout we assume that the model includes an intercept.

• (Woodbury identity) Let A 2 Rm⇥m invertible, and u, v 2 Rm be such that 1 + v' A−1u ≠ 0. Then A + uv' is invertible, and

• We recall also that if  and det(A) = A11A22 − A12A21 ≠ 0, then A is invertible and

Problem 1: Consider the linear regression model given in (1).

a. (1pt) TRUE or FALSE: the model assumes that the components of y are independent with the same distri bution.

b. (1pt) TRUE or FALSE: the model is not applicable when the explanatory variables are not continuous.

c. (1pt) If βˆ denotes the least squares estimate of β in model (1), use y, X, βˆ to express the vector of fitted values yˆ, and its covariance matrix.

d. (2pt) Consider model (1) with p = 2 (simple linear model). Let x = (x1,...,xn)0 denote the unique ex-planatory variable of the model (recall that the model contains an intercept). Let xc = (x1−x, . . . , x ¯ n− ¯x)0 , and 1 = (1,..., 1)' 2 Rn, where x¯ =  xi/n. Use a projection argument to show that the vec-tor of fitted values yˆ of the model can be written as

Give the expression of β˜0 and β˜1.

e. (1pt) Consider again the case p = 2, and assume that  xi = 0. Find Var(βˆ0), and Var(βˆ1).

Problem 2: Consider the linear regression model given in (1).

a. (2pts) The least squares estimator of β is β that minimizes the function β 7! ky − Xβk2. Give the expression of βˆ and the expression of an unbiased estimator σˆ2 of σ2.

b. (2pts) Under the assumptions of the model what are the distributions of βˆ and σˆ2?

c. (2pts) Suppose that we modify model (1) to y = Xβ + , where ~ N(0, σ2⌦−1) for a symmetric positive define matrix ⌦ 2 Rp⇥p assumed known. In that case we estimate β using β that minimizes the function β 7! (y − Xβ)0 ⌦(y − Xβ). Give the expression of βˇ and the expression of an unbiased estimator σˇ2 of σ2 in this model.

Problem 3: Let  be a Gaussian random vector with mean and covariance matrix given by

a. (1pts) Answer TRUE or FALSE: the variables Y1, Y2, Y3 as given are iid.

b. (1pts) Answer TRUE or FALSE: the variables Y1, Y3 are independent.

c. (1pts) Give the expression of the probability density function (pdf) of 2Y1.

(d) (1pts) Let Z = Y1 2 + (Y2 − 1)2. Find E(Z).

(d) (2pts) Find the expectation and the covariance matrix of

Problem 4: We consider the linear regression model in (1). Consider a sub-model y = X1β1 + , where X1 2 Rnxp1 is a sub-matrix of X that collects only p1 of the p columns of X. Let X2 2 Rn⇥p2 be the remaining columns of X, with p1 + p2 = p. We partition accordingly the true value β? as (β? T ,1, β? T ,2)T. The AIC of the sub-model is

with a similar expression for the full model.

a. (1pts) Answer TRUE or FALSE: in general, adding more explanatory variables to a linear model tend to produce fitted values with high biases, whereas removing many explanatory variables from the model tend to produce fitted values with high variances.

b. (1pts) Answer TRUE or FALSE: In general, when comparing models, the AIC and the R2 typically yield the same conclusion.

c. (1pts) Show that in the set up described at the beginning, if β?,2 = 0, then we have E(ky − X1βˆ1k2) = σ? 2(n − p1).

d. (1pts) By looking at the derivative of the function log(1 − x) + x, show that −x − x2 ≤ log(1 − x) ≤ −x for all x 2 [0, 1/2).

e. (2pts) In the specific set up described at the beginning, use the above to show that when n is larger than p, and β?,2 = 0, the smaller model is typically preferred according to the AIC criterion.

Problem 5: We consider the linear regression model in (1). Let y(i) 2 Rn−1 be the vector of responses obtained after removing the i-th response. Let X(i) 2 R(n−1)⇥p be the explanatory matrix obtained after removing the i-th row of X, that we denote xi. Let βˆ (i) be the least squares estimate of the model y(i) = X(i)β + (i).

(a) (1pts) The leverage of the i-th observation is hi = xi(X' X)−1x' i. Answer TRUE or FALSE: small value of hi means that the i-th observation is likely an outlier in the x-space.

(b) (1pts) Let the residuals of the model be ˆ. Use the fact that Var(ˆ) = σ? 2(In−H) to show that 0 hi 1 for all i.

(c) (2pts) Suppose that p = 2 (simple linear model). Let (x1,...,xn)0 denote the unique explanatory variable of the model (recall that the model contains an intercept). We set x¯ =  xi/n. Show that in this case the leverage of the i-th observation can be written as

(d) (2pts) In the general set up above, the i-th studentized residual is defined as  where ✏(i) = yi − xi

ˆβ(i) and σˆ( 2 i) = ky(i) − X(i) ˆ β(i)k2/(n − p − 1). Use the relation ˆβ(i) = ˆβ −  (X' X)−1x0 i to show that 





热门主题

课程名

mktg2509 csci 2600 38170 lng302 csse3010 phas3226 77938 arch1162 engn4536/engn6536 acx5903 comp151101 phl245 cse12 comp9312 stat3016/6016 phas0038 comp2140 6qqmb312 xjco3011 rest0005 ematm0051 5qqmn219 lubs5062m eee8155 cege0100 eap033 artd1109 mat246 etc3430 ecmm462 mis102 inft6800 ddes9903 comp6521 comp9517 comp3331/9331 comp4337 comp6008 comp9414 bu.231.790.81 man00150m csb352h math1041 eengm4100 isys1002 08 6057cem mktg3504 mthm036 mtrx1701 mth3241 eeee3086 cmp-7038b cmp-7000a ints4010 econ2151 infs5710 fins5516 fin3309 fins5510 gsoe9340 math2007 math2036 soee5010 mark3088 infs3605 elec9714 comp2271 ma214 comp2211 infs3604 600426 sit254 acct3091 bbt405 msin0116 com107/com113 mark5826 sit120 comp9021 eco2101 eeen40700 cs253 ece3114 ecmm447 chns3000 math377 itd102 comp9444 comp(2041|9044) econ0060 econ7230 mgt001371 ecs-323 cs6250 mgdi60012 mdia2012 comm221001 comm5000 ma1008 engl642 econ241 com333 math367 mis201 nbs-7041x meek16104 econ2003 comm1190 mbas902 comp-1027 dpst1091 comp7315 eppd1033 m06 ee3025 msci231 bb113/bbs1063 fc709 comp3425 comp9417 econ42915 cb9101 math1102e chme0017 fc307 mkt60104 5522usst litr1-uc6201.200 ee1102 cosc2803 math39512 omp9727 int2067/int5051 bsb151 mgt253 fc021 babs2202 mis2002s phya21 18-213 cege0012 mdia1002 math38032 mech5125 07 cisc102 mgx3110 cs240 11175 fin3020s eco3420 ictten622 comp9727 cpt111 de114102d mgm320h5s bafi1019 math21112 efim20036 mn-3503 fins5568 110.807 bcpm000028 info6030 bma0092 bcpm0054 math20212 ce335 cs365 cenv6141 ftec5580 math2010 ec3450 comm1170 ecmt1010 csci-ua.0480-003 econ12-200 ib3960 ectb60h3f cs247—assignment tk3163 ics3u ib3j80 comp20008 comp9334 eppd1063 acct2343 cct109 isys1055/3412 math350-real math2014 eec180 stat141b econ2101 msinm014/msing014/msing014b fit2004 comp643 bu1002 cm2030
联系我们
EMail: 99515681@qq.com
QQ: 99515681
留学生作业帮-留学生的知心伴侣!
工作时间:08:00-21:00
python代写
微信客服:codinghelp
站长地图