No products in the cart.

Introductory Econometrics International Edition 5th Edition by Jeffrey M. Wooldridge - Test Bank

Introductory Econometrics International Edition 5th Edition by Jeffrey M. Wooldridge - Test Bank   Instant Download - Complete Test Bank With Answers     Sample Questions Are Posted Below   Chapter 5 Which of the following statements is true? The standard error of a regression, , is not an unbiased estimator for , the standard …

$19.99

Introductory Econometrics International Edition 5th Edition by Jeffrey M. Wooldridge – Test Bank

 

Instant Download – Complete Test Bank With Answers

 

 

Sample Questions Are Posted Below

 

Chapter 5

  1. Which of the following statements is true?
  2. The standard error of a regression, , is not an unbiased estimator for , the standard deviation of the error, u, in a multiple regression model.
  3. In time series regressions, OLS estimators are always unbiased.
  4. Almost all economists agree that unbiasedness is a minimal requirement for an estimator in regression analysis.
  5. All estimators in a regression model that are consistent are also unbiased.

 

Answer: b

Difficulty: Moderate

Bloom’s: Knowledge

A-Head: Consistency

BUSPROG:

Feedback: The standard error of a regression is not an unbiased estimator for the standard deviation of the error in a multiple regression model.

 

  1. If j, an unbiased estimator of j, is consistent, then the:
  2. distribution of j becomes more and more loosely distributed around j as the sample size grows.
  3. distribution of j becomes more and more tightly distributed around j as the sample size grows.
  4. distribution of j tends toward a standard normal distribution as the sample size grows.
  5. distribution of j remains unaffected as the sample size grows.

 

Answer: b

Difficulty: Medium

Bloom’s: Knowledge

A-Head: Consistency

BUSPROG:

Feedback: If j, an unbiased estimator of j, is consistent, then the distribution of j becomes more and more tightly distributed around j as the sample size grows.

 

  1. If j, an unbiased estimator of j, is also a consistent estimator of j, then when the sample size tends to infinity:
  2. the distribution of j collapses to a single value of zero.
  3. the distribution of j diverges away from a single value of zero.
  4. the distribution of j collapses to the single point j.
  5. the distribution of j diverges away from j.

 

Answer: c

Difficulty: Easy

Bloom’s: Knowledge

A-Head: Consistency

BUSPROG:

Feedback: If j, an unbiased estimator of j, is also a consistent estimator of j, then when the sample size tends to infinity the distribution of j collapses to the single point j.

 

  1. In a multiple regression model, the OLS estimator is consistent if:
  2. there is no correlation between the dependent variables and the error term.
  3. there is a perfect correlation between the dependent variables and the error term.
  4. the sample size is less than the number of parameters in the model.
  5. there is no correlation between the independent variables and the error term.

 

Answer: d

Difficulty: Moderate

Bloom’s: Knowledge

A-Head: Consistency

BUSPROG:

Feedback: In a multiple regression model, the OLS estimator is consistent if there is no correlation between the explanatory variables and the error term.

 

  1. If the error term is correlated with any of the independent variables, the OLS estimators are:
  2. biased and consistent.
  3. unbiased and inconsistent.
  4. biased and inconsistent.
  5. unbiased and consistent.

 

Answer: c

Difficulty: Easy

Bloom’s: Knowledge

A-Head: Consistency

BUSPROG:

Feedback: If the error term is correlated with any of the independent variables, then the OLS estimators are biased and inconsistent.

 

  1. If δ1 = Cov(x1/x2) / Var(x1) where x1 and x2 are two independent variables in a regression equation, which of the following statements is true?
  2. If x2 has a positive partial effect on the dependent variable, and δ1 > 0, then the inconsistency in the simple regression slope estimator associated with x1 is negative.
  3. If x2 has a positive partial effect on the dependent variable, and δ1 > 0, then the inconsistency in the simple regression slope estimator associated with x1 is positive.
  4. If x1 has a positive partial effect on the dependent variable, and δ1 > 0, then the inconsistency in the simple regression slope estimator associated with x1 is negative.
  5. If x1 has a positive partial effect on the dependent variable, and δ1 > 0, then the inconsistency in the simple regression slope estimator associated with x1 is positive.

 

Answer: b

Difficulty: Moderate

Bloom’s: Knowledge

A-Head: Consistency

BUSPROG:

Feedback: Given that δ1 = Cov(x1/x2)/Var(x1) where x1 and x2 are two independent variables in a regression equation, if x2 has a positive partial effect on the dependent variable, and δ1 > 0, then the inconsistency in the simple regression slope estimator associated with x1 is positive.

 

  1. If OLS estimators satisfy asymptotic normality, it implies that:
  2. they are approximately normally distributed in large enough sample sizes.
  3. they are approximately normally distributed in samples with less than 10 observations.
  4. they have a constant mean equal to zero and variance equal to σ2.
  5. they have a constant mean equal to one and variance equal to σ.

Answer: a

Difficulty: Easy

Bloom’s: Knowledge

A-Head: Asymptotic Normality and Large Sample Inference

BUSPROG:

Feedback: If OLS estimators satisfy asymptotic normality, it implies that they are approximately normally distributed in large enough sample sizes.

 

  1. In a regression model, if variance of the dependent variable, y, conditional on an explanatory variable, x, or Var(y|x), is not constant, _____.
  2. the t statistics are invalid and confidence intervals are valid for small sample sizes
  3. the t statistics are valid and confidence intervals are invalid for small sample sizes
  4. the t statistics confidence intervals are valid no matter how large the sample size is
  5. the t statistics and confidence intervals are both invalid no matter how large the sample size is

 

Answer: d

Difficulty: Moderate

Bloom’s: Knowledge

A-Head: Asymptotic Normality and Large Sample Inference

BUSPROG:

Feedback: If variance of the dependent variable conditional on an explanatory variable is not a constant the usual t statistics confidence intervals are both invalid no matter how large the sample size is.

 

  1. If j is an OLS estimator of a regression coefficient associated with one of the explanatory variables, such that j= 1, 2, …., n, asymptotic standard error of j will refer to the:
  2. estimated variance of j when the error term is normally distributed.
  3. estimated variance of a given coefficient when the error term is not normally distributed.
  4. square root of the estimated variance of j when the error term is normally distributed.
  5. square root of the estimated variance of j when the error term is not normally distributed.

 

Answer: d

Difficulty: Easy

Bloom’s: Knowledge

A-Head: Asymptotic Normality and Large Sample Inference

BUSPROG:

Feedback: Asymptotic standard error refers to the square root of the estimated variance of j when the error term is not normally distributed.

 

  1. A useful rule of thumb is that standard errors are expected to shrink at a rate that is the inverse of the:
  2. square root of the sample size.
  3. product of the sample size and the number of parameters in the model.
  4. square of the sample size.
  5. sum of the sample size and the number of parameters in the model.

 

Answer: a

Difficulty: Moderate

Bloom’s: Knowledge

A-Head: Asymptotic Normality and Large Sample Inference

BUSPROG:

Feedback: Standard errors can be expected to shrink at a rate that is the inverse of the square root of the sample size.

 

  1. An auxiliary regression refers to a regression that is used:
  2. when the dependent variables are qualitative in nature.
  3. when the independent variables are qualitative in nature.
  4. to compute a test statistic but whose coefficients are not of direct interest.
  5. to compute coefficients which are of direct interest in the analysis.

 

Answer: c

Difficulty: Easy

Bloom’s: Knowledge

A-Head: Asymptotic Normality and Large Sample Inference

BUSPROG:

Feedback: An auxiliary regression refers to a regression that is used to compute a test statistic but whose coefficients are not of direct interest.

 

  1. The n-R-squared statistic also refers to the:
  2. F statistic.
  3. t statistic.
  4. z statistic.
  5. LM statistic.

 

Answer: d

Difficulty: Easy

Bloom’s: Knowledge

A-Head: Asymptotic Normality and Large Sample Inference
BUSPROG:

Feedback: The n-R-squared statistic also refers to the LM statistic.

 

  1. The LM statistic follows a:
  2. t distribution.
  3. f distribution.
  4. 2 distribution.
  5. binomial distribution.

 

Answer: c

Difficulty: Easy

Bloom’s: Knowledge

A-Head: Asymptotic Normality and Large Sample Inference

BUSPROG:

Feedback: The LM statistic follows a 2 distribution.

 

  1. Which of the following statements is true?
  2. In large samples there are not many discrepancies between the outcomes of the F test and the LM test.
  3. Degrees of freedom of the unrestricted model are necessary for using the LM test.
  4. The LM test can be used to test hypotheses with single restrictions only and provides inefficient results for multiple restrictions.
  5. The LM statistic is derived on the basis of the normality assumption.

 

Answer: a

Difficulty: Moderate

Bloom’s: Knowledge

A-Head: Asymptotic Normality and Large Sample Inference

BUSPROG:

Feedback: In large samples there are not many discrepancies between the F test and the LM test because asymptotically the two statistics have the same probability of a Type 1 error.

 

  1. Which of the following statements is true under the Gauss-Markov assumptions?
  2. Among a certain class of estimators, OLS estimators are best linear unbiased, but are asymptotically inefficient.
  3. Among a certain class of estimators, OLS estimators are biased but asymptotically efficient.
  4. Among a certain class of estimators, OLS estimators are best linear unbiased and asymptotically efficient.
  5. The LM test is independent of the Gauss-Markov assumptions.

 

Answer: c

Difficulty: Moderate

Bloom’s: Knowledge

A-Head: Asymptotic Efficiency of OLS

BUSPROG:

Feedback: Under the Gauss-Markov assumptions, among a certain class of estimators, OLS estimators are best linear unbiased and asymptotically efficient.

 

  1. If variance of an independent variable in a regression model, say x1, is greater than 0, or Var(x1) > 0, the inconsistency in 1 (estimator associated with x1) is negative, if x1 and the error term are positively related.

 

Answer: False

Difficulty: Easy

Bloom’s: Knowledge

A-Head: Consistency

BUSPROG:

Feedback: If variance of an independent variable, say x1, is greater than 0, the inconsistency in 1 (estimator associated with x1) is positive if x1 and the error term are positively related.

 

  1. Even if the error terms in a regression equation, u1, u2,….., unare not normally distributed, the estimated coefficients can be normally distributed.

 

Answer: False

Difficulty: Easy

Bloom’s: Knowledge

A-Head: Asymptotic Normality and Large Sample Inference

BUSPROG:

Feedback: Even if the error terms in a regression equation, u1, u2,….., unare not normally distributed, the estimated coefficients cannot be normally distributed.

 

  1. A normally distributed random variable is symmetrically distributed about its mean, it can take on any positive or negative value (but with zero probability), and more than 95% of the area under the distribution is within two standard deviations.

 

Answer: True

Difficulty: Easy

Bloom’s: Knowledge

A-Head: Asymptotic Normality and Large Sample Inference

BUSPROG:

Feedback: A normally distributed random variable is symmetrically distributed about its mean, it can take on any positive or negative value (but with zero probability), and more than 95% of the area under the distribution is within two standard deviations.

 

  1. The F statistic is also referred to as the score statistic.

 

Answer: False

Difficulty: Easy

Bloom’s: Knowledge

A-Head: Asymptotic Normality and Large Sample Inference
BUSPROG:

Feedback: The LM statistic is also referred to as the score statistic.

 

  1. The LM statistic requires estimation of the unrestricted model only.

 

Answer: False

Difficulty: Easy

Bloom’s: Knowledge

A-Head: Asymptotic Normality and Large Sample Inference

BUSPROG:

Feedback: The LM statistic requires estimation of the restricted model only.

Additional information

Add Review

Your email address will not be published. Required fields are marked *