Solved Statistical Inference MCQs with Answers

4.8/5 - (15 votes)

These Statistical Inference MCQs are designed to develop theoretical (mathematical) skills in students at the Undergraduate level. The course includes Interval Estimation: Pivotal and other methods of finding confidence interval, confidence interval in large samples, shortest confidence interval, and optimum confidence interval. Bayes’s Interval estimation. Tests of Hypothesis: Simple and composite hypothesis, critical regions. Neyman-Pearson Lemma, power functions, uniformly most powerful tests.

 Statistical Inference MCQs

The process of drawing inferences about the population parameter.
A. Statistical Inference
B. Statistical Analysis
C. both b and c
D. None of these
View Answer

A. Statistical Inference

No. of branches of statistical inference are.
A. Three
B. Two
C. Four
D. Five

View Answer

B. Two

Estimation is the branch of.
A. Statistic
B. Statistical Method
C. Both A andB
D. Statistical Inference
View Answer

D. Statistical Inference

Testing of hypothesis is the branch of.
A. Statistical Method
B. Statistical Inference
C. Both A andB
D. None of these
View Answer

A. Statistical Method

The process of finding true but unknown value of the population parameter is called.
A. Statistical Inference
B. Estimation
C. Both B and C
D. None of these
View Answer

B. Estimation

Part of population is called.
A. Statistical Inference
B. Statistical Analysis
C. Sample
D. None of these
View Answer

C. Sample

Types of estimation are.
A. Two
B. Three
C. One
D. Four
View Answer

A. Two

The formula uses to estimate the true but unknown value of the population parameter is called an.
A. Estimation
B. Estimate
C. Estimator
D. None of these
View Answer

C. Estimator

The value which is obtained by applying an estimator on sample information is known as an.
A. Estimation
B. Estimator
C. Both A&B
D. Estimate
View Answer

D. Estimate

Statistic may be an.
A. Estimator
B. Estimate
C. Both A & B
D. None of these
View Answer

C. Both A & B

The properties of an estimator are.
A. Unbiasedness
B. Sufficiency
C. Consistency
D. All of these
View Answer

D. All of these

Different method of estimation are deals with.
A. Point estimation
B. Interval estimation
C. Both A & B
D. None of these
View Answer

C. Both A & B

If expected value of an estimator is equal to its respective parameter then it is called an.
A. Biased estimator
B. Unbiased estimator
C. Estimator
D. None of these
View Answer

B. Unbiased estimator

If expected value of an estimator is greater than the parameter then estimator is called.
A. Unbiasedness
B. Positively Biased
C. Efficiency
D. None of these
View Answer

B. Positively Biased

If expected value of an estimator is equal to its respective parameter then this property known is.
A. Biasedness
B. Estimation
C. Unbiasedness
D. Both B & C
View Answer

C. Unbiasedness

If expected value of an estimator is less than the parameter then estimator is called.
A. Negatively biased
B. Positively biased
C. Only biased
D. None of these
View Answer

B. Positively biased

If the estimator utilizes all the observations of a sample then it is called a.
A. Positively biased
B. Negatively biased
C. Both A & B
D. None of these
View Answer

D. None of these

Mean square of an estimator is equal to.
A. Variance + (Bias)2
B. E(x) + (Bias)2
C. (Bias)2
D. Variance
View Answer

A. Variance + (Bias)2

Neyman Fisher Factorization theorem is also known as.
A. Theorem of sufficient estimators
B. Rao Black-well theorem
C. Estimator
D. None of these
View Answer

A. Theorem of sufficient estimators

A statistic (estimator) s(x) is sufficient for ? if conditional density is.
A. Dependent of parameter
B. Equal to parameter
C. Independent of parameter
D. None of these
View Answer

C. Independent of parameter

In sufficiency if sum of all observation of sample is sufficient for population mean then sample mean is also.
A. Unbiased
B. Non negative
C. Sufficient
D. None of these
View Answer

C. Sufficient

For sufficiency, in conditional density h(x) does not involve the.
A. Parameter
B. Estimator
C. Both A and B
D. None of these
View Answer

A. Parameter

A set of joint sufficient statistic is said to be minimal if it is the function of any other sufficient.
A. Parameter
B. Estimator
C. Statistics
D. h (x)
View Answer

C. Statistics

If conditional pdf is independent of the parameter then the statistic is said to be.
A. Efficient
B. Sufficient
C. Estimator
D. None of these
View Answer

B. Sufficient

In Neyman Fisher Factorization Theorem.
A. L(x; ?) = g (S ; ?) h(x)
B. L(x) = g (& = &)
C. L(x) = g (& = &) f(x)
D. None of these
View Answer

A. L(x; ?) = g (S ; ?) h(x)

If is sufficient for “ ”. Then is also.
A. Complete
B. Unbiased
C. Both A and B
D. Sufficient
View Answer

D. Sufficient

With the increase in sample size if the estimate becomes closer and closer to the parameter that is called.
A. Completeness
B. Unbiasedness
C. Consistency
D. Sufficient
View Answer

C. Consistency

If X follows normal distribution. For the value of “ = 1”. ThenT = is an.
A. Complete
B. Unbiased
C. Both A and B
D. Sufficient
View Answer

B. Unbiased

If then the statistic “t” is called.
A. Complete
B. Unbiased
C. Both A and B
D. Sufficient
View Answer

A. Complete

If X follows normal distribution ( ) then Var ( ) =
A. n /(n+1)2
B. /(n+1)2
C. Both A and B
D. Sufficient
View Answer

A. n /(n+1)2

For a random sampling from Normal Population, s2 is a consistent estimatorof.
A. Population variance
B. population variance
C. Both A and B
D. Sufficient
View Answer

B. population variance

In case of unbiased estimators, the estimator having minimum variance is called an.
A. Efficient estimator.
B. Sufficient
C. Both A and B
D. Consistent
View Answer

A. Efficient estimator.

If prior density is given, for finding an estimate we use.
A. Baye’s method
B. MLE
C. Both A and B
D. None of these
View Answer

A. Baye’s method

If the population has two parameters, to find moment estimates we have to Calculate.
A. First two sample raw moment
B. First sample raw moment.
C. Both A and B
D. None of these
View Answer

A. First two sample raw moment

Real Also>>Probability Distribution MCQs

In method of least square =.
A. .
B. .
C. Both A and B
D. None of these
View Answer

D. None of these

Baye’s estimator is always a function of.
A. Minimal sufficient statistic
B. Sufficient statistic
C. Both A and B
D. None of these
View Answer

A. Minimal sufficient statistic

If the numbers of unknowns are greater than no. of equations, we use the method for estimation.
A. MLE
B. Baye’s method
C. Least square method for estimation.
D. None of these
View Answer

C. Least square method for estimation.

If f(X ; ) = ; 0 X . We cannot find the MLE of “?” by using.
A. Real procedures
B. Baye’s method
C. Least square method for estimation.
D. None of these
View Answer

A. Real procedures

For t = then =.
A. 2
B. 4
C. 0
D. None of these
View Answer

C. 0

In a uniform distribution with parameter the Yn (largest observation) is.
A. Complete
B. Consistent
C. Efficient
D. None of these
View Answer

A. Complete

In Cramer-Rao Inequality var (T) is called.
A. Complete
B. Minimum variance bond
C. Efficient
D. None of these
View Answer

B. Minimum variance bond

An estimator is UMVUE if it is unbiased, sufficient and.
A. Complete
B. Minimum variance bond
C. Efficient
D. None of these
View Answer

A. Complete

Sample median is an efficient estimator more than.
A. Complete
B. Sample mean
C. Sample proportion
D. None of these
View Answer

B. Sample mean

Subscribe
Notify of
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments