Machine Learning: MCQs Set – 02

Q11: A multiple regression model has:

  • (A) Only one independent variable
  • (B) More than one independent variable
  • (C) More than one dependent variable
  • (D) None of the above

Q12: Consider the problem of binary classification. Assume I trained a model on a linearly separable training set, and now I have a new labeled data point that the model properly categorized and is far away from the decision border. In which instances is the learnt decision boundary likely to change if I now add this additional point to my previous training set and re-train? When the training model is,

  • (A) Perceptron and logistic regression
  • (B) Logistic regression and Gaussian discriminant analysis
  • (C) Support vector machine
  • (D) Perceptron

Q13: The average squared difference between classifier predicted output and actual output

  • (A) mean squared error
  • (B) root mean squared error
  • (C) mean absolute error
  • (D) mean relative error
  • (A) It can be applied to non-differentiable functions.
  • (B) It can be applied to non-continuous functions
  • (C) It is easy to implement
  • (D) It runs reasonably fast for multiple linear regression

Q15: Assume you’ve discovered multi-collinear features. Which of the following actions do you intend to take next? (1). Both collinear variables should be removed. (2). Instead of deleting both variables, we can simply delete one. (3). Removing correlated variables may result in information loss. We may utilize penalized regression models such as ridge or lasso regression to keep such variables.

  • (A) Only 1
  • (B) Only 2
  • (C) Either 1 or 3
  • (D) Either 2 or 3

Q16: What is a dead unit in a neural network?

  • (A) A unit which doesn’t update during training by any of its neighbour
  • (B) A unit which does not respond completely to any of the training patterns
  • (C) The unit which produces the biggest sum-squared error
  • (D) None of these

Q17: Which of the following approaches is capable of producing zero training error on every linearly separable dataset?

  • (A) Decision tree
  • (B) 15-nearest neighbors
  • (C) Hard-margin SVM
  • (D) Perceptron

Q18: A least squares regression study of weight (y) and height (x) yielded the following least squares line: y = 120 + 5x. This means that if the height is increased by one inch, the weight should increase by what amount?

  • (A) increase by 1 pound
  • (B) increase by 5 pound
  • (C) increase by 125 pound
  • (D) None of the above

Q19: Let A and B be events on the same sample space, with P (A) = 0.6 and P (B) = 0.7. Can these two events be disjoint?

  • (A) Yes
  • (B) No

Q20: A roulette wheel has 38 slots, 18 of which are red, 18 of which are black, and 2 of which are green. You play five games and always place your bets on the red slots. How many games do you think you’ll win?

  • (A) 1.1165
  • (B) 2.3684
  • (C) 2.6316
  • (D) 4.7368

Answers:

QuestionQ11Q12Q13Q14Q15Q16Q17Q18Q19Q20
AnswerCBADDAC, DBBB

≪ Previous | Next ≫


Leave a Reply

Your email address will not be published.