Q211: How many types are available in machine learning?
(A) 1
(B) 2
(C) 3
(D) 4
Q212: Feature selection tries to eliminate features which are
(A) Rich
(B) Redundant
(C) Irrelevant
(D) Relevant
Q213: For supervised learning we have ____ model.
(A) interactive
(B) predictive
(C) descriptive
(D) prescriptive
Q214: LOOCV in machine learning stands for
(A) Love one-out cross validation
(B) Leave-one-out cross-validation
(C) Leave-object oriented cross-validation
(D) Leave-one-out class-validation
Q215: The new features created in PCA are known as
(A) Principal components
(B) Eigenvectors
(C) Secondary components
(D) None of the above
Q216: Based on this joint distribution on two events p(A,B), we can define the this distribution as follows: p(A) = p(A,B) = p(A|B = b) p(B = b)
(A) Conditional distribution
(B) Marginal distribution
(C) Bayes distribution
(D) Normal distribution
Q217: Naïve Bayes classifier makes the naïve assumption that the attribute values are conditionally dependent given the classification of the instance.
(A) True
(B) False
Q218: This is the cleaning/transforming the data set in the supervised learning model.
(A) Problem Identification
(B) Identification of Required Data
(C) Data Pre-processing
(D) Definition of Training Data Set
Q219: Which of the following is true about SVM?
(A) It is useful only in high-dimensional spaces
(B) It always gives an approximate value
(C) It is accurate
(D) Understanding SVM is difficult
Q220: Which of the following will be Manhattan distance between the two data points A(8,3) and B(4,3)?