Difference between revisions of "TDSM 11.5"

From The Data Science Design Manual Wikia
Jump to: navigation, search
(Created page with "(a) Does there always exist a decision tree classifier which perfectly separates A from B? No, a perfect decision tree classifier does not exist if there are one or more sets...")
 
m
 
(One intermediate revision by the same user not shown)
Line 1: Line 1:
(a) Does there always exist a decision tree classifier which perfectly separates A from B?
+
(a) Does there always exist a decision tree classifier which perfectly separates A from B? <br/>
 
No, a perfect decision tree classifier does not exist if there are one or more sets of feature vectors which are exactly the same and correspond to different labels.
 
No, a perfect decision tree classifier does not exist if there are one or more sets of feature vectors which are exactly the same and correspond to different labels.
  
(b) Does there always exist a decision tree classifier which perfectly separates A from B if the n feature vectors are all distinct?
+
(b) Does there always exist a decision tree classifier which perfectly separates A from B if the n feature vectors are all distinct? <br/>
 
Yes, there always exists a perfect decision tree classifier if all the feature vectors are distinct, because a tree can always be constructed such that there exists a unique path from the root to a leaf corresponding to each vector.
 
Yes, there always exists a perfect decision tree classifier if all the feature vectors are distinct, because a tree can always be constructed such that there exists a unique path from the root to a leaf corresponding to each vector.
  
(c) Does there always exist a logistic regression classifier which perfectly separates A from B?
+
(c) Does there always exist a logistic regression classifier which perfectly separates A from B? <br/>
 
No. The logistic regression classifier is based on separating the n-dimensional space into two parts using a line / plane / hyperplane. There can always exist a point with the other label beyond the line in one part of the space.
 
No. The logistic regression classifier is based on separating the n-dimensional space into two parts using a line / plane / hyperplane. There can always exist a point with the other label beyond the line in one part of the space.
  
(d) Does there always exist a logistic regression classifier which perfectly separates A from B if the n feature vectors are all distinct?
+
(d) Does there always exist a logistic regression classifier which perfectly separates A from B if the n feature vectors are all distinct? <br/>
 
No. (Same as c)
 
No. (Same as c)

Latest revision as of 04:12, 11 December 2017

(a) Does there always exist a decision tree classifier which perfectly separates A from B?
No, a perfect decision tree classifier does not exist if there are one or more sets of feature vectors which are exactly the same and correspond to different labels.

(b) Does there always exist a decision tree classifier which perfectly separates A from B if the n feature vectors are all distinct?
Yes, there always exists a perfect decision tree classifier if all the feature vectors are distinct, because a tree can always be constructed such that there exists a unique path from the root to a leaf corresponding to each vector.

(c) Does there always exist a logistic regression classifier which perfectly separates A from B?
No. The logistic regression classifier is based on separating the n-dimensional space into two parts using a line / plane / hyperplane. There can always exist a point with the other label beyond the line in one part of the space.

(d) Does there always exist a logistic regression classifier which perfectly separates A from B if the n feature vectors are all distinct?
No. (Same as c)