Given that we can select the same feature multiple times during the recursive partitioning of the input space, is it always possible to achieve 100% accuracy on the training data (given that we allow for trees to grow to their maximum size) when building decision trees?
A. Yes
B. No
Answer: Option B
Solution(By Examveda Team)
In decision tree algorithms like ID3 or C4.5, where features can be selected multiple times during recursive partitioning, it is indeed possible to achieve 100% accuracy on the training data. This occurs when the decision tree model perfectly memorizes all the training data, leading to each instance being correctly classified during training. However, achieving 100% accuracy on training data does not necessarily mean the model will generalize well to unseen data. Overfitting can occur, where the model captures noise or anomalies specific to the training data, leading to poor performance on new data.Therefore, the correct answer is Option A: Yes.
Related Questions on Machine Learning
In simple term, machine learning is
A. training based on historical data
B. prediction to answer a query
C. both A and B
D. automization of complex tasks
Which of the following is the best machine learning method?
A. scalable
B. accuracy
C. fast
D. all of the above
The output of training process in machine learning is
A. machine learning model
B. machine learning algorithm
C. null
D. accuracy
Application of machine learning methods to large databases is called
A. data mining.
B. artificial intelligence
C. big data computing
D. internet of things
Join The Discussion