Examveda
Examveda

Given that we can select the same feature multiple times during the recursive partitioning of the input space, is it always possible to achieve 100% accuracy on the training data (given that we allow for trees to grow to their maximum size) when building decision trees?

A. Yes

B. No

Answer: Option B

Solution(By Examveda Team)

In decision tree algorithms like ID3 or C4.5, where features can be selected multiple times during recursive partitioning, it is indeed possible to achieve 100% accuracy on the training data. This occurs when the decision tree model perfectly memorizes all the training data, leading to each instance being correctly classified during training. However, achieving 100% accuracy on training data does not necessarily mean the model will generalize well to unseen data. Overfitting can occur, where the model captures noise or anomalies specific to the training data, leading to poor performance on new data.
Therefore, the correct answer is Option A: Yes.

This Question Belongs to Computer Science >> Machine Learning

Join The Discussion

Related Questions on Machine Learning