Use cross entropy to create decision tree classifier

Are entropy and cross-entropy the same thing as per basic definition?

If there is a difference:

Decision tree splits take on entropy or Gini index, can we use cross-entropy to split decision trees? OR should I use it as an evaluation metric after running the decision tree algorithm?

Also, does the decision tree algorithm assumes any distribution?

If yes,

then how can we use the KL Divergence metric?

I am just trying to link a few concepts in a broader view.

These are my concerns with respect to the multiclass decision tree.

Thanks,

Topic multiclass-classification decision-trees

Category Data Science


Entropy and cross-entropy are different concepts. Entropy quantifies the uncertainty in a single random variable. Cross-entropy quantifies the uncertainty between two distributions.

Decision tree algorithm do not make any assumptions about the distribution.

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.