This is just a short follow up to last week's StatQuest where we introduced decision trees. Here we show how decision trees deal with variables that don't improve the tree (feature selection) and how they deal with missing data.
To learn the basics about Decision Trees, see:
https://youtu.be/_L39rN6gz7Y
For a complete index of all the StatQuest videos, check out:
https://statquest.org/video-index/
If you'd like to support StatQuest, please consider...
Patreon: https://www.patreon.com/statquest
...or...
YouTube Membership: https://www.youtube.com/channel/UCtYLUTtgS3k1Fg4y5tAhLbw/join
...buying one of my books, a study guide, a t-shirt or hoodie, or a song from the StatQuest store...
https://statquest.org/statquest-store/
...or just donating to StatQuest!
https://www.paypal.me/statquest
Lastly, if you want to keep up with me as I research and create new StatQuests, follow me on twitter:
https://twitter.com/joshuastarmer
Correction:
1:35 I mistyped the gini impurity. I wrote 0.29, but it should be 0.19.
#statquest #ML #decisiontree