LightGBM is a gradient boosting framework that using tree based learning algorithms. It is designed to be distributed and high performance, and has following advantages:
- Fast training speed
- Low memory consumption
- Better accuracy
- Efficient parallel learning
- Can learn from very big data
For the details about these advantages, please refer to Feature Highlight.
The experiments on real data also show LightGBM can outperform other existing boosting tools on both learning efficiency and accuracy, with much lower memory consumption. And LightGBM can achieve linear speed-up for parallel learning in our experiments.
To quick start, please refer to Installation and Quick Start.
This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.