General Information

This brick provides you with an easy interface for creating your own out-of-box and distributed gradient boosted decision tree for your regression tasks (if you need to solve a classification task, please, check LGBM Binary or LGBM Multiclass brick). Due to its leaf-wise processing nature, the created model can be easily trained on large datasets, while giving formidable results.

The models are built on three important principles:

In this case, the weak learners are multiple sequential specialized decision trees, which do the following things:

All those trees are trained by propagating the gradients of errors throughout the system.

The main drawback of the LGBM Binary is that finding the best split points in each tree node is both a time-consuming and memory-consuming operation.

Description

Brick Locations

Brick Parameters