Skip to main content

treeboost_autograd

Project description

treeboost_autograd

Easy Custom Losses for Tree Boosters using Pytorch

Why calculate first and second derivatives for your objective when you can let PyTorch do it for you?

This packages includes an easy to use custom PyTorch objective implementation for tree boosters (just add loss).
Supported boosting packages: CatBoost, XGBoost, LightGBM.
Supported tasks: regression, binary classification.

Check out the post in Towards Data Science: https://towardsdatascience.com/easy-custom-losses-for-tree-boosters-using-pytorch-57ffaa0b2eb3

Usage

Usage is very similar for all boosting libraries:
from treeboost_autograd import CatboostObjective, LightGbmObjective, XgboostObjective
Ready-to-run examples are available at the Git repo: https://github.com/TomerRonen34/treeboost_autograd/tree/main/examples

pip install treeboost_autograd

def absolute_error_loss(preds: torch.Tensor, targets: torch.Tensor) -> torch.Tensor:
    return torch.abs(preds - targets).sum()

custom_objective = CatboostObjective(loss_function=absolute_error_loss)
model = CatBoostRegressor(loss_function=custom_objective, eval_metric="MAE")
model.fit(X_train, y_train)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

treeboost_autograd-0.1.2.tar.gz (8.2 kB view hashes)

Uploaded Source

Built Distribution

treeboost_autograd-0.1.2-py2.py3-none-any.whl (4.3 kB view hashes)

Uploaded Python 2 Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page