Skip to main content

Incremental machine learning in Python

Project description

creme_logo

creme is a library for online machine learning, also known as incremental learning. Online learning is a machine learning regime where a model learns one observation at a time. This is in contrast to batch learning where all the data is processed in one go. Incremental learning is desirable when the data is too big to fit in memory, or simply when it isn't available all at once. creme's API is heavily inspired from that of scikit-learn, enough so that users who are familiar with it should feel right at home.

Useful links

Installation

:point_up: creme is tested with Python 3.6 and above.

creme mostly relies on Python's standard library. Sometimes it relies on numpy, scipy, and scikit-learn so as not to reinvent the wheel. creme can simply be installed with pip.

pip install creme

Quick example

In the following snippet we'll be fitting an online logistic regression. The weights of the model will be optimized with the AdaGrad algorithm. We'll scale the data so that each variable has a mean of 0 and a standard deviation of 1. The standard scaling and the logistic regression are combined into a pipeline using the | operator. We'll be using the stream.iter_sklearn_dataset function for streaming over the Wisconsin breast cancer dataset. We'll measure the F1-score using progressive validation.

>>> from creme import compose
>>> from creme import linear_model
>>> from creme import metrics
>>> from creme import model_selection
>>> from creme import optim
>>> from creme import preprocessing
>>> from creme import stream
>>> from sklearn import datasets

>>> X_y = stream.iter_sklearn_dataset(
...     load_dataset=datasets.load_breast_cancer,
...     shuffle=True,
...     random_state=42
... )

>>> scaler = preprocessing.StandardScaler()
>>> lin_reg = linear_model.LogisticRegression(optimizer=optim.AdaGrad())
>>> model = scaler | lin_reg

>>> metric = metrics.F1Score()

>>> for x, y in X_y:
...     y_pred = model.predict_one(x)
...     model = model.fit_one(x, y)
...     metric = metric.update(y, y_pred)

>>> metric
F1Score: 0.97191

Comparison with other solutions

  • scikit-learn: Some of it's estimators have a partial_fit method which allows them to update themselves with new observations. However, online learning isn't treated as a first class citizen, which can make things awkward. You should definitely use scikit-learn if your data fits in memory and that you can afford retraining your model from scratch every time new data is available.
  • Vowpal Wabbit: VW is probably the fastest out-of-core learning system available. At it's core it implements a state-of-the-art adaptive gradient descent algorithm with many tricks. It also has some mechanisms for doing active learning and using bandits. However it isn't a "true" online learning system as it assumes the data is available in a file and can be looped over multiple times. Also it is somewhat difficult to grok for newcomers.
  • LIBOL: This is a good library written by academics with some great documentation. It's written in C++ and seems to be pretty fast. However it only focuses on the learning aspect of online learning, not on other mundane yet useful tasks such as feature extraction and preprocessing. Moreover it hasn't been updated for a few years.
  • Spark Streaming: This is an extension of Apache Spark which caters to big data practitioners. It processes data in mini-batches instead of actually doing real streaming operations. It also has some compatibility with the MLlib for implementing online learning algorithms, such as streaming linear regression and streaming k-means. However it is a somewhat overwhelming solution which might be a bit overkill for certain use cases.
  • TensorFlow: Deep learning systems are in some sense online learning systems because they use online gradient descent. However, popular libraries are mostly attuned to batch situations. Because frameworks such as Keras and PyTorch are so popular and very well backed, there is no real point in implementing neural networks in creme. Additionally, for a lot of problems neural networks might not be the right tool, and you might want to use a simple logistic regression or a decision tree (for which online algorithms exist).

Feel free to open an issue if you feel like other solutions are worth mentioning.

Contributing

Like many subfields of machine learning, online learning is far from being an exact science and so there is still a lot to do. Feel free to contribute in any way you like, we're always open to new ideas and approaches. If you want to contribute to the code base please check out the CONTRIBUTING.md file. Also take a look at the issue tracker and see if anything takes your fancy.

Last but not least you are more than welcome to share with us how you're using creme or online learning in general! We believe that online learning solves a lot of pain points in practice and we would love to share experiences.

This project follows the all-contributors specification. Contributions of any kind are welcome!

Max Halford
Max Halford

📆 💻
AdilZouitine
AdilZouitine

💻
Raphael Sourty
Raphael Sourty

💻
Geoffrey Bolmier
Geoffrey Bolmier

💻
vincent d warmerdam
vincent d warmerdam

💻
VaysseRobin
VaysseRobin

💻
Lygon Bowen-West
Lygon Bowen-West

💻

License

See the license file.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

creme-0.2.0.tar.gz (75.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

creme-0.2.0-py2.py3-none-any.whl (151.0 kB view details)

Uploaded Python 2Python 3

File details

Details for the file creme-0.2.0.tar.gz.

File metadata

  • Download URL: creme-0.2.0.tar.gz
  • Upload date:
  • Size: 75.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.11.0 pkginfo/1.4.2 requests/2.21.0 setuptools/40.8.0 requests-toolbelt/0.8.0 tqdm/4.31.1 CPython/3.6.8

File hashes

Hashes for creme-0.2.0.tar.gz
Algorithm Hash digest
SHA256 e5b9ead52e6ab61723014a5e0c3bc3c6b5af696f09723c9d2fd49c1adefb3de8
MD5 319999e5f93c9e5aa975654d04060520
BLAKE2b-256 412f8a1db4bfe6e0ab853bdacc7706e76ea5af3db14b16f3c9b3af3ed38985fd

See more details on using hashes here.

File details

Details for the file creme-0.2.0-py2.py3-none-any.whl.

File metadata

  • Download URL: creme-0.2.0-py2.py3-none-any.whl
  • Upload date:
  • Size: 151.0 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.11.0 pkginfo/1.4.2 requests/2.21.0 setuptools/40.8.0 requests-toolbelt/0.8.0 tqdm/4.31.1 CPython/3.6.8

File hashes

Hashes for creme-0.2.0-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 94abba6bfbbf579656397f29c92ac4f325194cf1830f587e0df3baf2734b239b
MD5 48680968a6e46cc21aad1d1a84fdd0fe
BLAKE2b-256 14c48187a53566e2190e43611372e3458042d0131f5a360aa795e4b39794cc77

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page