Skip to main content

An abstraction layer for hyper parameter tuning

Project description

Tune

Doc PyPI versionPyPI pyversions PyPI license codecov

Slack Status

Tune is an abstraction layer for general parameter tuning. It is built on Fugue so it can seamlessly run on any backend supported by Fugue, such as Spark, Dask and local.

Installation

pip install tune

It's recommended to also install Scikit-Learn (for all compatible models tuning) and Hyperopt (to enable Bayesian Optimization)

pip install tune[hyperopt,sklearn]

Quick Start

To quickly start, please go through these tutorials on Kaggle:

  1. Search Space
  2. Non-iterative Problems, such as Scikit-Learn model tuning
  3. Iterative Problems, such as Keras model tuning

Design Philosophy

Tune does not follow Scikit-Learn's model selection APIs and does not provide distributed backend for it. We believe that parameter tuning is a general problem that is not only for machine learning, so our abstractions are built from ground up, the lower level APIs do not assume the objective is a machine learning model, while the higher level APIs are dedicated to solve specific problems, such as Scikit-Learn compatible model tuning and Keras model tuning.

Although we didn't base our solution on any of HyperOpt, Optuna, Ray Tune and Nevergrad etc., we are truly inspired by these wonderful solutions and their design. We also integrated with many of them for deeper level optimizations.

Tuning problems are never easy, here are our goals:

  • Provide the simplest and most intuitive APIs for major tuning cases. We always start from real tuning cases, figure out the minimal requirement for each of them and then determine the layers of abstraction. Read this tutorial, you can see how minimal the interfaces can be.
  • Be scale agnostic and platform agnostic. We want you to worry less about distributed computing, and just focus on the tuning logic itself. Built on Fugue, Tune let you develop your tuning process iteratively. You can test with small spaces on local machine, and then switch to larger spaces and run distributedly with no code change. It can effectively save time and cost and make the process fun and rewarding. And to run any tuning logic distributedly, you only need a core framework itself (Spark, Dask, etc.) and you do not need a database, a queue service or even an embeded cluster.
  • Be highly extendable and flexible on lower level. For example
    • you can extend on Fugue level, for example create an execution engine for Prefect to run the tuning jobs as a Prefect workflow
    • you can integrate third party optimizers and use Tune just as a distributed orchestrator. We have integrated HyperOpt. And Optuna and Nevergrad is on the way.
    • you can start external instances (e.g. EC2 instances) for different training subtasks and to fully utilize your cloud
    • you can combine with distributed training as long as your have enough compute resource

Focuses

Here are our current focuses:

  • A flexible space design and can describe a hybrid space of grid search, random search and second level optimization such as bayesian optimization
  • Integrate with 3rd party tuning frameworks
  • Create generalized and distributed versions of Successive Halving, Hyperband and Asynchronous Successive Halving.

Collaboration

We are looking for collaborators, if you are interested, please let us know. Please join our Slack channel.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tune-0.1.1.tar.gz (70.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

tune-0.1.1-py3-none-any.whl (94.7 kB view details)

Uploaded Python 3

File details

Details for the file tune-0.1.1.tar.gz.

File metadata

  • Download URL: tune-0.1.1.tar.gz
  • Upload date:
  • Size: 70.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.8.13

File hashes

Hashes for tune-0.1.1.tar.gz
Algorithm Hash digest
SHA256 14c6594c45add8e5e04a9834b029e1fd6e484f578a38f78576b31bf44e29bc9f
MD5 eb65fc1b01a64cba41bbce458f946f63
BLAKE2b-256 e0acf0f54abdd7a69ec670c520856ce2ad1fa0c4069be02fbd1afd60c4e118bd

See more details on using hashes here.

File details

Details for the file tune-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: tune-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 94.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.8.13

File hashes

Hashes for tune-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 cae551b06a37bc134920b4af6fabcf536b02120f47b6c6280ef7dfbe4bacf426
MD5 8335cf887b897e5025a0f782be9e03f0
BLAKE2b-256 5cf7caf1526ea65d2ee754f5d4a1fe7c37f3c70aaaf8ad6eeb96ab63e7ab5cd7

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page