Skip to main content

A super-easy way to record, search and compare AI experiments.

Project description

An easy-to-use & supercharged open-source experiment tracker

Aim logs your training runs, enables a beautiful UI to compare them and an API to query them programmatically.

AboutFeaturesDemosExamplesQuick StartDocumentationRoadmapSlack CommunityTwitter

Platform Support PyPI - Python Version PyPI Package License PyPI Downloads Issues

Integrates seamlessly with your favorite tools



About Aim

Track and version ML runs Visualize runs via beautiful UI Query runs metadata via SDK

Aim is an open-source, self-hosted ML experiment tracking tool. It's good at tracking lots (1000s) of training runs and it allows you to compare them with a performant and beautiful UI.

You can use not only the great Aim UI but also its SDK to query your runs' metadata programmatically. That's especially useful for automations and additional analysis on a Jupyter Notebook.

Aim's mission is to democratize AI dev tools.

Why use Aim?

Compare 100s of runs in a few clicks - build models faster

  • Compare, group and aggregate 100s of metrics thanks to effective visualizations.
  • Analyze, learn correlations and patterns between hparams and metrics.
  • Easy pythonic search to query the runs you want to explore.

Deep dive into details of each run for easy debugging

  • Hyperparameters, metrics, images, distributions, audio, text - all available at hand on an intuitive UI to understand the performance of your model.
  • Easily track plots built via your favourite visualisation tools, like plotly and matplotlib.
  • Analyze system resource usage to effectively utilize computational resources.

Have all relevant information organised and accessible for easy governance

  • Centralized dashboard to holistically view all your runs, their hparams and results.
  • Use SDK to query/access all your runs and tracked metadata.
  • You own your data - Aim is open source and self hosted.

Demos

Machine translation lightweight-GAN
Training logs of a neural translation model(from WMT'19 competition). Training logs of 'lightweight' GAN, proposed in ICLR 2021.
FastSpeech 2 Simple MNIST
Training logs of Microsoft's "FastSpeech 2: Fast and High-Quality End-to-End Text to Speech". Simple MNIST training logs.

Quick Start

Follow the steps below to get started with Aim.

1. Install Aim on your training environment

pip3 install aim

2. Integrate Aim with your code

from aim import Run

# Initialize a new run
run = Run()

# Log run parameters
run["hparams"] = {
    "learning_rate": 0.001,
    "batch_size": 32,
}

# Log metrics
for i in range(10):
    run.track(i, name='loss', step=i, context={ "subset":"train" })
    run.track(i, name='acc', step=i, context={ "subset":"train" })

See the full list of supported trackable objects(e.g. images, text, etc) here.

3. Run the training as usual and start Aim UI

aim up

4. Or query runs programmatically via SDK

from aim import Repo

my_repo = Repo('/path/to/aim/repo')

query = "metric.name == 'loss'" # Example query

# Get collection of metrics
for run_metrics_collection in my_repo.query_metrics(query).iter_runs():
    for metric in run_metrics_collection:
        # Get run params
        params = metric.run[...]
        # Get metric values
        steps, metric_values = metric.values.sparse_numpy()

Integrations

Integrate PyTorch Lightning
from aim.pytorch_lightning import AimLogger

# ...
trainer = pl.Trainer(logger=AimLogger(experiment='experiment_name'))
# ...

See documentation here.

Integrate Hugging Face
from aim.hugging_face import AimCallback

# ...
aim_callback = AimCallback(repo='/path/to/logs/dir', experiment='mnli')
trainer = Trainer(
    model=model,
    args=training_args,
    train_dataset=train_dataset if training_args.do_train else None,
    eval_dataset=eval_dataset if training_args.do_eval else None,
    callbacks=[aim_callback],
    # ...
)
# ...

See documentation here.

Integrate Keras & tf.keras
import aim

# ...
model.fit(x_train, y_train, epochs=epochs, callbacks=[
    aim.keras.AimCallback(repo='/path/to/logs/dir', experiment='experiment_name')
    
    # Use aim.tensorflow.AimCallback in case of tf.keras
    aim.tensorflow.AimCallback(repo='/path/to/logs/dir', experiment='experiment_name')
])
# ...

See documentation here.

Integrate XGBoost
from aim.xgboost import AimCallback

# ...
aim_callback = AimCallback(repo='/path/to/logs/dir', experiment='experiment_name')
bst = xgb.train(param, xg_train, num_round, watchlist, callbacks=[aim_callback])
# ...

See documentation here.

Comparisons to familiar tools

Tensorboard

Training run comparison

Order of magnitude faster training run comparison with Aim

  • The tracked params are first class citizens at Aim. You can search, group, aggregate via params - deeply explore all the tracked data (metrics, params, images) on the UI.
  • With tensorboard the users are forced to record those parameters in the training run name to be able to search and compare. This causes a super-tedius comparison experience and usability issues on the UI when there are many experiments and params. TensorBoard doesn't have features to group, aggregate the metrics

Scalability

  • Aim is built to handle 1000s of training runs - both on the backend and on the UI.
  • TensorBoard becomes really slow and hard to use when a few hundred training runs are queried / compared.

Beloved TB visualizations to be added on Aim

  • Embedding projector.
  • Neural network visualization.

MLFlow

MLFlow is an end-to-end ML Lifecycle tool. Aim is focused on training tracking. The main differences of Aim and MLflow are around the UI scalability and run comparison features.

Run comparison

  • Aim treats tracked parameters as first-class citizens. Users can query runs, metrics, images and filter using the params.
  • MLFlow does have a search by tracked config, but there are no grouping, aggregation, subplotting by hyparparams and other comparison features available.

UI Scalability

  • Aim UI can handle several thousands of metrics at the same time smoothly with 1000s of steps. It may get shaky when you explore 1000s of metrics with 10000s of steps each. But we are constantly optimizing!
  • MLflow UI becomes slow to use when there are a few hundreds of runs.

Weights and Biases

Hosted vs self-hosted

  • Weights and Biases is a hosted closed-source MLOps platform.
  • Aim is self-hosted, free and open-source experiment tracking tool.

Roadmap

Detailed Sprints

:sparkle: The Aim product roadmap

  • The Backlog contains the issues we are going to choose from and prioritize weekly
  • The issues are mainly prioritized by the highly-requested features

High-level roadmap

The high-level features we are going to work on the next few months

Done

  • Live updates (Shipped: Oct 18 2021)
  • Images tracking and visualization (Start: Oct 18 2021, Shipped: Nov 19 2021)
  • Distributions tracking and visualization (Start: Nov 10 2021, Shipped: Dec 3 2021)
  • Jupyter integration (Start: Nov 18 2021, Shipped: Dec 3 2021)
  • Audio tracking and visualization (Start: Dec 6 2021, Shipped: Dec 17 2021)
  • Transcripts tracking and visualization (Start: Dec 6 2021, Shipped: Dec 17 2021)
  • Plotly integration (Start: Dec 1 2021, Shipped: Dec 17 2021)
  • Colab integration (Start: Nov 18 2021, Shipped: Dec 17 2021)
  • Centralized tracking server (Start: Oct 18 2021, Shipped: Jan 22 2022)
  • Tensorboard adaptor - visualize TensorBoard logs with Aim (Start: Dec 17 2021, Shipped: Feb 3 2022)
  • Track git info, env vars, CLI arguments, dependencies (Start: Jan 17 2022, Shipped: Feb 3 2022)
  • MLFlow adaptor (visualize MLflow logs with Aim) (Start: Feb 14 2022, Shipped: Feb 22 2022)
  • Activeloop Hub integration (Start: Feb 14 2022, Shipped: Feb 22 2022)
  • PyTorch-Ignite integration (Start: Feb 14 2022, Shipped: Feb 22 2022)
  • Run summary and overview info(system params, CLI args, git info, ...) (Start: Feb 14 2022, Shipped: Mar 9 2022)
  • Add DVC related metadata into aim run (Start: Mar 7 2022, Shipped: Mar 26 2022)
  • Ability to attach notes to Run from UI (Start: Mar 7 2022, Shipped: Apr 29 2022)
  • Fairseq integration (Start: Mar 27 2022, Shipped: Mar 29 2022)
  • LightGBM integration (Start: Apr 14 2022, Shipped: May 17 2022)
  • CatBoost integration (Start: Apr 20 2022, Shipped: May 17 2022)
  • Run execution details(display stdout/stderr logs) (Start: Apr 25 2022, Shipped: May 17 2022)

In Progress

  • Cloud storage support – store runs blob(e.g. images) data on the cloud (Start: Mar 21 2022)
  • Artifact storage – store files, model checkpoints, and beyond (Start: Mar 21 2022)
  • Long sequences(up to 5M of steps) support (Start: Apr 25 2022)

To Do

Aim UI

  • Runs management
    • Runs explorer – query and visualize runs data(images, audio, distributions, ...) in a central dashboard
  • Explorers
    • Audio Explorer
    • Text Explorer
    • Figures Explorer
    • Distributions Explorer
  • Dashboards – customizable layouts with embedded explorers

SDK and Storage

  • Scalability
    • Smooth UI and SDK experience with over 10.000 runs
  • Runs management
    • SDK interfaces
      • Reporting – query and compare runs, explore data with familiar tools such as matlpotlib and pandas
      • Manipulations – copy, move, delete runs, params and sequences
    • CLI interfaces
      • Reporting - runs summary and run details in a CLI compatible format
      • Manipulations – copy, move, delete runs, params and sequences

Integrations

  • ML Frameworks:
    • Shortlist: MONAI, SpaCy, AllenNLP, Raytune, fast.ai, KerasTuner
  • Datasets versioning tools
    • Shortlist: HuggingFace Datasets
  • Resource management tools
    • Shortlist: Kubeflow, Slurm
  • Workflow orchestration tools
  • Others: Hydra, Google MLMD, Streamlit, ...

On hold

  • scikit-learn integration

Community

If you have questions

  1. Read the docs
  2. Open a feature request or report a bug
  3. Join our slack

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aim-3.11.0.tar.gz (1.5 MB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

aim-3.11.0-cp310-cp310-manylinux_2_24_x86_64.whl (5.4 MB view details)

Uploaded CPython 3.10manylinux: glibc 2.24+ x86-64

aim-3.11.0-cp310-cp310-macosx_11_0_arm64.whl (2.3 MB view details)

Uploaded CPython 3.10macOS 11.0+ ARM64

aim-3.11.0-cp310-cp310-macosx_10_14_x86_64.whl (2.4 MB view details)

Uploaded CPython 3.10macOS 10.14+ x86-64

aim-3.11.0-cp39-cp39-manylinux_2_24_x86_64.whl (5.5 MB view details)

Uploaded CPython 3.9manylinux: glibc 2.24+ x86-64

aim-3.11.0-cp39-cp39-macosx_11_0_arm64.whl (2.2 MB view details)

Uploaded CPython 3.9macOS 11.0+ ARM64

aim-3.11.0-cp39-cp39-macosx_10_14_x86_64.whl (2.3 MB view details)

Uploaded CPython 3.9macOS 10.14+ x86-64

aim-3.11.0-cp38-cp38-manylinux_2_24_x86_64.whl (5.7 MB view details)

Uploaded CPython 3.8manylinux: glibc 2.24+ x86-64

aim-3.11.0-cp38-cp38-macosx_11_0_arm64.whl (2.3 MB view details)

Uploaded CPython 3.8macOS 11.0+ ARM64

aim-3.11.0-cp38-cp38-macosx_10_14_x86_64.whl (2.4 MB view details)

Uploaded CPython 3.8macOS 10.14+ x86-64

aim-3.11.0-cp37-cp37m-manylinux_2_24_x86_64.whl (5.4 MB view details)

Uploaded CPython 3.7mmanylinux: glibc 2.24+ x86-64

aim-3.11.0-cp37-cp37m-macosx_10_14_x86_64.whl (2.4 MB view details)

Uploaded CPython 3.7mmacOS 10.14+ x86-64

aim-3.11.0-cp36-cp36m-manylinux_2_24_x86_64.whl (5.2 MB view details)

Uploaded CPython 3.6mmanylinux: glibc 2.24+ x86-64

aim-3.11.0-cp36-cp36m-macosx_10_14_x86_64.whl (2.3 MB view details)

Uploaded CPython 3.6mmacOS 10.14+ x86-64

File details

Details for the file aim-3.11.0.tar.gz.

File metadata

  • Download URL: aim-3.11.0.tar.gz
  • Upload date:
  • Size: 1.5 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.0 CPython/3.10.4

File hashes

Hashes for aim-3.11.0.tar.gz
Algorithm Hash digest
SHA256 dd51166749e0b0830609fc5fb856881038289300c512d69d11beaee676179cdc
MD5 40e4352fdeb215b531f3ea8b3327c89c
BLAKE2b-256 fa4766b4b347b2ac52a235093e30aede6a3c96751addadc6c4af1d694cb4297b

See more details on using hashes here.

File details

Details for the file aim-3.11.0-cp310-cp310-manylinux_2_24_x86_64.whl.

File metadata

File hashes

Hashes for aim-3.11.0-cp310-cp310-manylinux_2_24_x86_64.whl
Algorithm Hash digest
SHA256 67d5e58c90ee691fe93c10d3ce5301b4faffa48864a40b9cb5b673354aa6e0af
MD5 80752e29cb16daf65c218e939f658126
BLAKE2b-256 ca36c94875b60396a2fb9dca8b7d6246e3a2bf4d1906d24d7f9f55cb4898b1ec

See more details on using hashes here.

File details

Details for the file aim-3.11.0-cp310-cp310-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for aim-3.11.0-cp310-cp310-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 aa6e55ef6d042384761c9e45a122945a2d0c37b126b03acc14be5ac2c2f45833
MD5 3ad750127149519fee6b2f0f0f9a053c
BLAKE2b-256 24b63351cf6e973b27bcf0855142f634734843528058719688442abfd816681a

See more details on using hashes here.

File details

Details for the file aim-3.11.0-cp310-cp310-macosx_10_14_x86_64.whl.

File metadata

File hashes

Hashes for aim-3.11.0-cp310-cp310-macosx_10_14_x86_64.whl
Algorithm Hash digest
SHA256 6cb0c694c0489f9e8d280d9fa7a887aa1ed5032827a8d8fef8f5a246f34a1740
MD5 dc1f78c654f5e6549c6da40494effb55
BLAKE2b-256 ea17cab856c1457aaa4af4c6555a66dc39e63d44ce8edf108a14f720afbd68c0

See more details on using hashes here.

File details

Details for the file aim-3.11.0-cp39-cp39-manylinux_2_24_x86_64.whl.

File metadata

File hashes

Hashes for aim-3.11.0-cp39-cp39-manylinux_2_24_x86_64.whl
Algorithm Hash digest
SHA256 8a24dfd458a38560a45ef029517585434df73bb97296ab2b6440666f19d11839
MD5 42bd357b89305981229d2083a5e9d98b
BLAKE2b-256 724763d674eeb11d8a06bdef50b278a3acff812997653d5ab0c6cd85f75e507f

See more details on using hashes here.

File details

Details for the file aim-3.11.0-cp39-cp39-macosx_11_0_arm64.whl.

File metadata

  • Download URL: aim-3.11.0-cp39-cp39-macosx_11_0_arm64.whl
  • Upload date:
  • Size: 2.2 MB
  • Tags: CPython 3.9, macOS 11.0+ ARM64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.0 CPython/3.9.12

File hashes

Hashes for aim-3.11.0-cp39-cp39-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 4309742caead5ae8f9aba49b310040482136544071ba869533fdd6c9860d19f4
MD5 d518449072fa48bb79ac49f22a3d36de
BLAKE2b-256 da8c80f6254f2ef310df25a00ef1efcd11f75c58b9fc3f303c82526a7fc5bfdf

See more details on using hashes here.

File details

Details for the file aim-3.11.0-cp39-cp39-macosx_10_14_x86_64.whl.

File metadata

File hashes

Hashes for aim-3.11.0-cp39-cp39-macosx_10_14_x86_64.whl
Algorithm Hash digest
SHA256 95ca21ed17c4ef6d9f76d630b4558301a5d8c2123fdc1ca102b5a682c8e017a0
MD5 a87b0e6640767484b46e4aedf6c8efd4
BLAKE2b-256 7911c0052d16088def07aa498890580d0da1acf0ba5327a2cb95a051cc59a60e

See more details on using hashes here.

File details

Details for the file aim-3.11.0-cp38-cp38-manylinux_2_24_x86_64.whl.

File metadata

File hashes

Hashes for aim-3.11.0-cp38-cp38-manylinux_2_24_x86_64.whl
Algorithm Hash digest
SHA256 49c8a2856e1a7b0c6dae997a82d1dc26d7dd729f25362fca22e1f6c16e1ced40
MD5 02c681a72e24fcf289fb98f85a45d666
BLAKE2b-256 b119c777cce2469990728f21fa1783fb803d7c85da6bfe343cce8e8ad2ec4612

See more details on using hashes here.

File details

Details for the file aim-3.11.0-cp38-cp38-macosx_11_0_arm64.whl.

File metadata

  • Download URL: aim-3.11.0-cp38-cp38-macosx_11_0_arm64.whl
  • Upload date:
  • Size: 2.3 MB
  • Tags: CPython 3.8, macOS 11.0+ ARM64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.0 CPython/3.8.13

File hashes

Hashes for aim-3.11.0-cp38-cp38-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 2a90494e1530d7e7301744e35962415112a2804217b863e8ede8332ad4eab99b
MD5 5de88a9c63cd96723890eb2381493914
BLAKE2b-256 15aa725d1bedf634d5f07a7fa4e74c9309e8581076054ee6aae3534512768f1a

See more details on using hashes here.

File details

Details for the file aim-3.11.0-cp38-cp38-macosx_10_14_x86_64.whl.

File metadata

File hashes

Hashes for aim-3.11.0-cp38-cp38-macosx_10_14_x86_64.whl
Algorithm Hash digest
SHA256 9cd6a34e799f26b79404e9ed05524eda6490e2475f476e87ac59e98c2bcc0e43
MD5 9cd85b195ac48f3acf5766db40e4c201
BLAKE2b-256 f4a1ae60396da0f7ee9bfacb08bedb0452628bcdfcbbb596805b3d14154f29fe

See more details on using hashes here.

File details

Details for the file aim-3.11.0-cp37-cp37m-manylinux_2_24_x86_64.whl.

File metadata

File hashes

Hashes for aim-3.11.0-cp37-cp37m-manylinux_2_24_x86_64.whl
Algorithm Hash digest
SHA256 48ed326b0a038d950a8940fe3dd1709fae2d61880ed93e813126771d57f29cbc
MD5 2032fde57d96acec74b640a71b88d89e
BLAKE2b-256 d1ecc4ac536212d8ba67ea9bbb9c957e2ad61dd03802e5c62af665b92743a861

See more details on using hashes here.

File details

Details for the file aim-3.11.0-cp37-cp37m-macosx_10_14_x86_64.whl.

File metadata

File hashes

Hashes for aim-3.11.0-cp37-cp37m-macosx_10_14_x86_64.whl
Algorithm Hash digest
SHA256 b5b9c51d325ea52b1a44dc70ecab91197ace199a91b133aa48308b4144d3f3ea
MD5 59b25cc71d13ac421d51e4e0cbf3fe4f
BLAKE2b-256 bd96a55f0aecc63507f4244ad135efcef70911b526fec9a88d5ac67db96286fd

See more details on using hashes here.

File details

Details for the file aim-3.11.0-cp36-cp36m-manylinux_2_24_x86_64.whl.

File metadata

File hashes

Hashes for aim-3.11.0-cp36-cp36m-manylinux_2_24_x86_64.whl
Algorithm Hash digest
SHA256 504089168fb4dd0b0c95fbdb5549bcaadd0ef125e9585f9b754248292c8849fb
MD5 f90ee1ad4a527e4262524c98d31ff816
BLAKE2b-256 d3755cb8d4f02548143662992659f3dac1caa2b6158fa695f173dff54b6aa548

See more details on using hashes here.

File details

Details for the file aim-3.11.0-cp36-cp36m-macosx_10_14_x86_64.whl.

File metadata

  • Download URL: aim-3.11.0-cp36-cp36m-macosx_10_14_x86_64.whl
  • Upload date:
  • Size: 2.3 MB
  • Tags: CPython 3.6m, macOS 10.14+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/34.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.9 tqdm/4.64.0 importlib-metadata/4.8.3 keyring/23.4.1 rfc3986/1.5.0 colorama/0.4.4 CPython/3.6.13

File hashes

Hashes for aim-3.11.0-cp36-cp36m-macosx_10_14_x86_64.whl
Algorithm Hash digest
SHA256 89d45898c2f1bc8bd4423f405f007c7b17c3edfc2055410d247d5b4dc48ba71f
MD5 b0f4f7157154389bfa312c1c3b455dca
BLAKE2b-256 29f8607f3e3b8f0576a8a785c087aef96dca27b05f3214e04e7f920b73e6d5c3

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page