Skip to main content

Ray provides a simple, universal API for building distributed applications.

Project description

https://github.com/ray-project/ray/raw/master/doc/source/images/ray_header_logo.png https://readthedocs.org/projects/ray/badge/?version=master https://img.shields.io/badge/Ray-Join%20Slack-blue https://img.shields.io/badge/Discuss-Ask%20Questions-blue

Ray provides a simple, universal API for building distributed applications.

Ray is packaged with the following libraries for accelerating machine learning workloads:

  • Tune: Scalable Hyperparameter Tuning

  • RLlib: Scalable Reinforcement Learning

  • RaySGD: Distributed Training Wrappers

  • Ray Serve: Scalable and Programmable Serving

There are also many community integrations with Ray, including Dask, MARS, Modin, Horovod, Hugging Face, Scikit-learn, and others. Check out the full list of Ray distributed libraries here.

Install Ray with: pip install ray. For nightly wheels, see the Installation page.

Quick Start

Execute Python functions in parallel.

import ray
ray.init()

@ray.remote
def f(x):
    return x * x

futures = [f.remote(i) for i in range(4)]
print(ray.get(futures))

To use Ray’s actor model:

import ray
ray.init()

@ray.remote
class Counter(object):
    def __init__(self):
        self.n = 0

    def increment(self):
        self.n += 1

    def read(self):
        return self.n

counters = [Counter.remote() for i in range(4)]
[c.increment.remote() for c in counters]
futures = [c.read.remote() for c in counters]
print(ray.get(futures))

Ray programs can run on a single machine, and can also seamlessly scale to large clusters. To execute the above Ray script in the cloud, just download this configuration file, and run:

ray submit [CLUSTER.YAML] example.py --start

Read more about launching clusters.

Tune Quick Start

https://github.com/ray-project/ray/raw/master/doc/source/images/tune-wide.png

Tune is a library for hyperparameter tuning at any scale.

To run this example, you will need to install the following:

$ pip install "ray[tune]"

This example runs a parallel grid search to optimize an example objective function.

from ray import tune


def objective(step, alpha, beta):
    return (0.1 + alpha * step / 100)**(-1) + beta * 0.1


def training_function(config):
    # Hyperparameters
    alpha, beta = config["alpha"], config["beta"]
    for step in range(10):
        # Iterative training function - can be any arbitrary training procedure.
        intermediate_score = objective(step, alpha, beta)
        # Feed the score back back to Tune.
        tune.report(mean_loss=intermediate_score)


analysis = tune.run(
    training_function,
    config={
        "alpha": tune.grid_search([0.001, 0.01, 0.1]),
        "beta": tune.choice([1, 2, 3])
    })

print("Best config: ", analysis.get_best_config(metric="mean_loss", mode="min"))

# Get a dataframe for analyzing trial results.
df = analysis.results_df

If TensorBoard is installed, automatically visualize all trial results:

tensorboard --logdir ~/ray_results

RLlib Quick Start

https://github.com/ray-project/ray/raw/master/doc/source/images/rllib-wide.jpg

RLlib is an open-source library for reinforcement learning built on top of Ray that offers both high scalability and a unified API for a variety of applications.

pip install tensorflow  # or tensorflow-gpu
pip install "ray[rllib]"
import gym
from gym.spaces import Discrete, Box
from ray import tune

class SimpleCorridor(gym.Env):
    def __init__(self, config):
        self.end_pos = config["corridor_length"]
        self.cur_pos = 0
        self.action_space = Discrete(2)
        self.observation_space = Box(0.0, self.end_pos, shape=(1, ))

    def reset(self):
        self.cur_pos = 0
        return [self.cur_pos]

    def step(self, action):
        if action == 0 and self.cur_pos > 0:
            self.cur_pos -= 1
        elif action == 1:
            self.cur_pos += 1
        done = self.cur_pos >= self.end_pos
        return [self.cur_pos], 1 if done else 0, done, {}

tune.run(
    "PPO",
    config={
        "env": SimpleCorridor,
        "num_workers": 4,
        "env_config": {"corridor_length": 5}})

Ray Serve Quick Start

https://raw.githubusercontent.com/ray-project/ray/master/doc/source/serve/logo.svg

Ray Serve is a scalable model-serving library built on Ray. It is:

  • Framework Agnostic: Use the same toolkit to serve everything from deep learning models built with frameworks like PyTorch or Tensorflow & Keras to Scikit-Learn models or arbitrary business logic.

  • Python First: Configure your model serving with pure Python code - no more YAMLs or JSON configs.

  • Performance Oriented: Turn on batching, pipelining, and GPU acceleration to increase the throughput of your model.

  • Composition Native: Allow you to create “model pipelines” by composing multiple models together to drive a single prediction.

  • Horizontally Scalable: Serve can linearly scale as you add more machines. Enable your ML-powered service to handle growing traffic.

To run this example, you will need to install the following:

$ pip install scikit-learn
$ pip install "ray[serve]"

This example runs serves a scikit-learn gradient boosting classifier.

from ray import serve
import pickle
import requests
from sklearn.datasets import load_iris
from sklearn.ensemble import GradientBoostingClassifier

# Train model
iris_dataset = load_iris()
model = GradientBoostingClassifier()
model.fit(iris_dataset["data"], iris_dataset["target"])

# Define Ray Serve model,
class BoostingModel:
    def __init__(self):
        self.model = model
        self.label_list = iris_dataset["target_names"].tolist()

    def __call__(self, flask_request):
        payload = flask_request.json["vector"]
        print("Worker: received flask request with data", payload)

        prediction = self.model.predict([payload])[0]
        human_name = self.label_list[prediction]
        return {"result": human_name}


# Deploy model
client = serve.start()
client.create_backend("iris:v1", BoostingModel)
client.create_endpoint("iris_classifier", backend="iris:v1", route="/iris")

# Query it!
sample_request_input = {"vector": [1.2, 1.0, 1.1, 0.9]}
response = requests.get("http://localhost:8000/iris", json=sample_request_input)
print(response.text)
# Result:
# {
#  "result": "versicolor"
# }

More Information

Older documents:

Getting Involved

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

ray-1.4.0rc1-cp38-cp38-win_amd64.whl (15.7 MB view details)

Uploaded CPython 3.8Windows x86-64

ray-1.4.0rc1-cp38-cp38-manylinux2014_x86_64.whl (49.2 MB view details)

Uploaded CPython 3.8

ray-1.4.0rc1-cp38-cp38-macosx_10_13_x86_64.whl (50.1 MB view details)

Uploaded CPython 3.8macOS 10.13+ x86-64

ray-1.4.0rc1-cp37-cp37m-win_amd64.whl (15.8 MB view details)

Uploaded CPython 3.7mWindows x86-64

ray-1.4.0rc1-cp37-cp37m-manylinux2014_x86_64.whl (49.4 MB view details)

Uploaded CPython 3.7m

ray-1.4.0rc1-cp37-cp37m-macosx_10_13_intel.whl (50.2 MB view details)

Uploaded CPython 3.7mmacOS 10.13+ Intel (x86-64, i386)

ray-1.4.0rc1-cp36-cp36m-win_amd64.whl (15.8 MB view details)

Uploaded CPython 3.6mWindows x86-64

ray-1.4.0rc1-cp36-cp36m-manylinux2014_x86_64.whl (49.4 MB view details)

Uploaded CPython 3.6m

ray-1.4.0rc1-cp36-cp36m-macosx_10_13_intel.whl (50.3 MB view details)

Uploaded CPython 3.6mmacOS 10.13+ Intel (x86-64, i386)

File details

Details for the file ray-1.4.0rc1-cp38-cp38-win_amd64.whl.

File metadata

  • Download URL: ray-1.4.0rc1-cp38-cp38-win_amd64.whl
  • Upload date:
  • Size: 15.7 MB
  • Tags: CPython 3.8, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.8.9

File hashes

Hashes for ray-1.4.0rc1-cp38-cp38-win_amd64.whl
Algorithm Hash digest
SHA256 ed3dbad4ed6da77312f428fe7d46e1177daeb5e60a08df801c2587990829432b
MD5 3d718b8701b38b143395dec90c08b720
BLAKE2b-256 b53b0ec2f230f3ede69fcc937b7f9c6d3ac77a28af9bed64a84a762380f5d2ac

See more details on using hashes here.

File details

Details for the file ray-1.4.0rc1-cp38-cp38-manylinux2014_x86_64.whl.

File metadata

  • Download URL: ray-1.4.0rc1-cp38-cp38-manylinux2014_x86_64.whl
  • Upload date:
  • Size: 49.2 MB
  • Tags: CPython 3.8
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.8.9

File hashes

Hashes for ray-1.4.0rc1-cp38-cp38-manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 094d3e38b31ee0607ee24cc0d231bf1f12f4d9aef45bd294f9c7444b136fcd60
MD5 211d52bb99324b1423b5a2f39edc2017
BLAKE2b-256 dd8c10ef530e5c748d8075192c60c4a8adcac5554707a674a1ce58d1b5eabe28

See more details on using hashes here.

File details

Details for the file ray-1.4.0rc1-cp38-cp38-macosx_10_13_x86_64.whl.

File metadata

  • Download URL: ray-1.4.0rc1-cp38-cp38-macosx_10_13_x86_64.whl
  • Upload date:
  • Size: 50.1 MB
  • Tags: CPython 3.8, macOS 10.13+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.8.9

File hashes

Hashes for ray-1.4.0rc1-cp38-cp38-macosx_10_13_x86_64.whl
Algorithm Hash digest
SHA256 dc8bf55b1fc0f43df205ba8384a6ab091f0993878a809741eaf981b2453744dd
MD5 9291e9087e1802a663c54116fa721b52
BLAKE2b-256 18d8160584ce578958a27ddb026925ea3b91626021da3bc4be69532e1855b031

See more details on using hashes here.

File details

Details for the file ray-1.4.0rc1-cp37-cp37m-win_amd64.whl.

File metadata

  • Download URL: ray-1.4.0rc1-cp37-cp37m-win_amd64.whl
  • Upload date:
  • Size: 15.8 MB
  • Tags: CPython 3.7m, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.8.9

File hashes

Hashes for ray-1.4.0rc1-cp37-cp37m-win_amd64.whl
Algorithm Hash digest
SHA256 a679a45b4d98195a605f41d473bb08d6515ef340d6399fa3fabc020518c055d1
MD5 3ae410c111d165bcdb214b32120318ef
BLAKE2b-256 5f2c2a1c715bda12dda0b7d81aaa462d8e9abf7a2403c51340f24c3c817f1137

See more details on using hashes here.

File details

Details for the file ray-1.4.0rc1-cp37-cp37m-manylinux2014_x86_64.whl.

File metadata

  • Download URL: ray-1.4.0rc1-cp37-cp37m-manylinux2014_x86_64.whl
  • Upload date:
  • Size: 49.4 MB
  • Tags: CPython 3.7m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.8.9

File hashes

Hashes for ray-1.4.0rc1-cp37-cp37m-manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 9112e2cf48b167d0d6715a1557b677e210ba6ef57bbef11286e188c8d925a30f
MD5 b933f303ff25503ed4e54b4efff2f715
BLAKE2b-256 2ff3ebc1244fb9a211f4f10b19e77dc3356b32b50cce92d4bdc33265a1fb9d5a

See more details on using hashes here.

File details

Details for the file ray-1.4.0rc1-cp37-cp37m-macosx_10_13_intel.whl.

File metadata

  • Download URL: ray-1.4.0rc1-cp37-cp37m-macosx_10_13_intel.whl
  • Upload date:
  • Size: 50.2 MB
  • Tags: CPython 3.7m, macOS 10.13+ Intel (x86-64, i386)
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.8.9

File hashes

Hashes for ray-1.4.0rc1-cp37-cp37m-macosx_10_13_intel.whl
Algorithm Hash digest
SHA256 7118f58af02726b2115142fe552f64aca62387cec00f01e6b8233755cdce9d9e
MD5 91cca19a80bb57d1b45d12b5962c4d9f
BLAKE2b-256 0a687d123a63959df13b0d2cf04ab3270be46cc3033a97a515131255bac18641

See more details on using hashes here.

File details

Details for the file ray-1.4.0rc1-cp36-cp36m-win_amd64.whl.

File metadata

  • Download URL: ray-1.4.0rc1-cp36-cp36m-win_amd64.whl
  • Upload date:
  • Size: 15.8 MB
  • Tags: CPython 3.6m, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.8.9

File hashes

Hashes for ray-1.4.0rc1-cp36-cp36m-win_amd64.whl
Algorithm Hash digest
SHA256 a9b9fffff6f095443b9d8a7267e2b62f2b4524ba0f89342c1ce03086ea38fad1
MD5 a567b694859aa36f6053098537235d9d
BLAKE2b-256 b256c112a06c0c79356a2a4be0ce4030f3c322cfe782e94aa524e2310eb9ea95

See more details on using hashes here.

File details

Details for the file ray-1.4.0rc1-cp36-cp36m-manylinux2014_x86_64.whl.

File metadata

  • Download URL: ray-1.4.0rc1-cp36-cp36m-manylinux2014_x86_64.whl
  • Upload date:
  • Size: 49.4 MB
  • Tags: CPython 3.6m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.8.9

File hashes

Hashes for ray-1.4.0rc1-cp36-cp36m-manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 042f0d6a1839c6b2b73db4bfad50792cd5b766b9fe0c3adac29d2d5bc4b2622f
MD5 3a3e15097edc65e4a576e394f744287c
BLAKE2b-256 a18eed864c02430834641637c4aa6a27934c17fe262cb8df9be7cd0aa67f75f4

See more details on using hashes here.

File details

Details for the file ray-1.4.0rc1-cp36-cp36m-macosx_10_13_intel.whl.

File metadata

  • Download URL: ray-1.4.0rc1-cp36-cp36m-macosx_10_13_intel.whl
  • Upload date:
  • Size: 50.3 MB
  • Tags: CPython 3.6m, macOS 10.13+ Intel (x86-64, i386)
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.8.9

File hashes

Hashes for ray-1.4.0rc1-cp36-cp36m-macosx_10_13_intel.whl
Algorithm Hash digest
SHA256 4e980354240a46a3cc5cac83074215f8fc662dd5af9d009a35142b0cc811355a
MD5 89581da3b66c66c095d340e963ba5d1c
BLAKE2b-256 dd8c386b84826298cff7022cc9166aaee43c9811afd2eeb9dfe9d3a402a92125

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page