Skip to main content

Lablup Backend.AI Meta-package

Project description

Backend.AI

PyPI release version Supported Python versions Gitter

Backend.AI is a streamlined, container-based computing cluster orchestrator that hosts diverse programming languages and popular computing/ML frameworks, with pluggable heterogeneous accelerator support including CUDA and ROCM. It allocates and isolates the underlying computing resources for multi-tenant computation sessions on-demand or in batches with customizable job schedulers. All its functions are exposed as REST/GraphQL/WebSocket APIs.

Server-side Components

If you want to run a Backend.AI cluster on your own, you need to install and configure the following server-side components. All server-side components are licensed under LGPLv3 to promote non-proprietary open innovation in the open-source community.

There is no obligation to open your service/system codes if you just run the server-side components as-is (e.g., just run as daemons or import the components without modification in your codes). Please contact us (contact-at-lablup-com) for commercial consulting and more licensing details/options about individual use-cases.

For details about server installation and configuration, please visit our documentation.

Manager with API Gateway

It routes external API requests from front-end services to individual agents. It also monitors and scales the cluster of multiple agents (a few tens to hundreds).

  • https://github.com/lablup/backend.ai-manager
    • Package namespace: ai.backend.gateway and ai.backend.manager
    • Plugin interfaces
      • backendai_scheduler_v10
      • backendai_hook_v10
      • backendai_webapp_v10
      • backendai_monitor_stats_v10
      • backendai_monitor_error_v10

Agent

It manages individual server instances and launches/destroys Docker containers where REPL daemons (kernels) run. Each agent on a new EC2 instance self-registers itself to the instance registry via heartbeats.

Server-side common plugins (for both manager and agents)

Kernels

A set of small ZeroMQ-based REPL daemons in various programming languages and configurations.

Jail

A programmable sandbox implemented using ptrace-based sytem call filtering written in Go.

Hook

A set of libc overrides for resource control and web-based interactive stdin (paired with agents).

Commons

A collection of utility modules commonly shared throughout Backend.AI projects.

Client-side Components

Client SDK Libraries

We offer client SDKs in popular programming languages. These SDKs are freely available with MIT License to ease integration with both commercial and non-commercial software products and services.

Media

The front-end support libraries to handle multi-media outputs (e.g., SVG plots, animated vector graphics)

  • The Python package (lablup) is installed inside kernel containers.
  • To interpret and display media generated by the Python package, you need to load the Javascript part in the front-end.
  • https://github.com/lablup/backend.ai-media

Interacting with computation sessions

Backend.AI provides websocket tunneling into individual computation sessions (containers), so that users can use their browsers and client CLI to access in-container applications directly in a secure way.

  • Jupyter Kernel: data scientists' favorite tool
    • Most container sessions have intrinsic Jupyter and JupyterLab support.
  • Web-based terminal
    • All container sessions have intrinsic ttyd support.
  • SSH
    • All container sessions have intrinsic SSH/SFTP/SCP support with auto-generated per-user SSH keypair. PyCharm and other IDEs can use on-demand sessions using SSH remote interpreters.
  • VSCode (coming soon)
    • Most container sessions have intrinsic web-based VSCode support.

Integrations with IDEs and Editors

Storage management

Backend.AI provides an abstraction layer on top of existing network-based storages (e.g., NFS/SMB), called vfolders (virtual folders). Each vfolder works like a cloud storage that can be mounted into any computation sessions and shared between users and user groups with differentiated privileges.

License

Refer to LICENSE file.

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

backend.ai-20.9.0a1.dev0.tar.gz (6.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

backend.ai-20.9.0a1.dev0-py3-none-any.whl (7.4 kB view details)

Uploaded Python 3

File details

Details for the file backend.ai-20.9.0a1.dev0.tar.gz.

File metadata

  • Download URL: backend.ai-20.9.0a1.dev0.tar.gz
  • Upload date:
  • Size: 6.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.24.0 setuptools/50.3.2 requests-toolbelt/0.9.1 tqdm/4.51.0 CPython/3.8.0

File hashes

Hashes for backend.ai-20.9.0a1.dev0.tar.gz
Algorithm Hash digest
SHA256 0a2af7da47d96916c3e97fb469da68f3845eb34cd1ba42a6f211087e00d8e8f2
MD5 14be621f5dc575eab99c6df0f3b490d6
BLAKE2b-256 48fb6b14382a3204d67713eea90e19896455136ade66c27ae67c6d907ec3f375

See more details on using hashes here.

File details

Details for the file backend.ai-20.9.0a1.dev0-py3-none-any.whl.

File metadata

  • Download URL: backend.ai-20.9.0a1.dev0-py3-none-any.whl
  • Upload date:
  • Size: 7.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.24.0 setuptools/50.3.2 requests-toolbelt/0.9.1 tqdm/4.51.0 CPython/3.8.0

File hashes

Hashes for backend.ai-20.9.0a1.dev0-py3-none-any.whl
Algorithm Hash digest
SHA256 68d2aae09b9167373469c7451e97d22eafb67fa1885e811b9781814f10508a84
MD5 a4e54260b3484108bcf49a38d54e1860
BLAKE2b-256 6a6e406ca58a9536e237940bab38bb124dcbc3689b010605220d6f792f8594ca

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page