Skip to main content

Databricks API client auto-generated from the official databricks-cli package

Project description

pypi pyversions

[This documentation is auto-generated]

This package provides a simplified interface for the Databricks REST API. The interface is autogenerated on instantiation using the underlying client library used in the official databricks-cli python package.

Install using

pip install databricks-api

The docs here describe the interface for version 0.10.0 of the databricks-cli package for API version 2.0. Assuming there are no new major or minor versions to the databricks-cli package structure, this package should continue to work without a required update.

The databricks-api package contains a DatabricksAPI class which provides instance attributes for the databricks-cli ApiClient, as well as each of the available service instances. The attributes of a DatabricksAPI instance are:

  • DatabricksAPI.client <databricks_cli.sdk.api_client.ApiClient>

  • DatabricksAPI.jobs <databricks_cli.sdk.service.JobsService>

  • DatabricksAPI.cluster <databricks_cli.sdk.service.ClusterService>

  • DatabricksAPI.managed_library <databricks_cli.sdk.service.ManagedLibraryService>

  • DatabricksAPI.dbfs <databricks_cli.sdk.service.DbfsService>

  • DatabricksAPI.workspace <databricks_cli.sdk.service.WorkspaceService>

  • DatabricksAPI.secret <databricks_cli.sdk.service.SecretService>

  • DatabricksAPI.groups <databricks_cli.sdk.service.GroupsService>

  • DatabricksAPI.instance_pool <databricks_cli.sdk.service.InstancePoolService>

To instantiate the client, provide the databricks host and either a token or user and password. Also shown is the full signature of the underlying ApiClient.__init__

from databricks_api import DatabricksAPI

# Provide a host and token
db = DatabricksAPI(
    host="example.cloud.databricks.com",
    token="dpapi123..."
)

# OR a host and user and password
db = DatabricksAPI(
    host="example.cloud.databricks.com",
    user="me@example.com",
    password="password"
)

# Full __init__ signature
db = DatabricksAPI(
    user=None,
    password=None,
    host=None,
    token=None,
    apiVersion=2.0,
    default_headers={},
    verify=True,
    command_name=''
)

Refer to the official documentation on the functionality and required arguments of each method below.

Each of the service instance attributes provides the following public methods:

DatabricksAPI.jobs

DatabricksAPI.jobs.cancel_run(
    run_id,
    headers=None,
)

DatabricksAPI.jobs.create_job(
    name=None,
    existing_cluster_id=None,
    new_cluster=None,
    libraries=None,
    email_notifications=None,
    timeout_seconds=None,
    max_retries=None,
    min_retry_interval_millis=None,
    retry_on_timeout=None,
    schedule=None,
    notebook_task=None,
    spark_jar_task=None,
    spark_python_task=None,
    spark_submit_task=None,
    max_concurrent_runs=None,
    headers=None,
)

DatabricksAPI.jobs.delete_job(
    job_id,
    headers=None,
)

DatabricksAPI.jobs.delete_run(
    run_id=None,
    headers=None,
)

DatabricksAPI.jobs.export_run(
    run_id,
    views_to_export=None,
    headers=None,
)

DatabricksAPI.jobs.get_job(
    job_id,
    headers=None,
)

DatabricksAPI.jobs.get_run(
    run_id=None,
    headers=None,
)

DatabricksAPI.jobs.get_run_output(
    run_id,
    headers=None,
)

DatabricksAPI.jobs.list_jobs(headers=None)

DatabricksAPI.jobs.list_runs(
    job_id=None,
    active_only=None,
    completed_only=None,
    offset=None,
    limit=None,
    headers=None,
)

DatabricksAPI.jobs.reset_job(
    job_id,
    new_settings,
    headers=None,
)

DatabricksAPI.jobs.run_now(
    job_id=None,
    jar_params=None,
    notebook_params=None,
    python_params=None,
    spark_submit_params=None,
    headers=None,
)

DatabricksAPI.jobs.submit_run(
    run_name=None,
    existing_cluster_id=None,
    new_cluster=None,
    libraries=None,
    notebook_task=None,
    spark_jar_task=None,
    spark_python_task=None,
    spark_submit_task=None,
    timeout_seconds=None,
    headers=None,
)

DatabricksAPI.cluster

DatabricksAPI.cluster.create_cluster(
    num_workers=None,
    autoscale=None,
    cluster_name=None,
    spark_version=None,
    spark_conf=None,
    aws_attributes=None,
    node_type_id=None,
    driver_node_type_id=None,
    ssh_public_keys=None,
    custom_tags=None,
    cluster_log_conf=None,
    init_scripts=None,
    spark_env_vars=None,
    autotermination_minutes=None,
    enable_elastic_disk=None,
    cluster_source=None,
    instance_pool_id=None,
    headers=None,
)

DatabricksAPI.cluster.delete_cluster(
    cluster_id,
    headers=None,
)

DatabricksAPI.cluster.edit_cluster(
    cluster_id,
    num_workers=None,
    autoscale=None,
    cluster_name=None,
    spark_version=None,
    spark_conf=None,
    aws_attributes=None,
    node_type_id=None,
    driver_node_type_id=None,
    ssh_public_keys=None,
    custom_tags=None,
    cluster_log_conf=None,
    init_scripts=None,
    spark_env_vars=None,
    autotermination_minutes=None,
    enable_elastic_disk=None,
    cluster_source=None,
    instance_pool_id=None,
    headers=None,
)

DatabricksAPI.cluster.get_cluster(
    cluster_id,
    headers=None,
)

DatabricksAPI.cluster.get_events(
    cluster_id,
    start_time=None,
    end_time=None,
    order=None,
    event_types=None,
    offset=None,
    limit=None,
    headers=None,
)

DatabricksAPI.cluster.list_available_zones(headers=None)

DatabricksAPI.cluster.list_clusters(headers=None)

DatabricksAPI.cluster.list_node_types(headers=None)

DatabricksAPI.cluster.list_spark_versions(headers=None)

DatabricksAPI.cluster.permanent_delete_cluster(
    cluster_id,
    headers=None,
)

DatabricksAPI.cluster.pin_cluster(
    cluster_id,
    headers=None,
)

DatabricksAPI.cluster.resize_cluster(
    cluster_id,
    num_workers=None,
    autoscale=None,
    headers=None,
)

DatabricksAPI.cluster.restart_cluster(
    cluster_id,
    headers=None,
)

DatabricksAPI.cluster.start_cluster(
    cluster_id,
    headers=None,
)

DatabricksAPI.cluster.unpin_cluster(
    cluster_id,
    headers=None,
)

DatabricksAPI.managed_library

DatabricksAPI.managed_library.all_cluster_statuses(headers=None)

DatabricksAPI.managed_library.cluster_status(
    cluster_id,
    headers=None,
)

DatabricksAPI.managed_library.install_libraries(
    cluster_id,
    libraries=None,
    headers=None,
)

DatabricksAPI.managed_library.uninstall_libraries(
    cluster_id,
    libraries=None,
    headers=None,
)

DatabricksAPI.dbfs

DatabricksAPI.dbfs.add_block(
    handle,
    data,
    headers=None,
)

DatabricksAPI.dbfs.close(
    handle,
    headers=None,
)

DatabricksAPI.dbfs.create(
    path,
    overwrite=None,
    headers=None,
)

DatabricksAPI.dbfs.delete(
    path,
    recursive=None,
    headers=None,
)

DatabricksAPI.dbfs.get_status(
    path,
    headers=None,
)

DatabricksAPI.dbfs.list(
    path,
    headers=None,
)

DatabricksAPI.dbfs.mkdirs(
    path,
    headers=None,
)

DatabricksAPI.dbfs.move(
    source_path,
    destination_path,
    headers=None,
)

DatabricksAPI.dbfs.put(
    path,
    contents=None,
    overwrite=None,
    headers=None,
)

DatabricksAPI.dbfs.read(
    path,
    offset=None,
    length=None,
    headers=None,
)

DatabricksAPI.workspace

DatabricksAPI.workspace.delete(
    path,
    recursive=None,
    headers=None,
)

DatabricksAPI.workspace.export_workspace(
    path,
    format=None,
    direct_download=None,
    headers=None,
)

DatabricksAPI.workspace.get_status(
    path,
    headers=None,
)

DatabricksAPI.workspace.import_workspace(
    path,
    format=None,
    language=None,
    content=None,
    overwrite=None,
    headers=None,
)

DatabricksAPI.workspace.list(
    path,
    headers=None,
)

DatabricksAPI.workspace.mkdirs(
    path,
    headers=None,
)

DatabricksAPI.secret

DatabricksAPI.secret.create_scope(
    scope,
    initial_manage_principal=None,
    scope_backend_type=None,
    headers=None,
)

DatabricksAPI.secret.delete_acl(
    scope,
    principal,
    headers=None,
)

DatabricksAPI.secret.delete_scope(
    scope,
    headers=None,
)

DatabricksAPI.secret.delete_secret(
    scope,
    key,
    headers=None,
)

DatabricksAPI.secret.get_acl(
    scope,
    principal,
    headers=None,
)

DatabricksAPI.secret.list_acls(
    scope,
    headers=None,
)

DatabricksAPI.secret.list_scopes(headers=None)

DatabricksAPI.secret.list_secrets(
    scope,
    headers=None,
)

DatabricksAPI.secret.put_acl(
    scope,
    principal,
    permission,
    headers=None,
)

DatabricksAPI.secret.put_secret(
    scope,
    key,
    string_value=None,
    bytes_value=None,
    headers=None,
)

DatabricksAPI.groups

DatabricksAPI.groups.add_to_group(
    parent_name,
    user_name=None,
    group_name=None,
    headers=None,
)

DatabricksAPI.groups.create_group(
    group_name,
    headers=None,
)

DatabricksAPI.groups.get_group_members(
    group_name,
    headers=None,
)

DatabricksAPI.groups.get_groups(headers=None)

DatabricksAPI.groups.get_groups_for_principal(
    user_name=None,
    group_name=None,
    headers=None,
)

DatabricksAPI.groups.remove_from_group(
    parent_name,
    user_name=None,
    group_name=None,
    headers=None,
)

DatabricksAPI.groups.remove_group(
    group_name,
    headers=None,
)

DatabricksAPI.instance_pool

DatabricksAPI.instance_pool.create_instance_pool(
    instance_pool_name=None,
    min_idle_instances=None,
    max_capacity=None,
    aws_attributes=None,
    node_type_id=None,
    custom_tags=None,
    idle_instance_autotermination_minutes=None,
    enable_elastic_disk=None,
    disk_spec=None,
    preloaded_spark_versions=None,
    headers=None,
)

DatabricksAPI.instance_pool.delete_instance_pool(
    instance_pool_id=None,
    headers=None,
)

DatabricksAPI.instance_pool.edit_instance_pool(
    instance_pool_id,
    instance_pool_name=None,
    min_idle_instances=None,
    max_capacity=None,
    aws_attributes=None,
    node_type_id=None,
    custom_tags=None,
    idle_instance_autotermination_minutes=None,
    enable_elastic_disk=None,
    disk_spec=None,
    preloaded_spark_versions=None,
    headers=None,
)

DatabricksAPI.instance_pool.get_instance_pool(
    instance_pool_id=None,
    headers=None,
)

DatabricksAPI.instance_pool.list_instance_pools(headers=None)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

databricks_api-0.4.0.tar.gz (7.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

databricks_api-0.4.0-py2.py3-none-any.whl (5.8 kB view details)

Uploaded Python 2Python 3

File details

Details for the file databricks_api-0.4.0.tar.gz.

File metadata

  • Download URL: databricks_api-0.4.0.tar.gz
  • Upload date:
  • Size: 7.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.0.5 CPython/3.7.4 Darwin/17.7.0

File hashes

Hashes for databricks_api-0.4.0.tar.gz
Algorithm Hash digest
SHA256 c7b530fc2788bad2298e57a7e6e8ca2fb0fb5f327f59e5af224019e0eaf141d6
MD5 0fd79cc6981f030f265ae0d7b3d1de33
BLAKE2b-256 edc2c6f73b0e05fb84e8dd941485efcc4f64bae9eb696eb7c23cbb3bd307f555

See more details on using hashes here.

File details

Details for the file databricks_api-0.4.0-py2.py3-none-any.whl.

File metadata

  • Download URL: databricks_api-0.4.0-py2.py3-none-any.whl
  • Upload date:
  • Size: 5.8 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.0.5 CPython/3.7.4 Darwin/17.7.0

File hashes

Hashes for databricks_api-0.4.0-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 7a0df5a6ab535b03e18705d3aab7ee78e1628a5f55f22ba161c55869f406ee87
MD5 85fa726e29371c9ddcb09ae48ea1e3d9
BLAKE2b-256 cba50d543d465e8457ab5b2ef4a250d174ae9b18f386a1bf3cb075808546acc1

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page