Skip to main content

Databricks API client auto-generated from the official databricks-cli package

Project description

pypi pyversions

[This documentation is auto-generated]

This package provides a simplified interface for the Databricks REST API. The interface is autogenerated on instantiation using the underlying client library used in the official databricks-cli python package.

Install using

pip install databricks-api

The docs here describe the interface for version 0.12.0 of the databricks-cli package for API version 2.0. Assuming there are no new major or minor versions to the databricks-cli package structure, this package should continue to work without a required update.

The databricks-api package contains a DatabricksAPI class which provides instance attributes for the databricks-cli ApiClient, as well as each of the available service instances. The attributes of a DatabricksAPI instance are:

  • DatabricksAPI.client <databricks_cli.sdk.api_client.ApiClient>

  • DatabricksAPI.jobs <databricks_cli.sdk.service.JobsService>

  • DatabricksAPI.cluster <databricks_cli.sdk.service.ClusterService>

  • DatabricksAPI.policy <databricks_cli.sdk.service.PolicyService>

  • DatabricksAPI.managed_library <databricks_cli.sdk.service.ManagedLibraryService>

  • DatabricksAPI.dbfs <databricks_cli.sdk.service.DbfsService>

  • DatabricksAPI.workspace <databricks_cli.sdk.service.WorkspaceService>

  • DatabricksAPI.secret <databricks_cli.sdk.service.SecretService>

  • DatabricksAPI.groups <databricks_cli.sdk.service.GroupsService>

  • DatabricksAPI.token <databricks_cli.sdk.service.TokenService>

  • DatabricksAPI.instance_pool <databricks_cli.sdk.service.InstancePoolService>

  • DatabricksAPI.delta_pipelines <databricks_cli.sdk.service.DeltaPipelinesService>

To instantiate the client, provide the databricks host and either a token or user and password. Also shown is the full signature of the underlying ApiClient.__init__

from databricks_api import DatabricksAPI

# Provide a host and token
db = DatabricksAPI(
    host="example.cloud.databricks.com",
    token="dpapi123..."
)

# OR a host and user and password
db = DatabricksAPI(
    host="example.cloud.databricks.com",
    user="me@example.com",
    password="password"
)

# Full __init__ signature
db = DatabricksAPI(
    user=None,
    password=None,
    host=None,
    token=None,
    apiVersion=2.0,
    default_headers={},
    verify=True,
    command_name=''
)

Refer to the official documentation on the functionality and required arguments of each method below.

Each of the service instance attributes provides the following public methods:

DatabricksAPI.jobs

DatabricksAPI.jobs.cancel_run(
    run_id,
    headers=None,
)

DatabricksAPI.jobs.create_job(
    name=None,
    existing_cluster_id=None,
    new_cluster=None,
    libraries=None,
    email_notifications=None,
    timeout_seconds=None,
    max_retries=None,
    min_retry_interval_millis=None,
    retry_on_timeout=None,
    schedule=None,
    notebook_task=None,
    spark_jar_task=None,
    spark_python_task=None,
    spark_submit_task=None,
    max_concurrent_runs=None,
    headers=None,
)

DatabricksAPI.jobs.delete_job(
    job_id,
    headers=None,
)

DatabricksAPI.jobs.delete_run(
    run_id=None,
    headers=None,
)

DatabricksAPI.jobs.export_run(
    run_id,
    views_to_export=None,
    headers=None,
)

DatabricksAPI.jobs.get_job(
    job_id,
    headers=None,
)

DatabricksAPI.jobs.get_run(
    run_id=None,
    headers=None,
)

DatabricksAPI.jobs.get_run_output(
    run_id,
    headers=None,
)

DatabricksAPI.jobs.list_jobs(headers=None)

DatabricksAPI.jobs.list_runs(
    job_id=None,
    active_only=None,
    completed_only=None,
    offset=None,
    limit=None,
    headers=None,
)

DatabricksAPI.jobs.reset_job(
    job_id,
    new_settings,
    headers=None,
)

DatabricksAPI.jobs.run_now(
    job_id=None,
    jar_params=None,
    notebook_params=None,
    python_params=None,
    spark_submit_params=None,
    headers=None,
)

DatabricksAPI.jobs.submit_run(
    run_name=None,
    existing_cluster_id=None,
    new_cluster=None,
    libraries=None,
    notebook_task=None,
    spark_jar_task=None,
    spark_python_task=None,
    spark_submit_task=None,
    timeout_seconds=None,
    headers=None,
)

DatabricksAPI.cluster

DatabricksAPI.cluster.create_cluster(
    num_workers=None,
    autoscale=None,
    cluster_name=None,
    spark_version=None,
    spark_conf=None,
    aws_attributes=None,
    node_type_id=None,
    driver_node_type_id=None,
    ssh_public_keys=None,
    custom_tags=None,
    cluster_log_conf=None,
    init_scripts=None,
    spark_env_vars=None,
    autotermination_minutes=None,
    enable_elastic_disk=None,
    cluster_source=None,
    instance_pool_id=None,
    headers=None,
)

DatabricksAPI.cluster.delete_cluster(
    cluster_id,
    headers=None,
)

DatabricksAPI.cluster.edit_cluster(
    cluster_id,
    num_workers=None,
    autoscale=None,
    cluster_name=None,
    spark_version=None,
    spark_conf=None,
    aws_attributes=None,
    node_type_id=None,
    driver_node_type_id=None,
    ssh_public_keys=None,
    custom_tags=None,
    cluster_log_conf=None,
    init_scripts=None,
    spark_env_vars=None,
    autotermination_minutes=None,
    enable_elastic_disk=None,
    cluster_source=None,
    instance_pool_id=None,
    headers=None,
)

DatabricksAPI.cluster.get_cluster(
    cluster_id,
    headers=None,
)

DatabricksAPI.cluster.get_events(
    cluster_id,
    start_time=None,
    end_time=None,
    order=None,
    event_types=None,
    offset=None,
    limit=None,
    headers=None,
)

DatabricksAPI.cluster.list_available_zones(headers=None)

DatabricksAPI.cluster.list_clusters(headers=None)

DatabricksAPI.cluster.list_node_types(headers=None)

DatabricksAPI.cluster.list_spark_versions(headers=None)

DatabricksAPI.cluster.permanent_delete_cluster(
    cluster_id,
    headers=None,
)

DatabricksAPI.cluster.pin_cluster(
    cluster_id,
    headers=None,
)

DatabricksAPI.cluster.resize_cluster(
    cluster_id,
    num_workers=None,
    autoscale=None,
    headers=None,
)

DatabricksAPI.cluster.restart_cluster(
    cluster_id,
    headers=None,
)

DatabricksAPI.cluster.start_cluster(
    cluster_id,
    headers=None,
)

DatabricksAPI.cluster.unpin_cluster(
    cluster_id,
    headers=None,
)

DatabricksAPI.policy

DatabricksAPI.policy.create_policy(
    policy_name,
    definition,
    headers=None,
)

DatabricksAPI.policy.delete_policy(
    policy_id,
    headers=None,
)

DatabricksAPI.policy.edit_policy(
    policy_id,
    policy_name,
    definition,
    headers=None,
)

DatabricksAPI.policy.get_policy(
    policy_id,
    headers=None,
)

DatabricksAPI.policy.list_policies(headers=None)

DatabricksAPI.managed_library

DatabricksAPI.managed_library.all_cluster_statuses(headers=None)

DatabricksAPI.managed_library.cluster_status(
    cluster_id,
    headers=None,
)

DatabricksAPI.managed_library.install_libraries(
    cluster_id,
    libraries=None,
    headers=None,
)

DatabricksAPI.managed_library.uninstall_libraries(
    cluster_id,
    libraries=None,
    headers=None,
)

DatabricksAPI.dbfs

DatabricksAPI.dbfs.add_block(
    handle,
    data,
    headers=None,
)

DatabricksAPI.dbfs.close(
    handle,
    headers=None,
)

DatabricksAPI.dbfs.create(
    path,
    overwrite=None,
    headers=None,
)

DatabricksAPI.dbfs.delete(
    path,
    recursive=None,
    headers=None,
)

DatabricksAPI.dbfs.get_status(
    path,
    headers=None,
)

DatabricksAPI.dbfs.list(
    path,
    headers=None,
)

DatabricksAPI.dbfs.mkdirs(
    path,
    headers=None,
)

DatabricksAPI.dbfs.move(
    source_path,
    destination_path,
    headers=None,
)

DatabricksAPI.dbfs.put(
    path,
    contents=None,
    overwrite=None,
    headers=None,
)

DatabricksAPI.dbfs.read(
    path,
    offset=None,
    length=None,
    headers=None,
)

DatabricksAPI.workspace

DatabricksAPI.workspace.delete(
    path,
    recursive=None,
    headers=None,
)

DatabricksAPI.workspace.export_workspace(
    path,
    format=None,
    direct_download=None,
    headers=None,
)

DatabricksAPI.workspace.get_status(
    path,
    headers=None,
)

DatabricksAPI.workspace.import_workspace(
    path,
    format=None,
    language=None,
    content=None,
    overwrite=None,
    headers=None,
)

DatabricksAPI.workspace.list(
    path,
    headers=None,
)

DatabricksAPI.workspace.mkdirs(
    path,
    headers=None,
)

DatabricksAPI.secret

DatabricksAPI.secret.create_scope(
    scope,
    initial_manage_principal=None,
    scope_backend_type=None,
    headers=None,
)

DatabricksAPI.secret.delete_acl(
    scope,
    principal,
    headers=None,
)

DatabricksAPI.secret.delete_scope(
    scope,
    headers=None,
)

DatabricksAPI.secret.delete_secret(
    scope,
    key,
    headers=None,
)

DatabricksAPI.secret.get_acl(
    scope,
    principal,
    headers=None,
)

DatabricksAPI.secret.list_acls(
    scope,
    headers=None,
)

DatabricksAPI.secret.list_scopes(headers=None)

DatabricksAPI.secret.list_secrets(
    scope,
    headers=None,
)

DatabricksAPI.secret.put_acl(
    scope,
    principal,
    permission,
    headers=None,
)

DatabricksAPI.secret.put_secret(
    scope,
    key,
    string_value=None,
    bytes_value=None,
    headers=None,
)

DatabricksAPI.groups

DatabricksAPI.groups.add_to_group(
    parent_name,
    user_name=None,
    group_name=None,
    headers=None,
)

DatabricksAPI.groups.create_group(
    group_name,
    headers=None,
)

DatabricksAPI.groups.get_group_members(
    group_name,
    headers=None,
)

DatabricksAPI.groups.get_groups(headers=None)

DatabricksAPI.groups.get_groups_for_principal(
    user_name=None,
    group_name=None,
    headers=None,
)

DatabricksAPI.groups.remove_from_group(
    parent_name,
    user_name=None,
    group_name=None,
    headers=None,
)

DatabricksAPI.groups.remove_group(
    group_name,
    headers=None,
)

DatabricksAPI.token

DatabricksAPI.token.create_token(
    lifetime_seconds=None,
    comment=None,
    headers=None,
)

DatabricksAPI.token.list_tokens(headers=None)

DatabricksAPI.token.revoke_token(
    token_id,
    headers=None,
)

DatabricksAPI.instance_pool

DatabricksAPI.instance_pool.create_instance_pool(
    instance_pool_name=None,
    min_idle_instances=None,
    max_capacity=None,
    aws_attributes=None,
    node_type_id=None,
    custom_tags=None,
    idle_instance_autotermination_minutes=None,
    enable_elastic_disk=None,
    disk_spec=None,
    preloaded_spark_versions=None,
    headers=None,
)

DatabricksAPI.instance_pool.delete_instance_pool(
    instance_pool_id=None,
    headers=None,
)

DatabricksAPI.instance_pool.edit_instance_pool(
    instance_pool_id,
    instance_pool_name=None,
    min_idle_instances=None,
    max_capacity=None,
    aws_attributes=None,
    node_type_id=None,
    custom_tags=None,
    idle_instance_autotermination_minutes=None,
    enable_elastic_disk=None,
    disk_spec=None,
    preloaded_spark_versions=None,
    headers=None,
)

DatabricksAPI.instance_pool.get_instance_pool(
    instance_pool_id=None,
    headers=None,
)

DatabricksAPI.instance_pool.list_instance_pools(headers=None)

DatabricksAPI.delta_pipelines

DatabricksAPI.delta_pipelines.create(
    id=None,
    name=None,
    storage=None,
    configuration=None,
    clusters=None,
    libraries=None,
    trigger=None,
    filters=None,
    allow_duplicate_names=None,
    headers=None,
)

DatabricksAPI.delta_pipelines.delete(
    pipeline_id=None,
    headers=None,
)

DatabricksAPI.delta_pipelines.deploy(
    pipeline_id=None,
    id=None,
    name=None,
    storage=None,
    configuration=None,
    clusters=None,
    libraries=None,
    trigger=None,
    filters=None,
    allow_duplicate_names=None,
    headers=None,
)

DatabricksAPI.delta_pipelines.get(
    pipeline_id=None,
    headers=None,
)

DatabricksAPI.delta_pipelines.reset(
    pipeline_id=None,
    headers=None,
)

DatabricksAPI.delta_pipelines.run(
    pipeline_id=None,
    headers=None,
)

DatabricksAPI.delta_pipelines.stop(
    pipeline_id=None,
    headers=None,
)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

databricks_api-0.5.1.tar.gz (8.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

databricks_api-0.5.1-py2.py3-none-any.whl (6.1 kB view details)

Uploaded Python 2Python 3

File details

Details for the file databricks_api-0.5.1.tar.gz.

File metadata

  • Download URL: databricks_api-0.5.1.tar.gz
  • Upload date:
  • Size: 8.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.0.5 CPython/3.7.4 Darwin/17.7.0

File hashes

Hashes for databricks_api-0.5.1.tar.gz
Algorithm Hash digest
SHA256 d8a416cdcd6aba5ac6370e205d4b3781137b3557fa9d66225266a942ceaedc2c
MD5 a6366986fb29d2afbc6dd9de93a7b616
BLAKE2b-256 14fc7272e92c22911dde5d1aa11fa45c3335958f062a9eedd03f7e30548842df

See more details on using hashes here.

File details

Details for the file databricks_api-0.5.1-py2.py3-none-any.whl.

File metadata

  • Download URL: databricks_api-0.5.1-py2.py3-none-any.whl
  • Upload date:
  • Size: 6.1 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.0.5 CPython/3.7.4 Darwin/17.7.0

File hashes

Hashes for databricks_api-0.5.1-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 b7e5e55a4f63bd6bc1e958b441aa762a28d9796966f4a7807fb867c86fb945b8
MD5 291b0399c6b7cafe8b651325a3039f69
BLAKE2b-256 cf820d10153da8b7e7634d5e90e9bb983a9d890dd58ecd2be4d8029d269b9560

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page