Databricks API client auto-generated from the official databricks-cli package
Project description
[This documentation is auto-generated]
This package provides a simplified interface for the Databricks REST API. The interface is autogenerated on instantiation using the underlying client library used in the official databricks-cli python package.
Install using
pip install databricks-api
The docs here describe the interface for version 0.8.7 of the databricks-cli package for API version 2.0. Assuming there are no major changes to the databricks-cli package structure, this package should continue to work without a required update.
The databricks-api package contains a DatabricksAPI class which provides instance attributes for the databricks-cli ApiClient, as well as each of the available service instances. The attributes of a DatabricksAPI instance are:
DatabricksAPI.client <databricks_cli.sdk.api_client.ApiClient>
DatabricksAPI.jobs <databricks_cli.sdk.service.JobsService>
DatabricksAPI.cluster <databricks_cli.sdk.service.ClusterService>
DatabricksAPI.managed_library <databricks_cli.sdk.service.ManagedLibraryService>
DatabricksAPI.dbfs <databricks_cli.sdk.service.DbfsService>
DatabricksAPI.workspace <databricks_cli.sdk.service.WorkspaceService>
DatabricksAPI.secret <databricks_cli.sdk.service.SecretService>
DatabricksAPI.groups <databricks_cli.sdk.service.GroupsService>
To instantiate the client, provide the databricks host and either a token or user and password. Also shown is the full signature of the underlying ApiClient.__init__
from databricks_api import DatabricksAPI
# Provide a host and token
db = DatabricksAPI(
host="example.cloud.databricks.com",
token="dpapi123..."
)
# OR a host and user and password
db = DatabricksAPI(
host="example.cloud.databricks.com",
user="me@example.com",
password="password"
)
# Full __init__ signature
db = DatabricksAPI(
user=None,
password=None,
host=None,
token=None,
apiVersion=2.0,
default_headers={},
verify=True,
command_name=''
)
Refer to the official documentation on the functionality and required arguments of each method below.
Each of the service instance attributes provides the following public methods:
DatabricksAPI.jobs
DatabricksAPI.jobs.cancel_run(
run_id,
headers=None,
)
DatabricksAPI.jobs.create_job(
name=None,
existing_cluster_id=None,
new_cluster=None,
libraries=None,
email_notifications=None,
timeout_seconds=None,
max_retries=None,
min_retry_interval_millis=None,
retry_on_timeout=None,
schedule=None,
notebook_task=None,
spark_jar_task=None,
spark_python_task=None,
spark_submit_task=None,
max_concurrent_runs=None,
headers=None,
)
DatabricksAPI.jobs.delete_job(
job_id,
headers=None,
)
DatabricksAPI.jobs.delete_run(
run_id=None,
headers=None,
)
DatabricksAPI.jobs.export_run(
run_id,
views_to_export=None,
headers=None,
)
DatabricksAPI.jobs.get_job(
job_id,
headers=None,
)
DatabricksAPI.jobs.get_run(
run_id=None,
headers=None,
)
DatabricksAPI.jobs.get_run_output(
run_id,
headers=None,
)
DatabricksAPI.jobs.list_jobs(headers=None)
DatabricksAPI.jobs.list_runs(
job_id=None,
active_only=None,
completed_only=None,
offset=None,
limit=None,
headers=None,
)
DatabricksAPI.jobs.reset_job(
job_id,
new_settings,
headers=None,
)
DatabricksAPI.jobs.run_now(
job_id=None,
jar_params=None,
notebook_params=None,
python_params=None,
spark_submit_params=None,
headers=None,
)
DatabricksAPI.jobs.submit_run(
run_name=None,
existing_cluster_id=None,
new_cluster=None,
libraries=None,
notebook_task=None,
spark_jar_task=None,
spark_python_task=None,
spark_submit_task=None,
timeout_seconds=None,
headers=None,
)
DatabricksAPI.cluster
DatabricksAPI.cluster.create_cluster(
num_workers=None,
autoscale=None,
cluster_name=None,
spark_version=None,
spark_conf=None,
aws_attributes=None,
node_type_id=None,
driver_node_type_id=None,
ssh_public_keys=None,
custom_tags=None,
cluster_log_conf=None,
spark_env_vars=None,
autotermination_minutes=None,
enable_elastic_disk=None,
cluster_source=None,
instance_pool_id=None,
headers=None,
)
DatabricksAPI.cluster.delete_cluster(
cluster_id,
headers=None,
)
DatabricksAPI.cluster.edit_cluster(
cluster_id,
num_workers=None,
autoscale=None,
cluster_name=None,
spark_version=None,
spark_conf=None,
aws_attributes=None,
node_type_id=None,
driver_node_type_id=None,
ssh_public_keys=None,
custom_tags=None,
cluster_log_conf=None,
spark_env_vars=None,
autotermination_minutes=None,
enable_elastic_disk=None,
cluster_source=None,
instance_pool_id=None,
headers=None,
)
DatabricksAPI.cluster.get_cluster(
cluster_id,
headers=None,
)
DatabricksAPI.cluster.get_events(
cluster_id,
start_time=None,
end_time=None,
order=None,
event_types=None,
offset=None,
limit=None,
headers=None,
)
DatabricksAPI.cluster.list_available_zones(headers=None)
DatabricksAPI.cluster.list_clusters(headers=None)
DatabricksAPI.cluster.list_node_types(headers=None)
DatabricksAPI.cluster.list_spark_versions(headers=None)
DatabricksAPI.cluster.permanent_delete_cluster(
cluster_id,
headers=None,
)
DatabricksAPI.cluster.pin_cluster(
cluster_id,
headers=None,
)
DatabricksAPI.cluster.resize_cluster(
cluster_id,
num_workers=None,
autoscale=None,
headers=None,
)
DatabricksAPI.cluster.restart_cluster(
cluster_id,
headers=None,
)
DatabricksAPI.cluster.start_cluster(
cluster_id,
headers=None,
)
DatabricksAPI.cluster.unpin_cluster(
cluster_id,
headers=None,
)
DatabricksAPI.managed_library
DatabricksAPI.managed_library.all_cluster_statuses(headers=None)
DatabricksAPI.managed_library.cluster_status(
cluster_id,
headers=None,
)
DatabricksAPI.managed_library.install_libraries(
cluster_id,
libraries=None,
headers=None,
)
DatabricksAPI.managed_library.uninstall_libraries(
cluster_id,
libraries=None,
headers=None,
)
DatabricksAPI.dbfs
DatabricksAPI.dbfs.add_block(
handle,
data,
headers=None,
)
DatabricksAPI.dbfs.close(
handle,
headers=None,
)
DatabricksAPI.dbfs.create(
path,
overwrite=None,
headers=None,
)
DatabricksAPI.dbfs.delete(
path,
recursive=None,
headers=None,
)
DatabricksAPI.dbfs.get_status(
path,
headers=None,
)
DatabricksAPI.dbfs.list(
path,
headers=None,
)
DatabricksAPI.dbfs.mkdirs(
path,
headers=None,
)
DatabricksAPI.dbfs.move(
source_path,
destination_path,
headers=None,
)
DatabricksAPI.dbfs.put(
path,
contents=None,
overwrite=None,
headers=None,
)
DatabricksAPI.dbfs.read(
path,
offset=None,
length=None,
headers=None,
)
DatabricksAPI.workspace
DatabricksAPI.workspace.delete(
path,
recursive=None,
headers=None,
)
DatabricksAPI.workspace.export_workspace(
path,
format=None,
direct_download=None,
headers=None,
)
DatabricksAPI.workspace.get_status(
path,
headers=None,
)
DatabricksAPI.workspace.import_workspace(
path,
format=None,
language=None,
content=None,
overwrite=None,
headers=None,
)
DatabricksAPI.workspace.list(
path,
headers=None,
)
DatabricksAPI.workspace.mkdirs(
path,
headers=None,
)
DatabricksAPI.secret
DatabricksAPI.secret.create_scope(
scope,
initial_manage_principal=None,
scope_backend_type=None,
headers=None,
)
DatabricksAPI.secret.delete_acl(
scope,
principal,
headers=None,
)
DatabricksAPI.secret.delete_scope(
scope,
headers=None,
)
DatabricksAPI.secret.delete_secret(
scope,
key,
headers=None,
)
DatabricksAPI.secret.get_acl(
scope,
principal,
headers=None,
)
DatabricksAPI.secret.list_acls(
scope,
headers=None,
)
DatabricksAPI.secret.list_scopes(headers=None)
DatabricksAPI.secret.list_secrets(
scope,
headers=None,
)
DatabricksAPI.secret.put_acl(
scope,
principal,
permission,
headers=None,
)
DatabricksAPI.secret.put_secret(
scope,
key,
string_value=None,
bytes_value=None,
headers=None,
)
DatabricksAPI.groups
DatabricksAPI.groups.add_to_group(
parent_name,
user_name=None,
group_name=None,
headers=None,
)
DatabricksAPI.groups.create_group(
group_name,
headers=None,
)
DatabricksAPI.groups.get_group_members(
group_name,
headers=None,
)
DatabricksAPI.groups.get_groups(headers=None)
DatabricksAPI.groups.get_groups_for_principal(
user_name=None,
group_name=None,
headers=None,
)
DatabricksAPI.groups.remove_from_group(
parent_name,
user_name=None,
group_name=None,
headers=None,
)
DatabricksAPI.groups.remove_group(
group_name,
headers=None,
)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file databricks_api-0.2.0.tar.gz.
File metadata
- Download URL: databricks_api-0.2.0.tar.gz
- Upload date:
- Size: 6.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/40.8.0 requests-toolbelt/0.9.1 tqdm/4.33.0 CPython/3.7.4
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6c4397cf790129947df1654d3eb2d96c2560defa0de1ff27bf8ce1fc353ec49b
|
|
| MD5 |
5f9d1a95043eb32e15c417e720ca393e
|
|
| BLAKE2b-256 |
b901b72afd655bb93bef4a48605624d5b1605340ddbb500396c4e65f8d5f5a9f
|
File details
Details for the file databricks_api-0.2.0-py2.py3-none-any.whl.
File metadata
- Download URL: databricks_api-0.2.0-py2.py3-none-any.whl
- Upload date:
- Size: 5.5 kB
- Tags: Python 2, Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/40.8.0 requests-toolbelt/0.9.1 tqdm/4.33.0 CPython/3.7.4
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
54c10540c0c056447791a7b473c6285eaa33541d1b11e2435cebc28810a94d7e
|
|
| MD5 |
b56a24549841bea146b7560cfbc872b9
|
|
| BLAKE2b-256 |
77683775cd41db067d77710a9877e9fb61c82217e09d20f9408fb8c31da045b8
|