CLI utils for working with a datacube index
Project description
odc.apps.dc_tools
Command line utilities for working with datacube index
Installation
pip install odc-apps-dc-tools
Usage
dc-index-export-md
Metadata transformer
Simple usage:
TODO:
Extended usage:
TODO:
dc-index-from-tar
Index ODC metadata that is contained in a .tar file
Simple usage:
dc-index-from-tar 'path/to/file.tar'
Extended usage:
TODO:
sqs-to-dc
A tool to index from an SQS queue
Simple usage:
sqs-to-dc example-queue-name 'product-name-a product-name-b'
Extended usage:
Usage: sqs-to-dc [OPTIONS] QUEUE_NAME PRODUCT
Iterate through messages on an SQS queue and add them to datacube
Options:
--skip-lineage Default is not to skip lineage. Set to skip
lineage altogether.
--fail-on-missing-lineage / --auto-add-lineage
Default is to fail if lineage documents not
present in the database. Set auto add to try
to index lineage documents.
--verify-lineage Default is no verification. Set to verify
parent dataset definitions.
--stac Expect STAC 1.0 metadata and attempt to
transform to ODC EO3 metadata
--odc-metadata-link TEXT Expect metadata doc with ODC EO3 metadata
link. Either provide '/' separated path to
find metadata link in a provided metadata
doc e.g. 'foo/bar/link', or if metadata doc
is STAC, provide 'rel' value of the 'links'
object having metadata link. e.g. 'STAC-
LINKS-REL:odc_yaml'
--limit INTEGER Stop indexing after n datasets have been
indexed.
--update If set, update instead of add datasets
--update-if-exists If the dataset already exists, update it
instead of skipping it.
--archive If set, archive datasets
--allow-unsafe Allow unsafe changes to a dataset. Take
care!
--record-path TEXT Filtering option for s3 path, i.e.
'L2/sentinel-2-nrt/S2MSIARD/*/*/ARD-
METADATA.yaml'
--region-code-list-uri TEXT A path to a list (one item per line, in txt
or gzip format) of valide region_codes to
include
--absolute Use absolute paths when converting from stac
--help Show this message and exit.
s3-to-dc
A tool for indexing from S3.
Simple usage:
s3-to-dc 's3://bucket/path/**/*.yaml' 'product-name-a product-name-b'
Extended usage:
The following command updates the datasets instead of adding them and allows unsafe changes. Be careful!
Usage: s3-to-dc [OPTIONS] URI PRODUCT
Iterate through files in an S3 bucket and add them to datacube
Options:
--skip-lineage Default is not to skip lineage. Set to skip
lineage altogether.
--fail-on-missing-lineage / --auto-add-lineage
Default is to fail if lineage documents not
present in the database. Set auto add to try
to index lineage documents.
--verify-lineage Default is no verification. Set to verify
parent dataset definitions.
--stac Expect STAC 1.0 metadata and attempt to
transform to ODC EO3 metadata
--update If set, update instead of add datasets
--update-if-exists If the dataset already exists, update it
instead of skipping it.
--allow-unsafe Allow unsafe changes to a dataset. Take
care!
--skip-check Assume file exists when listing exact file
rather than wildcard.
--no-sign-request Do not sign AWS S3 requests
--request-payer Needed when accessing requester pays public
buckets
--help Show this message and exit.
thredds-to-dc
Index from a THREDDS server
Simple usage:
TODO:
Extended usage:
TODO:
esri-lc-to-dc
Index the global 10 m ESRI Land Cover data.
Simple usage:
Index all the data and add the product first.
esri-lc-to-dc --add-product
Extended usage:
Index all the data, add the product, set a limit and update scenes that already are indexed.
esri-lc-to-dc --add-product --limit 1000 --update
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for odc-apps-dc-tools-0.2.0a1.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 024eabe50b00108093c41b3d8f32b284f45dadb20ce3a9445545d2e32923ebad |
|
MD5 | 148ee5941428cacac195de1900212be5 |
|
BLAKE2b-256 | 6a129e0d8a84366112b332a70660d85357a2e94a7787fea6f2f837980afb3faa |
Hashes for odc_apps_dc_tools-0.2.0a1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 8557ba0971fd785868a92b2de78b34d0302852855900cc9c1d4ccddf13238f05 |
|
MD5 | 24d24ee17126702f6329165e9aea29ae |
|
BLAKE2b-256 | 1679a30a60974f6456906e3b8368cb1ff0b2852d3359a7993f2f17c0791f5562 |