Skip to main content

A decorator that allows users to run SQL queries natively in Airflow.

Project description

astro

workflows made easy

Python versions License Development Status PyPI downloads Contributors Commit activity CI codecov

Astro SDK Python allows rapid and clean development of {Extract, Load, Transform} workflows using Python. It helps DAG authors to achieve more with less code. It is powered by Apache Airflow and maintained by Astronomer.

:warning: Disclaimer This project is in a preview release state. In other words, it is not production-ready yet. The interfaces may change. We welcome users to try out the interfaces and provide us with feedback.

Install

Astro SDK Python is available at PyPI. Use the standard Python installation tools.

To install a cloud-agnostic version of Astro SDK Python, run:

pip install astro-sdk-python

If using cloud providers, install using the optional dependencies of interest:

pip install astro-sdk-python[amazon,google,snowflake,postgres]

Quick-start

After installing Astro, copy the following example dag calculate_popular_movies.py to a local directory named dags:

from datetime import datetime
from airflow import DAG
from astro import sql as aql
from astro.files import File
from astro.sql.table import Table

@aql.transform()
def top_five_animations(input_table: Table):
    return """
        SELECT Title, Rating
        FROM {{input_table}}
        WHERE Genre1=='Animation'
        ORDER BY Rating desc
        LIMIT 5;
    """

with DAG(
    "calculate_popular_movies",
    schedule_interval=None,
    start_date=datetime(2000, 1, 1),
    catchup=False,
) as dag:
    imdb_movies = aql.load_file(
        File("https://raw.githubusercontent.com/astronomer/astro-sdk/main/tests/data/imdb.csv"),
        output_table=Table(
            name="imdb_movies", conn_id="sqlite_default"
        ),
    )
    top_five_animations(
        input_table=imdb_movies,
        output_table=Table(
            name="top_animation"
        ),
    )

Set up a local instance of Airflow by running:

export AIRFLOW_HOME=`pwd`
export AIRFLOW__CORE__ENABLE_XCOM_PICKLING=True
airflow db init

Create an SQLite database for the example to run with and run the DAG:

# The sqlite_default connection has different host for MAC vs. Linux
export SQL_TABLE_NAME=`airflow connections get sqlite_default -o yaml | grep host | awk '{print $2}'`
sqlite3 "$SQL_TABLE_NAME" "VACUUM;"
airflow dags test calculate_popular_movies `date -Iseconds`

Check the top five animations calculated by your first Astro DAG by running:

sqlite3 "$SQL_TABLE_NAME" "select * from top_animation;" ".exit"

You should see the following output:

$ sqlite3 "$SQL_TABLE_NAME" "select * from top_animation;" ".exit"
Toy Story 3 (2010)|8.3
Inside Out (2015)|8.2
How to Train Your Dragon (2010)|8.1
Zootopia (2016)|8.1
How to Train Your Dragon 2 (2014)|7.9

Requirements

Astro SDK Python depends on Apache Airflow >= 2.1.0.

Supported technologies

Databases
Google BigQuery
Postgres
Snowflake
SQLite
File types
CSV
JSON
NDJSON
Parquet
File stores
Amazon S3
Filesystem
Google GCS

Available operations

A summary of the currently available operations in Astro SDK Python. More details are available in the reference guide.

  • load_file: load a given file into a SQL table
  • transform: applies a SQL select statement to a source table and saves the result to a destination table
  • truncate: remove all records from a SQL table
  • run_raw_sql: run any SQL statement without handling its output
  • append: insert rows from the source SQL table into the destination SQL table, if there are no conflicts
  • merge: insert rows from the source SQL table into the destination SQL table, depending on conflicts:
    • ignore: do not add rows that already exist
    • update: replace existing rows with new ones
  • export_file: export SQL table rows into a destination file
  • dataframe: export given SQL table into in-memory Pandas data-frame

Documentation

The documentation is a work in progress--we aim to follow the Diátaxis system:

  • Tutorial: a hands-on introduction to Astro SDK Python
  • How-to guides: simple step-by-step user guides to accomplish specific tasks
  • Reference guide: commands, modules, classes and methods
  • Explanation: Clarification and discussion of key decisions when designing the project.

Changelog

We follow Semantic Versioning for releases. Check the changelog for the latest changes.

Release Managements

To learn more about our release philosophy and steps, check here

Contribution Guidelines

All contributions, bug reports, bug fixes, documentation improvements, enhancements, and ideas are welcome.

Read the Contribution Guideline for a detailed overview on how to contribute.

As contributors and maintainers to this project, you should abide by the Contributor Code of Conduct.

License

Apache Licence 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

astro-sdk-python-0.9.0.tar.gz (48.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

astro_sdk_python-0.9.0-py3-none-any.whl (70.3 kB view details)

Uploaded Python 3

File details

Details for the file astro-sdk-python-0.9.0.tar.gz.

File metadata

  • Download URL: astro-sdk-python-0.9.0.tar.gz
  • Upload date:
  • Size: 48.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.0 CPython/3.8.12

File hashes

Hashes for astro-sdk-python-0.9.0.tar.gz
Algorithm Hash digest
SHA256 e1691fa9fbeb3e44552c93859769498857dc83b95791dd8fe8df4578b5167781
MD5 e4e9003e5d5b588306e048c22cd7ea5e
BLAKE2b-256 71bdc4a756be6d87bb0847820d7f06af25227edcb90e1fa7c9f7a563ecaf8438

See more details on using hashes here.

File details

Details for the file astro_sdk_python-0.9.0-py3-none-any.whl.

File metadata

File hashes

Hashes for astro_sdk_python-0.9.0-py3-none-any.whl
Algorithm Hash digest
SHA256 676b7fa948f3382e39eeaa2b41b0271d769c595cd0cd7cb9339b7cd6d88060b1
MD5 bfcac3d1fc687fb9cf2021d683cddc3c
BLAKE2b-256 d9bde44206c0208b18c66c2a4bfeba64257a05086d2c7f205d5863258d1c2ad1

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page