Skip to main content

Provider for Apache Airflow. Implements apache-airflow-providers-apache-beam package

Project description

Package apache-airflow-providers-apache-beam

Release: 4.1.1

Apache Beam.

Provider package

This is a provider package for apache.beam provider. All classes for this provider package are in airflow.providers.apache.beam python package.

You can find package information and changelog for the provider in the documentation.

Installation

You can install this package on top of an existing Airflow 2 installation (see Requirements below for the minimum Airflow version supported) via pip install apache-airflow-providers-apache-beam

The package supports the following python versions: 3.7,3.8,3.9,3.10

Requirements

PIP package

Version required

apache-airflow

>=2.3.0

apache-beam

>=2.33.0

Cross provider package dependencies

Those are dependencies that might be needed in order to use all the features of the package. You need to install the specified provider packages in order to use them.

You can install such cross-provider dependencies when installing from PyPI. For example:

pip install apache-airflow-providers-apache-beam[google]

Dependent package

Extra

apache-airflow-providers-google

google

Changelog

4.1.1

Bug Fixes

  • Ensure Beam Go file downloaded from GCS still exists when referenced (#28664)

4.1.0

This release of provider is only available for Airflow 2.3+ as explained in the Apache Airflow providers support policy.

Misc

  • Move min airflow version to 2.3.0 for all providers (#27196)

Features

  • Add backward compatibility with old versions of Apache Beam (#27263)

4.0.0

Breaking changes

Features

  • Added missing project_id to the wait_for_job (#24020)

  • Support impersonation service account parameter for Dataflow runner (#23961)

Misc

  • chore: Refactoring and Cleaning Apache Providers (#24219)

3.4.0

Features

  • Support serviceAccount attr for dataflow in the Apache beam

3.3.0

Features

  • Add recipe for BeamRunGoPipelineOperator (#22296)

Bug Fixes

  • Fix mistakenly added install_requires for all providers (#22382)

3.2.1

Misc

  • Add Trove classifiers in PyPI (Framework :: Apache Airflow :: Provider)

3.2.0

Features

  • Add support for BeamGoPipelineOperator (#20386)

Misc

  • Support for Python 3.10

3.1.0

Features

  • Use google cloud credentials when executing beam command in subprocess (#18992)

3.0.1

Misc

  • Optimise connection importing for Airflow 2.2.0

3.0.0

Breaking changes

  • Auto-apply apply_default decorator (#15667)

2.0.0

Breaking changes

Integration with the google provider

In 2.0.0 version of the provider we’ve changed the way of integrating with the google provider. The previous versions of both providers caused conflicts when trying to install them together using PIP > 20.2.4. The conflict is not detected by PIP 20.2.4 and below but it was there and the version of Google BigQuery python client was not matching on both sides. As the result, when both apache.beam and google provider were installed, some features of the BigQuery operators might not work properly. This was cause by apache-beam client not yet supporting the new google python clients when apache-beam[gcp] extra was used. The apache-beam[gcp] extra is used by Dataflow operators and while they might work with the newer version of the Google BigQuery python client, it is not guaranteed.

This version introduces additional extra requirement for the apache.beam extra of the google provider and symmetrically the additional requirement for the google extra of the apache.beam provider. Both google and apache.beam provider do not use those extras by default, but you can specify them when installing the providers. The consequence of that is that some functionality of the Dataflow operators might not be available.

Unfortunately the only complete solution to the problem is for the apache.beam to migrate to the new (>=2.0.0) Google Python clients.

This is the extra for the google provider:

extras_require = (
    {
        # ...
        "apache.beam": ["apache-airflow-providers-apache-beam", "apache-beam[gcp]"],
        # ...
    },
)

And likewise this is the extra for the apache.beam provider:

extras_require = ({"google": ["apache-airflow-providers-google", "apache-beam[gcp]"]},)

You can still run this with PIP version <= 20.2.4 and go back to the previous behaviour:

pip install apache-airflow-providers-google[apache.beam]

or

pip install apache-airflow-providers-apache-beam[google]

But be aware that some BigQuery operators functionality might not be available in this case.

1.0.1

Bug fixes

  • Improve Apache Beam operators - refactor operator - common Dataflow logic (#14094)

  • Corrections in docs and tools after releasing provider RCs (#14082)

  • Remove WARNINGs from BeamHook (#14554)

1.0.0

Initial version of the provider.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

apache-airflow-providers-apache-beam-4.1.1.tar.gz (23.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file apache-airflow-providers-apache-beam-4.1.1.tar.gz.

File metadata

File hashes

Hashes for apache-airflow-providers-apache-beam-4.1.1.tar.gz
Algorithm Hash digest
SHA256 04b8a20f755f7be441d0a986a467295b96b2bfc22681334f4406dd84208b9a8e
MD5 9d3dbdb2b1cb11ee050f7a9ef5224029
BLAKE2b-256 6fce17a0ba04a72ffef0147d87156ad1e5fabe4888ca5031f53306b067af3012

See more details on using hashes here.

File details

Details for the file apache_airflow_providers_apache_beam-4.1.1-py3-none-any.whl.

File metadata

File hashes

Hashes for apache_airflow_providers_apache_beam-4.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 80109c191f13572befaa25de8714087b4e6ff3055a3bab2b07b67f3859dc688d
MD5 fefb41885a21f6541d052efe11618ce2
BLAKE2b-256 e76a9457b3c89270aa6cf7fb2913b566137b6730cbca1534a8f6f5c713a64e6d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page