Skip to main content

HDX Data Freshness

Project description

HDX Data Freshness

Build Status Coverage Status

The implementation of HDX freshness in Python reads all the datasets from the Humanitarian Data Exchange website (using the HDX Python library) and then iterates through them one by one performing a sequence of steps.

  1. It gets the dataset’s update frequency if it has one. If that update frequency is Never, then the dataset is always fresh.

  2. If not, it checks if the dataset and resource metadata have changed - this qualifies as an update from a freshness perspective. It compares the difference between the current time and update time with the update frequency and sets a status: fresh, due, overdue or delinquent.

  3. If the dataset is not fresh based on metadata, then the urls of the resources are examined. If they are internal urls (data.humdata.org - the HDX filestore, manage.hdx.rwlabs.org - CPS) then there is no further checking that can be done because when the files pointed to by these urls update, the HDX metadata is updated.

  4. If they are urls with an adhoc update frequency (proxy.hxlstandard.org, ourairports.com), then freshness cannot be determined. Currently, there is no mechanism in HDX to specify adhoc update frequencies, but there is a proposal to add this to the update frequency options. At the moment, the freshness value for adhoc datasets is based on whatever has been set for update frequency, but these datasets can be easily identified and excluded from results if needed.

  5. If the url is externally hosted and not adhoc, then we can open an HTTP GET request to the file and check the header returned for the Last-Modified field. If that field exists, then we read the date and time from it and check if that is more recent than the dataset or resource metadata modification date. If it is, we recalculate freshness.

  6. If the resource is not fresh by this measure, then we download the file and calculate an MD5 hash for it. In our database, we store previous hash values, so we can check if the hash has changed since the last time we took the hash.

  7. There are some resources where the hash changes constantly because they connect to an api which generates a file on the fly. To identify these, we hash again and check if the hash changes in the few seconds since the previous hash calculation.

Since there can be temporary connection and download issues with urls, the code has multiple retry functionality with increasing delays. Also as there are many requests to be made, rather than perform them one by one, they are executed concurrently using the asynchronous functionality that has been added to the most recent versions of Python.

Usage

python run.py

Project details


Release history Release notifications | RSS feed

This version

0.7

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hdx-data-freshness-0.7.tar.gz (14.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

hdx_data_freshness-0.7-py2.py3-none-any.whl (17.6 kB view details)

Uploaded Python 2Python 3

File details

Details for the file hdx-data-freshness-0.7.tar.gz.

File metadata

File hashes

Hashes for hdx-data-freshness-0.7.tar.gz
Algorithm Hash digest
SHA256 e678698e6174628e5ceb2fd467c84f0b01978e5fa0e75dcaa31ab5c40527019d
MD5 d5716e64fba5245757c1158587e6c32a
BLAKE2b-256 ad7aa962cf4b0b626099d7772c8a876f7b4ad4d2d32df9ee5377d7080e9b0cf3

See more details on using hashes here.

File details

Details for the file hdx_data_freshness-0.7-py2.py3-none-any.whl.

File metadata

File hashes

Hashes for hdx_data_freshness-0.7-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 c084bd4802c9358522d1b0e4e2050f89be84ebca6feecc5df57b901be99875d8
MD5 eff36c887d63c4f566e7c4646023eb03
BLAKE2b-256 5daa46005ca0f5d2e434328bd1769a465f7922d23806d9d6154eba353374c12f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page