Skip to main content

Python library for tracking water consumption from BVK (Brnenske vodarny a kanalizace, bvk.cz)

Project description

python-bvk

Water conspmution scraper for BVK (Brnenské vodárny a kanalizace, bvk.cz)

Note that you need to have the "smart" water-gauge installed. If you don't know what that is you probably don't have one. If you don't have one you have to ask them (BVK) to install it for you and you may have to wait a potentially long time - they are rolling them out gradually.

Install

pip install python-bvk

Usage

To create the client object you need to provide your BVK username/password (the one you use on the customer portal https://zis.bvk.cz/).

from bvk import Bvk
from dateutil import parser

# Create client
bvk = Bvk('username', 'password')

Use getwaterConsumption() method to get the water consumption data. It accepts a date_from and optionally a date_to, both of which have to be a datetime.date object. If date_to is not specified the method returns data to today.

Examples:

# Get water consumption data from the specified date to now
date_from = parser.parse('2020-08-01').date()
deferred_data = bvk.getWaterConsumption(date_from);

# Get water consumption data for a date interval
date_from = parser.parse('2020-08-01').date()
date_to = parser.parse('2020-08-11').date()
deferred_data = bvk.getWaterConsumption(date_from, date_to);

# Get water consumption data for a specific date (just 1 day)
date = parser.parse('2020-08-01').date()
deferred_data = bvk.getWaterConsumption(date, date);

You may call getWaterConsumption multiple times with different parameters. It returns a twisted.internet.defer.Deferred object that can be used to retrieve the price data in the future using a callback you need to provide.

def process_consumption(consumption)
  print(consumption)

deferred_data.addCallback(process_consumption)

If you have multiple Deferreds from multiple calls to getWaterConsumption you can use Bvk.join() to get a Deferred that will be resolved after all crawlers are finished.

The last callback should stop the reactor so it's shut down cleanly. Reactor should be stopped after all crawlers are done so the join() method comes in handy. Note that the reactor cannot be restarted so make sure this is the last thing you do:

from twisted.internet import reactor

d = bvk.join()
d.addBoth(lambda _: reactor.stop())

The last thing you need to do is run the reactor. The script will block until the crawling is finished and all configured callbacks executed.

reactor.run(installSignalHandlers=False)

Keep in mind the library is using Scrapy internally which means it is scraping the BVK customer portal to get the data. If BVK comes to think you are abusing the website they may block your IP address and/or account.

License

See LICENSE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

python_bvk-0.4.1.tar.gz (5.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

python_bvk-0.4.1-py3-none-any.whl (7.6 kB view details)

Uploaded Python 3

File details

Details for the file python_bvk-0.4.1.tar.gz.

File metadata

  • Download URL: python_bvk-0.4.1.tar.gz
  • Upload date:
  • Size: 5.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.5

File hashes

Hashes for python_bvk-0.4.1.tar.gz
Algorithm Hash digest
SHA256 da1afaccefcc5fb43ef14f7d88a6d1f6deaa6618e1f82833c4864d7264bda8d4
MD5 b18fbb244602a3105f56dd9b695b5945
BLAKE2b-256 100c3c954afc927f5e1c3b05f6b7525db5b6bea6538f1fcd16d9983243253d0e

See more details on using hashes here.

File details

Details for the file python_bvk-0.4.1-py3-none-any.whl.

File metadata

File hashes

Hashes for python_bvk-0.4.1-py3-none-any.whl
Algorithm Hash digest
SHA256 fd65232140a83d8b5ee647936449069af1e10f75ad496fa24c819b7570d13ff9
MD5 d934948a03dc37be06d367b5a97c422a
BLAKE2b-256 17b2918cb2263b28db8223d20e6298a54c510b4670b1c3becaffe9c8a4b73c4e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page