Skip to main content

Read GCS and local paths with the same interface, clone of tensorflow.io.gfile

Project description

blobfile

This is a standalone clone of TensorFlow's gfile, supporting both local paths and gs:// (Google Cloud Storage) paths.

The main function is BlobFile, a replacement for GFile. There are also a few additional functions, basename, dirname, and join, which mostly do the same thing as their os.path namesakes, only they also support gs:// paths.

Installation:

pip install blobfile

Usage:

import blobfile as bf

with bf.BlobFile("gs://my-bucket-name/cats", "wb") as w:
    w.write(b"meow!")

Here are the functions:

  • BlobFile - like open() but works with gs:// paths too, data is streamed to/from the remote file.
    • Reading is done without downloading the entire remote file.
    • Writing is done to the remote file directly, but only in chunks of a few MB in size. flush() will not cause an early write.
    • Appending is not implemented.
    • You can specify a buffer_size on creation to buffer more data and potentially make reading more efficient.
  • LocalBlobFile - like BlobFile() but operations take place on a local file.
    • Reading is done by downloading the file during the constructor.
    • Writing is done by uploading the file on close() or during destruction.
    • Appending is done by downloading the file during construction and uploading on close().
    • You can pass a cache_dir parameter to cache files for reading. You are reponsible for cleaning up the cache directory.

Some are inspired by existing os.path and shutil functions:

  • copy - copy a file from one path to another, will do a remote copy between two remote paths on the same blob storage service
  • exists - returns True if the file or directory exists
  • glob - return files matching a pattern, on GCS this only supports a single * operator. In addition, it can be slow if the * appears early in the pattern since GCS can only do prefix matches; all additional filtering must happen locally
  • isdir - returns True if the path is a directory
  • listdir - list contents of a directory as a generator
  • makedirs - ensure that a directory and all parent directories exist
  • remove - remove a file
  • rmdir - remove an empty directory
  • rmtree - remove a directory tree
  • stat - get the size and modification time of a file
  • walk - walk a directory tree with a generator that yields (dirpath, dirnames, filenames) tuples
  • basename - get the final component of a path
  • dirname - get the path except for the final component
  • join - join 2 or more paths together, inserting directory separators between each component

There are a few bonus functions:

  • get_url - returns a url for a path along with the expiration for that url (or None)
  • md5 - get the md5 hash for a path, for GCS this is fast, but for other backends this may be slow
  • set_log_callback - set a log callback function log(msg: string) to use instead of printing to stdout

Examples

Write and read a file:

import blobfile as bf

with bf.BlobFile("gs://my-bucket/file.name", "wb") as f:
    f.write(b"meow")

print("exists:", bf.exists("gs://my-bucket/file.name"))

print("contents:", bf.BlobFile("gs://my-bucket/file.name", "rb").read())

Parallel execution:

import blobfile as bf
import multiprocessing as mp
import tqdm

filenames = [f"{i}.ext" for i in range(1000)]

with mp.Pool() as pool:
    for filename, exists in zip(filenames, pool.imap(bf.exists, filenames)):
        print(filename, exists)

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

blobfile-0.11.0-py3-none-any.whl (33.0 kB view details)

Uploaded Python 3

File details

Details for the file blobfile-0.11.0-py3-none-any.whl.

File metadata

  • Download URL: blobfile-0.11.0-py3-none-any.whl
  • Upload date:
  • Size: 33.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.36.1 CPython/3.7.3

File hashes

Hashes for blobfile-0.11.0-py3-none-any.whl
Algorithm Hash digest
SHA256 ff3001473f8ad526fc41740348a468865dd681cbe70793c9a28a7aa611529de9
MD5 339b19aa53db0c5c8f1fea3fb54d41d6
BLAKE2b-256 071c31a8aa2459aae179dc1db895e02b395a110393d4ba0a10ca84c1cbfcff56

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page