Skip to main content

Python SDK for Vercel

Project description

Vercel Python SDK

Installation

pip install vercel

Requirements

  • Python 3.10+

Usage

This package provides both synchronous and asynchronous clients to interact with the Vercel API.



Headers and request context

from typing import Callable

from fastapi import FastAPI, Request
from vercel.headers import geolocation, ip_address, set_headers

app = FastAPI()

@app.middleware("http")
async def vercel_context_middleware(request: Request, call_next: Callable):
    set_headers(request.headers)
    return await call_next(request)

@app.get("/api/headers")
async def headers_info(request: Request):
    ip = ip_address(request.headers)
    geo = geolocation(request)
    return {"ip": ip, "geo": geo}


Runtime Cache

Sync

from vercel.cache import get_cache

def main():
    cache = get_cache(namespace="demo")

    cache.delete("greeting")
    cache.set("greeting", {"hello": "world"}, {"ttl": 60, "tags": ["demo"]})
    value = cache.get("greeting")  # dict or None
    cache.expire_tag("demo")        # invalidate by tag

Sync Client

from vercel.cache import RuntimeCache

cache = RuntimeCache(namespace="demo")

def main():
    cache.delete("greeting")
    cache.set("greeting", {"hello": "world"}, {"ttl": 60, "tags": ["demo"]})
    value = cache.get("greeting")  # dict or None
    cache.expire_tag("demo")        # invalidate by tag

Async

from vercel.cache.aio import get_cache

async def main():
    cache = get_cache(namespace="demo")

    await cache.delete("greeting")
    await cache.set("greeting", {"hello": "world"}, {"ttl": 60, "tags": ["demo"]})
    value = await cache.get("greeting")  # dict or None
    await cache.expire_tag("demo")        # invalidate by tag

Async Client

from vercel.cache import AsyncRuntimeCache

cache = AsyncRuntimeCache(namespace="demo")

async def main():
    await await cache.delete("greeting")
    await await cache.set("greeting", {"hello": "world"}, {"ttl": 60, "tags": ["demo"]})
    value = await cache.get("greeting")  # dict or None
    await cache.expire_tag("demo")        # invalidate by tag



Vercel OIDC Tokens

from typing import Callable

from fastapi import FastAPI, Request
from vercel.oidc import decode_oidc_payload, get_vercel_oidc_token
# async
# from vercel.oidc.aio import get_vercel_oidc_token

app = FastAPI()

@app.middleware("http")
async def vercel_context_middleware(request: Request, call_next: Callable):
    set_headers(request.headers)
    return await call_next(request)

@app.get("/oidc")
def oidc():
    token = get_vercel_oidc_token()
    payload = decode_oidc_payload(token)
    user_id = payload.get("user_id")
    project_id = payload.get("project_id")

    return {
        "user_id": user_id,
        "project_id" project_id,
    }

Notes:

  • When run locally, this requires a valid Vercel CLI login on the machine running the code for refresh.
  • Project info is resolved from .vercel/project.json.



Blob Storage

Requires BLOB_READ_WRITE_TOKEN to be set as an env var or token to be set when constructing a client

Sync

from vercel.blob import BlobClient

client = BlobClient()  
# or BlobClient(token="...")

# Create a folder entry, upload a local file, list, then download
client.create_folder("examples/assets", overwrite=True)
uploaded = client.upload_file(
    "./README.md",
    "examples/assets/readme-copy.txt",
    access="public",
    content_type="text/plain",
)
listing = client.list_objects(prefix="examples/assets/")
client.download_file(uploaded.url, "/tmp/readme-copy.txt", overwrite=True)

Async usage:

import asyncio
from vercel.blob import AsyncBlobClient

async def main():
    client = AsyncBlobClient()  # uses BLOB_READ_WRITE_TOKEN from env

    # Upload bytes
    uploaded = await client.put(
        "examples/assets/hello.txt",
        b"hello from python",
        access="public",
        content_type="text/plain",
    )

    # Inspect metadata, list, download bytes, then delete
    meta = await client.head(uploaded.url)
    listing = await client.list_objects(prefix="examples/assets/")
    content = await client.get(uploaded.url)
    await client.delete([b.url for b in listing.blobs])

asyncio.run(main())

Synchronous usage:

from vercel.blob import BlobClient

client = BlobClient()  # or BlobClient(token="...")

# Create a folder entry, upload a local file, list, then download
client.create_folder("examples/assets", overwrite=True)
uploaded = client.upload_file(
    "./README.md",
    "examples/assets/readme-copy.txt",
    access="public",
    content_type="text/plain",
)
listing = client.list_objects(prefix="examples/assets/")
client.download_file(uploaded.url, "/tmp/readme-copy.txt", overwrite=True)

Multipart Uploads

For large files, the SDK provides three approaches with different trade-offs:

1. Automatic (Simplest)

The SDK handles everything automatically:

from vercel.blob import auto_multipart_upload

# Synchronous
result = auto_multipart_upload(
    "large-file.bin",
    large_data,  # bytes, file object, or iterator
    part_size=8 * 1024 * 1024,  # 8MB parts (default)
)

# Asynchronous
result = await auto_multipart_upload_async(
    "large-file.bin",
    large_data,
)
2. Uploader Pattern (Recommended)

A middle-ground that provides a clean API while giving you control over parts and concurrency:

from vercel.blob import BlobClient, create_multipart_uploader

# Create the uploader (initializes the upload)
client = BlobClient()
uploader = client.create_multipart_uploader("large-file.bin", content_type="application/octet-stream")

# Upload parts (you control when and how)
parts = []
for i, chunk in enumerate(chunks, start=1):
    part = uploader.upload_part(i, chunk)
    parts.append(part)

# Complete the upload
result = uploader.complete(parts)

Async version with concurrent uploads:

from vercel.blob import AsyncBlobClient, create_multipart_uploader_async

client = AsyncBlobClient()
uploader = await client.create_multipart_uploader("large-file.bin")

# Upload parts concurrently
tasks = [uploader.upload_part(i, chunk) for i, chunk in enumerate(chunks, start=1)]
parts = await asyncio.gather(*tasks)

# Complete
result = await uploader.complete(parts)

The uploader pattern is ideal when you:

  • Want to control how parts are created (e.g., stream from disk, manage memory)
  • Need custom concurrency control
  • Want a cleaner API than the manual approach

Notes:

  • Part numbers must be in the range 1..10,000.
  • add_random_suffix defaults to True for the uploader (matches TS SDK); manual create defaults to False.
  • Abort/cancel: an abortable uploader API is not yet exposed (future enhancement).
3. Manual (Most Control)

Full control over each step, but more verbose:

from vercel.blob import (
    create_multipart_upload,
    upload_part,
    complete_multipart_upload,
)

# Phase 1: Create
resp = create_multipart_upload("large-file.bin")
upload_id = resp["uploadId"]
key = resp["key"]

# Phase 2: Upload parts
part1 = upload_part(
    "large-file.bin",
    chunk1,
    upload_id=upload_id,
    key=key,
    part_number=1,
)
part2 = upload_part(
    "large-file.bin",
    chunk2,
    upload_id=upload_id,
    key=key,
    part_number=2,
)

# Phase 3: Complete
result = complete_multipart_upload(
    "large-file.bin",
    [part1, part2],
    upload_id=upload_id,
    key=key,
)

See examples/multipart_uploader.py for complete working examples.

Development

  • Lint/typecheck/tests:
uv pip install -e .[dev]
uv run ruff format --check && uv run ruff check . && uv run mypy src && uv run pytest -v
  • CI runs lint, typecheck, examples as smoke tests, and builds wheels.
  • Publishing: push a tag (vX.Y.Z) that matches project.version to publish via PyPI Trusted Publishing.

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

vercel-0.5.0.tar.gz (60.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

vercel-0.5.0-py3-none-any.whl (73.9 kB view details)

Uploaded Python 3

File details

Details for the file vercel-0.5.0.tar.gz.

File metadata

  • Download URL: vercel-0.5.0.tar.gz
  • Upload date:
  • Size: 60.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for vercel-0.5.0.tar.gz
Algorithm Hash digest
SHA256 876f7c5f97eb1acb3cc2a027ec73809dae50b07e3ba4f4f3cb22a60486401839
MD5 15467af0b872ab8be3e50b721bfa3aa8
BLAKE2b-256 7950f5abe32c1ca3f92f6bb41e1dd732d515042f30baa3e7610c1f1ed8dd1a5f

See more details on using hashes here.

Provenance

The following attestation bundles were made for vercel-0.5.0.tar.gz:

Publisher: publish.yml on vercel/vercel-py

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file vercel-0.5.0-py3-none-any.whl.

File metadata

  • Download URL: vercel-0.5.0-py3-none-any.whl
  • Upload date:
  • Size: 73.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for vercel-0.5.0-py3-none-any.whl
Algorithm Hash digest
SHA256 2de1bcabee2123e0d7edb5bd5a9bc496aa93a5742e15b0ca073bd0536ecb914f
MD5 9e01285f02b19ef04a29d96845d06f3b
BLAKE2b-256 50514a7a1045c3cda5233fd9429e05cc26c661203679e30e4c385a9da1f740ea

See more details on using hashes here.

Provenance

The following attestation bundles were made for vercel-0.5.0-py3-none-any.whl:

Publisher: publish.yml on vercel/vercel-py

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page