Caching library with support for multiple cache backends
Project description
cachetory
Quickstart
from cachetory import serializers
from cachetory.backends import async_ as async_backends
from cachetory.caches.async_ import Cache
cache = Cache[int](
serializer=serializers.from_url("pickle+zstd://?pickle-protocol=4&compression-level=3"),
backend=(await async_backends.from_url("redis://localhost:6379")),
)
async with cache:
await cache.set("foo", 42)
assert await cache.get("foo") == 42
Non-async
from cachetory import serializers
from cachetory.backends import sync as sync_backends
from cachetory.caches.sync import Cache
cache = Cache[int](
serializer=serializers.from_url("pickle+zstd://"),
backend=sync_backends.from_url("redis://localhost:6379"),
)
with cache:
cache.set("foo", 42)
assert cache.get("foo") == 42
Tutorial
Supported operations
| Operation | |
|---|---|
get(key, default) |
Retrieve a value (or return a default one) |
__getitem__(key) |
Retrieve a value or raise KeyError (only sync Cache) |
get_many(*keys) |
Retrieve many values as a dictionary |
set(key, value, *, time_to_live, if_not_exists) |
Set a value and return if the value has been changed |
__setitem__(key, value) |
Set a value (only sync Cache) |
set_many(items) |
Set many values |
expire_in(key, time_to_live) |
Set an expiration duration on a key |
delete(key) |
Delete a key and return whether the key existed |
__delitem__(key) |
Delete a key (only sync Cache) |
Instantiating a Cache
Both sync and async Caches requires at least these parameters to work:
backend: functions as a storageserializer: is responsible for converting actual values from and to something that a backend would be able to store
Cache may be annotated with a value type, like this: Cache[ValueT], which provides type hints for the cache methods.
Instantiating a backend
There is a few ways to instantiate a backend:
- By directly instantiating a backend class via its
__init__ - By instantiating a specific backend class via its
from_urlclass method. In that case the URL is forwarded to underlying client (if any) - By using
cachetory.[sync|async_].from_urlfunction. In that case specific backend class is chosen by the URL's scheme (see the scheme badges below), and the URL is forwarded to itsfrom_urlclass method. This is especially useful to configure an arbitrary backend from a single configuration option – instead of hard-coding a specific backend class.
Examples
import redis
import cachetory.backends.sync
import cachetory.backends.async_
backend = cachetory.backends.sync.from_url("memory://")
backend = await cachetory.backends.async_.from_url("dummy://")
backend = cachetory.backends.sync.RedisBackend(redis.Redis(...))
backend = await cachetory.backends.async_.from_url("redis://localhost:6379/1")
Instantiating a serializer
Instantiating of a serializer is very much similar to that of a backend. To instantiate it by a URL use cachetory.serializers.from_url – unlike the back-end case there are no separate sync and async versions.
cachetory.serializers.from_url supports scheme joining with +, as in pickle+zlib://. In that case multiple serializers are instantiated and applied sequentially (in the example a value would be serialized by pickle and the serialized value is then compressed by zlib). Deserialization order is, of course, the opposite.
Examples
import pickle
import cachetory.serializers
serializer = cachetory.serializers.from_url("pickle+zstd://")
serializer = cachetory.serializers.from_url("pickle+zstd://?pickle-protocol=4&compression-level=3")
serializer = cachetory.serializers.from_url("null://")
serializer = cachetory.serializers.NoopSerializer()
serializer = cachetory.serializers.PickleSerializer(pickle_protocol=pickle.DEFAULT_PROTOCOL)
Decorators
Decorate a function with @cached
@cached performs memoization of a wrapped function:
from cachetory.caches.sync import Cache
from cachetory.decorators.shared import make_default_key
from cachetory.decorators.sync import cached
cache = Cache[int](backend=..., serializer=...)
another_cache = Cache[int](backend=..., serializer=...)
@cached(
cache, # may also be a callable that returns a specific cache for each call, e.g.:
# `cache=lambda wrapped_callable, *args, **kwargs: cache if … else another_cache`
# The following parameters are optional (shown the defaults):
make_key=make_default_key, # cache key generator
time_to_live=None, # forwarded to `Cache.set`
if_not_exists=False, # forwarded to `Cache.set`
)
def expensive_function() -> int:
return 42
Supported backends
The badges indicate which schemes are supported by a particular backend, and which package extras are required for it – if any:
Redis
| Sync | Async |
|---|---|
cachetory.backends.sync.RedisBackend |
cachetory.backends.async_.RedisBackend |
The URL is forwarded to the underlying client, which means one can use whatever options the client provides. The only special case is redis+unix://: the leading redis+ is first stripped and the rest is forwarded to the client.
All the cache operations are atomic in both sync and async, including get_many and set_many.
hiredis-async extra uses aioredis with hiredis parser.
Memory
| Sync | Async |
|---|---|
cachetory.backends.sync.MemoryBackend |
cachetory.backends.async_.MemoryBackend |
Simple memory backend that stores values in a plain dictionary.
Note the following caveats:
- This backend does not copy values. Meaning that mutating a stored value mutates it in the backend too. If this is not desirable, consider using another serializer or making up your own serializer which copies values in its
serializemethod. - Expired items actually get deleted only when accessed. If you put a value into the backend and never try to retrieve it – it'll stay in memory forever.
Dummy
| Sync | Async |
|---|---|
cachetory.backends.sync.DummyBackend |
cachetory.backends.async_.DummyBackend |
Dummy backend that always succeeds but never stores anything. Any values get forgotten immediately, and operations behave as if the cache always is empty.
Supported serializers
Pickle
Serializes using the standard pickle module.
| Class |
|---|
cachetory.serializers.PickleSerializer |
| URL parameter | |
|---|---|
pickle-protocol |
Version of pickle protocol |
No-operation
| Class |
|---|
cachetory.serializers.NoopSerializer |
NoopSerializer does nothing and just bypasses value back and forth. Most of the backends don't support that and require some kind of serialization.
However, it is possible to use NoopSerializer with MemoryBackend, because the latter just stores all values in a Python dictionary and doesn't necessarily require values to be serialized.
Supported compressors
Compressor is basically just a partial case of serializer: it's a serializer from bytes to and from bytes, which by definition provides some kind of data compression.
It also means that you can use a compressor alone, effectively making a cache of compressed blobs:
from datetime import timedelta
from cachetory.caches.sync import Cache
from cachetory.serializers.compressors import ZlibCompressor
from cachetory.backends.sync import RedisBackend
cache = Cache[bytes](
serializer=ZlibCompressor(),
backend=RedisBackend(...),
)
cache.set(
"my-blob",
b"this huge blob will be compressed and stored in Redis for an hour",
time_to_live=timedelta(hours=1),
)
Zlib
Uses the built-in zlib module.
| Class |
|---|
cachetory.serializers.compressors.ZlibCompressor |
| URL parameter | |
|---|---|
compression-level |
From 0 (no compression) to 9 (best compression) |
Zstandard
Uses python-zstd for Zstandard compression.
| Class |
|---|
cachetory.serializers.compressors.ZstdCompressor |
| URL parameter | |
|---|---|
compression-level |
See: https://github.com/sergey-dryabzhinsky/python-zstd#api |
compression-threads |
See: https://github.com/sergey-dryabzhinsky/python-zstd#api |
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file cachetory-0.2.0.tar.gz.
File metadata
- Download URL: cachetory-0.2.0.tar.gz
- Upload date:
- Size: 28.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.9.13
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ad448cf8f4bbc9cca0c717954fbe0ef396f9fd89d120fc5350ea5b3c1477e89a
|
|
| MD5 |
e0248a5f03bd927d19671a16b72007c7
|
|
| BLAKE2b-256 |
e5e2e27603f64ca2a891136fe3e9fbd5d6b6d4ff611572b179de7b784fda746e
|
File details
Details for the file cachetory-0.2.0-py3-none-any.whl.
File metadata
- Download URL: cachetory-0.2.0-py3-none-any.whl
- Upload date:
- Size: 31.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.9.13
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b58d17c6d62b1634f1d05e7a3839df3dcd9debddf19b1b398d50e1b4bdd1b471
|
|
| MD5 |
90586301a6eb2c9c45ba40e1a2c33589
|
|
| BLAKE2b-256 |
1428f258991e1e444b93c66854ed534025735baf3ba2d9ffef88d8adaaf7a4f7
|