Skip to main content

Bio image reading, metadata and some affine registration.

Project description

ndbioimage - Work in progress

Exposes (bio) images as a numpy ndarray-like-object, but without loading the whole image into memory, reading from the file only when needed. Some metadata is read and stored in an ome structure. Additionally, it can automatically calculate an affine transform that corrects for chromatic abberrations etc. and apply it on the fly to the image.

Currently, supports imagej tif files, czi files, micromanager tif sequences and anything bioformats can handle.

Installation

pip install ndbioimage

Usage

  • Reading an image file and plotting the frame at channel=2, time=1
import matplotlib.pyplot as plt
from ndbioimage import Imread
with Imread('image_file.tif', axes='ctxy', dtype=int) as im:
    plt.imshow(im[2, 1])
  • Showing some image metadata
from ndbioimage import Imread
from pprint import pprint
with Imread('image_file.tif') as im:
    pprint(im)
  • Slicing the image without loading the image into memory
from ndbioimage import Imread
with Imread('image_file.tif', axes='cztxy') as im:
    sliced_im = im[1, :, :, 100:200, 100:200]

sliced_im is an instance of Imread which will load any image data from file only when needed

  • Converting (part) of the image to a numpy ndarray
from ndbioimage import Imread
import numpy as np
with Imread('image_file.tif', axes='cztxy') as im:
    array = np.asarray(im[0, 0])

Adding more formats

Readers for image formats subclass Imread. When an image reader is imported, Imread will automatically recognize it and use it to open the appropriate file format. Image readers subclass Imread and are required to implement the following methods:

  • staticmethod _can_open(path): return True if path can be opened by this reader
  • property ome: reads metadata from file and adds them to an OME object imported from the ome-types library
  • __frame__(self, c, z, t): return the frame at channel=c, z-slice=z, time=t from the file

Optional methods:

  • open(self): maybe open some file
  • close(self): close any file handles

Optional fields:

  • priority (int): Imread will try readers with a lower number first, default: 99
  • do_not_pickle (strings): any attributes that should not be included when the object is pickled, for example: any file handles

TODO

  • more image formats
  • re-implement transforms

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ndbioimage-2023.7.3.tar.gz (128.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ndbioimage-2023.7.3-py3-none-any.whl (136.0 kB view details)

Uploaded Python 3

File details

Details for the file ndbioimage-2023.7.3.tar.gz.

File metadata

  • Download URL: ndbioimage-2023.7.3.tar.gz
  • Upload date:
  • Size: 128.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.59.0 CPython/3.8.10

File hashes

Hashes for ndbioimage-2023.7.3.tar.gz
Algorithm Hash digest
SHA256 e7a619d9c4ab663fab45c62c591b60eb646a25e1b7117edcec05d77f610edc86
MD5 a2afcab9e7d37a2ad321ab481fbabb18
BLAKE2b-256 457df11c7deae7c6b7b88772990de7cbc430daf15324e41f00315fcf1ccbe745

See more details on using hashes here.

File details

Details for the file ndbioimage-2023.7.3-py3-none-any.whl.

File metadata

  • Download URL: ndbioimage-2023.7.3-py3-none-any.whl
  • Upload date:
  • Size: 136.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.59.0 CPython/3.8.10

File hashes

Hashes for ndbioimage-2023.7.3-py3-none-any.whl
Algorithm Hash digest
SHA256 89b48e4ceeeb8ac6edc81a2cf41d807dbd9fa1ae8b16f1cddcb1f30de6a963e0
MD5 01a22b5bec410cc9a951ed197ed7aa00
BLAKE2b-256 26d59699012ede02dd340169197e49cf572d82bed4fa0bb0931b86de5b627e4b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page