Skip to main content

Bio image reading, metadata and some affine registration.

Project description

Pytest

ndbioimage - Work in progress

Exposes (bio) images as a numpy ndarray-like object, but without loading the whole image into memory, reading from the file only when needed. Some metadata is read and stored in an ome structure. Additionally, it can automatically calculate an affine transform that corrects for chromatic aberrations etc. and apply it on the fly to the image.

Currently, it supports imagej tif files, czi files, micromanager tif sequences and anything bioformats can handle.

Installation

pip install ndbioimage

Usage

  • Reading an image file and plotting the frame at channel=2, time=1
import matplotlib.pyplot as plt
from ndbioimage import Imread
with Imread('image_file.tif', axes='ctxy', dtype=int) as im:
    plt.imshow(im[2, 1])
  • Showing some image metadata
from ndbioimage import Imread
from pprint import pprint
with Imread('image_file.tif') as im:
    pprint(im)
  • Slicing the image without loading the image into memory
from ndbioimage import Imread
with Imread('image_file.tif', axes='cztxy') as im:
    sliced_im = im[1, :, :, 100:200, 100:200]

sliced_im is an instance of Imread which will load any image data from file only when needed

  • Converting (part) of the image to a numpy ndarray
from ndbioimage import Imread
import numpy as np
with Imread('image_file.tif', axes='cztxy') as im:
    array = np.asarray(im[0, 0])

Adding more formats

Readers for image formats subclass AbstractReader. When an image reader is imported, Imread will automatically recognize it and use it to open the appropriate file format. Image readers are required to implement the following methods:

  • staticmethod _can_open(path): return True if path can be opened by this reader
  • property ome: reads metadata from file and adds them to an OME object imported from the ome-types library
  • __frame__(self, c, z, t): return the frame at channel=c, z-slice=z, time=t from the file

Optional methods:

  • open(self): maybe open some file handle
  • close(self): close any file handles

Optional fields:

  • priority (int): Imread will try readers with a lower number first, default: 99
  • do_not_pickle (strings): any attributes that should not be included when the object is pickled, for example: any file handles

TODO

  • more image formats

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ndbioimage-2023.12.2.tar.gz (127.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ndbioimage-2023.12.2-py3-none-any.whl (134.2 kB view details)

Uploaded Python 3

File details

Details for the file ndbioimage-2023.12.2.tar.gz.

File metadata

  • Download URL: ndbioimage-2023.12.2.tar.gz
  • Upload date:
  • Size: 127.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.59.0 CPython/3.8.10

File hashes

Hashes for ndbioimage-2023.12.2.tar.gz
Algorithm Hash digest
SHA256 8954dbec736a78d1cff2b966a84d442a454061f1f82f2413e415bc1651cab284
MD5 3dd3f79610ce9dba90ccb56d859873fb
BLAKE2b-256 a9ff0465010234e73fc8ff4ceeaa145c979ec60683f72be1be4937ea67eb152f

See more details on using hashes here.

File details

Details for the file ndbioimage-2023.12.2-py3-none-any.whl.

File metadata

  • Download URL: ndbioimage-2023.12.2-py3-none-any.whl
  • Upload date:
  • Size: 134.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.59.0 CPython/3.8.10

File hashes

Hashes for ndbioimage-2023.12.2-py3-none-any.whl
Algorithm Hash digest
SHA256 092eebd9a013cbf0adfa377dfab1a1d1ab45b168babdd626d3938bd7f9c5ab5d
MD5 e43606d9e921d5b908f050c166944687
BLAKE2b-256 fd38ce8f27520b00d30c1b59335d44a62eeede0ac125e4fca803f1f08ec7da58

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page