Skip to main content

Bio image reading, metadata and some affine registration.

Project description

ndbioimage - Work in progress

Exposes (bio) images as a numpy ndarray-like-object, but without loading the whole image into memory, reading from the file only when needed. Some metadata is read and stored in an ome structure. Additionally, it can automatically calculate an affine transform that corrects for chromatic abberrations etc. and apply it on the fly to the image.

Currently, supports imagej tif files, czi files, micromanager tif sequences and anything bioformats can handle.

Installation

pip install ndbioimage

Usage

  • Reading an image file and plotting the frame at channel=2, time=1
import matplotlib.pyplot as plt
from ndbioimage import Imread
with Imread('image_file.tif', axes='ctxy', dtype=int) as im:
    plt.imshow(im[2, 1])
  • Showing some image metadata
from ndbioimage import Imread
from pprint import pprint
with Imread('image_file.tif') as im:
    pprint(im)
  • Slicing the image without loading the image into memory
from ndbioimage import Imread
with Imread('image_file.tif', axes='cztxy') as im:
    sliced_im = im[1, :, :, 100:200, 100:200]

sliced_im is an instance of Imread which will load any image data from file only when needed

  • Converting (part) of the image to a numpy ndarray
from ndbioimage import Imread
import numpy as np
with Imread('image_file.tif', axes='cztxy') as im:
    array = np.asarray(im[0, 0])

Adding more formats

Readers for image formats subclass Imread. When an image reader is imported, Imread will automatically recognize it and use it to open the appropriate file format. Image readers subclass Imread and are required to implement the following methods:

  • staticmethod _can_open(path): return True if path can be opened by this reader
  • property ome: reads metadata from file and adds them to an OME object imported from the ome-types library
  • __frame__(self, c, z, t): return the frame at channel=c, z-slice=z, time=t from the file

Optional methods:

  • open(self): maybe open some file
  • close(self): close any file handles

Optional fields:

  • priority (int): Imread will try readers with a lower number first, default: 99
  • do_not_pickle (strings): any attributes that should not be included when the object is pickled, for example: any file handles

TODO

  • more image formats
  • re-implement transforms

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ndbioimage-2023.7.4.tar.gz (127.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ndbioimage-2023.7.4-py3-none-any.whl (134.8 kB view details)

Uploaded Python 3

File details

Details for the file ndbioimage-2023.7.4.tar.gz.

File metadata

  • Download URL: ndbioimage-2023.7.4.tar.gz
  • Upload date:
  • Size: 127.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.59.0 CPython/3.8.10

File hashes

Hashes for ndbioimage-2023.7.4.tar.gz
Algorithm Hash digest
SHA256 9bb381f2f8c17f9fdd3fc49a6b2a50e2a2eacd49babaed4de6f46991cee18660
MD5 39b9668e1af0b11593735cee87392acb
BLAKE2b-256 fa20a27a2072c1eedcbbd9c33338a8f81960750c1900dcca0304add6b39d2469

See more details on using hashes here.

File details

Details for the file ndbioimage-2023.7.4-py3-none-any.whl.

File metadata

  • Download URL: ndbioimage-2023.7.4-py3-none-any.whl
  • Upload date:
  • Size: 134.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.59.0 CPython/3.8.10

File hashes

Hashes for ndbioimage-2023.7.4-py3-none-any.whl
Algorithm Hash digest
SHA256 62571c17ae18c226f6b9cc8dd56708137aa7f3da0ec0d58f220ccc6e069c4838
MD5 3adb22bf9c761e627c9967ffa53e8a87
BLAKE2b-256 06afd2cb2347c2c0a105045bc7ca4143cf51204278bcbb475e12c6a74dd8c389

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page