Skip to content

MonaiAdapter

MonaiAdapter

Bases: Transform

Wraps a MONAI transform for use in TorchIO pipelines.

This adapter allows using MONAI transforms within TorchIO workflows. Both dictionary transforms (e.g., NormalizeIntensityd) and array transforms (e.g., NormalizeIntensity) are supported.

The adapter handles conversion between TorchIO's Subject (where values are Image objects) and MONAI's expected format (where values are tensors or MetaTensor objects). Image tensors are passed as MetaTensor instances with the affine matrix embedded, so spatial transforms (e.g., cropping, resizing) correctly propagate affine changes.

Dictionary transforms (subclasses of MONAI's MapTransform) operate on the full subject dictionary — only the keys specified in the MONAI transform are modified.

Array transforms (all other callables) are applied to each image in the subject individually, respecting the include and exclude parameters inherited from Transform.

Parameters:

Name Type Description Default
monai_transform Callable

A MONAI transform (dictionary or array) or a MONAI-compatible callable. Requires MONAI to be installed, e.g., via pip install torchio[monai].

required
**kwargs

See Transform for additional keyword arguments.

{}

Examples:

>>> import torch
>>> import torchio as tio
>>> from monai.transforms import NormalizeIntensity
>>> from monai.transforms import NormalizeIntensityd
>>> from monai.transforms import RandSpatialCropd
>>> subject = tio.Subject(
...     t1=tio.ScalarImage(tensor=torch.randn(1, 64, 64, 64)),
...     seg=tio.LabelMap(tensor=torch.ones(1, 64, 64, 64)),
... )
>>> # Array transform — applied to each image
>>> adapter = tio.MonaiAdapter(NormalizeIntensity())
>>> transformed = adapter(subject)
>>> # Dictionary transform — applied to specified keys
>>> adapter = tio.MonaiAdapter(NormalizeIntensityd(keys=["t1"]))
>>> transformed = adapter(subject)
>>> # Spatial dict transform (affine is updated)
>>> adapter = tio.MonaiAdapter(
...     RandSpatialCropd(keys=["t1", "seg"], roi_size=[32, 32, 32]),
... )
>>> transformed = adapter(subject)
>>> # Inside a Compose pipeline
>>> pipeline = tio.Compose([
...     tio.ToCanonical(),
...     tio.MonaiAdapter(NormalizeIntensity()),
...     tio.RandomFlip(),
... ])
>>> transformed = pipeline(subject)

__call__(data)

__call__(data: Subject) -> Subject
__call__(data: ImageT) -> ImageT
__call__(data: torch.Tensor) -> torch.Tensor
__call__(data: np.ndarray) -> np.ndarray
__call__(data: sitk.Image) -> sitk.Image
__call__(data: dict[str, object]) -> dict[str, object]
__call__(data: nib.Nifti1Image) -> nib.Nifti1Image

Transform data and return a result of the same type.

Parameters:

Name Type Description Default
data TypeTransformInput

Instance of torchio.Subject, 4D torch.Tensor or numpy.ndarray with dimensions \((C, W, H, D)\), where \(C\) is the number of channels and \(W, H, D\) are the spatial dimensions. If the input is a tensor, the affine matrix will be set to identity. Other valid input types are a SimpleITK image, a torchio.Image, a NiBabel Nifti1 image or a dict. The output type is the same as the input type.

required

to_hydra_config()

Not supported — MONAI transforms are not serializable.

Raises:

Type Description
NotImplementedError

Always.

apply_transform(subject)

Apply the wrapped MONAI transform to a subject.

Parameters:

Name Type Description Default
subject Subject

TorchIO subject to transform.

required

Returns:

Type Description
Subject

The transformed subject with updated tensor data and, for spatial

Subject

transforms, updated affine matrices.