Skip to content

ToReferenceSpace

ToReferenceSpace

Bases: SpatialTransform

Modify the spatial metadata so it matches a reference space.

This is useful, for example, to set meaningful spatial metadata of a neural network embedding, for visualization or further processing such as resampling a segmentation output.

Examples:

import torchio as tio image = tio.datasets.FPG().t1 embedding_tensor = my_network(image.tensor) # we lose metadata here embedding_image = tio.ToReferenceSpace.from_tensor(embedding_tensor, image)

__call__(data)

Transform data and return a result of the same type.

Parameters:

Name Type Description Default
data InputType

Instance of torchio.Subject, 4D torch.Tensor or numpy.ndarray with dimensions \((C, W, H, D)\), where \(C\) is the number of channels and \(W, H, D\) are the spatial dimensions. If the input is a tensor, the affine matrix will be set to identity. Other valid input types are a SimpleITK image, a torchio.Image, a NiBabel Nifti1 image or a dict. The output type is the same as the input type.

required

get_base_args()

Provides easy access to the arguments used to instantiate the base class (Transform) of any transform.

This method is particularly useful when a new transform can be represented as a variant of an existing transform (e.g. all random transforms), allowing for seamless instantiation of the existing transform with the same arguments as the new transform during apply_transform.

Note

The p argument (probability of applying the transform) is excluded to avoid multiplying the probability of both existing and new transform.

add_base_args(arguments, overwrite_on_existing=False)

Add the init args to existing arguments

validate_keys_sequence(keys, name) staticmethod

Ensure that the input is not a string but a sequence of strings.

to_hydra_config()

Return a dictionary representation of the transform for Hydra instantiation.

from_tensor(tensor, reference) staticmethod

Build a TorchIO image from a tensor and a reference image.