darts_segmentation¶
Image segmentation of thaw-slumps for the DARTS dataset.
Classes:
-
SMPSegmenter
–An actor that keeps a model as its state and segments tiles.
-
SMPSegmenterConfig
–Configuration for the segmentor.
Functions:
-
create_patches
–Create patches from a tensor.
-
patch_coords
–Yield patch coordinates based on height, width, patch size and margin size.
-
predict_in_patches
–Predict on a tensor.
Attributes:
SMPSegmenter
¶
SMPSegmenter(
model_checkpoint: pathlib.Path | str,
device: torch.device = darts_segmentation.segment.DEFAULT_DEVICE,
)
An actor that keeps a model as its state and segments tiles.
Initialize the segmenter.
Parameters:
-
model_checkpoint
(pathlib.Path
) –The path to the model checkpoint.
-
device
(torch.device
, default:darts_segmentation.segment.DEFAULT_DEVICE
) –The device to run the model on. Defaults to torch.device("cuda") if cuda is available, else torch.device("cpu").
Methods:
-
__call__
–Run inference on a single tile or a list of tiles.
-
segment_tile
–Run inference on a tile.
-
segment_tile_batched
–Run inference on a list of tiles.
-
tile2tensor
–Take a tile and convert it to a pytorch tensor.
-
tile2tensor_batched
–Take a list of tiles and convert them to a pytorch tensor.
Attributes:
-
config
(darts_segmentation.segment.SMPSegmenterConfig
) – -
device
(torch.device
) – -
model
(torch.nn.Module
) –
Source code in darts-segmentation/src/darts_segmentation/segment.py
config
instance-attribute
¶
config: darts_segmentation.segment.SMPSegmenterConfig = (
darts_segmentation.segment.validate_config(
ckpt["config"]
)
)
device
instance-attribute
¶
device: torch.device = (
darts_segmentation.segment.SMPSegmenter(device)
)
model
instance-attribute
¶
model: torch.nn.Module = (
segmentation_models_pytorch.create_model(
**darts_segmentation.segment.SMPSegmenter(
self
).config["model"]
)
)
__call__
¶
__call__(
input: xarray.Dataset | list[xarray.Dataset],
patch_size: int = 1024,
overlap: int = 16,
batch_size: int = 8,
reflection: int = 0,
) -> xarray.Dataset | list[xarray.Dataset]
Run inference on a single tile or a list of tiles.
Parameters:
-
input
(xarray.Dataset | list[xarray.Dataset]
) –A single tile or a list of tiles.
-
patch_size
(int
, default:1024
) –The size of the patches. Defaults to 1024.
-
overlap
(int
, default:16
) –The size of the overlap. Defaults to 16.
-
batch_size
(int
, default:8
) –The batch size for the prediction, NOT the batch_size of input tiles. Tensor will be sliced into patches and these again will be infered in batches. Defaults to 8.
-
reflection
(int
, default:0
) –Reflection-Padding which will be applied to the edges of the tensor. Defaults to 0.
Returns:
-
xarray.Dataset | list[xarray.Dataset]
–A single tile or a list of tiles augmented by a predicted
probabilities
layer, depending on the input. -
xarray.Dataset | list[xarray.Dataset]
–Each
probability
has type float32 and range [0, 1].
Raises:
-
ValueError
–in case the input is not an xr.Dataset or a list of xr.Dataset
Source code in darts-segmentation/src/darts_segmentation/segment.py
segment_tile
¶
segment_tile(
tile: xarray.Dataset,
patch_size: int = 1024,
overlap: int = 16,
batch_size: int = 8,
reflection: int = 0,
) -> xarray.Dataset
Run inference on a tile.
Parameters:
-
tile
(xarray.Dataset
) –The input tile, containing preprocessed, harmonized data.
-
patch_size
(int
, default:1024
) –The size of the patches. Defaults to 1024.
-
overlap
(int
, default:16
) –The size of the overlap. Defaults to 16.
-
batch_size
(int
, default:8
) –The batch size for the prediction, NOT the batch_size of input tiles. Tensor will be sliced into patches and these again will be infered in batches. Defaults to 8.
-
reflection
(int
, default:0
) –Reflection-Padding which will be applied to the edges of the tensor. Defaults to 0.
Returns:
-
xarray.Dataset
–Input tile augmented by a predicted
probabilities
layer with type float32 and range [0, 1].
Source code in darts-segmentation/src/darts_segmentation/segment.py
segment_tile_batched
¶
segment_tile_batched(
tiles: list[xarray.Dataset],
patch_size: int = 1024,
overlap: int = 16,
batch_size: int = 8,
reflection: int = 0,
) -> list[xarray.Dataset]
Run inference on a list of tiles.
Parameters:
-
tiles
(list[xarray.Dataset]
) –The input tiles, containing preprocessed, harmonized data.
-
patch_size
(int
, default:1024
) –The size of the patches. Defaults to 1024.
-
overlap
(int
, default:16
) –The size of the overlap. Defaults to 16.
-
batch_size
(int
, default:8
) –The batch size for the prediction, NOT the batch_size of input tiles. Tensor will be sliced into patches and these again will be infered in batches. Defaults to 8.
-
reflection
(int
, default:0
) –Reflection-Padding which will be applied to the edges of the tensor. Defaults to 0.
Returns:
-
list[xarray.Dataset]
–A list of input tiles augmented by a predicted
probabilities
layer with type float32 and range [0, 1].
Source code in darts-segmentation/src/darts_segmentation/segment.py
tile2tensor
¶
Take a tile and convert it to a pytorch tensor.
Respects the input combination from the config.
Returns:
-
torch.Tensor
–A torch tensor for the full tile consisting of the bands specified in
self.band_combination
.
Source code in darts-segmentation/src/darts_segmentation/segment.py
tile2tensor_batched
¶
Take a list of tiles and convert them to a pytorch tensor.
Respects the the input combination from the config.
Returns:
-
torch.Tensor
–A torch tensor for the full tile consisting of the bands specified in
self.band_combination
.
Source code in darts-segmentation/src/darts_segmentation/segment.py
SMPSegmenterConfig
¶
Configuration for the segmentor.
Attributes:
create_patches
¶
create_patches(
tensor_tiles: torch.Tensor,
patch_size: int,
overlap: int,
return_coords: bool = False,
) -> torch.Tensor
Create patches from a tensor.
Parameters:
-
tensor_tiles
(torch.Tensor
) –The input tensor. Shape: (BS, C, H, W).
-
patch_size
(int
) –The size of the patches.
-
overlap
(int
) –The size of the overlap.
-
return_coords
(bool
, default:False
) –Whether to return the coordinates of the patches. Can be used for debugging. Defaults to False.
Returns:
Source code in darts-segmentation/src/darts_segmentation/utils.py
patch_coords
¶
patch_coords(
h: int, w: int, patch_size: int, overlap: int
) -> collections.abc.Generator[
tuple[int, int, int, int], None, None
]
Yield patch coordinates based on height, width, patch size and margin size.
Parameters:
-
h
(int
) –Height of the image.
-
w
(int
) –Width of the image.
-
patch_size
(int
) –Patch size.
-
overlap
(int
) –Margin size.
Yields:
-
tuple[int, int, int, int]
–tuple[int, int, int, int]: The patch coordinates y, x, patch_idx_y and patch_idx_x.
Source code in darts-segmentation/src/darts_segmentation/utils.py
predict_in_patches
¶
predict_in_patches(
model: torch.nn.Module,
tensor_tiles: torch.Tensor,
patch_size: int,
overlap: int,
batch_size: int,
reflection: int,
device=torch.device,
return_weights: bool = False,
) -> torch.Tensor
Predict on a tensor.
Parameters:
-
model
(torch.nn.Module
) –The model to use for prediction.
-
tensor_tiles
(torch.Tensor
) –The input tensor. Shape: (BS, C, H, W).
-
patch_size
(int
) –The size of the patches.
-
overlap
(int
) –The size of the overlap.
-
batch_size
(int
) –The batch size for the prediction, NOT the batch_size of input tiles. Tensor will be sliced into patches and these again will be infered in batches.
-
reflection
(int
) –Reflection-Padding which will be applied to the edges of the tensor.
-
device
(torch.device
, default:torch.device
) –The device to use for the prediction.
-
return_weights
(bool
, default:False
) –Whether to return the weights. Can be used for debugging. Defaults to False.
Returns:
Source code in darts-segmentation/src/darts_segmentation/utils.py
90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 |
|