SemanticSegmentation

class SemanticSegmentationResults(class_map, duration, image)

The results of semantic segmentation from SemanticSegmentation.

Parameters
  • class_map (ndarray) – The class label with the highest probability for each and every (x, y)-coordinate in the image

  • duration (float) – The duration of the inference.

  • image (ndarray) – The image that the inference was performed on.

property duration

The duration of the inference in seconds.

Return type

float

property class_map

The class label with the highest probability for each and every (x, y)-coordinate in the image.

Return type

ndarray

property image

The image the results were processed on.

Return type

ndarray

class SemanticSegmentation(model_id, model_config=None)

Classify every pixel in an image.

The build_legend() is useful when used with the Streamer.

Typical usage:

semantic_segmentation = edgeiq.SemanticSegmentation('alwaysai/enet')
semantic_segmentation.load(engine=edgeiq.Engine.DNN)

with edgeiq.Streamer() as streamer:
    <get image>
    results = semantic_segmentation.segment_image(image)

    text = 'Inference time: {:1.3f} s'.format(results.duration)
    text.append('Legend:')
    text.append(semantic_segmentation.build_legend())

    mask = semantic_segmentation.build_image_mask(results.class_map)
    blended = edgeiq.blend_images(image, mask, alpha=0.5)

    streamer.send_data(blended, text)
Parameters

model_id (str) – The ID of the model you want to use for semantic segmentation.

segment_image(image)

Classify every pixel within the specified image.

Parameters

image (ndarray) – The image to analyze.

Return type

SemanticSegmentationResults

build_image_mask(class_map)

Create an image mask by mapping colors to the class map. Colors can be set by the colors attribute.

Parameters

class_map (ndarray) – The class label with the highest probability for each and every (x, y)-coordinate in the image

Return type

ndarray

Returns

Class color visualization for each pixel

build_legend()

Create a class legend that associates color with a class object

Return type

str

Returns

An HTML table with class labels and colors that can be used with the streamer.

build_object_map(class_map, class_list)

Create a object map by isolating classes within the class map.

Parameters
  • class_map (ndarray) – The class with the highest probability for each and every (x, y)-coordinate in the image

  • class_list (List[str]) – The list of labels to include in the object map.

Return type

ndarray

Returns

The specific classes from the class list for each and every (x, y)-coordinate in the original image. Other classes not in the specified class list are rendered as non-labled or background.

property accelerator

The accelerator being used.

Return type

Optional[Accelerator]

property colors

The auto-generated colors for the loaded model.

Note: Initialized to None when the model doesn’t have any labels. Note: To update, the new colors list must be same length as the label list.

Return type

Optional[ndarray]

property engine

The engine being used.

Return type

Optional[Engine]

property labels

The labels for the loaded model.

Note: Initialized to None when the model doesn’t have any labels.

Return type

Optional[List[str]]

load(engine=<Engine.DNN: 'DNN'>, accelerator=<Accelerator.DEFAULT: 'DEFAULT'>)

Load the model to an engine and accelerator.

Parameters
  • engine (Engine) – The engine to load the model to

  • accelerator (Accelerator) – The accelerator to load the model to

property model_config

The configuration of the model that was loaded

Return type

ModelConfig

property model_id

The ID of the loaded model.

Return type

str

property model_purpose

The purpose of the model being used.

Return type

str