Model Config

class ModelConfig(model_json, base_dir=None, labels=None, colors=None)

The model configuration parameters.

Parameters
  • model_json (dict) – The parsed alwaysai.model.json.

  • labels (Optional[List[str]]) – The label list for the model.

  • colors (Optional[ndarray]) – The color list for the model.

classmethod from_model_id(model_id)
Return type

ModelConfig

property config

The config loaded from the model JSON file

Return type

dict

property model_parameters

The model parameters in the config

Return type

dict

property id

The model ID

Return type

str

property label_file

Path to the label file

Return type

Optional[str]

property colors_file

Path to the colors file

Return type

Optional[str]

property model_file

Path to the model weights file

Return type

str

property config_file

Relative path to the model framework config file

Return type

Optional[str]

property mean

The RGB/BGR mean values for the model

Return type

Tuple[float, float, float]

property scalefactor

The scale factor for the model input

Return type

float

property size

The input image size of the model

Return type

Tuple[int, int]

property purpose

The purpose of the model

Return type

str

property framework_type

The framework type of the model

Return type

str

property crop

Whether or not to crop the image prior to inferencing

Return type

bool

property colors_dtype

The data type of the color values

Return type

str

property labels

The labels of the model

Return type

Optional[List[str]]

property colors

The colors for each label of the model.

Each array element is a 3 dimensional array of 8 bit integers representing red, green, and blue

Return type

Optional[ndarray]

property swaprb

Whether to swap the red and blue channels of the image prior to inference

Return type

bool

property architecture

The architecture of the model

Return type

Optional[str]

property softmax

Whether to perform softmax after the inference

Return type

bool

property device

The device the model was built for

Return type

Optional[SupportedDevices]

property output_layer_names

The output layer names of the model

Return type

Optional[List[str]]

property hailo_quantize_input

Whether to quantize input of Hailo model

Return type

Optional[bool]

property hailo_quantize_output

Whether to quantize output of Hailo model

Return type

Optional[bool]

property hailo_input_format

Input format for Hailo model

Return type

Optional[str]

property hailo_output_format

Output format of Hailo model

Return type

Optional[str]

property dnn_support

Whether DNN Engine supports the model

Return type

bool

property dnn_cuda_support

Whether DNN CUDA Engine supports the model

Return type

bool

property tensor_rt_support

Whether TensorRT Engine supports the model

Return type

bool

property hailo_support

Whether Hailo RT Engine supports the model

Return type

bool

property qaic_support

Whether QAIC RT Engine supports the model

Return type

bool

property onnx_rt_support

Whether ONNX RT Engine supports the model

Return type

bool

property batch_size

Inference batch size of model

Return type

int