Convert Model
- convert_model(self, input_model_path: str, output_dir: str, target_framework: str | Framework, target_device_name: str | DeviceName, target_data_type: str | DataType = DataType.FP16, target_software_version: str | SoftwareVersion | None = None, input_layer: InputLayer | None = None, dataset_path: str | None = None, wait_until_done: bool = True, sleep_interval: int = 30) ConverterMetadata
Convert a model to the specified framework.
- Parameters:
input_model_path (str) – The file path where the model is located.
output_dir (str) – The local folder path to save the converted model.
target_framework (Union[str, Framework]) – The target framework name.
target_device_name (Union[str, DeviceName]) – Target device name. Required if target_device is not specified.
target_data_type (Union[str, DataType]) – Data type of the model. Default is DataType.FP16.
target_software_version (Union[str, SoftwareVersion], optional) – Target software version. Required if target_device_name is one of the Jetson devices.
input_layer (InputShape, optional) – Target input shape for conversion (e.g., dynamic batch to static batch).
dataset_path (str, optional) – Path to the dataset. Useful for certain conversions.
wait_until_done (bool) – If True, wait for the conversion result before returning the function. If False, request the conversion and return the function immediately.
- Raises:
e – If an error occurs during the model conversion.
- Returns:
Convert metadata.
- Return type:
ConverterMetadata
Example
from netspresso import NetsPresso
from netspresso.enums import DeviceName, Framework, SoftwareVersion
netspresso = NetsPresso(email="YOUR_EMAIL", password="YOUR_PASSWORD")
converter = netspresso.converter_v2()
conversion_task = converter.convert_model(
input_model_path="./examples/sample_models/test.onnx",
output_dir="./outputs/converted/TENSORRT_JETSON_AGX_ORIN_JETPACK_5_0_1",
target_framework=Framework.TENSORRT,
target_device_name=DeviceName.JETSON_AGX_ORIN,
target_software_version=SoftwareVersion.JETPACK_5_0_1,
)