Configuration

ORTConfig

class optimum.onnxruntime.ORTConfig

< >

( opset: typing.Optional[int] = None use_external_data_format: bool = False optimization_config: typing.Optional[optimum.onnxruntime.configuration.OptimizationConfig] = None quantization_config: typing.Optional[optimum.onnxruntime.configuration.QuantizationConfig] = None )

Parameters

  • opset (int, optional) — ONNX opset version to export the model with.
  • use_external_data_format (bool, optional, defaults to False) — Allow exporting model >= than 2Gb.
  • optimization_config (OptimizationConfig, optional, defaults to None) — Specify a configuration to optimize ONNX Runtime model
  • quantization_config (QuantizationConfig, optional, defaults to None) — Specify a configuration to quantize ONNX Runtime model

ORTConfig is the configuration class handling all the ONNX Runtime optimization and quantization parameters.