Skip to content

CLI tool for benchmarking inference time of onnx models using the OpenVINO runtime.

License

Notifications You must be signed in to change notification settings

IntelligentRoboticsLab/ml-bench

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ml-bench 🔬

A tool for benchmarking inference time of onnx models using the OpenVINO runtime.

The benchmarks are run using criterion, running a forward pass on the model --samples times. Once the benchmark is complete, the results are printed to stdout.

Usage

./ml-bench <model> [--samples <num_samples>]

Building

macOS

When compiling from macOS to x86-64-unknown-linux-gnu, create a .cargo/config.toml file with the following content:

[target.x86_64-unknown-linux-gnu]
linker = "x86_64-unknown-linux-gnu-gcc"
ar = "x86_64-unknown-linux-gnu-ar"

[env]
TARGET_CC = "x86_64-unknown-linux-gnu-gcc"
TARGET_CXX = "x86_64-unknown-linux-gnu-g++"
CC = "/opt/homebrew/opt/x86_64-unknown-linux-gnu/bin/x86_64-linux-gnu-gcc"
CXX = "/opt/homebrew/opt/x86_64-unknown-linux-gnu/bin/x86_64-linux-gnu-g++"

About

CLI tool for benchmarking inference time of onnx models using the OpenVINO runtime.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages