ONNX Runtime
:fire: ONNX Runtime - the high performance scoring engine for ML models - for Ruby
Installation
First, install ONNX Runtime.
Add this line to your application’s Gemfile:
gem 'onnxruntime'
Getting Started
Load a model and make predictions
model = OnnxRuntime::Model.new("model.onnx")
model.predict(x: [1, 2, 3])
Get inputs
model.inputs
Get outputs
model.outputs
Load a model from a string
byte_str = File.binread("model.onnx")
model = OnnxRuntime::Model.new(byte_str)
Get specific outputs
model.predict({x: [1, 2, 3]}, output_names: ["label"])
Inference Session API
You can also use the Inference Session API, which follows the Python API.
session = OnnxRuntime::InferenceSession.new("model.onnx")
session.run(nil, {x: [1, 2, 3]})
ONNX Runtime Installation
ONNX Runtime provides prebuilt libraries.
Mac
wget https://github.com/microsoft/onnxruntime/releases/download/v0.5.0/onnxruntime-osx-x64-0.5.0.tgz
tar xf onnxruntime-osx-x64-0.5.0.tgz
cd onnxruntime-osx-x64-0.5.0
cp lib/libonnxruntime.0.5.0.dylib /usr/local/lib/libonnxruntime.dylib
Linux
wget https://github.com/microsoft/onnxruntime/releases/download/v0.5.0/onnxruntime-linux-x64-0.5.0.tgz
tar xf onnxruntime-linux-x64-0.5.0.tgz
cd onnxruntime-linux-x64-0.5.0.tgz
cp lib/libonnxruntime.0.5.0.so /usr/local/lib/libonnxruntime.so
Windows
Download ONNX Runtime. Unzip and move lib/onnxruntime.dll to C:\Windows\System32\onnxruntime.dll.
History
View the changelog
Contributing
Everyone is encouraged to help improve this project. Here are a few ways you can help:
- Report bugs
- Fix bugs and submit pull requests
- Write, clarify, or fix documentation
- Suggest or add new features