Questions tagged [onnx]
ONNX is an open format to represent deep learning models and enable interoperability between different frameworks.
onnx
907
questions
0
votes
0
answers
7
views
How to convert onnx with onnx.data to openvino IR format
I am using mo to convert onnx to openvino IR format.But when encountering onnx and onnx.data, it reported error.
mo --input_model G:\convert_model\onnx-fp16\text_encoder\model.onnx --input_shape [1,77]...
0
votes
0
answers
32
views
Getting console errors when trying to run model in browser using rust, wasm and wonnx
I'm attempting to embed some text as vectors in the browser using rust, webassembly and wonnx.
When I try and run my model in the demo, I get these errors in the console:
wonnx_embeddings_repro_bg.js:...
0
votes
0
answers
24
views
How convert yolox onnx to yolox dlc
Trying to convert ONNX to DLC training YOLOX model. When ONNX its working but converting dlc doesnt work
YOLOX trained model --> convert .ONNX --> ONNXRUNTIME --> decode --> Visualization ...
0
votes
0
answers
17
views
Fail to load external data file "onnx model"in onnx runtime in devexpress application?
I have put onnx model in my wwwroot/js folder but it is not being detected and I am getting this error.
error = Error: failed to load external data file: js/sevensegment.onnx
at gn (https://cdn....
0
votes
1
answer
31
views
Getting errors in `wgpu` crate when used as a dependency of `wonnx`
When I try and build my project, I get these errors in the wgpu crate, which is a dependency of the wonnx crate that I am using.
❯ $env:RUSTFLAGS="--cfg=web_sys_unstable_apis"; wasm-pack ...
1
vote
0
answers
21
views
Opencv module dnn inference with batch greater than 1
I want to perform inference using the dnn module of opencv. This is my code:
static cv::Mat inputBlob2;
static std::vector<cv::Mat> outputs2;
cv::dnn::blobFromImages(images, inputBlob2, 1.0 / ...
0
votes
0
answers
24
views
Export a teknium/OpenHermes-2.5-Mistral-7B model to ONNX
I am trying to export teknium/OpenHermes-2.5-Mistral-7B to ONNX,
This is my code:
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer
import onnx
model_name = "teknium/...
1
vote
1
answer
20
views
NNModel (ONNX) Assertion Error in Barracuda
I am trying to use a ONNX model for inference in Unity using Barracuda, following this repo. When I try to call model inference, I get the following assertion error.
AssertionException: Assertion ...
-1
votes
0
answers
12
views
OpenVINO does not support the following ONNX operations: SequenceConstruct, ConcatFromSequence
MODEL_YOLOV8_PATH = 'yolov8x_person_face.onnx.prototxt'
WEIGHT_YOLOV8_PATH = 'yolov8x_person_face.onnx'
MODEL_MIVOLO_PATH = 'mivolo.onnx.prototxt'
WEIGHT_MIVOLO_PATH = 'mivolo.onnx'
...
0
votes
0
answers
10
views
Can the layer InstanceNormalization fuse with convolution?
Can the layer InstanceNormalization fuse with Convolution ?
scale = 1 and B = 0
enter image description here
InstanceNormalization
Carries out instance normalization as described in the paper https://...
1
vote
0
answers
12
views
Convert Quantization to Onnx
I am new and want to try converting models to Onnx format and I have the following issue. I have a model that has been quantized to 4-bit, and then I converted this model to Onnx. My quantized model ...
0
votes
0
answers
29
views
How to use ONNX Transformers model with Spring AI
I am trying to use ONNX export of model intfloat/multilingual-e5-large:
https://huggingface.co/intfloat/multilingual-e5-large/tree/main/onnx
I have successfully downloaded this model, using it ...
0
votes
0
answers
13
views
How to export a temporal forecasting transformer model as an onnx model?
I have a TFT model that is performing a timeseries forecasting using the get_stallion dataset. See the link below for reference:
https://pytorch-forecasting.readthedocs.io/en/stable/tutorials/stallion....
0
votes
0
answers
15
views
Android ONNXRuntime Multi Thread multi models
I am trying to run two different models on android device and I successfully made two different sessions and run two models.
But the problem is that they are running in serialized manner which is ...
-1
votes
0
answers
30
views
I transfer a torch model to onnx with dynamic input size, but when i infer onnx model by cpp onnxruntime sdk, it make an error
onnxruntime.capi.onnxruntime_pybind11_state.RuntimeException: [ONNXRuntimeError] : 6 : RUNTIME_EXCEPTION : Non-zero status code returned while running Mul node. Name:'/encoders0/encoders0.0/self_attn/...