Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

WinML ORT compile and inferenceSession error when the model path contains any non‐ASCII Unicode characters #6173

Open
Assignees
Labels
area-WinMLTopics related to Windows Machine Learning (WinML) in Windows App SDK
@rnagata0

Description

Describe the bug

When using WinML with ONNX Runtime (ORT), we encounter an issue where both model compilation and execution fail if the model path includes non ASCII Unicode characters. Specifically, when providing a model file path containing such characters, either while compiling the model for each Execution Provider or when loading and running a precompiled model, WinML throws the following exception.

[E:onnxruntime:, inference_session.cc:2545 onnxruntime::InferenceSession::Initialize::<lambda_b590a375cc4159bef6c92b76b4894c14>::operator ()] Exception during initialization: No mapping for the Unicode character exists in the target multi-byte code page.

The issue does not occur when using the CPU Execution Provider, and inference completes successfully. However, when using the OpenVINO Execution Provider, an exception is thrown. I have not tested other Execution Providers, but they may also be affected.

Steps to reproduce the bug

  1. Select OpenVINO Execution Provider.
  2. Follow the compilation and inference instructions of CSharp in the documentation: Run ONNX models using the ONNX Runtime included in Windows ML
// Prepare compilation options
OrtModelCompilationOptions compileOptions = new(sessionOptions);
compileOptions.SetInputModelPath(modelPath);
compileOptions.SetOutputModelPath(compiledModelPath);
// Compile the model
compileOptions.CompileModel();
// Create inference session using compiled model
using InferenceSession session = new(compiledModelPath, sessionOptions);
  1. Provide a model path that contains non ASCII characters (in my case, Japanese characters) when compiling the model.
  2. Use the same model path when running inference with the compiled model.

Expected behavior

The model should compile successfully, and inference should run without errors.

Screenshots

No response

NuGet package version

None

Packaging type

No response

Windows version

No response

IDE

No response

Additional context

Windows App SDK 1.8.251106002
Windows ML Runtime Intel OpenVINO Execution Provider 1.8.26.0

Metadata

Metadata

Labels

area-WinMLTopics related to Windows Machine Learning (WinML) in Windows App SDK

Type

No type

Projects

No projects

Milestone

No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions

      AltStyle によって変換されたページ (->オリジナル) /