Skip to content

Latest commit

 

History

History
37 lines (26 loc) · 1.4 KB

NNAPI-ExecutionProvider.md

File metadata and controls

37 lines (26 loc) · 1.4 KB

NNAPI Execution Provider

Android Neural Networks API (NNAPI) is a unified interface to CPU, GPU, and NN accelerators on Android. It is supported by onnxruntime via DNNLibrary.

Minimum requirements

The NNAPI EP requires Android devices with Android 8.1 or higher.

Build NNAPI EP

For build instructions, please see the BUILD page.

Using NNAPI EP in C/C++

To use NNAPI EP for inferencing, please register it as below.

string log_id = "Foo";
auto logging_manager = std::make_unique<LoggingManager>
(std::unique_ptr<ISink>{new CLogSink{}},
                                  static_cast<Severity>(lm_info.default_warning_level),
                                  false,
                                  LoggingManager::InstanceType::Default,
                                  &log_id)
Environment::Create(std::move(logging_manager), env)
InferenceSession session_object{so,env};
session_object.RegisterExecutionProvider(std::make_unique<::onnxruntime::NnapiExecutionProvider>());
status = session_object.Load(model_file_name);

The C API details are here.

Performance

NNAPI EP on RK3399

NNAPI EP on OnePlus 6T

NNAPI EP on Huawei Honor V10