This repository has been archived on 2023-11-05. You can view files and clone it, but cannot push or open issues or pull requests.
wasm-micro-runtime/core/iwasm/libraries/wasi-nn
Wenyong Huang 9b9ae0cfac
Update cmake files and wamr-test-suites to support collect code coverage (#1992)
Support collecting code coverage with wamr-test-suites script by using
lcov and genhtml tools, eg.:
  cd tests/wamr-test-suites
  ./test_wamr.sh -s spec -b -P -C

The default code coverage and html files are generated at:
  tests/wamr-test-suites/workspace/wamr.lcov
  tests/wamr-test-suites/workspace/wamr-lcov.zip

And update wamr-test-suites scripts to support testing GC spec cases to
avoid frequent synchronization conflicts between branch main and dev/gc.
2023-02-28 17:38:18 +08:00
..
src wasi-nn: Enable GPU support (#1922) 2023-02-02 08:09:46 +08:00
test Update cmake files and wamr-test-suites to support collect code coverage (#1992) 2023-02-28 17:38:18 +08:00
README.md wasi-nn: Enable GPU support (#1922) 2023-02-02 08:09:46 +08:00
wasi_nn_types.h Refactor WASI-NN to simplify the support for multiple frameworks (#1834) 2023-01-25 18:32:40 +08:00
wasi_nn.cmake Refactor WASI-NN to simplify the support for multiple frameworks (#1834) 2023-01-25 18:32:40 +08:00
wasi_nn.h Refactor WASI-NN to simplify the support for multiple frameworks (#1834) 2023-01-25 18:32:40 +08:00

WASI-NN

How to use

Enable WASI-NN in the WAMR by spefiying it in the cmake building configuration as follows,

set (WAMR_BUILD_WASI_NN  1)

The definition of the functions provided by WASI-NN is in the header file core/iwasm/libraries/wasi-nn/wasi_nn.h.

By only including this file in your WASM application you will bind WASI-NN into your module.

Tests

To run the tests we assume that the current directory is the root of the repository.

Build the runtime

Build the runtime base image,

docker build -t wasi-nn-base -f core/iwasm/libraries/wasi-nn/test/Dockerfile.base .

Build the runtime image for your execution target type.

EXECUTION_TYPE can be:

  • cpu
  • nvidia-gpu
EXECUTION_TYPE=cpu
docker build -t wasi-nn-${EXECUTION_TYPE} -f core/iwasm/libraries/wasi-nn/test/Dockerfile.${EXECUTION_TYPE} .

Build wasm app

docker build -t wasi-nn-compile -f core/iwasm/libraries/wasi-nn/test/Dockerfile.compile .
docker run -v $PWD/core/iwasm/libraries/wasi-nn:/wasi-nn wasi-nn-compile

Run wasm app

If all the tests have run properly you will the the following message in the terminal,

Tests: passed!
  • CPU
docker run \
    -v $PWD/core/iwasm/libraries/wasi-nn/test:/assets wasi-nn-cpu \
    --dir=/assets \
    --env="TARGET=cpu" \
    /assets/test_tensorflow.wasm
  • (NVIDIA) GPU
docker run \
    --runtime=nvidia \
    -v $PWD/core/iwasm/libraries/wasi-nn/test:/assets wasi-nn-nvidia-gpu \
    --dir=/assets \
    --env="TARGET=gpu" \
    /assets/test_tensorflow.wasm

Requirements:

What is missing

Supported:

  • Only 1 WASM app at a time.
  • Only 1 model at a time.
    • graph and graph-execution-context are ignored.
  • Graph encoding: tensorflowlite.
  • Execution target: cpu and gpu.
  • Tensor type: fp32.