diff options
author | Marat Dukhan <maratek@google.com> | 2020-02-03 22:05:43 -0800 |
---|---|---|
committer | XNNPACK Team <xnnpack-github-robot@google.com> | 2020-02-03 22:06:17 -0800 |
commit | 7278a95e3cfae6eac73f363c4fda5db53e1b2a87 (patch) | |
tree | a44d6501b7d190849062164801c756a59b7d6b16 | |
parent | 8170d3768a33c0d3dc0558197a5d13b57b1fc9b2 (diff) | |
download | XNNPACK-7278a95e3cfae6eac73f363c4fda5db53e1b2a87.tar.gz |
Fix description in README
PiperOrigin-RevId: 293076552
-rw-r--r-- | README.md | 2 |
1 files changed, 1 insertions, 1 deletions
@@ -1,6 +1,6 @@ # XNNPACK -XNNPACK is a highly optimized library of floating-point neural network inference operators for ARM, WebAssembly, and x86 (SSE2 level) platforms. XNNPACK is not intended for direct use by deep learning practitioners and researchers; instead it provides low-level performance primitives for accelerating high-level machine learning frameworks, such as [MediaPipe](https://mediapipe.dev), [TensorFlow Lite](https://www.tensorflow.org/lite), and [TensorFlow.js](https://www.tensorflow.org/js). +XNNPACK is a highly optimized library of floating-point neural network inference operators for ARM, WebAssembly, and x86 platforms. XNNPACK is not intended for direct use by deep learning practitioners and researchers; instead it provides low-level performance primitives for accelerating high-level machine learning frameworks, such as [MediaPipe](https://mediapipe.dev), [TensorFlow Lite](https://www.tensorflow.org/lite), and [TensorFlow.js](https://www.tensorflow.org/js). ## Supported Architectures |