From b1cbcede54c7600c389d2ce8344edc50c7479c6d Mon Sep 17 00:00:00 2001 From: majianjia Date: Sat, 30 Mar 2019 14:17:25 +0000 Subject: [PATCH 1/2] Update README.md --- README.md | 19 +++++++++++-------- 1 file changed, 11 insertions(+), 8 deletions(-) diff --git a/README.md b/README.md index e02e047..389b251 100644 --- a/README.md +++ b/README.md @@ -26,7 +26,7 @@ Examples: [MNIST-DenseNet example](examples/mnist-densenet) ---- + ## Why NNoM? The aims of NNoM is to provide a light-weight, user-friendly and flexible interface for fast deploying. @@ -40,11 +40,14 @@ Nowadays, neural networks are **wider**, **deeper**, and **denser**. >[3] Huang, G., Liu, Z., Van Der Maaten, L., & Weinberger, K. Q. (2017). Densely connected convolutional networks. In Proceedings of the IEEE conference on computer vision and pattern recognition (pp. 4700-4708). -If you would like to try those more up-to-date, decent and complex structures on MCU +After 2014, Neural network are more focus on structure optimising to improve efficiency and performance, which is more important to the small footprint platforms such as MCUs. +However, the available NN libs for MCU are too low-level which is make it sooooo difficult to use in these complex strucures. +Therefore, we build the NNoM to manage the structures for developers, even with automatic tools for automatic deploying. + +Now with NNoM, you are free to play with these more up-to-date, decent and complex structures on MCU. -NNoM can help you to build them with only a few lines of C codes, same as you did with Python in [**Keras**](https://keras.io/) +With [**Keras**](https://keras.io/) and our tools, deploying a model only takes a few line of codes. ---- ## Available Operations @@ -66,7 +69,7 @@ NNoM can help you to build them with only a few lines of C codes, same as you di **Activations** -Activation can be used by itself as layer, or can be attached to the previous layer as ["actail"](docs/A%20Temporary%20Guide%20to%20NNoM.md#addictionlly-activation-apis) to reduce memory cost. +Activation can be used by itself as layer, or can be attached to the previous layer as ["actail"](docs/A_Temporary_Guide_to_NNoM.md#addictionlly-activation-apis) to reduce memory cost. | Actrivation | Status |Layer API|Activation API|Comments| | ------ |-- |--|--|--| @@ -95,20 +98,20 @@ Activation can be used by itself as layer, or can be attached to the previous la | Substraction | Beta|Sub()|| | Dot | Under Dev. ||| ---- + ## Dependencies NNoM now use the local pure C backend implementation by default. Thus, there is no special dependency needed. ---- + ## Optimization You can select [CMSIS-NN/DSP](https://github.com/ARM-software/CMSIS_5/tree/develop/CMSIS/NN) as the backend for about 5x performance with ARM-Cortex-M4/7/33/35P. Check [Porting and optimising Guide](docs/Porting_and_Optimisation_Guide.md) for detail. ---- + ## Contacts Jianjia Ma From b82a30aa1da7a41ac2f7d6473846ce638867f383 Mon Sep 17 00:00:00 2001 From: majianjia Date: Sat, 30 Mar 2019 14:23:03 +0000 Subject: [PATCH 2/2] Update README.md --- README.md | 13 ++++++------- 1 file changed, 6 insertions(+), 7 deletions(-) diff --git a/README.md b/README.md index 389b251..72ae586 100644 --- a/README.md +++ b/README.md @@ -1,5 +1,6 @@ # Neural Network on Microcontroller (NNoM) +[![Build Status](https://travis-ci.org/majianjia/nnom.svg?branch=master)](https://travis-ci.org/majianjia/nnom) NNoM is a higher-level layer-based Neural Network library specifically for microcontrollers. @@ -21,11 +22,9 @@ Guides: Examples: -[RT-Thread-MNIST example (中文)](https://majianjia.github.io/nnom/example_mnist_simple_cn/) - [MNIST-DenseNet example](examples/mnist-densenet) - +[RT-Thread-MNIST example (中文)](https://majianjia.github.io/nnom/example_mnist_simple_cn/) ## Why NNoM? @@ -40,13 +39,13 @@ Nowadays, neural networks are **wider**, **deeper**, and **denser**. >[3] Huang, G., Liu, Z., Van Der Maaten, L., & Weinberger, K. Q. (2017). Densely connected convolutional networks. In Proceedings of the IEEE conference on computer vision and pattern recognition (pp. 4700-4708). -After 2014, Neural network are more focus on structure optimising to improve efficiency and performance, which is more important to the small footprint platforms such as MCUs. -However, the available NN libs for MCU are too low-level which is make it sooooo difficult to use in these complex strucures. -Therefore, we build the NNoM to manage the structures for developers, even with automatic tools for automatic deploying. +After 2014, the development of Neural Networks are more focus on structure optimising to improve efficiency and performance, which is more important to the small footprint platforms such as MCUs. +However, the available NN libs for MCU are too low-level which make it sooooo difficult to use with these complex strucures. +Therefore, we build the NNoM to help developers to manage the structures, memory and parameters, even with the automatic tools for fast deploying. Now with NNoM, you are free to play with these more up-to-date, decent and complex structures on MCU. -With [**Keras**](https://keras.io/) and our tools, deploying a model only takes a few line of codes. +With [**Keras**](https://keras.io/) and our tools, deploying a model only takes a few line of codes, please do check the [examples](examples/). ## Available Operations