TensorFlow Lite for Microcontrollers is an open-source deep learning framework designed to run machine learning models on microcontrollers with only few kilobytes of memory. Optimized models can run on many Arm Cortex-M Series Processors.

TensorFlow Lite for Microcontrollers is an open-source deep learning framework for on-device inference on 32-bit Microcontrollers. It is designed to run machine learning models on microcontrollers and other devices with only few kilobytes of memory. It is written in C++ 11 and requires a 32-bit platform. It has been tested extensively with many processors based on the Arm Cortex-M Series Processors. The core runtime just fits in 16 KB on an Arm Cortex M3 and can run many basic models.

There are example applications demonstrating the use of microcontrollers for tasks including wake word detection, gesture classification, and image classification. Arm’s engineers have worked closely with the TensorFlow team to develop optimized versions of the TensorFlow Lite kernels that use CMSIS-NN to deliver blazing fast performance on Arm Cortex-M cores. This means that developers get optimal performance without any hand tuning and have portability across embedded hardware.


Download content from this year’s Event Partners.

EVENT PARTNER SESSIONS


To hear more from this Event Partner, check out their Arm DevSummit sessions.

ON-DEMAND TECHNICAL SESSION

07:00 - 07:00
AI Solutions in Emerging Markets - Highlighting African Use-Cases

Across the African continent, developers face particular infrastructure challenges (including power and connectivity) as they build applications that address unique local problems. The Google Develop...

Ahmad Bature

Manager, Google Developer’s Group

Stephen Ozoigbo

Senior Director, Emerging Economies, Arm

Pete Warden

Technical Lead, TensorFlow Mobile

Robert John

Google Development Expert, Independent

Go to Session