In the IoT industry, "edge" refers to devices that perform computation locally instead of relying on cloud computing. The latest development, Tiny Edge, brings computation closer to where data is generated, such as sensor nodes. This shift moves from a centralized, cloud-based solution to a distributed network of edge nodes that collect, process, and infer data locally. By 2027, over 3 billion devices are expected to be sold with TinyML, a subset of AI focused on deploying machine learning models on Tiny Edge devices. This growth is driven by societal trends like the need for speed, privacy and connectivity. Additionally, the transition from wired to wireless technology is further accelerating the adoption of Tiny Edge devices.
Silicon Labs' Wireless SoCs support a range of ML applications, such as sensor signal processing for predictive and preventative maintenance, bio-signal analysis for healthcare, and cold chain monitoring. They also enable audio pattern matching for security applications, voice commands for smart device control, and low-resolution vision for tasks like people counting and presence detection. The SoCs offer various RAM sizes to accommodate different application requirements. Machine learning models are applied to data from sensors such as microphones, cameras, and those measuring time-series data like acceleration and temperature. These models include audio pattern matching, wake word/command word detection, fingerprint reading, always-on vision, and image/object classification and detection. The detected events can then be further processed according to the requirements.
Silicon Labs can accelerate the development of AI/ML devices, starting by outlining each step in the process and helping you along each stage of your project. We are here to simplify your development journey and help you get your devices to market faster and more efficiently.
We have outlined below three key stages of the AI/ML Developer Journey, along with what is required to successfully complete each stage.
Silicon Labs provides offers several development and explorer kits, from ultra-low-cost, small form factor to compact, feature-packed platforms designed for robust networks. We have several exciting demos, including wake-word detection, Pacman, and gesture control. These feature-rich kits support multiple protocols and come in different memory configurations with a wide variety of sensors and peripherals for quick debugging and rapid prototyping. Based on the demos you are interested in, please select the kit that best fits your needs below. The demos are hardware agnostic.
While you wait for your Development Kit, we recommend setting up your user accounts.
Silicon Labs Account: This account will offer you access to our developer community, Getting Started guides, private GitHub repositories and our Simplicity Studio development environment. You can create your account or verify access to your account here.
We know you have many options when it comes to choosing your development environment, but we believe Simplicity Studio is the right choice for developing your device with Bluetooth. Here’s why:
Need help setting up your environment? Our Getting Started Guide will have you up and running in no time.
Download the Full Online Installer Version of Simplicity Studio v5:
Here is a list of some additional ideas which could be easily realized with minimal coding, modifying the referenced example application as suggested below. These use cases are not provided as ready-to-go demos; instead they provide a perfect context for further evaluation.
Detects spoken keywords ""on" and "off" to turn on and off LED on board.
Suggested Kits:
Get up and running quickly with
pre-built application in 10 minutes.
Learn to create the ML application from
trained model in 30 minutes.
Because starting application development from scratch is difficult, our Simplicity SDK comes with a number of built-in demos and examples covering the most frequent use cases.
Play the popular Pac-Man game using keywords said out aloud – Go, Left, Right, Up, Down, Stop. The application uses keyword detection. Board can be controlled using Simplicity Studio. The demo is also available as part of Simplicity Studio.
Suggested Kit:
This application uses TensorFlow Lite for Microcontrollers to classify audio data recorded on the microphone in a Micrium OS kernel task. The classification is used to control a LED on the board. The demo is also available as part of Simplicity Studio.
Suggested Kit:
This application demonstrates a model trained to recognize various hand gestures with an accelerometer. The detected gestures are printed to the serial port. The demo is also available as part of Simplicity Studio.
Suggested Kit:
This application demonstrates a model trained to replicate a sine function. The model is continuously fed with values ranging from 0 to 2pi, and the output of the model is used to control the intensity of an LED. The demo is also available as part of Simplicity Studio.
Suggested Kit:
Already have your .tflite file ready to go? Skip to the next step: "Test and Validate" .
Train your model and prepare it for conversion into a deployable format.
Begin by designing and training your AI/ML model. This involves gathering and preprocessing data, selecting appropriate model, and setting up training parameters.
To help you build your model from scratch, we provide a Python package with command-line utilities and scripts to assist you with building your own model.
Refer to the TensorFlow documentation for support building on the Machine Learning model. Refer to LiteRT documentation for support on converting the model to .tflite
We've partnered with top AI platforms to help you design and build models with minimal coding. These platforms provide user-friendly GUI and automated workflows to simplify the process.
Evaluate your model's performance against the embedded target, validate the model to ensure it meets required performance metrics.
The MLTK model profiler provides information about how efficiently a model may run on an embedded target. The model profiler allows for executing a .tflite model file in a simulator or on a physical embedded target.
Note: This tool is optional and not officially supported by Silicon Labs, yet.
Integrate and deploy your validated model onto the embedded device.
Pre-built, ready-to-deploy AI/ML solutions on Silicon Labs SoCs that simplify the development process and accelerate time-to-market.
Silicon Labs has pre-screened and certified the following third-party AI/ML design service companies
to help you design and develop your customized AI/ML solution.
Thank you for downloading .
If you have any issues downloading, please contact sales support or product technical support.
Please select at least one column.