This guide will teach you how to use the Machine Learning Plugin to capture and transmit your data to our ML partner platforms where you can develop machine learning solutions that can be deployed in your embedded application. The ML Plugin works within MPLAB® Data Visualizer so that you can capture a live data stream from your target device, select the desired data region, and then upload it to the platform of your choice or log the data to a file for later use.
Collecting and curating data is one of the most critical steps in developing an ML solution. It is generally the most time-consuming activity in the design cycle, and the performance of the resultant model is highly dependent on the quality of the data. This is why it is crucial to have effective tools and processes for gathering data. The goal of the ML Plugin is to simplify the data collection process and to enable the rapid development of embedded ML solutions.
Microchip's ML Design Partners
Microchip has partnered with the experts in embedded ML in order to empower our developers with the latest tools and technologies in the field. We have multiple partners because they each have their own unique strengths and areas of expertise.
With Edge Impulse and SensiML, developers can create solutions for a wide array of use-cases utilizing various sensor types. With Motion Gestures, developers can easily create new recognition models for custom, user-defined gestures. Our partners provide the necessary tools within their development platform to validate and optimize model performance before deployment.
- Solution Type: Classification, Regression, and Anomaly Detection
- Dev Platform: Edge Impulse Studio
- Output Format: C++ Inferencing SDK (Open Source C/C++)
Upload your data to the Edge Impulse Studio and start creating the next generation of intelligent devices with embedded machine learning. Edge Impulse is an open-source TensorFlow Lite-based framework for classification, regression, and anomaly detection. To learn more visit Edge Impulse's website.
- Solution Type: Complex 2D Gesture Recognition
- Dev Platform: Motion Gestures SDK
- Output Format: Gesture Recognition Engine (Static C Library)
Define new touch gestures on your target hardware and upload them to the Motion Gestures SDK to train a gesture recognition model that can be deployed back to your embedded application. Motion Gestures offers an advanced, out-of-the-box solution for complex gesture recognition that enables rapid high-accuracy gesture recognition systems. To learn more visit Motion Gestures' website.
- Solution Type: Classification, and Anomaly Detection
- Dev Platform: SensiML Analytics Toolkit
- Output Format: Knowledge Pack (Static C Library or Source Code)
Log your data to a file for import into SensiML's Analytics Toolkit to get started developing classification and anomaly detection solutions that can be deployed in your embedded application. The Analytics Toolkit is great for beginners and experts alike as it provides AutoML tools as well as fully customizable ML pipelines. To learn more visit SensiML's website.
Partner solutions are suitable for deployment on Microchip Arm® Cortex®-based 32-bit microcontrollers and microprocessors.
This guide covers using the SAMD21 ML Kit for data collection as an example, however, any time-series data can be used for building ML solutions with Edge Impulse and SensiML.
The ML Plugin and MPLAB® Data Visualizer can be installed as plugins to MPLAB® X via the plugins manager or the ML Plugin as a plugin to MPLAB® Data Visualizer Standalone.
- SAMD21 ML Kit Data Logger Firmware - 6-axis IMU data for vibration, rotation, and motion-based solutions
To use the ML Plugin, MPLAB® Data Visualizer must first be configured to receive data from the desired target device. This involves the configuration of the serial connection as well as the variable streamer, which will parse variables from the serial stream. Once data sources are plotted in the Time pane, click the Mark button to tag the visible data for use in the ML Plugin. The Mark button places the cursors (A & B) at the bounds of the visible graph so that the viewable data can then be used within the ML Plugin.
To select a new region of data, scroll, and zoom as needed in the Time pane and then click Mark when the desired segment is precisely within the viewable window. Alternatively, the bounds of the time window can be set manually in the Time Axis menu on the right, before pressing the Mark button.
Capturing sensor data with MPLAB® Data Visualizer
Program the Kit with Data Logger Firmware
Use MPLAB® X IDE to program the SAMD21 ML Kit with the provided example project. Be sure to select the correct project configuration in MPLAB® X before programming the device. There are two configurations to support both versions of the SAMD21 ML Kit.
- IMU2 (Bosch): SAMD21_IOT_WG_BMI160
- IMU14 (TDK): SAMD21_IOT_WG_ICM42688
The general application settings for the SAMD21 Data Logger, such as the sensor sampling rate and the data logging format, can be found in app_config.h. This is also where the individual axes of the IMU can be enabled or disabled. This guide will use only the three axes from the accelerometer, however, the logged axes can be reconfigured as needed based on the application.
Once the kit is programmed with the desired configuration, you are ready to move on to collecting the serial data stream with MPLAB® Data Visualizer.
Configure MPLAB® Data Visualizer
Leave the board connected to the computer and open MPLAB® Data Visualizer. Load the Data Visualizer workspace file 3dof-imu-acc.dvws found in the example firmware repository. This workspace already contains the variable streamer required to parse the IMU data, and it will plot each variable once the serial port is configured.
After loading the Data Visualizer workspace file, select the Serial/CDC Connection that corresponds to the SAMD21 ML Kit. Adjust the baud rate to 115200 and click Apply. The DGI connection can also be disabled since we will not use any debug data.
Use the play button on the Serial/CDC Connection to start data collection from the kit. Once data is streaming, it is available for use with the variable streamer.
Now select the same Serial/CDC Connection as the input data source for the IMU variable streamer, so that the data axes can be parsed from the stream.
The IMU data should now be available in the time plot. Double click anywhere within the time plot to start/stop scrolling of the time axis.
Select Data Region and Mark Time Window
Once the desired data sources are plotted in the graph, select a region of interest in the data by focusing the Time Plot on that region. You can drag the plots in the time window to the desired region of data while scrolling to zoom in or out as needed. When you are satisfied with the data viewable in the Time Plot, click the Mark button to tag this region of data for use in the ML plugin.
Pressing Mark will place the cursors at the bounds of the visible window. First, select a new region of data, reposition the desired data within the Time Plot, and then press Mark again.
After marking the Time pane, the data is ready to be used within the ML Plugin. This general process of configuring, plotting, and marking the serial data stream can be followed for any type of time-series data that is available in Data Visualizer.
Data can also be logged directly to a file by clicking Snapshot in the Time Axis menu of Data Visualizer. This will automatically mark the visible data and allow for saving in .csv or .json format.
Using the ML Plugin
When using the ML Plugin with Edge Impulse, you can upload new data to the Data Acquisition tab in the Edge Impulse Studio. This is used to collect the training and testing data that will be used to develop the model. Once you have trained a model within the Edge Impulse studio, the ML Plugin can be used to test model performance by uploading new data segments to the testing endpoint and then viewing the classification results within the ML Plugin.
For a detailed guide on the Edge Impulse functionality, see "Using the ML Plugin with Edge Impulse".
When using the ML Plugin with Motion Gestures, you can upload new user-defined, 2D gestures to the Motion Gestures SDK. This is used to add new gestures to the Gesture Library within your account and also to upload new gestures for testing model performance.
For a detailed guide on the Motion Gestures functionality, see "Using the ML Plugin with Motion Gestures".
When using the ML Plugin with SensiML, you can save data to a CSV file that is formatted for import into SensiML's Data Capture Lab. The ML Plugin also provides an option to generate a .dcli file (a JSON based SensiML metadata file) that contains additional metadata about your sample.
For a detailed guide on the SensiML functionality, see "Using the ML Plugin with SensiML".
You should now understand the general process of collecting live sensor data from your target device with MPLAB® Data Visualizer and then sending it to one of our ML partner platforms with the Machine Learning Plugin. Now that this is out of the way, you can get to work developing edge-optimized, ML solutions that can be deployed in your embedded application.