Motion Classification and Anomaly Detection
Overview
The XIAO ESP32S3 Sense, with its built-in camera and mic, is a versatile device. But what if you need to add another type of sensor, such as an IMU? No problem! One of the standout features of the XIAO ESP32S3 is its multiple pins that can be used as an I2C bus (SDA/SCL pins), making it a suitable platform for sensor integration.
Installing the IMU
When selecting your IMU, the market offers a wide range of devices, each with unique features and capabilities. You could choose, for example, the ADXL362 (3-axis), MAX21100 (6-axis), MPU6050 (6-axis), LIS3DHTR (3-axis), or the LCM20600Seeed Grove— (6-axis), which is part of the IMU 9DOF (lcm20600+AK09918). This variety allows you to tailor your choice to your project’s specific needs.
For this project, we will use an IMU, the MPU6050 (or 6500), a low-cost (less than 2.00 USD) 6-axis Accelerometer/Gyroscope unit.
At the end of the lab, we will also comment on using the LCM20600.
The MPU-6500 is a 6-axis Motion Tracking device that combines a 3-axis gyroscope, 3-axis accelerometer, and a Digital Motion ProcessorTM (DMP) in a small 3x3x0.9mm package. It also features a 4096-byte FIFO that can lower the traffic on the serial bus interface and reduce power consumption by allowing the system processor to burst read sensor data and then go into a low-power mode.
With its dedicated I2C sensor bus, the MPU-6500 directly accepts inputs from external I2C devices. MPU-6500, with its 6-axis integration, on-chip DMP, and run-time calibration firmware, enables manufacturers to eliminate the costly and complex selection, qualification, and system-level integration of discrete devices, guaranteeing optimal motion performance for consumers. MPU-6500 is also designed to interface with multiple non-inertial digital sensors, such as pressure sensors, on its auxiliary I2C port.
Usually, the libraries available are for MPU6050, but they work for both devices.
Connecting the HW
Connect the IMU to the XIAO according to the below diagram:
- MPU6050 SCL –> XIAO D5
- MPU6050 SDA –> XIAO D4
- MPU6050 VCC –> XIAO 3.3V
- MPU6050 GND –> XIAO GND
Install the Library
Go to Arduino Library Manager and type MPU6050. Install the latest version.
Download the sketch MPU6050_Acc_Data_Acquisition.in:
/*
* Based on I2C device class (I2Cdev) Arduino sketch for MPU6050 class
by Jeff Rowberg <jeff@rowberg.net>
* and Edge Impulse Data Forwarder Exampe (Arduino)
- https://docs.edgeimpulse.com/docs/cli-data-forwarder
*
* Developed by M.Rovai @11May23
*/
#include "I2Cdev.h"
#include "MPU6050.h"
#include "Wire.h"
#define FREQUENCY_HZ 50
#define INTERVAL_MS (1000 / (FREQUENCY_HZ + 1))
#define ACC_RANGE 1 // 0: -/+2G; 1: +/-4G
// convert factor g to m/s^2^ ==> [-32768, +32767] ==> [-2g, +2g]
#define CONVERT_G_TO_MS2 (9.81/(16384.0/(1.+ACC_RANGE)))
static unsigned long last_interval_ms = 0;
MPU6050 imu;
int16_t ax, ay, az;
void setup() {
Serial.begin(115200);
// initialize device
Serial.println("Initializing I2C devices...");
Wire.begin();
imu.initialize();
delay(10);
// // verify connection
// if (imu.testConnection()) {
// Serial.println("IMU connected");
// }
// else {
// Serial.println("IMU Error");
// }
delay(300);
//Set MCU 6050 OffSet Calibration
imu.setXAccelOffset(-4732);
imu.setYAccelOffset(4703);
imu.setZAccelOffset(8867);
imu.setXGyroOffset(61);
imu.setYGyroOffset(-73);
imu.setZGyroOffset(35);
/* Set full-scale accelerometer range.
* 0 = +/- 2g
* 1 = +/- 4g
* 2 = +/- 8g
* 3 = +/- 16g
*/
imu.setFullScaleAccelRange(ACC_RANGE);
}
void loop() {
if (millis() > last_interval_ms + INTERVAL_MS) {
last_interval_ms = millis();
// read raw accel/gyro measurements from device
imu.getAcceleration(&ax, &ay, &az);
// converting to m/s^2^
float ax_m_s^2^ = ax * CONVERT_G_TO_MS2;
float ay_m_s^2^ = ay * CONVERT_G_TO_MS2;
float az_m_s^2^ = az * CONVERT_G_TO_MS2;
Serial.print(ax_m_s^2^);
Serial.print("\t");
Serial.print(ay_m_s^2^);
Serial.print("\t");
Serial.println(az_m_s^2^);
}
}
Some comments about the code:
Note that the values generated by the accelerometer and gyroscope have a range: [-32768, +32767], so for example, if the default accelerometer range is used, the range in Gs should be: [-2g, +2g]. So, “1G” means 16384.
For conversion to m/s2, for example, you can define the following:
#define CONVERT_G_TO_MS2 (9.81/16384.0)
In the code, I left an option (ACC_RANGE) to be set to 0 (+/-2G) or 1 (+/- 4G). We will use +/-4G; that should be enough for us. In this case.
We will capture the accelerometer data on a frequency of 50Hz, and the acceleration data will be sent to the Serial Port as meters per squared second (m/s2).
When you ran the code with the IMU resting over your table, the accelerometer data shown on the Serial Monitor should be around 0.00, 0.00, and 9.81. If the values are a lot different, you should calibrate the IMU.
The MCU6050 can be calibrated using the sketch: mcu6050-calibration.ino.
Run the code. The following will be displayed on the Serial Monitor:
Send any character (in the above example, “x”), and the calibration should start.
Note that a message MPU6050 connection failed. Ignore this message. For some reason, imu.testConnection() is not returning a correct result.
In the end, you will receive the offset values to be used on all your sketches:
Take the values and use them on the setup:
//Set MCU 6050 OffSet Calibration
imu.setXAccelOffset(-4732);
imu.setYAccelOffset(4703);
imu.setZAccelOffset(8867);
imu.setXGyroOffset(61);
imu.setYGyroOffset(-73);
imu.setZGyroOffset(35);
Now, run the sketch MPU6050_Acc_Data_Acquisition.in:
Once you run the above sketch, open the Serial Monitor:
Or check the Plotter:
Move your device in the three axes. You should see the variation on Plotter:
The TinyML Motion Classification Project
For our lab, we will simulate mechanical stresses in transport. Our problem will be to classify four classes of movement:
- Maritime (pallets in boats)
- Terrestrial (palettes in a Truck or Train)
- Lift (Palettes being handled by Fork-Lift)
- Idle (Palettes in Storage houses)
So, to start, we should collect data. Then, accelerometers will provide the data on the palette (or container).
From the above images, we can see that primarily horizontal movements should be associated with the “Terrestrial class,” Vertical movements with the “Lift Class,” no activity with the “Idle class,” and movement on all three axes to Maritime class.
Connecting the device to Edge Impulse
For data collection, we should first connect our device to the Edge Impulse Studio, which will also be used for data pre-processing, model training, testing, and deployment.
Follow the instructions hereto install the Node.jsand Edge Impulse CLI on your computer.
Once the XIAO ESP32S3 is not a fully supported development board by Edge Impulse, we should, for example, use the CLI Data Forwarder to capture data from our sensor and send it to the Studio, as shown in this diagram:
You can alternately capture your data “offline,” store them on an SD card or send them to your computer via Bluetooth or Wi-Fi. In this video, you can learn alternative ways to send data to the Edge Impulse Studio.
Connect your device to the serial port and run the previous code to capture IMU (Accelerometer) data, “printing them” on the serial. This will allow the Edge Impulse Studio to “capture” them.
Go to the Edge Impulse page and create a project.
The maximum length for an Arduino library name is 63 characters. Note that the Studio will name the final library using your project name and include “_inference” to it. The name I chose initially did not work when I tried to deploy the Arduino library because it resulted in 64 characters. So, I need to change it by taking out the “anomaly detection” part.
Start the CLI Data Forwarderon your terminal, entering (if it is the first time) the following command:
edge-impulse-data-forwarder --clean
Next, enter your EI credentials and choose your project, variables, and device names:
Go to your EI Project and verify if the device is connected (the dot should be green):
Data Collection
As discussed before, we should capture data from all four Transportation Classes. Imagine that you have a container with a built-in accelerometer:
Now imagine your container is on a boat, facing an angry ocean, on a truck, etc.:
- Maritime (pallets in boats)
- Move the XIAO in all directions, simulating an undulatory boat movement.
- Terrestrial (palettes in a Truck or Train)
- Move the XIAO over a horizontal line.
- Lift (Palettes being handled by Fork-Lift)
- Move the XIAO over a vertical line.
- Idle (Palettes in Storage houses)
- Leave the XIAO over the table.
Below is one sample (raw data) of 10 seconds:
You can capture, for example, 2 minutes (twelve samples of 10 seconds each) for the four classes. Using the “3 dots” after each one of the samples, select 2, moving them for the Test set (or use the automatic Train/Test Split tool on the Danger Zone of Dashboard tab). Below, you can see the result datasets:
Data Pre-Processing
The raw data type captured by the accelerometer is a “time series” and should be converted to “tabular data”. We can do this conversion using a sliding window over the sample data. For example, in the below figure,
We can see 10 seconds of accelerometer data captured with a sample rate (SR) of 50Hz. A 2-second window will capture 300 data points (3 axis x 2 seconds x 50 samples). We will slide this window each 200ms, creating a larger dataset where each instance has 300 raw features.
You should use the best SR for your case, considering Nyquist’s theorem, which states that a periodic signal must be sampled at more than twice the signal’s highest frequency component.
Data preprocessing is a challenging area for embedded machine learning. Still, Edge Impulse helps overcome this with its digital signal processing (DSP) preprocessing step and, more specifically, the Spectral Features.
On the Studio, this dataset will be the input of a Spectral Analysis block, which is excellent for analyzing repetitive motion, such as data from accelerometers. This block will perform a DSP (Digital Signal Processing), extracting features such as “FFT” or “Wavelets”. In the most common case, FFT, the Time Domain Statistical features per axis/channel are:
- RMS
- Skewness
- Kurtosis
And the Frequency Domain Spectral features per axis/channel are:
- Spectral Power
- Skewness
- Kurtosis
For example, for an FFT length of 32 points, the Spectral Analysis Block’s resulting output will be 21 features per axis (a total of 63 features).
Those 63 features will be the Input Tensor of a Neural Network Classifier and the Anomaly Detection model (K-Means).
You can learn more by digging into the lab DSP Spectral Features
Model Design
Our classifier will be a Dense Neural Network (DNN) that will have 63 neurons on its input layer, two hidden layers with 20 and 10 neurons, and an output layer with four neurons (one per each class), as shown here:
Impulse Design
An impulse takes raw data, uses signal processing to extract features, and then uses a learning block to classify new data.
We also take advantage of a second model, the K-means, that can be used for Anomaly Detection. If we imagine that we could have our known classes as clusters, any sample that could not fit on that could be an outlier, an anomaly (for example, a container rolling out of a ship on the ocean).
Imagine our XIAO rolling or moving upside-down, on a movement complement different from the one trained
Below is our final Impulse design:
Generating features
At this point in our project, we have defined the pre-processing method and the model designed. Now, it is time to have the job done. First, let’s take the raw data (time-series type) and convert it to tabular data. Go to the Spectral Features tab and select Save Parameters:
At the top menu, select the Generate Features option and the Generate Features button. Each 2-second window data will be converted into one data point of 63 features.
The Feature Explorer will show those data in 2D using UMAP. Uniform Manifold Approximation and Projection (UMAP) is a dimension reduction technique that can be used for visualization similarly to t-SNE but also for general non-linear dimension reduction.
The visualization allows one to verify that the classes present an excellent separation, which indicates that the classifier should work well.
Optionally, you can analyze the relative importance of each feature for one class compared with other classes.
Training
Our model has four layers, as shown below:
As hyperparameters, we will use a Learning Rate of 0.005 and 20% of data for validation for 30 epochs. After training, we can see that the accuracy is 97%.
For anomaly detection, we should choose the suggested features that are precisely the most important in feature extraction. The number of clusters will be 32, as suggested by the Studio:
Testing
Using 20% of the data left behind during the data capture phase, we can verify how our model will behave with unknown data; if not 100% (what is expected), the result was not that good (8%), mainly due to the terrestrial class. Once we have four classes (which output should add 1.0), we can set up a lower threshold for a class to be considered valid (for example, 0.4):
Now, the Test accuracy will go up to 97%.
You should also use your device (which is still connected to the Studio) and perform some Live Classification.
Be aware that here you will capture real data with your device and upload it to the Studio, where an inference will be taken using the trained model (But the model is NOT in your device).
Deploy
Now it is time for magic! The Studio will package all the needed libraries, preprocessing functions, and trained models, downloading them to your computer. You should select the option Arduino Library, and at the bottom, choose Quantized (Int8) and Build. A Zip file will be created and downloaded to your computer.
On your Arduino IDE, go to the Sketch tab, select the option Add.ZIP Library, and Choose the.zip file downloaded by the Studio:
Inference
Now, it is time for a real test. We will make inferences that are wholly disconnected from the Studio. Let’s change one of the code examples created when you deploy the Arduino Library.
In your Arduino IDE, go to the File/Examples tab and look for your project, and on examples, select nano_ble_sense_accelerometer:
Of course, this is not your board, but we can have the code working with only a few changes.
For example, at the beginning of the code, you have the library related to Arduino Sense IMU:
/* Includes --------------------------------------------------------------- */
#include <XIAO-ESP32S3-Motion-Classification_inferencing.h>
#include <Arduino_LSM9DS1.h>
Change the “includes” portion with the code related to the IMU:
#include <XIAO-ESP32S3-Motion-Classification_inferencing.h>
#include "I2Cdev.h"
#include "MPU6050.h"
#include "Wire.h"
Change the Constant Defines
/* Constant defines ------------------------------------------------------- */
MPU6050 imu;
int16_t ax, ay, az;
#define ACC_RANGE 1 // 0: -/+2G; 1: +/-4G
#define CONVERT_G_TO_MS2 (9.81/(16384/(1.+ACC_RANGE)))
#define MAX_ACCEPTED_RANGE (2*9.81)+(2*9.81)*ACC_RANGE
On the setup function, initiate the IMU set the off-set values and range:
// initialize device
Serial.println("Initializing I2C devices...");
Wire.begin();
imu.initialize();
delay(10);
//Set MCU 6050 OffSet Calibration
imu.setXAccelOffset(-4732);
imu.setYAccelOffset(4703);
imu.setZAccelOffset(8867);
imu.setXGyroOffset(61);
imu.setYGyroOffset(-73);
imu.setZGyroOffset(35);
imu.setFullScaleAccelRange(ACC_RANGE);
At the loop function, the buffers buffer[ix], buffer[ix + 1], and buffer[ix + 2] will receive the 3-axis data captured by the accelerometer. On the original code, you have the line:
IMU.readAcceleration(buffer[ix], buffer[ix + 1], buffer[ix + 2]);
Change it with this block of code:
imu.getAcceleration(&ax, &ay, &az);
buffer[ix + 0] = ax;
buffer[ix + 1] = ay;
buffer[ix + 2] = az;
You should change the order of the following two blocks of code. First, you make the conversion to raw data to “Meters per squared second (ms2)”, followed by the test regarding the maximum acceptance range (that here is in ms2, but on Arduino, was in Gs):
buffer[ix + 0] *= CONVERT_G_TO_MS2;
buffer[ix + 1] *= CONVERT_G_TO_MS2;
buffer[ix + 2] *= CONVERT_G_TO_MS2;
for (int i = 0; i < 3; i++) {
if (fabs(buffer[ix + i]) > MAX_ACCEPTED_RANGE) {
buffer[ix + i] = ei_get_sign(buffer[ix + i]) * MAX_ACCEPTED_RANGE;
}
}
And that is it! You can now upload the code to your device and proceed with the inferences. The complete code is available on the project’s GitHub.
Now you should try your movements, seeing the result of the inference of each class on the images:
And, of course, some “anomaly”, for example, putting the XIAO upside-down. The anomaly score will be over 1:
Conclusion
Regarding the IMU, this project used the low-cost MPU6050 but could also use other IMUs, for example, the LCM20600 (6-axis), which is part of the Seeed Grove - IMU 9DOF (lcm20600+AK09918). You can take advantage of this sensor, which has integrated a Grove connector, which can be helpful in the case you use the XIAO with an extension board, as shown below:
You can follow the instructions here to connect the IMU with the MCU. Only note that for using the Grove ICM20600 Accelerometer, it is essential to update the files I2Cdev.cpp and I2Cdev.h that you will download from the library provided by Seeed Studio. For that, replace both files from this link. You can find a sketch for testing the IMU on the GitHub project: accelerometer_test.ino.
On the projet’s GitHub repository, you will find the last version of all codeand other docs: XIAO-ESP32S3 - IMU.