Wednesday, 14 April 2021

Machine Learning Core Repository on GitHub, 4 Application Examples to Try Machine Learning in Sensors in Minutes | ELE Times

ST published its machine learning core repository on GitHub, with examples and configuration files, to vastly improve the developers’ experience. Artificial intelligence is notoriously difficult because it relies on data science. Additionally, creating the right algorithm, such as a decision tree, and setting it up, can also be tricky.

Unfortunately, all these issues tend to limit the number of engineers that can easily start working on machine learning applications. Hence, we published a repository on GitHub to solve this problem. The package includes subsets of data logs as well as applications and configuration examples for the LSM6DSOX, LSM6DSRX, ISM330DHCX, IIS2ICLX inertial sensors. It already served key ST customers that used it to develop their commercial solutions. We thus thought other members of our community could benefit from it.

Another Way to Make Machine Learning More Accessible

The sensors supported are unique because they all have a machine learning core that can run one or more decision trees in parallel. ST was the first to provide such a component and received awards for it. It remains unique because a machine learning core can provide decision-making capabilities at a fraction of a microcontroller’s power consumption. As a result, ST expanded its offering with new devices from 2019, such as the LSM6DSRX and ISM330DHCX. We also reduced the barrier to entry by releasing tools like Unico-GUI. The utility offers a graphical interface that helps with data collection and the machine learning core’s configuration. The GitHub repository is thus another initiative that aims to make machine learning more accessible. Anyone can simply follow the steps outlined in the software package and test applications in minutes.

The Machine Learning Core Repository and Sport

Gym Activity

Setting a wristband in a gym
Setting a wristband in a gym

One application example present in the machine learning core repository is a gym activity recognition running on the LSM6DSOX. The program enables a wristband to automatically detect between bicep curls, lateral raises, squats, or resting position. Users must, however, tell the system whether the wearable is on their right or left hand. The application in question relies on data collected by a wristband that used an LSM6DSOX inertial module. ST gathered data with the wearable on a right hand and then a left hand and now provides a subset of the “left hand” data in the repository. We are also offering two configuration files, one for each hand. Additionally, developers will find examples to help them design a similar algorithm and study the filters we applied to the accelerometer’s signal.

Yoga Pose

A woman performing a meditative pose
A woman performing a meditative pose

The other physical application is fascinating because it runs on a SensorTile.Box and can recognize 12 yoga positions as well as two non-yoga standing positions (standing still and standing in motion). It is possible to attach the device to the user’s left leg and use its powerful sensor to run a decision tree with 20 nodes. As the user holds a pose, the system can detect it in less than a second. The system can distinguish between a plank, the child’s pose, the downward-facing dog, or the meditation pose, among many others. The repository also offers the data logs from UNICO-GUI that helped create the decision tree classifier. The system determines a posture by tracking the mean values of the accelerometer on the X, Y, and Z axes.

The Machine Learning Core Repository and Motion Detection

Vehicle Stationary Detection

The stationary detection algorithm determines whether a car is moving or not thanks to the more precise LSM6DSRX. The application uses data from the accelerometer and the gyroscope and works regardless of the orientation. The GitHub repository even offers a subset of the data logs collected to make this program. Additionally, we are providing configuration examples to help developers work on similar algorithms. The example helps beginners understand how a few filters can make a big difference to the input signal. The configuration also shows how we implemented a decision tree with 30 nodes. ST used a similar algorithm in its Baby Crying Detector. Indeed, a moving car implies the presence of a driver, which meant that even if a baby is crying, there’s no need to start an alarm because an adult is present in the car.

Head Gesture

The head gesture recognition application also uses an LSM6DSRX. The sensor, present in a headphone, can determine whether users are nodding, stationary, walking, and shaking or swinging their head. ST collected data for this algorithm with this particular inertial sensor, and a subset of the data logs are available. The application uses data from both the accelerometer and the gyroscope on the X, Y, and Z axes. However, not all data sources receive the same filters. For instance, the system only monitors the accelerometer on the Y-axis for a maximum threshold while and it looks for a minimum threshold on the X-axis of the same sensing element. It is thus a great example of the importance of signal processing in machine learning applications. Additionally, the decision tree is itself fairly straightforward with only seven nodes to detect five classes.

For more information, visit blog.st.com

The post Machine Learning Core Repository on GitHub, 4 Application Examples to Try Machine Learning in Sensors in Minutes appeared first on ELE Times.

No comments:

Post a Comment