How sensor hubs on smartphones helped bring artificial intelligence to the edge

AIoT, TinyML, EdgeAI, and MLSensors are the latest buzzwords in the Artificial Intelligence and Embedded Systems communities. There’s plenty of hype around designing highly optimized, low-footprint machine learning algorithms that run on ultra-low power microcontrollers, and DSPs targeted at sensing applications including but not limited to audio and video.

This approach is particularly useful in ‘always-on’ applications where power consumption needs to be minimized and invasion of privacy is a serious concern.

Some existing applications already leverage embedded intelligence:

- Wake word detection: ‘ok Google’ and ‘hey Siri’ are prime examples that we use almost every day

- Car crash detection: newer smartphones can fuse data from multiple sensors like the microphone, inertial sensors, and GPS to detect and alert emergency services in the event of a car crash

- Step counting and workout detection: wearables use data from inertial and biosensors along with intelligent algorithms to track daily activities

What’s common in all of these cases is the use of dedicated low-power processing hardware that is tightly coupled with sensors running highly optimized algorithms to make the relevant inferences. Another commonality is the fact that they are all subsystems of more complex devices.

Owing to these developments, in the near future we may see embedded intelligence in other standalone devices such as:

- Completely offline smart cameras to detect human presence [1]

- Environmental sensors to detect forest fires [2] and illegal tree felling [3]

The growing popularity and widespread applications of Embedded Intelligence are inspiring. But where did this all begin?

Here’s my quick take on how ‘sensor hubs’, particularly those in smartphones, helped kickstart this movement.

But first, what is a sensor hub?

By definition, it’s a co-processor, microcontroller, or DSP responsible for integrating data and providing simplified insights from multiple low-power sensors in a device, with the intent of freeing up the processing bandwidth of the main application processor.

The sensor hub first started off as a neat power optimization trick in smartphones. Phones as early as the iPhone 5s and Nexus 5X and 6P featured the Apple M7 coprocessor [4] and Android Sensor Hub [5] respectively.

Apple used the M7 to handle the demanding inertial sensors like the accelerometer, gyroscope, compass, and sensor fusion algorithms, while the Android Sensor Hub did the same and also ran algorithms for advanced activity recognition. Motorola further innovated on inertial sensor features with the catchy “chop-chop” [6] gesture to turn on the flashlight.

We also began to see the overlap of sensors (microphone) and machine learning running on a low-power processor get popular with wake word detection like “Hey Siri” and “Ok Google”. These features are being pushed to greater heights with quick phrases [7]and ‘now playing’ on the latest Pixel phones.

Thus, over the last 6–7 years, smartphones and their sensor hubs have proved to be the perfect proof-of-concept to show the world that it is possible to deploy machine learning algorithms for sensors on very low-power computing platforms like microcontrollers and DSPs.

It’s great to see this movement get its own name and independent audience in the form of the TinyML, EdgeAI, and MLSensors communities.

Interestingly enough, semiconductor giants like Analog Devices, TDK, and Robert Bosch who design and manufacture a lot of these smartphone sensors have their own unique take on sensor hubs.

While the goal is still the same, to provide useful insights from multiple sensors while consuming as little power as possible, the applications are much broader. Since smartphones already have sensor hubs of their own, independent ones are being developed for wearables, automobiles, and other smart gadgets.

Sensor hubs on smartphones were initially discrete components. The M7 motion coprocessor is an independent NXP LPC18A1-based chip, but over time these coprocessors got integrated into main smartphone SoC.

However, discrete sensor hubs are still available from semiconductor and sensor manufacturers. They combine AI and sensors to enable niche use cases like swim coaching [8].

There are often just microcontrollers, typically ARM Cortex M series, that are tightly coupled with the sensor and pre-loaded with algorithms to enable a specific use case. This is great for the manufacturers of these sensors because they are not only able to monetize the hardware, but also the algorithms they develop for them.

It’s also great for companies using these ‘smart sensors’ to develop their own gadgets as they don’t need to spend time developing niche algorithms and can focus on system integration instead.

Sensor hubs are still in a very nascent stage, using mostly general purpose microcontrollers. But as hardware continues to improve, the possibilities are endless. Sensors themselves are getting more accurate and precise. ARM v9 and its focus on DSP and ML capabilities will greatly expand the set of models that can be implemented on an embedded device.

The Ethos U-55 is a microNPU (Neural Processing Unit) from ARM that can soon find its way to sensor hubs that are already implementing ARM IPs [9]. Many startups like Syntiant are also developing specialized hardware for neural network inference on the edge [10].

Exciting times ahead in the world of sensors! Stay tuned for more musings on EdgeAI and Smart Sensors…

[1] https://blog.tensorflow.org/2019/10/visual-wake-words-with-tensorflow-lite_30.html

[2] https://www.bosch.com/stories/early-forest-fire-detection-sensors/

[3] https://www.mdpi.com/1424-8220/21/22/7593

[4] https://en.wikipedia.org/wiki/Apple_motion_coprocessors

[5] https://www.androidpolice.com/2015/09/29/the-new-android-sensor-hub-will-significantly-improve-idle-battery-life-while-doing-more-with-sensor-data/

[6] https://www.facebook.com/MotorolaIN/videos/chop-chop-to-light-up-do-more-with-the-intuitive-moto-actions-on-the-motog5s-buy/1062923173744146/

[7] https://support.google.com/assistant/answer/9475056?hl=en&co=GENIE.Platform%3DAndroid

[8] https://www.bosch-sensortec.com/white-paper-swimming.html

[9] https://developer.arm.com/Processors/Ethos-U55

[10] https://www.syntiant.com/

[11] https://blog.esper.io/how-android-listens-to-you-with-ultra-low-power-sensors-w-kieron-quinn/


Check out more of Kenneth's content:

Kenneth Joel – Medium
Read writing from Kenneth Joel on Medium. Tech Entrepreneur and E&E Engineer. Writes about sensors, embedded systems, artificial Intelligence and how they converge.