Running small machine learning models on embedded devices has become a rich topic of research and investment, both at large tech companies and in academia. It’s not really mainstream yet, but it’s possible that the use cases where TinyML delivers the most value aren’t necessarily found in rich countries with robust infrastructure and bigger budgets for computing projects.
This is why Pete Warden, a staff research engineer at Google, thinks developing countries might offer the best opportunities to prove the technology and provide compelling use cases. In a talk hosted by the tinyML Foundation, Warden shared his thinking about the benefits of TinyML for poor and remote areas as well as asked participants to share their project ideas and use cases.
TinyML is a somewhat fuzzy concept. Some people use the term to refer to machine learning models running on smartphones or low-performance edge gateway devices. But in this newsletter, TinyML refers to machine learning models running on embedded devices that have microcontrollers and limited memory. Even Warden admitted there’s no formal classification, and said that he looks at how much memory a device has and whether or not it’s an embedded computer.
These embedded MCUs cost less than the chips inside smartphones or traditional computers, consume less power, and are typically designed for rugged environments. All of these things make them perfect for deploying in poor, remote areas of what Warden called “the Global South.” And because many of these areas are remote and/or lack high-quality connectivity, they also provide a natural use case for TinyML because it’s not possible to send data to the cloud for processing. Thus, TinyML benefits from doing machine learning on devices locally.
Warden pointed to a machine learning project in South America where researchers are using smartphones mounted in trees to measure the presence of loggers by listening for the sound of chainsaws or heavy equipment, and a project in Tanzania that is using computer vision and a smartphone to track the health of cassava plants. Warden said this could be done locally on the device with cheaper and more rugged components using TinyML.
But several elements are standing in the way. First up, the embedded computing world is heterogeneous, with several different popular devices and platforms. So there’s a lot of customization that has to happen to tie into the varied hardware out there.
There are lots of free resources out there for people who want to play with TinyML, but it’s still such a new field that everyone is learning together. Warden fielded several questions about common software resources, how to gather data, and the best ways to build models for embedded devices.
Finally, as is always the case when thinking about machine learning, access to good training data is a challenge. Warden highlighted efforts with voice recognition in multiple languages and getting training data associated with problems that needed solving among the areas where computers are both scarce and expensive.
And yet, because these countries had critical problems that TinyML could solve, Warden expressed confidence that people would be able to overcome those challenges and build cool things. He also urged people trying to build projects in developing countries using TinyML to reach out for help.
Personally, I think Warden’s assessment about why TinyML makes sense in the developing world can also apply in the U.S. As climate change ravages forests and floods our cities, many parts of the country would likely benefit from local machine learning on cheap sensors to predict the direction of forest fires or storms. Another use case where TinyML could help is counting the number of people in places like public shelters to help ensure compliance with social distancing.
One way or another, I’m excited to hear what projects people are building, because having cheaper devices, local processing, and a way to handle limited or nonexistent connectivity will make the internet of things smarter — even without the internet or with relatively dumb things.