Google works out ML inference stack for Android
Author: EIS Release Date: Jul 21, 2021
Aimed at mobile-based machine learning, Google has announced what it calls the Android ML Platform. The company describes it as an updateable, fully integrated ML inference stack and its first partner working in this area will be Qualcomm.
Google works out ML inference stack for Android
Developers will get a “consistent API that spans Android versions”, promises the search giant, with regular updates delivered via Google Play Services. These will be made available outside of the Android OS release cycle.
Google says it will also provide on-device inference binaries with Android (TensorFlow Lite for Android) to help reduce apk sizes. TensorFlow Lite for Android will use also metadata in the model to automatically enable hardware acceleration. This will allow “developers to get the best performance possible on each Android device” says Google.
Oli Gaymond, Product Manager of Android ML at Google, writes:
“Besides keeping TensorFlow Lite for Android up to date via regular updates, we’re also going to be updating the Neural Networks API outside of OS releases while keeping the API specification the same across Android versions.”
“In addition we are working with chipset vendors to provide the latest drivers for their hardware directly to devices, outside of OS updates. This will let developers dramatically reduce testing from thousands of devices to a handful of configurations. We’re excited to announce that we’ll be launching later this year with Qualcomm as our first partner.”
The features of Android ML Platform will roll out “later this year”, but developers can sign-up for the company’s early access programme online.
The image above shows Android Studio – Google’s IDE for Android development – and ML Model binding.