Firebase has really been at it with new features this time around, and perhaps the biggest among them is the new Firebase Service ML Kit. From the name, we can work out it’s got machine learning shenanigans, but what does it mean for us? How can we use it? Why should we use it?
This is all an introduction so there won’t be any in-depth tutorials in this article. However, in the following weeks, I’ll be doing a mini-series on each of ML Kit’s features which if you don’t know what they are, keep reading on.
So what does ML Kit actually do?
I find these short videos from Firebase quite handy at explaining things, but if you don’t want to spend a whopping 2 minutes watching it, here’s the gist of it.
ML Kit provides you with easy-to-use common ML features namely text recognition, face detection, barcode scanning, image labelling, and landmark recognition. The code that runs these functions can either be hosted on the device or on the cloud with advantages to each.
As for the more experienced ML practitioners, if Firebase’s machine learning model doesn’t suit your app’s needs, you can upload your own TensorFlow Lite model and Firebase will take care of hosting and serving it to you to make the overall process easier.
How does it work?
In the Firebase Console, the ML Kit section there is just references and info (other than if you’re using custom models). All the setup is done through code.
Basically, you add the dependency, then you get access to objects like FirebaseVisionImage (for text detection) which gets a bitmap and FirebaseVisionTextDetector. You can use these to run methods like detector.detectInImage(image) then you add a success (and failure) listener to it where you get access to FirebaseVisionText object containing all the text detected in the image. Complex machine learning techniques made simple!
That was a very basic explanation of how ML Kit is used in one case, but I hope you get the gist of it. The other features should follow a very similar process.
Implementing Custom Models
As for how it works with your own custom TensorFlow Lite models, it looks like you upload your model to the console, host it in your app (perhaps by bundling it with your app), load it in your app, specify inputs and ouputs, and then perform inference on input data.
By no means am I a machine learning expert, but I’ll do my best to present this in a way that would make your custom ML with Firebase as easy as it can be.
This is utterly impressive if I must say so myself. Firebase has always been about making the complex simple (John Sonmez would love this). With ML Kit, they’ve done it once again in a way that doesn’t fail to impress. What we are seeing is ML Kit in beta, so I’m really looking forward to seeing how it grows.
Like I said on top, I’ll be starting a mini-series on ML Kit’s different features throughout the following weeks. Look forward to that!