Skip to main content

Custom Vision, Xamarin.Forms and Simpsons - The Community AI Show

·2 mins

This week I had the honor of being the first guest on the Community AI Show with Henk Boelman, Cloud Advocate at Microsoft. In this first episode we look at how we can use Azure Custom Vision in a Xamarin.Forms app.

Custom Vision Simpsons Recognition #

Our demo app can recognize Simpson themed LEGO figures. The model is trained through Azure Custom Vision. From there, you can either communicate with the available REST APIs, or export the model to CoreML, TensorFlow or ONNX.

These models can then be loaded into iOS, Android or UWP respectively. Because these models reside on your device locally, you do not need an internet connection and the results return super fast!

Watch Henk and me have some fun while building this solution in the video below. You will learn how to train your model, the global structure of a Xamarin.Forms app and what routes you can take to use this platform-specific feature in a Forms app.

Spoiler alert: use this amazing plugin by Jim Bennett to make your life a lot easier.

All the source code for this app can be found on my GitHub page. If you want to get started with Custom Vision yourself, go and check out customvision.ai.

More Resources #

If you can’t get enough of all the AI goodness, I have also recorded a video course on the Azure Cognitive Services and how to use those. You can find that here.

On my YouTube channel you can also find a session where I incorporate the Cognitive Services in a Xamarin.Forms app. While you are there, don’t forget to like and subscribe!

In Closing #

Please let me know what you think of my appearance in the AI show or all the other resources mentioned in this post. If you need any clarification on anything mentioned in this post, I’m planning to write some more extensive posts about using the CoreML and TensorFlow models in your Xamarin app. Reach out if there are specific things you want to read in there.