Expert: Integration of Huawei ML Kit for Scene Detection in Xamarin(Android)

Manoj Kumar
10 min readJul 9, 2021

Overview

In this article, I will create a demo app along with the integration of ML Kit Scene Detection which is based on Cross platform Technology Xamarin. It will classify image sets by scenario and generates intelligent album sets. User can select camera parameters based on the photographing scene in app, to take better-looking photos.

Scene Detection Service Introduction

ML Text Recognition service can classify the scenario content of images and add labels, such as outdoor scenery, indoor places, and buildings, helps to understand the image content. Based on the detected information, you can create more personalized app experience for users. Currently, on-device detection supports 102 scenarios.

Prerequisite

  1. Xamarin Framework
  2. Huawei phone
  3. Visual Studio 2019

App Gallery Integration process

  1. Sign In and Create or Choose a project on AppGallery Connect portal.

2. Navigate to Project settings and download the configuration file.

3. Navigate to General Information, and then provide Data Storage location.

4. Navigate to Manage APIs and enable ML Kit.

Installing the Huawei ML NuGet package

  1. Navigate to Solution Explore > Project > Right Click > Manage NuGet Packages.

2. Install Huawei.Hms.MlComputerVisionScenedetection in reference.

3. Install Huawei.Hms.MlComputerVisionScenedetectionInner in reference.

4. Install Huawei.Hms.MlComputerVisionScenedetectionModel in reference.

​​​​

Xamarin App Development

  1. Open Visual Studio 2019 and Create A New Project.
  2. Configure Manifest file and add following permissions and tags.
<uses-feature android:name="android.hardware.camera" />    <uses-permission android:name="android.permission.CAMERA" />    <uses-permission android:name="android.permission.INTERNET" />    <uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />    <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />    <uses-permission android:name="android.permission.RECORD_AUDIO" />    <uses-permission android:name="android.permission.NETWORK_STATE" />    <uses-permission android:name="android.permission.ACCESS_WIFI_STATE" /></manifest>

3.Create Activity class with XML UI.

GraphicOverlay.cs

This Class performs scaling and mirroring of the graphics relative to the camera’s preview properties.

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using Android.App;
using Android.Content;
using Android.Graphics;
using Android.OS;
using Android.Runtime;
using Android.Util;
using Android.Views;
using Android.Widget;
using Huawei.Hms.Mlsdk.Common;
namespace SceneDetectionDemo
{
public class GraphicOverlay : View
{
private readonly object mLock = new object();
public int mPreviewWidth;
public float mWidthScaleFactor = 1.0f;
public int mPreviewHeight;
public float mHeightScaleFactor = 1.0f;
public int mFacing = LensEngine.BackLens;
private HashSet<Graphic> mGraphics = new HashSet<Graphic>();
public GraphicOverlay(Context context, IAttributeSet attrs) : base(context,attrs)
{

}
/// <summary>
/// Removes all graphics from the overlay.
/// </summary>
public void Clear()
{
lock(mLock) {
mGraphics.Clear();
}
PostInvalidate();
}
/// <summary>
/// Adds a graphic to the overlay.
/// </summary>
public void Add(Graphic graphic)
{
lock(mLock) {
mGraphics.Add(graphic);
}
PostInvalidate();
}
/// <summary>
/// Removes a graphic from the overlay.
/// </summary>
public void Remove(Graphic graphic)
{
lock(mLock)
{
mGraphics.Remove(graphic);
}
PostInvalidate();
}
/// <summary>
/// Sets the camera attributes for size and facing direction, which informs how to transform image coordinates later.
/// </summary>
public void SetCameraInfo(int previewWidth, int previewHeight, int facing)
{
lock(mLock) {
mPreviewWidth = previewWidth;
mPreviewHeight = previewHeight;
mFacing = facing;
}
PostInvalidate();
}
/// <summary>
/// Draws the overlay with its associated graphic objects.
/// </summary>
protected override void OnDraw(Canvas canvas)
{
base.OnDraw(canvas);
lock (mLock)
{
if ((mPreviewWidth != 0) && (mPreviewHeight != 0))
{
mWidthScaleFactor = (float)canvas.Width / (float)mPreviewWidth;
mHeightScaleFactor = (float)canvas.Height / (float)mPreviewHeight;
}
foreach (Graphic graphic in mGraphics)
{
graphic.Draw(canvas);
}
}
}

}
/// <summary>
/// Base class for a custom graphics object to be rendered within the graphic overlay. Subclass
/// this and implement the {Graphic#Draw(Canvas)} method to define the
/// graphics element. Add instances to the overlay using {GraphicOverlay#Add(Graphic)}.
/// </summary>
public abstract class Graphic
{
private GraphicOverlay mOverlay;
public Graphic(GraphicOverlay overlay)
{
mOverlay = overlay;
}
/// <summary>
/// Draw the graphic on the supplied canvas. Drawing should use the following methods to
/// convert to view coordinates for the graphics that are drawn:
/// <ol>
/// <li>{Graphic#ScaleX(float)} and {Graphic#ScaleY(float)} adjust the size of
/// the supplied value from the preview scale to the view scale.</li>
/// <li>{Graphic#TranslateX(float)} and {Graphic#TranslateY(float)} adjust the
/// coordinate from the preview's coordinate system to the view coordinate system.</li>
/// </ ol >param canvas drawing canvas
/// </summary>
/// <param name="canvas"></param>
public abstract void Draw(Canvas canvas);
/// <summary>
/// Adjusts a horizontal value of the supplied value from the preview scale to the view
/// scale.
/// </summary>
public float ScaleX(float horizontal)
{
return horizontal * mOverlay.mWidthScaleFactor;
}
public float UnScaleX(float horizontal)
{
return horizontal / mOverlay.mWidthScaleFactor;
}
/// <summary>
/// Adjusts a vertical value of the supplied value from the preview scale to the view scale.
/// </summary>
public float ScaleY(float vertical)
{
return vertical * mOverlay.mHeightScaleFactor;
}
public float UnScaleY(float vertical) { return vertical / mOverlay.mHeightScaleFactor; } /// <summary>
/// Adjusts the x coordinate from the preview's coordinate system to the view coordinate system.
/// </summary>
public float TranslateX(float x)
{
if (mOverlay.mFacing == LensEngine.FrontLens)
{
return mOverlay.Width - ScaleX(x);
}
else
{
return ScaleX(x);
}
}
/// <summary>
/// Adjusts the y coordinate from the preview's coordinate system to the view coordinate system.
/// </summary>
public float TranslateY(float y)
{
return ScaleY(y);
}
public void PostInvalidate()
{
this.mOverlay.PostInvalidate();
}
}
}

LensEnginePreview.cs

This Class performs camera’s lens preview properties which help to detect and identify the preview.

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using Android.App;
using Android.Content;
using Android.Graphics;
using Android.OS;
using Android.Runtime;
using Android.Util;
using Android.Views;
using Android.Widget;
using Huawei.Hms.Mlsdk.Common;
namespace HmsXamarinMLDemo.Camera
{
public class LensEnginePreview :ViewGroup
{
private const string Tag = "LensEnginePreview";
private Context mContext;
protected SurfaceView mSurfaceView;
private bool mStartRequested;
private bool mSurfaceAvailable;
private LensEngine mLensEngine;
private GraphicOverlay mOverlay;
public LensEnginePreview(Context context, IAttributeSet attrs) : base(context,attrs)
{
this.mContext = context;
this.mStartRequested = false;
this.mSurfaceAvailable = false;
this.mSurfaceView = new SurfaceView(context);
this.mSurfaceView.Holder.AddCallback(new SurfaceCallback(this));
this.AddView(this.mSurfaceView);
}
public void start(LensEngine lensEngine)
{
if (lensEngine == null)
{
this.stop();
}
this.mLensEngine = lensEngine;
if (this.mLensEngine != null)
{
this.mStartRequested = true;
this.startIfReady();
}
}
public void start(LensEngine lensEngine, GraphicOverlay overlay)
{
this.mOverlay = overlay;
this.start(lensEngine);
}
public void stop()
{
if (this.mLensEngine != null)
{
this.mLensEngine.Close();
}
}
public void release()
{
if (this.mLensEngine != null)
{
this.mLensEngine.Release();
this.mLensEngine = null;
}
}
private void startIfReady()
{
if (this.mStartRequested && this.mSurfaceAvailable) {
this.mLensEngine.Run(this.mSurfaceView.Holder);
if (this.mOverlay != null)
{
Huawei.Hms.Common.Size.Size size = this.mLensEngine.DisplayDimension;
int min = Math.Min(640, 480);
int max = Math.Max(640, 480);
if (this.isPortraitMode())
{
// Swap width and height sizes when in portrait, since it will be rotated by 90 degrees.
this.mOverlay.SetCameraInfo(min, max, this.mLensEngine.LensType);
}
else
{
this.mOverlay.SetCameraInfo(max, min, this.mLensEngine.LensType);
}
this.mOverlay.Clear();
}
this.mStartRequested = false;
}
}
private class SurfaceCallback : Java.Lang.Object, ISurfaceHolderCallback
{
private LensEnginePreview lensEnginePreview;
public SurfaceCallback(LensEnginePreview LensEnginePreview)
{
this.lensEnginePreview = LensEnginePreview;
}
public void SurfaceChanged(ISurfaceHolder holder, [GeneratedEnum] Format format, int width, int height)
{

}
public void SurfaceCreated(ISurfaceHolder holder)
{
this.lensEnginePreview.mSurfaceAvailable = true;
try
{
this.lensEnginePreview.startIfReady();
}
catch (Exception e)
{
Log.Info(LensEnginePreview.Tag, "Could not start camera source.", e);
}
}
public void SurfaceDestroyed(ISurfaceHolder holder)
{
this.lensEnginePreview.mSurfaceAvailable = false;
}
}
protected override void OnLayout(bool changed, int l, int t, int r, int b)
{
int previewWidth = 480;
int previewHeight = 360;
if (this.mLensEngine != null)
{
Huawei.Hms.Common.Size.Size size = this.mLensEngine.DisplayDimension;
if (size != null)
{
previewWidth = 640;
previewHeight = 480;
}
}
// Swap width and height sizes when in portrait, since it will be rotated 90 degrees
if (this.isPortraitMode())
{
int tmp = previewWidth;
previewWidth = previewHeight;
previewHeight = tmp;
}
int viewWidth = r - l;
int viewHeight = b - t;
int childWidth;
int childHeight;
int childXOffset = 0;
int childYOffset = 0;
float widthRatio = (float)viewWidth / (float)previewWidth;
float heightRatio = (float)viewHeight / (float)previewHeight;
// To fill the view with the camera preview, while also preserving the correct aspect ratio,
// it is usually necessary to slightly oversize the child and to crop off portions along one
// of the dimensions. We scale up based on the dimension requiring the most correction, and
// compute a crop offset for the other dimension.
if (widthRatio > heightRatio)
{
childWidth = viewWidth;
childHeight = (int)((float)previewHeight * widthRatio);
childYOffset = (childHeight - viewHeight) / 2;
}
else
{
childWidth = (int)((float)previewWidth * heightRatio);
childHeight = viewHeight;
childXOffset = (childWidth - viewWidth) / 2;
}
for (int i = 0; i < this.ChildCount; ++i)
{
// One dimension will be cropped. We shift child over or up by this offset and adjust
// the size to maintain the proper aspect ratio.
this.GetChildAt(i).Layout(-1 * childXOffset, -1 * childYOffset, childWidth - childXOffset,
childHeight - childYOffset);
}
try
{
this.startIfReady();
}
catch (Exception e)
{
Log.Info(LensEnginePreview.Tag, "Could not start camera source.", e);
}
}
private bool isPortraitMode()
{
return true;
}
}


}

activity_scene_detection.xml

<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:background="#000"
android:fitsSystemWindows="true"
android:keepScreenOn="true"
android:orientation="vertical">
<ToggleButton
android:id="@+id/facingSwitch"
android:layout_width="65dp"
android:layout_height="65dp"
android:layout_alignParentBottom="true"
android:layout_centerHorizontal="true"
android:layout_marginBottom="5dp"
android:background="@drawable/facingswitch_stroke"
android:textOff=""
android:textOn="" />
<com.huawei.mlkit.sample.camera.LensEnginePreview
android:id="@+id/preview"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_alignParentStart="true"
android:layout_alignParentTop="true">
<com.huawei.mlkit.sample.views.overlay.GraphicOverlay
android:id="@+id/overlay"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_marginStart="20dp"
android:layout_marginEnd="20dp" />
</com.huawei.mlkit.sample.camera.LensEnginePreview>
<RelativeLayout
android:id="@+id/rl_select_album_result"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:background="#000"
android:visibility="gone">
<ImageView
android:id="@+id/iv_result"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:layout_alignParentRight="true" />
<TextView
android:id="@+id/result"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_alignParentBottom="true"
android:layout_centerHorizontal="true"
android:layout_marginBottom="100dp"
android:textColor="@color/upsdk_white" />
</RelativeLayout> <ImageView
android:id="@+id/iv_select_album"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_alignParentRight="true"
android:layout_marginTop="20dp"
android:layout_marginEnd="20dp"
android:src="@drawable/select_album" />
<ImageView
android:id="@+id/iv_return_back"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_marginStart="20dp"
android:layout_marginTop="20dp"
android:src="@drawable/return_back" />
<ImageView
android:id="@+id/iv_left_top"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_below="@id/iv_return_back"
android:layout_marginStart="20dp"
android:layout_marginTop="20dp"
android:src="@drawable/left_top_arrow" />
<ImageView
android:id="@+id/iv_right_top"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_below="@id/iv_select_album"
android:layout_alignParentRight="true"
android:layout_marginTop="23dp"
android:layout_marginEnd="20dp"
android:src="@drawable/right_top_arrow" />
<ImageView
android:id="@+id/iv_left_bottom"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_alignParentBottom="true"
android:layout_marginStart="20dp"
android:layout_marginBottom="70dp"
android:src="@drawable/left_bottom_arrow" />
<ImageView
android:id="@+id/iv_right_bottom"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_alignParentRight="true"
android:layout_alignParentBottom="true"
android:layout_marginEnd="20dp"
android:layout_marginBottom="70dp"
android:src="@drawable/right_bottom_arrow" />
</RelativeLayout>

SceneDetectionActivity.cs

This activity performs all the operation regarding live scene detection.

using Android.App;
using Android.Content;
using Android.OS;
using Android.Runtime;
using Android.Support.V7.App;
using Android.Views;
using Android.Widget;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using Huawei.Hms.Mlsdk.Common;
using Huawei.Hms.Mlsdk.Scd;
using HmsXamarinMLDemo.Camera;
using Android.Support.V4.App;
using Android;
using Android.Util;
namespace SceneDetectionDemo
{
[Activity(Label = "SceneDetectionActivity")]
public class SceneDetectionActivity : AppCompatActivity, View.IOnClickListener, MLAnalyzer.IMLTransactor
{
private const string Tag = "SceneDetectionLiveAnalyseActivity";
private const int CameraPermissionCode = 0;
private MLSceneDetectionAnalyzer analyzer;
private LensEngine mLensEngine;
private LensEnginePreview mPreview;
private GraphicOverlay mOverlay;
private int lensType = LensEngine.FrontLens;
private bool isFront = true;
protected override void OnCreate(Bundle savedInstanceState)
{
base.OnCreate(savedInstanceState);
this.SetContentView(Resource.Layout.activity_live_scene_analyse);
this.mPreview = (LensEnginePreview)this.FindViewById(Resource.Id.scene_preview);
this.mOverlay = (GraphicOverlay)this.FindViewById(Resource.Id.scene_overlay);
this.FindViewById(Resource.Id.facingSwitch).SetOnClickListener(this);
if (savedInstanceState != null)
{
this.lensType = savedInstanceState.GetInt("lensType");
}
this.CreateSegmentAnalyzer();
// Checking Camera Permissions
if (ActivityCompat.CheckSelfPermission(this, Manifest.Permission.Camera) == Android.Content.PM.Permission.Granted)
{
this.CreateLensEngine();
}
else
{
this.RequestCameraPermission();
}
}
private void CreateLensEngine()
{
Context context = this.ApplicationContext;
// Create LensEngine.
this.mLensEngine = new LensEngine.Creator(context, this.analyzer).SetLensType(this.lensType)
.ApplyDisplayDimension(960, 720)
.ApplyFps(25.0f)
.EnableAutomaticFocus(true)
.Create();
}
public override void OnRequestPermissionsResult(int requestCode, string[] permissions, [GeneratedEnum] Permission[] grantResults)
{
if (requestCode != CameraPermissionCode)
{
base.OnRequestPermissionsResult(requestCode, permissions, grantResults);
return;
}
if (grantResults.Length != 0 && grantResults[0] == Permission.Granted)
{
this.CreateLensEngine();
return;
}
}
protected override void OnSaveInstanceState(Bundle outState)
{
outState.PutInt("lensType", this.lensType);
base.OnSaveInstanceState(outState);
}
protected override void OnResume()
{
base.OnResume();
if (ActivityCompat.CheckSelfPermission(this, Manifest.Permission.Camera) == Permission.Granted)
{
this.CreateLensEngine();
this.StartLensEngine();
}
else
{
this.RequestCameraPermission();
}
}
public void OnClick(View v)
{
this.isFront = !this.isFront;
if (this.isFront)
{
this.lensType = LensEngine.FrontLens;
}
else
{
this.lensType = LensEngine.BackLens;
}
if (this.mLensEngine != null)
{
this.mLensEngine.Close();
}
this.CreateLensEngine();
this.StartLensEngine();
}
private void StartLensEngine()
{
if (this.mLensEngine != null)
{
try
{
this.mPreview.start(this.mLensEngine, this.mOverlay);
}
catch (Exception e)
{
Log.Error(Tag, "Failed to start lens engine.", e);
this.mLensEngine.Release();
this.mLensEngine = null;
}
}
}
private void CreateSegmentAnalyzer()
{
this.analyzer = MLSceneDetectionAnalyzerFactory.Instance.SceneDetectionAnalyzer;
this.analyzer.SetTransactor(this);
}
protected override void OnPause()
{
base.OnPause();
this.mPreview.stop();
}
protected override void OnDestroy()
{
base.OnDestroy();
if (this.mLensEngine != null)
{
this.mLensEngine.Release();
}
if (this.analyzer != null)
{
this.analyzer.Stop();
}
}
//Request permission
private void RequestCameraPermission()
{
string[] permissions = new string[] { Manifest.Permission.Camera };
if (!ActivityCompat.ShouldShowRequestPermissionRationale(this, Manifest.Permission.Camera))
{
ActivityCompat.RequestPermissions(this, permissions, CameraPermissionCode);
return;
}
}
/// <summary>
/// Implemented from MLAnalyzer.IMLTransactor interface
/// </summary>
public void Destroy()
{
throw new NotImplementedException();
}
/// <summary>
/// Implemented from MLAnalyzer.IMLTransactor interface.
/// Process the results returned by the analyzer.
/// </summary>
public void TransactResult(MLAnalyzer.Result result)
{
mOverlay.Clear();
SparseArray imageSegmentationResult = result.AnalyseList;
IList<MLSceneDetection> list = new List<MLSceneDetection>();
for (int i = 0; i < imageSegmentationResult.Size(); i++)
{
list.Add((MLSceneDetection)imageSegmentationResult.ValueAt(i));
}
MLSceneDetectionGraphic sceneDetectionGraphic = new MLSceneDetectionGraphic(mOverlay, list);
mOverlay.Add(sceneDetectionGraphic);
mOverlay.PostInvalidate();
}
}
}

Xamarin App Build Result

  1. Navigate to Build > Build Solution.

2. Navigate to Solution Explore > Project > Right Click > Archive/View Archive to generate SHA-256 for build release and Click on Distribute.

3. Choose Archive > Distribute.

4. Choose Distribution Channel > Ad Hoc to sign apk.

5. Choose Demo keystore to release apk.

6. Build succeed and click Save.

7. Result.

​​​​​

Tips and Tricks

The minimum resolution is 224 x 224 and the maximum resolution is 4096 x 4960.

Obtains the confidence threshold corresponding to the scene detection result. Call synchronous and asynchronous APIs for scene detection to obtain a data set. Based on the confidence threshold, results whose confidence is less than the threshold can be filtered out.

Conclusion

In this article, we have learned how to integrate ML Text Recognition in Xamarin based Android application. User can live detect indoor and outdoor places and things with the help of Scene Detection API in Application.

Thanks for reading this article. Be sure to like and comments to this article, if you found it helpful. It means a lot to me.

References

HMS Core ML Scene Detection Docs: https://developer.huawei.com/consumer/en/doc/development/HMSCore-Guides-V5/scene-detection-0000001055162807-V5

--

--