In this article, I am going to show how to implement the Face API using C# with the help of Microsoft Azure and Visual Studio.This article also shows how to create a WPF application that implements the Face API.This application detects the faces in the images, showing the red frame around each face and the status bar showing the description of each faces by moving the cursor to the frame.
Prerequisites
- An active Azure Subscription.
- Visual Studio 2015 or higher.
The flow of this article is,
- Creating the Face API in Azure Portal.
- Accessing & Managing the Face API Keys in WPF Applications.
- Creating the WPF Application using the Face API in Visual Studio 2017.
Follow the steps to create the Face API on Azure portal.
Step 1
Sign into Azure portal.
Step 2Press "+New", then choose "AI + cognitive services", and select "Face API".
Step 3
In the "Create Face API" blade, enter the name of your API, choose the desired subscription, and select the desired location for the API. Select the appropriate pricing Tier that is required for our use. Create the resource group for the API. Then, press "Create".
Copy the Endpoint and press the "Show / Manage Keys" to show our keys that are used in the WPF application.
Step 4
The "Manage Keys" blade shows the keys that are going to work as the access keys for our Face API. Click the "Copy" button and then paste it into the Notepad.
Implementing Face API in Visual Studio using C#
Step 5
Follow the steps to create a Windows-based WPF application using Visual Studio 2017.
- Open Visual Studio 2015 or 2017.
- From the menu bar, click File >> New >> New Project.
- In Visual Studio, choose Installed-->Templates-->Visual C#-->Windows Classic Desktop-->WPF Application (.NET Application).
- Enter the name for our application as FaceApiDemo and press OK to open it in the Solution Explorer.
Step 6
We need to add two packages to our WPF applications. Right-click the Solution Explorer and press "NuGet Package Manager" to add two packages.
The JSON.NET is a popular Framework for JSON framework for .NET applications. We will use it because our Face API consists of a lot of JSON files to store the various details about the images.
In NuGet packages, browse for the Newtonsoft.json and then press "Install".
And also, browse for the Microsoft.ProjectOxford.Face and press "Install". This package is used to configure the Face API and our application through the HTTPS request. Thus, it imports a .NET library that encapsulates the Face API REST requests.
Step 7
Copy and paste the following code into the MainWindow.Xaml. This is the code of the layout for our Windows application.
- <Window x:Class="FaceApiDemo.MainWindow"
- xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"
- xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"
- Title="MainWindow" Height="700" Width="960">
- <Grid x:Name="BackPanel">
- <Image x:Name="FacePhoto" Stretch="Uniform" Margin="0,0,0,50" MouseMove="FacePhoto_MouseMove" />
- <DockPanel DockPanel.Dock="Bottom">
- <Button x:Name="BrowseButton" Width="72" Height="20" VerticalAlignment="Bottom" HorizontalAlignment="Left"
- Content="Browse..."
- Click="BrowseButton_Click" />
- <StatusBar VerticalAlignment="Bottom">
- <StatusBarItem>
- <TextBlock Name="faceDescriptionStatusBar" />
- </StatusBarItem>
- </StatusBar>
- </DockPanel>
- </Grid>
- </Window>
Step 8
Open MainWindow.xaml.cs and replace the code with this one.
- using System;
- using System.Collections.Generic;
- using System.IO;
- using System.Text;
- using System.Threading.Tasks;
- using System.Windows;
- using System.Windows.Input;
- using System.Windows.Media;
- using System.Windows.Media.Imaging;
- using Microsoft.ProjectOxford.Common.Contract;
- using Microsoft.ProjectOxford.Face;
- using Microsoft.ProjectOxford.Face.Contract;
-
- namespace FaceApiDemo
-
- {
- public partial class MainWindow : Window
- {
-
-
-
-
-
-
-
-
-
-
- private readonly IFaceServiceClient faceServiceClient =
- new FaceServiceClient("keys", "Endpoints");
-
- Face[] faces;
- String[] faceDescriptions;
- double resizeFactor;
-
-
- public MainWindow()
- {
- InitializeComponent();
- }
-
-
-
- private async void BrowseButton_Click(object sender, RoutedEventArgs e)
- {
-
- var openDlg = new Microsoft.Win32.OpenFileDialog();
-
- openDlg.Filter = "JPEG Image(*.jpg)|*.jpg";
- bool? result = openDlg.ShowDialog(this);
-
-
- if (!(bool)result)
- {
- return;
- }
-
-
- string filePath = openDlg.FileName;
-
- Uri fileUri = new Uri(filePath);
- BitmapImage bitmapSource = new BitmapImage();
-
- bitmapSource.BeginInit();
- bitmapSource.CacheOption = BitmapCacheOption.None;
- bitmapSource.UriSource = fileUri;
- bitmapSource.EndInit();
-
- FacePhoto.Source = bitmapSource;
-
-
- Title = "Detecting...";
- faces = await UploadAndDetectFaces(filePath);
- Title = String.Format("Detection Finished. {0} face(s) detected", faces.Length);
-
- if (faces.Length > 0)
- {
-
- DrawingVisual visual = new DrawingVisual();
- DrawingContext drawingContext = visual.RenderOpen();
- drawingContext.DrawImage(bitmapSource,
- new Rect(0, 0, bitmapSource.Width, bitmapSource.Height));
- double dpi = bitmapSource.DpiX;
- resizeFactor = 96 / dpi;
- faceDescriptions = new String[faces.Length];
-
- for (int i = 0; i < faces.Length; ++i)
- {
- Face face = faces[i];
-
-
- drawingContext.DrawRectangle(
- Brushes.Transparent,
- new Pen(Brushes.Red, 2),
- new Rect(
- face.FaceRectangle.Left * resizeFactor,
- face.FaceRectangle.Top * resizeFactor,
- face.FaceRectangle.Width * resizeFactor,
- face.FaceRectangle.Height * resizeFactor
- )
- );
-
-
- faceDescriptions[i] = FaceDescription(face);
- }
-
- drawingContext.Close();
-
-
- RenderTargetBitmap faceWithRectBitmap = new RenderTargetBitmap(
- (int)(bitmapSource.PixelWidth * resizeFactor),
- (int)(bitmapSource.PixelHeight * resizeFactor),
- 96,
- 96,
- PixelFormats.Pbgra32);
-
- faceWithRectBitmap.Render(visual);
- FacePhoto.Source = faceWithRectBitmap;
-
-
- faceDescriptionStatusBar.Text = "Place the mouse pointer over a face to see the face description.";
- }
- }
-
-
-
- private void FacePhoto_MouseMove(object sender, MouseEventArgs e)
- {
-
- if (faces == null)
- return;
-
-
- Point mouseXY = e.GetPosition(FacePhoto);
-
- ImageSource imageSource = FacePhoto.Source;
- BitmapSource bitmapSource = (BitmapSource)imageSource;
-
-
- var scale = FacePhoto.ActualWidth / (bitmapSource.PixelWidth / resizeFactor);
-
-
- bool mouseOverFace = false;
-
- for (int i = 0; i < faces.Length; ++i)
- {
- FaceRectangle fr = faces[i].FaceRectangle;
- double left = fr.Left * scale;
- double top = fr.Top * scale;
- double width = fr.Width * scale;
- double height = fr.Height * scale;
-
-
- if (mouseXY.X >= left && mouseXY.X <= left + width && mouseXY.Y >= top && mouseXY.Y <= top + height)
- {
- faceDescriptionStatusBar.Text = faceDescriptions[i];
- mouseOverFace = true;
- break;
- }
- }
-
-
- if (!mouseOverFace)
- faceDescriptionStatusBar.Text = "Place the mouse pointer over a face to see the face description.";
- }
-
-
-
- private async Task<Face[]> UploadAndDetectFaces(string imageFilePath)
- {
-
- IEnumerable<FaceAttributeType> faceAttributes =
- new FaceAttributeType[] { FaceAttributeType.Gender, FaceAttributeType.Age, FaceAttributeType.Smile, FaceAttributeType.Emotion, FaceAttributeType.Glasses, FaceAttributeType.Hair };
-
-
- try
- {
- using (Stream imageFileStream = File.OpenRead(imageFilePath))
- {
- Face[] faces = await faceServiceClient.DetectAsync(imageFileStream, returnFaceId: true, returnFaceLandmarks: false, returnFaceAttributes: faceAttributes);
- return faces;
- }
- }
-
- catch (FaceAPIException f)
- {
- MessageBox.Show(f.ErrorMessage, f.ErrorCode);
- return new Face[0];
- }
-
- catch (Exception e)
- {
- MessageBox.Show(e.Message, "Error");
- return new Face[0];
- }
- }
-
-
-
- private string FaceDescription(Face face)
- {
- StringBuilder sb = new StringBuilder();
-
- sb.Append("Face: ");
-
-
- sb.Append(face.FaceAttributes.Gender);
- sb.Append(", ");
- sb.Append(face.FaceAttributes.Age);
- sb.Append(", ");
- sb.Append(String.Format("smile {0:F1}%, ", face.FaceAttributes.Smile * 100));
-
-
- sb.Append("Emotion: ");
- EmotionScores emotionScores = face.FaceAttributes.Emotion;
- if (emotionScores.Anger >= 0.1f) sb.Append(String.Format("anger {0:F1}%, ", emotionScores.Anger * 100));
- if (emotionScores.Contempt >= 0.1f) sb.Append(String.Format("contempt {0:F1}%, ", emotionScores.Contempt * 100));
- if (emotionScores.Disgust >= 0.1f) sb.Append(String.Format("disgust {0:F1}%, ", emotionScores.Disgust * 100));
- if (emotionScores.Fear >= 0.1f) sb.Append(String.Format("fear {0:F1}%, ", emotionScores.Fear * 100));
- if (emotionScores.Happiness >= 0.1f) sb.Append(String.Format("happiness {0:F1}%, ", emotionScores.Happiness * 100));
- if (emotionScores.Neutral >= 0.1f) sb.Append(String.Format("neutral {0:F1}%, ", emotionScores.Neutral * 100));
- if (emotionScores.Sadness >= 0.1f) sb.Append(String.Format("sadness {0:F1}%, ", emotionScores.Sadness * 100));
- if (emotionScores.Surprise >= 0.1f) sb.Append(String.Format("surprise {0:F1}%, ", emotionScores.Surprise * 100));
-
-
- sb.Append(face.FaceAttributes.Glasses);
- sb.Append(", ");
-
-
- sb.Append("Hair: ");
-
-
- if (face.FaceAttributes.Hair.Bald >= 0.01f)
- sb.Append(String.Format("bald {0:F1}% ", face.FaceAttributes.Hair.Bald * 100));
-
-
- HairColor[] hairColors = face.FaceAttributes.Hair.HairColor;
- foreach (HairColor hairColor in hairColors)
- {
- if (hairColor.Confidence >= 0.1f)
- {
- sb.Append(hairColor.Color.ToString());
- sb.Append(String.Format(" {0:F1}% ", hairColor.Confidence * 100));
- }
- }
-
-
- return sb.ToString();
- }
- }
- }
In the 30th line of the code, enter the keys and the End-point URL that we acquired from Azure Face API.
- private readonly IFaceServiceClient faceServiceClient =
- new FaceServiceClient("_key_", "End-point URL");
Replace the Key and Endpoint using Face API on the Azure portal.
Step 9
Save and start the program.Then, click "Browse" button to import the image to detect the faces.
Step 10
Stay calm for some seconds. The cloud API wants to respond to our images to detect the Faces using red boxes around the faces in the image. Mooving the cursor over the face rectangle, the description for that image shows up in the status bar.
Conclusion
I hope you understood how to create an application using Face API on the Azure portal and implement the Face API using C# in Visual Studio.