Emotion Identification Using Emotions API In Android App - Part Two

Read Emotions API part one.

Introduction

Android is one of the most popular operating systems for mobile. In this article, I will show you how to identify emotions with the help of Microsoft Emotions API.

Requirements

  • Android Studio
  • Little knowledge of XML and Java.
  • Android Emulator (or) Android mobile
  • Stable internet connection

Steps to be followed

Carefully follow these steps to use Emotions API in Android application using Android Studio. I've included the source code below.

Step 1

Go to activity_main.xml and click the text bottom. This XML file contains the designing code for Android app. Into the activity_main.xml file, copy and paste the below code.

Activity_main.xml code

  1. <RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"  
  2.     xmlns:tools="http://schemas.android.com/tools" android:layout_width="match_parent"  
  3.     android:layout_height="match_parent" android:paddingLeft="@dimen/activity_horizontal_margin"  
  4.     android:paddingRight="@dimen/activity_horizontal_margin"  
  5.     android:paddingTop="@dimen/activity_vertical_margin"  
  6.     android:paddingBottom="@dimen/activity_vertical_margin" tools:context=".MainActivity">  
  7.   
  8.     <LinearLayout  
  9.         android:orientation="vertical"  
  10.         android:layout_width="fill_parent"  
  11.         android:layout_height="fill_parent"  
  12.         android:weightSum="1">  
  13.   
  14.         <Button  
  15.             android:layout_width="wrap_content"  
  16.             android:layout_height="wrap_content"  
  17.             android:text="Recognize Image"  
  18.             android:id="@+id/button_recognize"  
  19.             android:layout_gravity="center_horizontal"  
  20.             android:onClick="activityRecognize" />  
  21.         <TextView  
  22.             android:layout_width="wrap_content"  
  23.             android:layout_height="wrap_content"  
  24.             android:layout_marginTop="20dp"  
  25.             android:text="Microsoft will receive the images you upload and may use them to improve Emotion API and related services. By submitting an image, you confirm that you have consent from everyone in it."/>  
  26.     </LinearLayout>  
  27.   
  28. </RelativeLayout>  

Android

Step 2

Create new activity_recognize.xml file (File ⇒ New ⇒Activity⇒Empty_activity).

Go to activity_recognize.xml and click the text bottom. This XML file contains the designing code for Android app. In activity_recognize.xml, copy and paste the below code.

activity_recognize.xml code

  1. <RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"  
  2.     xmlns:tools="http://schemas.android.com/tools" android:layout_width="match_parent"  
  3.     android:layout_height="match_parent" android:paddingLeft="@dimen/activity_horizontal_margin"  
  4.     android:paddingRight="@dimen/activity_horizontal_margin"  
  5.     android:paddingTop="@dimen/activity_vertical_margin"  
  6.     android:paddingBottom="@dimen/activity_vertical_margin"  
  7.     tools:context="ganeshannt.emotionapi.RecognizeActivity">  
  8.   
  9.     <LinearLayout  
  10.         android:orientation="vertical"  
  11.         android:layout_width="fill_parent"  
  12.         android:layout_height="fill_parent"  
  13.         android:weightSum="1">  
  14.         <TextView  
  15.             android:layout_width="wrap_content"  
  16.             android:layout_height="wrap_content"  
  17.             android:layout_margin="4dp"  
  18.             android:text="Select an image to analyze"/>  
  19.         <LinearLayout  
  20.             android:orientation="horizontal"  
  21.             android:layout_width="fill_parent"  
  22.             android:layout_height="wrap_content">  
  23.   
  24.             <Button  
  25.                 android:layout_width="wrap_content"  
  26.                 android:layout_height="wrap_content"  
  27.                 android:text="Select Image"  
  28.                 android:id="@+id/buttonSelectImage"  
  29.                 android:onClick="selectImage"/>  
  30.   
  31.             <ImageView  
  32.                 android:id="@+id/selectedImage"  
  33.                 android:layout_width="200dp"  
  34.                 android:layout_height="200dp"  
  35.                 android:layout_toRightOf="@+id/image_control"  
  36.                 android:layout_toEndOf="@+id/image_control"  
  37.                 android:background="#E0E0E0" />  
  38.   
  39.         </LinearLayout>  
  40.         <LinearLayout  
  41.             android:orientation="horizontal"  
  42.             android:layout_width="match_parent"  
  43.             android:layout_height="wrap_content"  
  44.             android:layout_gravity="right"  
  45.             android:layout_weight="1.03">  
  46.   
  47.             <EditText  
  48.                 android:layout_width="wrap_content"  
  49.                 android:layout_height="match_parent"  
  50.                 android:inputType="textMultiLine"  
  51.                 android:ems="10"  
  52.                 android:id="@+id/editTextResult"  
  53.                 android:layout_weight="1" />  
  54.         </LinearLayout>  
  55.     </LinearLayout>  
  56.   
  57. </RelativeLayout>  

Android

Step 3

Create new activity_select_image.xml file (File ⇒ New ⇒Activity⇒Empty_activity).

Go to activity_select_image.xml and click the text bottom. In here, copy and paste the below code.

activity_select_image.xml code

  1. <LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"  
  2.     xmlns:tools="http://schemas.android.com/tools"  
  3.     android:layout_width="match_parent"  
  4.     android:layout_height="match_parent"  
  5.     android:paddingLeft="@dimen/activity_horizontal_margin"  
  6.     android:paddingRight="@dimen/activity_horizontal_margin"  
  7.     android:paddingTop="@dimen/activity_vertical_margin"  
  8.     android:paddingBottom="@dimen/activity_vertical_margin"  
  9.     android:baselineAligned="false"  
  10.     android:orientation="vertical"  
  11.     tools:context="ganeshannt.emotionapi.helper.SelectImageActivity">  
  12.   
  13.     <RelativeLayout android:layout_width="match_parent"  
  14.         android:layout_height="match_parent"  
  15.         android:layout_weight="2">  
  16.   
  17.         <TextView  
  18.             android:id="@+id/info"  
  19.             android:layout_width="wrap_content"  
  20.             android:layout_height="wrap_content"  
  21.             android:layout_centerHorizontal="true"  
  22.             android:layout_above="@+id/button_take_a_photo"  
  23.             android:layout_gravity="center" />  
  24.   
  25.         <Button  
  26.             android:id="@+id/button_take_a_photo"  
  27.             android:layout_width="match_parent"  
  28.             android:layout_height="wrap_content"  
  29.             android:text="@string/take_photo"  
  30.             android:layout_centerHorizontal="true"  
  31.             android:layout_alignParentBottom="true"  
  32.             android:onClick="takePhoto"  
  33.             style="@style/ButtonStyle" />  
  34.     </RelativeLayout>  
  35.   
  36.     <RelativeLayout android:layout_width="match_parent"  
  37.         android:layout_height="match_parent"  
  38.         android:layout_weight="1">  
  39.   
  40.         <Button  
  41.             android:id="@+id/button_select_a_photo_in_album"  
  42.             android:layout_width="match_parent"  
  43.             android:layout_height="wrap_content"  
  44.             android:text="@string/select_image_in_album"  
  45.             android:layout_centerHorizontal="true"  
  46.             android:layout_centerVertical="true"  
  47.             android:onClick="selectImageInAlbum"  
  48.             style="@style/ButtonStyle" />  
  49.     </RelativeLayout>  
  50.   
  51. </LinearLayout>  

Android

Step 4

create three (name: helper)android package folder

(java->>new->>folder->>package folder).

Step 5

Into helper folder create file(class name:MainActivity, RecognizeActivity ) (File ⇒ New ⇒Java class).

Into the MainActivity.java copy and paste the below code.java programming is the backend language for Android. Do not replace your package name otherwise app will not run.

MainActivity.java code

  1. package ganeshannt.emotionapi;  
  2.   
  3. import android.app.AlertDialog;  
  4. import android.content.Intent;  
  5. import android.support.v7.app.ActionBarActivity;  
  6. import android.os.Bundle;  
  7. import android.view.Menu;  
  8. import android.view.MenuItem;  
  9. import android.view.View;  
  10.   
  11. public class MainActivity extends ActionBarActivity {  
  12.     @Override  
  13.     protected void onCreate(Bundle savedInstanceState) {  
  14.         super.onCreate(savedInstanceState);  
  15.         setContentView(R.layout.activity_main);  
  16.   
  17.         if (getString(R.string.subscription_key).startsWith("Please")) {  
  18.             new AlertDialog.Builder(this)  
  19.                     .setTitle(getString(R.string.add_subscription_key_tip_title))  
  20.                     .setMessage(getString(R.string.add_subscription_key_tip))  
  21.                     .setCancelable(false)  
  22.                     .show();  
  23.         }  
  24.   
  25.     }  
  26.   
  27.     @Override  
  28.     public boolean onCreateOptionsMenu(Menu menu) {  
  29.         // Inflate the menu; this adds items to the action bar if it is present.  
  30.         getMenuInflater().inflate(R.menu.menu_main, menu);  
  31.         return true;  
  32.     }  
  33.   
  34.     public void activityRecognize(View v) {  
  35.         Intent intent = new Intent(this, RecognizeActivity.class);  
  36.         startActivity(intent);  
  37.     }  
  38.   
  39.     @Override  
  40.     public boolean onOptionsItemSelected(MenuItem item) {  
  41.         // Handle action bar item clicks here. The action bar will  
  42.         // automatically handle clicks on the Home/Up button, so long  
  43.         // as you specify a parent activity in AndroidManifest.xml.  
  44.         int id = item.getItemId();  
  45.   
  46.         //noinspection SimplifiableIfStatement  
  47.         if (id == R.id.action_settings) {  
  48.             return true;  
  49.         }  
  50.   
  51.         return super.onOptionsItemSelected(item);  
  52.     }  
  53. }  
Step 6

Into the RecognizeActivity.java copy and paste the below code.java programming is the backend language for Android. Do not replace your package name, otherwise the app will not run.

RecognizeActivity.java code:

  1. package ganeshannt.emotionapi;  
  2.   
  3. import android.content.Intent;  
  4. import android.graphics.Bitmap;  
  5. import android.graphics.Canvas;  
  6. import android.graphics.Color;  
  7. import android.graphics.Paint;  
  8. import android.graphics.drawable.BitmapDrawable;  
  9. import android.net.Uri;  
  10. import android.os.AsyncTask;  
  11. import android.os.Bundle;  
  12. import android.support.v7.app.ActionBarActivity;  
  13. import android.util.Log;  
  14. import android.view.Menu;  
  15. import android.view.MenuItem;  
  16. import android.view.View;  
  17. import android.widget.Button;  
  18. import android.widget.EditText;  
  19. import android.widget.ImageView;  
  20.   
  21. import com.google.gson.Gson;  
  22. import com.microsoft.projectoxford.emotion.EmotionServiceClient;  
  23. import com.microsoft.projectoxford.emotion.EmotionServiceRestClient;  
  24. import com.microsoft.projectoxford.emotion.contract.FaceRectangle;  
  25. import com.microsoft.projectoxford.emotion.contract.RecognizeResult;  
  26. import com.microsoft.projectoxford.emotion.rest.EmotionServiceException;  
  27. import com.microsoft.projectoxford.emotionsample.helper.ImageHelper;  
  28.   
  29. import com.microsoft.projectoxford.face.FaceServiceRestClient;  
  30. import com.microsoft.projectoxford.face.contract.Face;  
  31.   
  32. import java.io.ByteArrayInputStream;  
  33. import java.io.ByteArrayOutputStream;  
  34. import java.io.IOException;  
  35. import java.util.List;  
  36.   
  37. public class RecognizeActivity extends ActionBarActivity {  
  38.   
  39.     // Flag to indicate which task is to be performed.  
  40.     private static final int REQUEST_SELECT_IMAGE = 0;  
  41.   
  42.     // The button to select an image  
  43.     private Button mButtonSelectImage;  
  44.   
  45.     // The URI of the image selected to detect.  
  46.     private Uri mImageUri;  
  47.   
  48.     // The image selected to detect.  
  49.     private Bitmap mBitmap;  
  50.   
  51.     // The edit to show status and result.  
  52.     private EditText mEditText;  
  53.   
  54.     private EmotionServiceClient client;  
  55.   
  56.     @Override  
  57.     protected void onCreate(Bundle savedInstanceState) {  
  58.         super.onCreate(savedInstanceState);  
  59.         setContentView(R.layout.activity_recognize);  
  60.   
  61.         if (client == null) {  
  62.             client = new EmotionServiceRestClient(getString(R.string.subscription_key));  
  63.         }  
  64.   
  65.         mButtonSelectImage = (Button) findViewById(R.id.buttonSelectImage);  
  66.         mEditText = (EditText) findViewById(R.id.editTextResult);  
  67.     }  
  68.   
  69.     @Override  
  70.     public boolean onCreateOptionsMenu(Menu menu) {  
  71.         // Inflate the menu; this adds items to the action bar if it is present.  
  72.         getMenuInflater().inflate(R.menu.menu_recognize, menu);  
  73.         return true;  
  74.     }  
  75.   
  76.     @Override  
  77.     public boolean onOptionsItemSelected(MenuItem item) {  
  78.         // Handle action bar item clicks here. The action bar will  
  79.         // automatically handle clicks on the Home/Up button, so long  
  80.         // as you specify a parent activity in AndroidManifest.xml.  
  81.         int id = item.getItemId();  
  82.   
  83.         //noinspection SimplifiableIfStatement  
  84.         if (id == R.id.action_settings) {  
  85.             return true;  
  86.         }  
  87.   
  88.         return super.onOptionsItemSelected(item);  
  89.     }  
  90.   
  91.     public void doRecognize() {  
  92.         mButtonSelectImage.setEnabled(false);  
  93.   
  94.         // Do emotion detection using auto-detected faces.  
  95.         try {  
  96.             new doRequest(false).execute();  
  97.         } catch (Exception e) {  
  98.             mEditText.append("Error encountered. Exception is: " + e.toString());  
  99.         }  
  100.   
  101.         String faceSubscriptionKey = getString(R.string.faceSubscription_key);  
  102.         if (faceSubscriptionKey.equalsIgnoreCase("Please_add_the_face_subscription_key_here")) {  
  103.             mEditText.append("\n\nThere is no face subscription key in res/values/strings.xml. Skip the sample for detecting emotions using face rectangles\n");  
  104.         } else {  
  105.             // Do emotion detection using face rectangles provided by Face API.  
  106.             try {  
  107.                 new doRequest(true).execute();  
  108.             } catch (Exception e) {  
  109.                 mEditText.append("Error encountered. Exception is: " + e.toString());  
  110.             }  
  111.         }  
  112.     }  
  113.   
  114.     // Called when the "Select Image" button is clicked.  
  115.     public void selectImage(View view) {  
  116.         mEditText.setText("");  
  117.   
  118.         Intent intent;  
  119.         intent = new Intent(RecognizeActivity.this, com.microsoft.projectoxford.emotionsample.helper.SelectImageActivity.class);  
  120.         startActivityForResult(intent, REQUEST_SELECT_IMAGE);  
  121.     }  
  122.   
  123.     // Called when image selection is done.  
  124.     @Override  
  125.     protected void onActivityResult(int requestCode, int resultCode, Intent data) {  
  126.         Log.d("RecognizeActivity""onActivityResult");  
  127.         switch (requestCode) {  
  128.             case REQUEST_SELECT_IMAGE:  
  129.                 if (resultCode == RESULT_OK) {  
  130.                     // If image is selected successfully, set the image URI and bitmap.  
  131.                     mImageUri = data.getData();  
  132.   
  133.                     mBitmap = ImageHelper.loadSizeLimitedBitmapFromUri(  
  134.                             mImageUri, getContentResolver());  
  135.                     if (mBitmap != null) {  
  136.                         // Show the image on screen.  
  137.                         ImageView imageView = (ImageView) findViewById(R.id.selectedImage);  
  138.                         imageView.setImageBitmap(mBitmap);  
  139.   
  140.                         // Add detection log.  
  141.                         Log.d("RecognizeActivity""Image: " + mImageUri + " resized to " + mBitmap.getWidth()  
  142.                                 + "x" + mBitmap.getHeight());  
  143.   
  144.                         doRecognize();  
  145.                     }  
  146.                 }  
  147.                 break;  
  148.             default:  
  149.                 break;  
  150.         }  
  151.     }  
  152.   
  153.   
  154.     private List<RecognizeResult> processWithAutoFaceDetection() throws EmotionServiceException, IOException {  
  155.         Log.d("emotion""Start emotion detection with auto-face detection");  
  156.   
  157.         Gson gson = new Gson();  
  158.   
  159.         // Put the image into an input stream for detection.  
  160.         ByteArrayOutputStream output = new ByteArrayOutputStream();  
  161.         mBitmap.compress(Bitmap.CompressFormat.JPEG, 100, output);  
  162.         ByteArrayInputStream inputStream = new ByteArrayInputStream(output.toByteArray());  
  163.   
  164.         long startTime = System.currentTimeMillis();  
  165.         // -----------------------------------------------------------------------  
  166.         // KEY SAMPLE CODE STARTS HERE  
  167.         // -----------------------------------------------------------------------  
  168.   
  169.         List<RecognizeResult> result = null;  
  170.         //  
  171.         // Detect emotion by auto-detecting faces in the image.  
  172.         //  
  173.         result = this.client.recognizeImage(inputStream);  
  174.   
  175.         String json = gson.toJson(result);  
  176.         Log.d("result", json);  
  177.   
  178.         Log.d("emotion", String.format("Detection done. Elapsed time: %d ms", (System.currentTimeMillis() - startTime)));  
  179.         // -----------------------------------------------------------------------  
  180.         // KEY SAMPLE CODE ENDS HERE  
  181.         // -----------------------------------------------------------------------  
  182.         return result;  
  183.     }  
  184.   
  185.     private List<RecognizeResult> processWithFaceRectangles() throws EmotionServiceException, com.microsoft.projectoxford.face.rest.ClientException, IOException {  
  186.         Log.d("emotion""Do emotion detection with known face rectangles");  
  187.         Gson gson = new Gson();  
  188.   
  189.         // Put the image into an input stream for detection.  
  190.         ByteArrayOutputStream output = new ByteArrayOutputStream();  
  191.         mBitmap.compress(Bitmap.CompressFormat.JPEG, 100, output);  
  192.         ByteArrayInputStream inputStream = new ByteArrayInputStream(output.toByteArray());  
  193.   
  194.         long timeMark = System.currentTimeMillis();  
  195.         Log.d("emotion""Start face detection using Face API");  
  196.         FaceRectangle[] faceRectangles = null;  
  197.         String faceSubscriptionKey = getString(R.string.faceSubscription_key);  
  198.         FaceServiceRestClient faceClient = new FaceServiceRestClient(faceSubscriptionKey);  
  199.         Face faces[] = faceClient.detect(inputStream, falsefalsenull);  
  200.         Log.d("emotion", String.format("Face detection is done. Elapsed time: %d ms", (System.currentTimeMillis() - timeMark)));  
  201.   
  202.         if (faces != null) {  
  203.             faceRectangles = new FaceRectangle[faces.length];  
  204.   
  205.             for (int i = 0; i < faceRectangles.length; i++) {  
  206.                 // Face API and Emotion API have different FaceRectangle definition. Do the conversion.  
  207.                 com.microsoft.projectoxford.face.contract.FaceRectangle rect = faces[i].faceRectangle;  
  208.                 faceRectangles[i] = new com.microsoft.projectoxford.emotion.contract.FaceRectangle(rect.left, rect.top, rect.width, rect.height);  
  209.             }  
  210.         }  
  211.   
  212.         List<RecognizeResult> result = null;  
  213.         if (faceRectangles != null) {  
  214.             inputStream.reset();  
  215.   
  216.             timeMark = System.currentTimeMillis();  
  217.             Log.d("emotion""Start emotion detection using Emotion API");  
  218.             // -----------------------------------------------------------------------  
  219.             // KEY SAMPLE CODE STARTS HERE  
  220.             // -----------------------------------------------------------------------  
  221.             result = this.client.recognizeImage(inputStream, faceRectangles);  
  222.   
  223.             String json = gson.toJson(result);  
  224.             Log.d("result", json);  
  225.             // -----------------------------------------------------------------------  
  226.             // KEY SAMPLE CODE ENDS HERE  
  227.             // -----------------------------------------------------------------------  
  228.             Log.d("emotion", String.format("Emotion detection is done. Elapsed time: %d ms", (System.currentTimeMillis() - timeMark)));  
  229.         }  
  230.         return result;  
  231.     }  
  232.   
  233.     private class doRequest extends AsyncTask<String, String, List<RecognizeResult>> {  
  234.         // Store error message  
  235.         private Exception e = null;  
  236.         private boolean useFaceRectangles = false;  
  237.   
  238.         public doRequest(boolean useFaceRectangles) {  
  239.             this.useFaceRectangles = useFaceRectangles;  
  240.         }  
  241.   
  242.         @Override  
  243.         protected List<RecognizeResult> doInBackground(String... args) {  
  244.             if (this.useFaceRectangles == false) {  
  245.                 try {  
  246.                     return processWithAutoFaceDetection();  
  247.                 } catch (Exception e) {  
  248.                     this.e = e;    // Store error  
  249.                 }  
  250.             } else {  
  251.                 try {  
  252.                     return processWithFaceRectangles();  
  253.                 } catch (Exception e) {  
  254.                     this.e = e;    // Store error  
  255.                 }  
  256.             }  
  257.             return null;  
  258.         }  
  259.   
  260.         @Override  
  261.         protected void onPostExecute(List<RecognizeResult> result) {  
  262.             super.onPostExecute(result);  
  263.             // Display based on error existence  
  264.   
  265.             if (this.useFaceRectangles == false) {  
  266.                 mEditText.append("\n\nRecognizing emotions with auto-detected face rectangles...\n");  
  267.             } else {  
  268.                 mEditText.append("\n\nRecognizing emotions with existing face rectangles from Face API...\n");  
  269.             }  
  270.             if (e != null) {  
  271.                 mEditText.setText("Error: " + e.getMessage());  
  272.                 this.e = null;  
  273.             } else {  
  274.                 if (result.size() == 0) {  
  275.                     mEditText.append("No emotion detected :(");  
  276.                 } else {  
  277.                     Integer count = 0;  
  278.                     // Covert bitmap to a mutable bitmap by copying it  
  279.                     Bitmap bitmapCopy = mBitmap.copy(Bitmap.Config.ARGB_8888, true);  
  280.                     Canvas faceCanvas = new Canvas(bitmapCopy);  
  281.                     faceCanvas.drawBitmap(mBitmap, 0, 0, null);  
  282.                     Paint paint = new Paint(Paint.ANTI_ALIAS_FLAG);  
  283.                     paint.setStyle(Paint.Style.STROKE);  
  284.                     paint.setStrokeWidth(5);  
  285.                     paint.setColor(Color.RED);  
  286.   
  287.                     for (RecognizeResult r : result) {  
  288.                         mEditText.append(String.format("\nFace #%1$d \n", count));  
  289.                         mEditText.append(String.format("\t anger: %1$.5f\n", r.scores.anger));  
  290.                         mEditText.append(String.format("\t contempt: %1$.5f\n", r.scores.contempt));  
  291.                         mEditText.append(String.format("\t disgust: %1$.5f\n", r.scores.disgust));  
  292.                         mEditText.append(String.format("\t fear: %1$.5f\n", r.scores.fear));  
  293.                         mEditText.append(String.format("\t happiness: %1$.5f\n", r.scores.happiness));  
  294.                         mEditText.append(String.format("\t neutral: %1$.5f\n", r.scores.neutral));  
  295.                         mEditText.append(String.format("\t sadness: %1$.5f\n", r.scores.sadness));  
  296.                         mEditText.append(String.format("\t surprise: %1$.5f\n", r.scores.surprise));  
  297.                         mEditText.append(String.format("\t face rectangle: %d, %d, %d, %d", r.faceRectangle.left, r.faceRectangle.top, r.faceRectangle.width, r.faceRectangle.height));  
  298.                         faceCanvas.drawRect(r.faceRectangle.left,  
  299.                                 r.faceRectangle.top,  
  300.                                 r.faceRectangle.left + r.faceRectangle.width,  
  301.                                 r.faceRectangle.top + r.faceRectangle.height,  
  302.                                 paint);  
  303.                         count++;  
  304.                     }  
  305.                     ImageView imageView = (ImageView) findViewById(R.id.selectedImage);  
  306.                     imageView.setImageDrawable(new BitmapDrawable(getResources(), mBitmap));  
  307.                 }  
  308.                 mEditText.setSelection(0);  
  309.             }  
  310.   
  311.             mButtonSelectImage.setEnabled(true);  
  312.         }  
  313.     }  
  314. }  

 

Step 7

As we need to make network requests, we need to add INTERNET permission in AndroidManifest.xml.Add the below code into the AndroidManifest.xml.

AndroidManifest.xml code

  1. <?xml version="1.0" encoding="utf-8"?>  
  2. <manifest xmlns:android="http://schemas.android.com/apk/res/android"  
  3.     package="ganeshannt.emotionapi" >  
  4.   
  5.     <uses-permission android:name="android.permission.INTERNET" />  
  6.     <uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />  
  7.     <uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />  
  8.   
  9.     <application  
  10.         android:allowBackup="true"  
  11.         android:icon="@mipmap/ic_launcher"  
  12.         android:label="@string/app_name"  
  13.         android:theme="@style/AppTheme" >  
  14.         <activity  
  15.             android:name="com.microsoft.projectoxford.emotionsample.MainActivity"  
  16.             android:label="@string/app_name" >  
  17.             <intent-filter>  
  18.                 <action android:name="android.intent.action.MAIN" />  
  19.   
  20.                 <category android:name="android.intent.category.LAUNCHER" />  
  21.             </intent-filter>  
  22.         </activity>  
  23.         <activity  
  24.             android:name="com.microsoft.projectoxford.emotionsample.RecognizeActivity"  
  25.             android:label="@string/title_activity_analyze"  
  26.             android:parentActivityName="com.microsoft.projectoxford.emotionsample.MainActivity" >  
  27.             <meta-data  
  28.                 android:name="android.support.PARENT_ACTIVITY"  
  29.                 android:value="com.microsoft.projectoxford.emotionsample.MainActivity" />  
  30.         </activity>  
  31.         <activity  
  32.             android:name="com.microsoft.projectoxford.emotionsample.helper.SelectImageActivity"  
  33.             android:label="@string/select_an_image"  
  34.             android:screenOrientation="portrait" />  
  35.     </application>  
  36.   
  37. </manifest>  
Step 8

 

This is our user interface of the application. Click the make project option.

Android

Step 9

Run the application then choose the virtual machine then click ok.

Android

Deliverables

Here the emotion was successfully detected using emotion API created and executed in Android app.

Android

Android

Android

Android

Android

Don’t forgot to like and follow me. If you have any doubts just comment below.

Up Next
    Ebook Download
    View all
    Learn
    View all