[JAVA] Try using the Emotion API from Android

This article was written on the blog before, but since it is a blog that no one comes to, I thought that Qiita would be better and posted it again.

I participated in the Ishinomaki Hackathon held from July 28th to July 30th.

Therefore, I sometimes used the Emotion API of Microsoft Cognitive Services from Android, so I will write it as a memo at that time.

I referred to the following site for implementation. http://qiita.com/kosfuji/items/575408ae17113d7b58e9 http://qiita.com/a_nishimura/items/19cf3f60ad1dd3f66a84

The official implementation sample had java code, but since Android 6.0 the Apache HTTP Client is no longer supported, so I had to use HttpUrlConnection. # Preparation

Here because you need a Subscription Key to use the API Log in with your Microsoft account from> to get your Subscription Key.

Since HTTP communication is performed, add INTERNET permission.

<uses-permission android:name="android.permission.INTERNET" />
# Implementation example

First, create a class that inherits AsyncTask to set up a thread for communication.

public class ConnectToEmotionAPI extends AsyncTask<Void, Void, JSONObject> {
    protected void onPreExecute() {
    protected JSONObject doInBackground(Void... params) {
    protected void onPostExecute(JSONObject result) {

Since the result of analyzing the image from the Emotion API is returned in JSON, the return value of doInBackground should be JSONObject.

Next, we will write the communication process in doInBackground.

HttpURLConnection con = null;
URL url = null;
String urlStr = "https://westus.api.cognitive.microsoft.com/emotion/v1.0/recognize";
String key = "{Your Key}";                  //Subscription Key
DataOutputStream os = null;
BufferedReader reader;
JSONObject json = null;

try {
            url = new URL(urlStr);
            con = (HttpURLConnection)url.openConnection();
            //Request header settings
            con.setRequestProperty("Content-Type", "application/octet-stream");
            con.setRequestProperty("Ocp-Apim-Subscription-Key", key);

            //Creating a request body
            Resources r = main_.getResources();
            Bitmap bmp = BitmapFactory.decodeResource(r, R.drawable.face_small);

            //Convert images to binary data
            byte[] byteArray;
            ByteArrayOutputStream bos = new ByteArrayOutputStream();
            bmp.compress(Bitmap.CompressFormat.JPEG, 100, bos);
            byteArray = bos.toByteArray();

            os = new DataOutputStream(con.getOutputStream());
            for(int i =  0 ; i < byteArray.length;i++){

            //Connect to API
            int status = con.getResponseCode();

            switch (status) {
                case HttpURLConnection.HTTP_OK:
                    InputStream in = con.getInputStream();
                    reader = new BufferedReader(new InputStreamReader(in));
                    String line;
                    String readStr = new String();

                    while (null != (line = reader.readLine())){
                        readStr += line;
                    Log.d("EmotionAPI","read string: " + readStr);


                    json = new JSONArray(readStr).getJSONObject(0);

                case HttpURLConnection.HTTP_UNAUTHORIZED:

} catch (MalformedURLException e){
} catch (JSONException e){
} catch (IOException e){

return json;

When sending an image, we use ByteArrayOutputStream to send it as binary data.

Since the returned JSON is an array format, use JSONArray to parse the JSON.

Finally, the JSON obtained by onPostExecute is parsed for each object name.

    protected void onPostExecute(JSONObject result) {

        JSONObject jsonData;
        String[] str = new String[2];
        try {
            jsonData = result.getJSONObject("scores");
            str[0] = jsonData.getString("happiness");
            str[1] = jsonData.getString("anger");
        } catch (Exception e){

        if (isSmile(str[0])) {
            Log.d("EmotionAPI","It ’s a nice smile!");
        } else if (isAnger(str[1])) {
            Log.d("EmotionAPI","Don't get so angry~");
        } else {
            Log.d("EmotionAPI","It's boring. Please react something");

    public boolean isSmile(String strValue){

        double value = Double.parseDouble(strValue);
        if (value > 0.5) return true;
        else return false;

    public boolean isAnger(String strValue){

        double value = Double.parseDouble(strValue);
        if (value > 0.5) return true;
        else return false;

The code implemented in the hackathon is published on GitHub, so please see that for the entire implementation. Github EmotionAPI Sample

Recommended Posts

Try using the Emotion API from Android
Try using the service on Android Oreo
Try using the Stream API in Java
Try using the Rails API (zip code)
Try accessing the dataset from Java using JZOS
Try using the COTOHA API parsing in Java
Try using Cocoa from Ruby
[MT] Specify the article category from Android with Data API
[MT] Specify the article category from Android with Data API
[Parse] Hit the API using callFunctionInBackground
Try using the messaging system Pulsar
Try using the two-point measurement function of Firebase Performance Monitoring. [Android]
[Swift] Hit the API using Decodable / Generics
Try using || instead of the ternary operator
Data processing using stream API from Java 8
Hit the Salesforce REST API from Java
Try using JSON format API in Java
Try using the Wii remote with Java
Try to issue or get a card from Jave to Trello using API
[Java] Generate a narrowed list from multiple lists using the Stream API
Call the Microsoft Emotion API by sending image data directly from Java.
Try using Firebase Cloud Functions on Android (Java)
Try using libGDX
POST images from Android to PHP using Retrofit
Using JUnit from the command line on Ubuntu
ChatWork4j for using the ChatWork API in Java
Try using Maven
Try using powermock-mockito2-2.0.2
Try using GraalVM
Try using jmockit 1.48
[Java] Get and display the date 10 days later using the Time API added from Java 8.
Try using sql-migrate
Beginners try using android studio Part 2 (event processing)
Try calling the CORBA service from Spring (Java)
Try using GCP's Cloud Vision API in Java
Beginners try using android studio Part 1 (Hello World)
[API] I tried using the zip code search API
Try using SwiftLint
Try using Log4j 2.0
Translator using Microsoft Translator Text API on Android ~ Implementation ~
Try communication using gRPC on Android + Java server
[Rails] Set validation for the search function using Rakuten API (from the implementation of Rakuten API)
Try implementing the Eratosthenes sieve using the Java standard library
Vibrate the wristband device with Bluetooth from the Android app
Run the Android emulator on Docker using Android Emulator Container Scripts
Try calling IBM Watson Assistant 2018-07-10 from the Java SDK.
Try to implement using Rakuten product search API (easy)
Try image classification using TensorFlow Lite on Android (JAVA)
[Android] Uploading images from your device to the server
Try global hooking in Java using the JNativeHook library
Command to try using Docker for the time being
Try using the query attribute of Ruby on Rails
Try using Axon Framework
Compatible with Android 10 (API 29)
Try using JobScheduler's REST-API
Try using PowerMock's WhiteBox
Try using Talend Part 2
Try using Talend Part 1
Try using F # list
Try using each_with_index method
Try using Spring JDBC