Make an Eye tracking and Face detection app as a beginner

Written by pradyumandixit | Published 2019/01/10
Tech Story Tags: android | eye-tracking | facedetection | machine-learning | apps

TLDRvia the TL;DR App

We all know, how cool an android app looks, when it can detect our face or track if our eyes are closed or open. It becomes way more cooler when the app can even detect if we are smiling, reading on phone or not looking at it.

Well…

I believe.. whatever does appeals me, simply makes me build it!

Sorry, tried a “The Dark Knight” pun :)

So let us make an android eye tracking and face detection app using Google Vision API.

From Google:

Cloud Vision API enables developers to understand the content of an image by encapsulating powerful machine learning models in an easy-to-use REST API. It quickly classifies images into thousands of categories (such as, “sailboat”), detects individual objects and faces within images, and reads printed words contained within images. You can build metadata on your image catalog, moderate offensive content, or enable new marketing scenarios through image sentiment analysis.

Here we will be making an android app that can track our face, and detect if our eyes are closed or open.

Sounds cool? It’s not much hard even.

So let’s dive into understanding how this would work in a simple flow chart.

Our android app → Uses Camera → Detects face → Starts some operation → Checks if eyes of the viewer is open → Continues the operation → If eyes are closed → Stop the operation.

This is the basic idea for our android app. For learning purposes we will just be doing this much in our app, but much more advanced features can be added using Google Vision API.

The operation that we’d be carrying out in our app is playing a simple video. Well, I chose a video from PIXAR Animation.

You can find the full source code for the app here.

Start a new Android project in your IDE like Android Studio or any other.

Go to the manifest file and add the following permission, as for this app we need the permission to use front Camera of the device.

<uses-permission android:name="android.permission.CAMERA" />

We also have to import Google Vision API in our android app, so head to the Build Gradle for app, mostly written as build.gradle (Module: app).

You will see a file with text that resembles this:

apply plugin: 'com.android.application'

android {compileSdkVersion 28defaultConfig {applicationId "com.pd.trackeye"minSdkVersion 15targetSdkVersion 28versionCode 1versionName "1.0"testInstrumentationRunner "android.support.test.runner.AndroidJUnitRunner"}buildTypes {release {minifyEnabled falseproguardFiles getDefaultProguardFile('proguard-android.txt'), 'proguard-rules.pro'}}}

dependencies {implementation fileTree(dir: 'libs', include: ['*.jar'])implementation 'com.android.support:appcompat-v7:28.0.0'implementation 'com.android.support.constraint:constraint-layout:1.1.3'testImplementation 'junit:junit:4.12'androidTestImplementation 'com.android.support.test🏃1.0.2'androidTestImplementation 'com.android.support.test.espresso:espresso-core:3.0.2'implementation 'com.google.android.gms:play-services:12.0.1'}

If it looks like this, then you’re at the correct file destination. Copy the whole code above and paste it on that.

You’ll get a top-bar pop-up stating to sync files. Click on Sync Now and wait for the project to build.

After you get all green OK signs in your Build log, you can move ahead.

Now let’s write some android code that will do the magic. Head to the MainActivity of the file, and declare following variables, like this:

private static final String TAG = "MainActivity";VideoView videoView;EditText textView;

//For looking logsArrayAdapter adapter;ArrayList<String> list = new ArrayList<>();

CameraSource cameraSource;

These lines of code should be beneath the line that states something like this:

public class MainActivity extends AppCompatActivity {

Now let’s do the magic code inside the onCreate() method that you will see on the same file.

Let’s first add something in our code that lets the user grant permission to add the android app, as without that our app will crash and not work.

if (ActivityCompat.checkSelfPermission(this, android.Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {ActivityCompat.requestPermissions(this, new String[]{Manifest.permission.CAMERA}, 1);Toast.makeText(this, "Grant Permission and restart app", Toast.LENGTH_SHORT).show();}else {videoView = findViewById(R.id.videoView);textView = findViewById(R.id.textView);adapter = new ArrayAdapter<>(this, android.R.layout.simple_list_item_1, list);videoView.setVideoURI(Uri.parse("android.resource://" + getPackageName() + "/" + R.raw.videoplayback));videoView.start();createCameraSource();}

Add the above code inside the **onCreate()** method that we talked about above.

Now we should add the code that tracks the eyes using Google vision API. We will make a new private class for that, named **EyesTracker**. The code for the class looks like this:

private class EyesTracker extends Tracker<Face> {

private final float THRESHOLD = 0.75f;  

private EyesTracker() {  

}  

@Override  
public void onUpdate(Detector.Detections<Face> detections, Face face) {  
    if (face.getIsLeftEyeOpenProbability() > THRESHOLD || face.getIsRightEyeOpenProbability() > THRESHOLD) {  
        Log._i_(_TAG_, "onUpdate: Eyes Detected");  
        showStatus("Eyes Detected and open, so video continues");  
        if (!videoView.isPlaying())  
            videoView.start();  

    }  
    else {  
        if (videoView.isPlaying())  
            videoView.pause();  

        showStatus("Eyes Detected and closed, so video paused");  
    }  
}  

@Override  
public void onMissing(Detector.Detections<Face> detections) {  
    super.onMissing(detections);  
    showStatus("Face Not Detected yet!");  
}  

@Override  
public void onDone() {  
    super.onDone();  
}  

}

Here videoView is the reference to the video that is set to VideoView attribute in activity_main.xml that is for the UI of the app.

Now we can add the code that detects the user’s face in a new private class named **FaceTrackerFactory**. The code will look something like this:

private class FaceTrackerFactory implements MultiProcessor.Factory<Face> {

private FaceTrackerFactory() {  

}  

@Override  
public Tracker<Face> create(Face face) {  
    return new EyesTracker();  
}  

}

public void createCameraSource() {FaceDetector detector = new FaceDetector.Builder(this).setTrackingEnabled(true).setClassificationType(FaceDetector.ALL_CLASSIFICATIONS).setMode(FaceDetector.FAST_MODE).build();detector.setProcessor(new MultiProcessor.Builder(new FaceTrackerFactory()).build());

cameraSource = new CameraSource.Builder(this, detector)  
        .setRequestedPreviewSize(1024, 768)  
        .setFacing(CameraSource._CAMERA\_FACING\_FRONT_)  
        .setRequestedFps(30.0f)  
        .build();  

try {  
    if (ActivityCompat._checkSelfPermission_(this, Manifest.permission._CAMERA_) != PackageManager._PERMISSION\_GRANTED_) {  
        // _TODO: Consider calling_            //    ActivityCompat#requestPermissions  
        // here to request the missing permissions, and then overriding  
        //   public void onRequestPermissionsResult(int requestCode, String\[\] permissions,  
        //                                          int\[\] grantResults)  
        // to handle the case where the user grants the permission. See the documentation  
        // for ActivityCompat#requestPermissions for more details.  
        return;  
    }  
    cameraSource.start();  
}  
catch (IOException e) {  
    e.printStackTrace();  
}  

}

We also need to pause the video and end the video which we will take care in onDestory() and onPause() methods of MainActivity.

Now let’s add some XML code for basic UI of the app.

Head to the file that is named something like activity_main.xml. Now just replace the code in it with the following code.

<?xml version="1.0" encoding="utf-8"?><LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"xmlns:app="http://schemas.android.com/apk/res-auto"xmlns:tools="http://schemas.android.com/tools"android:layout_width="match_parent"android:layout_height="match_parent"tools:context=".MainActivity"android:orientation="vertical">

<VideoView  
    android:layout\_width="match\_parent"  
    android:layout\_height="wrap\_content"  
    android:id="@+id/videoView"/>  

<EditText  
    android:layout\_width="match\_parent"  
    android:layout\_height="match\_parent"  
    android:id="@+id/textView"  
    android:text="@string/face\_not\_found"  
    android:textSize="20sp"/>  

</LinearLayout>

All this code does is add an EditText to display the text which states what’s being detected and what’s not to user. And VideoView that plays the video for the user.

You can find the full source code for the app here.

And that’s pretty much it. You’ve built your own eye tracking and face detecting android app.

Everything done, looks like this:

Eyes open

Eyes closed

Now go and show off !!

Read my previous post about an android library that easily helps you customise Snackbar with very few lines of code. ChocoBar.


Published by HackerNoon on 2019/01/10