We all know, how cool an android app looks, when it can detect our face or track if our eyes are closed or open. It becomes way more cooler when the app can even detect if we are smiling, reading on phone or not looking at it. Well… I believe.. whatever does appeals me, simply makes me build it! Sorry, tried a “The Dark Knight” pun :) So let us make an android eye tracking and face detection app using Google Vision API. From Google: enables developers to understand the content of an image by encapsulating powerful machine learning models in an easy-to-use REST API. It quickly classifies images into thousands of categories (such as, “sailboat”), detects individual objects and faces within images, and reads printed words contained within images. You can build metadata on your image catalog, moderate offensive content, or enable new marketing scenarios through image sentiment analysis. Cloud Vision API Here we will be making an android app that can track our face, and detect if our eyes are closed or open. Sounds cool? It’s not much hard even. So let’s dive into understanding how this would work in a simple flow chart. Our android app → Uses Camera → Detects face → Starts some operation → Checks if eyes of the viewer is open → Continues the operation → If eyes are closed → Stop the operation. This is the basic idea for our android app. For learning purposes we will just be doing this much in our app, but much more advanced features can be added using Google Vision API. The operation that we’d be carrying out in our app is playing a simple video. Well, I chose a video from PIXAR Animation . You can find the full source code for the app here. Start a in your IDE like Android Studio or any other. new Android project Go to the and add the following permission, as for this app we need the permission to use front Camera of the device. manifest file <uses-permission android:name="android.permission.CAMERA" /> We also have to import in our android app, so head to the Build Gradle for app, mostly written as . Google Vision API build.gradle (Module: app) You will see a file with text that resembles this: apply plugin: 'com.android.application' android {compileSdkVersion 28defaultConfig {applicationId "com.pd.trackeye"minSdkVersion 15targetSdkVersion 28versionCode 1versionName "1.0"testInstrumentationRunner "android.support.test.runner.AndroidJUnitRunner"}buildTypes {release {minifyEnabled falseproguardFiles getDefaultProguardFile('proguard-android.txt'), 'proguard-rules.pro'}}} dependencies {implementation fileTree(dir: 'libs', include: ['*.jar'])implementation 'com.android.support:appcompat-v7:28.0.0'implementation 'com.android.support.constraint:constraint-layout:1.1.3'testImplementation 'junit:junit:4.12'androidTestImplementation 'com.android.support.test🏃1.0.2'androidTestImplementation 'com.android.support.test.espresso:espresso-core:3.0.2'implementation 'com.google.android.gms:play-services:12.0.1'} If it looks like this, then you’re at the correct file destination. C opy the whole code above and paste it on that. You’ll get a top-bar pop-up stating to sync files. Click on and wait for the project to build. Sync Now After you get , you can move ahead. all green OK signs in your Build log Now let’s write some android code that will do the magic. Head to the of the file, and declare following variables, like this: MainActivity private static final String = "MainActivity";VideoView videoView;EditText textView; TAG //For looking logsArrayAdapter adapter;ArrayList<String> list = new ArrayList<>(); CameraSource cameraSource; These lines of code should be beneath the line that states something like this: public class MainActivity extends AppCompatActivity { Now let’s do the magic code inside the that you will see on the same file. onCreate() method Let’s first add something in our code that n to add the android app, as without that our app will crash and not work. lets the user grant permissio if (ActivityCompat. (this, android.Manifest.permission. ) != PackageManager. ) {ActivityCompat. (this, new String[]{Manifest.permission. }, 1);Toast. (this, "Grant Permission and restart app", Toast. ).show();}else {videoView = findViewById(R.id. );textView = findViewById(R.id. );adapter = new ArrayAdapter<>(this, android.R.layout. , list);videoView.setVideoURI(Uri. ("android.resource://" + getPackageName() + "/" + R.raw. ));videoView.start();createCameraSource();} checkSelfPermission CAMERA PERMISSION_GRANTED requestPermissions CAMERA makeText LENGTH_SHORT videoView textView simple_list_item_1 parse videoplayback Add the above code inside the method that we talked about above. **onCreate()** Now we should add the code that tracks the eyes using Google vision API. We will make a new class for that, named The code for the class looks like this: private **EyesTracker** . private class EyesTracker extends Tracker<Face> { private final float THRESHOLD = 0.75f; private EyesTracker() { } @Override public void onUpdate(Detector.Detections<Face> detections, Face face) { if (face.getIsLeftEyeOpenProbability() > THRESHOLD || face.getIsRightEyeOpenProbability() > THRESHOLD) { Log._i_(_TAG_, "onUpdate: Eyes Detected"); showStatus("Eyes Detected and open, so video continues"); if (!videoView.isPlaying()) videoView.start(); } else { if (videoView.isPlaying()) videoView.pause(); showStatus("Eyes Detected and closed, so video paused"); } } @Override public void onMissing(Detector.Detections<Face> detections) { super.onMissing(detections); showStatus("Face Not Detected yet!"); } @Override public void onDone() { super.onDone(); } } Here is the reference to the video that is set to attribute in that is for the UI of the app. videoView VideoView activity_main.xml Now we can add the code that detects the user’s face in a new class named . The code will look something like this: private **FaceTrackerFactory** private class FaceTrackerFactory implements MultiProcessor.Factory<Face> { private FaceTrackerFactory() { } @Override public Tracker<Face> create(Face face) { return new EyesTracker(); } } public void createCameraSource() {FaceDetector detector = new FaceDetector.Builder(this).setTrackingEnabled(true).setClassificationType(FaceDetector. ).setMode(FaceDetector. ).build();detector.setProcessor(new MultiProcessor.Builder(new FaceTrackerFactory()).build()); ALL_CLASSIFICATIONS FAST_MODE cameraSource = new CameraSource.Builder(this, detector) .setRequestedPreviewSize(1024, 768) .setFacing(CameraSource._CAMERA\_FACING\_FRONT_) .setRequestedFps(30.0f) .build(); try { if (ActivityCompat._checkSelfPermission_(this, Manifest.permission._CAMERA_) != PackageManager._PERMISSION\_GRANTED_) { // _TODO: Consider calling_ // ActivityCompat#requestPermissions // here to request the missing permissions, and then overriding // public void onRequestPermissionsResult(int requestCode, String\[\] permissions, // int\[\] grantResults) // to handle the case where the user grants the permission. See the documentation // for ActivityCompat#requestPermissions for more details. return; } cameraSource.start(); } catch (IOException e) { e.printStackTrace(); } } We also need to pause the video and end the video which we will take care in and methods of . onDestory() onPause() MainActivity Now let’s of the app. add some XML code for basic UI Head to the file that is named something like . Now just replace the code in it with the following code. activity_main.xml <?xml version="1.0" encoding="utf-8"?><LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"xmlns:app="http://schemas.android.com/apk/res-auto"xmlns:tools="http://schemas.android.com/tools"android:layout_width="match_parent"android:layout_height="match_parent"tools:context=".MainActivity"android:orientation="vertical"> <VideoView android:layout\_width="match\_parent" android:layout\_height="wrap\_content" android:id="@+id/videoView"/> <EditText android:layout\_width="match\_parent" android:layout\_height="match\_parent" android:id="@+id/textView" android:text="@string/face\_not\_found" android:textSize="20sp"/> </LinearLayout> All this code does is add an to display the text which states what’s being detected and what’s not to user. And that plays the video for the user. EditText VideoView You can find the full source code for the app here. And that’s pretty much it. You’ve built your own eye tracking and face detecting android app. Everything done, looks like this: Eyes open Eyes closed Now go and show off !! Read my previous post about an android library that easily helps you customise Snackbar with very few lines of code. ChocoBar.
Share Your Thoughts