paint-brush
How to Build Landmark Recognition App with React Native and Vision AI by@mathiasr
404 reads
404 reads

How to Build Landmark Recognition App with React Native and Vision AI

by Mathias RahikainenDecember 31st, 2020
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

In this article, we cover the second part of our two part series. We will create a React Native application that makes use of the Firebase Cloud Function that we created in the first part. Here I will cover building the application in detail. If you want to skip ahead to the finish line, the full code is available here.

People Mentioned

Mention Thumbnail
featured image - How to Build Landmark Recognition App with React Native and Vision AI
Mathias Rahikainen HackerNoon profile picture

In this article, we cover the second part of our two part series. We will create a React Native application that makes use of the Firebase Cloud Function that we created in the first part. Here I will cover building the application in detail. If you want to skip ahead to the finish line, the full code is available here.

Sample image of the application:

What is React Native?

React Native lets coders develop applications for both Android and iOS with just one codebase. React Native applications are built using JavaScript or TypeScript, which simplifies the transition to mobile development for web developers. Unsurprisingly, the framework has gained popularity over the past few years.

What is Expo?

Expo is a set of tools and services built around React Native that help you quickly develop, build and deploy both iOS and Android apps from the same JavaScript/TypeScript codebase.

Setting up React Native with Expo

In this project we will be using Expo with React Native. We are going to go through step by step on how to set it up.

If you run into problems you can take a look at their docs which goes into more detail.

npm install -g expo-cli

Make sure that you’re in the landmark root folder that we created earlier in part 1.

expo init frontend

After running the init command there will be an option to choose a template. Select the blank (Typescript) option.

You can now start the application by running:

cd frontend
expo start

This opens your default browser with the following view

You will have to test and run the application on your mobile phone. You can do so by running it through the Expo mobile application and just scanning the QR code to start. You can shake the phone to see the developer menu, read more.


NOTE: Emulators will not work with this application because the expo-camera dependency does not support that. 

Installing dependencies

First make sure that you are in the frontend folder Expo created. Also make sure that Expo is not running while running these commands.

We will be using Expo to install most dependencies since that’s the recommended way to do it.

expo install firebase expo-camera expo-file-system

Setting up Firebase in the frontend

Go ahead and create a helpers folder within the frontend folder and within that folder create firebase_init.ts.

Start by coding the firebase_init.ts file.

You need to replace the firebase config settings with your own. They can be found in the Firebase console in the landmark project you created in part 1. Click on the cog and select Project settings.

This code should be pretty self-explanatory. One thing to take note of here is that this is a good way to make sure that you are only initializing Firebase once within the project.

Components

In this chapter we will go through all components we will be creating for this application. The components will all be inside the frontend/components folder. Start by creating the components folder and add the following files to it:

Setting up CameraPage.tsx

Start with setting up the CameraPage component which is responsible for taking the images and uploading them for validation.

CameraPage.tsx:

This component has several dependencies. Let’s go through them all so we can get this component working, start with the LandmarkCamera component.

LandmarkCamera.tsx:

Next add a button to the CameraPage that will take the image.

CircleButton.tsx:

Next up is the LandmarkModal component that will display the results of the taken image.

LandmarkModal.tsx:

Then we have the vision function that is placed in the helpers folder. Inside the helpers folder create an api folder and then vision.ts inside it:

api/vision.ts:

This function will send the request to the cloud function.

Updating App.tsx

Let’s not forget to update App.tsx which was created by Expo in the frontend folder. In this file we will sign-in and request the camera permission from the user.

App.tsx:

In the useEffect hook we are first signing in anonymously and then requesting camera permissions. We want to start this process as early as possible so that we get the user id as quickly as possible.

And that should cover all code needed by the application.

Wrapping up

To test the application, take a picture of this image of the Eiffel Tower. When snapping the pic, please ensure that the Eiffel Tower is fully included in the picture but nothing else on your screen is. By doing so, you increase the chances of the landmark being identified.

After the Vision API has validated the image a modal should pop up with the results. You can just press OK to go and take an image of something else.

You should now have a fully functional React Native mobile application that you can take images with. The images will then be pushed to the firebase storage and validated by the Cloud Vision API through the Cloud Function.

Have fun with the app!

Mathias Rahikainen is a Full Stack Developer and Consultant at Acelvia, where we build web and mobile apps for our awesome customers. We’re always interested in taking on new interesting projects. Read more about us at https://www.acelvia.com/en or get in touch via [email protected].