In this article, we cover the second part of our two part series. We will create a React Native application that makes use of the Firebase Cloud Function that we created in the first part. Here I will cover building the application in detail. If you want to skip ahead to the finish line, the full code is available here.
In this project we will be using Expo with React Native. We are going to go through step by step on how to set it up.
If you run into problems you can take a look at their docs which goes into more detail.
npm install -g expo-cli
Make sure that you’re in the landmark root folder that we created earlier in part 1.
expo init frontend
After running the init command there will be an option to choose a template. Select the blank (Typescript) option.
You can now start the application by running:
cd frontend expo start
This opens your default browser with the following view
You will have to test and run the application on your mobile phone. You can do so by running it through the Expo mobile application and just scanning the QR code to start. You can shake the phone to see the developer menu, read more.
NOTE: Emulators will not work with this application because the expo-camera dependency does not support that.
First make sure that you are in the frontend folder Expo created. Also make sure that Expo is not running while running these commands.
We will be using Expo to install most dependencies since that’s the recommended way to do it.
expo install firebase expo-camera expo-file-system
Go ahead and create a helpers folder within the frontend folder and within that folder create firebase_init.ts.
Start by coding the firebase_init.ts file.
This code should be pretty self-explanatory. One thing to take note of here is that this is a good way to make sure that you are only initializing Firebase once within the project.
In this chapter we will go through all components we will be creating for this application. The components will all be inside the frontend/components folder. Start by creating the components folder and add the following files to it:
Start with setting up the CameraPage component which is responsible for taking the images and uploading them for validation.
This component has several dependencies. Let’s go through them all so we can get this component working, start with the LandmarkCamera component.
Next add a button to the CameraPage that will take the image.
Next up is the LandmarkModal component that will display the results of the taken image.
Then we have the vision function that is placed in the helpers folder. Inside the helpers folder create an api folder and then vision.ts inside it:
This function will send the request to the cloud function.
Let’s not forget to update App.tsx which was created by Expo in the frontend folder. In this file we will sign-in and request the camera permission from the user.
In the useEffect hook we are first signing in anonymously and then requesting camera permissions. We want to start this process as early as possible so that we get the user id as quickly as possible.
And that should cover all code needed by the application.
To test the application, take a picture of this image of the Eiffel Tower. When snapping the pic, please ensure that the Eiffel Tower is fully included in the picture but nothing else on your screen is. By doing so, you increase the chances of the landmark being identified.
After the Vision API has validated the image a modal should pop up with the results. You can just press OK to go and take an image of something else.
You should now have a fully functional React Native mobile application that you can take images with. The images will then be pushed to the firebase storage and validated by the Cloud Vision API through the Cloud Function.
Have fun with the app!
Mathias Rahikainen is a Full Stack Developer and Consultant at Acelvia, where we build web and mobile apps for our awesome customers. We’re always interested in taking on new interesting projects. Read more about us at https://www.acelvia.com/en or get in touch via [email protected].