As a software developer, I spend my time building websites and web apps, but for a long time, I’ve also had a side interest in virtual reality. I’ve named my Oculus Go “Betty” and get giddy talking about experiencing gondola rides through Venice, hopping on breathtaking roller coasters, and traveling through veins in the human body. Since I use React, mainly, I was excited to learn I could develop virtual reality experiences with a library I already know and love.
To try my hand at React VR, I recently created a virtual reality application called Find Your Zen, which allows the user to select an immersive meditation environment, each of which comes with its own mantra inspired by the very excellent show “The Good Place.” In May 2018, shortly after I built my app, Facebook released a revamped and rebranded version of React VR called React 360 with multiple changes and significant improvements.
As I ported my application over to React 360, I took note of some important differences between React VR and React 360. I wrote the following article for developers who possess a working knowledge of React. If you’re unfamiliar with the library, I recommend starting here first.
If you want an introduction to React VR (as well as Recompose, whose utility functions help manage my application’s state), you can find that here and here.
Viewing the finished demo code
$ git clone [https://github.com/lilybarrett/find-your-zen.git](https://github.com/lilybarrett/find-your-zen.git) $ cd find-your-zen $ npm i $ npm start
File structure
The basic file structure for React VR was as follows:
index.vr.js
= entry point for my appvr
folder = stores the code that launches my app, includes index.html
and client.js
filesstatic_assets
= stores images, audio files, and other external resourcesAnd here’s the new file structure for React 360:
index.js
= entry point for my appclient.js
= sets up the “runtime,” which turns my React components into 3D elements in our VR landscapeindex.html
= as in the typical React application, provides a place for me to mount my React codestatic_assets
= stores images, audio files, and other external resourcesI set up the rest of my folder structure as follows:
- components // shared components - base-button - content- consts- providers // Recompose providers live here- scenes - home-environment - components - menu - title - zen-button - zens - zen-environment - components - home-button - mantra - static-assets - images - sounds
Shared components live in the top-level components
folder. Stored in scenes,
my HomeEnvironment
— the first environment to load, where my user accesses a menu of meditation environments to explore — and ZenEnvironment
scenes each have their own sets of relevant components. My state management is handled by Recompose providers
and functionally composed into each component that needs access to state.
Mounting the app
In React VR, my client.js
was pretty simple and didn’t give me too many configuration options:
// React VR application -- vr/client.js// Auto-generated content. // This file contains the boilerplate to set up your React app. // If you want to modify your application, start in "index.vr.js"
import
vr.render = function() { // Any custom behavior you want to perform on each frame goes here }; // Begin the animation loop vr.start(); return
In React 360, I can mount my application’s content to a surface or a location. Surfaces, as the docs say, “allow you to add 2D interfaces in 3D space, letting you work in pixels instead of physical dimensions.” In my case, I wrap the visual content of my application in an AppContent
component, which I mount to React 360’s default, cylindrical surface. This surface projects the content onto the inside of a cylinder — centered in front of the user — with a 4 meter radius.
I can create my own custom surfaces in React 360, increasing or decreasing the radius or making the surface flat rather than cylindrical.
I also mount the entire app itself to React 360’s default location, which allows my app to take advantage of React 360’s runtime.
The new runtime is one of React 360’s significant advantages over React VR. Why? Separating out the rendering or “runtime” aspects of the application from the application code improves the latency: the time between a user action and the time the pixels in the view update in response to that action. If the data transfer is too slow, it results in a choppy, disorienting view for the user — similar to buffering on a Youtube video or static on a television screen.
As the React 360 docs further explain, web browsers are single-threaded, which means that as part of the app updates behind the scenes, that process could block or hinder data transfer. “This is especially problematic for users viewing your 360 experience on a VR headset, where significant rendering latency can break the sense of immersion,” the docs tell us, “By running your app code in a separate context, we allow the rendering loop to consistently update at a high frame rate.”
If the data transfer is too slow, it results in a choppy, disorienting view for the user — similar to buffering on a Youtube video or static on a television screen.
In my index.js
, I register my MeditationApp
(see second code block below) to mount to the default location — giving my entire application access to the runtime — while I register the content I want to display (again, stored in AppContent
) to the default cylindrical surface.
// components/content.js
import
const AppContent = withAppContext(() => ( <View> <HomeEnvironment /> <ZenEnvironment /> </View>));
export
// index.js
import
const MeditationApp = withAppContext(() => ( <View style={{ transform: [{ translate: [0, 0, -2] }] }}> <AppContent /> </View>));
AppRegistry.registerComponent("AppContent", () => AppContent);AppRegistry.registerComponent("MeditationApp", () => MeditationApp);
My client.js
deals with mounting my component to locations and surfaces:
// client.jsimport
function
r360.renderToSurface( r360.createRoot("AppContent", { /* initial props */
r360.renderToLocation( r360.createRoot("MeditationApp", { /* initial props */
r360.compositor.setBackground( r360.getAssetURL("images/homebase.png") );}
window.React360 = {init};
Playing audio
In my consts
folder, I created a zens.js
file in which to quickly store my data — including the correct audio file and image — for each environment:
const zens = [ { id: 1, mantra: "Find your inner motherforking peace", image: "images/hawaii_beach.jpg", audio: "sounds/waves.mp3", text: "I'm feeling beachy keen", }, { id: 2, mantra: "Breathe in peace, breathe out bullshirt", image: "images/horseshoe_bend.jpg", audio: "sounds/birds.mp3", text: "Ain't no mountain high enough", }, { id: 3, mantra: "Benches will be benches", image: "images/sunrise_paris_2.jpg", audio: "sounds/chimes.mp3", text: "I want a baguette", }, { id: 4, image: "images/homebase.png", text: "Home" }]
export default zens;
To play audio in my React VR scenes, I used a Sound
component, which took in a URL for a sound file in the static_assets
folder as a source
prop. To prevent audio from playing in environments where it didn’t belong — such as the home environment — I implemented logic via Recompose for “hiding” and “showing” the Sound
component based on whether or not we were in an environment with no audio files associated with it.
// React VR -- components/audio.js
import
const hideIfNoAudioUrl = hideIf(({ selectedZen }) => { const zenAudio = zens[selectedZen - 1].audio; return
export
React 360 greatly improves upon this. For playing audio, I use the AudioModule
Native Module. Its playEnvironmental
method allows me to provide a path (to the audio in our assets folder) and a volume at which to play said audio at a looping pace. Once the audio file stops playing, it’ll start again.
Along the way, I realized I need to tell my application when to stop playing a particular audio file when switching scenes. (Otherwise, while immersed in Find Your Zen, you may wind up listening to audio from your previous environment — i.e., church bells in a city square in Paris — after you navigate back to the home environment). I accomplish this with the AudioModule
‘s stopEnvironmental
method.
Keep reading to see this in action…
Using Images
In React VR, I used a Pano
component to display a 360 degree photo. To display a specific image, Pano
, like Audio
, took in an assets URL as a source
prop. Based on which environment the user selected, the app’s state updated to display an image for that environment.
// React VR -- components/wrapped-pano.js
import
export
You may or may not have noticed that, in my React 360 application’s client.js
, I write the following line after rendering my application’s components:
r360.compositor.setBackground(r360.getAssetURL("images/homebase.png"));
This line of code, which immediately sets the background image when the app is first mounted, uses the asset
utility from React 360 to automatically look inside mystatic_assets
folder for the correct image.
That’s all well and good, but I still want to change the image based on which environment the user selects. Thankfully, I can handle dynamic images from within a React event by using React 360’s Environment
module. Here’s some sample usage:
Environment.setBackgroundImage(asset(someImage));
To pull it all together, here’s how I dynamically set my background image and audio based on which environment the user selects, using Recompose’s withState
and withHandlers
functions:
// providers/withStateAndHandlers.js
import
const withStateAndHandlers = compose( withState("selectedZen", "zenClicked", 4), withHandlers({ zenClicked: (props) => (id, evt) => { Environment.setBackgroundImage(asset(zens[id - 1].image)); if
export
Styling the app
React 360, like React VR, uses Flexbox to easily adapt the application’s layout to any display, whether it be a laptop’s web browser or a phone screen or a VR headset. However, for parts of the application mounted to a location — like MeditationApp
in my case — React 360 switches from Flexbox layout to a three-dimensional, meter based coordinate system. That’s why you see this code in my index.js
:
// index.js
// other code goes hereconst MeditationApp = withAppContext(() => ( <View style={{ transform: [{ translate: [0, 0, -2] }] }}> <AppContent /> </View>));
// other code goes here
The values passed into transform
are x
, y
, and z
, in that order. x
represents the orientation of an object to the right of the user; y
represents the orientation upwards or downwards, and z
represents the perceived distance away from the user.
In the example above, the View
should be in the center and 2 perceived meters ahead of the user.
Transforms are all positioned relative to their parents.
Practices that worked well for me
StyleSheets
StyleSheet
from react-native
allowed me to use JavaScript to style my React components. See my code below:
// scenes/home-environment/components/zen-button/style.jsimport
export
Here, I create and export a StyleSheet
object that allows me to reference styles in a terse, DRY manner in my component itself.
// scenes/home-environment/components/zen-button/index.jsimport
const ZenButton = ({ text, buttonClick, selectedZen }) => { return
export
State management
Because, at the end of the day, this is still just React, you can approach handling state in the same way you would in a typical React application: Redux, Recompose, Mobx, etc. I chose to use Recompose because I love how it allows me to build functional components. As mentioned earlier, I wrote some posts about Recompose in the context of React VR, which you can find here and here. I did not need to change anything about my state management approach when porting my application from React VR over to React 360.
Debugging React 360
When you Inspect Element
on the application, you”ll see that React 360 bundles all its files into one giant blob that isn’t super easy to grok. Fortunately, because React 360 supports sourcemaps, we can still access the original files, use debugger
, etc.