When building Sketch Elements for React Native, it was always clear that this would be the most exciting part of the project: implementing instagram-style filters and photo editing. The first part require the use of Expo OpenGL support and the second part involves using React Native animations and pan handling API provide photo editing capabilities at 60 fps.
Parts of this UI kits are live-coded on Youtube.
There are many libraries that can be used with the GLView component from Expo. Pixi.js is one of them. People seem to be using to do some really interesting things. Moreover, its API really fits the use case of building image filters. But the library that really got my attention is gl-react. I did feel some national pride when realizing that this library has been built by a fellow frenchmen. The best introduction to gl-react is the talk from its creator, Gaetan Renaudeau at React.js Conf. I’m a complete beginner with OpenGL and gl-react enabled me to build something quickly anyways. It definitely made me hungry to learn more about that topic.
Gl-react offers a really efficient paradigm for integrating React and OpenGL together. The library enables you to easily build scenes using React composability. For instance, consider an image with two filters: Brannan and Valencia. It can be expressed as below. The GLImage implementation is based on gl-react-image.
In the current implementation of GLImage, it takes a few second for the scene to be drawn. While this happens, screen touches won’t be responsive. So in order to provide a descent UX, we display an loading indicator while the scene is computed and we use the onDraw
property to remove the loading indicator. By looking at some of the OpenGL issues fixed in Expo 26, there is a possibility that it will be much faster in the SDK version 26.
UPDATE: Just upgraded to SDK 26 and I can confirm that these substantial performance improvements are there.
Each filter is using the on
property to find out if the filter must be applied or not. Below is an example of how such a filter looks like.
All animations in React Native Sketch Elements are using the React Native native driver in order to guarantee a super smooth user experience. The rotation slider, is implemented using a scroll view on top of a linear gradient. Below is the linear gradient.
And then we place a scroll view on top of it. The scroll view contains white rectangles which are separated with each other by a thin margin. Below is how it looks like.
Finally, we add a scroll animation value to the onScroll
property. And we use it to interpolate the rotation value that will be used with the tranform
property.
For displaying the degree of the rotation, we add a listener to the animated value. In order to not drop any frames when rotating the image, we do this operation in a subcomponent. This prevents the scroll view or the picture to be re-rendered when rotating. Below is how it looks like.
The image cropping is done using React Native gesture responder system. It was my very first use of the PanResponder API. After reading the documentation, and checking out this great example, it was straightforward to implement. The crop component is positioned using absolute coordinates and each corner has its own pan responder. It is a bit of a tough exercise for the brain to easily implement the translation of the x and y coordinate of a corner into an absolute coordinate, especially when computing the boundaries of the gesture. For instance, the corners cannot go outside the picture, a right corner cannot have a x coordinate lower than a left corner, and so on. But if you structure your code well, it is nicely doable.
Finally, once the absolute position is computed, we use setNativeProps to set the crop positioning without triggering a new rendering from React. Below is how the code looks like for the top left corner.
Hopefully you will find these small recipes useful, looking forward to read your feedback. And in the meantime: Happy Hacking 🎉