As a frontend developer you can never go more than a year without being tasked to build a dashboard of some sort. This year it’s a real time dashboard, providing live performance insights throughout the day.
From a technology perspective we only had one constrain; the data must come from Elastic Search, the rest was up for grabs. So grab a fair trade, organic, made with love, tea and let’s go on a nerdy adventure…
Hold on, what’s Elastic Search (ES)? Simply put, it’s a RESTful API that sits on top of a search and analytics engine. At its core it stores your data and provides a way of accessing it through queries. For the purpose of this project all we care about is asking ES for some data and the correct data being retuned.
Scope: the dashboard will be made up of two counters, two lists of ten articles and a bar chart. The variation on duplicated widgets will either be the type of data or the timeframe.
In order to keep our ES queries simple and quick to process, it’ll only return an ID and related stat. We’ll then be able to use this ID to query a faster and more comprehensive API.
The first decision was if the the frontend should make calls directly to ES or if there should an API in the middle. An API would allow us to manage the load on the ES cluster and create a more scalable solution. Imagine 10+ clients asking ES to run queries. Things are going to slow down and cost a lot!
With this in mind we now needed an API that can fetch all the necessary data and cache the responses. A simple Node.js app can achieve this for us, however we’ll then need to expose the data to the frontend through an API. This is where another debate came in REST vs GraphQL. I’m not going to talk about the details here as there are plenty of articles online. For us this presented a great opportunity to learn GraphQL, but also simplify the frontend.
The web app is purely powered by the data from this API; other than fetching data there is minimal application state. By opting for GraphQL and pairing it with Apollo Client we were able to eliminate the need for a state management system like Redux. With this structure we are simply able to wrap our components in higher order components that will inject the fetch state (loading, error, data) and some other useful functions.
To summaries we have a React web app with three core components, each can be injected with data through Apollo Client. Apollo will get this data by querying a GraphQL server that simply resolves some data it has in memory from previous calls to ES and another API. Something like the below:
This tutorial has everything you need to build a basic GraphQL server. Using this as a base we were then able to make queries which resolve data directly from ES.
Moving forward we needed to cache this data then resolve it upon a request. To do this we have an object called responses which is populated with data every 30 seconds from the ES queries, this runs as a continuous process on the server. The graphQL resolvers simply just return responses.theDataItNeeds.
As an extra I’ve also added subscriptions, this allows the web app to initially get the data needed with a query and then just subscribe to any data changes. This saves each component re-fetching data every 30 seconds, reduces sending a whole new response every time and ensures everything is in sync. This also means everything up until the REST section in the above drawing is realtime, giving us in the option to tweak our other API calls to make the data as close to realtime as possible.
We’re using React to build our view and bootstrapped it with a previous boilerplate I made, more details here. However we removed Redux as discussed above and also added Flow for type checking in an effort to ensure code quality.
The dashboards index page is built up of multiple components, each of these components can take custom data by wrapping it in the desired GraphQL higher order component.
We’re utilising our continuous integration and deployment setup powered by Jenkins, Docker and AWS. Jenkins allows us to build our project remotely while Docker ensures our environments are consistent and easily deployable both to our local machines for testing and AWS for production.
With a deployed app and API that’s a wrap, we’ve created a realtime dashboard sitting on an easily scalable frontend and backend architecture.