A few months ago I came up with the idea of creating a service that would give users the exact porn they needed. Whether it was just being uploaded to a service like Youporn or Tumblr or being a specific channel on Reddit. A link collector. This service is now online on thenextporn.com.
My challenge, like every challenge of a startup, in addition to learning new technologies, was to make everything possible by running as little as possible but organizing everything to scale in case it was necessary.
The variables to be managed to start with a technological project (whether you are a startup or an individual who wants to put an idea) are:
To answer my needs I opted for these technologies:
Yes I love javascript.
The microservices Architecture today is widespread and predicated by large companies such as Uber, Google, Airbnb, Square
The diagram above explains the difference between a monolithic architecture and a very, very simple microservice architecture. With a monolithic architecture, you have one large server responsible for handling all the requests. This is going to hit you at scale. It’s going to hit you hard. Microservices, however, can balance traffic due to your business’ needs. If you are receiving a large number of payments, you can scale up your payment service and keep the other services using a smaller number of resources. It’s horizontal scaling at it’s finest.
While it is true that a microservices architecture solves many problems, it also creates new like:
The best (at the moment) method to create microservices with a high level of security and simple to scale it’s Docker.
Docker is a tool designed to make it easier to create, deploy, and run applications by using containers. Containers allow a developer to package up an application with all of the parts it needs, such as libraries and other dependencies, and ship it all out as one package. By doing so, thanks to the container, the developer can rest assured that the application will run on any other Linux machine regardless of any customized settings that machine might have that could differ from the machine used for writing and testing the code.
Unlike a VM which provides hardware virtualization, a container provides operating-system-level virtualization by abstracting the “user space”.
MongoDB it’s great but if you do not have to handle it and someone does it, gratis, for you, it’s better.
Mlab it’s an online service that allows you to have some free database and scale it simply. The sandbox account allows you to create infinite database with the limit of 0.5gb of space. Space and the machine are shared so it’s perfect if you need just write and simply query but if you need to use reduce and a lot of data the sandbox it’s not the better solution. You will have access to the database without limit, so, if you want to pass on your infrastructure you can execute every type of backup and export of the data without limits.
Client-side rendering — Normally when using React, your browser will download a minimal HTML page, and the content will be filled in by JavaScript.
With Server-side rendering, the initial content is generated on the server, so your browser can download a page with HTML content already in place. Updates to the content are still handled in the browser.
N_ext.js is a minimalistic framework for server-rendered React applications._
5 reasons to love Next.js:
When you’re testing new technologies, you always come across problems that you find only in production, these are the issues I’ve had to deal with:
How can I keep my services under control?
Monitoring services are always useful for discovering bottlenecks or sudden crashes. Using Nodejs the best way, for me, was install a free Nodejs Performance Monitoring, in my case, provided by Newrelic
To monitor your microservices just:
newrelic
packagerequire('newrelic');
how the first line of your server.jsAfter 5 minutes in your dashboard you will see a situation like this:
But inside the detail of the microservice he magic happens:
You will find all details of the slower calls, the detail of the response layers, the error rates and other wonderful parameters.
Cache helps us to not re-render pages just rendered and limit the usage of the machine in case of many request in a limited time. To enable caching in Nextjs i have used the LRUCache package
yarn add lru-cache
changing my server.js in this:
const express = require('express')const next = require('next')const LRUCache = require('lru-cache')const port = parseInt(process.env.PORT, 10) || 3000const dev = process.env.NODE_ENV !== 'production'const app = next({ dir: '.', dev })const handle = app.getRequestHandler()
// This is where we cache our rendered HTML pagesconst ssrCache = new LRUCache({max: 100,maxAge: 1000 * 60 * 60 // 1hour})
app.prepare().then(() => {const server = express()// Use the `renderAndCache` utility defined below to serve pagesserver.get('*', (req, res) => {renderAndCache(req, res, '/')})server.listen(port, (err) => {if (err) throw errconsole.log(`> Ready on http://localhost:${port}`)})})
getCacheKey = req => return `${req.url}`
renderAndCache = (req, res, pagePath, queryParams) => {const key = getCacheKey(req)
// If we have a page in the cache, let's serve itif (ssrCache.has(key)) {console.log(`CACHE HIT: ${key}`)res.send(ssrCache.get(key))return}
// If not let's render the page into HTMLapp.renderToHTML(req, res, pagePath, queryParams).then(html => {// Let's cache this pageconsole.log(`CACHE MISS: ${key}`)ssrCache.set(key, html)res.send(html)}).catch(err => {app.renderError(err, req, res, pagePath, queryParams)})}
So simple. Every call will be served from the cache if just called in the previous hour or served directly and insert into the cache if is not in cache.
Of course if you have so many pages and little physical memory on your server is better uses an another microservices with Redis for cache your calls but, fortunately, it’s not my case.
Nextjs is based on react, so, to use google analytics we can use the react-ga package, to use it just add it as dependency to your project with yarn or npm:
yarn add react-ga
And use the package in your footer components:
import ReactGA from 'react-ga'
class NextFooter extends Component {constructor(props){super(props)ReactGA.initialize('XX-XXXXX-XX')}componentDidMount(){ReactGA.pageview(window.location.pathname)}}
Yes, so simple :)
Google optimizations
One of the key points to keep in mind when you start with a new project is the google optimization. Making sure your SEO is good will help you to adopt the most optimized marketing solutions possible. Nobody wants to throw money and today, thanks to all the tools we have, is really very simple.
1- Take under control the lighthouse chrome tool.
In Chrome we have a really nice tool that help use to improve the quality of our site. Lighthouse is on the Audit tab inside the Chrome developer console and use it it’s very simple. Just a click. For the google optimization we just need to keep in mind two of the four Lighthouse categories. Have a good scopre on Performance, Accessibility and Best Prectices will help us to have a better optimization on google.
Want to know how to optimize your code to get 100% score on each Lighthouse’s category with NextJS? Just read this guide :)
How to get a 100% Lighthouse score with NextJS_Today we will talk about how to get 100% on all four categories of Lightouse:_developers.caffeina.com
2- Create the **robot.txt**
and **sitemap.xml**
We have 2 way to serve this file. If we want just sever the files and we do not care about the path position we can just add this files inside the static
folder and serve it on http://oursite/static/sitemap.xml
and [http://oursite/static/robot.](http://oursite/static/robot.xml)txt
(it’s possible to instruct Google to check the new paths inside the webmaster tool). But if we want to use the /robot.txt path we need to rewrite our server.js
if ('/robots.txt' === req.url) {return sendFile(res, './static/robots.txt')}
3- Gzip your files
We have 2 way to gzip our file, inside the nginx configuration inside the /etc/nginx/nginx.conf
:
. . .### `gzip` Settings###
gzip on;gzip_disable "msie6";gzip_vary on;gzip_proxied any;gzip_comp_level 6;gzip_buffers 16 8k;gzip_http_version 1.1;gzip_min_length 256;gzip_types text/plain text/css application/json application/x-javascript text/xml application/xml application/xml+rss text/javascript application/vnd.ms-fontobject application/x-font-ttf font/opentype image/svg+xml image/x-icon;. . .
or inside our server.js using the compression package:
const compression = require('compression');const express = require('express');
...
const server = express();server.use(compression());
...
Having a certificate for https is always full of pitfalls, where do I buy my certificate? Do I need a wildcard certificate since I’m on a microservices architecture? How much does a certificate cost?
The answer to our problems is Let’s Encrypt (https://letsencrypt.org/)
Let**’s Encrypt is a free, automated, and open** Certificate Authority that give people the digital certificates they need in order to enable HTTPS (SSL/TLS) for websites, for free. Each certificate will be valid for 90 days, but thanks to certbot
the certificate renewal is transparent and automated.
It’s just a test. But my philosophy is “learning by doing” and this was a great exercise.
The microservices architecture helps to divide what you have to do in smaller and isolate task and the use of React with a “Components first” approach helps us to divide the macro task into smaller pieces. This gives us the chance to have the status of the project in hand and know exactly where are the bottlenecks or precisely choose what to develop before or after
For example the use of the free plane of mongolab or the use of a local cache it’s our bottlenecks but we don’t care because in this moment of the project they are not on our focus. It’s our decision and it’s all designed to be solved in the most painless way possible.
NextJS is, at this time, the best solution for less complex projects that need special attention to speed (both development and page view) and must be SEO oriented. We really need to be linked to Wordpress because it has a well done CMS when we could use Wordpress to create content and have a well-structured, maintainable and scalable frontend using only the API?
Some great sites made with NextJS:
Thanks and stay tuned :)