paint-brush
Building User Service With GRPC, Node.JS, and MongoDB: The Complete Microservice Tutorial — [Part 2]by@hasantalks
5,040 reads
5,040 reads

Building User Service With GRPC, Node.JS, and MongoDB: The Complete Microservice Tutorial — [Part 2]

by MD Ahad HasanDecember 13th, 2020
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Building User Service With GRPC, Node.JS, and MongoDB: The Complete Microservice Tutorial — [Part 2] The complete Microservice tutorial is here. We are going to build a GRPC server in NodeJS that accepts incoming RPC requests. We will use NodeJS for authentication and user service module of the application. We would need GRPC tools installed globally in NPM, so let’s do that first. NodeJS and NPM should be installed along with Protobuf for the transport layer with NPM ProtoBuf.

Company Mentioned

Mention Thumbnail
featured image - Building User Service With GRPC, Node.JS, and MongoDB: The Complete Microservice Tutorial — [Part 2]
MD Ahad Hasan HackerNoon profile picture

Microservices are about decoupling your system. In a monolith, you build all the components of the software in one large codebase and deploy the software at once. But in the microservice world, we build each component of a large system decoupled from each other.

In Part - 1, we explored the project’s architecture and now we are going to build it. We are going to build the authentication and user service module of the application. We will use NodeJS for this logic and MongoDB for the
data layer.

Prerequisites

Since we are going to build this service in NodeJS, you would need NodeJS and NPM installed. Also, we would need GRPC installed along with Protobuf for the transport layer.

  1. NodeJS with NPM
  2. ProtoBuf
  3. GRPC
  4. Docker

Essentially, we will build a GRPC server in NodeJS that accepts incoming RPC requests. We would need GRPC tools installed globally in NPM, so let’s do that first.

npm install -g grpc-tools

Proto

Let’s make a new directory

Microservice-Demo
and
cd
into it. The directory structure we will follow

MicroService-Demo
├── userService
│   ├── proto
│   │   ├── **/*.js
│   ├── node_modules
│   ├── api.js
│   ├── auth.js
|   ├── .env
|   ├── Dockerfile
│   ├── index.js
|   ├── package.json
│   └── testClient.js
├── protos
│   ├── user
│   │   ├── user.proto
|   docker-compose.yml

We are going to keep all our proto files outside of our NodeJS application so that it’s easier for us to use those in other services. If you are wondering what is a proto file, it is a new format introduced by Google to serialize data for API usage that needs to be compiled with

protoc
compiler. The compiler outputs the language generated files in the desired language and GRPC uses them to communicate between services. So let’s see the
user.proto
file.

// protos/user/user.proto

syntax = "proto3";

package demo_user;

option go_package = "github.com/Joker666/microservice-demo/protos/user";

service UserSvc {
    rpc register (RegisterRequest) returns (UserResponse);
    rpc login (LoginRequest) returns (UserResponse);
    rpc verify (VerifyRequest) returns (VerifyResponse);
    rpc getUser (GetUserRequest) returns (VerifyResponse);
}

message VerifyRequest {
    string token = 1;
}

message GetUserRequest {
    string user_id = 1;
}

message LoginRequest {
    string email = 1;
    string password = 2;
}

message RegisterRequest {
    string name = 1;
    string email = 2;
    string password = 3;
}

message UserResponse {
    string id = 1;
    string name = 2;
    string email = 3;
    string token = 4;
}

message VerifyResponse {
    string id = 1;
    string name = 2;
    string email = 3;
}

The proto file is using

proto3
syntax. We see that there are a couple of messages in this file representing request and response data. Then there is a service
UserSvc
defined that has four methods that leverage these messages. Essentially, these are four APIs that we would be building today. There is a way to load the proto file’s definition in runtime without compiling the file, but we are going to compile the file here because that would make our life much easier when we build other services. Let’s compile this proto file and store the results in
userService/proto
directory. Run the next command from the root
Microservice-Demo
directory.

grpc_tools_node_protoc \
    --js_out=import_style=commonjs,binary:userService/proto/ \
    --grpc_out=grpc_js:userService/proto \
    --proto_path=./protos/user ./protos/user/*.proto

Running this command will output two files in

userService/proto
directory, one is
user.pb.js
and another
user_grpc.pb.js
. We would need to require them in code next to build our APIs.

Building the service

So, we have some APIs we are going to build, let’s start with user registration. We would install

bcrypt
for password hashing and
jsonwebtoken
for generating a JWT token for authentication.

// userService/index.js

require('dotenv').config();
const grpc = require('@grpc/grpc-js');
const { MongoClient } = require("mongodb");
const services = require('./proto/user_grpc_pb');
const API = require("./api");

// Mongo Connection
const dbClient = new MongoClient(process.env.DB_URI, { useUnifiedTopology: true });
let api = null;

async function connectDB() {
    try {
        await dbClient.connect();
        let db = await dbClient.db(process.env.DB_NAME);
        db.command({ ping: 1 });
        console.log("Connected successfully to mongo server");
        // Create index
        await db.collection("users").createIndex({ email: 1 });

        // Init api
        api = new API(db, grpc);
    } catch (e) {
        console.error(e);
    }
}

async function main() {
    await connectDB().catch(console.dir);
    let server = new grpc.Server();
    server.addService(services.UserSvcService, {
        register: api.register,
        login: api.login,
        verify: api.verify,
        getUser: api.getUser,
    });
    let address = process.env.HOST + ":" + process.env.PORT;
    server.bindAsync(address, grpc.ServerCredentials.createInsecure(), () => {
        server.start();
        console.log("Server running at " + address);
    });
}

main();

This is a very basic NodeJS setup. Here we are importing the generated

user_grpc.pb.js 
file. That gives us access to
UserSvcService
that we defined earlier in the proto file. We initialize a new GRPC service and add our API methods to it as services. Next, we bind the address that we get from
.env
and start the server. There’s some boilerplate code to connect to MongoDB and pass the
db
and
grpc
instance to
API
class. Let’s code out
API
class.

// userService/api.js

const bcrypt = require('bcrypt');
const auth = require("./auth");
const messages = require('./proto/user_pb');
const ObjectId = require('mongodb').ObjectID;

module.exports = class API {
    constructor(db, grpc) {
        this.db = db;
        this.grpc = grpc;
    }

    register = (call, callback) => {
        const users = this.db.collection("users");

        bcrypt.hash(call.request.getPassword(), 10, (err, hash) => {
            let user = { name: call.request.getName(), email: call.request.getEmail(), password: hash }
            users.insertOne(user).then(r => {
                let resp = new messages.UserResponse();
                resp.setId(user._id.toString());
                resp.setName(user.name);
                resp.setEmail(user.email);
                resp.setToken(auth.generateToken(user));
                callback(null, resp);
            });
        });
    }

    // See the rest of the methods in
    // https://github.com/Joker666/microservice-demo/blob/main/userService/api.js
};

In the

API
class, we implement the
register
method. There are two parameters that have been passed to us by GRPC service definition,
call
and
callback
. The
call
parameter contains request information that we can access with
call.get{ParamName}
and
callback
is what gets returned from the method. It has two parameters, the first parameter takes error object and the second one
response
object.

We hash the password user has provided and then save the user to MongoDB. We then create the

UserResponse
message we made in the proto file earlier and set the necessary fields. The
callback
then returns the message. You can explore the token generation code here and the rest of the APIs of this service here. The full code is available here.

So we have coded our first API and now let's test it.

Docker Deploy

We have coded the application, now let’s write the

Dockerfile
to deploy it.

# userService/Dockerfile

FROM node:15

WORKDIR /app
COPY package*.json ./
RUN npm install
COPY . .

EXPOSE 50051
CMD [ "node", "index.js" ]

We are copying everything from the service directory and installing the
packages here. Since we would also need MongoDB, running only this in
docker would not be enough. Let’s write the

docker-compose.yml
file.

# docker-compose.yml

version: '3.8'

services:
    user:
        build:
            context: ./userService
        image: microservice/demo/user
        restart: "no"
        environment:
            - DB_URI=mongodb://mongo:27017/
            - DB_NAME=Microservice-demo-user
        ports:
            - 8080:50051
        depends_on:
            - mongo

    mongo:
        image: mongo
        restart: always
        environment:
            MONGO_INITDB_DATABASE: Microservice-demo-user
        ports:
            - 27017:27017
        volumes:
            - mongodb:/data/db
            - mongodb_config:/data/configdb

volumes:
    postgresdb:
    mysqldb:
    mongodb:
    mongodb_config:

Let’s run this with

docker-compose.yml up --build
. We should see both MongoDB and our service is running successfully.

Testing

Since, we have written a GRPC service, we cannot test it directly with any
tool like Postman, well not yet. There are some tools out there that somewhat ease the process like BloomRPC but I like to test the service
with real code.

So, we have a server and now we have to write a client to test it.

// userService/testClient.js

const messages = require('./proto/user_pb');
const services = require('./proto/user_grpc_pb');
const grpc = require('@grpc/grpc-js');

function main() {
    const client = new services.UserSvcClient('localhost:8080', grpc.credentials.createInsecure());

    let registerReq = new messages.RegisterRequest();
    registerReq.setName("Hello");
    registerReq.setEmail("[email protected]");
    registerReq.setPassword("Password");
    client.register(registerReq, function(err, response) {
        console.log(response);
    });
}

main();

Here, we are importing the message and service files and creating a client by connecting to port 8080 since we port-forwarded it in the docker-compose file. When we run this client with

node testClient.js
we would see that the user is being registered and a new user entry getting created in MongoDB. It should print in the console the response that contains the created user information.

Whoa! That was a lot. But now we have a full functioning microservice written in NodeJS that is running a GRPC server that can accept incoming RPC requests and interact with the database.

Conclusion

Here we have explored user registration/authentication, in the next article, we will build the project service with Python and MySQL. Til then, stay tuned.

Project Link: https://github.com/Joker666/microservice-demo/

Also published at https://medium.com/swlh/the-complete-microservice-tutorial-part-1-building-user-service-with-grpc-node-js-and-mongodb-73e70ed80148