Microservices are about decoupling your system. In a monolith, you build all the components of the software in one large codebase and deploy the software at once. But in the microservice world, we build each component of a large system decoupled from each other. In , we explored the project’s architecture and now we are going to build it. We are going to build the authentication and user service module of the application. We will use NodeJS for this logic and MongoDB for the data layer. Part - 1 Prerequisites Since we are going to build this service in NodeJS, you would need NodeJS and NPM installed. Also, we would need GRPC installed along with Protobuf for the transport layer. NodeJS with NPM ProtoBuf GRPC Docker Essentially, we will build a GRPC server in NodeJS that accepts incoming RPC requests. We would need GRPC tools installed globally in NPM, so let’s do that first. npm install -g grpc-tools Proto Let’s make a new directory and into it. The directory structure we will follow Microservice-Demo cd MicroService-Demo ├── userService │ ├── proto │ │ ├── **/*.js │ ├── node_modules │ ├── api.js │ ├── auth.js | ├── .env | ├── Dockerfile │ ├── index.js | ├── package.json │ └── testClient.js ├── protos │ ├── user │ │ ├── user.proto | docker-compose.yml We are going to keep all our proto files outside of our NodeJS application so that it’s easier for us to use those in other services. If you are wondering what is a proto file, it is a new format introduced by Google to serialize data for API usage that needs to be compiled with compiler. The compiler outputs the language generated files in the desired language and GRPC uses them to communicate between services. So let’s see the file. protoc user.proto syntax = ; demo_user; go_package = ; { ; ; ; ; } { token = ; } { user_id = ; } { email = ; password = ; } { name = ; email = ; password = ; } { id = ; name = ; email = ; token = ; } { id = ; name = ; email = ; } // protos/user/user.proto "proto3" package option "github.com/Joker666/microservice-demo/protos/user" service UserSvc register (RegisterRequest) (UserResponse) rpc returns login (LoginRequest) (UserResponse) rpc returns verify (VerifyRequest) (VerifyResponse) rpc returns getUser (GetUserRequest) (VerifyResponse) rpc returns message VerifyRequest string 1 message GetUserRequest string 1 message LoginRequest string 1 string 2 message RegisterRequest string 1 string 2 string 3 message UserResponse string 1 string 2 string 3 string 4 message VerifyResponse string 1 string 2 string 3 The proto file is using syntax. We see that there are a couple of messages in this file representing request and response data. Then there is a service defined that has four methods that leverage these messages. Essentially, these are four APIs that we would be building today. There is a way to load the proto file’s definition in runtime without compiling the file, but we are going to compile the file here because that would make our life much easier when we build other services. Let’s compile this proto file and store the results in directory. Run the next command from the root directory. proto3 UserSvc userService/proto Microservice-Demo grpc_tools_node_protoc \ --js_out=import_style=commonjs,binary:userService/proto/ \ --grpc_out=grpc_js:userService/proto \ --proto_path=./protos/user ./protos/user/*.proto Running this command will output two files in directory, one is and another . We would need to require them in code next to build our APIs. userService/proto user.pb.js user_grpc.pb.js Building the service So, we have some APIs we are going to build, let’s start with user registration. We would install for password hashing and for generating a JWT token for authentication. bcrypt jsonwebtoken ( ).config(); grpc = ( ); { MongoClient } = ( ); services = ( ); API = ( ); dbClient = MongoClient(process.env.DB_URI, { : }); api = ; { { dbClient.connect(); db = dbClient.db(process.env.DB_NAME); db.command({ : }); .log( ); db.collection( ).createIndex({ : }); api = API(db, grpc); } (e) { .error(e); } } { connectDB().catch( .dir); server = grpc.Server(); server.addService(services.UserSvcService, { : api.register, : api.login, : api.verify, : api.getUser, }); address = process.env.HOST + + process.env.PORT; server.bindAsync(address, grpc.ServerCredentials.createInsecure(), () => { server.start(); .log( + address); }); } main(); // userService/index.js require 'dotenv' const require '@grpc/grpc-js' const require "mongodb" const require './proto/user_grpc_pb' const require "./api" // Mongo Connection const new useUnifiedTopology true let null async ( ) function connectDB try await let await ping 1 console "Connected successfully to mongo server" // Create index await "users" email 1 // Init api new catch console async ( ) function main await console let new register login verify getUser let ":" console "Server running at " This is a very basic NodeJS setup. Here we are importing the generated file. That gives us access to that we defined earlier in the proto file. We initialize a new GRPC service and add our API methods to it as services. Next, we bind the address that we get from and start the server. There’s some boilerplate code to connect to MongoDB and pass the and instance to class. Let’s code out class. user_grpc.pb.js UserSvcService .env db grpc API API bcrypt = ( ); auth = ( ); messages = ( ); ObjectId = ( ).ObjectID; .exports = { (db, grpc) { .db = db; .grpc = grpc; } register = { users = .db.collection( ); bcrypt.hash(call.request.getPassword(), , (err, hash) => { user = { : call.request.getName(), : call.request.getEmail(), : hash } users.insertOne(user).then( { resp = messages.UserResponse(); resp.setId(user._id.toString()); resp.setName(user.name); resp.setEmail(user.email); resp.setToken(auth.generateToken(user)); callback( , resp); }); }); } }; // userService/api.js const require 'bcrypt' const require "./auth" const require './proto/user_pb' const require 'mongodb' module class API constructor this this ( ) => call, callback const this "users" 10 let name email password => r let new null // See the rest of the methods in // https://github.com/Joker666/microservice-demo/blob/main/userService/api.js In the class, we implement the method. There are two parameters that have been passed to us by GRPC service definition, and . The parameter contains request information that we can access with and is what gets returned from the method. It has two parameters, the first parameter takes error object and the second one object. API register call callback call call.get{ParamName} callback response We hash the password user has provided and then save the user to MongoDB. We then create the message we made in the proto file earlier and set the necessary fields. The then returns the message. You can explore the token generation code and the rest of the APIs of this service . The full code is available . UserResponse callback here here here So we have coded our first API and now let's test it. Docker Deploy We have coded the application, now let’s write the to deploy it. Dockerfile node: # userService/Dockerfile FROM 15 WORKDIR /app COPY package*.json ./ RUN npm install COPY . . EXPOSE 50051 CMD [ , ] "node" "index.js" We are copying everything from the service directory and installing the packages here. Since we would also need MongoDB, running only this in docker would not be enough. Let’s write the file. docker-compose.yml # docker-compose.yml version: '3.8' services: user: build: context: ./userService image: microservice/demo/user restart: "no" environment: - DB_URI=mongodb://mongo:27017/ - DB_NAME=Microservice-demo-user ports: - 8080 :50051 depends_on: - mongo mongo: image: mongo restart: always environment: MONGO_INITDB_DATABASE: Microservice-demo-user ports: - 27017 :27017 volumes: - mongodb: /data/db - mongodb_config: /data/configdb volumes: postgresdb: mysqldb: mongodb: mongodb_config: Let’s run this with . We should see both MongoDB and our service is running successfully. docker-compose.yml up --build Testing Since, we have written a GRPC service, we cannot test it directly with any tool like Postman, well not yet. There are some tools out there that somewhat ease the process like BloomRPC but I like to test the service with real code. So, we have a server and now we have to write a client to test it. messages = ( ); services = ( ); grpc = ( ); { client = services.UserSvcClient( , grpc.credentials.createInsecure()); registerReq = messages.RegisterRequest(); registerReq.setName( ); registerReq.setEmail( ); registerReq.setPassword( ); client.register(registerReq, { .log(response); }); } main(); // userService/testClient.js const require './proto/user_pb' const require './proto/user_grpc_pb' const require '@grpc/grpc-js' ( ) function main const new 'localhost:8080' let new "Hello" "hello@world.com" "Password" ( ) function err, response console Here, we are importing the message and service files and creating a client by connecting to port 8080 since we port-forwarded it in the docker-compose file. When we run this client with we would see that the user is being registered and a new user entry getting created in MongoDB. It should print in the console the response that contains the created user information. node testClient.js Whoa! That was a lot. But now we have a full functioning microservice written in NodeJS that is running a GRPC server that can accept incoming RPC requests and interact with the database. Conclusion Here we have explored user registration/authentication, in the next article, we will build the project service with Python and MySQL. Til then, stay tuned. Project Link: https://github.com/Joker666/microservice-demo/ Also published at https://medium.com/swlh/the-complete-microservice-tutorial-part-1-building-user-service-with-grpc-node-js-and-mongodb-73e70ed80148