paint-brush
Empowering Your API Performance with Proxyby@stdevk
353 reads
353 reads

Empowering Your API Performance with Proxy

by ST Dev June 29th, 2023
Read on Terminal Reader
Read this story w/o Javascript

Too Long; Didn't Read

In summary, improving API performance can lead to enhanced user experience, increased productivity, improved scalability, cost savings, and a competitive advantage, making it a desirable goal for businesses and developers alike. Therefore, we will be explaining the basic architecture of API gateway, how to set up Proxy to route RPC services with Nginx and discuss the advantages & disadvantages of doing so.
featured image - Empowering Your API Performance with Proxy
ST Dev  HackerNoon profile picture

There are several compelling reasons why people might want to improve API performance:

  1. Enhanced User Experience: Faster and more responsive APIs can provide a smoother user experience for clients or end-users. Improved performance can result in reduced latency, faster response times, and quicker data retrieval, leading to a more efficient and satisfying user experience.
  2. Improved Scalability: High-performance servers can handle a larger number of requests and concurrent users, making them more scalable. Scalability is crucial for growing businesses that need to handle increasing amounts of traffic and data as their user base expands. Improved API performance can help businesses meet growing demands without compromising on response times or service quality.
  3. Competitive Advantage: High-performance APIs can give businesses a competitive edge. Users or clients tend to prefer services that offer faster response times, better reliability, and superior overall performance. By providing a superior API experience, businesses can differentiate themselves from competitors and attract more users or customers.

Tutorial Learning Objective

We will be setting up a Proxy as a central management system to achieve improved API performance. A proxy acts as an intermediary between clients and servers. It sits between the client making API requests and the server that hosts the APIs. When a client makes an API request, it goes through the proxy first, which then forwards the request to the server. The server processes the request and sends the response back to the proxy, which then forwards it to the client. This allows the proxy to intercept, modify, or cache the request or response as needed, providing opportunities to optimize API performance.

Prerequisite

For macOS (or Linux), install Homebrew on your system.

Installing Nginx on Mac

There are following steps to install the Nginx on macOS:


1️⃣Download Homebrew

To install the Nginx on macOS, Homebrew must be installed on the system. Homebrew is a package manager for Mac operating system that allows us to install various Unix applications easily. If you don't have Homebrew, use the following link to install: https://brew.sh/

Or simply type the following command on the terminal:

/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"


2️⃣Install Nginx

The homebrew package installer will help to install the Nginx web server on the macOS. To install the Nginx, use the following command:

brew install nginx


3️⃣Edit configuration file

By default, the location of Nginx configuration file is:

/opt/homebrew/etc/nginx/nginx.conf

To edit the Nginx configuration file, you can use any text editor of your choice. For example, you can use nano, vim, or emacs. Here's an example command to edit the Nginx configuration file using nano:

nano /opt/homebrew/etc/nginx/nginx.conf

We will be editing the server block that listens on port 80.

Search for server block


Subsequently, proceed to include the three Ethereum mainnet RPC endpoints below:

location /nodereal { 
	proxy_pass https://eth-mainnet.nodereal.io/v1/<API KEY>;
	proxy_set_header Content-Type "application/json";
}

location /RPCProviderA {
	proxy_pass <https URI endpoint>;
	proxy_set_header Content-Type "application/json";
}

location /RPCProviderB {
	proxy_pass <https URI endpoint>;
	proxy_set_header Content-Type "application/json";
}

It is possible to include multiple RPC endpoints as necessary and subsequently preserve the configuration file.

To ensure the absence of syntax errors, kindly proceed with the testing of the Nginx configuration file:

nginx -t

🎊 In the event that there are no errors present, the following outcome shall be displayed.


nginx: the configuration file /opt/homebrew/etc/nginx/nginx.conf syntax is ok

nginx: configuration file /opt/homebrew/etc/nginx/nginx.conf test is successful

To restart the Nginx server, kindly execute the following command:

brew services restart nginx


4️⃣Sending API method via Nginx Proxy

To test the Nginx Proxy, we are checking the gas price on Ethereum via eth_gasPrice. We will be sending a curl command to send an HTTP POST request to the "/nodereal" location of a server running on the local machine (at "http://localhost") with a JSON payload seen below:

curl -X POST --data '{"jsonrpc":"2.0","method":"eth_gasPrice","params":[],"id":1}' -H "Content-Type: application/json" http://localhost/nodereal


✅ {"jsonrpc":"2.0","id":1,"result":"0xdec36a8d1"}

The response you received after running the curl command is a JSON-RPC response from an Ethereum node. Here's a brief explanation of the response:

  • "jsonrpc":"2.0": This field indicates the version of the JSON-RPC protocol used in the response.
  • "id":1: This field is the identifier of the request to which this response corresponds. In this case, the request had an "id" of 1."result":"0xdec36a8d1": This field is the result of the JSON-RPC request. In this case, the requested result was the current gas price on the Ethereum network, which is returned in  hexadecimal format as a string.

To interpret decimal value of "0xdec36a8d1"

The decimal value of "0xdec36a8d1" is 59797579985. Therefore, the current gas price on the Ethereum network at the time the request was made was 59797579985 wei (the smallest denomination of Ether) or 58 Gwei.

Test it out with the remaining servers, /RPCProviderA & /RPCProviderB, by running on the local machine (at "http://localhost"):

curl -X POST --data '{"jsonrpc":"2.0","method":"eth_gasPrice","params":[],"id":1}' -H "Content-Type: application/json" http://localhost/RPCProviderA  


🥳 You may commence the testing of your own proxy.


Advantages and Disadvantages of Using Nginx as a Proxy for API Gateway


Advantages of using Nginx as a proxy for an API gateway:

  1. Load Balancing: Nginx can distribute incoming API requests across multiple backend servers, ensuring efficient load balancing and improved performance.
  2. Caching: Nginx can cache API responses, reducing the load on backend servers and improving API response times.
  3. Scalability: Nginx can easily scale horizontally to accommodate increased API traffic and handle large numbers of concurrent connections.
  4. Security: Nginx provides various security features, such as SSL termination, DDoS protection, and request filtering, which help protect the API from security threats.


Disadvantages of using Nginx as a proxy for an API gateway:

  1. Limited API Management Features: Nginx primarily acts as a proxy and lacks some advanced API management features, such as API documentation, developer portal, and API versioning, which may be required in complex API ecosystems.
  2. Configuration Complexity: Configuring Nginx as an API gateway requires a good understanding of Nginx configuration, which may be complex for users who are not familiar with Nginx.
  3. Lack of Advanced Authentication and Authorization: Nginx provides basic authentication and authorization features, but may not have advanced capabilities, such as OAuth, JWT validation, and fine-grained access control, which may be required in some API scenarios.

Conclusion

In conclusion, improving API performance is crucial for businesses and developers. Using Nginx as a proxy for API gateway offers advantages like load balancing, caching, scalability, and security. However, there are limitations such as limited API management features, configuration complexity, and lack of advanced authentication and authorization capabilities. Careful consideration of these pros and cons is essential. Overall, leveraging Nginx as a proxy can be a powerful tool to improve API performance. Stay tuned for the next tutorial series as we will be sharing more about the common issue faced and how to debug it.


Also published here.