One of the most exciting topics in frontend development recently is edge computing. Being able to run code at the edge is a new capability that will radically change how we write applications. This article aims to provide a brief overview of serverless edge computing, why it is beneficial, and how you can use it.
Until recently, when we talked about the edge, we usually thought about content delivery networks (CDNs), which cache static assets such as HTML, CSS files, images, videos, etc. across hundreds of servers around the world. Because the contents are close to users, access is very fast. Serverless edge computing extends this idea by allowing developers to run code at the edge without having to manage any server. Edge computing gives developers a new location for their code along with the two traditional options, namely on the backend and on the device.
The two platforms that are most talked about these days are Cloudflare Workers and Deno Deploy. At the high level, they have many similarities:
On the other hand, the most apparent difference between the two platforms is that Deno Deploy doesn’t offer any storage solution while Cloudflare Workers has Durable Objects for consistent storage and Workers KV for low-latency storage with eventual consistency.
Edge computing is most often used for implementing middleware which is simple scripts that rewrite URLs, redirect requests, or manipulate cookies and headers. For example, you can perform A/B testing by serving different versions of a page depending on a cookie’s value. You can support localization, which means users from different countries get to see different content. You can also grant or block access to certain pages based on user authentication status.
While the above use cases could be handled by the backend and/or the client, using edge computing is much more optimal in terms of performance and simplicity. If you implement A/B testing or localization on the client, you might have a large-sized bundle and a flickering UI. On the other hand, implementing this functionality at the backend usually requires server configuration and results in slow responses.
Edge computing is so much more than middleware though. In fact, you can even use edge computing to replace your backend altogether. Many full-stack frameworks, for example, Remix and Fresh, now allow you to write functions that access databases directly and render HTML at the edge. By doing so, you probably won’t need to develop an API like you usually did in the past. (But should you want to serve a RESTful API or even a GraphQL API at the edge, you could do that as well.)
But why should you consider replacing your backend with serverless edge computing? There are several reasons for that. First, because it’s serverless, there is no server maintenance nor autoscaling to worry about. Second, it’s much faster than traditional backends because it runs very close to users and, compared to normal Function-as-a-Service platforms, edge computing has little or no cold starts. And third, it is cheaper than conventional serverless options because edge computing platforms use lighter-weight technologies, such as V8 isolates (as opposed to virtual machines or containers) for multi-tenancy, which allows them to utilize resources more efficiently and handle a lot more requests.
If you are using a platform such as Netlify or Vercel to deploy web applications, it’s very simple to opt-in to edge computing. For example, you can use Netlify’s Edge Functions (which is built on Deno Deploy) or Vercel’s Edge Middleware (which probably runs atop Cloudflare Workers). When you deploy a Next.js application (since version 12.2) to either Vercel or Netlify, you can configure some or all pages to use the edge runtime for server rendering (this capability is called Edge SSR).
If you don’t deploy to Netlify, Vercel, or a similar platform, it is advisable that you use an edge-native full stack framework and deploy your application directly to the edge computing service of choice. For example, you can develop your app with the Fresh framework and publish it to Deno Deploy. Another example is the Remix framework which provides adapters for both Cloudflare Workers and Deno Deploy (together with many other deployment targets).
If you don’t want to use a full stack framework, you can develop your frontend and backend separately. With Cloudflare, you can serve your frontend and static assets with Pages or Workers Sites. With Deno Deploy, you can serve static assets from the filesystem. As for the backend, you probably want to use an HTTP framework such as Sunder (if you target Cloudflare Workers), oak, Router, or Sift (if you target Deno Deploy).
While edge computing is perfect for implementing middleware scripts, if you want to implement the whole backend with it, you should be aware of the target platform’s limits on CPU runtime, memory, package size and so on. You can find detailed information on Cloudflare Workers’ limits here, and Deno Deploy here.
In summary, with edge computing platforms such as Cloudflare Workers and Deno Deploy, developers can write middleware and backends that are both fast and cheap. Edge computing is a game changer which has enabled a new generation of full-stack frameworks such as Remix and Fresh. You can use these frameworks to develop and deploy applications directly to the edge. Additionally, edge computing can be used indirectly through platforms like Netlify and Vercel.
It's just the beginning. In the near future, we can expect a lot more frameworks, libraries, and tools to be developed for the edge. Hopefully, this article has piqued your interest and given you some high-level information on edge computing so you can get started exploring its potential.