paint-brush
How Does Vite Achieve Constant Time Builds?by@jerrychang
807 reads
807 reads

How Does Vite Achieve Constant Time Builds?

by JerryNovember 14th, 2022
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Three decisions that makes Vite faster than Webpack: use of native ESM, rethinking recompilation and pre-bunlding.
featured image - How Does Vite Achieve Constant Time Builds?
Jerry HackerNoon profile picture

Content

  • Introduction

  • 3 Key design decisions

    1. Use of Native ESM
    2. Rethinking Recompilation
    3. Pre-bundling
  • Conclusion

Introduction

If you used webpack or rollup on large projects, then you probably know the struggle of having long build times between each change.


This was one of the pain points that Evan You thought about addressing when he created Vite.


He took a look at the existing solutions and thought of ways to iterate on the design.


These new ideas were made possible because of:


  • Better tooling (esbuild , swc )
  • Native ESM support (on browsers)


With these new changes, he had a vision of a new tool (Vite) that achieves:


  • Faster build times - Keeping build times constant that scales with codebase size.


  • Better Developer Experience (DX) - Building a tool from the ground up to improve DX — faster feedback cycle, overall experience, and extensibility.


  • A more integrated Server-side rendering experience - Providing a more integrated experience, tooling for server-side rendering, and solving pain points like having multiple configuration files



Build time vs Codebase size



From a technical standpoint, there were 3 key design decisions that made it possible for Vite to achieve these optimization goals.


Let’s go through them one by one.

3 Key Design Decisions

When webpack was introduced, it was the best tool at that time.


Then rollup came on the scene, it was a new bundler with a focus on simplicity.


Both of these tools took a very similar approach to bundling — which is, when the files changed, it rebuilt the whole bundle.


This means that as your codebase grows, the build times would grow linearly with this increase.

This leads to the question of why can’t we just rebuild the files that have changed (and not the whole bundle).


And that’s the exact approach that Evan took when designing Vite.


All of these decisions were made possible because modern browsers now support native ESM.


Let’s go through these 3 key design decisions.

1. Use of Native ESM

The first decision makes all the other decisions possible, and that was the decision to use native ESM.


In doing so, Vite can now delegate handling of the module system to the browser, and just handle the processing (transpilation, transforming, etc.) of the requested modules.


Vite: prebundling dependencies and caching in the filesystem


Example using native ESM syntax


With this small change, it actually allows Vite to also rethink how recompilation works in the development environment.

2. Rethinking Recompilation

With the use of native ESM, Evan can now rethink how the build lifecycle works.


One of those ideas is to actually separate the source code and dependencies.


This was actually one of the main bottleneck in bundlers like webpack and rollup.


Separating the two allows Evan to re-design the builds to better suit the lifecycle of each of these use cases independently.


The use cases are:


  • Source code - Changes frequently

  • Dependencies - Changes less frequently


When changes occur, instead of rebuilding the whole bundle each time, Vite will just serve the modules on-demand.


This is done by leveraging the Vite middleware and the native ESM on browsers which leads to better overall performance.


Comparison of typical to ideal bundling process


Vite: An illustration of the process of serving modules on demand


Now, in this section, we only talked about the recompilation.


What about the dependencies? How does Vite handle that part?


That’s where pre-bundling comes in.

3. Pre-bundling

To further optimize the build process, Vite will pre-bundle the dependencies in your project.


It does this by crawling the source code, and figuring out which dependencies needed to be pre-bundled, then running them through esbuild.


The outputs are cached in the filesystem based on the lockfiles, and they will be invalidated as needed.


So, that means no more rebuilding on every change!


Other than pre-bundling dependencies, Vite also performs the following optimization in the process:


  • Conversion to Native ESM - Vite will convert the modules using CommonJS or UMD into native ESM


  • Optimizing performance - Vite will concat modules to prevent waterfall requests when using native ESM (i.e., lodash-es )


Vite: pre-bundling dependencies and caching in filesystem


Conclusion

So, to recap, there are 3 key decisions made by Evan You (Creator of Vite) that allow for overall performance improvements in the build times as your codebase size increases.


The key decisions are:


  • Use of Native ESM - Module systems are now natively supported on browsers, going from building everything at once to serving bundles on-demand.


  • Rethinking Recompilation - Separating the source code and dependencies and optimizing the build for these two sources independently.


  • Pre-bundling - Dependencies are pre-bundled using esbuild and then cached on the filesystem; the dependencies are only invalidated whenever the lockfile changes


And that’s it! I hope you learned something new!


If you found this helpful or learned something new, please share this article with a friend or co-worker 🙏❤️! (Thanks!)


Also published here