tl;dr: It is now officially supported š! As part of an AWS to GCP migration, we wanted to leverage Google Cloud Functions ( ) to help us scale our infrastructure in a more manageable manner. https://cloud.google.com/functions Cloud Functions are written in JavaScript and execute in a standard runtime environment. Node.js How great is that?āāāso we thought. Instead of migrating our AWS monolith, we decided to extract isolated pieces and deploy them as independent services, one piece at a time. Our existing application is written in Node.js and migrating to GCF (Google Cloud Functions) sounded straight forward. If it wasnāt for our private NPM packagesā¦ A note for clarity, GCF has always worked beautifully if you only rely on public npm packages. Private NPM PackagesāāāTheĀ Problem In order for our teams to share common code, we use . GCF did not like thatā¦ Thereās a with a workaround. It works great if your code has top-level dependencies that are private npm packages. If, like in our case, you have transient dependencies that are also private npm packages, you either get really creative with your scripts or look for alternatives. In our case, we deployed to Google App Engine, which was a temporary workaround as it becomes very expensive very quicklyā¦ private npm packages StackOverflow discussion Private NPM PackagesāāāTheĀ Solution Yesterday, I ended up on the GCF docs by accident (yeah stuff like that happens) and found a new chapter ā ā. Wait what? Whaaaat? Using private modules In order to use a , you have to provide credentials (auth token) for the npm registry in a file located in the function's directory. You can simply copy the file that was created in your home directory when you logged into npm using the command. private npm module .npmrc .npmrc npm login Thatās the entire chapter and it is such a game changer! Let me know in the comments if this makes your decision simpler to migate to GCF. Have questions? Ask away, Iām happy to help out wherever I can.