Guys, today we’ll be learning how to integrate Sanity CMS into our Node.js application, and in this guide, we will explore how to handle file uploads in Node.js by using Sanity CMS.
Sanity is a fully customizable, headless CMS that allows you to manage and deliver content to any platform or device.
We’ll take a step-by-step approach to demonstrate how you can create a Sanity project, integrate Sanity with your Node.js application, and handle files with it for easy CRUD (Create, Read, Update, Delete) operations.
And I’m expecting that by the end of this guide, you’ll have a solid understanding of file handling in Node.js, and how to use Sanity CMS.
We’ll be using Express.js, which is a framework for Node.js. Express.js provides an easy and manageable way for developers to create server and API routing. So, if you find the article very interesting, then bookmark this or continue reading.
So, let’s start.
If you’re new to Sanity CMS, I recommend you signup to continue this guide. Here’s a link.
So, let’s create a folder and call it file_db
; there we create two folders, one is for the node.js/express.js application folder, and the other is for sanity, called backend
and sanity_files
respectively.
file_db
├───backend
└───sanity_files
Next cd sanity_files
, and run the following commands on your terminal.
npm install -g @sanity/cli
Next, we initialize a new Sanity project by running sanity init
in our terminal, and it should be in the sanity_files
directory.
sanity init
After we run this command, what happens is that we create a clean project, and set up our dataset, which is always production, and then Sanity generates the files needed to run the application without problems perfectly.
And some of you will be asked to log in before continuing, and also, don’t panic if you get the warning message below, just type y
, and you’re good to go.
╭──────────────────────────────────────────╮
│ │
│ Update available 3.1.2 → 3.9.0 │
│ Run npm update @sanity/cli to update │
│ │
╰──────────────────────────────────────────╯
╔═══════════════════════════════════════════════════════════════════════════════════════╗
║ ⚠ Welcome to Sanity! Looks like you're following instructions for Sanity Studio v2, ║
║ but the version you have installed is the latest, Sanity Studio v3. ║
║ In Sanity Studio v3, new projects are created with [npm create sanity@latest]. ║
║ ║
║ Learn more about Sanity Studio v3: https://www.sanity.io/help/studio-v2-vs-v3 ║
╚═══════════════════════════════════════════════════════════════════════════════════════╝
? Continue creating a Sanity Studio v3 project? (Y/n) y
Next, we select Create new project
. Then, select You’ll be asked to name the Sanity project, but let’s give it the same name as the sanity folder we created sanity_files
Then we type y
to use the default dataset configuration, and for our project output path, we say ./
, meaning we want to use the same directory we are currently in the name.
After, a project template with options will be provided but we will select Clean project with no predefined schemas
Then finally, we will be asked two things, one is if we want to use TypeScript, of course, we do, so type y
, and the package manager we want for installing our dependencies, and it’s npm
of course.
The terminal should look similar to mine here.
✔ Fetching existing projects
? Select project to use Create new project
? Your project name: sanity_files
Your content will be stored in a dataset that can be public or private, depending on
whether you want to query your content with or without authentication.
The default dataset configuration has a public dataset named "production".
? Use the default dataset configuration? Yes
✔ Creating dataset
? Project output path: C:\Users\essel_r\Desktop\file_db\sanity_files
? Select project template Clean project with no predefined schemas
? Do you want to use TypeScript? Yes
✔ Bootstrapping files from template
✔ Resolving latest module versions
✔ Creating default project files
? Package manager to use for installing dependencies? npm
Running 'npm install --legacy-peer-deps'
After the dependencies are installed, the files should be ready.
So now, our files are ready; the next step is to create a schema file, and import the file into the index.ts
in the schemas
folder.
sanity_files
└───schemas
├───doc.ts
└───index.ts
Inside of the doc.ts
file, write the following line of code:
// doc.ts
import {defineField, defineType} from 'sanity'
export default defineType({
name: 'doc',
title: 'Doc',
type: 'document',
fields: [
defineField({
name: 'file',
title: 'File',
type: 'file',
}),
],
})
For users that are not using TypeScript, you can still follow this guide, and for TypeScript users,
the defineField, defineType
does it to types for the fields we declare at, so we can spot type errors if one is made.
And with the code below, we are creating a schema document and naming it as doc
, so later, when we decide to GET
or DELETE
a file, we can use the type == “doc”
to retrieve the data, do whatever we want to it in the defineType
object field, and in defineField
, we create a field called file, which has a type of file.
Now, Sanity has a lot of reserver types; here are a few: URL, slug, file, string, and image.
export default defineType({
name: 'doc',
title: 'Doc',
type: 'document',
fields: [
defineField({
name: 'file',
title: 'File',
type: 'file',
}),
],
})
When you’re done, import the doc.ts
file into the index.ts
file.
import doc from './doc'
export const schemaTypes = [doc]
Then on the terminal and in the sanity_files
directory, we run npm run dev
to have access to the Sanity Studio local.
$ npm run dev
> [email protected] dev
> sanity dev
✔ Checking configuration files...
Sanity Studio using [email protected] ready in 7136ms and running at http://localhost:3333/
Note: You might be asked to sign in again to access the studio; when you have access to it, then your first assignment is to upload a file yourself. Here is mine:
Done? if so, click on Vision, and you should see something like this
So in the query field, we can test our query to make sure that we are requesting data correctly; after querying, we click on enter, and our data is displayed there for us to read and debug in case there's an error. So, let’s make our first query.
*[_type == "doc"]{
"file": file.asset->,
}
With the above code, what we are saying is, “We are looking for a document in our schemas that has a type called doc
, now, if the type exists, then we want to return the file information.“
So, we are totally done with our Sanity Setup Project. Here’s the code, and so let’s move to our backend.
We cd
into our backend
folder and run npm init -y
to initialize a new Node.js project.
$ npm init -y
Wrote to C:\Users\essel_r\Desktop\file_db\backend\package.json:
{
"name": "backend",
"version": "1.0.0",
"description": "",
"main": "server.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"keywords": [],
"author": "",
"license": "ISC"
}
And we install our dependencies, and we’ll use these libraries to help us run the application and certain logic without having to write all the code ourselves.
By default, we get ”main”: “index”
, but I changed it to server.js
. So it is your choice to keep or change.
$ npm install express next-sanity dotenv multer cors cron mongoose nodemon morgan
After the installation is done, then move on to how we will structure our folder and files.
backend
└───src
├───controller
└───files.js
└───model
└───files.js
└───routes
└───files.js
└───uploads
└───utils
└───config
└───database.js
└───sanity.js
└───services
└───index.js
└───server.js
└───────env
In our backend project, we organized our code into different folders in the src
directory. These folders include:
controller
: Here, we write functions that perform specific tasks such as creating or deleting a post. This is where the logic of our application resides.
model
: This folder is used to define the schema structure of our API. It specifies the data types of each field in our API such as boolean, string, number, or an array of both numbers and strings.
routes
: In this folder, we define the paths for our API that developers can access when they hit the route. For example, we may define a path http://domain-name/api/v1/files
which will return an array of objects
, objects
, or strings
, booleans
and numbers
.
uploads
: This folder is where all our files will be uploaded to.
utils
: This folder contains some configuration files and functions that will help us later in the code for specific tasks. Inside utils
, we have another folder called config
where we have configuration files like cron-schedule.js
, database.js
, and sanity.js
. We also have a folder called services
that contains a index.js
file where we define some utility functions.
server.js
: This file is where we define the main entry point of our application.
Overall, this folder structure helps us organize our code and separate concerns. The controller
folder is responsible for handling the business logic of our application, the model
folder specifies the data types of our API, and the routes
folder defines the paths for our API.
The uploads
folder is where we store files that are uploaded to our application, and the utils
folder contains helpful configuration files and functions. Finally, server.js
is the entry point of our application where we start the server and define the routes.
We head into our package.json
to make add dev
and start
to the scripts.
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1",
"dev": "nodemon ./src/server",
"start": "node ./src/server"
},
Then we npm run dev
$ npm run dev
> [email protected] dev
> nodemon ./src/server
[nodemon] 2.0.20
[nodemon] to restart at any time, enter `rs`
[nodemon] watching path(s): *.*
[nodemon] watching extensions: js,mjs,json
[nodemon] starting `node ./src/server.js`
[nodemon] clean exit - waiting for changes before restart
Now, let's get started with coding. Note: I assume you know Node.js, and we are using Express.js, which is a framework for Node. js. Though I will explain the code whether simple or complex, it is important you have and know the basic inner workings of Node.js.
Head over to server.js
.
// @desc file path
// server.js
const express = require('express');
const cors = require('cors');
const morgan = require('morgan');
const dotenv = require('dotenv');
const { connectDatabase } = require('./utils/config/database');
const PORT = process.env.PORT || 8000;
const app = express();
dotenv.config({ path: '.env' });
connectDatabase();
app.use(morgan('tiny'));
app.use(cors());
app.use(express.json());
app.use(express.urlencoded({ extended: false }));
app.use('api/v1/files', require('./routes/files'))
app.listen(PORT, () =>
console.log(`Server running on: http://localhost:${PORT}`)
);
In the above code, we create an Express server to handle HTTP requests. Let's break down the code:
express
, cors
, morgan
, and dotenv
.
PORT
variable to the value of the PORT
environment variable or default to 8000
if PORT
is not set.
express
application and call it app
.
.env
file using dotenv.config()
.
connectDatabase()
function from the ./utils/config/database
module.
app.use()
. We use the morgan
middleware to log HTTP requests in a concise format. We use the cors
middleware to enable cross-origin resource sharing. We use the express.json()
middleware to parse JSON requests. We use the express.urlencoded()
middleware to parse URL-encoded requests.
api/v1/files
route by using app.use()
with the require('./routes/files')
module. This means that any requests to the path api/v1/files
will be handled by the code defined in the ./routes/files
module.
Finally, we start the server and listen on the specified PORT
using app.listen()
. We also log a message to the console indicating that the server is running.
Overall, this code sets up an Express server, establishes a database connection, adds middleware, and mounts a route to handle HTTP requests.
So we next connect our Node.js application to MongoDB. So head to MongoDB.
If you have an account, then sign in and create a project, add your username and password, and set your IP Address access. But if you’re new, then create your account, and then let me show you how it is done.
After registering, complete the form, and click finish, then on the second step, we deploy our database.
Select free and any of the providers you want, then either give me a name or use the default name available to you by MongoDB, and click on Create.
We then create a user by filling out the form above. Note: it is not advisable to use the username as the password; since this is a tutorial guide, I decided to use it, which I will delete after this guide.
After, click on Create User.
When you scroll a little down, you will see, add IP Access List the type 0.0.0.0/0
(This means we want anyone from any part of the world to have access.) Then click on Add Entry.
We’ll then be redirected to our Database Deployment section. So to create the connection between our database and the Node.js application, we click on Connect.
Then select Drivers.
mongodb+srv://filedb:<password>@cluster0.m3itchu.mongodb.net/?retryWrites=true&w=majority
We copy a string that looks a lot like this. And do we remember the password we gave the user? We replace <password>
with the password we passed earlier., which was filedb
. Then in our env file, we should have something like this.
MONGODB_URI="mongodb+srv://filedb:[email protected]/?retryWrites=true&w=majority"
Now, we connect our database to our application. So head over to backend/src/utils/config/database
. This is the code for creating a bridge from our application to MongoDB(our database).
// @desc file path
// /utils/config/database.js
const mongoose = require('mongoose');
const dotenv = require('dotenv');
dotenv.config({ path: '.env' });
const url = `${process.env.MONGODB_URI}`;
const connectDatabase = async () => {
try {
await mongoose.connect(url);
mongoose.set('strictQuery', true);
console.log(`MONGODB is connected`);
} catch (error) {
console.log(error);
}
};
module.exports = { connectDatabase };
Overall, this code sets up a connection to a MongoDB database usingmongoose
and exports a function that can be used to establish the connection. The strictQuery
option ensures that queries are validated against the defined schema, providing an extra layer of data integrity. Here's a breakdown of the code:
mongoose
and dotenv
.
.env
file using dotenv.config()
.
url
variable that contains the value of the MONGODB_URI
environment variable. This is the URL that we will use to connect to our MongoDB database.
connectDatabase
.
connectDatabase
function, we use a try/catch
block to attempt to connect to the database using mongoose.connect()
. If the connection is successful, we log a message to the console indicating that the connection was successful.
strictQuery
option on the mongoose
object to true
. This ensures that queries that have fields that are not specified in the schema are rejected, preventing data loss or corruption.
connectDatabase
function so that it can be used in our server.js
.
Done!!
Now, you know how to set up a MongoDB database and deploy and connect to a Node.js application. So next, we create our API route and move to the controller
.
// @desc file path
// routes/file.js
const express = require('express');
const { upload } = require('../utils/services');
const { PostFile } = require('../controller/files');
const router = express.Router();
router.post('/', upload.single('file'), PostFile);
module.exports = router;
So what the above code does is set up a route for handling file uploads using the multer
library and the upload
middleware function. The PostFile
function is responsible for processing the uploaded file and creating a new file record in the database.
The router is exported so that it can be mounted in our main Express app and handle incoming requests. Here's a breakdown of the code:
express
, upload
from ../utils/services
, and PostFile
from ../controller/files
.
express.Router()
.
router.post()
.
/
, which means it will handle requests to the root of the current domain.
upload
middleware function to handle file uploads. The upload
function is a single middleware function that is defined in the ../utils/services
module and is responsible for processing and validating file uploads using the multer
library.
upload
middleware function is passed the single
method, which specifies that we are expecting a single file with the name file
.
PostFile
function from ../controller/files
is called. This function is responsible for creating a new file record in the database and returning a response to the client.
app.use()
in the server.js
file.
And we have set up or defined our API route, but before we go to the controller folder, we need to first set up the multer middleware upload.single('file')
to handle file upload. To implement that, we head to /utils/services/index.js
// @desc file path
// services/index.js
const multer = require('multer');
const path = require('path');
const storageEngine = multer.diskStorage({
destination: (_req, _, cb) => {
cb(null, path.join(__dirname, '../../uploads'));
},
filename: (_req, file, cb) => {
cb(null, `${Date.now()}-${file.originalname.replaceAll(' ', '-')}`);
},
});
const upload = multer({ storage: storageEngine });
module.exports = { upload };
So, first, let's talk about the Multer library, so if you’re new, you’ll be able to at least know what is it and how it works.
Multer is a popular Node.js middleware library that allows you to handle multipart/form-data
, which is typically used for file uploads. It works with the body-parser
middleware to parse form data and store uploaded files on the server.
It provides a disk storage engine that can be used to specify where uploaded files should be stored on the server. You can configure the storage engine using the multer.diskStorage()
method, which takes an object with two properties: destination
and filename
.
The destination
property specifies the directory
or path
where uploaded files should be stored, and the filename
property specifies how the filename should be generated. I think that’s good enough for basic knowledge. To dive deeper for more information check here.
So here is a breakdown of what this code does:
multer
and path
.
storageEngine
object using multer.diskStorage()
. This object is used to configure the storage engine used by Multer for storing uploaded files on the server. It has two properties: destination
and filename
.
destination
property specifies the directory where uploaded files will be stored. In this case, we set it to the uploads
directory relative to the current file using path.join()
.
filename
property specifies the name of the uploaded file on the server. We use the current timestamp and the original filename to create a unique filename for each uploaded file. We also replace any spaces in the original filename with hyphens using replaceAll()
to avoid any issues with spaces in filenames.
multer
middleware function using the storageEngine
configuration object we created earlier.
upload
middleware function so that it can be used in our routes to handle file uploads.
A summary is that this code sets up a multer
middleware function for handling file uploads. The storageEngine
configuration object specifies where uploaded files should be stored and how their names should be generated.
The upload
middleware function created using this configuration object is exported so that it can be used in our routes to handle file uploads.
In, this section, we create or design the API data structure in the model
folder after creating a files.js
inside of it.
// @desc file path
// model/files.js
const Mongoose = require('mongoose')
const FileModel = new Mongoose.Schema({
url: { type: String, unique:true, require: [true, 'This field is required'] },
size: { type: Number, required: true },
name: { type: String, require: [true, 'This field is required'] },
mimeType: { type: String, require: [true, 'This field is required'] },
extension: { type: String, require: [true, 'This field is required'] },
cms_id: { type: String,unique:true, require: [true, 'This field is required'] },
createdAt: { type: String, require: [true, 'This field is required'] },
updatedAt: { type: String, require: [true, 'This field is required'] },
});
const File = Mongoose.model('FileModel', FileModel);
module.exports = { File };
This code is defining a schema for the files that will be uploaded to the API. It uses the Mongoose library to define the schema and create a model for it.
The schema includes fields such as the file's URL, size, name, MIME type, file extension, and IDs for the content management system (CMS) it belongs to.
The schema also includes timestamps for when the file was created and last updated. The File
constant holds the Mongoose model for this schema, and it is exported so that it can be used in other parts of the code, such as the controller
.
In order to connect our Sanity CMS, we’ll need a token from Sanity that allows us to write to our CMS without any problems and also add our server UR so we don’t get blocked by the CORS policy; then after, we import the token, create a Sanity config file, and then create the connection.
So that’s pretty much the whole process, so let’s get on with it. Head over to the Sanity manager here,
Select the one we created earlier in this guide; for me, it is sanity_files
. Then copy your Project ID
; in this case, mine is l4h1g3vt
. Remember to store it, and even if you forget, you can always find it here.
The next step is to locate and create our Sanity token and set the access level and CORS.
So click on the API
tab; then on the left sidebar, click on Tokens, then click on Add API Token.
Then set the name, and select Editor under Permissions to gain both write and read access tokens.
Then after clicking on Save, copy your token, and add it to .env
.
SANITY_SECRET_TOKEN="skuDjxH0Psmf6uwQTgixOMJjfwMFzQx43TmFICReklDASWGtwQAURw9njY8qoA99IV0jVLUTG2M1EwcCFlTk8JGk0bEinuZE01pGSOvDRY5rOYxqDNRKVCRLD02R6QgEkCJdlQpxTqHQKW2ilzJdZ2Hvaykf9bbVaEW1MkqsKXb5ZSgoxHtb"
MONGODB_URI="mongodb+srv://filedb:[email protected]/?retryWrites=true&w=majority"
Furthermore, we then set our CORS to the left sidebar, click on CORS origins, then click on Add CORS origin.
After clicking on Save, it is done. So let’s head back to our backend, and into /utils/config/sanity.js
Then connect using the information we got.
// @desc file path
// utils/config/sanity.js
const { createClient } = require('@sanity/client');
const config = {
projectId: 'l4h1g3vt',
dataset: 'production',
useCdn: true,
apiVersion: '2023-02-17',
token: process.env.SANITY_SECRET_TOKEN,
};
const Client = createClient(config);
module.exports = { Client };
Overall, this code exports a client object that can be used to interact with a specific Sanity project and dataset, using the provided configuration options for authentication and authorization. Thus, connecting to the Sanity content management system (CMS).
The client is created using the @sanity/client
package which provides a set of methods for interacting with the Sanity API.
The config
object specifies the configuration options for the client instance. It includes the projectId
and dataset
fields, which determine which project and dataset the client will connect to.
The useCdn
option specifies whether to use the Sanity CDN, which provides cached responses for faster delivery of content. The apiVersion
field specifies the version of the Sanity API that the client will use.
Finally, the token
field is used for authentication and authorization of API requests and is retrieved from the environment variables using process.env.SANITY_SECRET_TOKEN
.
The Client
object is created using the createClient()
method from the @sanity/client
package and the configuration object as its argument. This object can then be used to make requests to the Sanity API to fetch and modify content.
So let's move on to our controller since we are done here.
Controller
So, in our routes
folder, the files.js
that we saw in PostFile,
is responsible for uploading the files at /
routes. We’ll create our PostFile in the controller
folder, and we’ll be able to make our request
and get back a response
.
So in the controller
folder, we create a files.js
// @desc file path
// controller/files.js
const { File } = require('../model/files');
const fs = require('fs');
const { Client } = require('../utils/config/sanity');
const PostFile = async (request, response) => {
try {
const postToSanity = await Client.assets.upload(
'file',
fs.createReadStream(`${request.file?.path}`),
{ filename: `${request.file?.originalname.replaceAll(' ', '-')}` }
);
const file = await File.create({
url: postToSanity.url,
size: postToSanity?.size,
name: postToSanity?.originalFilename,
mimeType: postToSanity?.mimeType,
extension: postToSanity?.extension,
cms_id: postToSanity?._id,
createdAt: postToSanity?._createdAt,
updatedAt: postToSanity?._updatedAt,
});
response.json(file);
} catch (error) {
console.log(error);
}
};
module.exports = { PostFile };
This is a controller function that is responsible for handling a POST request for uploading a file to the server. Here's what the code does:
The function first tries to upload the file to Sanity, a content management system, using the Sanity Client library. The file is read from the server's file system using the fs
library and passed to the Client.assets.upload
method along with some metadata.
If the upload to Sanity is successful, the function creates a new document in the MongoDB database using the Mongoose File
model. The document includes information about the uploaded file such as its URL, size, name, MIME type, extension, and some metadata from Sanity.
Finally, the function sends a JSON response back to the client with the newly created file
document.
If any errors occur during this process, they are caught and logged into the console.
Let’s explain this side well.
In order to create a file of any type, we use Client.assets.upload()
.
In the upload() function, we expect three(3) parameters: one being the asset type, which in this case, is a file, then an asset content which can be a browser File instance, a Blob, a Node.js Buffer instance, or a Node.js ReadableStream, and lastly, the filename, which is how we want the file to be saved or named with the file URL in our CMS.
await Client.assets.upload(
'file',
fs.createReadStream(`${request.file?.path}`),
{ filename: `${request.file?.originalname.replaceAll(' ', '-')}` }
);
I guess the rest is self-explanatory.
Done! Here is a gif file to see the actual thing working.
Now we know how to create a Sanity project, config, and connect to our Node.js(Express.js) application, and also how to config and connect MongoDB to our backend, then lastly, how to handle file upload with Sanity CMS, MongoDB, Multer, and Node.js(Express.js).
I hope you enjoyed it. I’ll update this later to include, DELETE, PUT, and GET, all together, or you can be a little adventurous and try to implement them yourself. And here is the whole code.
And also, I implement it on this project, which is live here.