What Does a Server Do?
As we mentioned before, Node.js allows us to interact with different operating systems. One of these is the file system, which is something that software engineers have to constantly work with. For example, when sending a post on Instagram, the server first needs to receive this post and record it to the disk. The same thing happens when you scroll through your feed: the user requests the image, then the server finds the right file and sends this data back to the user.
In this lesson, we'll talk about how to work with files on the server and teach you how to read data from files and folders, write new data to a file, create directories, and delete files. Let's go!
What Module Do We Need?
Node.js comes with the fs module, which allows us to access and manipulate the file system. There's a built-in method for each operation you might need to perform. Let's start off with reading files. For that, we have the readFile() function.
This function works asynchronously and takes three arguments: the name of the file that we want to read, an options object (optional), and a callback. In the callback, we need to describe what should be done with the data.
The callback has two parameters. Normally, the first parameter of a Node.js callback is an err parameter, which is used to handle potential errors. Second, we have the data parameter, which represents the contents of the file.
const fs = require('fs');
fs.readFile('data.json', (err, data) => {
if (err) {
console.log(err);
return;
}
console.log('data: ', data.toString('utf8'));
});
The first callback parameter can have one of the following two values:
If an error occurs while reading the file, the value of this parameter will be an object containing the error information
If the file is read successfully and there's nothing wrong with it, this parameter will have a value of null
As mentioned above, the second parameter of the callback is the file data. This comes in the form of binary code and is referred to as buffer data because it represents an instance of JavaScript's global Buffer class. In order to be able to work with this data, we first need to convert it into a string. There are two ways of doing this:
By using the toString() method. It'll look like this: data.toString('utf8'). This method takes a string as an argument, whose value is the encoding format into which we want to convert this data.
By passing the encoding format inside the encoding property of the options object of the readFile() method. If we do things this way, we don't need an extra method to convert the file as it will already be in the form of a string:
const fs = require('fs');
fs.readFile('data.json', { encoding: 'utf8' }, (err, data) => { // the options object is passed as the second argument. It contains the encoding property, in which we specify the character encoding to use
if (err) {
console.log(err);
return;
}
console.log('data: ', data); // since the data comes as a string, we don't need to call the toString() method here
});
What Else Can the fs Module Do?
It Can Read All the Files in a Directory
Node.js provides the fs.readdir() method for doing this. The first argument of this method is the path to the directory. The second one is a callback, which describes what should be done with the data returned.
The callback also has two parameters — an error parameter (err) and an array of the file names:
const fs = require('fs');
fs.readdir('.', (err, files) => {
if (err) {
console.log(err);
return;
}
console.log('data: ', files);
});
It Can Create Folders
The method for creating folders is fs.mkdir(). It takes two arguments: the name of the new folder, and a callback with a single argument, i.e. the error object. When passing the first argument, we can specify the path to this new file along with its name:
const fs = require('fs');
fs.mkdir('incomingData/data', (err) => {
if (err) console.log(err);
});
It Can Write Data to a File
This is done with the fs.writeFile() method. It has three parameters:
The file to which we want to write data
Data in the form of a string
A callback for error processing
const fs = require('fs');
fs.writeFile('data.json', JSON.stringify([1, 2, 3]), (err) => {
if (err) console.log(err);
});
It Can Delete Files
To delete files, we use the fs.unlink() method, which takes two arguments — the file name and a callback for processing errors:
const fs = require('fs');
fs.unlink('data.json', (err) => {
if (err) {
console.log(err);
return;
}
console.log('The file was deleted!');
});
It Can Do a Lot of Other Useful Things
The remaining methods of the fs module work in mostly the same way. If you want to do something with a file that we haven't explained how to do here, you can read the Node.js documentation, where you should find a method for what you want to do.
Using Promises when Working with Files
Node.js v10.0.0 introduced an fs module that supports promises. When we use promises, we don't need to pass any callbacks. If the data is read successfully, the promise will be resolved, and if the operation fails, the promise will be rejected, so to handle the success cases, all you need to do is add the asynchronous then() handler and put the code you want to be executed inside it:
const fsPromises = require('fs').promises;
fsPromises.readFile('data.json', { encoding: 'utf8' })
.then((data) => {
console.log(data);
})
.catch(err => {
console.log(err);
});
Documentation on the fs Promises API.
Routing problems
Working with a file system involves setting up routing. But how should we do this? Do we write file paths relative to the entry point, or do we write them based upon the file where the code is located? To figure this out, let's consider the following example:
Let's say we have the app.js file as our entry point, which contains the following code:
// app.js
const fs = require('fs');
const readFile = () => {
const file = fs.readFile('file.txt', { encoding: 'utf8' }, (err, data) => {
console.log(data); // logging the content of the file to the console
}); // reading file.txt with a relative path
};
readFile();
After that, let's say we decide to move the contents of this file and the logic for working with it to a separate folder, which results in the following file structure:
Since the logic for working with the file is now stored in a different folder, we need to connect it to the entry point, which is the app.js file. To do that, we need to import the readFile() function:
// app.js
const fs = require('fs');
const { readFile } = require('./files/read-file');
readFile();
Then, we export that same function from the read-file.js file:
// read-file.js
const fs = require('fs');
module.exports.readFile = () => {
const file = fs.readFile('file.txt', { encoding: 'utf8' }, (err, data) => {
console.log(data);
});
};
This code will lead to an error, because it won't be able to find file.txt. The problem lies in the relative path. Instead of reading the path relative to where the function is set up, the path is read relative to the file in which the code is run. We could have changed the path to the file from file.txt to /files/file.txt, but this is not ideal because as we add more files to our project, it will become difficult to manage and keep track of the routes.
Thankfully, there's a simple solution. We can make the routes dynamic. Instead of writing the path explicitly, we can read it from its module. To make this happen, there are two things to consider:
We need to know where the module we want to access is located.
We need an extra path module for working with directories and file paths. This module allows us to take the folder names, join them together, and create a path.
Let's talk about each of these in more detail.
What does a module store?
Each Node.js module contains information about itself and its environment. For example, we can check a module's location or see whether or not it's our application's entry point.
Where is a module located?
Every module contains the __filename and __dirname variables, which store the module's file path and directory path, respectively.
// app.js
console.log(__filename); // /usr/local/project/app.js
console.log(__dirname); // /usr/local/project
We could have used a template literal or concatenation to make the path dynamic:
const file = fs.readFile(`${__dirname}/file.txt`, { encoding: 'utf8' }, (err, data) => {
});
However, it's better to avoid doing so because different operating systems have different slashes. While macOS uses a forward slash, MS Windows uses a backward slash. To avoid any confusion with slashes, it's better to modify paths using the path module, which was specifically designed for this purpose.
How Do We Modify a Route?
The path module provides various methods for working with file and directory paths. One of these methods is the join() method, which joins the specified path segments together and returns what's referred to as a normalized path. This method accounts for the operating system being used, so we avoid any problems with slashes:
// read-file.js
const fs = require('fs');
const path = require('path');
module.exports.readFile = () => {
const filepath = path.join(__dirname, 'file.txt'); // joining the path segments to create an absolute path
const file = fs.readFile(filepath, { encoding: 'utf8' }, (err, data) => {
console.log(data);
});
};
Here are some more useful methods of the path module:
const fs = require('fs');
const path = require('path');
// the path.normalize() method normalizes the specified path
// and resolves '..' and '.' segments
path.normalize('/foo/bar//baz/asdf/quux/..'); // /foo/bar/baz/asdf
// the path.dirname() method returns the directory name of the given path
path.dirname(require.main.filename); // /usr/local/my-project
// the path.extname() method returns the extension of a file path
path.extname('app.js'); // .js
You can read more about other methods in the official Node.js documentation.
The fs and path modules are essential for working with file systems. The fs module contains methods for performing operations on the files themselves, while the path module provides the tools for creating normalized paths between them. Both these modules allow us to manage the file system without affecting the flexibility or functionality of our project.