How to Read a Config.json File in Javascript
Introduction to JSON
JavaScript Object Notation, referred to equally JSON in short, is one of the most pop formats for data storage and data interchange over the net. The simplicity of the JSON syntax makes it very like shooting fish in a barrel for humans and machines to read and write.
Despite its name, the use of the JSON data format is non express to JavaScript. Most programming languages implement data structures that you can easily convert to JSON and vice versa.
JavaScript, and therefore the Node.js runtime surround, is no exception. More often than not, this JSON data needs to be read from or written to a file for persistence. The Node runtime environment has the congenital-in fs module specifically for working with files.
This article is a comprehensive guide on how to use the born fs module to read and write information in JSON format. We shall also wait at some 3rd political party npm packages that simplify working with data in the JSON format.
Serializing and deserializing JSON
Serialization is the process of modifying an object or data structure to a format that is easy to shop or transfer over the internet. You can recover the serialized data past applying the reverse process.
Deserialization refers to transforming the serialized data structure to its original format.
Yous will almost always demand to serialize JSON or JavaScript object to a JSON string in Node. You can do and so with the JSON.stringify method before writing it to a storage device or transmitting it over the internet:
const config = { ip: '1234.22.11', port: 3000}; console.log(JSON.stringify(config)); On the other hand, after reading the JSON file, y'all will demand to deserialize the JSON string to a plain JavaScript object using the JSON.parse method before accessing or manipulating the data:
const config = JSON.stringify({ ip: '1234.22.xi', port: 3000}); console.log(JSON.parse(config)); JSON.stringify and JSON.parse are globally available methods in Node. You don't demand to install or crave before using.
Introduction to thefs module
Because the fs module is built in, you don't need to install it. Information technology provides functions that you lot can use to read and write information in JSON format, and much more.
Each function exposed by the fs module has the synchronous, callback, and promise-based form. The synchronous and callback variants of a method are accessible from the synchronous and callback API. The promise-based variant of a office is accessible from the promise-based API.
Synchronous API
The synchronous methods of the congenital-in fs module cake the event loop and further execution of the remaining lawmaking until the performance has succeeded or failed. More often than not, blocking the consequence loop is not something you want to do.
The names of all synchronous functions end with the Sync characters. For instance, writeFileSync and readFileSync are both synchronous functions.
You can access the synchronous API by requiring fs:
const fs = require('fs'); // Blocks the event loop fs.readFileSync(path, options); Callback API
Different the synchronous methods that block the event loop, the corresponding methods of the callback API are asynchronous. Yous'll pass a callback function to the method as the last argument.
The callback function is invoked with an Error object as the first argument if an error occurs. The remainder of the arguments to the callback function depends upon the fs method.
You tin can also access the methods of the callback API by requiring fs similar the synchronous API:
const fs = crave('fs'); fs.readFile(path, options, callback); Promise-based API
The hope-based API is asynchronous, similar the callback API. It returns a promise, which you can manage via promise chaining or async-await.
Yous can access the promise-based API past requiring fs/promises:
const fs = require('fs/promises'); fs.readFile(path) .then((data) => { // Practise something with the data }) .take hold of((fault) => { // Do something if mistake }); Nosotros used the commonJS syntax for accessing the modules in the code snippets above. We shall be using the commonJS syntax throughout this commodity considering Node treats JavaScript code every bit a commonJS module by default. You can also utilise ES6 modules if you want.
According to the Node documentation, the callback API of the built-in fs module is more performant than the promise-based API. Therefore, nigh examples in this article will use the callback API.
How to read JSON files in Node.js
The Node runtime environment has the built-in require function and the fs module that you lot can utilize for loading or reading JSON files. Because require is globally available, yous don't need to require information technology.
Nevertheless, y'all will need to require the fs module before using it. I will discuss how to read JSON files using the congenital-in fs module and crave part in the following subsections.
How to load a JSON file using the global require office
Yous tin can apply the global crave office to synchronously load JSON files in Node. Subsequently loading a file using crave, it is cached. Therefore, loading the file again using require will load the cached version. In a server environment, the file volition be loaded again in the next server restart.
It is therefore appropriate to use require for loading static JSON files such as configuration files that do not change ofttimes. Do not use require if the JSON file you load keeps changing, because information technology will cache the loaded file and use the cached version if you require the same file again. Your latest changes will not exist reflected.
Bold you take a config.json file with the following content:
{ "port": "3000", "ip": "127.00.12.3" } You can load the config.json file in a JavaScript file using the code below. require will ever load the JSON information as a JavaScript object:
const config = require('./config.json'); console.log(config); How to read a JSON file using the fs.readFile method
You can use the readFile method to read JSON files. It asynchronously reads the contents of the entire file in memory, therefore it is not the virtually optimal method for reading large JSON files.
The readFile method takes three arguments. The code snippet below shows its function signature:
fs.readFile(path, options, callback);
The first statement, path, is the file name or the file descriptor. The second is an optional object argument, and the third is a callback role. You tin as well pass a string as the 2nd argument instead of an object. If you laissez passer a cord, and so it has to be encoded.
The callback function takes two arguments. The first argument is the fault object if an error occurs, and the 2nd is the serialized JSON data.
The code snippet below volition read the JSON data in the config.json file and log it on the terminal:
const fs = crave('fs'); fs.readFile('./config.json', 'utf8', (error, data) => { if(fault){ console.log(fault); render; } console.log(JSON.parse(data)); }) Make sure to deserialize the JSON string passed to the callback role before you offset working with the resulting JavaScript object.
How to read a JSON file using fs.readFileSync method
readFileSync is another built-in method for reading files in Node like to readFile. The departure between the two is that readFile reads the file asynchronously while readFileSync reads the file synchronously. Therefore, readFileSync blocks the event loop and execution of the remaining code until all the information has been read.
To grasp the divergence between synchronous and asynchronous code, you lot tin read the article "Understanding asynchronous JavaScript" here.
Below is the function signature of fs.readFileSync:
fs.readFileSync(path, options);
path is the path to the JSON file you desire to read, and y'all can pass an object as the 2nd statement. The 2d argument is optional.
In the code snippet below, we are reading JSON data from the config.json file using readFileSync:
const { readFileSync } = crave('fs'); const information = readFileSync('./config.json'); console.log(JSON.parse(data)); How to write to JSON files in Node.js
Simply similar reading JSON files, the fs module provides congenital-in methods for writing to JSON files.
You tin employ the writeFile and writeFileSync methods of the fs module. The difference between the 2 is that writeFile is asynchronous while writeFileSync is synchronous. Before writing a JSON file, make sure to serialize the JavaScript object to a JSON string using the JSON.stringify method.
How to write to JSON files using the fs.writeFile method
JSON.stringify volition format your JSON data in a single line if y'all do not pass the optional formatting argument to the JSON.stringify method specifying how to format your JSON data.
If the path you pass to the writeFile method is for an existing JSON file, the method will overwrite the data in the specified file. It will create a new file if the file does non exist:
const { writeFile } = require('fs'); const path = './config.json'; const config = { ip: '192.0.2.one', port: 3000 }; writeFile(path, JSON.stringify(config, null, 2), (error) => { if (error) { console.log('An error has occurred ', error); render; } panel.log('Data written successfully to disk'); }); How to write to JSON files using the fs.writeFileSync method
Unlike writeFile, writeFileSync writes to a file synchronously. If you apply writeFileSync, you volition cake the execution of the event loop and the rest of the lawmaking until the operation is successful or an error occurs. It will create a new file if the path you pass doesn't exist, and overwrites it if it does.
In the lawmaking snippet below, we are writing to the config.json file. We are wrapping the code in try-take hold of so that nosotros tin can grab any errors:
const { writeFileSync } = require('fs'); const path = './config.json'; const config = { ip: '192.0.ii.1', port: 3000 }; attempt { writeFileSync(path, JSON.stringify(config, nada, 2), 'utf8'); panel.log('Data successfully saved to disk'); } catch (mistake) { console.log('An fault has occurred ', error); } How to append a JSON file
Node doesn't accept a built-in function for appending or updating fields of an existing JSON file out of the box. You tin, withal, read the JSON file using the readFile method of the fs module, update it, and overwrite the JSON file with the updated JSON.
Below is a code snippet illustrating how to get about it:
const { writeFile, readFile } = require('fs'); const path = './config.json'; readFile(path, (mistake, data) => { if (fault) { console.log(error); return; } const parsedData = JSON.parse(information); parsedData.createdAt = new Appointment().toISOString(); writeFile(path, JSON.stringify(parsedData, null, 2), (err) => { if (err) { console.log('Failed to write updated data to file'); render; } panel.log('Updated file successfully'); }); }); How to read and write to JSON files using tertiary-party npm packages
In this department, we shall look at the well-nigh popular third-political party Node packages for reading and writing data in JSON format.
How to use the jsonfile npm package for reading and writing JSON files
jsonfile is a pop npm parcel for reading and writing JSON files in Node. You can install it using the command below:
npm i jsonfile
Information technology is like to the readFile and writeFile methods of the fs module, though jsonfile has some advantages over the built-in methods.
Some of the features of this package are as follows:
- It serializes and deserializes JSON out of the box
- It has a born utility for appending data to a JSON file
- Supports promise chaining
You can see the jsonfile package in action in the code snippet below:
const jsonfile = require('jsonfile'); const path = './config.json'; jsonfile.readFile(path, (err, data) => { if (err) { console.log(err); return; } console.log(data); }); You lot can as well use promise chaining instead of passing a callback function like in the above example:
const jsonfile = require('jsonfile'); const path = './config.json'; jsonfile .readFile(path) .so((information) => { console.log(data); }) .catch((err) => { panel.log(err); }); How to use the fs-actress npm package for reading and writing JSON files
fs-extra is another popular Node package you can employ to work with files. Though you tin use this packet for managing JSON files, it has methods whose functions extend beyond merely reading and writing JSON files.
Equally its name suggests, fs-extra has all the functionalities provided by the fs module and more. According to the documentation, you can utilise the fs-extra packet instead of the fs module.
You need to showtime install fs-extra from npm earlier using it:
npm install fs-extra
The lawmaking below shows how you can read JSON files using the readJson method of the fs-actress package. You tin utilize a callback function, promise chaining, or async/await:
const fsExtra = require('fs-extra'); const path = './config.json'; // Using callback fsExtra.readJson(path, (error, config) => { if (fault) { console.log('An error has occurred'); render; } panel.log(config); }); // Using hope chaining fsExtra .readJson(path) .then((config) => { console.log(config); }) .catch((error) => { console.log(error); }); // Using async/await async office readJsonData() { effort { const config = expect fsExtra.readJson(path); console.log(config); } catch (mistake) { panel.log(error); } } readJsonData(); The code below illustrates how you lot tin write JSON data using the writeJson method:
const { writeJson } = crave('fs-extra'); const path = './config.json'; const config = { ip: '192.0.2.ane', port: 3000 }; // Using callback writeJson(path, config, (fault) => { if (error) { console.log('An error has occurred'); return; } console.log('Data written to file successfully '); }); // Using hope chaining writeJson(path, config) .then(() => { panel.log('Information written to file successfully '); }) .catch((fault) => { panel.log(fault); }); // Using async/await async function writeJsonData() { try { wait writeJson(path, config); console.log('Information written to file successfully '); } catch (error) { console.log(mistake); } } writeJsonData(); Just similar the fs module, fs-extra has both asynchronous and synchronous methods. You don't demand to stringify your JavaScript object before writing to a JSON file.
Similarly, you don't demand to parse to a JavaScript object later on reading a JSON file. The module does information technology for you out of the box.
How to use the bfj npm bundle for reading and writing JSON files
bfj is another npm package you can employ for treatment data in JSON format. Co-ordinate to the documentation, information technology was created for managing large JSON datasets.
bfjimplements asynchronous functions and uses pre-allocated fixed-length arrays to endeavour and alleviate issues associated with parsing and stringifying large JSON or JavaScript datasets – bfj documentation
You lot can read JSON data using the read method. The read method is asynchronous and it returns a promise.
Assuming y'all have a config.json file, yous can employ the following code to read it:
const bfj = require('bfj'); const path = './config.json'; bfj .read(path) .then((config) => { console.log(config); }) .grab((mistake) => { console.log(error); }); Similarly, you lot tin employ the the write method to write data to a JSON file:
const bfj = require('bfj'); const path = './config.json'; const config = { ip: '192.0.2.ane', port: 3000 }; bfj .write(path, config) .then(() => { console.log('Data has been successfully written to disk'); }) .catch((error) => { console.log(mistake); }); bfj has lots of functions that you can read about in the documentation. It was created purposely for treatment big JSON data. Information technology is also slow, so y'all should utilise it only if you are handling relatively big JSON datasets.
Conclusion
Equally explained in the above sections, JSON is one of the almost pop formats for information substitution over the net.
The Node runtime surroundings has the built-in fs module y'all can utilise to work with files in general. The fs module has methods that you tin can use to read and write to JSON files using the callback API, hope-based API, or synchronous API.
Considering methods of the callback API are more than performant than that of the promise-based API, every bit highlighted in the documentation, y'all are meliorate off using the callback API.
In addition to the built-in fs module, several popular third-party packages such as jsonfile, fs-extra, and bfj exist. They have boosted utility functions that make working with JSON files a breeze. On the flip side, you should evaluate the limitations of adding third-party packages to your application.
200's simply
Monitor failed and slow network requests in production
Deploying a Node-based spider web app or website is the piece of cake office. Making sure your Node example continues to serve resources to your app is where things get tougher. If you're interested in ensuring requests to the backend or tertiary political party services are successful, try LogRocket.
https://logrocket.com/signup/
LogRocket is similar a DVR for spider web and mobile apps, recording literally everything that happens while a user interacts with your app. Instead of guessing why bug happen, yous can aggregate and written report on problematic network requests to apace sympathize the root cause.
LogRocket instruments your app to tape baseline performance timings such every bit page load fourth dimension, time to first byte, tiresome network requests, and also logs Redux, NgRx, and Vuex actions/country. Offset monitoring for free.
Source: https://blog.logrocket.com/reading-writing-json-files-nodejs-complete-tutorial/
0 Response to "How to Read a Config.json File in Javascript"
Post a Comment