Back to blog

How to Read JSON Files in JavaScript: Tutorial

How to Read JSON Files in JavaScript: Tutorial

Vytenis Kaubrė

2024-07-264 min read
Share

When working with servers, you often need to fetch and save data to and from external sources. For such situations, we need a uniform format for fetching data stored on servers that’s easy to access and read.

JSON (JavaScript Object Notation) helps in such situations. It’s a lightweight data representation format that’s easy to read and write. Automated systems can also parse and generate JSON easily. It’s mainly used in web applications for transmitting data to servers and web applications.

This tutorial explores ways to load and read JSON files in JavaScript. Let’s dive in!

Data Representation in JSON files 

In JSON, data is represented in (key, value) pairs. This means that a unique key identifies each piece of data, and its value can be of various types, including strings, numbers, arrays, objects, booleans, and null. Let's consider this sample JSON file to illustrate this:

{
  "products": [
    {
      "id": 1,
      "name": "Smartphone",
      "price": 500,
      "description": "A smartphone with wide-angle camera",
      "stock": 30
    },
    {
      "id": 2,
      "name": "Laptop",
      "price": 799,
      "description": "A laptop with excellent battery life",
      "stock": 20
    }
  ]
}

In the next section, we’ll go through the steps to load JSON data in JavaScript. 

Read JSON data in JavaScript 

We'll cover multiple methods for loading a JSON file in JavaScript, including using Require/Import modules, the Fetch API, and the FileReader API

1. Using Require/Import Modules

One common task is using JavaScript read JSON file functionality to import data. In Node.js, the require() function provides functionalities to read and parse JSON files. However, in modern ES6 standard JavaScript, directly importing the JSON file serves the same purpose. 

For example, assume you have a JSON file named products.json as shown previously. The two syntaxes to load this JSON file are as follows:

Using Node.js:

// Node.js
const data = require('./products.json');
console.log(data);

The require() function looks for products.json in the same directory as the script and reads the JSON file in JavaScript object format.

Using ES6: 

// Browser with a module bundler (e.g., Webpack) or modern Node.js
import data from './products.json' with {type: "json"};
console.log(data);

In the case of ES6, remember to update your package.json file with "type": "module". Here’s what the console should be logged with:

JSON file displayed in the console

2. Using Fetch API to Read JSON File

Fetch API is a web API provided by modern browsers to fetch resources from the network, including JSON files. It returns a Promise that can be resolved to a response object and used as required.

Promises are objects that reflect the eventual completion (or failure) of an asynchronous operation and its result. When using the Fetch API, the response to the request is resolved with a Promise.

Assume you need to fetch a products.json file from this demo e-commerce website using a network request. Create a new JavaScript file and use this sample code:

fetch('https://scraping-demo-api-json-server.vercel.app/products')
    .then(response => {
        if (!response.ok) {
            throw new Error('Network response was not ok ' + response.statusText);
        }
        return response.json();
    })
    .then(data => {
        displayProducts(data);
    })
    .catch(error => {
        console.error('There has been a problem with your fetch operation:', error);
    });

function displayProducts(products) {
    products.forEach(product => {
        console.log(`\x1b[32mProduct:\x1b[0m ${product.game_name}, \x1b[32mIn Stock:\x1b[0m ${product.inStock}`);
    });
}

Using the Fetch API method, this JavaScript function retrieves a JSON file from a given URL. If the network response is successful, it parses data into a JSON string and displays the results using the displayProducts() method. This function logs the name and availability of each product to the console. If an error occurs during the fetch process, it’s detected and sent to the console. Here’s a snippet of the displayed results:

Parsed JSON data displayed in the console

You can learn more about web requests with Fetch API in this Fetch API web scraping guide.

3. Using the File Reader API

The FileReader API reads a local JSON file stored on your PC. The web application asynchronously reads the file's contents and extracts the data from it. 

The following code snippet shows a demo of a FileReader API that reads a local file when you upload it from your computer:

<!DOCTYPE html>
<html lang="en">
  <head>
    <meta charset="UTF-8" />
    <meta name="viewport" content="width=device-width, initial-scale=1.0" />
    <title>Read JSON File</title>
  </head>
  <body>
    <input type="file" id="fileInput" accept=".json" />
    <div id="productList"></div>


    <script>
      document
        .getElementById("fileInput").addEventListener("change", function (event) {
          const file = event.target.files[0];
          if (file) {
            const reader = new FileReader();
            reader.onload = function (e) {
              try {
                const data = JSON.parse(e.target.result);
                displayProducts(data.products);
              } catch (error) {
                console.error("Error parsing JSON:", error);
              }
            };
            reader.readAsText(file);
          }
        });


      function displayProducts(products) {
        const productList = document.getElementById("productList");
        productList.innerHTML = "";
        products.forEach((product) => {
          const productDiv = document.createElement("div");
          productDiv.innerHTML = `Product: ${product.name}, Price: $${product.price}`;
          productList.appendChild(productDiv);
        });
      }
    </script>
  </body>
</html>

This code sample generates an HTML file input element and adds an event listener to it. When you pick a JSON file, the FileReader API reads its contents as text. Once the file has been read, the content is parsed as JSON, and the displayProducts() method is invoked using the parsed product data. The name and price of each product are logged to the web page as the function iterates through the product array.

If you’re using Visual Studio Code for programming, run the HTML file by pressing “Go Live”. It’ll open a new browser window, allowing you to upload a JSON file. The output is shown as follows:

Parsed JSON data after uploading a file via a browser

Common Errors

  • CORS errors: Cross-Origin Resource Sharing (CORS) errors can occur when retrieving JSON from a different origin. Make sure that the server contains the headers necessary to permit requests from different origins.

  • Parsing Errors: Invalid JSON format is a common cause of JSON.parse() errors. JSON data should always be validated.

  • Security: Watch out for attacks using JSON injection. Always handle data safely and sanitize inputs.

Common Use Cases

  • Web scraping: JSON is commonly used to extract structured data from web pages, commonly referred to as web scraping. Data is retrieved from websites by scrapers, which then parse it into JSON format and use it for various tasks like monitoring, data analysis, and database creation. JSON's lightweight and organized nature makes it perfect for quickly managing scraped data. Check these JavaScript and Node.js web scraping and Python web scraping tutorials for more information.

  • Configuration Files: JSON is commonly used for configuration files in applications and services. Configuration files contain settings and parameters that influence software behavior, and JSON's simple structure and readability make it a popular format for handling these settings.

  • Data Storage and Retrieval: JSON is frequently used to store and retrieve data in file systems and databases. JSON-like formats (BSON) are used by several NoSQL databases, including MongoDB, to store documents, enabling flexible and hierarchical data representation.

  • APIs and Data Exchange: Web servers and clients can exchange data in standard JSON. JSON is frequently used by APIs to transmit and receive data, which facilitates easier web integration of many systems and services.

Advanced Tips

  • Performance optimization: Performance may be an issue when working with large JSON files. Instead of loading the entire file into memory, consider using streaming parsers such as Node.js' JSONStream to process data in parts. This method helps speed up processing while using less memory.

  • Comparison with XML: Because of its more straightforward and condensed structure, JSON is typically chosen over XML for data interchange. Key-value pairs and arrays in JSON are more concise than in XML, which has a cumbersome tag-based structure. This makes JSON easier to generate and parse. Additionally, JSON frequently produces smaller payloads, which can result in less network expenses and speedier data transfer. However, XML's hierarchical structure and attribute support might be useful in applications requiring complicated document schemas.

Final words

Now that you know how to efficiently handle JSON files in JavaScript, feel free to put these methods to the test. The best way to master these techniques is by applying them in real-world scenarios. As mentioned previously, web scraping is one of the use cases where JSON is heavily used. If you’re new to web scraping, take a look at this article, which introduces the best programming languages for web scraping.

About the author

Vytenis Kaubrė

Technical Copywriter

Vytenis Kaubrė is a Technical Copywriter at Oxylabs. His love for creative writing and a growing interest in technology fuels his daily work, where he crafts technical content and web scrapers with Oxylabs’ solutions. Off duty, you might catch him working on personal projects, coding with Python, or jamming on his electric guitar.

All information on Oxylabs Blog is provided on an "as is" basis and for informational purposes only. We make no representation and disclaim all liability with respect to your use of any information contained on Oxylabs Blog or any third-party websites that may be linked therein. Before engaging in scraping activities of any kind you should consult your legal advisors and carefully read the particular website's terms of service or receive a scraping license.

Related articles

Get the latest news from data gathering world

I’m interested