Back to blog

How to Use cURL in Javascript? Use-Cases & Tutorial

author avatar

Yelyzaveta Nechytailo

2025-03-134 min read
Share

cURL is a powerful command-line tool for transferring data with URLs, supporting various protocols including HTTP, HTTPS, FTP, and more. When used together with JavaScript, cURL allows you as a developer to make robust network requests, handle complex API interactions, and transfer data efficiently. It’s actually particularly valuable for web scraping, API integration, file downloads, and automated testing.

In this tutorial, we’ll cover multiple methods to implement cURL functionality in JavaScript applications and well as explore error handling, best practices, and using proxies with cURL in JavaScript.

How to use cURL in JavaScript: Tutorial

To use cURL in Javascript, we can execute a shell command, use node-libcurl package, request-promise package, and implement such alternatives as Axios. So, let’s discover all the mentioned methods below.

Setting up the environment

Before implementing cURL in JavaScript, ensure you have:

  • Node.js installed (version 12.x or later recommended);

  • npm (Node Package Manager) for installing dependencies;

  • Basic understanding of asynchronous JavaScript;

  • cURL installed on your system for shell command execution approaches.

Let's start with a basic project setup:

mkdir curl-js-project
cd curl-js-project
npm init -y

Option 1: Using child process module to execute a shell command

One straightforward approach to using cURL in JavaScript is Node.js's built-in child_process module to execute cURL commands directly:

const { exec } = require('child_process');

function curlRequest(url, options = {}) {
  return new Promise((resolve, reject) => {
    // Build the cURL command
    let curlCommand = `curl -s "${url}"`;
    
    // Add headers if provided
    if (options.headers) {
      Object.entries(options.headers).forEach(([key, value]) => {
        curlCommand += ` -H "${key}: ${value}"`;
      });
    }
    
    // Add request method if provided
    if (options.method) {
      curlCommand += ` -X ${options.method}`;
    }
    
    // Add data if provided
    if (options.data) {
      curlCommand += ` -d '${JSON.stringify(options.data)}'`;
    }
    
    // Execute the command
    exec(curlCommand, (error, stdout, stderr) => {
      if (error) {
        reject(`Error: ${error.message}`);
        return;
      }
      if (stderr) {
        reject(`Error: ${stderr}`);
        return;
      }
      
      try {
        // Try to parse JSON response
        const response = JSON.parse(stdout);
        resolve(response);
      } catch (e) {
        // Return raw response if not JSON
        resolve(stdout);
      }
    });
  });
}

This approach is simple but has a few limitations, such as:

  • Requires cURL to be installed on the system;

  • String concatenation for command building can be error-prone;

  • Limited error handling capabilities.

Option 2: Using node-libcurl package

For a more robust cURL integration, you can try node-libcurl which provides direct bindings to the libcurl library:

npm install node-libcurl

Here’s an example implementation of this package:

const { Curl, CurlFeature } = require('node-libcurl');

function performRequest(url, options = {}) {
  return new Promise((resolve, reject) => {
    const curl = new Curl();
    
    // Set URL
    curl.setOpt(Curl.option.URL, url);
    
    // Set headers
    if (options.headers) {
      const headerList = Object.entries(options.headers).map(
        ([key, value]) => `${key}: ${value}`
      );
      curl.setOpt(Curl.option.HTTPHEADER, headerList);
    }
    
    // Set method and data
    if (options.method === 'POST' && options.data) {
      curl.setOpt(Curl.option.POST, true);
      curl.setOpt(Curl.option.POSTFIELDS, JSON.stringify(options.data));
    }
    
    // Set timeout
    if (options.timeout) {
      curl.setOpt(Curl.option.TIMEOUT, options.timeout);
    }
    
    // Enable following redirects
    curl.setOpt(Curl.option.FOLLOWLOCATION, true);
    
    // Get response data
    curl.on('end', function(statusCode, data, headers) {
      try {
        // Try to parse JSON response
        const response = JSON.parse(data);
        resolve({
          statusCode,
          data: response,
          headers
        });
      } catch (e) {
        // Return raw response if not JSON
        resolve({
          statusCode,
          data,
          headers
        });
      }
      this.close();
    });
    
    // Handle errors
    curl.on('error', function(error) {
      reject(error);
      this.close();
    });
    
    // Perform the request
    curl.perform();
  });
}

By using node-libcurl you can take advantage of such benefits as:

  • Better performance than shell execution;

  • More granular control over request options;

  • Proper error handling.

Option 3: Using the request-promise package

Another method that will help you to use cURL in JavaScript effectively is the request-promise package. The main advantage of this package is that it provides a simpler interface while maintaining many cURL-like capabilities.

npm install request request-promise

Example implementation:

const rp = require('request-promise');

async function curlLikeRequest(url, options = {}) {
  const requestOptions = {
    uri: url,
    method: options.method || 'GET',
    headers: options.headers || {},
    json: true, // Automatically parse JSON responses
    resolveWithFullResponse: true // Get the full response object
  };
  
  // Add body for POST/PUT requests
  if (options.data && (options.method === 'POST' || options.method === 'PUT')) {
    requestOptions.body = options.data;
  }
  
  // Add query parameters
  if (options.params) {
    requestOptions.qs = options.params;
  }
  
  try {
    const response = await rp(requestOptions);
    return {
      statusCode: response.statusCode,
      data: response.body,
      headers: response.headers
    };
  } catch (error) {
    if (error.response) {
      // The request was made but the server responded with an error
      throw {
        statusCode: error.statusCode,
        data: error.response.body,
        headers: error.response.headers
      };
    } else {
      // Something else went wrong
      throw error;
    }
  }
}

Using Axios as an alternative

There are several alternatives to making HTTP requests on the server side. One of them is Axios which provides a modern, promise-based HTTP client that works in both browser and Node.js environments:

npm install axios

And here’s how you can implement it:

const axios = require('axios');

async function curlWithAxios(url, options = {}) {
  const config = {
    url,
    method: options.method || 'GET',
    headers: options.headers || {},
    // Set timeout in milliseconds
    timeout: options.timeout || 10000
  };
  
  // Add request body for POST/PUT/PATCH
  if (options.data) {
    config.data = options.data;
  }
  
  // Add URL parameters
  if (options.params) {
    config.params = options.params;
  }
  
  try {
    const response = await axios(config);
    return {
      statusCode: response.status,
      data: response.data,
      headers: response.headers
    };
  } catch (error) {
    if (error.response) {
      // The server responded with an error status
      throw {
        statusCode: error.response.status,
        data: error.response.data,
        headers: error.response.headers
      };
    } else if (error.request) {
      // The request was made but no response received
      throw {
        message: 'No response received',
        request: error.request
      };
    } else {
      // Something else went wrong
      throw error;
    }
  }
}

// Example usage
async function fetchUserData() {
  try {
    const response = await curlWithAxios('https://api.example.com/users/1', {
      headers: {
        'Accept': 'application/json'
      }
    });
    console.log('User data:', response.data);
  } catch (error) {
    console.error('Failed to fetch user data:', error);
  }
}

fetchUserData();

By using Axios, you’ll get access to the following advantages:

  • Works in both browser and Node.js environments;

  • Automatic JSON transformation;

  • Client-side XSRF protection;

  • Request cancellation support;

  • Interceptors for request/response.

Advanced tips

Now that we’ve covered several methods of using cURL in JavaScript, let’s look at some advanced tips, such as handling errors and responses, most common mistakes, and best practices.

Handling responses and errors

When working with network requests in JavaScript applications, it’s crucial to implement robust error handing – this way, you’ll ensure your systems won’t break down.  In fact, network operations are prone to failures due to connectivity issues, server problems, or invalid responses. An effective error handling strategy ensures your application can handle unexpected situations without crashing or providing a poor user experience.

async function robustRequest(url) {
  try {
    // Make the request
    const response = await curlWithAxios(url);
    return response.data;
  } catch (error) {
    // Handle different types of errors
    if (error.statusCode === 404) {
      console.error('Resource not found');
      return null;
    } else if (error.statusCode === 401 || error.statusCode === 403) {
      console.error('Authentication or authorization error');
      // Possibly refresh token or redirect to login
    } else if (error.statusCode >= 500) {
      console.error('Server error, retrying...');
      // Implement retry logic
      return await retryRequest(url, 3); // Retry 3 times
    } else {
      console.error('Unknown error:', error);
      throw error; // Re-throw for upstream handling
    }
  }
}

// Retry logic with exponential backoff
async function retryRequest(url, maxRetries, currentRetry = 0) {
  try {
    return await curlWithAxios(url);
  } catch (error) {
    if (currentRetry < maxRetries) {
      // Exponential backoff: wait longer between each retry
      const delay = Math.pow(2, currentRetry) * 1000;
      console.log(`Retrying in ${delay}ms...`);
      await new Promise(resolve => setTimeout(resolve, delay));
      return retryRequest(url, maxRetries, currentRetry + 1);
    } else {
      throw error; // Max retries reached
    }
  }
}

Common mistakes

  • Not handling network failures: Make sure to always be ready for network issues or service unavailability. Network connections can fail due to various reasons including server downtime, connectivity issues, or infrastructure problems. So, implementing proper error catching and fallback mechanisms is essential for robust applications.

  • Ignoring rate limiting: Many APIs implement rate limiting to ensure fair usage across all clients. Monitor response headers and back off to avoid service disruptions and potential IP bans.

  • Poor timeout management: Different endpoints may require different timeout values based on expected response times, so customizing timeouts per request type can significantly improve application responsiveness and resource utilization.

  • Inadequate error logging: Log detailed error information including request parameters, response status, and error messages for debugging purposes. Comprehensive logging will help you identify patterns in failures and provide crucial context for troubleshooting issues in production environments.

Best practices

  • Reuse connections: For multiple requests to the same host, reuse connections when possible to reduce latency and resource consumption.

  • Implement graceful degradation: Ensure your application can function when requests fail or return unexpected results. Design your error handling to maintain core functionality even when dependent services are unavailable.

  • Monitor response times: Establishing performance baselines and tracking metrics over time allows you to detect subtle degradations before they become critical issues and helps prioritize optimization efforts.

  • Validate responses: Don't assume response data will always be in the expected format. Implement schema validation for API responses to catch data inconsistencies early.

Using proxies with cURL in JavaScript

Proxies coming from reputable proxy providers are essential for web scraping, accessing geo-restricted content, load balancing, and anonymizing requests. The examples below show how to implement Oxylabs’ Residential Proxies with different approaches.

Using proxies with child-process and cURL

The child_process approach offers a straightforward way to leverage cURL's built-in proxy capabilities through command-line arguments.

const { exec } = require('child_process');

function curlThroughProxy(url, proxyConfig) {
  return new Promise((resolve, reject) => {
    const { host, port, username, password } = proxyConfig;
    
    // Construct proxy authentication string if credentials provided
    const proxyAuth = username && password ? 
      `${username}:${password}` : '';
    
    // Build the cURL command with proxy
    let curlCommand = `curl -s -x ${host}:${port}`;
    
    // Add proxy auth if needed
    if (proxyAuth) {
      curlCommand += ` -U "${proxyAuth}"`;
    }
    
    // Add the URL
    curlCommand += ` "${url}"`;
    
    exec(curlCommand, (error, stdout, stderr) => {
      if (error) {
        reject(`Error: ${error.message}`);
        return;
      }
      if (stderr) {
        reject(`Error: ${stderr}`);
        return;
      }
      
      try {
        // Try to parse JSON response
        const response = JSON.parse(stdout);
        resolve(response);
      } catch (e) {
        // Return raw response if not JSON
        resolve(stdout);
      }
    });
  });
}

// Example usage with Oxylabs' Residential Proxies
async function checkIPLocation() {
  try {
    const proxyConfig = {
      host: 'pr.oxylabs.io',
      port: 7777,
      username: 'customer-USERNAME',
      password: 'PASSWORD'
    };
    
    const data = await curlThroughProxy('https://ip.oxylabs.io/location', proxyConfig);
    console.log('IP Location:', data);
  } catch (error) {
    console.error('Request failed:', error);
  }
}

checkIPLocation();

Using proxies with node-libcurl

The node-libcurl library provides granular control over proxy settings with direct access to libcurl's extensive proxy configuration options.

const { Curl } = require('node-libcurl');

function requestViaProxy(url, proxyConfig) {
  return new Promise((resolve, reject) => {
    const curl = new Curl();
    
    // Set URL
    curl.setOpt(Curl.option.URL, url);
    
    // Configure proxy
    curl.setOpt(Curl.option.PROXY, `${proxyConfig.host}:${proxyConfig.port}`);
    
    // Set proxy type (HTTP/HTTPS)
    curl.setOpt(Curl.option.PROXYTYPE, CurlProxy.Http);
    
    // Set proxy authentication if needed
    if (proxyConfig.username && proxyConfig.password) {
      curl.setOpt(
        Curl.option.PROXYUSERPWD, 
        `${proxyConfig.username}:${proxyConfig.password}`
      );
    }
    
    // Get response
    curl.on('end', function(statusCode, data, headers) {
      try {
        const response = JSON.parse(data);
        resolve({
          statusCode,
          data: response,
          headers
        });
      } catch (e) {
        resolve({
          statusCode,
          data,
          headers
        });
      }
      this.close();
    });
    
    curl.on('error', function(error) {
      reject(error);
      this.close();
    });
    
    curl.perform();
  });
}

// Example with Oxylabs Residential Proxies
async function testProxyConnection() {
  const proxyConfig = {
    host: 'pr.oxylabs.io',
    port: 7777,
    username: 'customer-USERNAME',
    password: 'PASSWORD'
  };
  
  try {
    const response = await requestViaProxy('https://ip.oxylabs.io/location', proxyConfig);
    console.log('Response:', response.data);
  } catch (error) {
    console.error('Proxy request failed:', error);
  }
}

testProxyConnection();

Using proxies with Axios

And finally, Axios which provides a modern, Promise-based approach to proxy integration that works across both Node.js and browser environments.

const axios = require('axios');
const HttpsProxyAgent = require('https-proxy-agent');

async function axiosWithProxy(url, proxyConfig) {
  // Create proxy URL
  const proxyUrl = proxyConfig.username && proxyConfig.password
    ? `http://${proxyConfig.username}:${proxyConfig.password}@${proxyConfig.host}:${proxyConfig.port}`
    : `http://${proxyConfig.host}:${proxyConfig.port}`;
  
  // Create proxy agent
  const httpsAgent = new HttpsProxyAgent(proxyUrl);
  
  try {
    const response = await axios({
      url,
      method: 'GET',
      httpsAgent,
      // Set proxy auth header if using specific proxy types that require it
      headers: {
        'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36',
        // Some proxies might require additional headers
      }
    });
    
    return {
      statusCode: response.status,
      data: response.data,
      headers: response.headers
    };
  } catch (error) {
    if (error.response) {
      throw {
        statusCode: error.response.status,
        data: error.response.data,
        headers: error.response.headers
      };
    } else {
      throw error;
    }
  }
}

// Example usage
async function fetchWithOxylabsProxy() {
  // First install the required package
  // npm install https-proxy-agent
  
  const proxyConfig = {
    host: 'pr.oxylabs.io',
    port: 7777,
    username: 'customer-USERNAME',
    password: 'PASSWORD'
  };
  
  try {
    const response = await axiosWithProxy('https://ip.oxylabs.io/location', proxyConfig);
    console.log('Your IP location:', response.data);
  } catch (error) {
    console.error('Failed to fetch through proxy:', error);
  }
}

fetchWithOxylabsProxy();

Try Residential Proxies

The fastest and largest Residential Proxies network, made of 100M+ highest-quality proxies.

  • Free geo-targeting
  • 0.41s response time
  • 99.82% success rate

Wrapping up

Implementing cURL functionality in JavaScript provides powerful options for making HTTP requests, whether you're building web scrapers, integrating with APIs, or transferring files. Each approach offers different advantages in terms of flexibility, performance, and ease of use.

Regardless of your chosen method, implementing proper error handling, following best practices, and using proxy servers when necessary will ensure robust and reliable network operations in your JavaScript applications.

Would like to explore cURL even further? Discover a series of our blog posts on this topic: How to Send HTTP Header With cURL, How to Use cURL With REST API, How to Use cURL With Proxy, How to Use cURL With Python.

Frequently asked questions

What is cURL in JavaScript?

cURL in JavaScript refers to implementing the functionality of the cURL command-line tool within JavaScript applications. This can be achieved either by executing shell commands through Node.js, using dedicated libraries like node-libcurl that provide bindings to the libcurl C library, or using HTTP client libraries like Axios that offer similar capabilities with a more JavaScript-friendly API.

What does cURL actually do?

cURL is a tool for transferring data with URLs using various protocols. It allows you to:

  • Make HTTP/HTTPS requests (GET, POST, PUT, DELETE, etc.)

  • Upload and download files

  • Use different authentication methods

  • Set headers and cookies

  • Use proxies 

  • Handle redirects

  • Support various protocols beyond HTTP including FTP, SMTP, LDAP, and more.

How to run a cURL script?

There are several ways to run a cURL script in JavaScript. For instance, you can use a child process module to execute a shell command, use node-libcurl package, use the request-promise package, or use Axios as an alternative.

What is the difference between cURL and HTTP?

cURL and HTTP serve different purposes:

  • HTTP is a protocol—a set of rules defining how messages are formatted and transmitted over the web. It's the foundation of data communication on the World Wide Web.

  • cURL is a tool that implements various protocols, including HTTP. cURL is a client-side software that allows you to make requests using these protocols.

About the author

author avatar

Yelyzaveta Nechytailo

Senior Content Manager

Yelyzaveta Nechytailo is a Senior Content Manager at Oxylabs. After working as a writer in fashion, e-commerce, and media, she decided to switch her career path and immerse in the fascinating world of tech. And believe it or not, she absolutely loves it! On weekends, you’ll probably find Yelyzaveta enjoying a cup of matcha at a cozy coffee shop, scrolling through social media, or binge-watching investigative TV series.

All information on Oxylabs Blog is provided on an "as is" basis and for informational purposes only. We make no representation and disclaim all liability with respect to your use of any information contained on Oxylabs Blog or any third-party websites that may be linked therein. Before engaging in scraping activities of any kind you should consult your legal advisors and carefully read the particular website's terms of service or receive a scraping license.

Related articles

Get the latest news from data gathering world

I’m interested