Proxy locations


North America

South America




See all locations

Network statusCareers

Back to blog

How to Download Files With cURL

How to Download Files With cURL

Augustas Pelakauskas

2024-05-304 min read

File transfer is a critical skill in web development and system administration. One of the most versatile tools for this task is cURL (Client for URLs). This command-line utility is designed to transfer data using network protocols such as HTTP, HTTPS, FTP, and SFTP.

With the cURL download file features, you can automate transfers through scripts, saving time and reducing manual errors. Many developers prefer cURL for data manipulation and API interaction.

Read on to learn cURL download file commands, including using essential parameters, handling redirects, and managing the process.

Managing file names

When downloading files with cURL, there are a couple of parameters you'll frequently use:

  • -O (uppercase O) tells cURL to download and save the file with the same name as in the URL.

  • -o (lowercase o) followed by a filename allows you to specify a different name for the saved file.

Download a file and retain its original name:

curl -O

Download a file and rename it locally:

curl -o newfilename.jpg

-o is invaluable when scripting downloads to avoid overwriting existing files or when organizing downloads into a specific naming convention.

Handling redirects

To handle HTTP(S) redirects, use the -L flag (location). It tells cURL to follow any redirect and fetch content from the final destination:

curl -L -O

Without the -L flag, cURL would output the redirection response headers and not follow the new location.

Downloading multiple files

You can download multiple files by invoking cURL repeatedly within the same command. Use curly braces {} for URLs with a common pattern or separate URLs with spaces.

URLs with a common pattern (specific items listed one by one):
curl -O{file1.jpg,file2.jpg,file3.jpg}

A range of files with a common pattern:

curl -O[1-3].jpg

Different URLs:

curl -O -O

Rate limiting

To avoid overwhelming the server, use the --limit-rate option followed by the maximum download speed. It allows you to control the bandwidth usage for a task, either to avoid overloading the network, to comply with bandwidth restrictions, or to simulate slower network conditions for testing purposes.

  • K – kilobytes

  • M – megabytes

  • G – gigabytes

curl --limit-rate 2M -O

Use --max-time to set a timeout for the request and ensure your script doesn’t hang indefinitely.

Rate limiting applies to both downloads and uploads.

Silent downloads

For silent or background operations, use the -s flag to minimize cURL's output:

curl -s -O

The silent mode disables the progress meter and suppresses error messages, letting you focus only on the actual data being transferred or received.


For resources requiring HTTP or FTP authentication, use the -u flag followed by the username and password:

curl -u username:password -O

Be cautious of the security risks associated with exposing sensitive information. You may want to store your credentials in .netrc files instead of exposing them directly in a cURL command.

Download through proxy

To download via a proxy connection, use the -x flag followed by the proxy protocol, host, and port:

curl -x http://host:port -O

For example, downloading a file using Oxylabs Residential Proxies:

curl -x -O

NOTE: The addition of authentication for the proxy connection.

Possible variations for Residential Proxies:

  • Protocol: HTTP, HTTPS, or SOCKS5.

  • Host and port: and 7777 for a random location and location-based entry nodes for geo-targeting.

Download progress

By default, cURL displays a progress meter during downloads/uploads. For a simpler progress bar, use the -# flag. It provides a cleaner, more concise way to monitor the transfer progress without extra details.

curl -# -O

An example of a simplified progress bar output:

#############################################  75.0%

Request/response information

The -v flag stands for verbose. With it, cURL provides detailed information about the request and the response, including:

  1. Request headers.

  2. Response headers.

  3. Protocol: handshakes, connection attempts, and SSL/TLS information.

  4. Request body data.

Using -v is very useful for debugging, as it lets you see exactly what is being sent to and received from the server.

curl -v

cURL vs. Wget

Wget is another command-line utility designed to download files from the web. Unlike cURL, Wget can recursively download files, making it a better choice for some scenarios.

  • Use cURL when you need extensive control over HTTP requests, support for multiple protocols, integration with programming languages, and interaction with web services and APIs.

  • Use Wget when you need to download files or entire websites, especially with the ability to resume interrupted downloads and mirror sites recursively.

Common mistakes and best practices

A common mistake when downloading files with cURL is not handling redirects or errors properly. This can result in incomplete data retrieval or processing errors. Always use the -L flag to follow redirects and check cURL's exit code for successful downloads. Using rate limiting and proxy options can also help avoid server issues or IP bans.

Some of the common mistakes:

  • Ignoring HTTPS and SSL/TLS errors. Using -k or --insecure to bypass SSL certificate checks can make the connection vulnerable to man-in-the-middle attacks. Always verify SSL certificates when dealing with sensitive data.

  • Improper URL encoding can lead to malformed requests and server errors. Use the % encoding for special characters or --data-urlencode for form data.

  • Ignoring HTTP response codes can result in undetected errors. Check the response code using -w "%{http_code}" to handle different outcomes.

  • Improper use of HTTP methods. Ensure you are using the correct method (POST or GET) as required by the API or server.

  • Handling JSON data incorrectly. Sending JSON data without specifying the correct content type might reject the request or misinterpret the data. Use -H "Content-Type: application/json" along with --data or --data-raw.

  • Not escaping special characters in cURL command-line input can lead to shell interpretation errors. Use single quotes (') around strings with special characters or escape them properly.

NOTE: On Windows machines, try double quotes in case of failure.

Not quoting URL parameters can lead to unexpected behavior or errors. Always ensure your URLs are properly quoted.

Some good practices:

  • Use configuration files for repeated requests. Use -K or --config to store common cURL options. This simplifies complex or repeated cURL commands and makes scripts more readable.

  • Optimize data handling. Ensure data is transmitted in the correct format and reduce errors by using --data-binary for raw data uploads, --form for multipart/form-data, and --data-urlencode for URL-encoded form data.

  • Save and reuse cookies. Use -c and -b to save and send cookies for session management to maintain session continuity and mimic browser behavior.

Wrap up

Practice is key to becoming proficient with cURL, so consider setting aside some time to experiment with its capabilities. With cURL, you can easily automate and handle complex downloading scenarios.

The ability to resume interrupted downloads is a significant benefit of using cURL download file commands, ensuring that large or unstable file transfers can be completed without restarting from the beginning.

About the author

Augustas Pelakauskas

Senior Copywriter

Augustas Pelakauskas is a Senior Copywriter at Oxylabs. Coming from an artistic background, he is deeply invested in various creative ventures - the most recent one being writing. After testing his abilities in the field of freelance journalism, he transitioned to tech content creation. When at ease, he enjoys sunny outdoors and active recreation. As it turns out, his bicycle is his fourth best friend.

All information on Oxylabs Blog is provided on an "as is" basis and for informational purposes only. We make no representation and disclaim all liability with respect to your use of any information contained on Oxylabs Blog or any third-party websites that may be linked therein. Before engaging in scraping activities of any kind you should consult your legal advisors and carefully read the particular website's terms of service or receive a scraping license.

Related articles

Get the latest news from data gathering world

I'm interested