Back to blog

How to Send cURL OPTIONS Request

Maryia Stsiopkina

2025-01-223 min read
Share

Have you ever wondered what HTTP methods a web server supports before sending your requests? That's where the OPTIONS method comes in. While GET requests are the workhorses of web communication, OPTIONS requests act like scouts, gathering intelligence about a server's capabilities and restrictions. But sending these requests effectively requires more than just the OPTIONS method – it needs the right combination of cURL command-line options (or flags) to handle everything from authentication to proxy configuration. For web scraping projects, this dual approach of using OPTIONS requests with carefully configured cURL parameters is invaluable – it helps you understand exactly how to communicate with your target server effectively while maintaining precise control over how your requests are sent.

What is cURL?

Let's explore what cURL is and how you can use it effectively. cURL (Client URL) is a command-line tool and library for transferring data using various protocols. The cURL command supports numerous protocols, including HTTP, HTTPS, and file transfer protocol, making it an essential command-line utility for developers working with web APIs and web scraping tasks. 

cURL options reference

When web scraping, you'll primarily need a specific set of cURL options to handle authentication, headers, proxies, and error handling. You can also learn more about using cURL with a proxy before diving into this tutorial. Here are the most relevant options organized by category:

Authentication options

--anyauth: Automatically selects the most appropriate authentication method

--basic: Forces basic HTTP authentication

--digest: Uses digest authentication

--user, -u <user:password>: Specifies username and password for authentication

Header and response options

--header, -H <header>: Adds custom header (crucial for setting User-Agent and other scraping-related headers)

--dump-header, -D <file>: Writes response headers to file (useful for debugging and analyzing server responses)

--ignore-content-length: Ignores Content-Length header

--compressed: Requests compressed response (helps handle gzip/deflate encoding)

Proxy and network options

--proxy, -x <[protocol://][user:password@]proxyhost[:port]>: Specifies proxy settings

--proxy-user, -U <user:password>: Sets proxy username and password

--connect-timeout <seconds>: Sets maximum connection time

--max-time, -m <seconds>: Sets maximum time allowed for transfer

--limit-rate <speed>: Limits transfer speed (useful for respecting rate limits)

Error handling options

--fail, -f: Fails silently on HTTP errors

--show-error, -S: Shows errors even in silent mode (essential for debugging)

--retry <num>: Retries failed requests

--retry-delay <seconds>: Sets delay between retries

We invite you to revisit the topics of sending cURL POST requests and cURL GET requests to refresh your knowledge before proceeding.

Tutorial on how to send cURL OPTIONS request

The simplest form of an OPTIONS request can be made using the following command:

curl -X OPTIONS https://oxylabs.io/ -i

The -i flag includes the HTTP response headers in the output, which is crucial since OPTIONS requests are primarily about examining server headers. Here’s the response example:

HTTP/2 405
date: Mon, 20 Jan 2025 09:57:51 GMT
content-type: text/html; charset=utf-8
x-frame-options: SAMEORIGIN
allow: GET
allow: HEAD
cache-control: private, no-cache, no-store, max-age=0, must-revalidate
x-powered-by: Next.js
vary: Accept-Encoding
via: 1.1 google
strict-transport-security: max-age=63072000; includeSubDomains; preload
alt-svc: h3=":443"; ma=86400
cf-cache-status: DYNAMIC
server: cloudflare
cf-ray: 904e33e4dde4dd3b-VNO

<!DOCTYPE html>Page content here...</html>

Understanding the response

When you send an OPTIONS request, the server typically responds with several important headers. Let's break down what these headers mean for your web scraping project:

  • Allow: Lists the HTTP methods permitted for the requested resource

  • Access-Control-Allow-Methods: Specifies which HTTP methods can be used in cross-origin requests

  • Access-Control-Allow-Headers: Indicates which headers can be used in the actual request

Advanced OPTIONS request

To make your scraper appear more like a genuine browser request, you'll want to include these additional headers:

curl -X OPTIONS https://oxylabs.io/ -i \
-H "User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/132.0.0.0 Safari/537.36" \
-H "Accept: */*" \
-H "Access-Control-Request-Method: GET" \
-H "Access-Control-Request-Headers: content-type" \
-H "Accept-Language: en-US,en;q=0.9" \
-H "Accept-Encoding: gzip, deflate, br" \
-H "Connection: keep-alive" \
-H "Referer: https://oxylabs.io/"

Error handling

While OPTIONS requests are straightforward, things can go wrong. Let's look at how to handle various scenarios effectively. Here are the key aspects to consider:

Status codes

A successful OPTIONS request typically returns:

- 200 OK: The server supports OPTIONS and is responding normally

- 204 No Content: The server accepts the request but has no content to return

Common error codes include:

- 403 Forbidden: Request blocked by security measures

- 405 Method Not Allowed: Server doesn't support OPTIONS requests

- 429 Too Many Requests: Rate limit exceeded

Saving response headers

For automation purposes, you might want to save the response headers:

curl -X OPTIONS https://oxylabs.io/ -D headers.txt

This command saves the response headers to a file named 'headers.txt', which you can later parse in your scraping script.

Bottom line

Now that you understand how to leverage OPTIONS requests, you're better equipped to build resilient web scraping solutions. Whether you're working with complex APIs or heavily secured websites, this knowledge will help you make more informed decisions about how to approach your scraping tasks.

 For more insights, read these articles about using cURL with REST API. Additionally, you can explore how to download files with cURL or use cURL with Python.

About the author

Maryia Stsiopkina

Senior Content Manager

Maryia Stsiopkina is a Senior Content Manager at Oxylabs. As her passion for writing was developing, she was writing either creepy detective stories or fairy tales at different points in time. Eventually, she found herself in the tech wonderland with numerous hidden corners to explore. At leisure, she does birdwatching with binoculars (some people mistake it for stalking), makes flower jewelry, and eats pickles.

All information on Oxylabs Blog is provided on an "as is" basis and for informational purposes only. We make no representation and disclaim all liability with respect to your use of any information contained on Oxylabs Blog or any third-party websites that may be linked therein. Before engaging in scraping activities of any kind you should consult your legal advisors and carefully read the particular website's terms of service or receive a scraping license.

Frequently Asked Questions

How do you send Curl commands?

You can send cURL commands through the terminal/command line using the curl command followed by the URL. The basic syntax is curl example.com. You can add various options like -X to specify the HTTP method (GET, POST, PUT, etc.) and -H to add headers. Example: curl -X GET -H "Content-Type: application/json" example.com

How do I send cURL data?

To send cURL data, use the -d or --data option followed by your data in quotes. For example: curl -d "name=john&age=25" https://httpbin.org/post. For JSON data, include the appropriate content-type header: curl -d '{"name":"john","age":25}' -H "Content-Type: application/json" https://httpbin.org/post. The -d option automatically sets the request method to POST.

How do I send a form with cURL?

In order to send a form with cURL, use -F or --form option for multipart/form-data requests, which is ideal for file uploads and complex form submissions. Example: curl -F "file=@./my_photo.png" -F "description=My photo" https://httpbin.org/anything. This automatically sets the correct content-type header and handles file encoding. You can also send multiple files and form fields in a single request.

How do I share a cURL request?

To do that, copy the entire cURL command, including all options and parameters. Use curl -v (verbose mode) to generate a detailed output that includes headers and request details for sharing. You can also save the entire connection data into a trace file, for example curl https://httpbin.org/status/403 --trace-ascii curl_trace.txt. For cleaner sharing, you can use curl -sS to show errors but suppress the progress meter. Tools like Postman or cURL converter can also convert cURL commands into different formats for easier sharing with team members.

Related articles

Get the latest news from data gathering world

I’m interested