cURL is a command-line tool and a library that allows you to transfer data to and from servers using various protocols. Besides extracting public data, which can be HTML, text, JSON, images, etc., cURL can also handle sessions and cookies. It can connect with APIs and be easily incorporated with shell commands for automation. All these cases make cURL a great tool for web scraping.
Using a cURL GET request can help you extract public data, so we explain how to send GET requests in this article.
A GET request is an HTTP request used to retrieve data from a specific web page. For example, when you type in a URL into your browser and click enter, your browser sends a GET request to the server hosting that URL, requesting to retrieve the associated HTML code.
While other HTTP request methods can also result in a web server sending you HTML, those methods are meant for another purpose. For example, HTTP POST is used for posting data. So, you use HTTP POST to send data to a server. For example, the Login button sends your username and password to the server. You may still see a web page, but the POST request method was only for sending the form data. If you want to learn more cURL functionalities, check out our blog posts on sending POST requests with cURL and HTTP headers with cURL.
On the other hand, a GET request doesn't send any data. It's used only for requesting a page or other resources, such as images. Here's a simple example of a cURL GET request:
curl https://oxylabs.io
The result is the HTML returned by the web server.
Now that we've covered the basics let's delve into how to send a GET request using cURL via a terminal. In this tutorial, we'll use httpbin.org, a simple HTTP request and response service. We'll explore different aspects of cURL, including:
Simple cURL GET requests
Sending a GET request with parameters
Retrieving GET HTTP headers
Receiving responses in JSON format
Following redirects
Sending cookies with a GET request
If the default request method is GET, you can skip --request option and send a GET request as follows:
curl http://httpbin.org/get
A GET request with parameters allows you to send additional data to the server within the URL of your request. cURL provides two powerful options, -d and -G, to facilitate this.
Note that if you use -d without -G, the request will be a POST request. Also, if you use the -X option with GET value, the data you want to send with -d will be ignored.
Therefore, you must use -G along with -d if you want to send parameters with a GET request.
curl -G -d "param1=value1" -d "param2=value2" http://httpbin.org/get
In the above command, 'param1' and 'param2' are the keys, and 'value1' and 'value2' are their respective values. The -d option can be used multiple times to send various parameters.
Alternatively, the GET parameters can be included as part of URL:
curl 'http://httpbin.org/get?param1=value1¶m2=value2'
HTTP headers allow for exchanging additional information between the client and server during an HTTP request. To get the HTTP headers along with the response body, use the -i or --include option in the curl GET command:
curl -i http://httpbin.org/headers
This command retrieves the HTTP response headers, such as the server, date, content type, and content length. They provide useful information about the nature and specifics of the response data.
Note that the long parameter for fetching response headers is --head:
curl --head http://httpbin.org/headers
JSON has become a standard for data exchange in the modern web development ecosystem. When interacting with APIs via cURL, requesting data in JSON format is crucial. You can instruct cURL to accept the response in JSON format by using the -H option followed by "Accept: application/json":
curl -H "Accept: application/json" http://httpbin.org/get
Note that sending "Accept: application/json" doesn't guarantee that the response will be returned in JSON format. It highly depends on whether the website supports returning a JSON response.
In certain scenarios, the URL you're requesting might redirect to another URL. By default, cURL doesn't follow such redirects, but you can explicitly instruct it to do so. You can achieve this by using the -L or --location option:
curl -L 'http://httpbin.org/redirect-to?url=http://httpbin.org/get'
Sometimes, you may need to send cookies with your GET request, especially when interacting with websites that require user sessions or tracking user activity. You can use the -b or --cookie option followed by the name and value of the cookie:
curl -b "username=John" http://httpbin.org/cookies
To learn how to send GET requests with cURL, we've prepared a table that shows some key arguments, the short option, the long option, and the description.
Note that you must use either the short or long option, not both.
Argument | Short Option | Long Option | Description |
---|---|---|---|
-Headers | -I | --head |
Retrieves HTTP headers only. |
Include Headers | -i | --include |
Includes the HTTP response headers in the output. |
User Agent | -A | --user-agent |
Specifies the User-Agent string to send to the server. |
Request Type | -X | --request |
Specifies the request type to use (GET, POST, PUT, DELETE, etc.) |
Follow Redirects | -L | --location |
Follows redirects in the server's response. |
Send Cookies | -b | --cookie |
Sends cookies from a string or file. Format of string should be NAME=VALUE; another=anotherval. |
Verbose Mode | -v | --verbose |
Provides more information (debug info). |
Silent Mode | -s | --silent |
Silent mode. Don't output anything. |
Output to File | -o | --output |
Writes output to |
Pass a Custom Header | -H | --header |
Passes a custom header to the server. To get JSON, use -H "Accept: application/json". |
We hope this guide helped you understand the basics of cURL GET requests. Still, as with any skill, practice is key. We recommend spending some time practicing sending requests, and you'll soon become comfortable with cURL.
If you want to learn how to use proxies, see our blog post cURL with proxy. We also have a blog post about cURL with Python to learn how to use the cURL command with the Python code.
Test Oxylabs Scraper APIs designed for advanced web scraping tasks:
Simply put, a cURL GET request retrieves data from a specified resource. It sends an HTTP request to a server and gets the response body, which typically contains the webpage's content or the requested data. Optionally, you can also retrieve the response headers, which provide more details about the response, like content type, content length, server details, and more.
Executing a GET request using cURL is straightforward. You just need to open your terminal or command line and type curl followed by the URL you want to request. See the following example:
curl http://httpbin.org/get
This command sends a GET request to the specified URL and prints the HTML code of the webpage on the terminal.
First of all, you need to open your terminal or command line and type curl. You can get JSON format by using the -H and "Accept: application/json", followed by the URL you want to request. See the following example:
curl -H "Accept: application/json" http://httpbin.org/get
About the author
Iveta Vistorskyte
Lead Content Manager
Iveta Vistorskyte is a Lead Content Manager at Oxylabs. Growing up as a writer and a challenge seeker, she decided to welcome herself to the tech-side, and instantly became interested in this field. When she is not at work, you'll probably find her just chillin' while listening to her favorite music or playing board games with friends.
All information on Oxylabs Blog is provided on an "as is" basis and for informational purposes only. We make no representation and disclaim all liability with respect to your use of any information contained on Oxylabs Blog or any third-party websites that may be linked therein. Before engaging in scraping activities of any kind you should consult your legal advisors and carefully read the particular website's terms of service or receive a scraping license.
Get the latest news from data gathering world
Scale up your business with Oxylabs®