Back to blog

Web Unblocker Quick Start Guide

Vytenis Kaubre

5 min read

Web Unblocker is an AI-powered proxy solution that manages the unblocking process to extract quality public data even from the most difficult sites. It comes with a dynamic scraping strategy discovery system that consists of ML-driven proxy management, dynamic browser fingerprinting, ML-powered response recognition, and other advanced technologies which ensure block-free scraping.

Interested to learn more about our Web Unblocker? You’re right where you should be. This quick start guide will provide everything you need to get rolling. Let’s get started!

What you’ll get with Oxylabs Web Unblocker

  • Block-free scraping – easily bypass advanced anti-bot systems by utilizing a set of AI-powered features.

  • Human-like browsing – with proxies, dynamic fingerprinting, and custom cookies in place, your scraping requests will look like they’re coming from a real organic user.

  • Worldwide coverage – choose from 195 countries and perform country, city, or coordinate-level targeting.

  • Simple integration – connect through a single-entry node that’s compatible with your existing code and start scraping within minutes.

  • 24/7 live support – get professional assistance from our Customer Success Team at any time.

Technical features

  • ML-driven proxy management – selects the best possible proxy pool for your target website. 

  • Dynamic fingerprinting – picks the right selection of headers, cookies, browser attributes, and proxies to bypass blocks by masking your bot’s identity.

  • ML-powered response recognition - creates an effective feedback loop between the scraping results and the experimentation engine to determine the outcome quality.

  • Auto-retry functionality – if a scraping task fails, our system chooses a new set of parameters and resends the request.

  • JavaScript rendering – we render JavaScript web pages on our end, so all you have to do is provide the necessary header in your request along with the preferred output format.

  • Session control – use different IPs with each request, or use the same IP address with all the following requests for up to 10 minutes.

Purchase and subscription information

To purchase the Web Unblocker, please contact our sales team. We provide five different plans based on data usage:

  1. 1-week Free trial (1GB)

  2. Starter (25GB)

  3. Business (60GB)

  4. Corporate (100GB)

  5. Enterprise (1TB+)

Our pricing is based on how much traffic you require monthly. Note that the free trial doesn’t include access to the dashboard. When you choose the Business plan, and up, you’ll get a Dedicated Account Manager for support. More information about each plan is available here.

Using Web Unblocker

If you’ve previously used proxies for web scraping, you’ll find the Web Unblocker integration process familiar. The only distinction is that we require you to ignore SSL certificates by using the -k or --insecure cURL parameters (or an equivalent expression in your preferred language).

To use the Web Unblocker for scraping requests, you need to use the endpoint. Here’s an example of a simple request using cURL:


Note that this code example doesn’t include any additional parameters, such as the proxy location or session time settings. Therefore, our system will add all standard headers, select the fastest proxy and deliver the response to you.

You can find more code samples in our documentation for Python, PHP, C#, Golang, Java, and Node.js languages. For a complete list of code examples, visit our documentation.

Geo-location settings

In case you want to access location-specific proxies, simply add the x-oxylabs-geo-location header with a specific country name next to it. For instance, add United States to connect to the United States proxy:

curl -k -v -x \
-H "x-oxylabs-geo-location: United States"

Don’t forget to replace the username and password with your Oxylabs sub-user’s login credentials.

You can find more code examples in other languages and download the full list of supported geo-location parameter values from our documentation.

Session control

Web Unblocker allows you to use the same proxy IP for multiple requests or a different IP with each request. To use the same IP, simply add the x-oxylabs-sesion-id header with any string value of your choice for the session ID. You’ll be able to use the same proxy for all consecutive requests for up to 10 minutes. After that, we’ll assign a new proxy to that particular session ID. Here’s a code example using cURL:

curl -k -v -x \
-H "X-Oxylabs-Session-Id: 123randomString"

More code samples in other languages can be found here.


To help you effectively scrape target websites, the Web Unblocker supports custom headers. You can use standard headers, like user-agent or accept-language, as well as custom and target-specific headers:

curl -k -v -x \
-H "Your-Custom-Header: interesting header content" \
-H "User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Ubuntu Chromium/73.0.3683.86 Chrome/73.0.3683.86 Safari/537.36" \
-H "Accept-Language: en-US"

Follow this link to find code examples in other languages.


You can also set custom cookies to the target website. The system will return all response headers and cookies with your first request, so you can later customize and send them with the next request. In cURL, the code would look like this:

curl -k -v -x \
-U user:pass1 "" \
-H "Cookie: NID=1234567890; 1P_JAR=0987654321"

More code examples in different languages can be found here.

POST requests

With the Web Unblocker, you can not only send GET requests but also POST requests. This is a valuable feature when you need to send data to a targeted website:

curl -k -v -x \
-U user:pass1 "" \
-d "Some Content"

Click here to find code samples in other languages.

JavaScript rendering

When you need to load JavaScript to dynamically load all required data into the Document Object Model (DOM), you can use the x-oxylabs-render: html header to fully render and capture data into an HTML file. If you want to receive the data as a PNG screenshot, use the png entry instead of html. You can find code examples here.

Custom and existing status codes

While the Web Unblocker marks requests as successful after receiving a 2xx or 4xx status code, there may be times when a website returns the required data but with a non-standard HTTP status code. In that case, you can specify which status codes are acceptable by adding the x-oxylabs-status-code header with all HTTP response codes that work for you. Please keep in mind that 2xx and 4xx will still be marked as successful. For code examples, follow this link.

In the following table, we’ve compiled a list of some of the most common status codes indicating errors:

400 Bad RequestProxy server can return this error code if the request didn’t contain a host to connect to or there was a generic error when parsing an HTTP request. Make sure your request is correctly formed and make sure to include a URL in the request, then try again.
407 Proxy Authentication RequiredRequest lacks proxy authentication information, or the username or password is invalid. Include the Proxy-Authorization header in your request and make sure your username and password are correctly formed and then try again.
500 Internal Server ErrorProxy server has encountered an internal error. Retry request at a later time.
502 Bad GatewayProxy server received an invalid response from the upstream server. Retry request.

Response Code 502 signifies that the IP assigned to your session ID is no longer available. If you encounter this error, there are two ways to work around it. The first is to wait for one minute, and the system will automatically assign a new IP address to your session ID. Another approach is to simply switch to a new session ID (i.e., change the sessid parameter) – this way you will receive a new IP address.
504 Gateway TimeoutProxy server didn't receive a response from the upstream server in time. Retry request.
525 No Exit FoundCustom HTTP status code – this means the proxy was unable to find an exit node that satisfies the request. Change request filter parameters or try again at a later time.


The Oxylabs GitHub repository is the place to go for instructions on how to scrape websites, put our tools into use, implement products, or integrate them using the most widely used programming languages (such as C#, GoLang, Java, NodeJs, PHP, Python, Ruby, etc.).

Oxylabs dashboard for Web Unblocker

The Oxylabs dashboard is your personal hub to manage Oxylabs services. There, you’ll find Web Unblocker usage statistics, and you’ll be able to change the sub-users’ passwords. Below, you’ll find more information on these features:

Usage statistics

Within the “Statistics” section, you’ll find a detailed view of how much traffic you use daily. You can set the time period of usage as preferred and filter the usage by sub-user and domain. Additionally, you’ll be able to see your requests’ success rates, average response times, and the number of results for each domain.

User management

Your sub-users can be seen in the "Users" section. You can view the usernames and, if necessary, update the passwords.


Web Unblocker is a powerful tool that enables your web scraper to collect public data from websites without IP blocks, CAPTCHAs, and geo-restrictions. Filled with sophisticated anti-detection features, like dynamic fingerprinting and ML-driven proxy management, as well as additional features, like JavaScript rendering, the Web Unblocker will streamline your scraping experience.

If you have any further questions about Web Unblocker, please feel free to contact our support team at any time via live chat or email.

About the author

Vytenis Kaubre

Junior Copywriter

Vytenis Kaubre is a Junior Copywriter at Oxylabs. As his passion lay in creative writing and curiosity in anything tech kept growing, he joined the army of copywriters. After work, you might find Vytenis watching TV shows, brainstorming ideas for tabletop games, or taking Raymond Chandler’s advice, “When in doubt, have a man come through a door with a gun in his hand” too seriously (when writing short stories).

All information on Oxylabs Blog is provided on an "as is" basis and for informational purposes only. We make no representation and disclaim all liability with respect to your use of any information contained on Oxylabs Blog or any third-party websites that may be linked therein. Before engaging in scraping activities of any kind you should consult your legal advisors and carefully read the particular website's terms of service or receive a scraping license.

How to Parse HTML with PyQuery: Python Tutorial

Vytenis Kaubre


Real Estate Scraper API Quick Start Guide

Augustas Pelakauskas


Guide to Scraping Data from Websites to Excel with Web Query

Iveta Vistorskyte


Get the latest news from data gathering world

I’m interested


  • What you’ll get with Oxylabs Web Unblocker

  • Technical features

  • Purchase and subscription information

  • Using Web Unblocker

  • GitHub

  • Oxylabs dashboard for Web Unblocker

  • Conclusion

Scale up your business with Oxylabs®