Get Data From Any Website With Web Scraper API

  • Pay only for successfully delivered results

  • Get highly localized real-time data without IP blocks

  • Enhance efficiency and cut infrastructure costs

See it in action

No credit card is required. Free trial lasts for 1 week and includes 5K results.

Web Scraper API for various business use cases

Collect product data from eStores

  • Track product data in real-time

  • Implement dynamic pricing strategies

  • Monitor reviews to build brand reputation

  • Conduct market research

E-Commerce

Acquire search engine results

  • Get data from Google page types like search and images

  • Extract SERP elements like featured snippet and local pack

  • Keep track of your brand and competitor SERP rankings

  • Optimize SEO strategies

SERP monitoring

Scrape real estate data

  • Explore real-time property prices across various platforms

  • Compare prices for trend analysis

  • Analyze rental rates in high-demand zones

  • Make precise property value estimations

Real estate

Extract travel industry data

  • Gather real-time prices for flights and accommodations

  • Compare data across various platforms for strategy refinement

  • Track and analyze accommodation availability

  • Analyze customer reviews for insights

Travel

Collect B2B intelligence data

  • Leverage company profiles for B2B lead generation

  • Scrape essential business details and job postings

  • Identify potential partners for collaboration

  • Strengthen business development efforts

Companies

Scrape entertainment websites

  • Uncover audience preferences through content trend analysis

  • Explore user engagement across different websites

  • Diligently monitor media for copyright infringement prevention

  • Preserve the integrity of intellectual property

Entertainment

Dive into code samples

Accessing data from challenging websites has never been easier. Explore the capabilities of Web Scraper API with practical code samples.

Input parameters

source

Scraper

Set the scraper to 'amazon_product' to get product data. (Other sources: 'amazon_search', 'amazon_pricing', 'amazon_reviews', 'amazon_questions', 'amazon_bestsellers', 'amazon_sellers')

query

ASIN

Input 10-symbol ASIN code of the product you want to scrape.

geo_location

Localization

Specify the 'Deliver to' location.

domain

Amazon domain

Specify Amazon marketplace you want to scrape.

locale

Interface language

Set the interface language.

render

JavaScript rendering

Enable to load JavaScript-based content.

parse

Structured data

Enable to get structured product data.

Input

Output

Output preview

Copy

import requests
from pprint import pprint

# Structure payload.
payload = {
    'source': 'amazon_product',
    'query': 'B0BGYWPWNC',
    'geo_location': '90210',
    'domain': 'com',
    'parse': True
}

# Take a free trial or buy the product on our dashboard to create an API user.
# Replace 'USERNAME' and 'PASSWORD' with your API credentials to run this request.

# Get response by using real-time endpoint.
response = requests.request(
    'POST',
'https://realtime.oxylabs.io/v1/queries',
    auth=('USERNAME', 'PASSWORD'),
    json=payload,
)

# Print prettified response to stdout.
pprint(response.json())

See full code

...
"product_details": {
    "os": "iOS 16",
    "ram": "1024 GB",
    "asin": "B0BGYWPWNC",
    "color": "Silver",
    "batteries": "1 Lithium Ion batteries required. (included)",
    "form_factor": "Slate",
    "item_weight": "15.5 ounces",
    "manufacturer": "Apple Computer",
    "customer_reviews": "4.2 4.2 out of 5 stars 1,444 ratings 4.2 out of 5 stars",
    "whats_in_the_box": "iPhone, Charger, Mfi cable, SIM Pin ejector",
    "best_sellers_rank": "#139 in Amazon Renewed (See Top 100 in Amazon Renewed) #49 in Renewed Smartphones #1,003 in Climate Pledge Friendly: Electronics",
    "country_of_origin": "China",
    "item_model_number": "A2483",
    "product_dimensions": "0.28 x 2.8 x 5.75 inches",
    "battery_power_rating": "3095",
    "date_first_available": "September 30, 2022",
    "other_display_features": "Wireless",
    "memory_storage_capacity": "1024 GB",
    "connectivity_technologies": "Wi-Fi",
    "ram_memory_installed_size": "1 TB",
    "standing_screen_display_size": "6.1 Inches"
},
...

Try Web Scraper API out for yourself

Discover Scraper API Playground on Oxylabs dashboard for a firsthand interaction with Web Scraper API, and explore technical documentation for all the information you need.

Collect quality data from any URL

Collect quality data from any URL

With Oxylabs Web Scraper API, you can bypass anti-scraping systems and extract large volumes of data from even the most complex websites. Accuracy, no missing pieces and overall quality of the retrieved data is our guarantee. 

Custom headers and cookies

Send custom headers and cookies at no extra cost for enhanced control over your scraping.

Global coverage

Our premium proxy pool spans 195 countries, providing you with unrestricted access to localized data.

OxyCopilot

Automate API call code generation with OxyCopilot

OxyCopilot, an integral feature of Web Scraper API, is an AI-powered assistant designed to auto-generate codes for scraping requests and parsing instructions, eliminating manual coding:

  • Use Scraper API Playground

  • Input your prompt

  • Receive ready-to-use code

Try Web Scraper API with free 5K results

Advanced features

Leverage Web Scraper API smart features for collecting data at scale.

Proxy management

ML-driven proxy selection and rotation using our premium proxy pool from 195 countries.

Custom parameters

Enhance your scraping control with custom headers and cookies at no extra cost.

AI-driven fingerprinting

Unique HTTP headers, JavaScript, and browser fingerprints ensure resilience to dynamic content.

CAPTCHA bypass

Automatic retries and CAPTCHA bypassing for uninterrupted data retrieval.

JavaScript rendering

Accurate, high-quality data extraction from dynamic and interactive websites.

Web Crawler

Comprehensive page discovery on websites, extracting only essential data.

Scheduler

Automate recurring scraping jobs with desired frequency and receive data to AWS S3 or GCS.

Custom Parser

Define your parsing logic using XPath or CSS selectors for structured data collection.

Headless Browser

Render JavaScript-based pages with a single line of code, eliminating the need for complex browser development or automated third-party tools. Set up custom browser instructions and enable Headless Browser to execute mouse clicks, input text, scroll pages, wait for elements to appear, and more.

  • Effortless JavaScript rendering

  • Browser instructions execution

  • Seamless data collection

Learn more

Get a maintenance-free scraping infrastructure

Benefit from our AI-powered web data collection infrastructure that is ready-to-use straight away.

  • No need to develop or maintain scrapers and browsers

  • Bypass anti-scraping systems

  • Allocate your resources towards analyzing data

Simple integration

Step 1: Enter your endpoint URL, API user credentials, and data payload into a single request.

Step 2: Send this request to our API. We’ll take it from there – you don’t need to take any other actions. 

Step 3: Retrieve the result directly from the API or store it in your chosen cloud storage bucket.

API reference

Copy

import requests
from pprint import pprint

username = "USERNAME"
password = "PASSWORD"

payload = {
    "source": "universal",
    "url": "https://sandbox.oxylabs.io/products/",
    "geo_location": "United States",
}

response = requests.request(
    'POST',
    'https://realtime.oxylabs.io/v1/queries',
    auth=(username, password),
    json=payload,
)

pprint(response.json())

Web Scraper API pricing

Regular
Enterprise

Pay only for successful results

Avoid CAPTCHAs and IP blocks

Save time and development costs

Don’t miss out

Free trial

0

1 week trial

Limited to 1 user

Micro

49

$2.00 / 1K results

$49 + VAT billed monthly

Starter

99

$1.80 / 1K results

$99 + VAT billed monthly

Advanced

249

$1.65 / 1K results

$249 + VAT billed monthly

Results
5,000

24,500

55,000

151,000

Rate Limit
5 requests / s

10 requests / s

15 requests / s

30 requests / s

AI-Powered Web Scraping
Worldwide Geo-Targeting
JavaScript Rendering
Dedicated Account Manager

10% off

Yearly plans discount

For all our plans by paying yearly. Contact sales to learn more.

We accept these payment methods:

What do others say?

"Building and maintaining your own scraping and parsing solution is expensive. That’s why we turned to Oxylabs. They offered one of the best price-to-value combinations in the market and helped us save our total web scraping costs."

Conductor company face

Wei Zheng

Chief Product Officer at Conductor

Read full story
Conductor company image
More customer stories

Frequently asked questions

What is a web scraping API?

A web scraping API is software that retrieves data from a URL with the help of an API call. It helps establish a connection between a user and a web server to access and extract data.

What type of data can I extract with Web Scraper API?

Web Scraper API can deliver data in raw HTML or structured JSON from any web page, including e-commerce marketplaces and SERPs. Additionally, it leverages the JavaScript rendering feature to retrieve data from websites using JavaScript for dynamic content loading.

Can I automate recurring scraping jobs with Web Scraper API?

Yes, with the free Scheduler feature, you can automate recurring scraping jobs by scheduling them. Simply put, you don't need to send new requests with identical parameters to receive regular updates. Also, there's no need to create or maintain your scheduling scripts.

Check our documentation to learn more about the Scheduler feature.

How long does Web Scraper API take to give the results back?

Web Scraper API can deliver real-time results from almost any website worldwide. The delivery time highly depends on a requested target. For more details regarding specific targets, please get in touch with your Account Manager or contact our support team.

Is it legal to scrape a website?

Web scraping services may be legal in cases where it is done without breaching any laws regarding the source targets or data itself. We have explored this subject in one of our blog posts, and we highly recommend that you read it and consult with your legal advisor before any scraping project to avoid any potential risks.

How to use a web scraping API?

Using Web Scraper API consists of three main steps:

  • First, create a request and add the necessary information, such as the endpoint URL, user credentials, and the payload.

  • Second, send the request to the API. 

  • Finally, receive the results—you can retrieve them via the API or have them delivered to the storage solution of your choice. To see how Web Scraper API looks in action, check out our video here.

What is the difference between a scraper and a parser?  

While scrapers and parsers go hand-in-hand, they have different functionalities. Simply put, scrapers retrieve the information from the web, while parsers focus on analyzing text based on predefined rules and syntax.

Does Web Scraper API have rate limits? 

Yes, Web Scraping API comes with a specific job submission rate limit. The rate at which you can submit jobs depends on your plan size. For example, the free plan comes with 5K results, you can submit 5 total jobs per second, and for rendered jobs, you can submit one job per second. On the other hand, Web Scraper API can bypass rate limiting that websites implement as anti-bot measures.

To see the specifics for each plan, please refer to our documentation.

Is Web Scraper API ISO certified?

Yes, Web Scraper API is ISO/IEC 27001:2017 certified. This certification demonstrates our commitment to maintaining a robust Information Security Management System (ISMS) that adheres to internationally recognized standards for data security. To learn more about what ISO/IEC 27001:2017 certification means for our product and users, please read here.

How to set up Postman with Web Scraper API?

Try out Web Scraper API with Postman before using it at scale. You can import our API collection to Postman and start scraping right away.

More FAQs

Get the latest news from data gathering world

I'm interested