Start scraping Google in seconds

Get real-time Google data without CAPTCHAs or blocks. Check out code samples to see how it works.

Input parameters

source

Scraper

Set the scraper to 'google_search' to get web, image, or news search results. (Other sources: 'google', 'google_ads', 'google_images', 'google_lens', 'google_maps', 'google_travel_hotels', 'google_suggest', 'google_trends_explore')

query

Search query

Input UTF-encoded search query.

geo_location

Localization

Specify the UULE-based geographic location value. (State name, Country name, Coordinates and Radius, Criteria ID)

domain

Google domain

Specify Google domain you want to scrape.

locale

Interface language

Set the interface language.

results_language

Results language

Set the results language.

render

JavaScript rendering

Enable to load JavaScript-based content.

parse

Structured data

Enable to get structured data.

Input

Output

Output preview

import requests
from pprint import pprint

# Structure payload.
payload = {
   'source': 'google_search',
   'query': 'adidas',
   'parse': True,
   'domain': 'nl',
   'geo_location': 'Netherlands',
   'locale': 'nl-nl',
   'start_page': '1',
   'pages': '2',
   'context': [
        {'key': 'filter', "value": 1},
        {'key': 'results_language', "value": 'nl'}
    ]
}

# Take a free trial or buy the product on our dashboard to create an API user.
# Replace 'USERNAME' and 'PASSWORD' with your API credentials to run this request.

# Get response by using real-time endpoint.
response = requests.request(
    'POST',
    'https://realtime.oxylabs.io/v1/queries',
    auth=('USERNAME', 'PASSWORD'),
    json=payload,
)

# Print prettified response to stdout.
pprint(response.json())

See full code

...
  "content": {
    "url": "https://www.google.nl/search?q=adidas&lr=lang_nl&filter=1&safe=off&uule=w+CAIQICILbmV0aGVybGFuZHM&gl=nl&hl=nl",
    "page": 1,
    "results": {
      "pla": {
        "items": [
          {
            "pos": 1,
            "url": "https://www.footlocker.nl/nl/product/~/314310743704.html?channable=03b4996964003331343331303734333730343130305e",
            "price": "€ 119,99",
            "title": "Adidas - Campus Heren Schoenen - Grijs - Maat: 44 - Suède - Foot Locker",
            "seller": "Footlocker.nl",
            "source": "Van Shoparize",
            "url_image": "https://encrypted-tbn3.gstatic.com/shopping?q=tbn:ANd9GcR01C659O4XdaFVkjhIqJ5_MM22-7mgBox9yKQagKIZpjDJJabE5A7FV5h_7sbFraVOCFUs0qDJxMt4jkuO6RC6q_AnRiiNMhAAoH6rd-nxCCzKm17EK8JeglfyIqheCzY891TyuuN9FA&usqp=CAc",
            "image_data": "UklGRsoKAAB<...>SQAAAAAAAAA=="
          },
          { "...": "..." }
        ],
        "pos_overall": 1
      },
      "paid": [],
      "images": {
        "items": [
          {
            "alt": "Originals schoenen, kleding en accessoires | adidas NL",
            "pos": 1,
            "url": "https://www.adidas.nl/originals",
            "data": "/9j/4AAQSk<...>TKlQhB//9k=",
            "source": "https://www.adidas.nl/originals"
          },
          { "...": "..." }
        ],
        "pos_overall": 6
      },
      "organic": [
        {
          "pos": 1,
          "url": "https://www.adidas.nl/",
          "desc": "adidas is meer dan sport- en trainingskleding. We werken samen met de besten in de branche om samen te creëren. Op deze manier bieden we onze fans de ...",
          "title": "adidas Officiële Website Nederland | Sportwinkel",
          "sitelinks": {
            "expanded": [
              { "url": "https://www.adidas.nl/dames", "title": "Dames" },
              {"...": "..."}
            ]
          },
          "url_shown": "https://www.adidas.nl",
          "pos_overall": 2
        },
        { "...": "..." }
      ],
      "search_information": {
        "query": "adidas",
        "showing_results_for": "adidas",
        "total_results_count": 10200000
      }
    }
  }
...

Try Google Scraper API out for yourself

Discover Scraper APIs Playground on Oxylabs dashboard for a firsthand interaction with our APIs, and explore technical documentation for all the information you need.

OxyCopilot

Automate API call code generation with OxyCopilot

OxyCopilot, an integral feature of Web Scraper API, is an AI-powered assistant designed to auto-generate codes for scraping requests and parsing instructions, eliminating manual coding:

  • Use Scraper API Playground

  • Input your prompt

  • Receive ready-to-use code

Great service and support! We have been using Oxylabs services for several years, and we are very pleased with the quality of service, especially comparing to previous few similar we tried. Their crew also has been very forthcoming and helpful.

Milos

Oxylabs customer

Trusted by businesses worldwide

See why global teams rely on Oxylabs for their web data needs

Leverage advanced features

Developers-first documentation

Get started quickly with our documentation

Vast proxy pool

Use 177M+ proxy pool for geo-targeting

Bulk data extraction

Up to 5,000 URLs per batch from multiple pages all at once

Multiple delivery options

Get results via the API or to your Amazon S3 or Google Cloud Storage bucket

Highly scalable

Easy to integrate, customize & supports a high volume of requests

24/7 assistance

Contact our support staff and resolve your issues anytime

Collect Google data smarter with API features

Custom Parser

Custom Parser

Independently write parsing instructions and parse any target effortlessly while using our infrastructure.

  • No need to maintain your own parser

  • Define your own parsing logic with XPath and CSS selectors

  • Collect ready-to-use structured data from Google Maps

Scheduler

Automate recurring data extraction and parsing jobs with the needed frequency by scheduling them with Scheduler feature.

  • Create multiple schedules for different jobs

  • Receive data automatically to your preferred cloud storage

  • Get notified once each job is done

Google Search scraper pricing

Gather data from Google effortlessly

Regular
Enterprise

Pay only for successful results

Gather coordinate-level SERP data

Receive data collection know-how

Don’t miss out

Free trial

1 week trial

0

5000  Results

10 requests / s

 Rate Limit

Premium Proxies

AI-Powered Web Scraping

JavaScript Rendering

 

Micro

$49/mo

1.35

36,296

 Results
50 requests / s  Rate Limit

Premium Proxies

AI-Powered Web Scraping

JavaScript Rendering

$49 + VAT billed monthly

Starter

$99/mo

1.30

76,154

 Results
50 requests / s  Rate Limit

Premium Proxies

AI-Powered Web Scraping

JavaScript Rendering

$99 + VAT billed monthly

Advanced

$249/mo

1.25

199,200

 Results
50 requests / s  Rate Limit

Premium Proxies

AI-Powered Web Scraping

JavaScript Rendering

$249 + VAT billed monthly

10% off

Yearly plans discount

For all our plans by paying yearly. Contact sales team to learn more.

We accept these payment methods:

Google scraping tutorials

Frequently asked questions

How does Google SERP API work?

Simply put, you first select your target website and send a request to our API containing the link (or links to several pages.) Then, our API returns the data in a structured format so you can easily analyze it. 

Of course, in reality, the process is a bit more complex than that – check out this video where one of our professionals explains how the tool works step-by-step. 

What is SERP data used for? 

One of the most common cases for SERP data is SEO monitoring – tracking the visibility and rankings of your (or your competitor’s) website. SEO specialists then analyze this data and implement strategic decisions, if necessary. Also, other businesses monitor reviews, product pricing, and hotel data that appears on Google. Feel free to test out our API with a free trial and get started easily by following our guide to retrieving Google search results.

If you're completely new to web scraping, check out our article about the best websites to scrape for practice before moving on to Google.

Does Google offer an official API for collecting its data?

No, as of this moment, Google doesn’t provide a tool for retrieving its data, so businesses typically use third-party tools or build those themselves.