Back to blog

How to Scrape Google Shopping Results: A Step-by-Step Guide

author avatar

Yelyzaveta Hayrapetyan

Last updated on

2025-11-06

6 min read

In today’s competitive business environment, it’s hard to imagine a scenario where an e-commerce company or a retailer stays in demand without turning to web scraping. To shortly answer why, gathering accurate public data from thousands of targets worldwide, often with the help of a proxy, is what gives them a chance to draw actionable insights and, eventually, present customers with the best deals.

This tutorial will demonstrate how you can scrape publicly-available data from Google Shopping hassle-free. In addition to the guide itself, we’ll shortly cover whether it’s legal to scrape Google Shopping and what difficulties you can encounter in the process.

What is Google Shopping?

Formerly known as Google Products Search, Google Products, and Froogle, Google Shopping is a service that allows users to browse, compare, and shop for products from different suppliers who have paid to be featured on the website. 

While giving consumers an opportunity to choose the best offers among thousands of brands, Google Shopping is also beneficial for retailers. When a user clicks on a product link, they are redirected to the vendor’s website for purchasing; thus, Google Shopping acts as a solution for businesses to advertise their products online.

More information on how Google Shopping works can be found here

Google Shopping results page structure overview

The data you get when browsing Google Shopping can be segmented into these categories: Search and Product. Let's briefly discuss each of these:

  • Search: A list of items for a specific search term on Google Shopping with information about each item, such as title, description, price, availability, and more.

  • Product: Comprehensive information about a single product's listing, details about other retailers selling it, and the costs at which it’s offered.

Search page 

The Google Shopping search results page lists all the relevant items available for the required product. The below screenshot highlights different attributes of a results page for the query “pixel 10.” 

Results page
  • Search bar: Allows a user to search for any product on Google Shopping. 

  • Sponsored products: A list of products that merchants pay to promote, typically displayed at the top or throughout search results.

  • List of organic products: Lists all the organically sold products and the details of each product. 

  • Filters: Allows you to apply any filter to your search, for example, price range, color, style, etc. 

  • Sorting options: This drop-down list enables you to sort your search on multiple attributes, for example, increasing price, decreasing price, popularity, etc.

  • The list of products shows an individual product with the following product attributes: product name, price, name of the retailer or store, and ratings.

Products page

When you select a specific item from the search page, the Product's page expands from the side. This page contains detailed information about that particular product, such as its pictures, key features, product details, product reviews, retailers and prices information, and much more.

Product page
  • Product name: Title of the product. 

  • Product information: Thumbnails, variants, and general specifications.

  • Product details: Detailed description of the product.

  • Prices: List of different retailers and their prices. 

  • Min and max prices: Product’s minimum to maximum pricing range sold by different sellers.

  • Top insights: Featured information about the product.

  • Product reviews: Product rating and customer reviews. 

  • Related items: A list of similar products.

Can you scrape Google Shopping for free?

Before we get into the full tutorial, you should also know that we have free Datacenter Proxies. You can use that as an alternative for smaller scale Google Shopping scraping, for which you'll need to build your own script.


To claim your free proxies, simply log into the dashboard – there, you’ll find the free proxy IPs:

Get free proxies

Get 5 IPs for free and enjoy exceptional performance & speed.

No credit card required

20 concurrent sessions

5GB of traffic per month across 5 US IPs

In general, web scraping is legal as long as you strictly follow all the regulations surrounding the public data you wish to gather. However, we still recommend seeking professional legal advice to rule out any possible risks.

If you wish to dive deeper into the topic of web scraping legality, check out our extensive blog post.

The pain of scraping Google Shopping

Though doable, scraping Google Shopping might not be the easiest task to take on. Not only is Google Shopping good at detecting automated requests, but it also requires parsing JavaScript, which is an “expensive” operation that slows down the scraping process. 

Therefore, to make sure you effortlessly scrape and parse a variety of Google Shopping page types, it’s best to rely on a high-quality scraping solution, such as Oxylabs’ Google Shopping API. This API is specifically designed to deal with the challenges of Google scraping process and lets you gather accurate real-time data globally. If you want to extract data from the Google search engine, check out our other tutorial on how to scrape Google search results. For speed-sensitive projects, you may want to scrape Google organic results in under a second, making it perfect for developing AI-driven tools.

Get a free trial

Claim your free trial to test Web Scraper API for your use case.

Up to 2K results

No credit card required

Step-by-step guide for scraping Google Shopping results using Google Shopping API

Step 1: Set up Python and install required libraries

To get started, you must have Python 3.6+ installed on your system. Then, you need to install the following packages to code the scraper. 

  • Requests - to send the request to the API.

  • Pandas - to populate the data in the DataFrame data structure. 

To install the packages, use the following command:

pip install requests pandas

Step 2: Set up a payload

Search page

The first step is creating a structure payload containing different query parameters. Find the complete list of available API parameters for scraping Google Shopping search results in our documentation.

Here's an example payload structure that scrapes results for pixel 10 query, sorts them by review score and minimum price of $30, localizes results for the United States, and automatically parser all data:

payload = {
    'source': 'google_shopping_search',
    'query': 'pixel 10',
    'geo_location': 'US',
    'parse': True,
    'context': [
        {'key': 'sort_by', 'value': 'rv'},
        {'key': 'min_price', 'value': 30},
    ],
}

Step 3: Send a POST request

After the payload structure is ready, you can create the request by passing your authentication key.

response = requests.post(
    'https://realtime.oxylabs.io/v1/queries',
    auth=('USERNAME', 'PASSWORD'), # Use your API credentials.
    json=payload,
)

Step 4: Extract product data from a JSON response

We will be extracting Product Title, Price, and Store name from the response. Since we made the payload parameter parse: true, so we will get the JSON response. We can get all this data from the JSON response.

The code below extracts the data from JSON format and stores it in DataFrame.

result = response.json()['results'][0]['content']

with open('search.json', 'w') as file:
    json.dump(result, file, indent=2)

products = result['results']['organic']

data = []

for p in products:
    data.append({
        'Title': p.get('title', 'N/A'),
        'Price': p.get('price_str', 'N/A'),
        'Merchant URL': p.get('merchant', {}).get('url', p.get('url')),
        'Token': p.get('token', 'N/A')
    })

df = pd.DataFrame(data)

The script extracts relevant product information from the response and stores it in the df DataFrame.

Step 5: Save extracted data to a CSV using Pandas

Using the following script, we can easily export the DataFrame to CSV or JSON files:

df.to_csv('search.csv', index=False)

Let’s put all the code together and see the output. 

import json
import requests
import pandas as pd


# Set the scraper & parameters.
payload = {
    'source': 'google_shopping_search',
    'query': 'pixel 10',
    'geo_location': 'US',
    'parse': True,
    'context': [
        {'key': 'sort_by', 'value': 'rv'},
        {'key': 'min_price', 'value': 30},
    ],
}

# Get a response.
response = requests.post(
    'https://realtime.oxylabs.io/v1/queries',
    auth=('USERNAME', 'PASSWORD'), # Use your API credentials.
    json=payload,
)

# Get result content and save it to JSON.
result = response.json()['results'][0]['content']

# Save the content to a JSON file.
with open('search.json', 'w') as file:
    json.dump(result, file, indent=2)

# Get the organic results from the content.
products = result['results']['organic']

data = []

# Iterate through all the products.
for p in products:
    data.append({
        'Title': p.get('title', 'N/A'),
        'Price': p.get('price_str', 'N/A'),
        # Get Google Shopping URL if no Merchant URL available.
        'Merchant URL': p.get('merchant', {}).get('url', p.get('url')),
        'Token': p.get('token', 'N/A')
    })

# Create DataFrame from the list of all products.
df = pd.DataFrame(data)

# Save the DataFrame to a CSV file.
df.to_csv('search.csv', index=False)

The script doesn’t contain any print statements and writes everything in CSV and JSON files. Let’s look at a portion of the output CSV file.

As expected, the output CSV contains the Product Titles, Prices, Store URLs, and Tokens for all the products listed on the search page. You can use the product token to scrape the comprehensive details from product pages. Let's see how to do that.

Product page

The payload structure will be created using different parameters for the products page. Check out our documentation for a complete list of parameters for scraping Google Shopping product pages.

We'll use one of the acquired product tokens from the scraped search results.

payload = {
    'source': 'google_shopping_product',
    'query': 'eyJjYXRhbG9naWQiOiAiNzQ1NTQwOTg0ODA0NjMwOTI3NCIsICJncGNpZCI6ICIxMjYwMTAwNDkxNTAzNTAwODUxNiIsICJpbWFnZURvY2lkIjogIjE2NDExNjQxMjcxMTgxNTM2MzI0IiwgIm1pZCI6ICIiLCAicHZvIjogIjMiLCAicHZ0IjogImhnIiwgInJkcyI6ICJQQ18xMjYwMTAwNDkxNTAzNTAwODUxNnxQUk9EX1BDXzEyNjAxMDA0OTE1MDM1MDA4NTE2IiwgInByb2R1Y3RpZCI6ICIiLCAicXVlcnkiOiAicGl4ZWwgMTAifQ==',
    'geo_location': 'US',
    'render': 'html',
    'parse': True,
}

After the payload structure is ready, you can create the request by passing your authentication key. 

response = requests.post(
    'https://realtime.oxylabs.io/v1/queries',
    auth=('USERNAME', 'PASSWORD'), # Use your API credentials.
    json=payload,
)

We’ll extract the Product Title, Product Description, Rating, Reviews Count, and the Top Review from the received response. Like in the previous section, we’ll use the JSON response to extract our desired output.

product = response.json()['results'][0]['content']

with open('product.json', 'w') as file:
    json.dump(product, file, indent=2)

row_data = {
    'Title': product.get('title', 'N/A'),
    'Description': product.get('description', 'N/A'),
    'Rating': product.get('reviews', {}).get('rating', 'N/A'),
    'Reviews Count': product.get('reviews', {}).get('reviews_count', 'N/A'),
    'Top Review': product.get('reviews', {}).get('top_review', {}).get('text', 'N/A'),
}

df = pd.DataFrame([row_data])

In the above code, we’ve created a DataFrame object that will save all the extracted data in it. We can print this DataFrame or write it into a CSV file. 

df.to_csv('product.csv', index=False)

Let’s put all the code together and see the output. 

import json
import requests
import pandas as pd


# Set the scraper & parameters.
payload = {
    'source': 'google_shopping_product',
    # Product token after scraping seach results.
    'query': 'eyJjYXRhbG9naWQiOiAiNzQ1NTQwOTg0ODA0NjMwOTI3NCIsICJncGNpZCI6ICIxMjYwMTAwNDkxNTAzNTAwODUxNiIsICJpbWFnZURvY2lkIjogIjE2NDExNjQxMjcxMTgxNTM2MzI0IiwgIm1pZCI6ICIiLCAicHZvIjogIjMiLCAicHZ0IjogImhnIiwgInJkcyI6ICJQQ18xMjYwMTAwNDkxNTAzNTAwODUxNnxQUk9EX1BDXzEyNjAxMDA0OTE1MDM1MDA4NTE2IiwgInByb2R1Y3RpZCI6ICIiLCAicXVlcnkiOiAicGl4ZWwgMTAifQ==',
    'geo_location': 'US',
    'render': 'html',
    'parse': True,
}

# Get a response.
response = requests.post(
    'https://realtime.oxylabs.io/v1/queries',
    auth=('USERNAME', 'PASSWORD'), # Use your API credentials.
    json=payload,
)

# Get the content and save it to JSON.
product = response.json()['results'][0]['content']

# Save the content to a JSON file.
with open('product.json', 'w') as file:
    json.dump(product, file, indent=2)

# Create a single row with product information.
row_data = {
    'Title': product.get('title', 'N/A'),
    'Description': product.get('description', 'N/A'),
    'Rating': product.get('reviews', {}).get('rating', 'N/A'),
    'Reviews Count': product.get('reviews', {}).get('reviews_count', 'N/A'),
    'Top Review': product.get('reviews', {}).get('top_review', {}).get('text', 'N/A'),
}

# Create DataFrame.
df = pd.DataFrame([row_data])

# Save the data to CSV.
df.to_csv('product.csv', index=False)

We’ve just successfully scraped a product page at Google Shopping. Let’s move on to scrape the Pricing options that are available through the product scraper.

Pricing options

You can extract the pricing options of each seller from the previously saved product.json file. Ideally, you would extract the product data and pricing offers in one go, since the google_shopping_product source provides all this information. For demonstration purposes, we'll split this process here.

We’ll be extracting the complete pricing options, that include the Item Price, Currency, Shipping Price, Condition, Seller Name, Seller URL, and Details.

import json
import pandas as pd


# Load the product data from the JSON file.
with open('product.json', 'r') as file:
    product = json.load(file)

# Get the pricing options.
pricing = product.get('pricing', {}).get('online', [])

pricing_list = []

# Iterate through all the pricing options.
for p in pricing:
    pricing_list.append({
        'Price': p.get('price', 'N/A'),
        'Currency': p.get('currency', 'N/A'),
        'Price Shipping': p.get('price_shipping', 'N/A'),
        'Condition': p.get('condition', 'N/A'),
        'Seller': p.get('seller', 'N/A'),
        'Seller URL': p.get('seller_link', 'N/A'),
        'Details': p.get('details', 'N/A'),
    })

# Create DataFrame.
df = pd.DataFrame(pricing_list)

# Save the data to CSV.
df.to_csv('product_pricing.csv', index=False)

The above script stores the extracted data in a DataFrame object. Therefore, saving data in CSV, JSON, or other formats is easy. Just execute the above code to save the whole data in a CSV file.

Conclusion

Scraping Google Shopping is essential if you’re looking to retrieve accurate data on your biggest competitors’ products and prices and make data-driven decisions to scale your business. If you're aiming to enhance your scraping capabilities, you can buy proxies, such as residential proxies, to ensure smoother and more efficient data extraction. We hope this tutorial was clear and will contribute to more effortless and smooth data-gathering activities. You can also find all the necessary code files on our GitHub. But in case you still have any questions about our Google Scraper API, don’t hesitate to contact us – Oxylabs’ professional team is always ready to assist you. 

Want to broaden your Google data scraping skills? Explore our step-by-step guides for scraping Google search results, Jobs, Images, Trends, News, Flights, Scholar, AI Mode, and Maps.

About the author

author avatar

Yelyzaveta Hayrapetyan

Former Senior Technical Copywriter

Yelyzaveta Hayrapetyan was a Senior Technical Copywriter at Oxylabs. After working as a writer in fashion, e-commerce, and media, she decided to switch her career path and immerse in the fascinating world of tech. And believe it or not, she absolutely loves it! On weekends, you’ll probably find Yelyzaveta enjoying a cup of matcha at a cozy coffee shop, scrolling through social media, or binge-watching investigative TV series.

All information on Oxylabs Blog is provided on an "as is" basis and for informational purposes only. We make no representation and disclaim all liability with respect to your use of any information contained on Oxylabs Blog or any third-party websites that may be linked therein. Before engaging in scraping activities of any kind you should consult your legal advisors and carefully read the particular website's terms of service or receive a scraping license.

Related articles

Wget Proxy: How to Use Wget With a Proxy
author avatar

Augustas Pelakauskas

2025-10-29

How Is AI Trained? A Guide for AI Training
Agne Matuseviciute avatar

Agnė Matusevičiūtė

2025-10-28

How to Scrape E-Commerce Websites With Python
Maryia Stsiopkina avatar

Maryia Stsiopkina

2025-10-23

Try Google Shopping Scraper API

Choose Oxylabs' Google Shopping Scraper API to gather real-time product data hassle-free.

Get the latest news from data gathering world

Try Google Shopping Scraper API

Choose Oxylabs' Google Shopping Scraper API to gather real-time product data hassle-free.