Back to blog

How to Scrape Google Maps Using Python

Danielius avatar

Danielius Radavicius

2025-02-256 min read
Share

Web scraping has become essential for many businesses, and Google Maps is a popular source of valuable data. In this article, we’ll discuss what this data may be and how to scrape Google Maps using an Oxylabs solution.

For your convenience, we also prepared this tutorial in a video format:

What data can I extract from Google Maps?

If you're wondering what data can be scraped from Google Maps, the table below provides a clear overview of the information available and where you can use it.

Data type Common use cases
Business Listings Local business directories, competitor analysis
Reviews & Ratings Sentiment analysis, reputation management, market research
Geolocation Data Mapping services, logistics
Photos & Images Enhancing business profiles, image recognition analysis
Place Details Travel apps, customer insights, business intelligence
Street View Images Virtual tours, real estate visualization, urban planning
Traffic & Route Data Navigation apps, ride-sharing services
Nearby Places Location-based services, event planning

Knowing how to scrape Google Maps website provides a wealth of data that businesses and developers can use for various applications. This data is widely used across industries like e-commerce, travel, real estate, and logistics, helping businesses optimize operations and enhance customer experiences.

For example, businesses can analyze customer sentiment through average ratings and reviews and pinpointing locations using geolocation data. Then, by scraping data from Google Maps, it’s possible to uncover trends for strategic planning. Lastly, utilizing the search functionality makes nearby places and real-time traffic updates easily accessible. 

Should you use the official Google Maps API?

The core purposes of scraping Google Maps results are numerous. From a research perspective, a user may want to employ a Google Maps data scraper to analyze demographic information or transportation routes. For businesses, a Google Maps scraper may be the go-to tool for competitor analysis, as it allows you to collect data on competitors' locations, customer reviews, and ratings. Gathering real estate/property listings is a possible use case as well.

Quite a few popular websites like Twitter or Amazon provide their own APIs. Google is no exception, therefore naturally the question arises, why not use the official Google Maps API?

Let’s start with the pricing. Every user receives a $200 monthly credit for Google Maps API calls, which covers approximately:

  • Up to 40,000 Geocoding requests

  • Up to 100,000 Static Maps loads

  • Up to 28,500 Dynamic Maps loads

  • Up to 40,000 Directions requests

  • Up to 11,765 Places API requests (Basic Data)

At first, this might seem like a lot, but it often isn’t. Once you exceed these limits, Google starts charging per request, which can add up quickly. Imagine using the Embed API in Directions, Views, and Search modes. A single service request that includes address search through autocomplete could trigger multiple API calls. If you add geolocation services for directions or distances, a single request could consume three separate API calls.

As your business grows and you find yourself needing additional data, so does your daily API usage. Over time, Google Maps API can become an extremely costly solution to retrieve data.

Yet, the high price isn’t the only limitation of Google API. There are also strict request limitations. Google’s current enforced rate limit is up to 100 requests per second.

Google is also known to implement unpredictable changes that offer little benefit to their users, such as the limits imposed in 2010.

However, products like Oxylabs' Google Maps API (part of the Web Scraper API) are specifically made to avoid limitations such as the ones mentioned above, which is why they’re commonly chosen instead of official APIs to scrape data from Google Maps.

How to scrape data from Google Maps?

Before you begin

To extract data from Google Maps, you will need Oxylabs' Web Scraper API. Sign up for Google Search Results API and take note of your username.

Replace USERNAME with your username and PASSWORD with your password throughout the code samples in this guide.

Setting up your project environment

Before writing code for a Google Maps data scraping project, we must set up a project environment and install the necessary Python libraries.

Create a new virtual environment to separate your project dependencies from your system packages. Ensure that you have Python 3.8 or newer installed. Run the following command in a terminal:

$ python3 -m venv env

On Windows, use python instead of python3:

·   Windows: env\Scripts\activate
·   macOS/Linux: source env/bin/activate

Install the required Python libraries for this project. You'll be using beautifulsoup4, lxml, requests, and pandas. You can install them by running the following:

$ pip install beautifulsoup4 requests pandas lxml

Alternatively, instead of Beautiful Soup, you can use the built-in Custom Parser feature of the API. With your project environment set up, you're ready to start writing code to scrape Google Maps data.

Fetching data with Google Scraper API

We'll be using Oxylabs' Google Search API to fetch data from Google Maps. This API allows us to send HTTP requests to Google and receive the HTML content of the search results page. For a detailed tutorial, see How to Scrape Google Search Results.

1. First, open google.com in your browser and search for "restaurants near me". You’ll see the search results with the restaurants' names, ratings, hours, and other data points. Have this page opened, as it’ll be useful when using Developer Tools to craft element selectors.

2. To use Google Search Results Scraper API, you need to set the following parameters:

  • source: This will be google_maps;

  • query:  This will be your search term, such as “restaurants near me”;

  • domain: This parameter localizes the results by using a specific domain, for example, .com.

  • geo_location: Google Scraper API allows us to set specific Google Maps locations for search;

  • start_page: Sets the search results page from which to start Google Maps scraping.

  • pages: Sets the total number of pages to scrape.

3. Create a payload dictionary as follows that will contain these parameters:

import requests, lxml.html, re, pandas as pd
from bs4 import BeautifulSoup

payload = {
    'source': 'google_maps',
    'query': 'restaurants near me',
    'user_agent_type': 'desktop',
    'domain': 'com',
    'geo_location': 'New York,United States',
    'start_page': '1',
    'pages': '3'
}

4. The next step is to send these parameters to the API endpoint. For this, you can use the requests library to send a POST request as follows:

response = requests.request(
	'POST',
	'https://realtime.oxylabs.io/v1/queries',
	auth=('USERNAME', 'PASSWORD'),
	json=payload,
	timeout=180
)
print(response.status_code)

Replace USERNAME and PASSWORD with your actual username and password. If everything is well, you should get a response status code 200.

5. You can get the HTML files for each page from the JSON results as follows:

results = response.json()['results']
html_files = [result['content'] for result in results]

6. The next step is to parse the returned HTML files.

Parsing Google Maps data

Once you have the HTML content of the search results page, you can use the BeautifulSoup library to parse the data. In this example, we'll extract the following data points from each place listed in the search results—Name, Place Type, Address, Rating, Price Level, Rating Count, Latitude, Longitude, Hours, and other details.

First, open your browser and open the Google Maps page. Right-click on any of the listings and select Inspect.

Try to create a selector that selects exactly one listing at a time. You can do this with this CSS selector: [class="VkpGBb"].

Afterward, you can loop over all the matches of this selector and then look for specific data points. So, let’s create CSS selectors for each data point you may want to scrape.

google maps scraper python – inspecting elements using dev tools

As seen in the above image, you can select the name of the restaurant with the following CSS selector:

[role="heading"]

The following are all the selectors:

name_selector = '[role="heading"]'
rating_selector = 'span[aria-hidden="true"]'
rating_count_selector = '[class*="RDApEe"]'
hours_selector = '.rllt__details div:nth-of-type(4)'
details_selector = '.rllt__details div:nth-of-type(5)'
price_selector = '.rllt__details div:nth-of-type(2) > span:nth-of-type(2)'
lat_selector = '[data-lat]'
lng_selector = '[data-lng]'
type_selector = '//div[@class="rllt__details"]/div[2]/text()'
address_selector = '.rllt__details div:nth-of-type(3)'

Note the xPath selector instead of CSS for the type_selector.

You can use BeautifulSoup's select and select_one methods to select elements and then extract the text within those elements.

Rating count needs a different approach. The rating count is enclosed in brackets along with the rating. For example, 4.3(513). In this case, the rating count is within the brackets.

Hence, you can use regex (regular expressions) to extract this value as follows:

count_match = re.search(r"\((.+)\)", rating_count_el.text)
rating_count = count_match.group(1) if count_match else ""

Putting everything together, the following code generates a list of dictionaries that contain all the data from all the listings on the page:

data = []
for html in html_files:
    soup = BeautifulSoup(html, 'html.parser')
    lxml_obj = lxml.html.fromstring(str(soup))
    index = -1

    for listing in soup.select('[class="VkpGBb"]'):
        index += 1
        place = listing.parent
        name_el = place.select_one(name_selector)
        name = name_el.text.strip() if name_el else ''
        
        rating_el = place.select_one(rating_selector)
        rating = rating_el.text.strip() if rating_el else ''
        
        rating_count_el = place.select_one(rating_count_selector)
        rating_count = ''
        if rating_count_el:
            count_match = re.search(r'\((.+)\)', rating_count_el.text)
            rating_count = count_match.group(1) if count_match else ''
        
        hours_el = place.select_one(hours_selector)
        hours = hours_el.text.strip() if hours_el else ''
        if 'opens' not in hours.lower():
            hours = ''
        
        details_el = place.select_one(details_selector)
        details = details_el.text.strip() if details_el else ''
        
        price_level_el = place.select_one(price_selector)
        price_level = price_level_el.text.strip() if price_level_el else ''
        
        lat_el = soup.select_one(lat_selector)
        lat = lat_el.get('data-lat') if lat_el else ''
        
        lng_el = soup.select_one(lng_selector)
        lng = lng_el.get('data-lng') if lng_el else ''
        
        type_el = lxml_obj.xpath(type_selector)
        place_types = []
        for item in type_el:
            parts = item.strip().split('·')
            non_empty_parts = [part.strip() for part in parts if part.strip()]
            if non_empty_parts:
                place_types.append(non_empty_parts[-1])
        
        address_el = place.select_one(address_selector)
        address = address_el.text.strip() if address_el else ''
        
        place = {
            'name': name,
            'place_type': place_types[index],
            'address': address,
            'rating': rating,
            'price_level': price_level,
            'rating_count': rating_count,
            'latitude': lat,
            'longitude': lng,
            'hours': hours,
            'details': details,
        }
        data.append(place)

The next step is to save this data as CSV.

Exporting Google Maps data to CSV

With the data parsed, the final step is to export it to a CSV file. Let’s use the Pandas library to create a DataFrame and save it as a CSV file:

df = pd.DataFrame(data)
df.to_csv("data.csv", index=False)

When you run this code, it will save the data to a CSV file named data.csv.

how to scrape google maps data – exported data into a csv file

Ways of scraping Google Maps compared

There are three main methods for scraping Google Maps: manual scraping (without proxies), scraping using proxies, and scraper APIs. Each approach serves different needs – manual scraping works for small-scale data extraction, proxies help bypass restrictions for larger data collection, and scraper APIs provide a streamlined, automated solution for large scale data extraction from Google Maps website. Down below, you can see a break down of each method, explaining when and how to use them effectively.

Criteria Manual scraping (without proxies) Scraping using proxies Scraper APIs
Key features Single, static IP address
Direct network requests
Local execution environment
IP rotation
Geo-targeting
Request distribution
Anti-detection measures
Maintenance-free infrastructure
CAPTCHA handling
JavaScript rendering
Automatic proxy management
Pros Maximum flexibility
No additional service costs
Complete data pipeline control
Minimal latency
Improved success rate
Reduced IP blocking
Coordinate, city, state-level targeting
Anonymity
Minimal maintenance overhead
Built-in error handling
Regular updates for site layout changes
Technical support
Cons High likelihood of IP blocks
Regular maintenance
Limited scaling
No geo-targeting
Additional proxy service costs
Manual proxy management
Additional setup
Increased request latency
Higher costs
Fixed customization
API-specific limitations
Dependency on provider
Best for Small-scale scraping
Unrestricted websites
Development and testing
Custom data extraction logic
Medium to large-scale scraping
Restricted websites
Global targets
Enterprise-level scraping
Complex websites with anti-bot measures
Resource-constrained teams
Quick implementation

Conclusion

Scraping Google Maps isn't an easy task, but this guide should help you navigate both how the scraping process works and how it functions in tandem with our API solution. The aim of the tutorial was to provide a step-by-step, comprehensive guide, but in case you have any questions, don't hesitate to contact us or chat with our 24/7 available live support team.

For web scraping, proxies are an essential anti-blocking measure. To avoid detection by the target website, you can buy proxies of various types to fit any scraping scenario, such as or datacenter IPs.

Want to extract data from other Google platforms, not only the Google Maps website? Check out our how-to guides for scraping Jobs, Search, Images, Trends, News, Flights, Shopping, and Scholar.

About the author

Danielius avatar

Danielius Radavicius

Former Copywriter

Danielius Radavičius was a Copywriter at Oxylabs. Having grown up in films, music, and books and having a keen interest in the defense industry, he decided to move his career toward tech-related subjects and quickly became interested in all things technology. In his free time, you'll probably find Danielius watching films, listening to music, and planning world domination.

All information on Oxylabs Blog is provided on an "as is" basis and for informational purposes only. We make no representation and disclaim all liability with respect to your use of any information contained on Oxylabs Blog or any third-party websites that may be linked therein. Before engaging in scraping activities of any kind you should consult your legal advisors and carefully read the particular website's terms of service or receive a scraping license.

People also ask

Is it possible to scrape Google maps?

Yes, you can use various programming languages or automated solutions such as Google Maps data extractor APIs to scrape Google Maps.

Is it legal to scrape data from Google maps?

Yes, Google Maps provides publicly available data. Either copy-pasting, writing the data down, or scraping data from Google Maps – no matter how you do it, the essence of public data is that it's free to use and share.

It's essential to comply with the website's terms of service, respect any restrictions or limitations on data usage, and adhere to legal and ethical guidelines governing Google Maps business scraper activities.

For more specific data extraction scenarios involving copyrighted or sensitive material, please seek professional legal guidance and analyze applicable national and international legislation. To learn more about the legalities of web scraping, check here.

Why collect data Google Maps?

The purpose of web data extraction from Google Maps is subsequent business data analysis. By extracting data with a Google Maps business scraper, users can gain insights, identify patterns, perform market research, or make informed decisions.

What is the best scraper for Google maps?

The answer depends on your needs and available budget. You can always build a custom Google Maps scraper, which will require you to use a headless browser to render JavaScript, utilize proxies to overcome IP blocks and detection systems, and you'll need to have significant development skills to create a scalable scraper that overcomes anti-scraping measures. If this option doesn't suit you, a better way is to use a dedicated Google Maps scraping tool like Oxylabs' Google Maps Scraper API, which bypasses anti-scraping systems, enables an internal headless browser, scales according to your needs, and offers plenty of features to ease your scraping processes.

Related articles

Get the latest news from data gathering world

I’m interested