BIG NEWS: Scrapingdog is collaborating with Serpdog.

Scrape Google Maps Data using Python

Scrape Google Maps using Python

Table of Contents

When it comes to collecting location-based information, scraping Google Maps is one of the best option, offering everything from phone numbers to reviews and other details like operating hours, current open status, ratings, etc.

However, manually collecting this data can be super time-consuming, prone to errors and inefficient.

In this article, we’ll explore how to automate the process using Scrapingdog’s Google Maps API and Python.

Whether you’re looking to extract reviews, phone numbers, or other key details, this guide will show you how to efficiently scrape Google Maps dat to generate local leads or even B2B leads depending upon the business you do. 

Where Can You Use this Scraped Data from Google Maps?

  • You can use this data to generate leads. With every listing, you will get a phone number to reach out to.
  • Reviews can be used for sentiment analysis of any place.
  • Posts shared by the company can be used for real-estate mapping.
  • Competitor analysis by scraping data like reviews and ratings of your competitor.
  • Enhancing existing databases with real-time information from Google Maps.

Preparing the Food

  • I am assuming you have already installed Python on your machine if not then you can install it from here. Now, create a folder where you will keep your Python script. I am naming the folder as maps.
				
					mkdir maps
				
			

Inside this folder, we have to install two Python libraries: requests and pandas.

				
					pip install requests
pip install pandas
				
			

requests will be used for making an HTTP connection with the data source and pandas will be used for storing the extracted data inside a CSV file.

Once this is done, sign up for the Scrapingdog’s trial pack.

The trial pack comes with 1000 free credits which is enough for testing the API. We have now all the ingredients to prepare the Google Maps scraper.

Let’s dive in!!

Making the First Request

Let’s say I am a manufacturer of hardware items like taps, showers, etc., and I am looking for leads near my factory which is based out of New York.

Now, here I can take advantage of data from Google Maps which can help me find hardware shops around me.

With the Google Maps API I can extract their phone numbers and then reach out.

If you see the Google Maps API documentation, you’ll notice that it requires two mandatory arguments in addition to your secret API key- coordinates and a query. (see the image below)

The coordinates represent the latitude and longitude of New York, which you can easily extract from the Google Maps URL for the New York location. The query will be “hardware shops near me”.

Google Maps Documentation

Before writing a single line of code you can try this directly from your dashboard. Just enter the coordinates, query, and hit enter.

As you can see in the above GIF you got a beautiful JSON response that shows every little detail like operating hours, title, phone number, etc.

Also, you can watch the video below wherein I have quickly explained how you can use Scrapingdog Google Maps API.

We can code this in Python and save the data directly to a CSV file. Here we will use Pandas library.

By the way, a Python code snippet is ready for you on the right-hand side of the dashboard.

Just copy that and paste it into a Python file inside your maps folder. I am naming this file as leads.py.

				
					import requests
import pandas as pd

api_key = "your-api-key"
url = "https://api.scrapingdog.com/google_maps"
obj={}
l=[]

params = {
    "api_key": api_key,
    "query": "hardware shops near me",
    "ll": "@40.6974881,-73.979681,10z",
    "page": 0
}

response = requests.get(url, params=params)

if response.status_code == 200:
    data = response.json()    
    for i in range(0,len(data['search_results'])):
        obj['Title']=data['search_results'][i]['title']
        obj['Address']=data['search_results'][i]['address']
        obj['PhoneNumber']=data['search_results'][i]['phone']
        l.append(obj)
        obj={}
    df = pd.DataFrame(l)
    df.to_csv('leads.csv', index=False, encoding='utf-8')
else:
    print(f"Request failed with status code: {response.status_code}")
				
			

Once we got the results we are iterating over the JSON response. Inside the loop, for each result, it extracts the title, address, and phone number and stores them in a dictionary obj.

After the loop finishes, the list l, which now contains all the extracted data, is converted into a Pandas DataFrame.

Finally, the DataFrame is saved to a CSV file named leads.csv. It also ensures that the file doesn’t include row numbers (index) and encodes the file in UTF-8.

Once you run this code a leads.csv file will be created inside the folder maps.

This file now contains all the phone numbers and the addresses of the hardware shops. Now, I can call them and convert them into clients.

If you are not a developer and want to learn how to pull this data into Google Sheets, I have built a guide on scraping websites using Google sheets, using this guide you will understand to pull data from any API to a Google sheet.

Scraping Reviews from Google Maps

Extracting reviews with this API is also straightforward. Before proceeding, you should refer to the documentation of the Google Reviews API. In this example, we will extract the reviews of a Le Cafe Coffee Place which is also located in New York.

While reading the docs you will see that an argument by the name data_id is required. (see below⬇️)

This is a secret ID allocated to this place on Google Maps. Every place has its own ID. To get this ID you will have to search for this cafe on Google Maps then you will get this URL — https://www.google.com/maps/place/le+cafe+coffee/@40.7320513,-74.0106458,13z/data=!4m10!1m2!2m1!1scoffee+shops!3m6!1s0x89c258e4c2066753:0x64f50a662bfb26d7!8m2!3d40.7597355!4d-73.9697968!15sCgxjb2ZmZWUgc2hvcHNaCCIGY29mZmVlkgELY29mZmVlX3Nob3DgAQA!16s%2Fg%2F11c1pcnzbf?entry=ttu&g_ep=EgoyMDI0MTAwMi4xIKXMDSoASAFQAw%3D%3D.

In this URL you will see a string starting with 0x89 and ending with ‘!’. Like in the above URL, it is 0x89c258e4c2066753:0x64f50a662bfb26d7 and yes you have to consider everything before ‘!’.

Marked in the image below ⬇️ is what I am referring to.

Let’s code this in Python and extract the reviews.

				
					import requests

api_key = "your-api-key"
url = "https://api.scrapingdog.com/google_maps/reviews"

params = {
    "api_key": api_key,
    "data_id": "0x89c258e4c2066753:0x64f50a662bfb26d7"
}

response = requests.get(url, params=params)

if response.status_code == 200:
    data = response.json()
    print(data)
else:
    print(f"Request failed with status code: {response.status_code}")
				
			

Once you run this code you will get this beautiful JSON response.

As you can see, you got the rating, review, name of the person who posted the review, etc. Using this data, you can do sentiment analysis, which can help you judge the customer’s mood.

Conclusion

Using Python is a good way to start Google Maps scraping, however if you want to scale this process using an API is the best choice. 

Scrapingdog gives a Google Maps API for building a hassle free solution. 

If you like this article, do share it on social media, and if you have any questions or suggestions do reach out to me on live chat here🙂 

Frequently Asked Questions

Additional Resources

Hi I am Darshan Khandelwal and I am managing all the tech stuff (CTO) at Scrapingdog. I hope you enjoyed reading this article as much as I loved writing it 😀
Darshan Khandelwal

Web Scraping with Scrapingdog

Scrape the web without the hassle of getting blocked

Recent Blogs

Scraping Flipkart

How To Scrape Flipkart Data using Python

Flipkart is an Indian ecommerce brand and has a huge data for different types of products. In this guide, we have extracted data from it using Python.
Scraping Instagram using Python

How to Scrape Instagram using Python

In this tutorial, we have scraped Instagram using Python, further to scale the process we have used Scrapingdog's API to get get data without any blockage.