How To Create a Google SERP Checker in Python

Written by Sc | Published 2020/02/18
Tech Story Tags: api | python | google | seo | seo-optimization | seo-tools | web-scraping | web-crawlers

TLDR The goal of SEO is to get your website to the top of the search engine. One excellent way of tracking SEO progress is by checking the Search engine result pages (SERPs) of a website. We are going to learn how to perform free SERP searches with a web-based tool. We will be using Python3 to build our SERP tool using Python. We need a Python module urllib to call the Google Search API service. The API service will return the results in a JSON format.via the TL;DR App

The goal of SEO is to get your website to the top of the search engine. One excellent way of tracking SEO progress is by checking the Search engine result pages (SERPs) of a website.
In this tutorial, we are going to learn how to perform free SERP searches with a web-based tool, and also build a free SERP checker tool using Python.
First, you will need to register an account to perform Google searches programmatically. Head over to Google Search API to register with a free plan which gives you 50 free searches a day.

Manually (With API Console)

To perform manual searches. Head over to the Google Search API console.
To search for a particular keyword or set of keywords. Change the q parameter. To get search results for a different country change the country parameter it is defaulted to the United States. To change search results for a different language, change the language parameter.
Click test-endpoint to perform the search. This API by default will return the top 100 results for that particular set of keywords.
Although this method works for a small list of keywords. However, if you have a large list of websites or keywords to search the manual process can be tedious.

Programically (With Python)

We will be using Python3 to build our SERP tool. So first make sure you have Python3 installed.
Next, we need a Python module urllib to call the Google Search API service.
import urllib
import urllib.parse
import urllib.request
import json
Update the script once you have subscribed to the Google Search API service and received an access key. The API service expects the x-rapidapi-key in the headers of the request.
# change this
ACCESS_KEY = "gf4ak6ynirrqyfk5lox7yldzhf9z98cp4ffk9fkk5h5di0lugl"
# change to keywords of interest
KEYWORD = "how to rank instagram"
# change this to the website of interest
WEBSITE = "http://website.to.find" # change this
The API requires at least the q parameter. That is where we will input the keywords we want to search for.
country = "US"
language = "lang_en"
data = {
    "q" : KEYWORD,
    "country" : country,
    "language" : language
}

# build the url to make request
urlencoded = urllib.parse.urlencode(data)
We will make the request using urllib request. The API service will return the results in a JSON format.
url = "https://google-search3.p.rapidapi.com/api/v1/search?" + urlencoded
headers = {
    'x-rapidapi-host': "google-search3.p.rapidapi.com",
    'x-rapidapi-key': ACCESS_KEY
}

# create a GET request with the url and headers
req = urllib.request.Request(url, headers=headers)
resp = urllib.request.urlopen(req)
# read the results and parse the JSON
content = resp.read()
results = json.loads(content)
Now to find out what rank the website is within the results. Just loop through the results and compare if the website is inside the link.
found_in_results = False # keep track if we found the website
for rank, result in enumerate(results):
    title = result['title']
    link = result['link']
    if WEBSITE in link:
        print("Found website at rank: " + str(rank))
        found_in_results = True

if not found_in_results:
    print("Could not find the website in search results.")
That is it. Make sure to change the API key, keyword, and website before running the script. To run the script simply run the following command.
python3 serp_tool.py
This script is pretty simple and error-prone. But should get you started with writing your own SERP tool. You can clone or download the entire script over at the git repo.
With the basic plan, you are able to perform 50 free searches a day. If you have a large list of websites or keywords, I highly suggest upgrading to the pro plan which offers unlimited search.
Previously published at https://blog.goog.io/web/scraping/2019/12/30/creating-a-serp-checker.html

Published by HackerNoon on 2020/02/18