It would seem that Google will never tell us the specific algorithms they’re using to rank a certain webpage. However, Google has admitted that they are limiting their secret to no more than 200 signals.
So for that reason, we are only left with no more than two options on how to understand this. Those two are context and clues.
SEO and context help us understand when a page ranks higher than us in the SERP’s. Whenever we attract more visitors, or a competitor attracts more visitors, we or they rank higher on Google’s Search Engine Result Pages.
This can be the result of building and earning new backlinks, making changes to the metadata, or improving your internal website links.
Or it could be something that we’ve not yet discovered.
SEO and clues help us understand what we need to do in order to get the much needed Google favor. Through clues, we are able to better understand the types of pages Google wants, and how Google interprets intent for a set of keywords associated with those pages.
What better way to do this than by analyzing SERPs?
There are many ways to analyze SERPs, but according to zenserp.com, the best way to start is to take a look at custom extractions.
What are custom extractions?
It’s safe to say that there are two types of custom extractions, ranging from simple to detailed. Custom extractions are crawling tools we send to identify and extract information for a specific element on a webpage.
And if we want to understand SERPs, we need to send one into a webpage that happened to be one.
So that begs the question, how do you easily scrap SERPs?
Well, for starters, we believe that the best and most efficient way of scraping SERPs is to scrap the page title and meta descriptions from Google.
You start by pulling together a list of SERP URL’s that you want to crawl into. These are the URL’s that Google would normally display after a query has been entered.
There are a few ways to compile these URL’s and we are going to tell you one of them
Start with a simple Excel formula that follows the following formula =“https://www.google.com.us/search?q=”&SUBSTITUTE(A3,” “,”+”) In our case A3 is the cell that contains your keyword (example: tools).
You can add as many keywords as you want in between each parameter (in between “,”).
For scrapping more than 10 results, you will need to add the following to the SERP URL: (&num=20). For 10 results, simply put 10 after “=”. For three results, simply put 3 after “=”. It’s that simple.
The number you add shows the top number of results for a specific query. We will give you an example: If you’re looking for the top 5 results for “best marketing tools 2019” then add 5 after “=”.
There is also another parameter you can add to your scrapping. Namely, you can add a country if you want to scrape SERPs from multiple countries.
To do that, add the following to your search: &cr=countryXX. The “XX” represents the country that you want to search for. For the United Kingdom, add the UK. For the United States, add the US, etc, etc.
This is one of the ways to get more detailed scrapping of SERP’s. As we mentioned before, there is no definitive answer and Google will never tell us something like that. So what works for you best, might not work for someone else.