Open Mon - Fri 9`:00-18:00

An Ultimate Guide on Google Search Console

Google Search Console aka Google Webmasters Tool is an essential account for website and blog owners. It is vital for SEO process because it helps the webmasters understand the Google’s view on website or blog. Google provides several data points and recommendations for solving the issues you may not have even known existed.

Setting Up Google Search Console Account

If you already have an existing Google account for Gmail or Analytics, you just need to login into the Google Search Console.

If your website is in WordPress and using Yoast SEO plugin, it becomes quite easier to install Google search console. Yoast plugin will help you easily connect and verify your website or blog to Google so that you can start collecting data in Google Search Console.

If you are not using WordPress then there are few ways to which you can add and verify your website to Google. The best method is to verify with Google analytics but if you haven’t installed the Analytics then you have to copy the provided code and put it in the header of the index file. You can also download the file from Google Search Console and upload it to the root directory of your website or blog.

 

Getting Started with Google Search Console:

  • Add & Verify the Property
  • Link Google Analytics Account to Search Console
  • Submit XML Sitemap
  • Validate Robots.txt File

Things which need regular maintenance:

  • Crawl Errors
  • Structure Data Review
  • Review XML Sitemap
  • HTML Improvements Suggestions
  • Inbound Links

Inside Google Search Console:

Google Search Console provides lots of great information and suggestions which make it easier for your to make website search engine friendly. Let’s have a look at the individual sections of Google Search Console:

Search Appearance: This section of Google Search Console provides the information on how your website is crawled by the Google and how the website appears in the SERP.

Structured Data: Structured data uses Schema.org vocabulary to provide tags for your website. These tags help Google better understanding on your content, organize to display it in search results. Structured Data could be related to ratings of products or service, prices, and location.

Rich Cards: Rich cards provide a more detailed presentation of the website data. These are designed to improve the standard search results with the more visual preview of website data. Rich cards can be used for products, recipes, events and more.

Data Highlighter: Data Highlighter is designed to help you fix the issues within structured data. It is used to highlight the structured data so that Google can better understand it.

HTML Improvements: HTML Improvements provide a list of recommended improvement to your HTML data like Meta title and description. This section helps to optimize the Meta title and description of the respective pages of your website.

Accelerated Mobile Pages: AMP is an open-source project dedicated to speed up the website on mobile devices. AMP is a way to make your web pages easier accessible on mobile devices. In order to AMP work correctly, you need to create valid AMP pages with the appropriate schema markups and make sure they are properly linked to your website.

Search Traffic Section:

The search traffic section in the Search Console display all the activities taking place on your website. This includes the ranking, backlinks, manual action and search traffic.

Search Analytics: Search Analytics is probably one of the most vital areas of the Google Search Console tool as it provides the valuable information on your website. This area shows the top pages and keywords with their respective CTR and ranking. Search Analytics also provides the comparison of the data so that you can check out the trends and rectify the issue asap.

Link to Your Website: Backlinks have always been an important ranking factor. This section in the Google Search Console displays the backlinks that Google has found from third party sources. In order to ensure the quality of the backlinks, it recommended having a regular monitoring on this section.

Internal Links: Google provides a list of the most linked content within your website or blog as this data is helpful to find out the most important URLs and the content on your website. The more links Google sees to a page or blog post, the more important Google considers the content to be the relationship with other content in your website.

Manual Actions: A Manual Action is a situation in which a human reviews your website and its backlink profile to determine that the pages of posts on your website are not compliant with the Google Webmaster’s quality guidelines. This section in Google Search Console provides that information on all the known issues as well as provides recommendations for rectifying the issues. Top reasons to get a manual action notifications are spam, spammy backlinks, spammy mark-ups, unnatural backlinks, hacked websites, thin content, clocking and other black-hat SEO tactics.

International Targeting: This section of Google Search Console can be used to help Google for the better understanding of website’s country to focus on. If your website is providing services in different countries, you should not populate this section. If your website providing services in a specific geographical location, then it is recommended to use this section to help Google understand the geographical location of your website.

Mobile Usability: Google’s mobile usability section highlights the URLs of the website that have the usability problems on mobile devices. This report gives a detailed data on the issues as well as the recommendations to rectify the issue. In the recent years, mobile traffic is dominating the searches and Google are becoming more focused on creating a better user experience on mobile devices. Due to his, website owners must take special attention to this report and proactively work towards resolving these issues.

Google Index Section

This section of Google Search Console provides the information on how your website is performing in the Google index. The index is the collection of all those pages or posts which are available for appearing in Search Engine Result pages.

Index Status: Index Status report provides an overview of the indexed content and the number of pages that are being blocked from indexing. The total number of indexed pages rarely matches the total amount of pages submitted to the Google and this inconsistency should not be cause for alert. Website owners should use this section to be alerted of sudden drops in the quantity of index pages. A major decrease could mean that Google is not able to reach the content.

Blocked Resources: This section highlights the website elements that are not accessible for Googlebot and thus disallowing Google to access and index those pages. Though Google doesn’t show all the blocked pages in this data and in turn, only pages that Google believes are within your access for review.

Remove URLs: This Remove URL tool of Google Search Console allows the webmasters to temporarily remove specific pages from Google Search results. A successful request will only last about 90 days so the webmasters should use a 404 error page to permanently remove those pages from SERP.

Crawl:

Google use crawlers to explore the content of a website. This crawler is acknowledged as “GoogleBot”. Googlebot visits the website and look out the content and follow the links to the respective pages.

Crawl Errors: The Crawl Errors report displays the list of errors found on the website. This report shows different types of errors like Server error, Soft 404 error, hard 404 error and blocked content.

Crawl Stats: Crawl Stats report provides the information on the crawler’s activities within your website for last 90 days. This report will list content types and the elements of the website such as CSS, Flash, PDF Files, Images, and Javascript.

Fetch as Google: This tool allows webmasters to see how their pages are seen by the Googlebot. Users can run this tool to whether the Googlebot can crawl a page, see how the pages are extracted and validate additional elements like scripts or images are accessible.

Robots.txt Tester: Robots.txt is a text file at the root directory of a website used to instruct search engine bots how to crawl and index the pages of a website. This tool is used to validate and check the robots.txt file on the website.

Sitemaps: This section allows the user to upload and submit an XML sitemap for Google to crawl and index the website. XML sitemaps provide a roadmap for Googlebot to explore and index the pages of your website.

URL Parameters: This tool of Google Search Console can use to instruct Googlebot to crawl a preferred version of an URL which prevents Google from crawling the duplicate content within a website. In most of the scenarios, webmasters don’t need this tool as they use 301 or canonical redirects.

Security Issues:

This section of the Google Search Console is used to notify the webmaster about the security issues which Google has discovered on the website. Webmasters can use this tool to get the information about the security issues Google discovered on their website, review the issues through code snippets and request a review once the malware or hacking issues has been resolved.

Other Resources:

This section of Google search console contain lots of things like Structured Data Testing tool, Schema Markup Helper, Email Markup Tester, Google My Business, Google Merchant Center, PageSpeed Insight, Custom Search, Google Domain and webmaster academy.

Leave a Reply