Almost 90% of our web search market is under Google’s dominance, so the tool created by Google to help improve your rankings it’s always advisable to try it out. This is precisely what Google Webmaster Tools offers.In webmaster tool users can add up to 1000 sites including news and mobile sites to their account.
When you create an account with Google Webmaster Tools, you will be asked to verify your site to make sure you own that website. This doesn’t affect the site’s page rank in any way. Moreover, if you create blogs on Blogger, you can automatically add and verify site after enabling the Webmaster Tools from the Blogger dashboard.
Let us get into the technical details. After activating the account, you need to add your site and verify it by placing a code string on the HTML head section.
The first thing to do after this is to go to the ‘Site Configuration’ and then ‘Settings’ from where you can select your ‘Preferred Domain.’ The reason behind this is to make sure that Google will index only one version of your site either the one with the www prefix or the one without it. This is doing just for preference and has no impact on ranking. While doing this, make sure that you choose a version that is actually working on your site and not the one that redirects to another version.
At this point of time no more further informations can be gathered on your site so you can let it there and wait.login again after a few days and explore the other features available.
Another important section that demands attention is ‘Crawl Errors’ under the ‘Diagnostics’ link. This will identify the problems Google is facing while crawling your site. This is a good indicator to find and fix issues, so try to solve the issues as soon as it is found and before it affects your search rankings.
After exploring the ‘Crawl Errors’ take some time to explore the ‘Crawl Stats’ and ‘HTML suggestions’ under the same tab. The stats shown are a breakdown of how frequently the Google bot is visiting your site. This graph should ideally be stable if not growing. If you see a decline in the graph, it means something is going wrong. The more often the bot visits your site, the greater your chance for an increase in page ranking.
The ‘HTML suggestions reports problems regarding the meta tags. Large numbers of duplicate title tags usually creates problem. To resolve this, you need to revise the structure of your blog.
All such errors are not so serious. The “Not Found” error is a kind of common error that comes across when owners link to the wrong URL within the site. “Restricted by robots.txt” errors is caused by disallowing crawling on areas around your website. ‘Timed out’ and ‘Unreachable’ are those need serious attention. If these come up it means the Google bot is not reaching your pages in the way it should. If you get these errors frequently it could mean your website might get de-indexed.
These are key sections to look out for in the Google Webmaster Tools related to the performance of your website on Google, so it’s important to monitor this at least once in a month. There are lot more interesting sections that you can explore and experiment with once you get the hang of it.