The Google Webmaster Tools: Which services can they render
The Google Webmaster Tools offer valuable assistance regarding the user-friendliness optimization of the own website and regarding the Google ranking by retrieving data for crawling, for indexing and about page views.
In order to uses these services you need to register with the Webmaster-Tools. Furthermore, the corresponding website must be verified.
There are four categories in which the Google Webmaster Tools deliver important information and by doing so support the optimization of the website:
1. Website Configuration
XML-sitemaps, crawler-access, sitelinks, change of address, settings
2. Your website on the web
Queries, links, keywords, subscriber statistics
Malware, crawling errors, crawling statistics, retrieval as by Googlebot, HTML suggestions
Website performances, video-sitemaps
By submitting a XML-Sitemap you inform Google about your website’s page-structure and can therewith accelerate the identification of new sites. This makes sense even if your website has already been indexed. If you want to set your website on a new domain over time, you can inform Google about the new URL by using the tool „Adressänderung“ (change of address) so that the index can be updated quickly. Pages that shall not be found and indexed by the Google search-robots can be excluded in a so called „robots.txt“ file. The search-robots can then not access these pages. This makes sense, for example, if pdf files or pages including personal data (for example the website’s contact page) shall not be indexed. The so calles sitelinks help users to navigate your website by detecting relevant contents faster. Google displays the sitelinks automatically within the search result when issuing your website, if your website’s structure seems to be appropriate. Your do have the option to block individual sitelinks if they seem to be inappropriate.
Your website on the web
This area provides you with a lot of information about how your website is found in the internet. This includes data about the number of queries, page impressions and clicks over a to be defined period of time, about the most frequent keywords that Google found while crawling, about the internal and extraneous referring links to your website and subscriber statistics. These show the amount of Google users that subscribed to your Feeds by means of a Google product (like Google Reader, iGoogle or Orkut). This analysis-area offers a very interesting potential for optimizing your website. For example, you should ask yourself if Google does find exactly the keywords that you consider to be relevant for your website? Or if your most linked content is really the one that is most important for your target group?
The diagnosis area supports you concerning the detection of crawling- or content-problems and malware on your website and thus helps to keep the website intact, user-friendly and search engine optimized. Under point „Malware“ (= malicious software) it is reported if a site was hacked or infected by a virus. If this should happen, you will also find information on how to clean a hacked website under this point and how you may avoid future infections with viruses. In case Google detected problems while crawling your website, these will be reported under „crawling errors“. These can be URLs which were changed without having set a 301 forwarding or which are no longer accessible. All crawling activities of the Googlebot, for example the number or the loading time of crawled pages, are presented in statistics. If you want to know how your websites are displayed within Google, you can view your site in „Retrieval as by Googlebot“ just like the Googlebot does. This is useful in order to detect why a site does poorly in the search results or to identify problematic pages, if the site has been hacked. The point „HTML suggestions“ gives notice in case there are problems concerning the page title (for example missing or repeated titles, meta-descriptions or unidentifiable content, such as videos, images or Rich-media-files.
Here, statistics concerning the average loading time of the website are displayed. Due to the fact that fast websites increase user satisfaction, this point should be regarded However, it is important to note that a great number of data is needed for a reliable valuation, which is hard to achieve for smaller websites with few visitors und can thereby lead to distortions.
If all listed tools are considered and implemented, the user-friendliness will most likely be increased and thus the access rates of your website.