Google’s Webmaster ToolsDo They Help, or Hinder Search Engine Optimization?
Before 2005, Google and webmasters were separate entities who fought with one another for, each trying to outdo the other to achieve better ranking, or to create fairer, more reliable results for searchers. With the exception of the occasional comment by GoogleGuy on the forums or Matt Cutts at an industry show, there was virtually no dialogue between the search giant and those who created and marketed web sites.
That all changed In the summer of 2005, when Google started its sitemaps program, which today has evolved into Webmaster Central. Today, that organization comprises hundreds of individuals around the world, working on webmaster relations, webmaster problems and webmaster tools. It’s the de facto model followed by Microsoft and Yahoo! and in many ways epitomizes the legitimacy that Search Engine Optimisation has achieved from its days as a thorn in the side of search engines.
However, there is one group that is very angry with what Google has built and desperate to stop the progress of the Webmaster Central programs - in particular the parts that require site owners to verify their domains with Google. Who are they? They are Google’s competition - other search engines and SEO experts working in ’stealth-mode’.
From the beginning of the web until about 2006, any company who had the funding and wanted to build a web search engine had everything they needed at their disposal. Despite the massive technical and financial requirements, nothing stood in the way of a creative group crawling and indexing the web, then using that data to construct an index and an algorithm to show results. It was all there on the Internet, just waiting to be fetched. Google changed that with the introduction of Sitemaps and the later growth of Webmaster Central.
Now, there’s tons of data no startup engine could access without hacking Google’s servers. Information like:
* Sitemaps - The lifeblood of many sites’ crawlability and accessibility as well as information about which canonicalization of URLs and even URL exclusion is now exclusively available to search engines that receive the sitemap. Even Yahoo! and Microsoft are severely disadvantaged, as webmasters are less likely to submit sitemaps to them.
* Geo-targeting information - Google allows you to set geography and language for subdomains, subfolders, or entire sites inside the Webmaster Console, which is fantastic for websites and SEOs, but gives a clear competitive advantage over any player wishing to enter the search space.
* Crawl Rate Control - For sites that want to allow faster crawling or those who require less heavy demand, crawl control is an option inside Webmaster Tools and another piece of information that benefits Google’s control over site data.
* Domain preference - Even though it’s a simple thing, the ability to set a domain preference (url.com vs. www.url.com) gives Google a decided advantage.
Any piece of information that’s submitted behind a verification protocol, rather than openly accessible by crawling, is going to hinder competition and help reinforce the market leader’s domination. Suddenly, in the last two years, the barriers of entry to building an effective web-wide search engine have skyrocketed.