An Introduction Of How Google 'crawls' The 'net And How It Indexes Your Website.


by Don Rev - Date: 2010-09-27 - Word Count: 485 Share This!

There are various reasons why companies want their website appear top on SERP's (search engine result pages). Likewise, there are various ways in how companies can get themselves to the top. Websites ranking in Google search results are partly based on analysis of those sites that link back to the original site. The quantity, quality, and relevance of links also count towards the rating. But the question is, how does Google collate this data?

Google uses something called a 'GoogleBot' to index or 'crawl' the whole web. Crawling is the process by which Googlebot discovers new and updated pages to be added to the Google index. They use thousands of machines to 'crawl' billions of pages on the web. Googlebot uses an algorithmic process: computer programs determine which sites to crawl, how often, and how many pages to fetch from each site. Googlebot's crawl process begins with a list of webpage URLs, generated from previous crawl processes and augmented with Sitemap data provided by webmasters. As Googlebot visits each of these websites it detects links (SRC and HREF) on each page and adds them to its list of pages to crawl. New sites, changes to existing sites, and dead links are noted and used to update the Google index.

For most sites, Googlebot shouldn't access your site more than once every few seconds on average. However, due to network delays, it's possible that the rate will appear to be slightly higher over short periods. In general, Googlebot should download only one copy of each page at a time. If you see that Googlebot is downloading a page multiple times, it's probably because the crawler was stopped and restarted. Googlebot was designed to be distributed on several machines to improve performance and scale as the web grows. Also, to cut down on bandwidth usage, we run many crawlers on machines located near the sites they're indexing in the network. Google's aim is to crawl as many pages from your site as we can on each visit without overwhelming your server's bandwidth.

Google gives a lot of advice for helping web developers to get better ranking positions. On their webmaster central page, Google state 'The best way to get other sites to create relevant links to yours is to create unique, relevant content that can quickly gain popularity in the Internet community. The more useful content you have, the greater the chances someone else will find that content valuable to their readers and link to it. Before making any single decision, you should ask yourself the question: Is this going to be beneficial for my page's visitors? It is not only the number of links you have pointing to your site that matters, but also the quality and relevance of those links. Creating good content pays off: Links are usually editorial votes given by choice, and the buzzing blogger community can be an excellent place to generate interest.'

Related Tags: seo, search engine optimization, web development, social media marketing

Your Article Search Directory : Find in Articles

© The article above is copyrighted by it's author. You're allowed to distribute this work according to the Creative Commons Attribution-NoDerivs license.
 

Recent articles in this category:



Most viewed articles in this category: