Accommodating Bots, Crawlers and Spiders On Your Website
Every single day, there are countless bots, spiders and crawlers perusing the internet on behalf of major search engines that are collecting information about each site and page they come across. Your site – especially if it has many backlinks – may be indexed several times per day by bots that are looking for any changes that have been made to the website. If you want to take advantage of these bots and their constant patrolling, you will want to make sure that your site is optimized for their presence. These tips will provide you with ways to augment your site’s indexing potential and allow you to get properly indexed.
The First Three Sentences Are Crucial
On each page, you should have a page title, article title and page content. One of the best ways to make sure bots, spiders and crawlers are comfortable on your pages is to make sure that you use titles and headings that are relevant to the rest of your site. Spiders and bots have become increasingly intuitive and look for matching content sections on each page. You will also want to make sure that all of your keywords (or rather, the first instance of each) appears in the first two or three sentences of the page. Your article or page title should have also emphasis on it (<em>, <h5>, etc) so that search engines properly index it along with the rest of the page.
Have A Site Map
Especially useful if your site is new, site maps give search bots and crawlers the ability to find all pages on your website. Some of your pages at first may not be locatable by spiders and crawlers due to the lack of backlinks and internal linking, so a site map incorporates links to all of your pages in one easy to find section. Based on the formatting of the site map, you can also indicate to the bots which sites are primary pages, which are secondary and which are tertiary. Site maps work well on websites that have 50 pages or less.
Small Edits
Your site is completely evaluated by these crawlers, so it is vital to make sure that your site doesn’t have any aspects to it that could lead to a downgraded PageRank or other rating. Some of the most common ways to avoid being perceived as a spammer is to reduce the instances of dashes in your URL (common in blog posts) and to use your keywords at a rate of approximately 2% density. When creating anchor links on your site for example, use keywords in the hyperlink instead of “click here” or “visit this page”. Finally, be sure to spin the descriptions on any products or services offered; if you are selling a product that a thousand other websites sell, do not use the generic item description.
See Your Site Through Their Eyes
Pingler has released its Spider Viewer Tool that gives you first-hand access to what search engine spiders are seeing. This strips away all of the code such as HTML and allows you to see what the spider does in relation to content and keywords. This will let you know the extent of your keyword usage and whether or not excess and unnecessary content is being indexed by major search engines.