This is a guest post by Ben Hook, a search marketer and owner of Navaro, a UK based SEO company helping clients to increase their online visibility.

When optimising a site most likely you’ll have to be making some on site changes if the site hasn’t previously been optimised, but when it comes to sites that have already achieved and secured some really good rankings there is a lot more at stake than a site that has previously been in the supplementals.

When making changes to these kinds of sites it’s much more important to know the effects of the changes that are made to help give more control and security.

The issue with this is that the frequency that search engines crawl your website can vary dramatically, which could result in it being weeks before you can see the effect the changes you just made have on the positions.

By increasing the crawl rate you can see the effects of changes, and therefore resolve them, a lot quicker than before. There are a few ways to help increase the frequency of the crawl rate:

Increase the PR of the website

High PR websites are often crawled more frequently. Links from regularly crawled / high PR websites will help to increase the frequency that Google visits your website.

Integrate some regularly updated content

Blogs are a good example of this. Search engines will often index new posts almost instantly and if the homepage contains snippets of the latest blog posts this is often crawled regularly.

For a lot of websites this may seem tricky to regularly keep updating their homepage, but why not consider other alternatives? Place some Twitter feeds or a widget that displays the first paragraph of your latest news article. A few small pieces of updated content will help show the search engines that your website is changing quite frequently and so it should come back to visit more often.

Reliable server

A reliable server will help avoid any downtime when the search engines come to visit, meaning a more reliable website.


A lot of people report increased crawl rate when uploading sitemaps. Both XML and HTML sitemaps can help with crawling and indexing.

Avoid canonical / duplicate issues

If the search engines often find they are returning the same content but on different URLs they may restrict the amount of time spent on the website. Crawling websites uses resources and search engines don’t want to waste these on duplicate content. Sorting out any duplicate content issues on a website has major SEO advantages anyway despite this.

Reduce page size

Not only is page speed going becoming more of a ranking factor, the quicker the page loads the more beneficial for the crawl rate. If you have a lot of information to download the search engines are only going to visit more regularly if they are considered important (high PR). Reducing page size makes the website faster and uses less bandwidth, meaning an increased crawl rate.

If crawling is causing issues adjust crawl rate in webmaster tools – If the crawling is using up a lot of your resources and causing issues on your side you can set the speed at which Google crawls your website in webmaster tools. This can help avoid any of the issues that can come with search engines visiting your website.

By monitoring your crawl rate in Google webmaster tools you will hopefully be able to see what effect these changes have on the crawl rate.