When optimising a site most likely you’ll have to be making some on site changes if the site hasn’t previously been optimised, but when it comes to sites that have already achieved and secured some really good rankings there is a lot more at stake than a site that has previously been in the supplementals.
When making changes to these kinds of sites it’s much more important to know the effects of the changes that are made to help give more control and security.
The issue with this is that the frequency that search engines crawl your website can vary dramatically, which could result in it being weeks before you can see the effect the changes you just made have on the positions.
By increasing the crawl rate you can see the effects of changes, and therefore resolve them, a lot quicker than before. There are a few ways to help increase the frequency of the crawl rate:
Increase the PR of the website
High PR websites are often crawled more frequently. Links from regularly crawled / high PR websites will help to increase the frequency that Google visits your website.
Integrate some regularly updated content
Blogs are a good example of this. Search engines will often index new posts almost instantly and if the homepage contains snippets of the latest blog posts this is often crawled regularly.
For a lot of websites this may seem tricky to regularly keep updating their homepage, but why not consider other alternatives? Place some Twitter feeds or a widget that displays the first paragraph of your latest news article. A few small pieces of updated content will help show the search engines that your website is changing quite frequently and so it should come back to visit more often.
A reliable server will help avoid any downtime when the search engines come to visit, meaning a more reliable website.
A lot of people report increased crawl rate when uploading sitemaps. Both XML and HTML sitemaps can help with crawling and indexing.
Avoid canonical / duplicate issues
If the search engines often find they are returning the same content but on different URLs they may restrict the amount of time spent on the website. Crawling websites uses resources and search engines don’t want to waste these on duplicate content. Sorting out any duplicate content issues on a website has major SEO advantages anyway despite this.
Reduce page size
Not only is page speed going becoming more of a ranking factor, the quicker the page loads the more beneficial for the crawl rate. If you have a lot of information to download the search engines are only going to visit more regularly if they are considered important (high PR). Reducing page size makes the website faster and uses less bandwidth, meaning an increased crawl rate.
If crawling is causing issues adjust crawl rate in webmaster tools – If the crawling is using up a lot of your resources and causing issues on your side you can set the speed at which Google crawls your website in webmaster tools. This can help avoid any of the issues that can come with search engines visiting your website.
By monitoring your crawl rate in Google webmaster tools you will hopefully be able to see what effect these changes have on the crawl rate.
Title of blog post really tricky... :) Two question in one title... I have done what you have recommend in your blog post but still confuse with crawling issue. Can you evaluate issues which create resistant for crawling. Thanks for your post..
Great post. I know for wordpress, you can download a plugin that will automatically notify the search engines of your new post and add the new post to your sitemap. "search xml sitemap in the plugins area. My recent post Show-Stopping 8 Inch Android Tablets Available Now
Some times I see that my new posts are quickly indexing and sometimes not. Now I realized the issues. My recent post How to Password Lock and Encrypt Your Office 2007 And 2010 Documents Without Any Third Party Tool
Good article. I've done many things to see if google would craw my blogs more often but usually it takes around 10-15 days to get an update.
hey frns, i get email alerts every time google bots crawl my site,You can also activate this email alerts for google bots by following simple steps at: http://latesthub.in/2010/11/21/get-email-alerts-when-google-crawls-your-blog/
These are very nice tips for SEO. Sitemaps are definite must have for website to get better crawls and indexation.
Thanks for the tips. Blogging is the easiest way to insure that search engines continue to take notice of your site by crawling it. Had no idea that a homepage snippet of latest blog enhances the site being crawled.
Here elaborates the matter not only extensively but also detailly .I support the write's unique point.It is useful and benefit to your daily life.You can go those http://conditions-encountered.com/ " >wslmart.net sits to know more relate things.They are strongly recommended by friends.Personally
And also keep on updating your xml sitemap every time you make updations on your site and resubmit the xml sitemap in Google webmaster. you can also increase the crawlability of the google crawlers through Google webmaster tools. Making regular updations on the site also helps in improving crawling.
Thanks for sharing your knowledge with us, in SEO this is very important that google crawls your website periodically and index your new content/page from your website.
Thanks for useful post. For SEO we need to focus on basic trick and web seo tips...Love to see more such post from you...