Being able to track bots and spiders on a website can give a deep knowledge of the way search engines interact with its pages, but, unfortunately, in most cases this information is not available since common javascript based stats softwares can’t monitor spiders (that don’t execute javascript) and often the webmaster doesn’t own server logs. Howevere, even when he owns them, without a log analyzer reading them isn’t comfortable at all.

Thinking about this issue, I wondered if it was possible to track bots using the King of stats software: Google Analytics. Surfing the web I found some french guys that have written a PHP script that exactly does want I was looking for (you can find it at http://www.web-analytics.fr/google-analytics-seo-comment-mesurer-les-vistes-des-robots-et-crawlers-sur-votre-site/; I don’t link since there are some spam uncensored comments in that post…)

I’ve given a chance to the script and tested that it works, but I also have tought that integrating it on a website wasn’t so immediate. So I sayd to my self: most of the websites are WordPress based, and WordPress uses PHP… Why don’t write a plugin to easy install the code without hacking template files?

From the idea to the result the step is short, and so we have now WP Bots Analytics: a very simple WordPress extension that could give you precious information on your website. Looking at its results, in fact, we can easily realize if:

  • There are pages visited very rarely (are they too far from the homepage? Do them have too few inbound links?)
  • There are other pages of great interest for search engines (Can I optimize them better? Can I use them to link to other?)
  • A few unwanted bots unnecessarily consume our bandhttps://www.stayonsearch.com/wp-admin/post-new.phpwidth (Should I block it via robots.txt file?)
  • If pages we wanted to block are still spidered (are directives in the robots.txt file or meta robots tag correct?)
  • Pages we want to be indexed are not visited at all (are we inadvertently blocking content?)

Knowing this kind of details can strongly help our SEO activity, but the greatest thing about that is we can monitor bots over time, look at trends and all the deeper set of data that Google Analytics offers.

To give you all these datas, Bots Analytics WP needs only 2 things:

  1. The Web Property ID associated with a new profile created in Google Analytics
  2. The first digits of the _utma cookie of the site on which you want to install it.

Let’s see how to get them.

Regarding the first one, in your Google Analytics account create a new profile and name it as you prefer (something like bots.site.com could be smart).

Take note of the Web Property ID.

For the first digit of _utma cookie, with Firefox you can get it from Tools>> Options>> Show Cookies. Find your site, expand and select _utma, take note of the first digits (till the first dot):

In addition to these two informations, you don’t need anything else.

Downloaded the plugin, you can install and activate it, enter the datas in the fields of administration panel (Settings>> WP Bots Analytics) and you have done.

In your Google Analytics account, you will see something similar to this:

Of course, now you can use all the capabilities offered by GA analysis to see which engine bots have passed on your pages or which pages a particular bot has (Googlebot, for example) visited.

Well, now it’s all!

You can download WP Bots Analytics from WordPress.org and if you need some help about it, you can add a comment on the official page on my blog.

I hope you’ll find WP Bots Analytics useful, and if you want to help me promoting it, links, tweets and shares will be very appreciated.