Tuesday, December 1, 2009

blog.grader.com: Useful, or...?

After reading a recent blog post by Spydergrrl, I was moved to visit a number of twitter monitoring and grading resources she suggested.  While she didn't specifically recommend "Blog Grader", it's in the same suite of Grader.com tools that encompasses Twitter Grader, Facebook Grader, Website Grader, and (the completely nonsensical for-fun-only pseudo-tool) Personality Grader, among others.

My blog is reasonably new and I didn't expect a glowing report, so I wasn't surprised with what I got.  But when I compared my blog to four other well-established and greatly superior blogs, the results were surprising.  Here's a snapshot of the numbers:

Monday, November 30, 2009
BlogRankTraffic RankInbound LinksFrequencyRating
toddlyons.ca5,741n/a7Weekly37
cpsrenewal.ca2,5947,793,9801,134Weekly63
spaghettitesting.ca6,2025,358,849776Daily26
blog.gc20.ca3,8742,844,6754,724Weekly63
eaves.ca672385,53820,482n/a89

I've initially ordered these by traffic ranking, where lower numbers indicate higher ranking (more traffic), higher numbers indicate lower ranking (less traffic), and "n/a" indicates that the author's mother and a number of esteemed colleagues make regular visits.  :) Cheers.

The algorithm used to produce these results is uncertain. Grader.com does seem to have launched a blog recently, but only the Twitter calculations are explained.  But whatever the gears are that crunch the numbers, there is no way on Earth that Peter Smith's blog (spaghettitesting.ca) churns out a paltry 26.  Additionally, a 37 for my blog is overly generous, given that I only have 7 links pointing back at the site, and at least half of those are probably from comments that I've left on other blogs where my site address is linked to my name.  In fact, the only numbers that look reasonably solid to me are for David Eaves' blog.

So... the Inbound Links -- if they are accurate -- are the most useful numbers we have in making a comparison.  The validity of the other numbers is questionable.  Are they replicable?  What are the results from retests on subsequent days?

Tuesday, December 1, 2009
BlogRankTraffic RankInbound LinksFrequencyRating
toddlyons.ca7,712n/a12Weekly41
cpsrenewal.ca3,0907,793,9801,134Weekly68
spaghettitesting.ca9,2515,358,849776Daily54
blog.gc20.ca3,0932,844,6754,724Weekly68
eaves.ca948385,53820,482n/a59

Traffic is unchanged.  The Inbound Links are identical, except for mine, where another 5 links were recognized over the next 24 hours.  In the ratings, my blog has climbed a few points, as has cpsr and gc20; Peter's blog has skyrocketed by 28 points, and David Eaves plunged down 30 points.  Most blog rankings have gone down while their ratings have gone up, especially Peter's of course.  His overnight surge in quality resulted in a -3050 reduction in ranking, thrown shoes and tomatoes, two thumbs down, and significantly curtailed rotation on Muchmusic.  For shame.

Wednesday, December 2, 2009
BlogRankTraffic RankInbound LinksFrequencyRating
toddlyons.ca13,041n/a15Weekly46
cpsrenewal.ca5,2747,808,4361,025Weekly70
spaghettitesting.ca9,0885,369,543484Daily60
blog.gc20.ca6,0052,851,0043,927Weekly75
eaves.ca8,189382,45616,562n/a63

I gained more links; cpsr and cg20 lost a bunch somehow.  So did Peter.  And David (with a 7240 point drop in rank thrown in for good measure).  Perhaps the Inbound Links query isn't so reliable after all? 

In conclusion, I highly recommend Personality Grader. While it doesn't purport to provide useful information -- and in fact disavows any capacity to provide anything useful at all -- it does deliver reliably on that promise, and that's something.

And I couldn't make a meaningful recommendation about a tool I didn't actually use, so here are my results (click to enlarge):