Google recommends we ignore standards to pass link juice.
We have used a roundabout way to use tracking URLS to pass page rank for some time now – dropping a tracking variable as a cookie, before delivering a 301 without the variable and then picking up the variable and putting it into our Traffic analysis system. But twice this year I have heard Google reps on podiums offer an easy alternative. No, not Matt Cutts this time. The first was Adam Lasnik in Stockholm and the other was Mathew Trewhella aka “Chewy” in London. They both suggested using #sourcename at the end of the URL to track a campaign source.
Well – it sounded great at the time – and I’m sure it will work without all the complicated need to do 301s. I mentioned it to a few people, but by colleague Andy has some issues. Let’s be honest Google – you are ruining standards. Joost gave an example of ignoring standards they’ve agreed by indexing “no index” pages, but blatantly ignoring the PROPER use of the “#” on a URL should probably exclude Google staff from being employed at W3C (that’s how the “#” symbol should be used).
For me, the only acceptable solution would be for Google to allow us to submit our own parameters in the site-maps protocol that we could use for tracking, without the need for us to do fancy things to stop our improved reporting systems from creating Duplicate URLs or diluting our link juice. Come on Google – you’ve done it for “WWW” vs “non-WWW” so let us do it for variables.
Free access to a rather useful link tool that should save SEOs a lot of time.
After the review panel yesterday, loads of people asked about the link tool I was using. So I have set up a free log in for people to use for a few weeks. The tool compares the relative quality of backlinks from two competing web sites.
You will need the following details:
Please feel free to digg or sphinn or stumble or whatever, as it may not be available for long. we’d appreciate any feedback you have as well.
It’s not beyond the wit of man to fix. It’s beyond the will of those with the wit and beyond the wit of those with the will.
As I sat in Gatwick airport, fighting with the WiFi, my CTO sat opposite me, making sure my Outlook Exchange gets into the office systems properly, so I can be dealing with my Emails while on the plane to Pubcon. I had some Viagra emails amongst the 159 downloaded over the weekend, but the majority were things I signed up to, and the occasional one was even a real human discussing real business. Well – out of the 159 I have now answered the 12 or so that were human related or required human replies. That leaves me to ponder the comment Andy made when he saw the Viagra emails… namely that although those got through, my spam filters passed the 1 million blocked some time ago.
1 Million – that’s quite a lot I thought! Apparently we don’t even have our filter settings set as high as they can be, as we are happy to let a few of the cleverer ones get through instead of a false positive on a client email. We do use more than one system to filter the emails – but overall I think this tells me three things:
1) That maybe the Viagra ones are actually now quite well targeted now that I am in my forties
2) That Andy has pretty well got email spam under control but mostly…
3) The amount of spam is far from decreasing! When is it going to get fixed???
In a world gone mad with behavioural targeting getting down to whether I prefer chess over cars, the same clever boffins can’t work out how to stop spam. Oh – wait – the guys that are developing behavioural targeting systems to the nth degree are marketers… as are the spammers. The boffins have been bought up by the free market economy. That means it’s up to the governments to fix this.
Well, nobody else is going to do it are they?
Me – I’m going to go into rocket science marketing in my next incarnation.