I just noticed that Search Engine Land’s Facebook group has just recruited its 2000th member. I thought I knew the first few members – but was surprised that I also know the 2000th! Jason Duke has been on the search scene forever (Warning – You Tube Link)- you may have seen him walking around wearing a Spam sandwichboard if you have been in this game long enough. I thought I’d capture a screenshot of the 2000th search Engine Land member on Facebook for prosperity.
I read an unusually philosophical post from Joost about not getting ahead of yourself in being too visionary. He started after Roy started talking about being visionary in the first place. Everyone on Joost’s blog seems to agree with Joost that you should do what works today in SEO, to do the best for your client.
Joost – Someone’s gotta disagree with you 🙂
I agree with Roy. But before I make my argument, paid links are still working. I was alarmed to see a significant but smaller competitor to a client of ours hit number 1 for a BIG “PPC” volume keyword (Porn-Pills-Casino for the uninitiated). In the last three months, the competitor’s backlinks from sites CONTAINING the target keyword increased… from hundreds… to 70,000.
Now that smells of paid links to me, and the number one slot has ensued. Certainly artificial links. But when we PROPERLY explain to a Brand savvy client how the competitor did it, the client doesn’t say “yum, I’ll have some of that”. They just don’t want to risk their brand reputation and to be frank, nor do we. In fact – the opposite.
Roy is right – big companies are so darn useless at understanding and implementing SEO recommendations, you cannot make recommendations that won’t outlast the latest fad. Link manipulation still works – but every link brings with it an association with the web page it is on. That can be a bad thing. We have a client asking legal questions about backlinks they didn’t ask for (and we didn’t provide them with).
Back in 2000 I was looking at Links (well before the Google reason) and found Shell Oil linking to Friends of the Earth on its corporate responsibility pages. Probably not in the FOE’s interests I would wager.
The same goes for “bigbrand*com” getting links and recommendations from “dodgysite*com”… unknown quantities that may one day come home to roost.
So it’s not so much about being visionary, as it is about building on principals that will be as long term as the brands we represent.
If you ever see me at conferences, I tend to give away short term access to our in-house link analysis tool (Password below). We like the feedback, but we can’t give everyone the tool forever as it’s part of our “secret sauce” (or “secret source”). Maybe one day we’ll give people the chance to sign up, but for now, we are still playing with it.
It just got better – so here’s a free login for a few weeks…
The tool takes two web domains and compares the relative backlink quality of each site. Not only does it look at the number of backlinks, but it tries to extract links that may have a quality signal and record these “quality links” separately from any old spam.
In this release we have:
Added the ability to add a list of your own “trusted domains” to fit your industry
Allowed you to restrict results to back-links from pages containing a specific keyword
Added the ability to download the results into a spreadsheet
We added a metric: “backlinks per page” (see below)
Improved the retrieval speed
I analyzed two of my colleagues, Mike and David (hope you don’t mind, guys) who both were on the front page of Google just now for the phrase “search engine marketing”. Now – they are BOTH there, so I’m not going to argue with either result. I asked it to analyse the following:
“Analyse links to www.weboptimiser.com vs www.search-engine-book.co.uk from pages containing the phrase “search engine marketing”
Now – Mike only has 46 pages, but still gets first page, with only 1,590 inbound links compared to Weboptimiser’s 18,200. Why so different? Well for one reason, Mike actually has a huge number of links to the site (try the search again without the phrase filters).
I also note that over half of Mike’s links are not to the the home page. Now I suspect I know exactly why that is. But I’m not letting on.
I think the calculation to start looking at more and more, though, is the “Backlinks per page” statistic. Here we divide the number of backlinks we see pointing towards a site by the number of pages that we see indexed for a domain. We may do more research on this statistic at a later date.
You are free to use the tool for the rest of the month. Please tell us if you can see a way to improve it – and sign up if you want us to give out passwords to you in the future.
Last week I was privileged enough to be invited to Microsoft’s “Live Search Symposium” in a posh private venue in Knightsbridge. There were only a hundred or so guests, but when the guests include Danny Sullivan, Dave Naylor and the tech boffins at the British Library you know you are in the right sort of place.
(By the way – how short can you make YOUR domain name? The British Library is www.bl.uk… Do you think their DNS could drop the www? that would be very cool. Anyway… I digress.)
The symposium was exceedingly slick. MUCH better than I am used to frankly, from Microsoft. They really put text based serps into perspective as being… frankly the very start of search. I know we have been banding about universal or blended search for over a year now, but in the UK at least, we really haven’t made the leap. Microsoft seem pretty joined up in their thinking about how that leap will change their fortunes in serach. Microsoft are not thinking “serps”. They are thinking vertical search and multiple media. They may still be weaker than Google in the organic results, but what they have been building an the infrastructure powerful enough to break Google’s market up entirely by encouraging different people to build different ways to search, based on different audiences. Danny reported on one such example of the Indiana Jones Search Engine straight out of the meeting and if you haven’t spent 30 minutes engaging with http://www.msdewey.com/ you really should! but Microsoft have gone way further and it looks like their new “Silverlight” product (a bit like flash on steroids). I am no developer, but seeing how the dots connect almost makes me wish I was.
Microsoft are not just paying lip service to these joined up dots either, They have created dozens of viral videos which must have cost a fortune! (sorry… the video below is probably still loading… bear with Microsoft…)
There are several of these designed primarily (it would appear) for the UK market.
Probably the most impressive thiong I saw to show integrated search could already be was http://specials.uk.msn.com/mayoral-election which was pulling news, image and data feeds in real time as the London elections were going on. Once built, the system was functioning and updating seemlessly straight through the election period and is still current now. You need to download Silverlight to see it, mind, but wow – that’s going to challenge the very core of the news providers. Apparently anyone with programming skills and some time on their hands could have built it, using Microsoft’s freely available APIs. It didn’t have to be MSN, but obviously they wanted to see how far they could take the technology.
When you have search that is so rich in any given verticle or topic, built by millions of enthusiasts in thousands of genres and styles… what does Google become by comparison? Just a directory of search engines – because the “most authoritative” source and “richest experience” on (say) bungee jumping will be a site that drags in every valuable news search, image search, map locations, addresses, forums and blogs on the subject in the most entertaining way… and that way will be based on Microsoft’s silverlight technology and Microsoft’s APIs. Not Google’s.
The better Google gets at retutning the “best site” at the head of search, the more Microsoft technology driven sites will be at the far end. That leaves Google with the long tail and no place to go.
I’m not saying Microsoft’s strategy will work. But it’s definitely a strategy that wasn’t thought up and developed to this level of sophistication on the back of a napkin. It might just work.
Most WordPress blogs these days use nofollow tags – a Google promoted device to make a vague attempt at stopping link spam. i think it may have also dampened the desire to debate on blogs though. The debates are now being dragged into Facebook groups or back behind walled gardens at Google’s expense as well as the public’s.
What do you think? nofollow free – good idea or bad?
Regardless of your opinion, here’s how to eliminate nofollows in your comments:
I am experimenting with two plugins to do this “Nofollowfree” and “Nofollowcasebycase“. Not sure which is the better yet. To install a plugin to WordPress you really need FTP access to your WordPress site. You download the plugin, and then unzip it (Don’t try and put a zipped file onto the server. It doesn’t work). The unzipped contents should, themselves be in a directory and need to be put in the wp-content/plugins directory on the server. For some people, that will be /public_html/wp-content/plugins.
That’s not it though. You then need to activate the plugin from within WordPress. Log into the /wp-admin area and click on “plugins” on the far right of the screen to do this.