Search Engine Land’s 2000th Facebook Member is…?

I just noticed that Search Engine Land’s Facebook group has just recruited its 2000th member. I thought I knew the first few members – but was surprised that I also know the 2000th! Jason Duke has been on the search scene forever (Warning – You Tube Link)- you may have seen him walking around wearing a Spam sandwichboard if you have been in this game long enough. I thought I’d capture a screenshot of the 2000th search Engine Land member on Facebook for prosperity.

search engin land has 2000 visitors

 All I can say, Jason, is “What kept you?” 🙂



Google erodes corporate privacy rights

Are the search engines are becoming fiefdoms in their own right, unrestrained by the notion of corporate responsibility? Corporate privacy may need protecting a law at some point in the future.

Google have just changed the terms of use for Google analytics. More about the change below. But first my rant…

Isn’t it about time small companies started getting similar rights online to those of individuals? I don’t use Google Analytics much – because I worry about giving my sales data to the very company that changes the cost of my traffic based on a variety of undefined metrics. The same company who’s directors have a moral obligation to maximize the revenue for their shareholders. Not mine.

I do use Google Analytics on though. Let’s face it, until a few weeks ago it was the best free analytics tool in the market. So I looked at my mediocre stats just now (well… I haven’t actually LOOKED yet…) I saw the following “stick em up” message as I logged in:
Google Analytics Data Sharing Settings
In order to improve your experience with Google products, Google Analytics is updating its data sharing policy. You now have the ability to share your Analytics data with other Google services. This will improve integration, enable additional features in Google’s advertising services (including Google Analytics, AdWords and AdSense) and improve your experience with these products.
Press “Accept” to enable data sharing between Google Analytics and other Google services or for additional options, choose “More data sharing options”.
Remind me later | More data sharing options | Do not share data | Learn more
Now it looks to me like they have just made me accept allowing Google the RIGHT to charge me more for traffic based on what they know about my site if they should so desire.
Doesn’t it mean that to you?
So – I could of course – reject the notion, but Google has been built on offering you something you want, in exchange for giving up something you don’t worry about too much. But every time you give something up, we really can’t mentally recall the inter-relationships between all of the things we have agreed to. Like those Adwords terms of use? Those Google Docs terms of use? and more recently Google Health… where they are giving you access to YOUR OWN medical records. Now I don’t mean to sound like a paranoid android here but… OK. I do mean to sound like a paranoid android. This is getting absurd. Viacom needs to spend millions defending their business model, and individuals have the protection of governments at least in theory in the UK but smaller companies really don’t have anywhere to turn, nor do they have the will to even try.
Ironically, Google feels justified to demand privacy laws of the US government for individuals at the very same time that it erodes the rights of businesses on the web.
In the UK we assume that the monopoly laws that protect us apply in other parts of the world. They don’t. In the US, the “right” way to do business is to be seen to entirely demolish your competition until you are the last man standing. Every other website on the planet apart from Google is going to eventually have to decide… are you with Google or against Google? It’s going to be hard to stand against something that large. It will be worse than not paying taxes.

Link building: Short term vs Long term strategies

Why I still believe short term link building is bad – even though it works.

I read an unusually philosophical post from Joost about not getting ahead of yourself in being too visionary. He started after Roy started talking about being visionary in the first place. Everyone on Joost’s blog seems to agree with Joost that you should do what works today in SEO, to do the best for your client.

Joost – Someone’s gotta disagree with you 🙂

I agree with Roy. But before I make my argument, paid links are still working. I was alarmed to see a significant but smaller competitor to a client of ours hit number 1 for a BIG “PPC” volume keyword (Porn-Pills-Casino for the uninitiated). In the last three months, the competitor’s backlinks from sites CONTAINING the target keyword increased… from hundreds… to 70,000.

Now that smells of paid links to me, and the number one slot has ensued. Certainly artificial links. But when we PROPERLY explain to a Brand savvy client how the competitor did it, the client doesn’t say “yum, I’ll have some of that”. They just don’t want to risk their brand reputation and to be frank, nor do we. In fact – the opposite.

Roy is right – big companies are so darn useless at understanding and implementing SEO recommendations, you cannot make recommendations that won’t outlast the latest fad. Link manipulation still works – but every link brings with it an association with the web page it is on. That can be a bad thing. We have a client asking legal questions about backlinks they didn’t ask for (and we didn’t provide them with).

Back in 2000 I was looking at Links (well before the Google reason) and found Shell Oil linking to Friends of the Earth on its corporate responsibility pages. Probably not in the FOE’s interests I would wager.

The same goes for “bigbrand*com” getting links and recommendations from “dodgysite*com”… unknown quantities that may one day come home to roost.

So it’s not so much about being visionary, as it is about building on principals that will be as long term as the brands we represent.

Gordon Brown was at a Google Zeitgeist forum yesterday. When asked about the industry he paraphrased someone else saying “the first 500 years of any institution are always the most difficult



Link Analysis tool update

If you haven’t had a chance to see this back link analysis tool yet, here’s a password that will only last a couple of weeks. You should get there quickly if you are onto SEO.

If you ever see me at conferences, I tend to give away short term access to our in-house link analysis tool (Password below). We like the feedback, but we can’t give everyone the tool forever as it’s part of our “secret sauce” (or “secret source”). Maybe one day we’ll give people the chance to sign up, but for now, we are still playing with it.

It just got better – so here’s a free login for a few weeks…

What is it and what has got better?

The tool takes two web domains and compares the relative backlink quality of each site. Not only does it look at the number of backlinks, but it tries to extract links that may have a quality signal and record these “quality links” separately from any old spam.

In this release we have:

  • Added the ability to add a list of your own “trusted domains” to fit your industry
  • Allowed you to restrict results to back-links from pages containing a specific keyword
  • Added the ability to download the results into a spreadsheet
  • We added a metric: “backlinks per page” (see below)
  • Improved the retrieval speed

I analyzed two of my colleagues, Mike and David (hope you don’t mind, guys) who both were on the front page of Google just now for the phrase “search engine marketing”. Now – they are BOTH there, so I’m not going to argue with either result. I asked it to analyse the following:

“Analyse links to vs from pages containing the phrase “search engine marketing”

Now – Mike only has 46 pages, but still gets first page, with only 1,590 inbound links compared to Weboptimiser’s 18,200. Why so different? Well for one reason, Mike actually has a huge number of links to the site (try the search again without the phrase filters).

I also note that over half of Mike’s links are not to the the home page. Now I suspect I know exactly why that is. But I’m not letting on.

I think the calculation to start looking at more and more, though, is the “Backlinks per page” statistic. Here we divide the number of backlinks we see pointing towards a site by the number of pages that we see indexed for a domain. We may do more research on this statistic at a later date.

You are free to use the tool for the rest of the month. Please tell us if you can see a way to improve it – and sign up if you want us to give out passwords to you in the future.


Integrated Search. The future of search.

Microsoft ran a “Live Search Syposium” for 100 or so specially invited guests. Here is the vision of search in the future that they showed us.

Last week I was privileged enough to be invited to Microsoft’s “Live Search Symposium” in a posh private venue in Knightsbridge. There were only a hundred or so guests, but when the guests include Danny Sullivan, Dave Naylor and the tech boffins at the British Library you know you are in the right sort of place.

(By the way – how short can you make YOUR domain name? The British Library is… Do you think their DNS could drop the www? that would be very cool. Anyway… I digress.)

The symposium was exceedingly slick. MUCH better than I am used to frankly, from Microsoft. They really put text based serps into perspective as being… frankly the very start of search. I know we have been banding about universal or blended search for over a year now, but in the UK at least, we really haven’t made the leap. Microsoft seem pretty joined up in their thinking about how that leap will change their fortunes in serach. Microsoft are not thinking “serps”. They are thinking vertical search and multiple media. They may still be weaker than Google in the organic results, but what they have been building an the infrastructure powerful enough to break Google’s market up entirely by encouraging different people to build different ways to search, based on different audiences. Danny reported on one such example of the Indiana Jones Search Engine straight out of the meeting and if you haven’t spent 30 minutes engaging with you really should! but Microsoft have gone way further and it looks like their new “Silverlight” product (a bit like flash on steroids). I am no developer, but seeing how the dots connect almost makes me wish I was.

Microsoft are not just paying lip service to these joined up dots either, They have created dozens of viral videos which must have cost a fortune! (sorry… the video below is probably still loading… bear with Microsoft…)

What Dixon Jones laughs at during working hours

There are several of these designed primarily (it would appear) for the UK market.

Probably the most impressive thiong I saw to show integrated search could already be was which was pulling news, image and data feeds in real time as the London elections were going on. Once built, the system was functioning and updating seemlessly straight through the election period and is still current now. You need to download Silverlight to see it, mind, but wow – that’s going to challenge the very core of the news providers. Apparently anyone with programming skills and some time on their hands could have built it, using Microsoft’s freely available APIs. It didn’t have to be MSN, but obviously they wanted to see how far they could take the technology.

When you have search that is so rich in any given verticle or topic, built by millions of enthusiasts in thousands of genres and styles… what does Google become by comparison? Just a directory of search engines – because the “most authoritative” source and “richest experience” on (say) bungee jumping will be a site that drags in every valuable news search, image search, map locations, addresses, forums and blogs on the subject in the most entertaining way… and that way will be based on Microsoft’s silverlight technology and Microsoft’s APIs. Not Google’s.

The better Google gets at retutning the “best site” at the head of search, the more Microsoft technology driven sites will be at the far end. That leaves Google with the long tail and no place to go.

I’m not saying Microsoft’s strategy will work. But it’s definitely a strategy that wasn’t thought up and developed to this level of sophistication on the back of a napkin. It might just work.

I’ve gone nofollow free… Good or bad thing?

Google promoted device to make a vague attempt at link spam. i think it may have also dampened the desire to debate on blogs though

Most WordPress blogs these days use nofollow tags – a Google promoted device to make a vague attempt at stopping link spam. i think it may have also dampened the desire to debate on blogs though. The debates are now being dragged into Facebook groups or back behind walled gardens at Google’s expense as well as the public’s.

What do you think? nofollow free – good idea or bad?

Regardless of your opinion, here’s how to eliminate nofollows in your comments:

I am experimenting with two plugins to do this “Nofollowfree” and “Nofollowcasebycase“. Not sure which is the better yet. To install a plugin to WordPress you really need FTP access to your WordPress site. You download the plugin, and then unzip it (Don’t try and put a zipped file onto the server. It doesn’t work). The unzipped contents should, themselves be in a directory and need to be put in the wp-content/plugins directory on the server. For some people, that will be /public_html/wp-content/plugins.

That’s not it though. You then need to activate the plugin from within WordPress. Log into the /wp-admin area and click on “plugins” on the far right of the screen to do this.