Stop search engines assuming you can’t spell

This article shows how Using the Plus operator lets you largely bypass any auto correction features on the main search engines. If you look closely, you’ll also see a minor bug in Bing.

I have to admit that my typing is poor. Rarely do I post an article that doesn’t need at least one correction after it goes live. I am therefore mostly grateful that search engines seek to correct my poor typing by suggesting alternatives.

But there are many, many times that the search engines jump too quickly to the conclusion that I mistyped my query. I am sure that happens to us all.

There is an easy way to fix this which works on both Google and Bing, which is to use the “+” operator before your search. Try typing in “searche” into either engine and you get results for the phrase “search”. Bing, though, is more helpful if you actually WERE looking for results with an “e” on the end, by telling you that it changed the results.

plusoperator

It also lets you rerun the search again without the assumption. When you press that button is adds a “+” symbol to the search phrase and different results appear.

plusoperator2

Infact, Google has the same operator, but it is much harder to find intuitively, so you just have to add it.

plusoperator3

So that’s the tip.

Now did anyone notice the bug in bing’s plus operator search results?

Badabing day at Microsoft

Today I am at Microsoft’s London HQ with about 50 Microsoft clients. They are going to talk about what’s coming down the pipe. Since they said nothing was secret, I decided to blog it (from my iPhone!)

So this post may get longer as they goes on…

The first speaker used to work for the BBC and brought BBC Iplayer to us. He is saying that until Microsoft are convinced that they have the BEST search engine in the UK, they won’t be doing their UK marketing launch. (We haven’t had adverts yet, here in the UK.)

Microsoft believe that (surprise, surprise) their future is in the cloud. With a show of hands, most people in the room had more that 2 computers in the home. We don’t want our data stored locally any more. It sounds like Windows 7 will have this at it’s heart. (looks like we may get a free beta at the end of the day… Another post, for another day).

This cloud idea goes beyond storing data. “synchronized consumption” was a catchprase that I think might stick. The idea of playing a game on your xbox and doing a live bet on William hill on the outcome of the multiplayer game you are in, for example.

He also sees a huge rise in video usage online, and wants Microsoft to be the leading longplay video consumption online… But also reminded us that 15% of the UK don’t use mobile phones and a third don’t have broadband.

10:20: next up, Nicky Smith, Marketing Director of Microsoft Advertising. She started with a video summarizing some of the masses of tools and platforms open to microsoft advertisers. More of an advert than an insight, but the idea she went on to was a one stop shop. Cloud based data warehousing allowing advertisers to target adverts fully. This targeting ability is becoming more important than ever, getting the right message at the right time to the right person… And this translates into Engagement Mapping, which dispels the idea that “last click wins”. Microsoft feel that we can now measure all the aspects (touch points) of the campaign.

She went on to say that display adverts can increase the number of searches by 54% and can increase click through by 47%. Interestingh, huh?

Live searchers (now bing) convert over 40% better than the… Main competition.

So the only challenge now… More market share?

Will you vote for my Blog?

There’s a Blog competition worth entering. Win a great prize for affiliates. If you don’t have a blog… or just want to say thanks for the info, maybe put in a vote for me?

Murray is running a Blog competition, which has some pretty super prizes:

What do you win in the Blog Competition?
Prizes:
-2 Gold Passes for Affiliate Summit East on August 9, 2009 in New York City: ($1800 value) for the best blog overall
-A Full Pass for Adtech London (£835)
-A portrait of you by Mari Kurisato
-A competition winner logo for your blog
-The chance to say you have an award winning blog

If you could vote for me whilst you are there, that would be really great.

I found a blog competition
I found a blog competition

Dixon.

Linkscape vs Majestic

There are very very “Link maps” in the world commercially available to the public. A link map is the hardest element to replicate in Google’s search algorithm. There are really only two companies with commercial link maps available now to the masses. This article helps you choose between the two.

Who has the best back link data in the world today? Discounting Yahoo, there are only two world class systems being developed that I can see. They are Majestic – which has been quietly link walking since 2004 and is only now revealing its hand, and Linkscape – probably the most well known in the US – which has had considerable investment from the Rand Foundation (SEOMoz).

I’ve been impressed with both and thought it was time to really put both systems through the test. Which one is better and which one is priced right?

To clarify – I am looking at the PAID versions of both systems. I covered the following areas:

  • Index Size
  • General Look and Feel
  • Manipulating data
  • Pricing
  • Global reach

Index Size

Both sides could shout about the size of their index. Indeed – Majestic certainly is, claiming that they now have 539 billion urls indexed – which they say compares to only 170 billion indexed by Yahoo and only 38 billion indexed by Linkscape. In fact Linkscape’s Meta Description puts their own number higher at 54 Billion+, but even at this level, Majestic’s data (if true) is 10 TIMES the size of Linkscape’s at the moment and about half the size of Google’s. So let’s test this with a few examples – from popular to unknown
Small site test: http://swanh.org/ (Software association of New Hampshire)

I chose this one for several reasons. The first is that I have never heard of them. I just went through the DMoz directory randomly starting with a state I’ve never been to. The second reason is that they 301 the www onto the non www so will avoid a potential flaw in results. Third, the site does not have an architecture that is built upon multiple subdomains.

Majestic found:  5,127 external back-links from 882 referring domains. with 229 unique anchor texts.
Linkscape found: 25 external links from 6 domains & subdomains. Linkscape only shows the top 50 anchor texts in this report.

Well on this basis – Majestic is absolutely crucifying Linkscape – but let’s be careful… Majestic may be giving so much data that we are not comparing like with like.

Big Site Test: http://BBC.co.uk (The UK’s most well known news brand)

Large sites will be especially interesting to compare because they tend to have many subdomains (like http://news.bbc.co.uk) I tried to find a big site without significan subdomains, but even Wikipedia uses them for language, so I think we need to accept that any link analysis tool needs to cope with subdomains. So what did we find with the BBC?

SEOMoz found:  16,424,105 links from 315,686 domains/subdomains
Majestic found:  345,383,557 links from 598,475 domains.

Again, Majestic shows considerably more backlinks. Majestic;s data, though, includes 23 million image links, 22 million nofollow links, 1 million, 15 million DELETED and 2.9 million mentions (links in plain text, without a hyperlink). On the other hand, SEOMoz’s number appears to count subdomains as seprate domains, instead of limiting their advertised number to the number of Top Level Domains (TLDs).

If we take all of Majestic’s deleted domains out, and even if SEOMoz’s data had already excluded these, (which it doesn’t) then I think we can safely say that Majestic’s index is considerably more developed than Linkscape’s at the moment.

How can Majestic’s Index be so much larger? Majestic started indexing in 2004. That’s a lot of crawling time that Linkscape needs to catch up on. In addition, Majestic’s method of collecting data was ingenious – using distributed crawlers, similar to the bit torrent idea of using multiple partners to use their spare computer downtime to crawl the web. This has given Majestic considerable processing power at a relatively low cost.

General Look and Feel

Majestic’s hands down win on the index size is entirely reversed when it comes to Linkscape’s considerably better “look and feel”. Linkscape looks usable – whilst Majestic looks like it is built by a techie who never quite got around to thinking about it all from the user’s point of view.

clip_image002[4]

Linkscape lays the data out logically, with a dashboard containing the most important information readily displayed and intuitive tabs to drill down to the referring domains or the URL anchor text. When you delve into the “links to domain” tab, SEOMoz lets you filter the result on the fly. This is an especially nice feature. For example, you can easily hide or include particular types of links. To do this with Majestic, you need to go right back to the options menu and force a new analysis of the data. You can get the same sorts of data, but it just takes more effort in Majestic and looks better in Linkscape.

By comparison – Majestic tries to display Top anchors, top referring domains and top pages all on the same page, offering a drill down on each table. It’s all too much data for a single screen. This has now also been augmented with some new graphs – which are nice… but MORE DATA! I also think people will be confused between the two graphs on this dashboard – entitled: “External backlinks discovery for domain.com” and “Referring domains discovery for domain.com”. I know the difference – but I guess you’ll have to look twice… and I would prefer if these defaulted to cumulative graphs.

Manipulating Data

The thing that strikes me between the two systems is that Linkscape only gives you detailed data about the 50 most common anchor text phrases, and the 50 most important links. Looking at www.swanh.org as my example, I also found that all the most important links were internal! Now that may be – but if I want internal link data I can use Xenu Link Sleuth… it’s external data that I want – and by comparison, Majestic gave me so much that I immediately need to start filtering out what I feel may not be appropriate.
Majestic gives 200 results to SEOMoz’s 50 per page on the screen. You can drill down to up to 3,000 l of SEOMoz’s results, page by page – but this makes it hard to extract the data.

On both systems, you can export the data to a CSV file and then you get the whole lot! This is incredibly powerful, except that Linkscape limites their data to just under 3,000 URLs, whilst Majestic gives you the complete data dump if you want it all. There is, however, a considerable learning curve here for using Majestic. To get the data you REALLY want, you need to manipulate the “options” and then force a new analysis… THEN you need to download the data into a CSV. That gives you vastly superior information than SEOMOz, but it does take a while to be able to see the data from different perspectives.

Majestic also has some useful tools for power users. You can, for example, group your different accounts (SEOMoz calls them reports) into sub-folders. SEMoz let’s you compare two competitors side by side, but Majestic’s folders allow you to compare a whole industry sector if you had enough funds to collect all the data.

Pricing

I am not going to go into pricing for the real high end users, who may be spending several thousand every month to use the data. For mere mortals, the pricing models are very different.

Comparing the prices is like comparing apples and oranges.

Linkscape is part of my SEOMoz Gold membership. That start from 25 reports a month for about $80. When I run a report, I get the data for that domain, at that point in time. I get to keep it for as long as I want provided I remain a member of SEMoz. By contrast, on Majestic, I buy access to a domain’s data, for a given amount of time – from 7 days upwards.

Majestic similarly uses a “credits” system to get around the international issues, but the price of a domain can vary dramatically. In the examples I used, Swanh.org cost just a couple of credits, whilst analysing the BBC would cost 600 credits for seven days access (or 3000 for a year’s).

So which is cheaper actually depends on what sites you are analyzing and how you are using the system. If you only have $20 though… you probably only have Majestic as an option.

Functionality

Both systems are function rich and I probably have missed a few. If either Linkscape or Majestic think I’ve missed a trick here, they both know how to contact me and I will correct the table below – but only for functions available at the date of posting.

 

Linkscape

Majestic

Your own domain for free

No

Yes

Domain Quality Estimate

MozRank (trying)
MozTrust

ACRank (Needs work)

External Links list

Yes

Yes

Internal Links List

Yes

No

Links to URL

Yes

Yes

Ability to filter on the fly

Yes

No

Filter by images

Yes

Yes

Filter noscripts

Yes

No

Filter Nofollow

Yes

Yes

Filter Ofscreen links

Yes

No

Filter same IP number

Yes

Yes

Filter Same IP block

Yes

Yes

Filter same subdomain

Yes

Yes

Filter Same root domain

Yes

Yes

Filter by Frame

No

Yes

Filter by Redirect

301s shown

Yes

Filter Deleted Links

No

Yes

Filter in/out Alt Text

No

Yes

Filter Mentions

Not tracked

Yes

Filter by specific anchor text

No

Yes

Filter by crawl date

No

Yes

Filter by URL text

No

Yes

By given IP range

No

Yes

Summary

Linkscape is considerably more intuitive at the present time, but here is much more depth of data at Majestic and for professionals, the leaning curve will be worth the effort. By contrast, though, SEOMoz has a huge variety of other tools available within its membership fee which you will still need for Internet Marketing even if you do go for Majestic.

Bing Hack: Seeing Geo-targeted results

Bing is going to be all the rage for a few weeks – and hopefully longer. But looking at localized international results from our Bedfordshire office needed a bit of working out. Here’s what we found.

Bing launched yesterday – a few days before everyone thought it would and I have been pleasantly surprised by the generally positive feedback from people within the industry. Last time Microsoft launched a search engine, I can safely say the response wasn’t QUITE so good. In the intervening years they have really tried to engage with the industry and it seems that this has paid off a bit.

A huge amount of work seems to have gone into user intent and user behaviour. This means that results can vary dramatically depending on the nature of the query. The decision to add images to the results, or local listings, or currency exchange rates all depend on what Microsoft can glean about you.

This means they have used some novel new GeoTargeting features which are a little unusual – meaning that most people around the world are going to see different results EVEN WHEN THEY TRY TO SEE RESULT FROM OTHER COUNTRIES.

Let me explain…

The first two “layers” of Geo-targeting are cookie based. One defines the country and one defines a much narrower area like your town. To change the country, you would think that you just need to click on the country button in the top right:  bing-counyry-code

But this doesn’t truly give you US results if you are in the UK….

The second Cookie related setting can be modified by typing in a very generic term like “plumbers”. You should get a local result. From there you have a button that lets you change your location:

bing-local

this will allow you to draw local results from anywhere. (I switched mine from Ealing, to New York.) But this still doesn’t guarantee you seeing proper US results in the main SERPS.

 

To properly see UK results from the US you need a third level of Geotargeting. The third level is IP location. Looking at the Bing SERPS from a computer hosted in the country you want to review does make a difference. To do this, the easiest way is to look through a US proxy from the UK or a UK Proxy from the US. If you want to look through a US proxy, but before you do that, you should probably disable all your cookies or at least set then to the country you are trying to view.

But even THIS isn’t perfect, we found. There is a FOURTH layer which even a proxy server doesn’t fix. This fourth layer has some unusual idiosyncracies.

Andy and I worked on this today and it looks like the accept-language headersetting is working differently in the US than in the UK. Changing the accept-language header is a bit of a pain, as it gets configured when your browser gets installed, although in Firefox I imagine that the “tamper data” plugin is abot to get quite popular as you can change the settings using this plugin.

We used this plugin, with cookies disabled, through a UK and through a US Proxy, to see whether IP location or accept language header too precendence. The results were geeky but interesting:

 

IP Location

Accept language setting

Result in Bing

US

GB

UK

US

None

US

US

US

US

UK

US

UK

UK

US

UK

UK

AU

AU

UK

CA

CA

The interesting element here is that on a US IP address, the accept language setting takes precedence, but from a UK IP address, the accept language setting defers to letting the IP address taking precedence.

So it seems that Bing does not trust an IP address being reported as coming from the US as it does from outside the US.

Summary and Conclusion

If you are in the UK and you REALLY want to see what the Americans are really seeing on Bing then you need to do all of the following:

1. Set the country in Bing AND

2. Set the town in Bing AND

3. View through a US based proxy server AND

4. Change the accept language settings by installing a Firefox plugin

If you are in the US, then you only need to do one of the two last elements to be able to see a regional result outside the US.