Matt Cutts describes in his blog post why the German BMW site was banned from Google. I have to say, this is not very clear to me.
For one, I consider quite fair (if not a good practice) to show different content depending on client capabilities. If I had a site depending heavily on AJAX to present information, I would probably take some time to create a simpler version for those using text-based browsers (I myself use Lynx every now and then) or bots (like search engine bots), or other devices with limited capabilities, like PDAs.
I wouldn’t consider this to be “cheating”. It is obvious that I make my site more accessible to some users and I would think that I help search engines index my site. Would I have been punished by Google in this case? I hope not!
Jeremy Zawodny writes:
Google is not some public utility or government service. They can do whatever they want with their search index: make it better, make it worse, censor it, randomize it, or sell printed copies on a street corner.
Why?
Because it’s their index.
Of course it’s their index. But if they want us to like their index, they have to be fair on us. For sure, I do not know the details of the bmw.de case. But if BMW did something similar to what I would have done, I think Google has not been fair with them.
I think you’re missing something. They created a reference farm only for google. This was not ‘helping’ the search to be more accurate, it was pushing their links to the top, and nothing else. Gaming your site’s position can create a ranking that isn’t accurate, and I’m glad to see Google treating BMW the same as anyone else.
Kurt, as I said, I do not know the details. If what you say is true (wich I have no reason to doubt), I’m 100% with you.