Cloaking is a frowned upon search engine optimization technique in which that which the search engine spider indexes is different from what the reader sees in their browser. Such a technique is often employed to fool an engine into interpreting your site as relevant to a subject of greater popularity than is the case; in a few cases, webmasters have legitimate concerns that irrelevant content is adversely influencing their position in search as bots intepret navigation, references, audience comments, or other text as relevant to the subject matter.
Yahoo! brings us a new optimization resource with the ‘robots-nocontent’ tag which indicates, at least to their crawler, that parts of a page are unrelated to the body and only serve to support the audience of the page. Keywords contained in these tagged sections are ignored. For example:
“The header and boilerplate on Yahoo! Answers might be useful to visitors, but it’s probably not helpful when searching for this particular page. The ‘robots-nocontent’ tag allows you to identify that for our crawler in order to improve the targeting and the abstract for the page.”
4 ways to apply robots-nocontent attributes
- <div class=”robots-nocontent”> This is some content on your site that you want ignored</div>
- <span class=”robots-nocontent”> You don’t want this indexed by search engines either</span>
- <p class=”robots-nocontent”> This might be a privacy or legal statement on the page that has nothing to do with the copy</p>
- <div class=”robots-nocontent”> This is just poorly written and I’m embarrassed to have search engines crawl it.</div>
Donâ€™t Cloak! NoContent…
Cloaking is a frowned upon SEO technique in which that which the search engine spider indexes is different from what the reader sees in their browser. Such a technique is often employed to fool an engine into interpreting your site as relevant to a sub…
paul – great work on the blog and the SEO stuff. love it!!
My personal opinion is that Yahoo did not think this through. Using a Class attribute to relay information to a robot is wrong — semantically and technically.
While the microformat for “nofollow” is viable as a Webmaster/Blogger tool, there needs to be Standard without using classes, attributes and other confusing (oft misleading) forms of communication.
My proposal is a natural progession into the Body of the Html — a <robots attr=”value”> tag. And instead of marking up what is not “content”, simply markup what is. ‘The’ content of the page normally starts with an <H1> tag and proceeds from there … does it not?