SEOMoz community member Roy Peleg writes about how to make the Facebook comments (part of the Facebook comments box iframe) indexable and crawlable by the search engines including Google.
He links to a PHP script that basically pulls out comments from the API: http://www.rayhe.net/fb/comments.phps and inserts them into a page.
So basically you can now use Facebook Comments Box on your site and serve GoogleBot (or any other crawler/browser agent) with the comments to have them crawled & indexed. Obviously this won’t be considered as cloaking as you’re serving Google exactly what the users see (just like creating an HTML version for a Flash website).
However, trying to cloak (although having noble intentions) is just wrong in any case. Specially, when current methods and possibilities allow us to provide content visible only to the search engines. Instead of using the easiest way, Roy Peleg recommends one of the Google banned techniques.
What is the easiest way I am talking about?
Using the plain old
element allows authors to provide alternate content when a script is not executed. The content of a
element should only be rendered by a script-aware user agent in the following cases:
- The user agent is configured not to evaluate scripts.
- The user agent doesn’t support a scripting language invoked by a
element earlier in the document.
User agents that do not support client-side scripts must render this element’s contents.
Easy, peasy and accessible, dear Roy Peleg.