Basically don’t, unless you have to
Every site wants a slot on the first page of search results. Pages get there because of carefully planned keywords sprinkled throughout the content. Sometimes, site creators do not want to take the time to carefully plan out where keywords should go, so that the content will flow well. They just want to get to the top as fast as possible and decide to utilize this SEO black hat technique.
What is Cloaking in SEO?
Do you remember in Star Wars when the ships would use cloaking to hide from the other side? Compare that to this: internet marketers who want their webpages to rank higher than their current spot, hide their true content and intent from search engines.
Cloaking used as a search engine optimization technique where the content showed to the user appears different from that presented to search engine crawlers. This technique tricks the search engine crawler into giving a page a higher ranking because the content appears more relevant than others. It also tricks the user into looking at a page that doesn’t have the answers they wanted, which frustrates them. Cloaking utilizes lying to get ahead.
Search Engines do not approve of this technique and if a page demonstrates cloaking, then the search engine will drop that page’s rank or ban it from its index.
How is Cloaking Done?
Cloaking takes many different forms. Cloaking primarily comes from identifying the IP addresses or the User-Agent HTTP header of the user requesting the page. A creator who uses cloaking adds a mod_rewrite edit in the .htaccess file. A webmaster collects the IP addresses of a search engine crawler and if the mod_rewrite module expects a crawler, then the page will display different content than what a typical user would see.
Some Internet Marketers have tried to fool the search engine crawlers with the following techniques:
- Hidden Text: Hidden text implies that the internet marketer wrote keyword-heavy content that the reader cannot see. They uploaded the content to the site and make the text the same color as the background. Hence, the crawler could read the content but the user would have no idea it’s there.
- Flash Based Websites: Search engines struggle to crawl sites created with flash. So, creators with flash-based websites may choose to avoid rewriting the website entirely. Instead, they opt to create content-rich pages to show the search engines, while still showing the flash-based site to the user.
- HTML Rich Websites: Pages should have more content than HTML tags. If the content-length comes up short then the writer may cloak extra content to improve the HTML to content ratio.
- Image Galleries: Search engine crawlers cannot read images. They do not have eyes as we do. So, the creators choose to use cloaking to rank for keywords.
Is there ever a time to use Cloaking?
Surprisingly, several well-known sites have utilized cloaking and it has not hurt their rankings:
- Google: Who may have an advantage here but has many different features that when searched, the URL display changes when clicked on.
- ATT.com: This site shows a different landing page with unique content to everyone who clicks on it.
What Type of Cloaking is Acceptable?
How do some sites get away with cloaking and others do not? All sites have to abide by the rules and although search engines do not approve of cloaking, search engines have allowed some forms of white hat cloaking:
- Geolocation: Showing users a different version of a site based on where the user searches. Search engines approve of this tactic because it improves the user’s experience rather than hinder it. A person in New York City would not want to see results for restaurants in Los Angeles. A website, like Yelp.com, can target users based on location.
- URL rewriting: Search engines accept deleting unnecessary portions of a URL because this practice does not actually change the content on the page shown to either the search engine crawler or the user.
- First Click Free: Do you know those pesky pop-ups that ask you to subscribe or log-in? Crawlers get to bypass those messages and crawl the content without any hindrance.
- Personalized Content: Showing different content based on past user data describes a version of cloaking. However, search engines accept this because the crawler receives the same type of treatment and the content shown comes from the crawler’s past data on the site.
SEOmoz has a ranking system for different variants of Cloaking:
- Near White: Near White takes pearly white one step further and uses geotargeting. Near white pages usually do not experience trouble with search engines dropping their page.
- Light Gray: Here, sites use redirects to manage duplicate content. Many articles have the option to email to a friend, print or view on-page. If the search engine crawls all three versions then it becomes overwhelmed with the duplicate content. So, site creators include redirects to the original page to help with the confusion.
- Dark Gray: Sites in the dark gray category, use cloaking to misrepresent link qualities. Many creators set up sites with links that do pass link juice and redirect the search engine crawlers which pass the benefit of the links to the pages that they want to rank higher for.
- Solid Black: Many spam sites engage in solid black cloaking. Any link that sends a user to an unexpected site with no relevant content describes solid black cloaking.
In the end, use your best judgment. If you feel like your actions will mislead users, don’t do it. If you feel as though it will increase user experience, go for it. Just remember, the search engine has the final say, so don’t be surprised if your site drops in rankings. Also, don’t think that you will get away with using cloaking, the search engine will find it and you will not like the results.
For help building out a successful SEO strategy, contact WebFX today!