What Is Cloaking In SEO: Everything You Need To Know
SEO cloaking is a black hat technique that aims to show search engines what they want to see while actual users see something else.
By serving different content to bots than humans, cloaking is used to trick search engines or mislead visitors. Sometimes, it is also used for quick (but problematic) website fixes.
What Is Cloaking In SEO?
Cloaking in SEO means serving different versions of content to (human) users and search engines.
Cloaking is a black hat SEO tactic that delivers content based on user identification.
To do that, a request is analyzed via the user’s IP address or user-agent string, HTTP headers, etc. Once the user is identified as a search engine bot, server-based scripts provide it with a different version of the webpage.
SEO cloaking is mainly used for shady advertising. We often see it in gambling and adult content websites or free software downloads.
Sometimes, developers apply cloaking techniques for easy (but outdated) website fixes that simply cover up an SEO or design issue instead of fixing it.
In any case, Google penalizes black hat SEO cloaking by unlisting a website or blacklisting it for good.
What Is Cloaking Used For?
Today in SEO, cloaking aims to deceive search engines or people in various ways:
- Achieve higher rankings with content that does not follow acceptable SEO principles, e.g., hidden text with keyword stuffing.
- Get a page to rank with content that differs from the page Title/Description in search results, e.g., delivering adult content cloaked into non-pornographic search results.
- Hackers can cloak your website and redirect your visitors to their (unwanted) content. In this case, if you don't act fast, your website could be penalized by Google.
- Disguised cloaked websites can also infect users' computers with malware.
- In advertising, using IP cloaking can detect competitors visiting your website and present them with different content, e.g., different product prices.
- Cloaking serves search engines with an alternative version of a page for content that is not searchable, e.g., Flash or JavaScript-based websites.
- Developers can use cloaking techniques to quickly fix serious SEO issues until or instead of properly dealing with them with up-to-date procedures.
So, Are There Any ‘White Hat’ Cloaking Techniques?
Not really. Cloaking, meaning to hide, cover or disguise, is a practice that tricks search engines by showing dissimilar content to bots and visitors.
However, there are some legitimate cases where you’d want to serve disparate versions of content to visitors and bots.
Or present different visitors with different content.
Or even automatically divert users from one page to another.
But here, you can use redirects, hreflang implementations, prerenderings, etc. In contrast with cloaking and tricking, such practices give bots a valid reason why you have 2 (or more) versions of the same webpage.
How Does Cloaking Work?
Search engine optimization cloaking uses primarily two techniques to deceive crawlers:
- Identify where the request is coming from (crawler or real visitor).
- Alter the page content or serve different content to search engines - disguised as compliant content.
What Are The Different Types Of Cloaking In SEO?
There are various ways to identify where a website request is coming from.
Here are the most common types of cloaking in SEO:
1. IP-based Cloaking
IP cloaking means serving different web page versions depending on a user’s IP address.
It’s a relatively easy way to cloak a website, where, once your server recognizes search engine IP addresses, it serves them the content you want.
What’s An IP Address?
IP (Internet Protocol) addresses are numerical identifiers that provide information about the host's location and establish a communication path to that host.
2. User Agent Cloaking
This is one of the most common types of cloaking in SEO. When bots visit a page, they are recognized via the user agent information sent to the server via a web browser or other client.
Web crawlers and browsers have unique identifiers called user-agent strings.
These identifiers help servers recognize, for example, when a page is accessed via a mobile phone and serve a mobile variant of the website, which is a perfectly valid practice.
But in black hat SEO cloaking, the crawler’s user agent string recognizes bots and shows them altered content.
What’s A User Agent?
A user agent string is a line of text that gives information on the software, browser, or crawler (i.e., the ‘agent’) making a request to a website. It contains information on the type of agent, the operating system, device type, etc.
3. HTTP Header Cloaking
Another way to identify who makes a request to your website is via HTTP headers. Your server can examine the HTTP header incoming requests and determine whether it’s a virtual or ‘flesh and blood’ visitor.
What Are HTTP Headers?
HTTP headers include a list of strings that pass information between the client (browser, mobile app, etc.) and the server via HTTP request and response. Such information may include data for the encoding type, client identification or session verification.
4. GeoIP Cloaking
When it comes to geolocation, you can identify a user's location via their cookie information, certain login details, IP or GeoIP addresses.
In international SEO, serving targeted/different content to users based on their location is a great way to optimize your website for local searches.
However, here, too, you need to treat Googlebot (or other bots) as a typical, location-based user - not as a visitor from a different country. According to Google, this would be cloaking.
What Is GeoIP?
GeoIP identifies user location based on their IP address. It does not reveal specific user locations with latitude and longitude coordinates as geolocation does.
5. Browser Cloaking
This is a less common method, similar to user-agent cloaking. Browser cloaking means serving different content to every browser and also to crawlers.
6. CNAME Cloaking (or DNS Cloaking)
CNAME Cloaking uses DNS records to disguise 3rd party domains as part of the 1st party domain so they can track user activity and receive browser data without the user's knowledge or consent.
In essence, DNS cloaking sends data of unsuspected users to 3rd parties, usually advertisers. It tracks IP addresses, operating systems, browser types, etc. - against privacy policy regulations.
It also lets 3rd party trackers bypass adblockers for the sake of data collection.
What Are DNS Records?
DNS records are instructions that live in dedicated DNS servers and provide information about a domain. This information may include what IP address is associated with that domain, how email should be handled for that domain, as well as how to handle requests for that domain.
What Are CNAME Records?
A CNAME record (Canonical Name) is a type of DNS record that maps one domain name (an alias) to another, the canonical name.
7. Referrer Cloaking
Here, content is altered depending on where the user came from, i.e., a referring website or source.
This technique often applies in ‘unethical’ affiliate marketing, but you can also use it legally in link cloaking.
Affiliate link cloaking is a (valid) practice that disguises a complicated URL containing the “ref” parameter - in order to shorten it and make it look trustworthy.
If your ‘disguised’ link represents an actual URL of a ‘real’ landing page, this is a white hat SEO technique that enhances user experience and CTR.
Nowadays, SEOs refer to this practice as affiliate link redirects.
8. JavaScript Cloaking
Some SEO cloaking techniques detect the user agent and then alter content after a page loads with the help of JavaScript.
However, when you have a JavaScript website, you can present search engines with an SEO-friendly HTML version while following legitimate SEO practices.
Prerendering Vs. Cloaking
Cloaking violates search engine guidelines. It’s a black hat SEO technique that deceitfully shows different content to search engines and different content to users in order to make a page look legitimate.
Prerendering, on the other hand, is a technique used to generate static HTML versions of JavaScript-based web pages. The website versions served to search engine spiders are identical or very similar to what users see on JavaScript websites.
Prerendering improves user experience and it is an acceptable SEO approach.
Quality SEO Ensures Long-term Success
When it comes to SEO, following white-hat practices and adapting to the latest trends pave the road to lasting success.
Website trust and authority are built step by step along the ranking race. Any drawback equals time and resources lost.
At Atropos Digital, we create an effective SEO roadmap that elevates your website and brings your pages in front of prospects looking for your services.