Display of the same content on more than one website (URL) known as a duplicate content.
According to Google: Duplicate content generally refers to substantive blocks of content within or across domains that either completely matches other content or are appreciably similar. Mostly, this is not deceptive in origin.
Here get the easy explanation of “content within or across domains,” and “content is appreciably similar”
Content within or across domains:
If the same content particularly belongs to the single domain space, with different-different URL will get treated as a duplicate content.
If the same content appears on multiple domain spaces considered as a duplicate content.
Content is appreciably similar:
Here, similar content means, use of synonyms, copy and spin of exiting content and use of scraps.
Some time duplicate content can be technically like:
your post URL available on both “WWW.” and Non “WWW.” and HTTP & Https URL.
Impact of duplicate content on your site:
Google does not have a penalty policy. Although it creates a negative impact on website health.
Instead of taking action google, downgrade the ranking of duplicate content
Before going to more deep let’s take a view of Google Q&A #4 Duplicate Content:
The above video states that:
According to Google’s Andrey Lipattsev, “Google does not have a duplicate content penalty.
There is no duplicate content penalty:
Yes, it’s true Google doesn’t penalize for a copy. However, your content doesn’t get ranking over unique content. But Google doesn’t treat it as a penalty. Because, as we know, ranking depends upon uniqueness and quality.
So Google keeps in mind uniqueness. It filters out the most suitable and unique content to place on the top of the SERP.
Google reward uniqueness
Google shows a clear sign. Unique content will get the reward in the ranking.
Filter out duplicate content:
Yes, it’s obvious Google filters out copy content and down their rank. However, when it finds a relevant place on SERP.
Duplicate content slows down the search result:
The search engine will have to take more effort to find out the results. There is a direct relationship between quantity and unique results.
The XML sitemap is a technical method to discover new content:
The sitemap helps not only in indexing of the content it helps the search engines to discover new content.
Let’s take the question of a webmaster hangout to make it more relevant.
How Google categorizes the content-type:
Http. And Https Url content
WWW and non-WWW Url content
Dynamically generated URL content
Reuse of content with some change like the use of synonyms, spin, and rephrase
Google doesn’t penalize for a copy. However, your content doesn’t get ranking over unique content.
Google doesn’t fine for a repeat. But your content will not get ranking over unique content.
Content without an added value.
If you write the content that doesn’t add value to the topic, Google considers them thin content.
Thin content is those content, especially which don’t add value to the content.
Re-use of the content with the permission of the author.
Although syndicated content is not an SEO issue.
However, it affects the ranking
You copied the sentence and took permission from the author to reuse the content.
You are using content in your blog simultaneously; the author is also using the same content with the real canonical tag.
As a result, your content will get undervalued in uniqueness.
Use of content without the consent of the author.
Here you can take action against the scraper by reporting to google.
Http and Https Url content:
It starts with the change of site from http to https.
Actually, it’s a technical error and found as a common problem.
It’s good using https instead of http. However, it can be fixed by 301 redirects.
WWW and Non-WWW Url content:
This technical duplicity is similar to Http and Https and can be resolved by 301 redirects.
Dynamically generated URL content:
Dynamically generated URLs can differ slightly from the original URL to give user preference.
and they can be even more.
How to resolve the duplicate content issue:
You can identify duplicate content using tools like site-liner.
Once you identified the duplicate content, you can resolve it with the following method.
Real canonical tag
Linking back to original content
Real canonical tag:
Insert the real canonical tag for the identification of real content.
It helps to Googlebot to identify the real content and reduce the crawl budget.
<link href=”URL of original page” rel=”canonical” />
Use of canonical tag should always be in a head part of html
Redirect is the process of forwarding one URL to a different URL.
It is the best method to link content with the real one.
Information aired by Google on duplicate content:
search quality evaluator guidelines aired on September 5, 2019
DYK Google doesn’t have a duplicate content penalty, but having many URLs serving the same content burns crawl budget and may dilute signals by – Gary “鯨理” Illyes on 12 FEB 2017
Google Advice: Duplicate Content on Product & Category Pages
English Google Webmaster Central office-hours from September 6, 2019
If I report the same news story as someone else, is that duplicate content? 16 May 2012
English Google Webmaster Central office-hours hangout 9 JAN 2018
Duplicate content is a common problem. You can find it on many websites. However, always try to remove duplicate content from your website to increase your ranking and secure from the Google penalty.
Top searches: Profile creation site list, Article submission site list, Social bookmarking site list, Forum submission site list, guest posting site list, Unique visitors, meta tag, Search engine submission, Do-follow backlink, Google webmaster tool, SEO, Duplicate content, organic traffic.