One of the mantras recited by SEO experts is, “Avoid duplicate content.” Duplicate content can harm your search engine rankings, they will tell you, and they are right, but many people erroneously think this means Google will de-list your website.
Google does not punish you for publishing syndicated content or for syndicating your content to others. Duplicating your own content on your own site could have negative effects and syndicating your content may mean you lose position to another website.
Consider scenario 1: You republish an article that has been syndicated by another website.
If you publish syndicated articles on your site, Google will find two (or more) similar pages on different domain names. The articles on each page will be nearly the same, but the layout around those articles might differ. This means that one of the articles could rank higher in search results than the other, based on the search term used by the person searching.
Google will not see both articles and decide that the newest one deserves to be removed from the index. It will decide, based on a wide variety of factors, which article is the most relevant one to show to the searcher. Generally, all else being equal, Google will try to show the “official” original – something it identifies using a number of parameters on all copies. It is feasible that Google will display your copy of the original if it is deemed more appropriate for the searcher.
Some people may tell you that there have been cases of sites being de-listed on Google for running duplicate content. If they have been de-listed, I’d be willing to bet that it was caused by over-optimisation, bad backlink profiles and a general lack of quality and originality, rather than the fact that they were copying content.
Scenario 2: You publish an article that is then re-published by lots of other websites.
In a similar fashion to Scenario 1, Google is going to make a judgement call on your content to work out which version of a page to show. Google can’t punish you for syndicating your content, logically, because doing that would be to destroy the very notion of writing press releases.
Press releases are content that gets duplicated. They are written precisely for lots of people to republish them either in part or in full. Google would never ride roughshod over this practice by claiming there can only ever be one copy of something online.
Scenario 3: You have multiple copies of the same content on your own website.
One common problem with websites can be that they have multiple URLs for the same content. If you have URL rewrites in your content management system, for example, you may be publishing a friendly URL like www.websitename.com/title-of-article when the underlying page URL is really this www.websitename.com/index.php?page=22&story=334. If Google manages to index both of these links, it will effectively see two different pages that are exactly the same.
Again, Google’s job here would be to show the most relevant page. In my experience, on sites where this is an issue, Google shows the friendly URL.
Why duplicate content is an SEO problem
The SEO negativity related to duplicate content is mainly that Google may not crawl all of your site if it finds duplicate content on the site. If your site is so full of duplication that Google’s spider feels like you are wasting its time, your site may lose ranking authority generally. The other key issue is that other sites running your syndicated content may rank higher than you for those articles if they have a stronger domain authority.
One tip to bear in mind if your pages don’t have much content on them at all – if there is no body on the page to outweigh the page furniture (the navigation, the footer, the sidebars etc), then the content of all your pages may look the same to Google. Ensure all pages have enough differentiated text to avoid accidental duplicated content.
What you can’t do is create ten websites on ten domain names and just duplicate the content across all of them in a bid to hedge your bets. Google will soon spot this poor quality spamming and you will find it’s better to put real effort into optimising one domain properly instead of trying to gain from ten.
Here is a link to an excellent video in which Google spam specialist Matt Cutts explains how you can identify original content to Google.