Duplicate Content will Destroy SEO

Duplicate Content will Destroy SEODuplicate content will destroy SEO efforts for a website.  The importance of having unique content has been emphasized throughout the SEO community.  Developing effective, keyword rich content that is unique and will rank well within the search engines can be time consuming and expensive.

The size of the internet and the ease to copy content makes it easy for individuals to “steal” content and pass it off as their own.  The chances of being caught by the original authors are at best slim but the chances of ranking well with duplicate content may be even slimmer.

Google and the major search engines have implemented duplicate content filters and will not only impose penalties but may filter the duplicate content entirely from its results pages.

Google defines duplicate content as follows:

Duplicate content generally refers to substantive blocks of content within or across domains that either completely matches other content or is appreciably similar. Most of the time when we see this, it’s unintentional or at least not malicious in origin: forums that generate both regular and stripped-down mobile-targeted pages, store items shown (and — worse yet — linked) via multiple distinct URLs, and so on. In some cases, content is duplicated across domains in an attempt to manipulate search engine rankings or garner more traffic via popular or long-tail queries.

Resist the urge to take the “copy and paste” shortcut in developing content.  Using duplicate content will destroy any SEO for a site.

Utilizing fresh and unique content will not only mean that improved search engine ranking in the long run, but ensures that visitors continue to frequent a website.  Users typically want to find unique content when visiting a website containing the same content spread throughout many websites, their likelihood of returning is low.

If the fact major search engines perceive duplicate content in a negative light is not incentive enough to develop unique content, the possibility of having a website banned from the search engines should be!  Google states the following regarding how it deals with duplicate content that it discovers while indexing websites:

During our crawling and when serving search results, we try hard to index and show pages with distinct information. This filtering means, for instance, that if your site has articles in “regular” and “printer” versions and neither set is blocked in robots.txt or via a noindex meta tag, we’ll choose one version to list. In the rare cases in which we perceive that duplicate content may be shown with intent to manipulate our rankings and deceive our users, we’ll also make appropriate adjustments in the indexing and ranking of the sites involved. However, we prefer to focus on filtering rather than ranking adjustments … so in the vast majority of cases, the worst thing that’ll befall webmasters is to see the “less desired” version of a page shown in our index.

The key phrase to pay particular attention to is “focus on filtering rather than ranking adjustments” clearly indicating that the content will not be ranked, but filtered from results pages.  Duplicate content is perhaps one of the quickest ways to destroy the SEO of a website.

Developing content is time consuming, especially good content that follows good SEO guidelines.  The temptation to utilize “free” content provided from services that publishes content via syndication or RSS newsfeeds is great, especially in the early days of a website going live.  While there is nothing technically wrong with using content from these services, website owners must realize that this type of content will do very little for SEO.

Google and the other search engines will show the content that they think is the most appropriate for users in each given search.  It is possible the site publishing the original content is more likely the one that would show on SERPs, but not always.  Syndicated content is subject to the same duplicate content penalty as any other content published on any website.  Remember, search engine optimization is not about taking shortcuts.  It requires time and dedication to develop effective keyword rich content that when actively promoted through strategic internet marketing mechanisms improves the ranking of a website.

Do not destroy SEO efforts looking for quick wins.  Think about the rewards that will be received when the fruits of all the SEO labour come ripe!  Avoid duplicate content and utilize 301-redirects to ensure that link juice is directed towards the proper content instead of spread throughout the site incorrectly.

Be Sociable, Share!

About Barry Wheeler

Barry Wheeler is a blogger, novice SEO, geek and passionate Newfoundlander. Operating several successful websites and online communities, Barry has started exploring the social internet and its impact on all facets of society including personal life and business relationships. Find Barry on Twitter @barrywheeler and FaceBook or on his website Barry Wheeler - Blogging for Success.
Filed under: Blogging, SEO / SEM Tagged with: , , . - permalink.

One Response to Duplicate Content will Destroy SEO

  1. When you have good content you have to keep track of where it is on the web. If it’s that good others are probably using it too. It doesn’t make any sense to use content that will not amount to anything on your own site no matter how good the content is that you lifted. Great observation.

Speak Your Mind

Your email address will not be published. Required fields are marked *