Duplicate Content: the effects on Search Engine Rankings

duplicatecontentThe duplicate content is one of the major problems that most of the websites face. Search engines are also affected since they waste valuable resources in order to crawl, analyze and detect those pages. Finally since the duplicate content reduces the quality of the search results, this problem also affects the search engine users. So how can we solve this problem once and for all?

In this article we will focus on the root of the problem, we’ll see why it can be a major issue for the search engines and for the websites and we will explain how duplicate content affects SEO. In the next article we’ll examine in detail the most common web development and SEO mistakes that lead to duplicate content problems and we’ll suggest ways to solve the issue.

What is duplicate content?

The term Duplicate Content is used to describe the situation where multiple URLs have the same or almost the same content. Note that those pages can be part of the same or of different websites.

How duplicate content is created?

One of the common ways to get this result is by copy-pasting the same text in different pages or by submitting the same content/article/review in multiple sites. Additionally duplicate content can be the result of using poor Web development techniques or of developing a bad link structure.

Why duplicate content is a problem?

In order to understand why the Duplicate Content is a problem you need to see it from the search engine’s point of view. Search engines need to crawl, analyze & index, find the reputation of each page and be able to search fast through their index in order to return the results to the users. Having lots of duplicate content in a website is bad for search engines since they waste their resources on pages that do not usually have a significant value for the users.

Matt Cutts, a well known Google employee, has mentioned in one Google-Webmaster-Help video that in order to crawl a large part of the web you need a relatively small number of machines (more than 25 less than 1000). This means that crawling a website requires a relatively small amount of resources. Nevertheless the analysis of the page, the evaluation of the links and the indexation is a much more time consuming process. Those of you, who have coded web-spiders in the past, know that the analysis requires lots of CPU and memory comparing to the web requests. This is due to the complexity of the algorithms that are used in the text analysis.

Clearly the duplicate content is a problem for the search engine users because it affects the quality of the search results. But why this is a problem for the webmasters? Well, since this problem requires additional resources that cost money to the search engine companies, they try to force the webmasters and the SEOs to help them solve the issue. And the cheapest way to solve it is by motivating the webmasters to eliminate their duplicate pages.

Does duplicate content issue affects the rankings?

Even if duplicate content will not directly lead to bans from search engines (read the article “Why my SEO campaign failed? Part 1: Common On-page Optimization mistakes”), it does affect the SEO status of a website. When search engines identify cases of duplicate content they try to find out which is the best version of the page that should appear in the search results. Normally this choice is made based on the age of the page, the authority of the domain, the number of incoming links, the PageRank etc. So if few pages of your site contain lots of copy-pasted text from another pages or websites there is a good chance that they will not appear in the search results.


Additionally as we said before, duplicate pages can be a result of poor programming or link structure development. Usually the dynamic websites tend to pass variables in each dynamic URL in order to get a particular record from the Database:


In the above example the product with id 3012 could have lots of different pages (one for every color). If the content of every page does not change significantly then this could lead to duplicate content problems.

The PageRank distribution is negatively affected by the presence of duplicate pages. Since PageRank flows through the links, a lot of important link juice is directed to duplicate pages or gets evaporated. As a result the Rankings of the website are affected.

Finally another reason why duplicate content can negatively affect your rankings is that search engines find and index a particular number of pages from every website. The number of pages depends on the domain authority. If your site contains lots of duplicate pages, then the re-crawl period will increase and the new important pages that you add in your website will be indexed much slower.

Should duplicate content be a problem for all websites?

Certainly having the same content in many different pages of the website is not particularly useful for the users, but this does not mean that it’s a critical mistake. As we mentioned above it becomes a major problem for webmasters because it affects their Search Engine rankings.

Generally speaking, in most of the cases, unique quality content is very important for the users. But the question is, should duplicate content affect all webmasters or there are cases where they should not affect their SEO status? Many users have asked Matt Cutts in the past whether it is a problem for online stores to have the same generic product descriptions that are used also by other websites. Matt Cutts replied that this is a problem for their SEO campaigns and that if they want to attract more users they should do it by providing unique quality content and by differentiating from other e-commerce sites. I am sorry Matt but I have to strongly disagree with you on that.

My personal opinion is that when a user searches for a particular branded product, for example the new Xbox 360 console, he/she does not really care about the well-written description, the number of incoming links of the domain, the PageRank of the page, or the authority etc. He/she does care about the price, the product and the services that come with it. Using the same algorithms or principles to evaluate blogs and ecommerce websites is not the best practice. Even if I do understand that there are lots of technical difficulties in evaluating correctly the products, still you can’t force or suggest to the e-tailers that they have to spend lots of time and effort in rewriting their product descriptions in order to avoid a duplicate content disaster. E-tailers don’t differentiate from the competition by providing unique descriptions but by providing unique quality products and services.

That was the first part of the article. The next part will focus on more technical subjects and on how to solve the duplicate content problem.

Images by searchenginejournal, seodenver

  • Pingback: Duplicate Content: the effects on Search Engine Rankings | Web SEO … :PC & Internet Lifestyle

  • Pingback: Duplicate Content: the effects on Search Engine Rankings | Web SEO … | Search Result Secrets

  • Bonnie

    Simply put, posting duplicate content in your site is a futile effort. You put something in your site that Google will never index. It could have some benefit to your loyal readers though. However I’ve had a couple of experiences where some good for nothing seo guy copied content in my site and submitted it to article directories claiming credit for the article. And because the directory has a high authority, that article is still indexed by Google. I just wish some of these directories would be more discerning.

  • Pingback: Duplicate Content: How to solve the problem | Web SEO Analytics

  • Pingback: Effets du contenu dupliqué sur les résultats dans les moteurs « Pierre Rouarch

  • Search Engine Optimization Company

    It happens when two or more website has a same content. There is few reasons of duplicate content. When a blog may have picked up an article and re-posted it on multiple sites with different domains but similar content.

  • Geoff @ Beer Club

    Running a e-commerce website I think there is value in creating original product descriptions. It’s easy to just use the standard stuff given by the product manufacturer but consumers are likely to appreciate something a bit different to the norm. Plus it assists online businesses to stand out from competitors. And it doesnt really take that much effort.

  • Pingback: The Mighty Search Engine | SEO and SEM | Diamond Website Conversion

  • smo package

    as per current SEO norms we can see that how important is content for promotion work and as well as websites…we can see that google last updated all was focusing on content.. if content is unique than it will be a good for website marketing…

  • SEO Expert

    I think duplicate content issue could be a great debatable issue. Everyone says duplicate content is not good for the website promotion, even Google says same thing but the burning question is still unanswered that how Google penalize the site due to duplicate contents? What the criteria?

  • Employment contract

    I think our blog duplicates contents! It shows the same info on the summary and in the blog contents would this be seen as duplicate content by the search engines?

  • Search Engine Optimisation Professional

    Duplicate content is an issue and can be caused by many activities, mainly intentional copying in my personal opinion.

    Canonical links are away of dealing with this issue and one rule I certainly adhere to is to place canonical links within all web pages I create and work with.

  • Blogging & Duplicate Content

    What if I post blog posts in multiple web locations? Will it be harder for people to find the article?

    I write a blog and get the material posted on up to a dozen other blogs on the web. I’ve thought that it would increase my visibility. Do you think I’m actually hurting my SEO by placing the material in too many places? If I have a blog on my personal website – should I not be sending that material to another more trafficked site with a nice bio on me sending people back to my site?

  • Greg Holbert

    I am also a little confused by this. Should we only post our articles on our blogs, and not on other article directories. Or is it more beneficial to send the article to a submission site and not post it anywhere else? I don’t want to be a black hat seo, and I’ve posted my articles on multiple sites before…

  • seomavin

    Posting the same article on multiple sites has no positive effect on your rankings. It is just creating duplicate content. It used to work about 2 years ago, but now Google will discount it.

    If you are building quality content for your site, post the article on your site. If you are building backlinks post it on a good site with backlinks to your site. Don’t copy and paste it all over the web as this is now a useless procedure.

  • Door Levers

    This is always something which does annoy me. Duplicate content can bring down the many hours of hard work that goes in to preparing the copy in the first place.

Leave a Reply