WordPress and Duplicate Content

WordPress is my blogging engine of choice – I’ve been through two others, and WordPress is my favorite by far. That doesn’t, however, make it flawless or perfect by any stretch of the imagination.

One of the biggest faults of WordPress is duplicate content. Between the post itself, the index, archives, feeds, and tags / categories, you can have the same content appear in 5 or more places at any given time. Google, master of all search engines, does not like duplicate content at all. And we all want to make Google happy – it does, after all, control the traffic on the internet.

One of my sites recently took a nosedive in all Google rankings – the traffic drop has been somewhere around 75%. It’s severe, the income from the site took a similar nosedive. I don’t know what caused the drop, but I’ll make changes that can have only a positive effect, and hopefully regain the lost rankings in a short period of time.

DupPrevent is a WordPress plugin that helps prevent all the search engines ( Google included ) from indexing content more than once. In a nutshell, it sets up the META NOINDEX tags on pages like tags and archives. In addition, DupPrevent has a robots.txt file included in the .zip – This will also assist in duplicate content issues.

I setup this plugin today on my afflicted blog, as well as this one. The robots.txt files are also now firmly in place. Hopefully within a few days, everything will be back to normal or better – I don’t want Google as a long-term enemy.

3 Comments

  1. Rian Brooklyn said

    April 28 2007 @ 9:21 am

    Thanks for the plugin resource. I’ve been curious as to how to handle this. Ive heard of people having their RSS feeds outrank their blog URL, or getting severely penalized by Google. Some of this can be fixed in robots.txt but a plugin to prevent the index sounds great.

  2. Al Davies said

    April 30 2007 @ 9:35 pm

    Thanks for the tip about the plugin. Has it made a difference on your “afflicted blog”?

  3. Leroy Brown said

    May 1 2007 @ 9:09 am

    Yup – robots.txt is the best way, but hey, we’re all a little lazy, and I’m big on the easy way.

    And as a little followup, my afflicted blog did regain rankings & traffic – to levels never seen before. Whether it’s from the plugin / changes I made here, or something else, no one will ever know. But it certainly didn’t hurt.

Comment RSS

 

GreenLlama

    Green Llama is a place to learn about making money online. Any ideas or walk-through's I have, I'll post here for your enjoyment. I also do product and online service reviews.

    However, I tend to run off on tangets quite often. I call this part of the site 'random crap'. Basically I'll go off and post about whatever I want, just to keep things interesting. Have fun and stay tuned.