This site is for sale,
Google's Spam Filters: Duplicate Content, False Use Of robots.txt And Google Bowling
Dealing With Google's Filters
Originally Published: March 20, 2007
Google tries to keep its search results as clean as possible. For that reason, they have a variety of spam filters in their ranking algorithm that try to remove low quality web sites.
If your web site gets caught by one of these filters, it will be very difficult to get high Google rankings. In this
Google spam filters article series, we're taking a look at the 15 most common Google spam filters and how you can get around them.
Duplicate Content, False Use Of robots.txt And Google Bowling
duplicate content filter
is applied to web pages that contain content that has already been indexed on other web pages. This can happen if you have multiple versions of the same page on your web site or if you use content from other sites on your own web pages.
If the content is already available on another page then it will be difficult to get high rankings for your own web page. If the same content is available on multiple pages then Google will pick only one of them for the search results. Having the same page more than once on your web site might also look like a spamming attempt.
False use of the
file is not exactly a Google spam filter but it basically has the same effect. While a robots.txt file can help you to direct search engine spiders to the right pages it can also lock out search engines from your web site if you use it incorrectly.
Further information about the robots protocol can be found
Official Google Webmaster Central Blog: All about robots, test your robots.txt here:
robots.txt Checker, Test Your Robot File Syntax
An easy to use form to create a robots.txt file for your site:robots.txt Creator, Generate Simple robots.txt Files
Although your competitor has set up these spam pages that redirect to your web site, Google might think that it is you who is responsible for these spamming attempts and downgrade your web site. Google claims that external factors cannot influence your rankings on Google. However, some "black hat" SEO'lers offer services that can harm the rankings of your competitors.
How To Get Around These Filters
If you have multiple versions of the same page on your web site (print version, online version, WAP-version, etc.) then make sure that search engines will index only one of them.
You can exclude special web pages from indexing by using a robots.txt file or the Meta Robots tag. IBP's
web site optimization editor
allows you to quickly add Meta Robots tags to your web pages.
Double check the contents of your robots.txt file to make sure that you don't exclude search engines by mistake.
If your web site has been hit by Google bowling then the only thing you can do is to
file a reinclusion request.
The best way to get high rankings on Google and other major search engines is to use white-hat SEO methods: Optimize the content of your web pages with
IBP's Top 10 Optimizer tools
and get high quality inbound links with the
ARELIS link popularity tools .
Copyright by Axandra GmbH, publishers of SEOProfiler, a
complete SEO software solution.
All product names, copyrights and trademarks mentioned in this newsletter are owned by their
respective trademark and copyright holders.
Site Promotion Articles Indexes: