This site is for sale,
Learn More
Protect Your Site From Log File Spammers
Log File Spammers Waste Time And Resources
Originally Published: August 9, 2004
Most log file programs show the
Referer
value for each visitors if possible. A sample referrer entry looks like this:
This visitor first arrived from www.google.com page critic 1-10
You may have noticed entries in your log files showing clicks from pages which aren't actually linked to your site. These fake referrer "clicks" are created by the programs spammers use to manipulate your logs.
Log File Spammers Are A Problem Because They:
-
Create links on your site to the spammer's site which the search engines may penalize
-
Waste your bandwidth
-
Skew your logging data and analysis
Here are the common types of log file spamming and how to protect yourself:
Spammers Hit Your Log Files For 2 Reasons
1.) Spammers Want You To Click On The Link To Their Site
A spammer creates a fake referrer entry in your logs in the hopes that you will click on the link & visit his site. This is a particularly pathetic form of spamming and is easily defeated my never clicking on these links.
A good rule of thumb is to never click on any referrer links in your log files, you should copy the link and enter it in a new browser window's address bar. This prevents your log file as showing up as a referrer in the logs of another site & keeps your from rewarding a spammer.
2.) Your Log Files Are Spammed To Increase Page Rank
This second form of log file spamming requires that the log files be public. Some sites intentionally make the logs public but many novice web masters don't realize that their logs are public and a target for spammers.
Spammers search the search engines for common phrases found in log file's output to find public logs. Once they find these public logs they spam the site so their site shows as a referrer. When the search engines index the log files they treat these referrer entries as links from your site to the spammer's site.
Google considers outgoing links in ranking sites so these spam links can hurt your ranking and cause you to be penalized for linking to a site in a "bad neighborhood".
The best protection against this spamming is to password protect your log files directories. Most sites make this easy to do in a control panel for the site.
If you insist on having your logs publicly accessible you should block the search engine spiders using your robots.txt file. Create an entry like this in the robots.txt file:
User-agent: *
Disallow: /log-file-directory/
The easiest way to create your robots.txt file is to use our
Simple robots.txt Files Creator
If you want to link to your logs from a public page, don't use the name of the statistics package in the anchor text. Use something like "click here for my great stats" and not something like "awstats". This lessens the chance that a spammer will find your link to the log files in the search engine. (The spammer won't check your robots.txt file and will spam you even if it won't do him any good.
Log file spam doesn't have to be a problem if you follow these steps to protect yourself.
See also
©2005 SearchEnginePromotionHelp.com, All Rights Reserved.
Site Promotion Articles Indexes:
|