[CentOS] httpd and robots.txt
R-Elists
lists07 at abbacomm.netSat Jan 16 22:18:36 UTC 2010
- Previous message: [CentOS] CentOS 5 and webex wrf files?
- Next message: [CentOS] httpd and robots.txt
- Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]
would anyone out there care to share their robots.txt experience using centos as a webserver and their robots.txt files? i realize this is a somewhat simple exercise, yet i am sure there are both large and small hosters out there and possibly those that have high traffic modify their robots.txt files differently that others ??? please share if you can or care to please? for years we have just did a * (allow all) and disallow on things like /cgi-bin as examples of places to visit for those out or in the know... http://www.robotstxt.org/ http://en.wikipedia.org/wiki/Robots_exclusion_standard http://www.google.com/robots.txt and others... quite frankly, there are many orgs out there that dont follow this anyways, right? anyone? tia - rh
- Previous message: [CentOS] CentOS 5 and webex wrf files?
- Next message: [CentOS] httpd and robots.txt
- Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]
More information about the CentOS mailing list