would anyone out there care to share their robots.txt experience using centos as a webserver and their robots.txt files? i realize this is a somewhat simple exercise, yet i am sure there are both large and small hosters out there and possibly those that have high traffic modify their robots.txt files differently that others ??? please share if you can or care to please? for years we have just did a * (allow all) and disallow on things like /cgi-bin as examples of places to visit for those out or in the know... http://www.robotstxt.org/ http://en.wikipedia.org/wiki/Robots_exclusion_standard http://www.google.com/robots.txt and others... quite frankly, there are many orgs out there that dont follow this anyways, right? anyone? tia - rh