better check your robots.txt...

Status
Not open for further replies.

12BucksFor2Dozon

New Member
Apr 14, 2005
1,555
56
0
the Mind's I
www.usflowerhaus.com
State / Prov
FL
...because google might ban your site if it is unvalidated...

Unvalidated Robots.Txt Risks Google Banishment
David A. Utter | Staff Writer

The web crawling Googlebot may find a forgotten line in robots.txt
that causes it to de-index a site from the search engine.

Webmasters welcome being dropped out of Google about as much as
they enjoy flossing with barbed wire. Making it easier for Google
to do that would be anathema to being a webmaster. Why willingly
exclude one's site from Google?

That could happen with an unvalidated robots.txt file. Robots.txt
allows webmasters to provide standing instructions to visiting
spiders, which contributes to having a site indexed faster and
more accurately.

oogle has been considering new syntax to recognize within
robots.txt. The Sebastians-Pamphlets blog said Google confirmed
recognizing experimental syntax like Noindex in the robots.txt
file.

This poses a danger to webmasters who have not validated their
robots.txt. A line reading Noindex: / could lead to one's site
being completely de-indexed.

The surname-less Sebastian recommended Google's robots.txt
analyzer, part of Google's Webmaster Tools, and only using
the Disallow, Allow, and Sitemaps crawler directives in the
Googlebot section of robots.txt.


i would strongly advise validating your robots.txt file using google's webmaster tools.
 
I have always used "disallow"

Example:

User-agent: *
Disallow: /stats/
Disallow: /Images/
Disallow: /cgi-bin/
Disallow: /db/

For information on "Search Indexing Robots and Robots.txt", read here:
http://www.searchtools.com/robots/robots-txt.html

.
 
Status
Not open for further replies.