### BEGIN FILE ### # XXX - searchers robots.txt file # Allow Google to spider the XXX-searchers site User-agent: GoogleBot Disallow: # Allow MSN to spider the XXX-searchers site User-agent: MSNBot Disallow: # Allow Yahoo! to spider the XXX-searchers site User-agent: Slurp Disallow: # Baiduspider User-agent: baiduspider Disallow: User-agent: * Disallow: ### END FILE ###