Hi friends,
Today I came across a very nice article about robots.txt on SEO Chat by James.
It covers topics of ranging how to create robots.txt for your web site, use of various syntax to allow and disallow various robots, search engine spiders to crawl your web site, internal directories and specific files, adding sitemaps auto-discovery, crawl-delay and adding comments in the robots.txt file.
Here is the link to the article.
Monday, March 31, 2008
Robots.txt - Standards
Posted by Navdeep Trivedi at 1:13 AM
Labels: Crawl-delay, robots.txt, robots.txt - Commenting, sitemaps auto-discovery
Subscribe to:
Post Comments (Atom)
2 comments:
Hello. This post is likeable, and your blog is very interesting, congratulations :-). I will add in my blogroll =). If possible gives a last there on my blog, it is about the Smartphone, I hope you enjoy. The address is http://smartphone-brasil.blogspot.com. A hug.
I recently felt the requirement of Allow: Operator in robots.txt in solving a rather complex https duplicate content problem. A little complex than seomoz.org/ugc/solving-duplicate-content-issues-with-http-and-https
You have a good blog.
Post a Comment