Robots exclusion standard:
The Robot Exclusion Standard, also known as the Robots Exclusion Protocol or robots.txt protocol, is a convention to advising cooperating web crawlers and other web robots about accessing all or part of a website which is otherwise publicly viewable.
en.wikipedia.org
Robots.txt:
robots.txt(统一小写)是一种存放于网站根目录下的ASCII编码的文本文件,它通常告诉网络搜索引擎的漫游器(又称网络蜘蛛),此网站中的哪些内容是不应被搜索引擎的漫游器获取的,哪些是可以被(漫游器)获取的。
zh.wikipedia.org
This article uses material from the Wikipedia articles %@, which are released under the Creative Commons Attribution-Share-Alike License 3.0.
Robots exclusion standardRobots.txt