Humans.txt
Encyclopedia
The humans.txt file allows the creators of a website to be identified. It is a play on the name of the file robots.txt
Robots Exclusion Standard
The Robot Exclusion Standard, also known as the Robots Exclusion Protocol or robots.txt protocol, is a convention to prevent cooperating web crawlers and other web robots from accessing all or part of a website which is otherwise publicly viewable. Robots are often used by search engines to...

. The file is to be placed in the root of a web server, e.g., http://www.example.com/humans.txt. It can be publicly viewed.

There are no standards to the file. Developers are free to include whatever content they wish. The primary goal is to acknowledge contributors to a site.

Examples


External links

The source of this article is wikipedia, the free encyclopedia.  The text of this article is licensed under the GFDL.
 
x
OK