perl-www-robotrules 6.02 Perl database of robots.txt-derived permissions
The WWW::RobotRules module parses /robots.txt files as specified in "A Standard for Robot Exclusion", at <http://www.robotstxt.org/wc/norobots.html>. Webmasters can use the /robots.txt file to forbid conforming robots from accessing parts of their web site.
- Website: https://metacpan.org/release/WWW-RobotRules
- License: GPL 1+
- Package source: web.scm
- Patches: None
- Builds: x86_64-linux, i686-linux