deb_control_files:
- control
- md5sums
deb_fields:
Architecture: all
Breaks: libwww-perl (<< 6.00)
Depends: perl, liburi-perl
Description: |-
database of robots.txt-derived permissions
WWW::RobotRules parses /robots.txt files as specified in "A Standard for
Robot Exclusion", at <http://www.robotstxt.org/wc/norobots.html>. Webmasters
can use the /robots.txt file to forbid conforming robots from accessing parts
of their web site.
.
The parsed files are kept in a WWW::RobotRules object, and this object
provides methods to check if access to a given URL is prohibited. The same
WWW::RobotRules object can be used for one or more parsed /robots.txt files
on any number of hosts.
Homepage: https://metacpan.org/release/WWW-RobotRules
Installed-Size: '36'
Maintainer: Debian Perl Group <pkg-perl-maintainers@lists.alioth.debian.org>
Package: libwww-robotrules-perl
Priority: optional
Replaces: libwww-perl (<< 6.00)
Section: perl
Version: 6.02-1
srcpkg_name: libwww-robotrules-perl
srcpkg_version: 6.02-1