Artifact parsero_0.0+git20140929.e5b585a-6_all

Metadata
deb_control_files:
- control
- md5sums
- postinst
- prerm
deb_fields:
  Architecture: all
  Depends: python3-bs4, python3-urllib3, python3:any, python3-pkg-resources
  Description: |-
    Audit tool for robots.txt of a site
     Parsero is a free script written in Python which reads the Robots.txt file
     of a web server through the network and looks at the Disallow entries. The
     Disallow entries tell the search engines what directories or files hosted
     on a web server mustn't be indexed. For example, "Disallow: /portal/login"
     means that the content on www.example.com/portal/login it's not allowed to
     be indexed by crawlers like Google, Bing, Yahoo... This is the way the
     administrator have to not share sensitive or private information with the
     search engines.
     .
     Parsero is useful for pentesters, ethical hackers and forensics experts.
     It also can be used for security tests.
  Homepage: https://github.com/behindthefirewalls/Parsero
  Installed-Size: '40'
  Maintainer: Debian Security Tools <team+pkg-security@tracker.debian.org>
  Package: parsero
  Priority: optional
  Section: net
  Version: 0.0+git20140929.e5b585a-6
srcpkg_name: parsero
srcpkg_version: 0.0+git20140929.e5b585a-6

File

parsero_0.0+git20140929.e5b585a-6_all.deb
Binary file parsero_0.0+git20140929.e5b585a-6_all.deb cannot be displayed. you can view it raw or download it instead.

Relations

Relation Direction Type Name
built-using Source package parsero_0.0+git20140929.e5b585a-6

binary package System - - 5 months, 2 weeks ago 4 months, 2 weeks
BETA