jeudi 29 décembre 2016

RAWR - RAPID ASSESSMENT OF WEB RESOURCES

RAWR - RAPID ASSESSMENT OF WEB RESOURCES


  Features 
  • A customizable CSV containing ordered information gathered for each host, with a field for making notes/etc. 
  • An elegant, searchable, JQuery-driven HTML report that shows screenshots, diagrams, and other information. 

  • A report on relevent security headers, courtesy of SmeegeSec.

  • a CSV Threat Matrix for an easy view of open ports across all provided hosts. (Use -a to show all ports.)


  • A wordlist for each host, comprised of all words found in responses. (including crawl, if used).
  • Default password suggestions through checking a service's CPE for matches in the DPE Database.
  • A shelve database of all host information. (planned comparison functionality)
  • Parses meta-data in documents and photos using customizable modules.
  • Supports the use of a proxy (Burp, Zap, W3aF)
  • Captures/stores SSL Certificates, Cookies, and Cross-domain.xml
  • [Optional] Customizable crawl of links within the host's domain.
  • [Optional] PNG Diagram of all pages found during crawl



  • [Optional] List of links crawled in tiered format.
  • [Optional] List of documents seen for each site.
  • [Optional] Automation-Friendly output (JSON strings)



  • Input 
    • Using Prior Scan Data
      • -c <RAWR .cfg file>
        • .cfg files containing that scan's settings are created for every run.
      • -f <file, csv list of files, or directory>
        • It will parse the following formats:
        • NMap - XML (requires -sV)
        • Nessus - XML v2 (requires "Service Detection" plugin)
        • Metasploit - CSV
        • Qualys - Port Services Report CSV
        • Qualys - Asset Search XML (requires QIDs 86000,86001,86002)
        • Nexpose - Simple XML, XML, XML v2
        • OpenVAS - XML
    • Using NMap
      • RAWR accepts valid NMap input strings (CIDR, etc) as an argument
        • -i can be used to feed it a line-delimited list.
      • use -t <timing> and/or -s <source port>
      • use -p <port|all|fuzzdb> to specify port #(s), all for 1-65353, or fuzzdb to use the FuzzDB Common Ports
      • --ssl will call enum-ciphers.nse for more in-depth SSL data.
    Enumeration 
    • In [conf/settings.py], 'flist' defines the fields that will be in the CSV as well as the report.
      • The section at the bottom - "DISABLED COLUMNS" is a list of interesting data points that are not shown by default.
    • --dns will have it query Bing for other hostnames and add them to the queue.
      • (Planned) If IP is non-routable, RAWR will request an AXFR using 'dig'
      • This is for external resources - non-routables are skipped.
      • Results are cached for the duration of the scan to prevent unneeded calls.
    • -o-r, and -x make additional calls to grab HTTP OPTIONSrobots.txt, and crossdomain.xml, respectively
    • Try --downgrade to make requests with HTTP/1.0
      • Possible to glean more info from the 'chattier' version
      • Screenshots are still made via HTTP/1.1, so expect that when viewing the traffic.
    • --noss will omit the collection of screenshots
      • The HTML report still functions, but will show the '!' image for all hosts.
    • Proxy your requests with --proxy=<ip:port>
      • This works well with BurpSuite, Zap, or W3aF.
    • Crawl the site with --spider, notating files and docs in the log directory's 'maps' folder.
      • Defaults: [conf/settings.py] follow subdomains, 3 links deep, timeout at 3min, limit to 300 urls
      • If graphviz and python-graphviz are installed, it will create a PNG diagram of each site that is crawled.
      • Start small and make adjustments outward in respect to your scanning environment. Please use caution to avoid trouble. :)
    • Use -S <1-5> to apply one of the crawl intensity presets. The default is 3.
    • --mirror is the same as --spider, but will also make a copy of each site during the crawl.
    • Use --spider-opts <opts> to define crawl settings on the fly.
      • 's' = 'follow subdomains', 'd' = depth, 't' = timeout, 'l' = url limit
      • Not all are required, nor do they have to be in any particular order.
      • Example--spider-opts s:false,d:2,l:500
    • Also for spidering, --alt-domains <domains> will whitelist domains you want to follow during the crawl.
      • By default, it won't leave the originating domain.
      • Example: --alt-domains domain1.com,domain2.com,domain3.com
      • --blacklist-urls <input list> will blacklist domains you don't want to crawl.
    Output 
    • -a is used to include all open ports in the CSV output and the Threat Matrix.
    • -m will create the Threat Matrix from provided input and exit (no scan).
    • -d <folder> changes the log folder's location from the default "./"
      • Example: -d ./Desktop/RAWR_scans_20140227 will create that folder and use it as your log dir.
    • -q or --quiet mutes display of the dinosaur on run.
      • Still in disbelief that anyone would want this... made 2 switches for it, to show that I'm a good sport. :)
    • Compress the log folder when the scan is complete with -z.
    • --json and --json-min are the automation-friendly outputs from RAWR.
      • --json only kicks out JSON lines to STDOUT, while still creating all of the normal output files.
      • --json-min creates no output files, only JSON strings to STDOUT
    • Use --parsertest if you're testing a custom parser. It parses input, displays the first 3 lines, and quits.
    • -v makes output verbose.
    Report Customization 
    • -e excludes the 'Default password suggestions' from your output.
      • This was suggested as an 'Executive' option.
    • Give your HTML report a custom logo and title with --logo=<file> and --title=<title>.
      • The image will be copied into the report folder.
      • Click 'printable' in the HTML report to view the custom header.
    Updating 
    • -u runs update and prompts if a file is older than the current version.
      • Files downloaded are defpass.csv and Ip2Country.tar.gz.
      • It checks for phantomJS and will download after prompting.
    • -U runs update and downloads the files mentioned above regardless of their version, without prompting.

    Aucun commentaire:

    Enregistrer un commentaire