Skipfish: Web Application Security Scanner

Last Commit: 12/22/2012

Skipfish: Web Application Security Scanner

Introduction

Skipfish is a powerful reconnaissance tool that has the ability to carry out security checks on web-based applications. Through recursive crawls and launching probes on the available dictionary files the tool is able to prepare a site map which acts as an interactive platform for the site that is being targeted. The map that is obtained is annotated using the resulting outputs that are gotten from several security checks.

The tool then generates a report which can aid in handling security-related issues on a given web application, acting as a basic foundation for all the security assessments carried out on a particular web-based application.

Skipfish: Security Scanner for Web Applications

As a security scanner Skipfish is very efficient and can be used to spot vulnerabilities such as SQL injections, directory listings, and command injections among others. Using this tool also provides several advantages. This is made possible due to the fact that the tool tries to look into some of the security based issues that other tools may have a difficulty in handling. 

Skipfish is very fast and can handle over 2000 requests in a single second when launched in a LAN/MAN based networks. When dealing with responses on Internet targets Skipfish is able to manage 500+ requests within a second. Skipfish also supports HTTP authentications and this makes the tool to particularly come in handy when handling a site that may require basic level authentications. Since the tool is very good at capturing session cookies that have been authenticated, its functions can be also applied on sites that need authentications in the web application stage.

Even More With Skipfish

All the sessions that are run on Skipfish can be kept safe from any destruction by stopping new cookies or excluding the available logout links. The adaptive crawling scope of Skipfish also allows the user to perform scans on sites that may contain a large volume of data. Due to its adaptive nature you can easily tune the tool’s crawl depth by limiting the scans to subdirectories that have you have chosen.

By applying snort style signatures the tool is able to highlight errors that may be related to the server in use. It can further show web applications that may potentially pose some danger. Through the use of handcrafted dictionaries this tool is able to give results that are accurate within a timeframe that is favorable to the user. Skipfish also performs an in-depth analysis on the content available on the site and can use this information to automatically construct a word list.

Subtle problems such as incorrect caching of directives can also be detected by applying the Ratproxy logic. This style enable the tool to pick up the slightest underlying problems within a given web application.

Features:

  • High performance: 500+ requests per second against responsive Internet targets, 2000+ requests per second on LAN / MAN networks, and 7000+ requests against local instances have been observed, with a very modest CPU, network, and memory footprint
  • Ease of use: skipfish is highly adaptive and reliable. Heuristic recognition, Automatic wordlist construction, Well-designed security checks, etc.
  • Snort style signatures: highlight server errors, information leaks or potentially dangerous web applications
  • Advanced security logic (can detect even subtle problems)
  • And so much more…

Supported Platforms:

  • Linux

Dependecies:

  • libidn

Skipfish Install

Clone the repo:

$ git clone https://github.com/spinkham/skipfish.git

or download and unpack and run:

$ make
Note: Make sure you install libidn in order to avoid skipfish running issues/errors.

Usage

To run the scanner ensure that you refer to the instructions provided in the doc/dictionaries.txt this will allow you to pick a dictionary file that will aid in proper configuration.

For list of all options, use -h:

$ ./skipfish -h 

To load a dictionary use –Sand then use –W to specify where the learned information about a site will be stored:

$ touch new_dict.wl
$ ./skipfish -o output_dir -S existing_dictionary.wl -W new_dict.wl \ 

To read a URL from a given file use:

$ ./skipfish [...other options...] @../path/to/url_list.txt
Documentation Box
Download Box