2010-03-30 21:23:09 +00:00
. \" vi:set wm=5
.TH SKIPFISH 1 "March 23, 2010"
.SH NAME
skipfish \- active web application security reconnaissance tool
.SH SYNOPSIS
.B skipfish
2012-03-17 13:59:08 +00:00
.RI [ options ] " -W wordlist -o output-directory start-url [start-url2 ...]"
2010-03-30 21:23:09 +00:00
.br
.SH DESCRIPTION
.PP
\fB skipfish\fP is an active web application security reconnaissance tool.
It prepares an interactive sitemap for the targeted site by carrying out a recursive crawl and dictionary-based probes.
The resulting map is then annotated with the output from a number of active (but hopefully non-disruptive) security checks.
The final report generated by the tool is meant to serve as a foundation for professional web application security assessments.
.SH OPTIONS
.SS Authentication and access options:
.TP
.B \- A user:pass
use specified HTTP authentication credentials
.TP
2010-11-21 12:40:21 +00:00
.B \- F host=IP
2010-03-30 21:23:09 +00:00
pretend that 'host' resolves to 'IP'
.TP
.B \- C name=val
append a custom cookie to all requests
.TP
.B \- H name=val
append a custom HTTP header to all requests
.TP
2010-11-21 12:40:21 +00:00
.B \- b (i|f|p)
use headers consistent with MSIE / Firefox / iPhone
2010-03-30 21:23:09 +00:00
.TP
.B \- N
do not accept any new cookies
.SS Crawl scope options:
.TP
.B \- d max_depth
maximum crawl tree depth (default: 16)
.TP
.B \- c max_child
2010-08-21 19:56:47 +00:00
maximum children to index per node (default: 512)
.TP
.B \- x max_desc
maximum descendants to index per crawl tree branch (default: 8192)
2010-03-30 21:23:09 +00:00
.TP
.B \- r r_limit
max total number of requests to send (default: 100000000)
.TP
.B \- p crawl%
node and link crawl probability (default: 100%)
.TP
.B \- q hex
repeat a scan with a particular random seed
.TP
.B \- I string
only follow URLs matching 'string'
.TP
.B \- X string
exclude URLs matching 'string'
.TP
2010-06-21 14:53:17 +00:00
.B \- K string
do not fuzz query parameters or form fields named 'string'
.TP
2010-06-21 14:57:40 +00:00
.B \- Z
do not descend into directories that return HTTP 500 code
.TP
2010-03-30 21:23:09 +00:00
.B \- D domain
also crawl cross-site links to a specified domain
.TP
.B \- B domain
trust, but do not crawl, content included from a third-party domain
.TP
.B \- O
do not submit any forms
.TP
.B \- P
do not parse HTML and other documents to find new links
.SS Reporting options:
.TP
.B \- o dir
write output to specified directory (required)
.TP
.B \- M
2010-11-21 12:40:21 +00:00
log warnings about mixed content or non-SSL password forms
2010-03-30 21:23:09 +00:00
.TP
.B \- E
log all HTTP/1.0 / HTTP/1.1 caching intent mismatches
.TP
.B \- U
log all external URLs and e-mails seen
.TP
.B \- Q
completely suppress duplicate nodes in reports
2010-04-02 13:45:44 +00:00
.TP
.B \- u
be quiet, do not display realtime scan statistics
2010-03-30 21:23:09 +00:00
.SS Dictionary management options:
.TP
2012-03-17 13:59:08 +00:00
.B \- S wordlist
load a specified read-only wordlist for brute-force tests
.TP
2010-03-30 21:23:09 +00:00
.B \- W wordlist
2012-03-17 13:59:08 +00:00
load a specified read-write wordlist for any site-specific learned words. This option is required but the specified file can be empty, to store the newly learned words and alternatively, you can use -W- to discard new words.
2010-03-30 21:23:09 +00:00
.TP
.B \- L
do not auto-learn new keywords for the site
.TP
.B \- Y
do not fuzz extensions during most directory brute-force steps
.TP
.B \- R age
purge words that resulted in a hit more than 'age' scans ago
.TP
.B \- T name=val
add new form auto-fill rule
.TP
.B \- G max_guess
maximum number of keyword guesses to keep in the jar (default: 256)
.SS Performance settings:
.TP
2012-03-17 13:59:08 +00:00
.B \- l max_req
max requests per second (0 = unlimited)
.TP
2010-03-30 21:23:09 +00:00
.B \- g max_conn
maximum simultaneous TCP connections, global (default: 50)
.TP
.B \- m host_conn
maximum simultaneous connections, per target IP (default: 10)
.TP
.B \- f max_fail
maximum number of consecutive HTTP errors to accept (default: 100)
.TP
.B \- t req_tmout
total request response timeout (default: 20 s)
.TP
.B \- w rw_tmout
individual network I/O timeout (default: 10 s)
.TP
.B \- i idle_tmout
timeout on idle HTTP connections (default: 10 s)
.TP
.B \- s s_limit
response size limit (default: 200000 B)
2011-08-09 20:04:52 +00:00
.TP
.B \- e
do not keep binary responses for reporting
2010-03-30 21:23:09 +00:00
2012-03-17 13:59:08 +00:00
.SS Performance settings:
2010-03-30 21:23:09 +00:00
.TP
2012-03-17 13:59:08 +00:00
.B \- k duration
stop scanning after the given duration (format: h:m:s)
2010-03-30 21:23:09 +00:00
.SH AUTHOR
2012-03-17 13:59:08 +00:00
skipfish was written by Michal Zalewski <lcamtuf@google.com>,
with contributions from Niels Heinen <heinenn@google.com>,
Sebastian Roschke <s.roschke@googlemail.com>, and other parties.
2010-03-30 21:23:09 +00:00
.PP
This manual page was written by Thorsten Schifferdecker <tsd@debian.systs.org>,
for the Debian project (and may be used by others).