1.27b - Tweaks to CFLAGS and man page added

- Tweak to CFLAGS ordering to always enforce FORTIFY_SOURCE.
- Man page added.
This commit is contained in:
Steve Pinkham 2010-03-30 17:23:09 -04:00
parent dc378471b7
commit 5918f62bbc
4 changed files with 159 additions and 3 deletions

View File

@ -1,3 +1,10 @@
Version 1.27b:
--------------
- Tweak to CFLAGS ordering to always enforce FORTIFY_SOURCE.
- Man page added.
Version 1.26b:
--------------

View File

@ -25,8 +25,8 @@ OBJFILES = http_client.c database.c crawler.c analysis.c report.c
INCFILES = alloc-inl.h string-inl.h debug.h types.h http_client.h \
database.h crawler.h analysis.h config.h report.h
CFLAGS_GEN = -Wall -funsigned-char -g -ggdb -D_FORTIFY_SOURCE=0 \
-I/usr/local/include/ -I/opt/local/include/ $(CFLAGS)
CFLAGS_GEN = -Wall -funsigned-char -g -ggdb -I/usr/local/include/ \
-I/opt/local/include/ $(CFLAGS) -D_FORTIFY_SOURCE=0
CFLAGS_DBG = -DLOG_STDERR=1 -DDEBUG_ALLOCATOR=1 $(CFLAGS_GEN)
CFLAGS_OPT = -O3 -Wno-format $(CFLAGS_GEN)

View File

@ -23,7 +23,7 @@
#ifndef _HAVE_CONFIG_H
#define _HAVE_CONFIG_H
#define VERSION "1.26b"
#define VERSION "1.27b"
#define USE_COLOR 1 /* Use terminal colors */

149
skipfish.1 Normal file
View File

@ -0,0 +1,149 @@
.\" vi:set wm=5
.TH SKIPFISH 1 "March 23, 2010"
.SH NAME
skipfish \- active web application security reconnaissance tool
.SH SYNOPSIS
.B skipfish
.RI [ options ] " -o output-directory start-url [start-url2 ...]"
.br
.SH DESCRIPTION
.PP
\fBskipfish\fP is an active web application security reconnaissance tool.
It prepares an interactive sitemap for the targeted site by carrying out a recursive crawl and dictionary-based probes.
The resulting map is then annotated with the output from a number of active (but hopefully non-disruptive) security checks.
The final report generated by the tool is meant to serve as a foundation for professional web application security assessments.
.SH OPTIONS
.SS Authentication and access options:
.TP
.B \-A user:pass
use specified HTTP authentication credentials
.TP
.B \-F host:IP
pretend that 'host' resolves to 'IP'
.TP
.B \-C name=val
append a custom cookie to all requests
.TP
.B \-H name=val
append a custom HTTP header to all requests
.TP
.B \-b (i|f)
use headers consistent with MSIE / Firefox
.TP
.B \-N
do not accept any new cookies
.SS Crawl scope options:
.TP
.B \-d max_depth
maximum crawl tree depth (default: 16)
.TP
.B \-c max_child
maximum children to index per node (default: 1024)
.TP
.B \-r r_limit
max total number of requests to send (default: 100000000)
.TP
.B \-p crawl%
node and link crawl probability (default: 100%)
.TP
.B \-q hex
repeat a scan with a particular random seed
.TP
.B \-I string
only follow URLs matching 'string'
.TP
.B \-X string
exclude URLs matching 'string'
.TP
.B \-S string
exclude pages containing 'string'
.TP
.B \-D domain
also crawl cross-site links to a specified domain
.TP
.B \-B domain
trust, but do not crawl, content included from a third-party domain
.TP
.B \-O
do not submit any forms
.TP
.B \-P
do not parse HTML and other documents to find new links
.SS Reporting options:
.TP
.B \-o dir
write output to specified directory (required)
.TP
.B \-J
be less noisy about MIME / charset mismatches on probably
static content
.TP
.B \-M
log warnings about mixed content
.TP
.B \-E
log all HTTP/1.0 / HTTP/1.1 caching intent mismatches
.TP
.B \-U
log all external URLs and e-mails seen
.TP
.B \-Q
completely suppress duplicate nodes in reports
.SS Dictionary management options:
.TP
.B \-W wordlist
load an alternative wordlist (skipfish.wl)
.TP
.B \-L
do not auto-learn new keywords for the site
.TP
.B \-V
do not update wordlist based on scan results
.TP
.B \-Y
do not fuzz extensions during most directory brute-force steps
.TP
.B \-R age
purge words that resulted in a hit more than 'age' scans ago
.TP
.B \-T name=val
add new form auto-fill rule
.TP
.B \-G max_guess
maximum number of keyword guesses to keep in the jar (default: 256)
.SS Performance settings:
.TP
.B \-g max_conn
maximum simultaneous TCP connections, global (default: 50)
.TP
.B \-m host_conn
maximum simultaneous connections, per target IP (default: 10)
.TP
.B \-f max_fail
maximum number of consecutive HTTP errors to accept (default: 100)
.TP
.B \-t req_tmout
total request response timeout (default: 20 s)
.TP
.B \-w rw_tmout
individual network I/O timeout (default: 10 s)
.TP
.B \-i idle_tmout
timeout on idle HTTP connections (default: 10 s)
.TP
.B \-s s_limit
response size limit (default: 200000 B)
.TP
.B \-h, \-\-help
Show summary of options.
.SH AUTHOR
skipfish was written by Michal Zalewski <lcamtuf@google.com>.
.PP
This manual page was written by Thorsten Schifferdecker <tsd@debian.systs.org>,
for the Debian project (and may be used by others).