WebSurgery is a suite of tools for security testing of web
applications. It was designed for security auditors to help them with
the web application planning and exploitation. Currently, it uses an
efficient, fast and stable Web Crawler, File/Dir Brute forcer, Fuzzer
for advanced exploitation of known and unusual vulnerabilities such as
SQL Injections, Cross site scripting (XSS), Brute force for login forms,
identification of firewall-filtered rules, DOS Attacks and WEB Proxy to
analyze, intercept and manipulate the traffic between your browser and
the target web application.
WEB Crawler
WEB Crawler was designed to be fast, accurate, stable, completely
parametrable and the use of advanced techniques to extract links from
Javascript and HTML Tags. It works with parametrable timing settings
(Timeout, Threading, Max Data Size, Retries) and a number of rules
parameters to prevent infinitive loops and pointless scanning (Case
Sensitive, Dir Depth, Process Above/Below, Submit Forms, Fetch
Indexes/Sitemaps, Max Requests per File/Script Parameters). It is also
possible to apply custom headers (user agent, cookies etc) and
Include/Exclude Filters. WEB Crawler come with an embedded File/Dir
Brute Forcer which helps to directly brute force for files/dirs in the
directories found from crawling.
WEB Bruteforcer
WEB Bruteforcer is a brute forcer for files and directories within
the web application which helps to identify the hidden structure. It is
also multi-threaded and completely parametrable for timing settings
(Timeout, Threading, Max Data Size, Retries) and rules (Headers, Base
Dir, Brute force Dirs/Files, Recursive, File’s Extension, Send GET/HEAD,
Follow Redirects, Process Cookies and List generator configuration).
By default, it will brute force from root / base dir recursively for both files and directories. It sends both HEAD and GET requests when it needs it (HEAD to identify if the file/dir exists and then GET to retrieve the full response).
By default, it will brute force from root / base dir recursively for both files and directories. It sends both HEAD and GET requests when it needs it (HEAD to identify if the file/dir exists and then GET to retrieve the full response).
WEB Fuzzer
WEB Fuzzer is a more advanced tool to create a number of requests
based on one initial request. Fuzzer has no limits and can be used to
exploit known vulnerabilities such (blind) SQL Inections and more unsual
ways such identifing improper input handling, firewall/filtering rules,
DOS Attacks.
WEB Editor
A simple WEB Editor to send individual requests. It also contains a HEX Editor for more advanced requests.
WEB Proxy
WEB Proxy is a proxy server running locally and will allow you to
analyze, intercept and manipulate HTTP/HTTPS requests coming from your
browser or other application which support proxies.