Thursday, August 26, 2010

Open Source WebEngine and Web Crawler v.0.2 released

Web Crawler is a utility designed for testing and demonstration of the WebEngine open source library features. This program gathers information about the resources of a specified web server by analyzing references in the HTML markup, text, and JavaScript code. Additionally, a query is sent to the Web Of Trust knowledge base to obtain information about the analyzed site. This check demonstrates analysis of web application vulnerabilities.

The main features provided by the application are listed below:
- JavaScript analysis aimed at receiving references with simulation of a DOM structure
- Access to the contents of web servers via HTTP
- Support of the Basic, Digest, and NTLM authorization schemes
- Operation via proxy servers with various authorization schemes
- A wide variety of options to describe the scan target (lists of scanned domains, restriction of scanning to a host, a domain, or a web server directory, etc.)
- Modular structure, which allows one to implement plug-ins

This utility was designed by the Positive Technologies Research Lab team within the bounds of development of a web application analyzer for the MaxPatrol system


No comments: