28 September 2011

WebInspect - Useful Notes from manual....

Im not one to RTFM but I did read the WebInspect manual and pulled out these little nuggets of info. I plan to revisit all of these items during the next few uses to provide more information around their usage so check back soon.

Optional Depth-First Crawler
  • Depth-first crawling accommodates sites that enforce order-dependent navigation (where you must visit page A before you can visit page B). This method traces the first link on a page to the first link on the referenced page before returning to the original page and tracing the second link. By contrast, breadth-first crawling (which is also available) follows all the links on a page before drilling down to the pages that are being linked.


Java Model View Control (MVC) Support
  • Based on in-depth research by the HP DevInspect for Java team, WebInspect now supports applications built on the Java MVC platform by the use of the depth-first crawler, path-based attacks, and navigational parameters.


Accommodate E-mail Messages
  • If your system generates e-mail messages in response to user-submitted forms, you might consider disabling your mail server. Alternatively, you could redirect all e-mails to a queue and then, following the audit, manually review and delete those e-mails that were generated in response to forms submitted by WebInspect.

Pace the HTTP Traffic
  • WebInspect can be configured to send up to 75 concurrent HTTP requests before waiting for an HTTP response to the first request. The default thread count setting is 5 for a crawl and 10 for an audit (if using separate requestors). In some environments, you may need to specify a lower number to avoid crashing the Web application or your server.

Delete Uploaded Files
  • Finally, WebInspect tests for certain vulnerabilities by attempting to upload files to you server. If your server allows this, WebInspect will record this susceptibility in its scan report and attempt to delete the file. Sometimes, however, the server will not allow a file to be deleted. For this reason, part of your post-scan maintenance should include searching for and deleting files whose name begins with “CreatedByHP.”



Web Service Assessment
  • When performing a Web service assessment, WebInspect crawls the WSDL site and submits an arbitrary enumeration value for each parameter in each operation it discovers. It then audits the site by attacking each parameter in an attempt to detect vulnerabilities such as SQL injection. You can tailor these attacks to your WSDL by creating a file containing specific values that should be submitted, and by selecting the WebInspect option to “Auto-fill SOAP messages during crawl” (select Edit ? Default Scan Settings and then select the Method category in the Scan Settings group). For more information, see SOAP Editor on page 185.
Navigation Pane
In both Site view and Sequence view, blue text denotes a directory or file that was “guessed” by WebInspect, rather than a resource that was discovered through a link. For example, WebInspect always submits the request “GET /backup/ HTTP/1.1” in an attempt to discover if the target Web site contains a directory named “backup.” A Blue folder: A private folder on your Web server found by WebInspect. These folders are not linked from the site itself.
  • A Yellow folder: A folder whose contents are available over your Web site.
  • A Grey folder: A folder indicating the discovery of an item via path truncation.

Once the parent is found, the folder will display in either blue or yellow, depending on its properties. Export Site Tree (Site View only) Saves the site tree in XML format to a location you specify.

Session
A session is a matched set comprising the HTTP request sent by WebInspect to test for vulnerabilities and the HTTP response received from the server.
  • 0 - 9 Normal
  • 10 Information
  • 11 - 25 Low
  • 26 - 50 Medium
  • 51 - 75 High
  • 76 - 100 Critical
Pg79 25/08/2010

Full manual is available from: http://www.echo-zero.co.uk/echo-zero-downloads/webinspectuserguide_8_0_548_0.pdf

25 September 2011

Password Hashes

So for my first post I thought I would keep it simple. Hashes.

There are two broad types of attack when it comes to attacking a system or application that is protected by some sort of authentication mechanism:
  1. Password Guessing - where you *don't* have access to the stored password representation (hash) aka brute force attack.
  2. Password Cracking - where you *do* have access to the stored password representation
So its key to point out here that all passwords are stored somewhere - whether in volatile (memory) or non-volatile (datastore, registry) of the target or a centralized authentication provider (Radius, Active Directory). And whether in cleartext (#fail) or in encrypted/hashed representation.

I am going to leave password guessing attacks for now but it maybe on another post in the future. But there are many simple tools out there (Hydra, Metasploit, Web Brute) so find a decent wordlist (Rockyou75, Openwall) and get brute forcing!

Resources for password lists:

This is going to brief and I am sure i will add to this in the future. However, without a toolbox of zero-day exploits in your back pocket, capturing, breaking or reusing hashes is essential when assessing security of you network. Understanding what you have is key here. 

A recent talk from IronGeek (Adrian Crenshaw) on pilfering windows targets gives a good overview of the types of common hashes you may encounter:
Source: Irongeek - Nashville 2011 Talk 

(VNC is listed here however it is a simple Base64 encoding of the password. Not really a hash more an obfuscation as its easily reversible)


Once you know what you have, 'cracking' them is the next step.


Tools that are useful here are:

  • Hashcat
  • John
  • Ophcrack
  • Cain