Version 2 of Capture-HPC client honeypot released
Friday, September 14th, 2007The New Zealand Honeynet Project have been busy with version two of their Capture-HPC client honeypot application, which we use internally for crawling and analysis of suspect URLs. Some of the new features include:
* support for any client application that is http protocol aware (for example, Microsoft Excel)
* ability to automatically collect malware
* ability to automatically collect network traffic on the client
* ability to push exclusion lists from the Capture Server to the Capture Client
* improved control of Internet Explorer: obtain HTML error codes; specify visitation delay after page has been retrieved; retry visitation of URLs in case of time outs or network errors)
* support for plug-in architecture, that allows to create fine grained control of clients (for example, as provided for Internet Explorer), but also allows for integration of client applications that require complex interactions to retrieve content from the web ( e.g. Safari is such an application. It doesn’t allow retrieval of web content by passing the URL as a parameter)
Highly recommended if you are interested in research in this area, as it is very actively maintained and has been effective in our experience.