Friday 30 July 2010

Bish Bash Bosh; Linux commandline Kung-fu for Hackers

In order to make the best use of various commandline tools, it is important to have a good understanding of bash commands and techniques. These techniques can save a massive amount of time and effort, and achieve results that would be impossible manually.

Commands such as cut, tr, wc, sort, grep, sed, xargs and awk are really useful, and it is important for any budding commandline ninja to learn how to use these effectively. Also, the pipe symbol "|" and redirection ">" and "<" can be used to direct input and output to chain commands together making the commandline an extremely powerful toolset.

On top of that there are various programming options, that are possible on the commandline, from "for" loops, to one-liners for various scripting languages including perl and python.
The more you know about the options you have, the more creative you can be.

Please remember to use these techinques only for good, not for malicious purposes.

Here are some expressions that I have found very useful:

Grep, pipe, cut and wc

Say you want to scan a subnet with nmap, and build a list of IPs that have TCP port 80 open, and count them.

nmap -oG http.txt -p T:80 192.168.1.1-254
grep -i open http.txt | cut -d" " -f2 > httpips.txt
wc -l httpips.txt


The first line uses nmap to scan a subnet, and create a file output in a grepable format.
The second line then greps this file, in order to find any lines containing "open", and for each line found, it extracts the ip address and writes that to a second file.
You then have a useful file containing the ip addresses of all the systems running a TCP service on port 80 (probably web-servers)
The third line counts the number of lines in the file created, i.e. the number of hosts with port 80 open.

Using echo and awk to create HTML tags

Following on from the above example, imagine you found many hosts, and wanted to wrap these IP addresses in html so that you could use a browser to click through and look at the Web homepage on each server. You could use awk to wrap each line in html tags, and echo to top and tail your html.

echo "" > httpips.html
cat httpips.txt | awk '{ print "" $1 "" }' >> httpips.html
echo "" >> httpips.html


Then you can open up httpips.html in a web browser and have a quick look through each site.

Note: I noticed my HTML tags got stripped out when I published the post, but never mind

Using cat, echo, tr, xargs, and sort to create a dictionary out of some documents,

I've been in a situation where I needed a dictionary (for a password attack) and had no internet access. There are various tools in Backtrack that have supporting documents, so I found some large ones by searching around, and then extracted a dictionary as follows.

cat sourcefile1.txt sourcefile2.txt sourcefile3.txt | tr  [:punct:] " " | xargs -n 1 -P0 echo >> wordlist.txt 
sort -u wordlist.txt > dict.txt


The first line lists the source files, then uses "tr" to turn all the punctuation into white space, before using xargs to echo each of the words found into a new file, one on each line. The second line sorts all the words found, removes duplicates, and puts the result in a dictionary file. This is pretty effective, and you could use a variety of source files to produce large dictionaries quite quickly.

Using sed to prepend, and append text

Say you have a text file and you want to add something to the begining and end of each line. You may have a situation where you want to copy a script to a remote server, inline in a command shell. The following expression enables you to use the "echo" command to write each line to a file.

sed 's/^/echo /' script.txt | sed 's/$/ >> script.bat/'

This expression processes each line of the file script.txt, adds "echo " at the begining, and " >> script.bat" at the end, and prints the result to the screen.
Cut and past the result of this into the shell, and you will end up with your original script as a file called script.bat on the remote system.

"For" loops on the commandline

So, you want to do something repetitive with a list of hosts; quicky and easily

Lets run nmap for each of 50 hosts in a range, and write grepable results out to a set of files named by IP address.

for ip in $(seq 150 200); do nmap -oG nmapscan192.168.1.$ip.txt 192.168.1.$ip & done

Seq is a good tool for doing operations with ranges in for loops. In other situations you may prefer to use something like "echo {150..200}". The rest of the for loop should be self explanitory, but note the "&", which enables these jobs to run concurrently. This can speed up time-consumming processes in some cases.

Chaining grep, cut and sort, and using "for" loops with a file

Say you want to do some quick analysis of some web logs, over several days of logs, and you want to find all the hosts requesting a specific URL, and sort them by unique IP address. You then want to ping the IP addresses to see how many of them respond.

First check the file format, using something like "head" on one of the files.

head /var/log/apache2/access.log

Then something like

cat /var/log/apache2/access*.log | grep "/myfile.html" | cut -d" " -f1 | sort -u > myhosts.txt
for ip in $( cat myhosts.txt ); do ping -c 1 $ip | grep "1 received" | sed "s/^/$ip is up - /" | sort -u ; done

192.168.1.42 is up - 1 packets transmitted, 1 received, 0% packet loss, time 3ms
192.168.1.43 is up - 1 packets transmitted, 1 received, 0% packet loss, time 1ms


As you can see, you can get pretty creative, and the long job of pouring over several log files, with thousands of log entries, and searching for IP addresses, is reduced to less than a minutes work.

Perl and Python oneliners on the commandline

Perl and Python oneliners can be issued on the commandline and included in the mix with other operations described previously. This is done easily with the -c option in Python.

python -c 'print "A" * 500' > buffer.txt

This example prints a buffer of 500 A's to a file using Python

Similar results can be obtained with the -e option in Perl.

If you are interested in expanding your options, with many more possibilites, then I would highly recommend the following site, which contains many examples, and users can rate expressions for usefulness.

www.commandlinefu.com

Mitgating factors are not to relevant for this post, but bear in mind, that when you have a system you want to secure, one aspect of hardening; is to limit the tools and languages that would be available to a potential attacker. Have fun, and be good.

7 comments:

  1. I noticed my html tags got stripped out of the html wrapping example.

    Ah well, if you understand what I am getting at, nuff said.

    ReplyDelete
  2. Hi Ben,
    nice post, small world. After doing CISSP I was craving for something a bit more technical.

    Here are a couple of oneliners I have found useful this month:

    python -m SimpleHTTPServer

    # serves current dir on port 8000, can specify port number by adding it to the end.

    du -h --max-depth=1

    # show the size of all subfolders in the current dir

    also, if you are in a non-interactive shell you can redirect STDERR to STDOUT by appending 2>&1 to the end of your command.

    Let me know how you are getting on with the PWB course - Sam (intersecs.com)

    ReplyDelete
  3. Hey Guys !

    USA Fresh & Verified SSN Leads with DL Number AVAILABLE with 99.9% connectivity
    All Leads have genuine & valid information

    **HEADERS IN LEADS**
    First Name | Last Name | SSN | Dob | DL Number | Address | City | State | Zip | Phone Number | Account Number | Bank Name | Employee Details | IP Address

    *Price for SSN lead $2
    *You can ask for sample before any deal
    *If anyone buy in bulk, we can negotiate
    *Sampling is just for serious buyers

    ==>ACTIVE, FRESH CC & CVV FULLZ AVAILABLE<==
    ->$5 PER EACH

    ->Hope for the long term deal
    ->Interested buyers will be welcome

    **Contact 24/7**
    Whatsapp > +923172721122
    Email > leads.sellers1212@gmail.com
    Telegram > @leadsupplier
    ICQ > 752822040

    ReplyDelete
  4. "This information really helped me a lot. It was very informative.
    Password Policy Management"

    ReplyDelete