Friday, July 29, 2011

Website Security Intro - Looking for RFI attacks

If you run server-side scripting (which most major sites/applications do), it makes your site much more vulnerable to many attacks. If you are dealing with data-bases, this opens you up for other attacks. Unfortunately one may be hard pressed to find a web-site, (CMS solution) that does not have any server-side scripts, unless you have no need for dynamic content, and can program the site in just html/java-script/flash.

If you can manage to write your site up in html only (including java-script and/or flash) this is clearly the most secure solution.

But the rest out there, may want to look into putting their site up on a 3rd party managed CMS-service. Blogs (like this) can be run here on blogger (or WordPress, TypePad, ect. Also offer solutions). If blogs are not what your looking for, Google freely offers “google sites”. There are many other managed CMS hosting services available, check with your chosen hosting provider. A major benefit to these services is to leave the server-side security (and incident handling) on them instead of you. You would just need to keep up with your passwords, email, PC security just like with your facebook, or myspace pages.

For those that just cant find a managed solution for their needs (or just dont want to use them/enjoy more freedom then the provided solutions), monitoring the security of your sites can be an ongoing battle. In the following introduction, you will find helpful weapons to add to your arsenal.

I find dealing with linux servers the easiest, (and also the most common hosting solution) so I will be primarily discussing such. If your site is on a hosting account without ssh access, upgrade your account, or find a new host.

SSH, and the linux server's terminal applications (grep, awk, sed, cat, less, ect.) are your best friends. Get to know them. Google searching for their “example uses” will greatly help. If you get stuck with any of them, chances are the 'man' application will give you much more info about the options of these apps than you will ever need.

Server-side scripting exploits can be seen in the apache access logs. If you dont know where these are for you, find out. Apache log files can be viewed (searched, and their data formated more friendly) using your ssh login, and the terminal apps.

For example: RFI attacks (“remote file include” - usually the result of include statements not being sanitized correctly, allowing remote scripts to be run on your server with your users permissions) show up in the apache access logs as a line including the apache code: '200'. This is the code for successful access, although this can still be misleading if you have custom error pages, it could indicate that there was an attempt which only successfully returned the error page. Weather just an attempt, or successful, RFI attack log entries usually include a 'POST' or 'GET', and a link to a remote file such as: '=http://SomeDomain.com/MalitionsHackerScript.php'.

To run the following example (1-liner) command, to look for RFI attack attempts (including the mentioned false-positives), you would want to ssh into your webspace, and change directory to your 'logs/' (or whatever the folder name of your logs).

Example 1-liner:

zcat ./access.log.* |grep -i '=http' |grep ' 200 ' |grep -iE 'txt|php|cgi'

Now lets break down that above call:

zcat ./acess.log*

This command outputs all of the lines in your apache access logs, and then its output is “piped” to the next search command:

grep ie '=http'

Which searches for occurrences of '=http' or '=HTTP'. The output of this is then “piped” to the command:

grep ' 200 '

Which searches for occurrences of the successful apache code '200'. the output of this is then “piped” to:

grep -iE 'txt|php|cgi'

That then searches and outputs only the lines that include common extension for hacker scripts such as 'txt', 'php' or 'cgi'. There are others, or non-normal-pattern files, and scripts used also, so you could just replace this like with 'less' to just view all the posts or gets of the previous search commands. Although this will also show many potential false-positives that are just links from/to your site from/to elsewhere.

To further explain the above used grep options or “switches”:

For 'grep -i',  the 'i' option tells grep to not count the case of the characters, so picks up '=http', '=Http', '=hTtP' or '=HTTP', ect...

The 'E' option in 'grep -iE' tells it that we are looking for an expression (in this case any occurrences of: txt, php, or cgi), instead of just 1 specific text.

--
Anyhow, thats more than I meant to go into for one post, but its a starting point to one look into their websites security.

keep on keeping on, and bad guys be gone...

1 comment:

  1. I get it clear information about your topic.Above all the points are explained very clearly.Most of the people like this kind of valuable tips.web hosting

    ReplyDelete