Tag Archives: robots

SQL Injected!

My workplace was recently hit by the recent string of SQL injection attacks that have been wreaking havoc on IIS and Microsoft hosted websites.   This was my first experience with a malicious attack at the server level.  I found some great information over at the IIS forms here.  Here are some of the steps that we took to fix it.

  1. Added a robots.txt file to the application root to prevent indexing of non-relevant pages.  This was a common sense tactic.  An article over at Internet Storm Center  showed how the hacker’s application finds websites to infect by iterating through the search results in Google. Since we have been hit in a database that is only used for a back end administration tool and a form submission tool on the front end, neither of which need to be indexed or have relevant SEO content we decided to hide these sites. Security by stealth.
  2. Prevented non-North American ip addresses from accessing our web sites.  This was kind of a weak tactic since an IP address is so easy to spoof, but it was decided that some Firewall protection would be better than none.
  3. Changed database connections to use limited user account.  Our applications all ran under a single dbo user account.  We created a new limited user account that did not have rights to the database system objects, which would prevent its script from executing.  Luckily, the infected database only ran a single application so it was easy to test and implement this solution.  For connection strings that are used by several applications, or may not function under a limited user account this will be much tougher to implement.
  4. Added a database triggers as well as verbose transaction logging to be prepared if the infection hits us again.

Great, problem solved right?  Unfortunately, this was not the case and we got re-infected almost immediately.   As it turns out, while everything in the database is administrated by the single application, there was one form from this applications precursor which selected a single from from the new applications’ database.  This was news to me.  After reviewing the code we found this legacy application had been hacked by a developer to check the new database by taking the url query string and selecting from the database directly from that.  Here’s an example….

string sqlcommand = “select * from dbo.example where id = ‘” + Request.QueryString(“id”) + “‘”;

Well, hackable code doesn’t get much more obvious than that.  We immediately changed the code to first cast the parameter to an integer (since it was a number field) and than made the database query using a stored procedure so we could more easily protect ourselves in the future.  If the field had been a text field we would have had to perform some substring detection for bad/reserved words or used regular expressions to define the acceptable range of characters.

This was one of the those experiences that you have to go through first hand to appreciate the amount of time and resources it takes to fix.  Security is usually the last thing on your mind when doing the rapid development of web based applications, but it only takes going through this kind of experience to remember the importance ot proper security procedures.