Feb 062015
 

SQL Injection became a favorite hacking technique in 2007. Despite being widely documented for so many years it continues to evolve and be utilized.  Because SQL Injection is such a well known attack vector, I am always surprised when as sysadmin I come across someone’s site that has been compromised by it. In most instances the site was compromised because of not properly validating user data entered on web forms. Classic ASP sites using inline SQL queries with hardcoded query string parameters are especially vulnerable. Fortunately regardless of a site’s potential programming weaknesses it can still be protected. In this walkthrough I will cover how to protect your site from SQL Injection using IIS Request Filtering.

\r\n

Identifying SQL Injection Requests

\r\n

Most people find out about SQL Injection the hard way after their web site has been defaced or their database has been compromised. If you are not sure your site has been attacked you need look no further than your web site’s traffic logs. Upon visual inspection of the IIS site logs you will see blocks of legitimate requests interspersed with the malicious requests. Checking the HTTP status code at the end of each request will indicate if there is any issue. If the status code was a 404 or a 500, you know the malicious request didn’t work. However, if the request had a HTTP status code of 200 then you should be concerned.

\r\n

image

\r\n

 

\r\n

Using Log Parser to Find SQL Injection Requests

\r\n

Looking through web logs for malicious requests can be a tedious and time consuming. Microsoft’s wonderful log analysis tool Log Parser makes it easier to identify pages on your site that are being targeted by SQL Injection attacks. Running the query below will create a report of all the page requests in the log with query string parameters.

\r\n

logparser.exe "SELECT EXTRACT_FILENAME(cs-uri-stem) AS PageRequest, cs-uri-query, COUNT(*) AS TotalHits\r\nFROM C:\wwwlogs\W3SVC35\u_ex141219.log TO results.txt\r\nGROUP BY cs-uri-stem, cs-uri-query\r\nORDER BY TotalHits DESC"

\r\n

 

\r\n

Using Findstr to find SQL Injection Requests

\r\n

Using Log Parser to identify malicous requests is helpful however if you need to look at multiple sites’ logs the task becomes more challenging. For these situations I like to utilize Findstr. Findstr is a powerful Windows tool that uses regular expressions to search files for any string value. One powerful feature is that you can store your search strings in separate files and even exclude certain strings from being searched. In the example below, I use the /g parameter to have Findstr load a file named sqlinject.txt with my predefined string and then search all the web logs in the W3SVC1 folder. The output is redirected to a file called results.txt.

\r\n

findstr /s /i /p  /g:sqlinject.txt C:\wwwlogs\W3SVC1\*.log >c:\results.txt

\r\n

Using this syntax it is easy to extend the capability of Findstr by creating a simple batch file with all the web log folders on your server. Once setup you will be able to any identify SQL Injection requests against your server within minutes.

\r\n

\r\n

Configuring IIS Request Filtering

\r\n

The Request Filtering module was introduced in IIS 7 as a replacement for the very capable Url Scan. Using Log Parser and Findstr you will be able to identify plenty of malicious requests attacking the web sites on your server. A Request Filtering rule can block requests based on file extensions, URL, HTTP Verbs, Headers, or Query Strings. Additionally you can block requests based on a maximum size of the query string and url length.

\r\n

image

\r\n

 

\r\n

Like any other IIS module you can maintain the settings outside of IIS Manager by editing the web.config. The values are stored in the <requestFiltering> section within <system.webServer>

\r\n

image

\r\n

\r\n

Filtering Rules

\r\n

I typically create 1 rule for each Deny String that I want to block. You can add multiple strings on each rule however I find it easier to maintain when only using 1 string per rule. This way if a rule is inadvertently blocking legitimate requests you can quickly disable it while leaving the other ones operational. The request url or query string can scanned for Deny Strings. However, enabling url scanning requires a bit of forethought because if the Deny String matches any part of the name of a page on your site, requests to that page will be blocked. matches . For example if you want to block requests containing the SQL command “update” but there happens to be page called update.aspx on your site, any request to update.aspx will be blocked.

\r\n

image

\r\n

If I had to pick only 1 Deny String for filtering it would be cast(. Nearly every SQL Injection request I’ve seen uses cast( and no legitimate page or query string parameter should have this name.

\r\n

\r\n

404 Status Codes

\r\n

Any request blocked by Request Filtering will return a 404 error status with a specific substatus to identify the reason it was denied. A few common ones are listed below.

\r\n

\r\n
\r\n
\r\n

\r\n

\r\n

\r\n

\r\n

\r\n

\r\n

\r\n

\r\n

\r\n

\r\n

\r\n

\r\n

\r\n

\r\n

\r\n

\r\n

\r\n

\r\n

\r\n

\r\n

\r\n

\r\n

\r\n

\r\n

HTTP Substatus Description
404.5 URL Sequence Denied
404.14 Url Too Long
404.15 Query String Too Long
404.18 Query String Sequence Denied
404.19 Denied by Filtering Rule

\r\n

With Request Filtering is enabled, it is easy to keep an eye on blocked requests. The Log Parser query below will create a report of all the requests with HTTP status 404 and substatus greater than zero.

\r\n

logparser.exe "SELECT EXTRACT_FILENAME(cs-uri-stem) AS FILENAME, cs-uri-query, COUNT(*) AS TotalHits\r\nFROM C:\wwwlogs\W3SVC35\u_ex141219.log TO results.txt\r\nWHERE (sc-status = 404 AND sc-substatus > 0)\r\nGROUP BY cs-uri-stem, cs-uri-query\r\nORDER BY TotalHits DESC

\r\n

 

\r\n

Blocking User Agents

\r\n

While investigating a recent SQL Injection attack I noticed in the IIS logs that the site had been compromised by an automated tool. It was interesting to see how the first malicious request was very basic and then each subsequent one became more elaborate with complex SQL queries. What I found even more curious was that each request used the same User-Agent which in fact identified the name of the tool and where to download it.

\r\n

image

\r\n

 

\r\n

Their web site clearly states the creators of the tool released it with the intention of helping system administrators discover vulnerabilities. Unfortunately it’s far too easy for someone to use it with malicious intent.  The good news is that blocking requests based on the User-Agent is quite easy. You just need to create a new rule and specify User-Agent in the header and then the name of the agent in the Deny Strings.  As you can see by the 404.19 status in the picture above the automated tool was successfully blocked the next time around after the rule was added.

\r\n

image

\r\n

 

\r\n

In Summary

\r\n

SQL Injection is a popular attack vector for web sites but by leveraging IIS Request Filtering the malicious requests can be easily blocked. Using Log Parser and Findstr you can quickly check your site’s logs to identify suspicious activity. Thanks for reading.

\r\n

Peter Viola

Creative, customer focused, results oriented, Senior Web Systems Engineer who enjoys providing the highest level of customer service supporting complex Windows hosting solutions. MCITP, MCSA, MCTS

More Posts - Website

Jul 272014
 

The other day I was troubleshooting 100%  CPU utilization on a SQL Server 2008 database server. The server had 100 or so databases of varying sizes however none were larger than a few hundred MB and each database had a corresponding web site on a separate web server.  Since the server hosted quite a few databases the high CPU needed to be resolved quickly because it was causing issues for everyone.  High CPU on a database server can often be symptomatic of a issues occurring outside the server. In this case the real issue was in fact being caused by a SQL Injection attack on a web server.\r\n

Top Queries by Total CPU Time

\r\nThe knee jerk reaction when experiencing high CPU may be to stop it immediately either by restarting services or recycling app pools however letting it run temporarily will help you to isolate the cause. SQL Server 2008 has some great built-in reports to help track down CPU utilization. On this occasion I used the Top Queries by Total CPU Time report. You can get to this report by right clicking on the server name in SQL Server Management Studio and then selecting Reports.\r\n\r\nimage\r\n\r\n \r\n\r\nThe Top Queries by Total CPU Time report will take a few minutes to run. However once it completes it provides a wealth of information. You’ll get a Top 10 report clearly showing which queries and databases are consuming the most CPU on the server at that moment. Using this report I was able to see that one of the databases on the server had 4 different queries running that were contributing to the high CPU. Now I could focus my attention on this 1 problematic database and hopefully resolve the high CPU.\r\n\r\n \r\n\r\nimage\r\n\r\n \r\n

SQL Profiler and Database Tuning Advisor

\r\nNow that I knew which database was causing the problems I fired up SQL Profiler for just a few minutes. I wanted to get a better understanding of the activity that was occurring within the database. Looking at the high number of Reads coming from the app named “Internet Information Services” I was starting to realize that web site activity was hitting the database pretty hard. I could also see plaintext  data being inserted into the database and it was clearly spam.\r\n\r\nimage\r\n\r\n \r\n\r\nBefore I turned my attention to the web site however I wanted to see if there could be any performance improvement using the Database Engine Tuning Advisor since I had the valuable profiler trace data. The DTA will analyze the database activity and provide a SQL script with optimizations using indexes, partitioning, and indexed views. Usually with DTA I’ll see 5-10 % performance improvement. I was excited to see a 97% improvement!\r\n\r\nimage\r\n

\r\n

Preventing SQL Injection with IIS Request Filtering

\r\nAfter I applied the optimizations script from the Database Engine Tuning Advisor the CPU utilization on the database server improved considerably. However, I knew the web site was experiencing suspicious activity so I used Log Parser to get some reports from the site’s traffic log. Using the query below I could see the most frequently used querystring values and it was obvious the site experiencing a SQL Injection attack.\r\n

\r\n\r\nlogparser.exe -i:iisw3c “select top 20 count(*),cs-uri-query from ex140702.log\r\n\r\ngroup by cs-uri-query order by count(*) desc” -rtp:-1 >file.txt\r\n\r\n

\r\n

 \r\n\r\nWith attacks like this a natural inclination is to start blocking IP addresses. Unfortunately sophisticated attacks will use a variety of IP addresses so as soon as you block a few address malicious requests from new ones will take over. The best solution is to block the malicious requests with Request Filtering so I quickly added a few rules to block keywords I had seen in my log parser reports.\r\n\r\nrequestfiltering\r\n\r\n \r\n\r\nImplementing the IIS Request Filtering rules stymied the SQL Injection attack. Using the Log Parser query below I could see the http status codes of all the requests hitting the site with the new rules in place.\r\n

\r\n\r\nSELECT STRCAT(TO_STRING(sc-status), STRCAT(‘.’, TO_STRING(sc-substatus))) AS Status, COUNT(*)\r\n\r\nAS Total FROM w3svc.log to TopStatusCodes.txt GROUP BY Status ORDER BY Total DESC\r\n\r\n

\r\n

 \r\n\r\nRequest Filtering uses the http substatus 404.18 when a query string sequence is denied. Looking at Log Parser report below you can see the  50,039 requests were blocked by the new Request Filtering rules.\r\n\r\ntopstatuscodes\r\n

An Once of Prevention…

\r\nThe web site that had been attacked hosted free cooking recipes and allowed visitors to submit their own recipes. Unfortunately the owner’s goodwill was easily exploited because there was no form field validation on site’s submission page and new recipes were automatically being displayed on the site without being approved. This is a dangerous site design and should never have been deployed without basic security measures in place.\r\n\r\nI did a quick select count(*) from the recipe table in the database and was amused by all the delicious recipes I found Smile.\r\n\r\nimage\r\n\r\n \r\n

In Summary

\r\nSQL Server 2008 has several built-in reports like Top Queries by Total CPU Time to help Investigate high CPU utilization. Running SQL Profiler will provide detailed analysis of database activity. Running the profiler output through the Database Tuning Advisor can yield significant performance improvements for the database. IIS Request Filtering is a powerful tool to block SQL Injection attacks against a web site. However, SQL Injection can be easily mitigated using basic data validation. Thanks for reading.

Peter Viola

Creative, customer focused, results oriented, Senior Web Systems Engineer who enjoys providing the highest level of customer service supporting complex Windows hosting solutions. MCITP, MCSA, MCTS

More Posts - Website