Aug 042014
 

Another one of the great built-in features of IIS 8 is Dynamic IP Restrictions (DIPR). With a few simple configuration steps you can quickly set limits for blocking IP addresses based on the number of concurrent requests or frequency of requests over a period time. With these parameters in place IIS will take over blocking requests unattended thereby making your server more secure.

Before DIPR was available on IIS 7 you could manually block 1 IP or a range of IPs easily in the IP Address and Domain Restrictions module. However this could be a time consuming task if your server was under attack. Using a tool like Log Parser to examine the site’s logs you could identify IPs with suspicious activity but then you still had manually enter Deny Rules. Determined hackers will use a variety of IPs from proxy servers so by the time you’ve blocked a handful a new range could be starting up. DIPR was released out-of-band for IIS 7 and IIS 7.5 so you can leverage this great security tool on those web servers as well. In this walk through I cover how to configure Dynamic IP Restrictions and even show a test in action.

Installing Dynamic IP Restrictions

Open the Server Manager and to Web Server role. Under Security ensure that IP and Domain Restrictions is installed.

image

 

IP Address and Domain Restrictions in IIS Manager

Open IIS Manager and click on IP Address and Domain Restrictions.

Capture2

 

From this window you can either Add Allow Entry rules or Add Deny Entry rules. These rules would be for manually blocking (or allowing) one IP address or an IP address range. You have to be care when blocking an IP range because you could inadvertently block legitimate traffic. Click on Edit Dynamic Restriction Settings to set the dynamic thresholds for blocking IP addresses.

image

 

Click Edit Feature Settings to set the Deny Action Type. In this example I’ve set Forbidden so blocked requests will receive an http 403 status error. These errors will also be recorded in the site’s log for us to review later.

image

 

On the Dynamic IP Restriction Settings screen you can choose the maximum number of concurrent requests to block. And you can also Deny IP addresses based on frequency of requests over a period of time.

Capture4

 

As always depending on the volume of your web site’s traffic you should test these settings to ensure that legitimate traffic does not get blocked.

 

Testing Dynamic IP Address Blocking

I didn’t have a real security incident available for testing the DIPR module so I did the next best thing. Using Fiddler the free debugging tool from Telerik and StressStimulus a free load testing plugin from StimulusTechnology I hammered my test virtual server for a few minutes and got the desired results. With Fiddler open you will see the StressStimulus module. From here you can record your test case or open an existing test case as well as edit the test case paramters.

Capture12

 

Test Results

StressStimulus gives you multiple detailed charts to review to gauge the performance of your site and identify potential areas of weakness. For my test I choose to hit the wp-login.php page on my test WordPress site with 3 concurrent requests and 100 iterations. The test completed within a few minutes.

Capture8

 

Visiting the test page from the server running StressStimulus I get the expected result. It’s blocked by a 403 error.  The full description of this code is 403.502 – Forbidden: Too many requests from the same client IP; Dynamic IP.

Capture

 

Using the Log Parser query below to analyze the site log I see that 331 requests were blocked with a 403.502 status code.

SELECT TOP 100
STRCAT(EXTRACT_PATH(cs-uri-stem),'/') AS RequestPath, sc-status,sc-substatus,
EXTRACT_FILENAME(cs-uri-stem) AS RequestedFile,
COUNT(*) AS TotalHits, c-ip
FROM w3svc.og TO top-403-ip-requests
GROUP BY cs-uri-stem, sc-status,sc-substatus,c-ip
ORDER BY TotalHits DESC

image

 

Further examination of the log with Log Parser shows the full break down of the requests blocked with 403 status.

SELECT TOP 100
STRCAT(EXTRACT_PATH(cs-uri-stem),’/') AS RequestPath, sc-status,sc-substatus,
EXTRACT_FILENAME(cs-uri-stem) AS RequestedFile,
COUNT(*) AS TotalHits, c-ip
FROM w3svc.og TO top-403-ip-requests
where sc-status=403
GROUP BY cs-uri-stem, sc-status,sc-substatus,c-ip
ORDER BY TotalHits DESC



image

 

Summary

The Dynamic IP Restrictions module is available with IIS 8 as well as IIS 7 and IIS 7.5. It is a powerful tool to block automated attacks on your site and requires minimal configuration and maintenance. Thanks for reading.

Jul 272014
 

The other day I was troubleshooting 100%  CPU utilization on a SQL Server 2008 database server. The server had 100 or so databases of varying sizes however none were larger than a few hundred MB and each database had a corresponding web site on a separate web server.  Since the server hosted quite a few databases the high CPU needed to be resolved quickly because it was causing issues for everyone.  High CPU on a database server can often be symptomatic of a issues occurring outside the server. In this case the real issue was in fact being caused by a SQL Injection attack on a web server.

Top Queries by Total CPU Time

The knee jerk reaction when experiencing high CPU may be to stop it immediately either by restarting services or recycling app pools however letting it run temporarily will help you to isolate the cause. SQL Server 2008 has some great built-in reports to help track down CPU utilization. On this occasion I used the Top Queries by Total CPU Time report. You can get to this report by right clicking on the server name in SQL Server Management Studio and then selecting Reports.

image

 

The Top Queries by Total CPU Time report will take a few minutes to run. However once it completes it provides a wealth of information. You’ll get a Top 10 report clearly showing which queries and databases are consuming the most CPU on the server at that moment. Using this report I was able to see that one of the databases on the server had 4 different queries running that were contributing to the high CPU. Now I could focus my attention on this 1 problematic database and hopefully resolve the high CPU.

 

image

 

SQL Profiler and Database Tuning Advisor

Now that I knew which database was causing the problems I fired up SQL Profiler for just a few minutes. I wanted to get a better understanding of the activity that was occurring within the database. Looking at the high number of Reads coming from the app named “Internet Information Services” I was starting to realize that web site activity was hitting the database pretty hard. I could also see plaintext  data being inserted into the database and it was clearly spam.

image

 

Before I turned my attention to the web site however I wanted to see if there could be any performance improvement using the Database Engine Tuning Advisor since I had the valuable profiler trace data. The DTA will analyze the database activity and provide a SQL script with optimizations using indexes, partitioning, and indexed views. Usually with DTA I’ll see 5-10 % performance improvement. I was excited to see a 97% improvement!

image

Preventing SQL Injection with IIS Request Filtering

After I applied the optimizations script from the Database Engine Tuning Advisor the CPU utilization on the database server improved considerably. However, I knew the web site was experiencing suspicious activity so I used Log Parser to get some reports from the site’s traffic log. Using the query below I could see the most frequently used querystring values and it was obvious the site experiencing a SQL Injection attack.

logparser.exe -i:iisw3c “select top 20 count(*),cs-uri-query from ex140702.log

group by cs-uri-query order by count(*) desc” -rtp:-1 >file.txt

 

With attacks like this a natural inclination is to start blocking IP addresses. Unfortunately sophisticated attacks will use a variety of IP addresses so as soon as you block a few address malicious requests from new ones will take over. The best solution is to block the malicious requests with Request Filtering so I quickly added a few rules to block keywords I had seen in my log parser reports.

requestfiltering

 

Implementing the IIS Request Filtering rules stymied the SQL Injection attack. Using the Log Parser query below I could see the http status codes of all the requests hitting the site with the new rules in place.

SELECT STRCAT(TO_STRING(sc-status), STRCAT(‘.’, TO_STRING(sc-substatus))) AS Status, COUNT(*)

AS Total FROM w3svc.log to TopStatusCodes.txt GROUP BY Status ORDER BY Total DESC

 

Request Filtering uses the http substatus 404.18 when a query string sequence is denied. Looking at Log Parser report below you can see the  50,039 requests were blocked by the new Request Filtering rules.

topstatuscodes

An Once of Prevention…

The web site that had been attacked hosted free cooking recipes and allowed visitors to submit their own recipes. Unfortunately the owner’s goodwill was easily exploited because there was no form field validation on site’s submission page and new recipes were automatically being displayed on the site without being approved. This is a dangerous site design and should never have been deployed without basic security measures in place.

I did a quick select count(*) from the recipe table in the database and was amused by all the delicious recipes I found Smile.

image

 

In Summary

SQL Server 2008 has several built-in reports like Top Queries by Total CPU Time to help Investigate high CPU utilization. Running SQL Profiler will provide detailed analysis of database activity. Running the profiler output through the Database Tuning Advisor can yield significant performance improvements for the database. IIS Request Filtering is a powerful tool to block SQL Injection attacks against a web site. However, SQL Injection can be easily mitigated using basic data validation. Thanks for reading.

Oct 072013
 

When you need quick analysis of your traffic logs you won’t find an better tool than Microsoft’s free Log Parser. With Log Parser you can read a variety of log files including the Registry and Windows event logs. It’s ease of use comes from using SQL queries against your log file. You can get your data even faster by using multiple log parser queries in a batch file.

image

The other day I was helping someone who needed some “top 10” data from their site’s log. Since I had these in my trusty batch file I could provide the text reports within seconds. However, I like to offer a little more pizzazz when possible so this time I decided use Log Parser’s native charting capability to output the results with some nice charts.  As the saying goes a picture is worth a thousand words.

Here’s the query I used to create the chart above:

logparser.exe -i:iisw3c "select top 10 cs-uri-stem, count(*)  into top10requests.gif 
from <file> group by cs-uri-stem order by count(*) desc" 
-o:CHART -chartType:pieexploded3d -categories:off -chartTitle:"Top 10 Requests"

 

Installing Office Web Components

Charting is a native feature of Log Parser however there is a dependency for Office 2003 Add-in: Office Web Components. Depending on where you are running Log Parser the first time you try to output your query to a chart you may see this error message:

Error creating output format “CHART”: This output format requires a licensed Microsoft Office Chart Web Component to be installed on the local machine

If you didn’t see the error above then you’re all set but if you saw the error then it will be necessary to install the Office Web Components before you can start outputting charts. Once you’ve downloaded the file just accept the License Agreement and click Install.

image

The installation runs quickly. Click OK to close the window.

image

 

Example Log Parser Reports with Charts

Now you’re ready to start creating some colorful charts. The most useful parameters in my opinion are –chartType, –chartTitle, –categories, –values, and –legend. There are some 20+ chart types that you can choose from including:  Pie, PieExploded, PieExlpoded3D, LineStacked, Line3D, BarClustered, ColumnClustered, Smooothline. The default chart type is Line.  To see all the possible chart options run this simple command:

LogParser -h -o:CHART

To take your charts to the highest level of customization you can use an external configuration script with Jscript or VBscript . Take a look at the MSDN ChartSpace Object Model documentation for more information.

Here are a few different charts with various options.

image

logparser.exe -i:iisw3c "select top 10 cs-uri-stem, count(*)  into top10requests.gif 
from x.log group by cs-uri-stem order by count(*) desc" 
-o:CHART -chartType:pieexploded3d -categories:off -chartTitle:"Top 10 Requests"

 

 

image

logparser.exe -i:iisw3c "select top 10 sc-status, count(*)  into top10errorcodes.gif 
from x.log group by sc-status having sc-status not in ('200') order by count(*) desc" 
-o:CHART -chartType:column3d -categories:on -values:on -chartTitle:"Top Status Codes"

 

 

image

logparser.exe -i:iisw3c "select top 10 cs-uri-stem, count(*)  into top10_404.gif 
from x.log group by cs-uri-stem, sc-status having sc-status in ('404') order by count(*) desc" 
-o:CHART -chartType:BarClustered3D -values:on -categories:on -chartTitle:"Top 10 404 Status"

 

image

logparser.exe -i:iisw3c "select quantize(time, 60) as TimeGenerated, count(*) as Hits into 
hitsperminute.gif from %1 group by TimeGenerated" -o:chart -chartType:Line –chartTitle:"Hits per Minute"

 

 

 

image

 

logparser.exe -i:iisw3c "SELECT TOP 10 cs-uri-stem AS RequestedFile, COUNT(*) AS TotalHits, 
MAX(time-taken) AS MaxTime, AVG(time-taken) AS AvgTime into slow.gif from x.log 
where EXTRACT_FILENAME(cs-uri-stem) not in('%begin%') GROUP BY cs-uri-stem ORDER BY MaxTime, TotalHits DESC" 
-o:CHART -chartType:barclustered3d -values:off -categories:on -chartTitle:"Top 10 Slowest Requests"

 

In Summary

Microsoft’s Log Parser is a powerful tool for log file analysis. You can use it to analyze text files, csv files, Window’s event logs and even the Windows Registry.  You can make boring reports come alive with colorful charts.  There is a dependency on Office Web Components for charting to work but that is easily solved. Thanks for reading.

Nov 102012
 

The other day I was checking the traffic stats for my WordPress blog to see which of my posts were the most popular. I was a little concerned to see that wp-login.php was in the Top 5 total requests almost every month.  Since I’m the only author on my blog my logins could not possibly account for the traffic hitting that page.

image

The only explanation could be that the additional traffic was coming from automated hacking attempts. Any server administrator concerned about security knows that “footprinting” is one of the first things a hacker will do when checking for vulnerable sites. I checked a few of my non-wordpress sites and sure enough every couple of days page requests were coming in for wp-login.php and urls for other CMS products such as Plesk’s Cpanel.

Using log parser I wanted to see how many IPs were hitting my login page during a particular month. I like to keep my complex queries in a separate query file and then output the results to a text file. In a previous post I showed how you can leverage this technique to automate log parser to run multiple queries at once. Here’s the query I used to check every log file for my site in the month of October. So this query outputs a simple text file clearly showing thousands of requests to wp-login.php from across the internet during the month of October.

logparser.exe file:wplogin.sql?destination=wpadmin.txt -o:NAT

Here is the SQL query contained in wplogin.sql. I am using a couple of advanced techniques such as searching in multiple log files and filtering the dates to get a results for one month:

SELECT c-ip, COUNT(*) AS TotalHits
FROM ./log/u_ex*.log  TO %destination%
where (EXTRACT_FILENAME(cs-uri-stem) = 'wp-login.php'
and TO_STRING(To_timestamp(date, time), 'MM')='10'
GROUP BY c-ip
ORDER BY TotalHits DESC

Having always followed best practices for hosting sites I was confident that my sites were not vulnerable to these automated probes but it was still a bit irksome to know they were occurring. And there’s always the possibility of some zero-day exploit showing up so I decided the simplest thing do was to block them. IIS now has a great new module available for Dynamic IP Restrictions however that would only block the requests once a threshold was reached. I needed something more “nuclear”. I wanted a means to block everyone hitting that page except for me and the help of the IIS Url Rewrite module a simple rewrite rule was going to do the trick.

In another previous post I showed how easy it was to use Url Rewrite for SEO and security. Now you can see how to tighten security even more by aborting requests based on almost any criteria. In my particular case I wanted to abort requests to wp-login.php from any IP address except for my own IP. So the next logical question is “what’s my ip address?” There are several sites online which can offer you this information. Probably the easiest to remember is whatsmyip.org. When you visit that site, your public IP address is displayed very clearly at the top of the page. This is the IP that you want to put in the rule below. It is important to note the use of “negate=true”. This means all IPs except for the one referenced. The next logical question is what if I need to login to WordPress from more than 1 IP address such as from home and at the office? This is a simple matter of just adding an additional condition to the rule below.

<rewrite>
      <rules>   
    <rule name="Block wp login" stopprocessing="true">
          <match url="(^wp-login.php)" ignorecase="true" />
          <conditions>
            <add negate="true" pattern="10\.10\.10\.10" input="{REMOTE_ADDR}" />
          </conditions>
          <action type="AbortRequest" />
        </rule>
      </rules>
<rewrite>

So in summary Log Parser makes it very easy to get quick extracts of data from your web site’s logs and IIS 7’s Url Rewrite module not only offers powerful rewriting and redirecting of urls but also an easy way to tighten the security of your site. Thanks for reading.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS
Sep 252012
 

Microsoft’s Log Parser is a really powerful tool for searching log files. With just a few simple SQL commands you can pull more data than you ever imagined out of your logs. It can be used to read web site log files, csv files, and even Windows event logs. Log Parser isn’t intended to compete with stats products such as Webtrends Log Analyzer or  SmarterStats by Smartertools.com which I feel is the best on the market. 

I primarily use Log Parser for troubleshooting problems with a site such as identifying http errors or long running page requests. When I look into an issue I also like to get a snapshot for the day’s traffic such as the top IP addresses making requests and the top bots hitting the site. This is all available from Log Parser once you know the right queries to run.

As one can imagine when a site is having problem you want to know as much information as quickly as possible. Manually running Log Parser queries on a site’s log files is not easy when you have an urgent problem resolve.  My biggest challenge with using Log Parser is remembering  the different queries I want and getting the syntax correct.  Solving this challenge is easier than one might think and involves just creating the scripts ahead of time that you need.

I just create a batch file called logparse.bat and use the %1 parameter for the name of the log file that I want to analyze and then I redirect the output of the query to a text file:

Here are the queries I am using:

 
"c:\temp\logparser.exe" -i:iisw3c "select top 25 count(*), cs-uri-stem from %1 group by cs-uri-stem order by count(*) desc" -rtp:-1 >top25req.txt

"c:\temp\logparser.exe" "select Top 10 count(*), c-ip from %1 group by c-ip order by Count(*) DESC" -o:csv >topIP.txt

"c:\temp\logparser.exe" "select Top 10 count(*), c-ip, cs(User-Agent) from %1 group by c-ip, cs(User-Agent) order by Count(*) DESC" -o:csv >topBot.txt

Running this from the command line is simple. You just specify path to the log file as a parameter.


c:\temp\logparse c:\wwwlogs\w3svc1\u_ex120921.log

Running this will create 3 files showing the top 25 requests, top bots hitting the site and the top IP addresses making requests. This is valuable information but we really need more information to know what’s going on with our site. Here’s a more advanced query using a SQL file. Just as before we’re going to use parameters and then call it from our logparse.bat file. This query is stored in a separate file called topstatuscodes.sql

SELECT 	STRCAT(TO_STRING(sc-status), STRCAT('.', TO_STRING(sc-substatus))) 
AS Status, COUNT(*) AS Total FROM %source% to %destination% GROUP BY Status ORDER BY Total DESC

Here is how we call this external file from our logparse.bat file:

"c:\temp\logparser.exe" file:TopStatusCodes.sql?source=%1+destination=TopStatusCodes.txt -o:NAT 

Here is what the output of this particular query looks like:

Immediately we can see that our site had 1,564 HTTP 404 errors as well as 22 HTTP 500 errors. That many 404 errors indicates a problem with the site’s design and could impact performance. So let’s create another query and output the results to a file to see where the 404 errors are coming from:

SELECT
TOP 10
STRCAT(EXTRACT_PATH(cs-uri-stem),'/') AS RequestPath, sc-status,
EXTRACT_FILENAME(cs-uri-stem) AS RequestedFile,
COUNT(*) AS TotalHits,
MAX(time-taken) AS MaxTime,
AVG(time-taken) AS AvgTime,
AVG(sc-bytes) AS AvgBytesSent
FROM %source% TO %destination%
where sc-status=404
GROUP BY cs-uri-stem, sc-status
ORDER BY MaxTime, TotalHits DESC

Here is how we run this query from our logparse.bat file:

"c:\temp\logparser.exe" file:Top10-404-WebRequests.sql?source=%1+destination=Top10-404-WebRequests.txt -o:NAT

The output of this query provides all the information we need to track down the issue including request path and the requested file.


Let’s look at one more of my favorite Log Parser queries: the top 10 longest running page requests. This will show us the 10 slowest pages on the site. This can be invaluable information when diagnosing a problem. Here is the query which I save in a file called top10webrequests.sql:

SELECT
TOP 10
STRCAT(EXTRACT_PATH(cs-uri-stem),'/') AS RequestPath,
EXTRACT_FILENAME(cs-uri-stem) AS RequestedFile,
COUNT(*) AS TotalHits,
MAX(time-taken) AS MaxTime,
AVG(time-taken) AS AvgTime,
AVG(sc-bytes) AS AvgBytesSent
FROM %source% TO %destination%
GROUP BY cs-uri-stem
ORDER BY MaxTime, TotalHits DESC

Here is how we call it from our logparse.bat file. It will redirect the output to a file called top10webrequests.txt:

"c:\temp\logparser.exe" file:Top10WebRequests.sql?source=%1+destination=Top10WebRequests.txt -o:NAT

The numbers shown in this report seem incredibly high at first glance but do not be alarmed. The IIS log format uses microseconds so you have to divide each of the numbers by 1000. So by doing that we can see that one request took well over 2 minutes to complete and another averages 19 seconds to complete. These are red flags that need immediate investigation.

So in summary I have shown 6 great log parser queries you can you run from a batch file to automate your web log analysis. Anytime someone reports a performance problem you can provide valuable data within seconds. But this is really just scratching the surface. You could add 5-10 more queries to the batch file to get even more data depending on your needs.

Please note: if you use the examples I’ve provided be sure to change the paths to where your logparser.exe is located.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS