Aug 282014
 

Depending on how many sites your Windows web server is hosting maintaining IIS logs can be a challenge.  IIS logs provide valuable insight into the traffic your sites are experiencing as well as detailed SEO metrics and performance data. A typical web server will have just enough free disk space for future growth needs but ultimately will be limited by the capacity of drives in the server. If left unmonitored IIS logs can quickly fill any remaining disk space on the server. There are a few 3rd party tools that are good at compressing log files when they are under one parent folder but when the log files are in different locations such as on a WebsitePanel server I support an alternative solution is needed. In this walkthrough I will demonstrate how I solved this challenge using asp.net and GZipStream.

 

What about Windows Folder Compression?

Enabling Windows folder compression should always be the first step to try when freeing up space used by IIS logs. I always redirect the site logs to a folder at the root level of the server so it’s easy to find them and monitor their growth. In the example below I saved 8.3 GB of disk space without having even having to zip any files. However Windows folder compression was not going to work for my WebsitePanel server.

Capture12

 

Finding WebsitePanel IIS Logs

WebsitePanel is a wonderful Windows control panel for Multi-Tennant web hosting and the best part is that it’s free. If you haven’t discovered this incredible application yet check it out. Once installed every aspect of site creation and maintenance is completely automated. As shown in the picture below customer sites are usually deployed with a directory structure like c:\HostingSpaces\customername\sitename. The IIS logs are always stored further down the tree in …\sitename\logs\W3SVCXX. As one can imagine on a web server with hundreds of sites it would be quite a challenge to track down and manually apply Windows compression on all the W3SVC folders. This would be further complicated as new sites are added to the server.

Untitled-1 copy

 

Using Directory.EnumerateFiles and GZipStream

I knew I was going to do some coding to solve my challenge but I also wanted to keep the solution easy to use.  In years past I would have considered VBScript folder recursion and executing the PKZip command line tool but I wanted something more contemporary.  So I used Visual Studio 2013 and created a Console Application.

Capture9

 

image

 

The GZipStream class which was introduced in .Net 2.0 would handle the necessary file compression. Since I would only be compressing individual files I didn’t have to deal with using a separate library such as DotNetZip.

public static void Compress(FileInfo fileToCompress)
{
    using (FileStream originalFileStream = fileToCompress.OpenRead())
    {
        if ((File.GetAttributes(fileToCompress.FullName) & 
            FileAttributes.Hidden) != FileAttributes.Hidden & fileToCompress.Extension != ".gz")
        {
            using (FileStream compressedFileStream = File.Create(fileToCompress.FullName + ".gz"))
            {
                using (GZipStream compressionStream = new GZipStream(compressedFileStream, CompressionMode.Compress))
                {
                    originalFileStream.CopyTo(compressionStream);
                    Console.WriteLine("Compressed {0} from {1} to {2} bytes.",
                        fileToCompress.Name, fileToCompress.Length.ToString(), compressedFileStream.Length.ToString());
                }
            }
        }
    }
}

The Directory.EnumerateFiles method introduced in .Net 4 would use find all the IIS log folders from the WebsitePanel hosting spaces directory tree. Using the searchOption overload  would enable me to not only search for .log files but also limit the searching to specific folders matching “W3SVC”.  In the snippet below I loop through the search results, compress any log file older than 1 day, and delete the original file. The last step is to just write the name the file that was just zipped to an audit log.

public static void zipFiles()
{
    try
    {
        foreach (string file in Directory.EnumerateFiles(homeDirectory, "*.log", 
            SearchOption.AllDirectories).Where(d => d.Contains("W3SVC")).ToArray())
        {
            FileInfo fi = new FileInfo(file);
            if (fi.LastWriteTime < DateTime.Now.AddDays(-1))
            {
                Compress(fi);
                fi.Delete();
                activityLog("Zipping: " + file);
            }
        }
    }
    catch (UnauthorizedAccessException UAEx)
    {
        activityLog(UAEx.Message);
    }
    catch (PathTooLongException PathEx)
    {
        activityLog(PathEx.Message);
    }
}

I only needed 30 days worth of log files to be left on the server. In the snippet below I delete any zipped log files matching my date criteria and then write the activity to the audit log.

public static void deleteFiles()
{
    try
    {
        foreach (string file in Directory.EnumerateFiles(homeDirectory, "*.gz", 
            SearchOption.AllDirectories).Where(d => d.Contains("W3SVC")).ToArray())
        {
            FileInfo fi = new FileInfo(file);
            if (fi.LastWriteTime < DateTime.Now.AddDays(-daysToSave))
            {
                fi.Delete();
                activityLog("Deleting:" + file);
            }

        }
    }
    catch (UnauthorizedAccessException UAEx)
    {
        activityLog(UAEx.Message);
    }
    catch (PathTooLongException PathEx)
    {
        activityLog(PathEx.Message);
    }
}

Using an XML Config File

To help keep my application flexible I created a simple XML file to store the path to the WebsitePanel hosting spaces folder and the number of days of logs I wanted to keep on the server.

<?xml version="1.0" encoding="utf-8"?>
<MyConfig>
  <HomeDirectory>D:\HostingSpaces</HomeDirectory>
  <DaysToSave>30</DaysToSave>
</MyConfig>

 

Loading these values into my program is done by parsing the XML data using XmlDocument.

public static void readConfig()
{

    string path = System.IO.Path.GetFullPath(@"logCleanConfig.xml");
    XmlDocument xmlDoc = new XmlDocument();
    xmlDoc.Load(path);
    XmlNodeList HomeDirectory = xmlDoc.GetElementsByTagName("HomeDirectory");
    homeDirectory = HomeDirectory[0].InnerXml;

    XmlNodeList DaysToSave = xmlDoc.GetElementsByTagName("DaysToSave");
    daysToSave = Convert.ToInt32(DaysToSave[0].InnerXml);

}

 

Running the Program

Running the program from a command line I can see the progress as files are compressed. Afterwards I setup a scheduled task so it will run every day to compress new IIS log files and purge the ones older than 30 days.

Capture7

 

A few days later I came across several sites on the server using CMS products with their own application logs. These log files were not being cleaned up by the site owners and taking up free space on the server so I decided to archive them as well. Since they were being stored within each site in subfolder called “Logs”, tweaking the searchOption parameters of the Directory.EnumerateFiles method would catch them. With a quick rebuild of the program, it would now compress the CMS log files as well as the IIS log files.

foreach (string file in Directory.EnumerateFiles(homeDirectory, "*.log",
  SearchOption.AllDirectories).Where(d => d.Contains("W3SVC") || d.Contains("Logs")).ToArray())

In Summary

IIS logs contain wealth of data about web site traffic and performance. Left unchecked over time the logs will accumulate and take up disk space on the server. Using Windows folder compression is a simple solution to reclaim valuable disk space. If your log files are different locations then leveraging the power of ASP.NET with Directory.EnumerateFiles and GZipStream will do the trick. Thanks for reading.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS
Aug 072014
 

The FTP protocol is some 43 years old now. Yet it continues to be one of the most widely used file transfer technologies available. Over the years it has been shown to  be vulnerable to brute force attacks, packet capture, and other attack vectors.  Fortunately with IIS 8 on Windows Server 2012  your FTP server doesn’t have to be vulnerable. It goes without saying that FTP Authentication and Authorization are the most fundamental methods to secure your server.  Here are three additional things you can do to increase the security of your server’s FTP service and minimize its attack footprint.

 

IIS 8 FTP Logon Attempt Restrictions

One of the most common FTP attack vectors is the dictionary attack. Using automated tools hackers will repeatedly hammer your FTP site with thousands of username and password combinations hoping to find that one account with an easy password. In the picture below you can see just a snippet of this automated activity. Fortunately none of these attempts were successful.

log

 

IIS 8 now features FTP Logon Attempt Restrictions. This powerful feature is not available in IIS 7 or IIS 7.5. Once configured automated logon attacks will be stopped in their tracks. From IIS Manager simply click on FTP Logon Attempt Restrictions.

Capture3

 

Configuring the FTP Logon Attempt Restrictions module is easy. Simply choose how many logon attempts are to be allowed and the time period for them to occur. When you consider that most FTP clients will save passwords in a profile, legitimate users on your FTP site should only need 1-2 logon attempts. However, depending on how many FTP users you’re hosting and their technical savvy you may need to tweak these settings.

Capture4

 

Testing my FTP site now with the new logon attempt restrictions it is easy to see how well it works. After the threshold is exceeded my connection to the server is forcibly closed. Automated hack attempts will no longer be a threat to this FTP server.

Capture5

 

Enable FTP Over SSL with IIS 8

The FTP protocol wasn’t originally designed for encryption. Fortunately with IIS 8 (and IIS 7) your FTP sessions can now be encrypted with SSL. To configure FTPS also known as FTP Over SSL open IIS Manager. You can either specify using SSL when adding FTP Publishing to a site or alternatively just going to the FTP SSL Settings on an existing site. Connecting with SSL can either be optional or or you can force all connections to use it. Using the drop down menu choose the certificate that you want to be used to encrypt the connections.  Windows Server 2012 has a default certificate available however you are also welcome to install your own 3rd party certificate.

image

 

After you’ve configured the SSL settings on the server you just need to change your FTP client connection properties. In the picture below I’m using a legacy version of Cute FTP 8.0. Depending on which FTP client you’re using your protocol menu will look different.

image

 

Having changed my FTP client settings I attempt to connect to the server using SSL. The first time you connect to the server you will be prompted to accept the new SSL certificate. The log snippet below shows that my FTP session is being properly established with SSL. My communication with the server is now secure and protected.  Here is a more detailed walk through of configuring FTP over SSL on IIS 8.

image

Configuring IIS 8 FTP User Isolation

When IIS 7 was released the FTP service had been completely redesigned from the ground up with security in mind. This was a welcome change indeed from IIS 6. In addition to supporting FTP over SSL it introduced FTP User Isolation. Multiple users on the same FTP site could be separated regardless of which file path they were being logged into without risk of someone traversing up parent paths to other user folders.

The FTP Authorization rules make it easy to identify multiple users or even local groups to have access to the FTP server. The user isolation is accomplished by creating a virtual directory called LocalUser and then choosing User name directory (disable global virtual directories). The LocalUser virtual directory should point to the FTP root directory and then you create a separate virtual directory for each FTP user which points to their destination path.

image

 

With FTP User Isolation configured your users will never be able to move up up to a parent path beyond their individual root directory. Even if a user were able to correctly guess the username and virtual path of another FTP account on the server they will not be able to reach it. Due to the confines of the isolation the FTP session can not see anything else on the server. In the example below I login with local account ftpuser2 and attempt to change the path to /ftpuser1 however that path does not exist and therefore is not accessible to my user. Here is a more detailed walkthrough of configuring FTP User Isolation on IIS 8.

image

 

In Summary

IIS 8 on Windows Server 2012 offers the most secure FTP service of any IIS version to date. You have multiple layers of FTP security available by leveraging FTP Logon Attempt Restrictions, FTP Over SSL, and FTP User Isolation. Your FTP server will be well protected using these built-in modules. With internet security there is no ‘patch’ for complacence.  More security is always better so implement it when it’s readily available to you.  Thanks for reading.

Aug 042014
 

Another one of the great built-in features of IIS 8 is Dynamic IP Restrictions (DIPR). With a few simple configuration steps you can quickly set limits for blocking IP addresses based on the number of concurrent requests or frequency of requests over a period time. With these parameters in place IIS will take over blocking requests unattended thereby making your server more secure.

Before DIPR was available on IIS 7 you could manually block 1 IP or a range of IPs easily in the IP Address and Domain Restrictions module. However this could be a time consuming task if your server was under attack. Using a tool like Log Parser to examine the site’s logs you could identify IPs with suspicious activity but then you still had manually enter Deny Rules. Determined hackers will use a variety of IPs from proxy servers so by the time you’ve blocked a handful a new range could be starting up. DIPR was released out-of-band for IIS 7 and IIS 7.5 so you can leverage this great security tool on those web servers as well. In this walk through I cover how to configure Dynamic IP Restrictions and even show a test in action.

Installing Dynamic IP Restrictions

Open the Server Manager and to Web Server role. Under Security ensure that IP and Domain Restrictions is installed.

image

 

IP Address and Domain Restrictions in IIS Manager

Open IIS Manager and click on IP Address and Domain Restrictions.

Capture2

 

From this window you can either Add Allow Entry rules or Add Deny Entry rules. These rules would be for manually blocking (or allowing) one IP address or an IP address range. You have to be care when blocking an IP range because you could inadvertently block legitimate traffic. Click on Edit Dynamic Restriction Settings to set the dynamic thresholds for blocking IP addresses.

image

 

Click Edit Feature Settings to set the Deny Action Type. In this example I’ve set Forbidden so blocked requests will receive an http 403 status error. These errors will also be recorded in the site’s log for us to review later.

image

 

On the Dynamic IP Restriction Settings screen you can choose the maximum number of concurrent requests to block. And you can also Deny IP addresses based on frequency of requests over a period of time.

Capture4

 

As always depending on the volume of your web site’s traffic you should test these settings to ensure that legitimate traffic does not get blocked.

 

Testing Dynamic IP Address Blocking

I didn’t have a real security incident available for testing the DIPR module so I did the next best thing. Using Fiddler the free debugging tool from Telerik and StressStimulus a free load testing plugin from StimulusTechnology I hammered my test virtual server for a few minutes and got the desired results. With Fiddler open you will see the StressStimulus module. From here you can record your test case or open an existing test case as well as edit the test case paramters.

Capture12

 

Test Results

StressStimulus gives you multiple detailed charts to review to gauge the performance of your site and identify potential areas of weakness. For my test I choose to hit the wp-login.php page on my test WordPress site with 3 concurrent requests and 100 iterations. The test completed within a few minutes.

Capture8

 

Visiting the test page from the server running StressStimulus I get the expected result. It’s blocked by a 403 error.  The full description of this code is 403.502 – Forbidden: Too many requests from the same client IP; Dynamic IP.

Capture

 

Using the Log Parser query below to analyze the site log I see that 331 requests were blocked with a 403.502 status code.

SELECT TOP 100
STRCAT(EXTRACT_PATH(cs-uri-stem),'/') AS RequestPath, sc-status,sc-substatus,
EXTRACT_FILENAME(cs-uri-stem) AS RequestedFile,
COUNT(*) AS TotalHits, c-ip
FROM w3svc.og TO top-403-ip-requests
GROUP BY cs-uri-stem, sc-status,sc-substatus,c-ip
ORDER BY TotalHits DESC

image

 

Further examination of the log with Log Parser shows the full break down of the requests blocked with 403 status.

SELECT TOP 100
STRCAT(EXTRACT_PATH(cs-uri-stem),’/') AS RequestPath, sc-status,sc-substatus,
EXTRACT_FILENAME(cs-uri-stem) AS RequestedFile,
COUNT(*) AS TotalHits, c-ip
FROM w3svc.og TO top-403-ip-requests
where sc-status=403
GROUP BY cs-uri-stem, sc-status,sc-substatus,c-ip
ORDER BY TotalHits DESC



image

 

Summary

The Dynamic IP Restrictions module is available with IIS 8 as well as IIS 7 and IIS 7.5. It is a powerful tool to block automated attacks on your site and requires minimal configuration and maintenance. Thanks for reading.

Jul 272014
 

The other day I was troubleshooting 100%  CPU utilization on a SQL Server 2008 database server. The server had 100 or so databases of varying sizes however none were larger than a few hundred MB and each database had a corresponding web site on a separate web server.  Since the server hosted quite a few databases the high CPU needed to be resolved quickly because it was causing issues for everyone.  High CPU on a database server can often be symptomatic of a issues occurring outside the server. In this case the real issue was in fact being caused by a SQL Injection attack on a web server.

Top Queries by Total CPU Time

The knee jerk reaction when experiencing high CPU may be to stop it immediately either by restarting services or recycling app pools however letting it run temporarily will help you to isolate the cause. SQL Server 2008 has some great built-in reports to help track down CPU utilization. On this occasion I used the Top Queries by Total CPU Time report. You can get to this report by right clicking on the server name in SQL Server Management Studio and then selecting Reports.

image

 

The Top Queries by Total CPU Time report will take a few minutes to run. However once it completes it provides a wealth of information. You’ll get a Top 10 report clearly showing which queries and databases are consuming the most CPU on the server at that moment. Using this report I was able to see that one of the databases on the server had 4 different queries running that were contributing to the high CPU. Now I could focus my attention on this 1 problematic database and hopefully resolve the high CPU.

 

image

 

SQL Profiler and Database Tuning Advisor

Now that I knew which database was causing the problems I fired up SQL Profiler for just a few minutes. I wanted to get a better understanding of the activity that was occurring within the database. Looking at the high number of Reads coming from the app named “Internet Information Services” I was starting to realize that web site activity was hitting the database pretty hard. I could also see plaintext  data being inserted into the database and it was clearly spam.

image

 

Before I turned my attention to the web site however I wanted to see if there could be any performance improvement using the Database Engine Tuning Advisor since I had the valuable profiler trace data. The DTA will analyze the database activity and provide a SQL script with optimizations using indexes, partitioning, and indexed views. Usually with DTA I’ll see 5-10 % performance improvement. I was excited to see a 97% improvement!

image

Preventing SQL Injection with IIS Request Filtering

After I applied the optimizations script from the Database Engine Tuning Advisor the CPU utilization on the database server improved considerably. However, I knew the web site was experiencing suspicious activity so I used Log Parser to get some reports from the site’s traffic log. Using the query below I could see the most frequently used querystring values and it was obvious the site experiencing a SQL Injection attack.

logparser.exe -i:iisw3c “select top 20 count(*),cs-uri-query from ex140702.log

group by cs-uri-query order by count(*) desc” -rtp:-1 >file.txt

 

With attacks like this a natural inclination is to start blocking IP addresses. Unfortunately sophisticated attacks will use a variety of IP addresses so as soon as you block a few address malicious requests from new ones will take over. The best solution is to block the malicious requests with Request Filtering so I quickly added a few rules to block keywords I had seen in my log parser reports.

requestfiltering

 

Implementing the IIS Request Filtering rules stymied the SQL Injection attack. Using the Log Parser query below I could see the http status codes of all the requests hitting the site with the new rules in place.

SELECT STRCAT(TO_STRING(sc-status), STRCAT(‘.’, TO_STRING(sc-substatus))) AS Status, COUNT(*)

AS Total FROM w3svc.log to TopStatusCodes.txt GROUP BY Status ORDER BY Total DESC

 

Request Filtering uses the http substatus 404.18 when a query string sequence is denied. Looking at Log Parser report below you can see the  50,039 requests were blocked by the new Request Filtering rules.

topstatuscodes

An Once of Prevention…

The web site that had been attacked hosted free cooking recipes and allowed visitors to submit their own recipes. Unfortunately the owner’s goodwill was easily exploited because there was no form field validation on site’s submission page and new recipes were automatically being displayed on the site without being approved. This is a dangerous site design and should never have been deployed without basic security measures in place.

I did a quick select count(*) from the recipe table in the database and was amused by all the delicious recipes I found Smile.

image

 

In Summary

SQL Server 2008 has several built-in reports like Top Queries by Total CPU Time to help Investigate high CPU utilization. Running SQL Profiler will provide detailed analysis of database activity. Running the profiler output through the Database Tuning Advisor can yield significant performance improvements for the database. IIS Request Filtering is a powerful tool to block SQL Injection attacks against a web site. However, SQL Injection can be easily mitigated using basic data validation. Thanks for reading.

Dec 272013
 

Thanks to Microsoft’s Web Platform Installer (Web PI) installing IIS has never been so easy. Before using Web PI to install IIS became available,  you had to use the Server Manager to install the Web Server (IIS) role and then select various Role Services that you need to be enabled. Depending on your level of expertise this could be a challenging task with lots scrolling back and forth and click upon click to get things just right,  but now you can have IIS deployed with just 3 clicks of your mouse.

Install Web PI

If you’re not familiar with the Web PI, it is a powerful tool that can be used to install not only IIS but also SQL Server Express, Visual Web Developer, Express, PHP, WordPress, Umbraco, and many other 3rd party applications from the Windows Web Application Gallery. If you haven’t already done so first Download Web PI and install it. It’s free and has a small footprint of only 2 MB.

image

Select IIS Recommended Configuration

Once Web PI has been installed just launch the program . It will open to the Spotlight tab so just click on the Products tab and click Add next to IIS Recommended Configuration. If you don’t see it in the opening list just search for it. All you need to do after this is just click Install at the bottom of the window.

 

image

 

You may be curious as to what options are installed with the IIS Recommended Configuration. Here is what will be installed:

  • ASP.NET
  • Static Content
  • Default Document
  • Directory Browsing
  • HTTP Errors
  • HTTP Logging
  • Logging Tools
  • Request Monitor
  • .NET Extensibility
  • Request Filtering
  • Static Content Compression
  • ISAPI Extensions
  • ISAPI Filters
  • WAS Process Model
  • Management Console
  • WAS Configuration API
  • WAS .NET Environment
  • .NET 4.5 Extended with ASP.NET for Windows 8
  • .NET 3.5 for Windows 8

Before the installation starts you need to accept the license terms so just click I Accept.

image

 

The installation will run for a few minutes installing the essential features for IIS to work properly.

image

 

Once Web PI has completed installing IIS just click Finish.

image

 

Using IIS Manager

Your server is now ready for hosting web sites. Open IIS Manager and you’ll see the Default web site has been configured.

image

 

When you browse http://localhost you’ll see the familiar IIS Start Page.

image

This page is named iisstart.htm and appears in the Default Documents list above default.aspx so once you upload your web site files be sure to delete this page.

Next Steps?

Now that you have IIS installed what’s next? Well you’ll want to go back to Web PI and at least install FTP Publishing. Once you have FTP Publishing installed you want to look into configuring FTP User Isolation as well as using FTP over SSL for greater security when transferring content to and from your server. You may also want to look at installing Url Rewrite 2.0 from Web PI. Url Rewrite offers many ways to rewrite urls for SEO and perform 301 redirects as well as blocking page requests.

Summary

The Web Platform Installer (Web PI) is a powerful tool for deploying a wide variety of 3rd party applications such as WordPress and other popular CMS products but it can also be used to install IIS or even SQL Server Express on your server. The Web PI offers  unparalleled ease and convenience with installing applications on Windows servers. Thanks for reading.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS
Oct 072013
 

When you need quick analysis of your traffic logs you won’t find an better tool than Microsoft’s free Log Parser. With Log Parser you can read a variety of log files including the Registry and Windows event logs. It’s ease of use comes from using SQL queries against your log file. You can get your data even faster by using multiple log parser queries in a batch file.

image

The other day I was helping someone who needed some “top 10” data from their site’s log. Since I had these in my trusty batch file I could provide the text reports within seconds. However, I like to offer a little more pizzazz when possible so this time I decided use Log Parser’s native charting capability to output the results with some nice charts.  As the saying goes a picture is worth a thousand words.

Here’s the query I used to create the chart above:

logparser.exe -i:iisw3c "select top 10 cs-uri-stem, count(*)  into top10requests.gif 
from <file> group by cs-uri-stem order by count(*) desc" 
-o:CHART -chartType:pieexploded3d -categories:off -chartTitle:"Top 10 Requests"

 

Installing Office Web Components

Charting is a native feature of Log Parser however there is a dependency for Office 2003 Add-in: Office Web Components. Depending on where you are running Log Parser the first time you try to output your query to a chart you may see this error message:

Error creating output format “CHART”: This output format requires a licensed Microsoft Office Chart Web Component to be installed on the local machine

If you didn’t see the error above then you’re all set but if you saw the error then it will be necessary to install the Office Web Components before you can start outputting charts. Once you’ve downloaded the file just accept the License Agreement and click Install.

image

The installation runs quickly. Click OK to close the window.

image

 

Example Log Parser Reports with Charts

Now you’re ready to start creating some colorful charts. The most useful parameters in my opinion are –chartType, –chartTitle, –categories, –values, and –legend. There are some 20+ chart types that you can choose from including:  Pie, PieExploded, PieExlpoded3D, LineStacked, Line3D, BarClustered, ColumnClustered, Smooothline. The default chart type is Line.  To see all the possible chart options run this simple command:

LogParser -h -o:CHART

To take your charts to the highest level of customization you can use an external configuration script with Jscript or VBscript . Take a look at the MSDN ChartSpace Object Model documentation for more information.

Here are a few different charts with various options.

image

logparser.exe -i:iisw3c "select top 10 cs-uri-stem, count(*)  into top10requests.gif 
from x.log group by cs-uri-stem order by count(*) desc" 
-o:CHART -chartType:pieexploded3d -categories:off -chartTitle:"Top 10 Requests"

 

 

image

logparser.exe -i:iisw3c "select top 10 sc-status, count(*)  into top10errorcodes.gif 
from x.log group by sc-status having sc-status not in ('200') order by count(*) desc" 
-o:CHART -chartType:column3d -categories:on -values:on -chartTitle:"Top Status Codes"

 

 

image

logparser.exe -i:iisw3c "select top 10 cs-uri-stem, count(*)  into top10_404.gif 
from x.log group by cs-uri-stem, sc-status having sc-status in ('404') order by count(*) desc" 
-o:CHART -chartType:BarClustered3D -values:on -categories:on -chartTitle:"Top 10 404 Status"

 

image

logparser.exe -i:iisw3c "select quantize(time, 60) as TimeGenerated, count(*) as Hits into 
hitsperminute.gif from %1 group by TimeGenerated" -o:chart -chartType:Line –chartTitle:"Hits per Minute"

 

 

 

image

 

logparser.exe -i:iisw3c "SELECT TOP 10 cs-uri-stem AS RequestedFile, COUNT(*) AS TotalHits, 
MAX(time-taken) AS MaxTime, AVG(time-taken) AS AvgTime into slow.gif from x.log 
where EXTRACT_FILENAME(cs-uri-stem) not in('%begin%') GROUP BY cs-uri-stem ORDER BY MaxTime, TotalHits DESC" 
-o:CHART -chartType:barclustered3d -values:off -categories:on -chartTitle:"Top 10 Slowest Requests"

 

In Summary

Microsoft’s Log Parser is a powerful tool for log file analysis. You can use it to analyze text files, csv files, Window’s event logs and even the Windows Registry.  You can make boring reports come alive with colorful charts.  There is a dependency on Office Web Components for charting to work but that is easily solved. Thanks for reading.

Jul 222013
 

Redirecting visitors on your site from one page to another is handled by using either a 301 redirect or a 302 redirect. The numbers 301 and 302 refer to the http status code that is returned by the web server to your browser. They may seem similar but they are quite different. A 302 indicates a temporary change and a 301 indicates a permanent change. This difference is important to understand and will impact how search engines see content changes on your site. There are a number of ways to implement a 301 redirect on your web site. Some are easier than others to configure and will depend on the version of IIS you are using. Here’s the story of how I recently had to use the global.asax and Application_BeginRequest to do a 301 redirect.

Unforeseen consequences of revoking a certificate

The other day I was helping someone who had revoked their site’s SSL certificate. They were no longer going to use SSL on their site so they figured revoking the certificate was a logical decision. Unfortunately what they had not realized was that the https:// url of their site had been indexed by most search engines and they were getting a lot organic traffic to their site using that url. By revoking the certificate many of their visitors were now seeing dire warnings in their browsers like the picture below instead of going to their site. This was not good.

image

 

Not being a technical person they figured that just removing the certificate from the site bindings would solve their problem. This was not a good idea. On the one hand it solved the problem with the browser security warnings being displayed but in fact it just caused a different problem. People were still accessing the https:// url of their site so instead of a security warning now they were just seeing an error. Using Fiddler you can see that a 502 error is generated when you try to access a site using https without having a binding for it.

image

 

The need for a redirect

We needed to take visitors accessing the https url of the site and send them to the http url of the site. This is the perfect application of using either a 301 or 302 redirect. However, here’s where things got a little more complicated. Ordinarily I would just use Url Rewrite or even a Url Rewrite Map to handle the 301 redirects. Unfortunately their site was hosted on IIS 6 so we couldn’t use Url Rewrite. Furthermore we only needed to redirect incoming requests using SSL. The site content was fine so page level redirects such as a meta tag refresh weren’t going to help in this case either.

Since the site was using .Net 2.0 I decided to use the Application_BeginRequest event in the global.asax. This is the first event in the HTTP pipeline change of execution when asp.net responds to a request. Using this event I created a conditional statement to test the HTTPS server variable to see if the request was being made using SSL or not. If the request was made with SSL then we would redirect it to the http url of the site as shown below. Bear in mind however that Response.Redirect’s default status is 302 –a temporary redirect. In my situation I needed a 301 permanent redirect so that search engines would drop the https url from their index. So I had to add the extra line of Response.StatusCode=301.

image

At this point I was pretty satisfied I had solved my friend’s problem. I had setup a test site with an SSL certificate and the redirect worked great. Unfortunately when I set it up on the live site (with the revoked certificate) nothing happened Sad smile.  It turned out that because the site’s certificate had been revoked, browsers weren’t actually loading the site which in turn meant the redirect wasn’t happening. There was only one way to solve this last piece of the puzzle and that meant putting in a valid SSL certificate again. So I created a Certificate Signing Request for my friend’s site and within minutes they had a new $9 RapidSSL certificate from Namecheap.com. Once a new certificate was bound to the site the https page requests started working again and then our custom 301 redirect in the global.asax was able to do it’s job.

 

Testing a 301 Redirect

Because I needed the redirect to be permanent I wanted to be sure it was really returning a 301 status. Checking the web site’s www logs would have confirmed this but that’s a bit cumbersome especially when a tool like Fiddler makes it so easy to check. Fiddler is a free web debugging tool. As one can see in the pictures below the redirect was in fact returning a 301 status code.

image

Here you can see the raw header and body of the request.

image

If you need to remove a url from a search engine’s index you can contact them directly:

https://support.google.com/webmasters/answer/164734?hl=en

http://www.bing.com/webmaster/help/how-can-i-remove-a-url-or-page-from-the-bing-index-37c07477

Please note that is is not a fast process and using a 301 permanent redirect is the best solution.

Summary

Sending traffic to a different location on your site can be accomplished using either a 301 permanent redirect or a 302 temporary redirect and this will ensure your search engine ranking isn’t impacted. There are many techniques to implement a redirect such as using Url Rewrite, meta tags, or even Response.Redirect. If you’re going to revoke an SSL certificate or remove one from your site, first be absolutely sure that there isn’t traffic using the certificate. Thanks for reading.

Dec 092012
 

The other day I was helping someone who was trying to configure a wildcard certificate on their Windows Cloud Server. Their server was running Windows 2008 R2 server using IIS 7. The were technically savvy and knew how to configure site’s on their own and install a regular SSL certificate but they were stuck trying to get a wildcard certificate configured properly.

They had quite a few site’s configured using subdomains such as support.domain.com, mail.domain.com, login.domain.com, etc. To tighten security they decided to use SSL to protect all these sites so they bought a wildcard certificate for *.domain.com. They installed the new certificate on the 1st site correctly but when they tried doing it on the 2nd site they couldn’t. IIS wouldn’t let them assign the certificate on the other sites using a shared IP address. Does this sound familiar? Here’s how you can solve it and it’s easier than you think.

Here are 4 site’s configured in IIS using host header xxx.domain.com with the same IP address.

image

 

After installing our wildcard SSL certificate we assign the binding on the first site.

image

 

Testing the site we see that the wildcard SSL certificate is working great.

image

 

Now we go to site #2 and try to assign our certificate. However we’re not able to enter a host name for site #2.

image

 

If we click OK and try to proceed we get a warning about the dire consequences of our actions. As soon as we try to access site #2 using SSL, IIS will actually stop site #1 which cause all kinds of issues for our users.

image

 

Now that we’ve seen the problem let’s the get to the solution. According to my friend, former coworker, and IIS MVP Scott Forsyth, it turns out that this is not a bug and the IIS team designed it to work this way. There are 2 ways to solve this particular issue when using a wildcard SSL certificate. One way is to simply execute the following app command for each binding that you need.

appcmd set site /site.name:”” /+bindings.
[protocol='https',bindingInformation=':443:']

This certainly works however I tend to have hard time remembering the syntax which leads us to the 2nd method which is in my opinion is far easier and has to do with how the wildcard SSL certificate was originally installed.

Remember back when you had just received the completed wildcard certificate from your SSL vendor? Like most IT people you were probably in a hurry when you installed it. Do you remember what you entered when you were prompted for a Friendly name before saving it? Chances are you just entered “domain.com” however what you should have specified is “*.domain.com”. Doh!

You can check this easily by looking at the certificate store in IIS Manager. If the Name column doesn’t show the * then you need to change it before it SSL binding on multiple sites will work properly.

image

 

So how does one change the Friendly name this after the certificate has already been installed? Open the MMC Snap-In for Certificates. Right-click on the certificate and change the Friendly name to *.domain.com. Save the changes and close out the MMC.

image

 

Now that the Friendly name has been changed to *.domain.com go back to IIS and try to add the SSL binding for site number #2 and now it works. Woohoo. Smile

image

 

Now you can add your wildcard certificate to as many subdomain host header sites as needed using only 1 IP and you don’t have to remember any programming syntax. I hope this helps and thanks for reading.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS
Sep 132012
 

Before IIS 7, if you wanted to do url rewriting with IIS 6 you had to use a 3rd party program such as ISAPI Rewrite by helicontech.com. This was a good program but it wasn’t native to IIS and there were limitations such as a site hosting more than 1 domain with different applications running.

With IIS 7 url rewriting and redirecting has never been easier thanks to Microsoft’s Url Rewrite module. The rewriting is done by rules which are specified in the web.config under <system.webserver> element. Using IIS Manager you can use the Url Rewrite gui to create and maintain your rules.


You can also just put the rules directly into the web.config without using the gui. For example:

<system.webserver>
<rewrite>
<rules>
<rule name="xyz">...blah...</rule>
</rules>
<rewrite>
</system.webserver>

IIS 7 Url Rewrite WWW

One of the most common needs for SEO is to force your site to use www for all page requests so that search engines will go to www.mydomain.com instead of domain.com. This is very easy to do with IIS 7′s Url Rewrite. Here is the rule:

<rewrite>
<rules>
<rule name=”Redirect to www” patternSyntax=”Wildcard” stopProcessing=”true”>  
<match url=”*” />
<conditions>
<add input=”{HTTP_HOST}” pattern=”peterviola.com” />
  </conditions>
 <action type=”Redirect” url=”http://www.peterviola.com/{R:0}” />
</rule>
</rules>
<rewrite>

This works really well and it is a completely seamless experience for your web site visitors.  Here is how the rule looks in the IIS Manager gui.

IIS 7 Url Rewrite HTTP to HTTPS

Probably the 2nd most common use of Url Rewrite is for sites that have SSL certificates installed and need to seamlessly redirect page requests using the certificate for either the entire site or a particular folder. Here is the Url Rewrite rule for redirecting requests on the entire site. You simply detect if the request is not secure and then redirect to the secure channel:

<rewrite>
<rules>
<rule name="HTTP Redirect to HTTPS" enabled="true" stopProcessing="true">
<match url="(.*)" ignoreCase="false" />
<conditions>
<add input="{HTTPS}" pattern="off" />
</conditions>
<action type="Redirect" url="https://{HTTP_HOST}/{R:1}" appendQueryString="true" redirectType="Permanent" />
</rule>
</rules>
</rewrite>

IIS 7 Url Rewrite HTTP to HTTPS on Subfolder

The example above is great but running your entire site in HTTPS will have a performance impact so you don’t need to do it unless there is a specific business requirement for it. So then we need a rule to redirect requests to HTTPS for just one folder. In this example we’ll use a folder called “/secure”. In this instance we use the same rule as above however now we only want page requests for the “secure” folder. This is done by modifying the “match url” element.

<rewrite>
<rules>
<rule name="HTTPS on subfolder" enabled="true">
<match url="(^secure/.*)" ignoreCase="false" />
<conditions>
<add input="{HTTPS}" pattern="off" />
</conditions>
<action type="Redirect" url="https://{HTTP_HOST}/{R:1}" appendQueryString="true" redirectType="Permanent" />
</rule>
<rules>
<rewrite>

We’ve covered 3 of the most common uses of IIS 7 Url Rewrite but if you notice the rules above are really for redirecting and not url rewriting. We’ll cover more examples on rewriting in an upcoming post.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS