Aug 282014
 

Depending on how many sites your Windows web server is hosting maintaining IIS logs can be a challenge.  IIS logs provide valuable insight into the traffic your sites are experiencing as well as detailed SEO metrics and performance data. A typical web server will have just enough free disk space for future growth needs but ultimately will be limited by the capacity of drives in the server. If left unmonitored IIS logs can quickly fill any remaining disk space on the server. There are a few 3rd party tools that are good at compressing log files when they are under one parent folder but when the log files are in different locations such as on a WebsitePanel server I support an alternative solution is needed. In this walkthrough I will demonstrate how I solved this challenge using asp.net and GZipStream.\r\n\r\n \r\n

What about Windows Folder Compression?

\r\nEnabling Windows folder compression should always be the first step to try when freeing up space used by IIS logs. I always redirect the site logs to a folder at the root level of the server so it’s easy to find them and monitor their growth. In the example below I saved 8.3 GB of disk space without having even having to zip any files. However Windows folder compression was not going to work for my WebsitePanel server.\r\n\r\nCapture12\r\n\r\n \r\n

Finding WebsitePanel IIS Logs

\r\nWebsitePanel is a wonderful Windows control panel for Multi-Tennant web hosting and the best part is that it’s free. If you haven’t discovered this incredible application yet check it out. Once installed every aspect of site creation and maintenance is completely automated. As shown in the picture below customer sites are usually deployed with a directory structure like c:\HostingSpaces\customername\sitename. The IIS logs are always stored further down the tree in …\sitename\logs\W3SVCXX. As one can imagine on a web server with hundreds of sites it would be quite a challenge to track down and manually apply Windows compression on all the W3SVC folders. This would be further complicated as new sites are added to the server.\r\n\r\nUntitled-1 copy\r\n\r\n \r\n

Using Directory.EnumerateFiles and GZipStream

\r\nI knew I was going to do some coding to solve my challenge but I also wanted to keep the solution easy to use.  In years past I would have considered VBScript folder recursion and executing the PKZip command line tool but I wanted something more contemporary.  So I used Visual Studio 2013 and created a Console Application.\r\n\r\nCapture9\r\n\r\n \r\n\r\nimage\r\n\r\n \r\n\r\nThe GZipStream class which was introduced in .Net 2.0 would handle the necessary file compression. Since I would only be compressing individual files I didn’t have to deal with using a separate library such as DotNetZip.\r\n

public static void Compress(FileInfo fileToCompress)\r\n{\r\n    using (FileStream originalFileStream = fileToCompress.OpenRead())\r\n    {\r\n        if ((File.GetAttributes(fileToCompress.FullName) & \r\n            FileAttributes.Hidden) != FileAttributes.Hidden & fileToCompress.Extension != ".gz")\r\n        {\r\n            using (FileStream compressedFileStream = File.Create(fileToCompress.FullName + ".gz"))\r\n            {\r\n                using (GZipStream compressionStream = new GZipStream(compressedFileStream, CompressionMode.Compress))\r\n                {\r\n                    originalFileStream.CopyTo(compressionStream);\r\n                    Console.WriteLine("Compressed {0} from {1} to {2} bytes.",\r\n                        fileToCompress.Name, fileToCompress.Length.ToString(), compressedFileStream.Length.ToString());\r\n                }\r\n            }\r\n        }\r\n    }\r\n}

\r\n

The Directory.EnumerateFiles method introduced in .Net 4 would use find all the IIS log folders from the WebsitePanel hosting spaces directory tree. Using the searchOption overload  would enable me to not only search for .log files but also limit the searching to specific folders matching “W3SVC”.  In the snippet below I loop through the search results, compress any log file older than 1 day, and delete the original file. The last step is to just write the name the file that was just zipped to an audit log.\r\n

public static void zipFiles()\r\n{\r\n    try\r\n    {\r\n        foreach (string file in Directory.EnumerateFiles(homeDirectory, "*.log", \r\n            SearchOption.AllDirectories).Where(d => d.Contains("W3SVC")).ToArray())\r\n        {\r\n            FileInfo fi = new FileInfo(file);\r\n            if (fi.LastWriteTime < DateTime.Now.AddDays(-1))\r\n            {\r\n                Compress(fi);\r\n                fi.Delete();\r\n                activityLog("Zipping: " + file);\r\n            }\r\n        }\r\n    }\r\n    catch (UnauthorizedAccessException UAEx)\r\n    {\r\n        activityLog(UAEx.Message);\r\n    }\r\n    catch (PathTooLongException PathEx)\r\n    {\r\n        activityLog(PathEx.Message);\r\n    }\r\n}

\r\n

I only needed 30 days worth of log files to be left on the server. In the snippet below I delete any zipped log files matching my date criteria and then write the activity to the audit log.\r\n

\r\n

public static void deleteFiles()\r\n{\r\n    try\r\n    {\r\n        foreach (string file in Directory.EnumerateFiles(homeDirectory, "*.gz", \r\n            SearchOption.AllDirectories).Where(d => d.Contains("W3SVC")).ToArray())\r\n        {\r\n            FileInfo fi = new FileInfo(file);\r\n            if (fi.LastWriteTime < DateTime.Now.AddDays(-daysToSave))\r\n            {\r\n                fi.Delete();\r\n                activityLog("Deleting:" + file);\r\n            }\r\n\r\n        }\r\n    }\r\n    catch (UnauthorizedAccessException UAEx)\r\n    {\r\n        activityLog(UAEx.Message);\r\n    }\r\n    catch (PathTooLongException PathEx)\r\n    {\r\n        activityLog(PathEx.Message);\r\n    }\r\n}

\r\n

\r\n

Using an XML Config File

\r\nTo help keep my application flexible I created a simple XML file to store the path to the WebsitePanel hosting spaces folder and the number of days of logs I wanted to keep on the server.\r\n

<?xml version="1.0" encoding="utf-8"?>\r\n<MyConfig>\r\n  <HomeDirectory>D:\HostingSpaces</HomeDirectory>\r\n  <DaysToSave>30</DaysToSave>\r\n</MyConfig>

\r\n

 \r\n\r\nLoading these values into my program is done by parsing the XML data using XmlDocument.\r\n

public static void readConfig()\r\n{\r\n\r\n    string path = System.IO.Path.GetFullPath(@"logCleanConfig.xml");\r\n    XmlDocument xmlDoc = new XmlDocument();\r\n    xmlDoc.Load(path);\r\n    XmlNodeList HomeDirectory = xmlDoc.GetElementsByTagName("HomeDirectory");\r\n    homeDirectory = HomeDirectory[0].InnerXml;\r\n\r\n    XmlNodeList DaysToSave = xmlDoc.GetElementsByTagName("DaysToSave");\r\n    daysToSave = Convert.ToInt32(DaysToSave[0].InnerXml);\r\n\r\n}

\r\n

 \r\n

Running the Program

\r\n

Running the program from a command line I can see the progress as files are compressed. Afterwards I setup a scheduled task so it will run every day to compress new IIS log files and purge the ones older than 30 days.\r\n\r\nCapture7\r\n\r\n \r\n\r\nA few days later I came across several sites on the server using CMS products with their own application logs. These log files were not being cleaned up by the site owners and taking up free space on the server so I decided to archive them as well. Since they were being stored within each site in subfolder called “Logs”, tweaking the searchOption parameters of the Directory.EnumerateFiles method would catch them. With a quick rebuild of the program, it would now compress the CMS log files as well as the IIS log files.\r\n

foreach (string file in Directory.EnumerateFiles(homeDirectory, "*.log",\r\n  SearchOption.AllDirectories).Where(d => d.Contains("W3SVC") || d.Contains("Logs")).ToArray())

\r\n

\r\n

\r\n

In Summary

\r\n

IIS logs contain wealth of data about web site traffic and performance. Left unchecked over time the logs will accumulate and take up disk space on the server. Using Windows folder compression is a simple solution to reclaim valuable disk space. If your log files are different locations then leveraging the power of ASP.NET with Directory.EnumerateFiles and GZipStream will do the trick. Thanks for reading.

Peter Viola

Creative, customer focused, results oriented, Senior Web Systems Engineer who enjoys providing the highest level of customer service supporting complex Windows hosting solutions. MCITP, MCSA, MCTS

More Posts - Website

Aug 092014
 

One of the many benefits of using virtual servers over physical servers is the ability to add server resources such as CPU, RAM, and disk space on the fly without downtime. An addition drawback with a physical server is that you were often limited by the physical capacity of the server.  Once those limits were reached the server couldn’t be upgraded further.  Adding resources also required powering off the server which in turn would require coordinating with business owners and impacted users. Not all editions of Windows support hot-add so be sure to confirm your server is supported before starting. In this walkthrough I’ll show how easy it is to add server resources using VMware’s vSphere client.\r\n

Logging into vSphere Client

\r\nAfter authenticating on my network with the VMware vSphere 5.5 client I go to Hosts and Clusters under Inventory. From here I have access to all the virtual servers configured in our environment. After selecting the server to be upgraded you will be see the Getting Started tab. From here you have access to the usual administrative tasks such as starting, stopping, and restarting the server as well as performance reporting and events about the server. Click Edit virtual machine settings to add resources.\r\n\r\nCapture2\r\n\r\n \r\n

Enabling RAM and CPU Hotplug

\r\nAdding the additional resources is straight forward. However when you begin you may find the CPU and Memory properties disabled. This indicates that the server has not been been previously enabled for hot adding resources. In this instance the server will need to be shutdown before you can upgrade these resources.\r\n\r\nCapture3\r\n\r\n \r\n\r\nFortunately fixing this for future upgrades is a simple matter. When the server is powered down click on the Options tab of the Virtual Machine Properties. Under the Advanced settings go to the Memory/CPU Hotplug properties. Click Enable memory hot add and Enable CPU hot add. Click OK to save the changes. After the server is powered back on you will now be able to add CPU and Memory without having to first shutdown the server.\r\n\r\n \r\n\r\nUntitled-1\r\n\r\n \r\n\r\nTo add additional virtual CPUs simply increase the Number of virtual sockets and click OK to save the changes.\r\n\r\ncore\r\n\r\n \r\n\r\nTo add additional Memory to the server adjust the Memory Configuration accordingly and click OK to save.\r\n\r\nUntitled-2\r\n\r\n \r\n\r\n \r\n

Adding Additional Disk Space

\r\nIn addition to adding CPU and Memory to the server during this maintenance window I am also going to add disk space. Adding additional disk space is just as straight forward as adding CPU and Memory. In the Virtual Machine Properties on the Hardware tab go to the Hard disk settings. Increase the Provisioned Size by the new amount and click OK to save the changes. Windows will not automatically recognize the new space so the final step of the upgrade will be log into the server and Extend the server’s disk drive. This can either be accomplished using vShere’s server console window or by connecting to the server with Remote Desktop.\r\n\r\nCapture5\r\n\r\n \r\n

Extending Windows Disk Space

\r\nAfter logging into Windows open the Computer Management snap-in. In the console tree click on Disk Management under Storage. You may need to Rescan the disk before Windows will see that the new space is available.\r\n\r\nCapture6\r\n\r\n \r\n\r\nStep through the Extend Volume Wizard to allocate the additional space on the existing volume.\r\n\r\nCapture7\r\n\r\n \r\n

In Summary

\r\nVMware vSphere offers System Administrators complete control over virtual server properties. Adding additional CPU, RAM, and disk space is  straight forward and in many cases can be performed without having to shutdown the server.  To help minimize downtime of your next maintenance window, double check the edition of your Windows server supports hot-add and confirm the Memory/CPU Hotplug property has been enabled. Thanks for reading.

Peter Viola

Creative, customer focused, results oriented, Senior Web Systems Engineer who enjoys providing the highest level of customer service supporting complex Windows hosting solutions. MCITP, MCSA, MCTS

More Posts - Website

Aug 072014
 

The FTP protocol is some 43 years old now. Yet it continues to be one of the most widely used file transfer technologies available. Over the years it has been shown to  be vulnerable to brute force attacks, packet capture, and other attack vectors.  Fortunately with IIS 8 on Windows Server 2012  your FTP server doesn’t have to be vulnerable. It goes without saying that FTP Authentication and Authorization are the most fundamental methods to secure your server.  Here are three additional things you can do to increase the security of your server’s FTP service and minimize its attack footprint.\r\n\r\n \r\n

IIS 8 FTP Logon Attempt Restrictions

\r\nOne of the most common FTP attack vectors is the dictionary attack. Using automated tools hackers will repeatedly hammer your FTP site with thousands of username and password combinations hoping to find that one account with an easy password. In the picture below you can see just a snippet of this automated activity. Fortunately none of these attempts were successful.\r\n\r\nlog\r\n\r\n \r\n\r\nIIS 8 now features FTP Logon Attempt Restrictions. This powerful feature is not available in IIS 7 or IIS 7.5. Once configured automated logon attacks will be stopped in their tracks. From IIS Manager simply click on FTP Logon Attempt Restrictions.\r\n\r\nCapture3\r\n\r\n \r\n\r\nConfiguring the FTP Logon Attempt Restrictions module is easy. Simply choose how many logon attempts are to be allowed and the time period for them to occur. When you consider that most FTP clients will save passwords in a profile, legitimate users on your FTP site should only need 1-2 logon attempts. However, depending on how many FTP users you’re hosting and their technical savvy you may need to tweak these settings.\r\n\r\nCapture4\r\n\r\n \r\n\r\nTesting my FTP site now with the new logon attempt restrictions it is easy to see how well it works. After the threshold is exceeded my connection to the server is forcibly closed. Automated hack attempts will no longer be a threat to this FTP server.\r\n\r\nCapture5\r\n\r\n \r\n

Enable FTP Over SSL with IIS 8

\r\nThe FTP protocol wasn’t originally designed for encryption. Fortunately with IIS 8 (and IIS 7) your FTP sessions can now be encrypted with SSL. To configure FTPS also known as FTP Over SSL open IIS Manager. You can either specify using SSL when adding FTP Publishing to a site or alternatively just going to the FTP SSL Settings on an existing site. Connecting with SSL can either be optional or or you can force all connections to use it. Using the drop down menu choose the certificate that you want to be used to encrypt the connections.  Windows Server 2012 has a default certificate available however you are also welcome to install your own 3rd party certificate.\r\n\r\nimage\r\n\r\n \r\n\r\nAfter you’ve configured the SSL settings on the server you just need to change your FTP client connection properties. In the picture below I’m using a legacy version of Cute FTP 8.0. Depending on which FTP client you’re using your protocol menu will look different.\r\n\r\nimage\r\n\r\n \r\n\r\nHaving changed my FTP client settings I attempt to connect to the server using SSL. The first time you connect to the server you will be prompted to accept the new SSL certificate. The log snippet below shows that my FTP session is being properly established with SSL. My communication with the server is now secure and protected.  Here is a more detailed walk through of configuring FTP over SSL on IIS 8.\r\n\r\nimage\r\n

\r\n

Configuring IIS 8 FTP User Isolation

\r\nWhen IIS 7 was released the FTP service had been completely redesigned from the ground up with security in mind. This was a welcome change indeed from IIS 6. In addition to supporting FTP over SSL it introduced FTP User Isolation. Multiple users on the same FTP site could be separated regardless of which file path they were being logged into without risk of someone traversing up parent paths to other user folders.\r\n\r\nThe FTP Authorization rules make it easy to identify multiple users or even local groups to have access to the FTP server. The user isolation is accomplished by creating a virtual directory called LocalUser and then choosing User name directory (disable global virtual directories). The LocalUser virtual directory should point to the FTP root directory and then you create a separate virtual directory for each FTP user which points to their destination path.\r\n\r\nimage\r\n\r\n \r\n\r\nWith FTP User Isolation configured your users will never be able to move up up to a parent path beyond their individual root directory. Even if a user were able to correctly guess the username and virtual path of another FTP account on the server they will not be able to reach it. Due to the confines of the isolation the FTP session can not see anything else on the server. In the example below I login with local account ftpuser2 and attempt to change the path to /ftpuser1 however that path does not exist and therefore is not accessible to my user. Here is a more detailed walkthrough of configuring FTP User Isolation on IIS 8.\r\n\r\nimage\r\n\r\n \r\n

In Summary

\r\nIIS 8 on Windows Server 2012 offers the most secure FTP service of any IIS version to date. You have multiple layers of FTP security available by leveraging FTP Logon Attempt Restrictions, FTP Over SSL, and FTP User Isolation. Your FTP server will be well protected using these built-in modules. With internet security there is no ‘patch’ for complacence.  More security is always better so implement it when it’s readily available to you.  Thanks for reading.

Peter Viola

Creative, customer focused, results oriented, Senior Web Systems Engineer who enjoys providing the highest level of customer service supporting complex Windows hosting solutions. MCITP, MCSA, MCTS

More Posts - Website

Aug 042014
 

Another one of the great built-in features of IIS 8 is Dynamic IP Restrictions (DIPR). With a few simple configuration steps you can quickly set limits for blocking IP addresses based on the number of concurrent requests or frequency of requests over a period time. With these parameters in place IIS will take over blocking requests unattended thereby making your server more secure.\r\n\r\nBefore DIPR was available on IIS 7 you could manually block 1 IP or a range of IPs easily in the IP Address and Domain Restrictions module. However this could be a time consuming task if your server was under attack. Using a tool like Log Parser to examine the site’s logs you could identify IPs with suspicious activity but then you still had manually enter Deny Rules. Determined hackers will use a variety of IPs from proxy servers so by the time you’ve blocked a handful a new range could be starting up. DIPR was released out-of-band for IIS 7 and IIS 7.5 so you can leverage this great security tool on those web servers as well. In this walk through I cover how to configure Dynamic IP Restrictions and even show a test in action.\r\n

\r\n

Installing Dynamic IP Restrictions

\r\nOpen the Server Manager and to Web Server role. Under Security ensure that IP and Domain Restrictions is installed.\r\n\r\nimage\r\n\r\n \r\n

IP Address and Domain Restrictions in IIS Manager

\r\nOpen IIS Manager and click on IP Address and Domain Restrictions.\r\n\r\nCapture2\r\n\r\n \r\n\r\nFrom this window you can either Add Allow Entry rules or Add Deny Entry rules. These rules would be for manually blocking (or allowing) one IP address or an IP address range. You have to be care when blocking an IP range because you could inadvertently block legitimate traffic. Click on Edit Dynamic Restriction Settings to set the dynamic thresholds for blocking IP addresses.\r\n\r\nimage\r\n\r\n \r\n\r\nClick Edit Feature Settings to set the Deny Action Type. In this example I’ve set Forbidden so blocked requests will receive an http 403 status error. These errors will also be recorded in the site’s log for us to review later.\r\n\r\nimage\r\n\r\n \r\n\r\nOn the Dynamic IP Restriction Settings screen you can choose the maximum number of concurrent requests to block. And you can also Deny IP addresses based on frequency of requests over a period of time.\r\n\r\nCapture4\r\n\r\n \r\n\r\nAs always depending on the volume of your web site’s traffic you should test these settings to ensure that legitimate traffic does not get blocked.\r\n\r\n \r\n

\r\n

Testing Dynamic IP Address Blocking

\r\nI didn’t have a real security incident available for testing the DIPR module so I did the next best thing. Using Fiddler the free debugging tool from Telerik and StressStimulus a free load testing plugin from StimulusTechnology I hammered my test virtual server for a few minutes and got the desired results. With Fiddler open you will see the StressStimulus module. From here you can record your test case or open an existing test case as well as edit the test case paramters.\r\n\r\nCapture12\r\n\r\n \r\n

\r\n

\r\n

Test Results

\r\nStressStimulus gives you multiple detailed charts to review to gauge the performance of your site and identify potential areas of weakness. For my test I choose to hit the wp-login.php page on my test WordPress site with 3 concurrent requests and 100 iterations. The test completed within a few minutes.\r\n

Capture8

\r\n \r\n\r\nVisiting the test page from the server running StressStimulus I get the expected result. It’s blocked by a 403 error.  The full description of this code is 403.502 – Forbidden: Too many requests from the same client IP; Dynamic IP. \r\n

Capture

\r\n \r\n\r\nUsing the Log Parser query below to analyze the site log I see that 331 requests were blocked with a 403.502 status code.\r\n

SELECT TOP 100\r\nSTRCAT(EXTRACT_PATH(cs-uri-stem),'/') AS RequestPath, sc-status,sc-substatus,\r\nEXTRACT_FILENAME(cs-uri-stem) AS RequestedFile,\r\nCOUNT(*) AS TotalHits, c-ip\r\nFROM w3svc.og TO top-403-ip-requests\r\nGROUP BY cs-uri-stem, sc-status,sc-substatus,c-ip\r\nORDER BY TotalHits DESC

\r\n

image\r\n\r\n \r\n\r\nFurther examination of the log with Log Parser shows the full break down of the requests blocked with 403 status.\r\n\r\nSELECT TOP 100\r\nSTRCAT(EXTRACT_PATH(cs-uri-stem),’/’) AS RequestPath, sc-status,sc-substatus,\r\nEXTRACT_FILENAME(cs-uri-stem) AS RequestedFile,\r\nCOUNT(*) AS TotalHits, c-ip\r\nFROM w3svc.og TO top-403-ip-requests\r\nwhere sc-status=403\r\nGROUP BY cs-uri-stem, sc-status,sc-substatus,c-ip\r\nORDER BY TotalHits DESC\r\n


\r\n

image\r\n\r\n \r\n

Summary

\r\nThe Dynamic IP Restrictions module is available with IIS 8 as well as IIS 7 and IIS 7.5. It is a powerful tool to block automated attacks on your site and requires minimal configuration and maintenance. Thanks for reading.

Peter Viola

Creative, customer focused, results oriented, Senior Web Systems Engineer who enjoys providing the highest level of customer service supporting complex Windows hosting solutions. MCITP, MCSA, MCTS

More Posts - Website