Feb 062016
 

When it comes to securing IIS web applications on Windows Server 20008 R2 or Windows Server 2012 R2 one typically thinks of firewalls, access control lists (ACL), and using an application pool identity. These security measures will protect a site from external threats . However, .Net configuration files which typically store username and password data are text files so anyone with admin access to the server can read their contents.  The only way to prevent prying eyes from seeing app.config or web.config passwords is to encrypt them. Fortunately encrypting the connectionStrings section of a config file is straight foward. You can also encrypt other configuration sections in addition to connectionStrings section. Encrypting and decrypting config files can be performed programatically using .NET Framework methods or by using the ASP.NET IIS Registration tool (aspnet_regiis.exe).  With the encryption commands you can target either the path to the config file or reference an IIS application name. In my examples I will be encrypting and decrypting the connectionStrings section with the .NET Framework 4.

 

Encrypting Configuration Sections

You will find aspnet_regiis.exe in the C:\Windows\Microsoft.NET\Framework\version\ folder.  With the .NET Framework you can use the builtin protected configuration providers RSAProtectedConfigurationProvider  or DPAPIProtectedConfigurationProvider to encrypt and decrypt sections of your config files. You can also create your own provider. The general synatax to encrypt a config section is as follows:

aspnet_regiis.exe -pef section physical_directory -prov provider
or
aspnet_regiis.exe -pe section -app virtual_directory -prov provider

It is important to note when using aspnet_regiis.exe to encrypt or decrypt config files and you specify a physical path (rather than a web app name) the command is hardcoded  for a file named “web.config”.  If you are trying to run the command against an app.config you will first need to rename that file to web.config before running the command. Rename it back afterwards before using it. For this reason I find it easier to create a .bat file hardcoded with the necessary command syntax to encrypt my configs and then a 2nd .bat file to decrypt my configs.

On my Windows 2012 R2 server I have setup an IIS 8.5 site called domain1.com. For the example below I am using the builtin DPAPI provider to encrypt a web.config in c:\domains\domain1.com. The encrypted web.config is shown below.

aspnet_regiis.exe -pef "connectionStrings" "c:\domains\domain1.com" 
-prov "DataProtectionConfigurationProvider"

 

encrypted-config

 

Decrypting Configuration Sections

Following steps above we have now encrypted the connectionStrings section of the web.config for domain1.com. Naturally We also need to be able to decrypt it. When decrypting a config section you do not need to specify the protected configuration provider. Just like when encrypting a config file we can target either a file path or IIS web application name. Here is the syntax to decrypt a configuration file section:

aspnet_regiis.exe –pdf section physical_directory 
or
aspnet_regiis.exe –pd section -app virtual_directory

In my example below I decrypt the connectionStrings section of my web.config in c:\domains\domain1.com. As a reminder again when using the –pdf option we do not need to specify “web.config” in the syntax.

aspnet_regiis.exe –pdf "connectionStrings" "c:\domains\domain1.com" 

decrypt-config-connectstrings

After running the above command, the connectionStrings section of the web.config is decrypted as shown below. Once I am done editing my connection string I will follow best practices and encrypt the connectionStrings section again.

web-config-decrypted

Failed to decrypt using provider error

It is important to note that when encrypting your config files the encryption key is stored locally on the server which means if you need to move your encrypted config file to another server you will need to either decrypt the config file first before moving it to the new server or export the key prior to moving and install it on the new server.  If you move an encrypted config file to a server without exporting the encryption key you will receive an error like below indicating: Failed to decrypt using provider … Key not valid for use in specified state.

crypto-error

 

Creating an RSA Key Container

Fortunately moving encryption keys between servers is straight forward. We can create our own RSA key container, export it to a file, and then move it from server to server as needed. This is ideal for multi node web farm solutions where applications are deployed across multiple servers.  Use the following syntax to create an RSA key container. Be sure to include the –exp option so the container can be exported:

aspnet_regiis -pc "MyFarmKey" –exp

creating-mycrypto

 

Adding the configProtectedData section to your config

Next you will add the following configProtectedData section to your web.config.

<configProtectedData>
   <providers>
      <add name="MyFarmCrypto" 
           type="System.Configuration.RsaProtectedConfigurationProvider, 
System.Configuration, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a processorArchitecture=MSIL" keyContainerName="MyFarmKey" useMachineContainer="true" /> </providers> </configProtectedData>

Below is how my web.config looks now that I have added the configProtectedData section.

myfarmkey-config

 

Assigning permissions to the RSA key container

Before the new RSA key container is ready to be used by my site domain1.com, I need to assign the application pool identity permission to access it. On the server in my example the application pool identity for domain1.com is ApplicationPoolIdentity. I use the following syntax to assign this user to the new RSA key container:

aspnet_regiis -pa "MyFarmKey" "iis apppool\domain1.com"

myfarmkey-identity

Encrypting a config with an RSA key container

After adding the configProtectedData section to the web.config and granting permission to the RSA key container for domain1.com’s application pool identity, I’ll run the encryption command again using the new “MyFarmCrypto” RSA key container:

aspnet_regiis.exe -pef "connectionStrings" "c:\domains\domain1.com"
-prov "MyFarmCrypto"

image

In the image above we see the encryption succeeded. Note in the command syntax above we are specifying the configProtectionProvider name MyFarmCrypto and not the RSA key container name. If you mix that up you’ll get an error. We can see below how domain1.com’s web.config now looks after being encrypted with the new RSA key container.

farmcrypto-encrypted

Exporting and Importing an RSA Key Container

Now that we’ve successfully created and tested our new RSA key Container we need to export it to a file. Once it’s saved in a file we can then copy it to other servers for installation as needed. It is important to remember to use –pri option to include the private key when the export file is created otherwise you will not be able to decrypt information on the next server .

aspnet_regiis -px "MyFarmKey" "c:\MyFarmKey.xml" –pri

export-farmkey

Having logged into another server and copied the MyFarmKey.xml file to c:\temp I will import the key fil using the following command:

aspnet_regiis -pi "MyFarmKey" "c:\temp\MyFarmKey.xml"

import-farmkey

For security purposes, after importing the key on a new server, delete the key .xml file from the server to ensure someone unauthorized doesn’t use it decrypt data. This of course assumes that you have backed up the file off server somewhere safe.

To permanently delete the RSA key container from a server you should run this command:

aspnet_regiis -pz "MyFarmKey"

Summary

The .NET Framework offers powerful encryption tools to secure sensitive information like usernames and passwords in application connection strings. When encrypting a config file on a server the private key used to decrypt the information is local to the server. Creating an RSA key container will enable you to encrypt information with the same private key across multiple servers. Thanks for reading.

Peter Viola

Creative, customer focused, results oriented, Senior Web Systems Engineer who enjoys providing the highest level of customer service supporting complex Windows hosting solutions. MCITP, MCSA, MCTS

More Posts - Website

Dec 292015
 

The Microsoft Web Platform Installer (WPI) has made installing WordPress and MySQL on your Windows servers incredibly easy. With just a few clicks one can deploy a new WordPress site in minutes.  When it comes to managing a MySQL database you may first think of MySQL Workbench which is a great tool however depending on your technical savy installing that application may be challenging.  Additionally, using it requires remote access to your server and what if you need to manage your MySQL database and don’t have access to Remote Desktop? Fortunately phpMyAdmin is a wonderful alternative with a browser-based GUI and fits any budget because it’s free.

Installing phpMyAdmin

Installing phpMyAdmin on your Windows cloud server is easy and straight forward. Here are the basic steps:

  • Download the latest version of phpMyAdmin to your server
  • Using IIS Manger create a site
  • Unzip the phpMyAdmin archive into the root of the new site
  • Using a browser go to the new site where you’ll see the login screen
  • Enter your MySQL root user and password

myphpadmin-login

The current version of phpMyAdmin requires at least PHP 5.5 and MySQL 5.5. If your server isn’t running the correct specs you’ll receive an error message.

myphpadmin-error

 

There’s a phpinfo.php page in the root of site so you can check which version your server has in case you are not sure.

image

 

After logging in with the root username and password you will arrive at the administration dashboard as seen below. From here you can perform nearly every admin task necessary to manage your MySQL instance. Adding, deleting, and editing, databases, users, and tables is only a few clicks away as well as exporting and restoring  databases.

phpmyadmin-dasboard

 

Backing up a MySQL Database

Backing up a MySQL database with phpMyAdmin is pretty straight forward. From the admin dashboard click on the Databases menu and you will see all the databases available to your user as shown below. If you are logged in as the root user you’ll be able to select all the databases in the MySQL instance. Select the database you want to backup and click on the Export menu. Alternatively if you are already on a database’s details page, the Export menu will be there as well.  In my example I am just going to backup one user database called pvtest1.

phpmyadmin-export

 

On the following screen you need to choose the Quick or Custom Export method. Choosing the Custom method allows you to set specific options such as reformatting the output, using compression, exporting the data as well as the structure, and Object creation options to drop existing objects when the database is restored.

image

 

Selecting the Custom export method also enables you to specify if you want to rename the database and structure in the export file.

image

 

Upon clicking the Go button the database will be exported to a flat file using the options you’ve selected. The file is automatically downloaded to your browser’s Downloads folder.  The text file contains serialized data so be aware that you can corrupt the contents. I have had issues restoring the files after editing them with Notepad so I prefer to use Notepad++ which hasn’t caused any issues for me.

image

 

One can see this is an easy and straight forward process when the occasional backup is needed. However if more frequent backups are needed, then a more robust process is required . Here is a recent blog post on how to automate MySQL backups on your server.

 

Restoring a MySQL Database

Restoring a MySQL database is just as simple as backing it up was. From the home dashboard click the Import menu and then choose the MySQL backup file to be imported.  By default the export file be hardcoded to create a database with the same name from which it was exported. If the export file contains the Object creation options it will drop the existing database before creating it again. As noted above you can also have the database renamed before exporting it. Additionally you can just edit the export file and specify the new database name by changing the CREATE DATABASE and USE statements. In this example we’re backing it up into a new a new database called pvtest2.

phpmyadmin-import

Depending on how big your backup file is the import process will take a few minutes. Once the process completes you’ll see a detailed message with the results. If there were any errors they will be noted here as well. Your new database is now ready to be used.

 

image

In Summary

Backing up and restoring MySQL databases on Windows Server 2012 R2 is easy with PhpMyAdmin. Previously one had to use Remote Desktop to do MySQL administration. Now using only a web browser you have nearly complete control over your MySQL databases. Thanks for reading!

Peter Viola

Creative, customer focused, results oriented, Senior Web Systems Engineer who enjoys providing the highest level of customer service supporting complex Windows hosting solutions. MCITP, MCSA, MCTS

More Posts - Website

Nov 082015
 

Hosting MySQL and WordPress on your Windows Server 20008 R2 or Windows Server 2012 R2  has never been easier thanks to Microsoft Web Platform Installer (WPI).  However, backing up the MySQL databases is another story. Running mysqldump is certainly easy enough but manually taking a backup once in a while won’t be useful for disaster reovery so something automated is needed.  I was hoping to find an equivalent tool to the very capable SQL Scheduler which automates backups for SQL Server Express.  Fortunately while searching an easy solution for automated backups of MySQL I stumbled across a blog post by Mathew Moeller at who created the script I’m going to cover here.

The backup solution runs from a .bat file which you then schedule using Windows Task Scheduler. Each MySQL database is backed up to an individual SQL file using a file name of the database and the date and time of the backup.  A zip file is then created containing all of the individual SQL backup files.  The script even includes a feature to delete historical backups after a specified period of time.  Errors that ocurr during the backup process are logged in a dumperrors.txt file.

Setting up the Batch File

The first step to automate your MySQL backups is to download the script to your server. Edit the batch file using Notepad. In the file you’ll see a section called SETTINGS AND PATHS.  This will contain the username and password of the user backing up the databases and as well as the specific paths the script needs to run:

  • Error log path
  • MySQL EXE Path
  • Path to data folder
  • Path to zip executable
  • Number of days to retain .zip files

The download package also includes a copy of the 7zip standalone console which is easy to use.

Setting the Backup User

The script can just as easily run with the MySQL root user and password however following security best practices you should use a different user with the least permissions necessary to run the backups. The previous hyperlink has the example below to set those permissions for a user called mysqlbackup.

image

 

Setting the Paths

Once you complete filling in the SETTINGS and PATHS section your script should look something similar to this:

image

Be sure to double check all the paths are correct. You don’t want come back sometime in the future in need a backup during an emergency only to discover your script had a typo and your data was never archived.

 

Setting the Scheduled Task

Once the script configuation is complete the only remaining step to automating your MySQL backups is to create the scheduled task in Task Scheduler. Simply step through the Basic Task Wizard and browse to the path of batch file you setup earlier.

image

 

On the job properies be sure to set the user account to run as a user with the necessary permissions and set Run whether the user is logged on or not.

image

Run the job manually from the console to ensure everything runs properly. Double check the output path you specified earlier and ensure the MySQL databases were created.

 

In Summary

Having an automated solution to backup databases is critical for disaster recovery. Today I covered a free script that you can leverage to automate backing up your MySQL databases on Windows Server 2012 R2. With any backup solution be sure to always test your procedures as well as periodically testing restoring the backups. Thanks for reading!

Peter Viola

Creative, customer focused, results oriented, Senior Web Systems Engineer who enjoys providing the highest level of customer service supporting complex Windows hosting solutions. MCITP, MCSA, MCTS

More Posts - Website

Feb 062015
 

SQL Injection became a favorite hacking technique in 2007. Despite being widely documented for so many years it continues to evolve and be utilized.  Because SQL Injection is such a well known attack vector, I am always surprised when as sysadmin I come across someone’s site that has been compromised by it. In most instances the site was compromised because of not properly validating user data entered on web forms. Classic ASP sites using inline SQL queries with hardcoded query string parameters are especially vulnerable. Fortunately regardless of a site’s potential programming weaknesses it can still be protected. In this walkthrough I will cover how to protect your site from SQL Injection using IIS Request Filtering.

Identifying SQL Injection Requests

Most people find out about SQL Injection the hard way after their web site has been defaced or their database has been compromised. If you are not sure your site has been attacked you need look no further than your web site’s traffic logs. Upon visual inspection of the IIS site logs you will see blocks of legitimate requests interspersed with the malicious requests. Checking the HTTP status code at the end of each request will indicate if there is any issue. If the status code was a 404 or a 500, you know the malicious request didn’t work. However, if the request had a HTTP status code of 200 then you should be concerned.

image

 

Using Log Parser to Find SQL Injection Requests

Looking through web logs for malicious requests can be a tedious and time consuming. Microsoft’s wonderful log analysis tool Log Parser makes it easier to identify pages on your site that are being targeted by SQL Injection attacks. Running the query below will create a report of all the page requests in the log with query string parameters.

logparser.exe "SELECT EXTRACT_FILENAME(cs-uri-stem) AS PageRequest, cs-uri-query, COUNT(*) AS TotalHits
FROM C:\wwwlogs\W3SVC35\u_ex141219.log TO results.txt
GROUP BY cs-uri-stem, cs-uri-query
ORDER BY TotalHits DESC"

 

Using Findstr to find SQL Injection Requests

Using Log Parser to identify malicous requests is helpful however if you need to look at multiple sites’ logs the task becomes more challenging. For these situations I like to utilize Findstr. Findstr is a powerful Windows tool that uses regular expressions to search files for any string value. One powerful feature is that you can store your search strings in separate files and even exclude certain strings from being searched. In the example below, I use the /g parameter to have Findstr load a file named sqlinject.txt with my predefined string and then search all the web logs in the W3SVC1 folder. The output is redirected to a file called results.txt.

findstr /s /i /p  /g:sqlinject.txt C:\wwwlogs\W3SVC1\*.log >c:\results.txt

Using this syntax it is easy to extend the capability of Findstr by creating a simple batch file with all the web log folders on your server. Once setup you will be able to any identify SQL Injection requests against your server within minutes.

Configuring IIS Request Filtering

The Request Filtering module was introduced in IIS 7 as a replacement for the very capable Url Scan. Using Log Parser and Findstr you will be able to identify plenty of malicious requests attacking the web sites on your server. A Request Filtering rule can block requests based on file extensions, URL, HTTP Verbs, Headers, or Query Strings. Additionally you can block requests based on a maximum size of the query string and url length.

image

 

Like any other IIS module you can maintain the settings outside of IIS Manager by editing the web.config. The values are stored in the <requestFiltering> section within <system.webServer>

image

Filtering Rules

I typically create 1 rule for each Deny String that I want to block. You can add multiple strings on each rule however I find it easier to maintain when only using 1 string per rule. This way if a rule is inadvertently blocking legitimate requests you can quickly disable it while leaving the other ones operational. The request url or query string can scanned for Deny Strings. However, enabling url scanning requires a bit of forethought because if the Deny String matches any part of the name of a page on your site, requests to that page will be blocked. matches . For example if you want to block requests containing the SQL command “update” but there happens to be page called update.aspx on your site, any request to update.aspx will be blocked.

image

If I had to pick only 1 Deny String for filtering it would be cast(. Nearly every SQL Injection request I’ve seen uses cast( and no legitimate page or query string parameter should have this name.

404 Status Codes

Any request blocked by Request Filtering will return a 404 error status with a specific substatus to identify the reason it was denied. A few common ones are listed below.

HTTP Substatus Description
404.5 URL Sequence Denied
404.14 Url Too Long
404.15 Query String Too Long
404.18 Query String Sequence Denied
404.19 Denied by Filtering Rule

With Request Filtering is enabled, it is easy to keep an eye on blocked requests. The Log Parser query below will create a report of all the requests with HTTP status 404 and substatus greater than zero.

logparser.exe "SELECT EXTRACT_FILENAME(cs-uri-stem) AS FILENAME, cs-uri-query, COUNT(*) AS TotalHits
FROM C:\wwwlogs\W3SVC35\u_ex141219.log TO results.txt
WHERE (sc-status = 404 AND sc-substatus > 0)
GROUP BY cs-uri-stem, cs-uri-query
ORDER BY TotalHits DESC

 

Blocking User Agents

While investigating a recent SQL Injection attack I noticed in the IIS logs that the site had been compromised by an automated tool. It was interesting to see how the first malicious request was very basic and then each subsequent one became more elaborate with complex SQL queries. What I found even more curious was that each request used the same User-Agent which in fact identified the name of the tool and where to download it.

image

 

Their web site clearly states the creators of the tool released it with the intention of helping system administrators discover vulnerabilities. Unfortunately it’s far too easy for someone to use it with malicious intent.  The good news is that blocking requests based on the User-Agent is quite easy. You just need to create a new rule and specify User-Agent in the header and then the name of the agent in the Deny Strings.  As you can see by the 404.19 status in the picture above the automated tool was successfully blocked the next time around after the rule was added.

image

 

In Summary

SQL Injection is a popular attack vector for web sites but by leveraging IIS Request Filtering the malicious requests can be easily blocked. Using Log Parser and Findstr you can quickly check your site’s logs to identify suspicious activity. Thanks for reading.

Peter Viola

Creative, customer focused, results oriented, Senior Web Systems Engineer who enjoys providing the highest level of customer service supporting complex Windows hosting solutions. MCITP, MCSA, MCTS

More Posts - Website

Dec 142014
 

Regardless of whether you are running Windows Server 2012 on virtual server or physical server, the success of your business depends on having the server run at optimal capacity. To ensure the server delivers uninterrupted service, you have be aware of potential performance issues before they arise.

One of the best methods to analyze the performance of Windows Server 2012 is with Performance Monitor and a User Defined Data Collector. With this tool the identification and analysis of potential performance issues has never been easier. Upon completion, a detailed summary report will be generated providing immediate insight into key aspects of the server’s performance such as Disk IO, CPU, and RAM as well as network utilization. Reading the report summary is simplified further with the use of green, red and yellow icons that call your attention to any irregularities. Additional in depth metrics are contained in collapsible sections of the report below the Summary.

Creating a New Data Collector

To create a new User Defined Data Collector simply open Performance Monitor, right click on User Defined, select Data Collector Set. A wizard will launch to guide you through creating a new Data Collector. Once created the Data Collector will be available to run as frequently as needed. Each time it runs a new report will be created.

image

 

The first step will be to enter the name of your report. I usually specify “Performance” somewhere in the name since that is the type of Data Collector I am planning on running. Choosing the default option of Create from the template is recommended. Click on Next to continue.

image

 

The next step will be to choose the Data Collector Template that you want to use. I am going to choose System Performance. Click on Next to continue.

image

 

Next you will be prompted to choose a path to store the report data. Depending on how long your report runs and how frequently you run it the reports can consume a lot of space. In the event that your server has multiple disk drives, it would be better to select the larger drive for storing the reports.  Click Next to Continue.

image

 

Leave <Default> for the Run as: user context. You can change that later if needed. We need to configure some additional settings before running so select Open properties for this data collector set and then click Finish.

image

Additional Data Collector Properties

Before running your new data collector there are a few properties that you want to double check first.

 

Setting the Stop Condition

With the properties open, click on the Stop Condition tab so that you can enter a specific period of time for the Data Collector to run. It is important to set a Stop Condition before running otherwise it will continue to run indefinitely until you manually stop it. As I noted earlier not only can the logs can take up disk space but also running a Data Collector for an extended period of time can impact server performance so specifying a Stop Condition is a good idea. For short tests I typically set 20-30 minutes. For longer tests I’ll set 2-3 hours.

image

 

Setting a Recurring Schedule

Chances are you may already be aware of a performance problem on your server and need to isolate the analysis window to a specific day or time period. Clicking on the Schedule tab will enable to specify multiple dates and times to run the Data Collector. This could be especially helpful if your server gets busy with after-hours utilization and you’re not available to start the data collector manually.

 

image

 

You can even select a date range to run the data collector on specific days of the week during that period of time.

image

 

Once you’ve finished setting the properties of the data collector just right-click on the name to run it manually or wait for the schedule to start it automatically.

 

Viewing the Summary Report

You will be able to view and analyze the report generated by the Data Collector once it has completed running. If you try to view the report before it has completed you will be notified that the Data Collector is still running. The report is located under the User Defined Reports section of Performance Monitor.

image

 

The overall performance of the server is displayed at the top of the report in the Summary. Anything requiring your immediate attention is noted in the Diagnostic Results section. In the picture below we can see that the server clearly needs additional RAM to alleviate the disk paging that is occurring.  The Resource Overview offers an easy to read chart of the server’s core resources of CPU, Network, Disk, and Memory. The status of each of these is indicated with Green, Yellow, or Red icons.

image

 

Below the Summary are collapsible sections that offer more detailed insight into the server’s CPU, Network, Disk, and Memory utilization. Here are two examples of the additional data that is available:

CPU Utilization

In the picture below we can see that one IIS worker process was consuming nearly 80% of the server’s CPU utilization. Performing additional analysis with Log Parser on the web site’s web logs would help identify the problems this particular web site is experiencing.

image

 

Disk IO

Some cloud server providers will charge overage fees for excessive disk IO so it’s important to know what’s happening there. In the Disk summary there a helpful report that shows exactly what files on your server are consuming the most IO. This report is aptly named Files Causing Most Disk IOs. In the picture below we can see that pagefile.sys is causing a lot of disk IO. This is a good indication that the server could benefit from additional RAM thereby reducing the amount of disk paging that is occurring.

image

Viewing the Data Counters

In addition to reading the data collector report you also have the ability to view the raw counter data. From this view you can select all the counters that were collecting data or only a few and play back the utilization as it occurred.

image

 

In Summary

Windows Server 2012 offers several tools for analyzing your server’s performance. The Performance Monitor Data Collector offers comprehensive insight into resource utilization and makes it easy to quickly identify performance bottlenecks. Thanks for reading.

Peter Viola

Creative, customer focused, results oriented, Senior Web Systems Engineer who enjoys providing the highest level of customer service supporting complex Windows hosting solutions. MCITP, MCSA, MCTS

More Posts - Website

Nov 292014
 

When your Windows server is low on space or runs out of space entirely you need to quickly identify where the disk space is being utilized and free up space. Low disk space or worse yet no disk space can have a negative impact on your server’s performance.  Knowing the paths to a few folders that typically eat up space such as web logs isn’t enough when you need to free up space ‘now’. In this situation you need a graphical tool that can quickly analyze an entire disk drive or even multiple drives and show you how the server’s space is being utilized. Fortunately for Windows server admins JDiskReport and WinDirStat are two such tools and better still they are both free.

 

Using JDiskReport

JDiskReport is a free graphical disk space tool from jgoodies.com. Unlike some of those other free tools companies provide that require you to register your product before it works or that you have to pay to unlock features, JDisk is ready to use as soon as it’s installed and it’s feature complete. Installation of JDisk is straight forward and quite simple.

Before you install JDisk you should know that it requires the Java Runtime to run. If the Java Runtime is missing and you install JDisk, it prompt you to locate the path to the Java Runtime. Once you’ve downloaded Jdisk to your server just launch the installation wizard. The only additional step of the wizard will be to specify the path where you want it to be installed.

image

 

Once installation has completed you will be presented with the default starting screen. Any previous paths that you’ve analyzed will be displayed for added convenience. You can select the entire disk drive or a specific folder on the server.

image

 

Unless I have a specific folder in mind I typically pick the entire disk drive. Within a few minutes, after initiating a directory scan, you will see a detailed analysis of the server’s disk space utilization. This report is more than just a pretty picture. Not only can you can click on any folder of the navigation tree to drill down more but you can also click on any part of the pie chart to see subdirectories.

image

 

In the picture above we can see that the Windows folder is using the most space but that is to be expected on a C: drive. Looking more closely I can see that on this server C:\temp is using over 9 GB and that’s unusual so there’s probably some files in there I can delete which will free up valuable space. In addition to the colorful chart you can also get a detailed file list and sort that according to size. In the picture below we can see a more detailed look at C:\temp.

image

Within minutes of running the scan, JDisk has helped me find several large files which can be deleted.

 

Using WinDirStat

WinDirStat can be downloaded from windirstat.info and is available in 12 different languages. It offers some interesting features such as an option to delete files and a color coded treemap  as well as disk space utilization based on file type. Installing WinDirStat is just as simple as installing Jdisk. Upon launching the wizard you’ll be prompted to accept the GNU GPL. After that you just need to choose the features and then pick the installation path.

image

 

When the program first opens, it will display all of the disk drives available for analysis. If your server happens to have any network drives mapped, they will also be displayed. Here you have the option to scan all the drives on the server, just one drive, or a specific folder.

image

 

Scanning the disk drive completes quickly however it’s hard to say whether WinDirStat is faster or slower than JDisk. The speed of both programs will ultimately depend on how much data is being analyzed and the server hardware configuration  such as processor speed and disk drive speed. Once it completes you are presented with a detailed analysis of the disk space utilization. Clicking on any folder in the tree view enables you to drill down in the directory tree.

image

 

From the application menu you can toggle showing the utilization by file type and see the treemap. Although the treemap and file type analysis are helpful, I prefer to just use the directory list because when I’m working on a server that’s running out of disk space, I need to get it resolved quickly.

image

 

In Summary

Having enough free disk space is a necessity for Windows servers to perform optimally.  Graphical tools like JDiskReport and WinDirStat make it easy to identify where your server’s disk space is being consumed. Both are capable programs and work quickly to analyze disk space utilization. If I had to choose only one, I could pick WinDirStat because it doesn’t require any additional software to operate. Thanks for reading.

Peter Viola

Creative, customer focused, results oriented, Senior Web Systems Engineer who enjoys providing the highest level of customer service supporting complex Windows hosting solutions. MCITP, MCSA, MCTS

More Posts - Website

Nov 072014
 

When it comes to improving Windows server performance, most sysadmins focus on hardware such as adding CPUs or RAM. However, low disk space can also impact performance sometimes even causing critical processes such as backups to fail. Fortunately there are quite a few places to check on a Windows server to free up additional disk space.  Some paces to check are obvious such as cleaning up log files while other paces are not as obvious such as finding system temp files.

How to See System Files

Before searching for additional space you need to ensure that you Windows Explorer will display hidden system files and file extensions. To confirm you can see these open Windows Explorer and go to Folder & Search Options.

image

 

Click on the View tab and select Show hidden files, folders, and drives. Uncheck Hide protected operating system files and Hide extensions for known file types.  Making these changes will allow you to see all the files on the server including system files and folders which could be taking up unnecessary space. Click OK to close the window.

Before deleting anything always double check that you really don’t need the files any more and it’s safe to delete. Here are the top places that I check when I need to free up disk space on a Windows server.

1. Empty Recycle Bin

Cleaning up the recycle bin is most likely the easiest way to purge files unnecessarily taking up space. When you need to quickly clean up space this is the first place to check. It is surprising how much space can accumulate over time. Every disk volume on the server has a $recycle.bin folder. As mentioned above you won’t be able to see it until you enable viewing system folders. In the picture below you can see there’s plenty of deleted files waiting to be purged. Just select all the folders and right-click to delete them.

 

image

 

2. Compress IIS Log Files

The next thing I do when I need to free up disk space is to compress the IIS site log files. The default path to these files is %SystemDrive%\inetpub\logs\LogFiles. However, I prefer to redirect that path to something easier to find at the root of the disk drive such as C:\wwwlogs. If the server has multiple drives I will store them on the largest drive. Unless you disable your site logs they will automatically grow until the disk drive has filled up or they are removed or they are deleted. Enabling Windows file compression on the IIS logs directory tree will save a considerable amount of disk space.

image

 

To enable Windows file compression, just right-click on logs folder and select Properties. Click the Advanced button and as shown in the picture above and select Compress contents to save disk space. Click OK to close the window. Depending on how much content you have in the directory tree it may take several minutes to complete.

 

image

The picture above is from an IIS logs folder where I enabled compression and as you can see it saved 62% of the space being utilized by the log files. You can squeeze even more free space from your IIS log files by zipping them with an archiving program. In a recent walkthrough of mine I show how to manage IIS logs with GZipStream.

 

3. Compress SQL Server Backups

 

The SQL Server backup folder is another great place to check when you need to free up some disk space. You can use the steps above to apply Windows file compression and as well zipping the files to free up additional disk space. In the photo below the SQL Server backup folder is using 1.8 GB of space without any compression.

image

After applying compression to this folder I was able to save approximately 60% of the disk space used by the backups. By zipping the files as well can you save can even more space. Depending on your particular business needs, you can also save additional disk space by limiting number of backups SQL Server stores on the server. This can be configured with a SQL Server Maintenance Plan.

 

4. Cleanup Performance Monitor Reports

Windows Performance Monitor is an invaluable tool to analyze performance on a Widows server. Within minutes, one can easily configure a Data Collector to get deep insights on CPU, RAM, Network IO, and Disk IO. However, this convenience can also lead to disk space being needlessly consumed when you have forgotten about the reports days or weeks after the analysis has completed. This will be even more apparent if someone forgets to set a Stop Condition on the Data Collector and leaves it running for days.

image

The default path for the logs is usually C:\PerfLogs. The report path is also clearly shown in the Data Collector properties. Once your analysis has completed and you’ve reviewed the reports you can delete them. Applying Windows file compression to the reports folder as shown above will also help save disk space.

 

5. Cleanup Windows Error Reports

Windows Error Reporting is an exceptional tool for identifying issues on your server. Unless you delete the logs or disable the feature they will accumulate over time. The default path to WER reports is C:\ProgramData\Microsoft\Windows\WER and there are two sub-directories below it. You can delete the files in the folders but you should leave the 2 folders in place. This is another great place to apply Windows file compression to save more space.

image

 

 

6. Cleanup Windows Temp Files

There are several paths on Windows server’s that are used temporarily when installing updates or new programs. In many cases Windows will automatically delete these files after the installation has completed. However sometimes you’ll need to manually delete them yourself. One such folder is C:\Windows\ServiceProfiles\NetworkService\AppData\Local\Temp. In the picture below I was able to free up nearly 1 GB by deleting Malware Protection updates that had not been properly removed after they were installed.

clip_image002

Here are some other possible locations to look for temporary files that can be removed:

  • C:\temp
  • C:\Users\Default\AppData\Local\Temp
  • %localappdata%\Microsoft\Windows\Temporary Internet Files
  • %localappdata%\Microsoft\Windows\Explorer
  • %windir%\ServiceProfiles\LocalService\AppData\Local\Temp

 

7. Windows Disk Cleanup Tool

Trying to remember all the paths to temporary files can be a daunting challenge for any sysadmin. Fortunately Microsoft recognized this as well. On Windows Server 2008 R2 and Windows Server 2012 or later,  you can get a Disk Cleanup tool like the one on the desktop versions of Windows. However, to take advantage of this you need to install the Desktop Experience feature which is available using the Server Manager’s Add Features Wizard. Just check the feature and then complete the wizard.

image

 

After the server has been installed you can access the Disk Cleanup tool from the Control Panel. You will have a convenient way to clean up different types of temporary files including Windows Update files and Windows Error Reporting files.

 

image

This tool is very helpful with cleaning up disk space. However, you should be aware that there will be some additional programs installed along with the Disk Cleanup tool which you may not want on your server such as Media Player. Here is a complete list of the programs that are installed with the Desktop Experience.

 

8. Windows Server 2008

All of the options listed above will also work on Windows Server 2008 systems however specifically on Windows Server 2008 SP2 servers you can make the service pack permanent and free up space by running the following command which should free up nearly 1GB of disk space on the server:

  • compcln.exe /VERBOSE:C:\temp\compcln.txt

 

9. Windows Server 2003

Windows Server 2003 “end of life” is July 14, 2015. If you haven’t started migration plans for legacy systems on that platform then you need to start planning for it asap. A great place to clean up space on Windows Server 2003 is to delete the hotfix uninstall files. Imagine my surprise when I logged into the server below to work on a low disk space situation and I found over 1 GB of these legacy files going back to 2011. There are also files in the C:\windows\$hf_mig$ folder that can be cleaned up. However, It’s always a good idea to wait at least a week or two before deleting these files in case you need to rollback one of the hotfixes.

 

image

 

One additional way to free up space would be to create a symbolic link from one directory to another on a larger disk drive. Mark Russinovich’s free Junction tool makes it very easy to do this however you have to be careful when doing this or you can inadvertently cause problems for yourself. Be sure to make a backup before using it the first time.

In Summary

Having your Windows server run out of space can cause serious performance issues as well as prevent important backup processes from running. I covered several great places to check on a Windows server when you need to free up space. Always confirm that files are safe to delete before you delete them. Thanks for reading.

Peter Viola

Creative, customer focused, results oriented, Senior Web Systems Engineer who enjoys providing the highest level of customer service supporting complex Windows hosting solutions. MCITP, MCSA, MCTS

More Posts - Website

Aug 282014
 

Depending on how many sites your Windows web server is hosting maintaining IIS logs can be a challenge.  IIS logs provide valuable insight into the traffic your sites are experiencing as well as detailed SEO metrics and performance data. A typical web server will have just enough free disk space for future growth needs but ultimately will be limited by the capacity of drives in the server. If left unmonitored IIS logs can quickly fill any remaining disk space on the server. There are a few 3rd party tools that are good at compressing log files when they are under one parent folder but when the log files are in different locations such as on a WebsitePanel server I support an alternative solution is needed. In this walkthrough I will demonstrate how I solved this challenge using asp.net and GZipStream.

 

What about Windows Folder Compression?

Enabling Windows folder compression should always be the first step to try when freeing up space used by IIS logs. I always redirect the site logs to a folder at the root level of the server so it’s easy to find them and monitor their growth. In the example below I saved 8.3 GB of disk space without having even having to zip any files. However Windows folder compression was not going to work for my WebsitePanel server.

Capture12

 

Finding WebsitePanel IIS Logs

WebsitePanel is a wonderful Windows control panel for Multi-Tennant web hosting and the best part is that it’s free. If you haven’t discovered this incredible application yet check it out. Once installed every aspect of site creation and maintenance is completely automated. As shown in the picture below customer sites are usually deployed with a directory structure like c:\HostingSpaces\customername\sitename. The IIS logs are always stored further down the tree in …\sitename\logs\W3SVCXX. As one can imagine on a web server with hundreds of sites it would be quite a challenge to track down and manually apply Windows compression on all the W3SVC folders. This would be further complicated as new sites are added to the server.

Untitled-1 copy

 

Using Directory.EnumerateFiles and GZipStream

I knew I was going to do some coding to solve my challenge but I also wanted to keep the solution easy to use.  In years past I would have considered VBScript folder recursion and executing the PKZip command line tool but I wanted something more contemporary.  So I used Visual Studio 2013 and created a Console Application.

Capture9

 

image

 

The GZipStream class which was introduced in .Net 2.0 would handle the necessary file compression. Since I would only be compressing individual files I didn’t have to deal with using a separate library such as DotNetZip.

public static void Compress(FileInfo fileToCompress)
{
    using (FileStream originalFileStream = fileToCompress.OpenRead())
    {
        if ((File.GetAttributes(fileToCompress.FullName) & 
            FileAttributes.Hidden) != FileAttributes.Hidden & fileToCompress.Extension != ".gz")
        {
            using (FileStream compressedFileStream = File.Create(fileToCompress.FullName + ".gz"))
            {
                using (GZipStream compressionStream = new GZipStream(compressedFileStream, CompressionMode.Compress))
                {
                    originalFileStream.CopyTo(compressionStream);
                    Console.WriteLine("Compressed {0} from {1} to {2} bytes.",
                        fileToCompress.Name, fileToCompress.Length.ToString(), compressedFileStream.Length.ToString());
                }
            }
        }
    }
}

The Directory.EnumerateFiles method introduced in .Net 4 would use find all the IIS log folders from the WebsitePanel hosting spaces directory tree. Using the searchOption overload  would enable me to not only search for .log files but also limit the searching to specific folders matching “W3SVC”.  In the snippet below I loop through the search results, compress any log file older than 1 day, and delete the original file. The last step is to just write the name the file that was just zipped to an audit log.

public static void zipFiles()
{
    try
    {
        foreach (string file in Directory.EnumerateFiles(homeDirectory, "*.log", 
            SearchOption.AllDirectories).Where(d => d.Contains("W3SVC")).ToArray())
        {
            FileInfo fi = new FileInfo(file);
            if (fi.LastWriteTime < DateTime.Now.AddDays(-1))
            {
                Compress(fi);
                fi.Delete();
                activityLog("Zipping: " + file);
            }
        }
    }
    catch (UnauthorizedAccessException UAEx)
    {
        activityLog(UAEx.Message);
    }
    catch (PathTooLongException PathEx)
    {
        activityLog(PathEx.Message);
    }
}

I only needed 30 days worth of log files to be left on the server. In the snippet below I delete any zipped log files matching my date criteria and then write the activity to the audit log.

public static void deleteFiles()
{
    try
    {
        foreach (string file in Directory.EnumerateFiles(homeDirectory, "*.gz", 
            SearchOption.AllDirectories).Where(d => d.Contains("W3SVC")).ToArray())
        {
            FileInfo fi = new FileInfo(file);
            if (fi.LastWriteTime < DateTime.Now.AddDays(-daysToSave))
            {
                fi.Delete();
                activityLog("Deleting:" + file);
            }

        }
    }
    catch (UnauthorizedAccessException UAEx)
    {
        activityLog(UAEx.Message);
    }
    catch (PathTooLongException PathEx)
    {
        activityLog(PathEx.Message);
    }
}

Using an XML Config File

To help keep my application flexible I created a simple XML file to store the path to the WebsitePanel hosting spaces folder and the number of days of logs I wanted to keep on the server.

<?xml version="1.0" encoding="utf-8"?>
<MyConfig>
  <HomeDirectory>D:\HostingSpaces</HomeDirectory>
  <DaysToSave>30</DaysToSave>
</MyConfig>

 

Loading these values into my program is done by parsing the XML data using XmlDocument.

public static void readConfig()
{

    string path = System.IO.Path.GetFullPath(@"logCleanConfig.xml");
    XmlDocument xmlDoc = new XmlDocument();
    xmlDoc.Load(path);
    XmlNodeList HomeDirectory = xmlDoc.GetElementsByTagName("HomeDirectory");
    homeDirectory = HomeDirectory[0].InnerXml;

    XmlNodeList DaysToSave = xmlDoc.GetElementsByTagName("DaysToSave");
    daysToSave = Convert.ToInt32(DaysToSave[0].InnerXml);

}

 

Running the Program

Running the program from a command line I can see the progress as files are compressed. Afterwards I setup a scheduled task so it will run every day to compress new IIS log files and purge the ones older than 30 days.

Capture7

 

A few days later I came across several sites on the server using CMS products with their own application logs. These log files were not being cleaned up by the site owners and taking up free space on the server so I decided to archive them as well. Since they were being stored within each site in subfolder called “Logs”, tweaking the searchOption parameters of the Directory.EnumerateFiles method would catch them. With a quick rebuild of the program, it would now compress the CMS log files as well as the IIS log files.

foreach (string file in Directory.EnumerateFiles(homeDirectory, "*.log",
  SearchOption.AllDirectories).Where(d => d.Contains("W3SVC") || d.Contains("Logs")).ToArray())

In Summary

IIS logs contain wealth of data about web site traffic and performance. Left unchecked over time the logs will accumulate and take up disk space on the server. Using Windows folder compression is a simple solution to reclaim valuable disk space. If your log files are different locations then leveraging the power of ASP.NET with Directory.EnumerateFiles and GZipStream will do the trick. Thanks for reading.

Peter Viola

Creative, customer focused, results oriented, Senior Web Systems Engineer who enjoys providing the highest level of customer service supporting complex Windows hosting solutions. MCITP, MCSA, MCTS

More Posts - Website

Aug 092014
 

One of the many benefits of using virtual servers over physical servers is the ability to add server resources such as CPU, RAM, and disk space on the fly without downtime. An addition drawback with a physical server is that you were often limited by the physical capacity of the server.  Once those limits were reached the server couldn’t be upgraded further.  Adding resources also required powering off the server which in turn would require coordinating with business owners and impacted users. Not all editions of Windows support hot-add so be sure to confirm your server is supported before starting. In this walkthrough I’ll show how easy it is to add server resources using VMware’s vSphere client.

Logging into vSphere Client

After authenticating on my network with the VMware vSphere 5.5 client I go to Hosts and Clusters under Inventory. From here I have access to all the virtual servers configured in our environment. After selecting the server to be upgraded you will be see the Getting Started tab. From here you have access to the usual administrative tasks such as starting, stopping, and restarting the server as well as performance reporting and events about the server. Click Edit virtual machine settings to add resources.

Capture2

 

Enabling RAM and CPU Hotplug

Adding the additional resources is straight forward. However when you begin you may find the CPU and Memory properties disabled. This indicates that the server has not been been previously enabled for hot adding resources. In this instance the server will need to be shutdown before you can upgrade these resources.

Capture3

 

Fortunately fixing this for future upgrades is a simple matter. When the server is powered down click on the Options tab of the Virtual Machine Properties. Under the Advanced settings go to the Memory/CPU Hotplug properties. Click Enable memory hot add and Enable CPU hot add. Click OK to save the changes. After the server is powered back on you will now be able to add CPU and Memory without having to first shutdown the server.

 

Untitled-1

 

To add additional virtual CPUs simply increase the Number of virtual sockets and click OK to save the changes.

core

 

To add additional Memory to the server adjust the Memory Configuration accordingly and click OK to save.

Untitled-2

 

 

Adding Additional Disk Space

In addition to adding CPU and Memory to the server during this maintenance window I am also going to add disk space. Adding additional disk space is just as straight forward as adding CPU and Memory. In the Virtual Machine Properties on the Hardware tab go to the Hard disk settings. Increase the Provisioned Size by the new amount and click OK to save the changes. Windows will not automatically recognize the new space so the final step of the upgrade will be log into the server and Extend the server’s disk drive. This can either be accomplished using vShere’s server console window or by connecting to the server with Remote Desktop.

Capture5

 

Extending Windows Disk Space

After logging into Windows open the Computer Management snap-in. In the console tree click on Disk Management under Storage. You may need to Rescan the disk before Windows will see that the new space is available.

Capture6

 

Step through the Extend Volume Wizard to allocate the additional space on the existing volume.

Capture7

 

In Summary

VMware vSphere offers System Administrators complete control over virtual server properties. Adding additional CPU, RAM, and disk space is  straight forward and in many cases can be performed without having to shutdown the server.  To help minimize downtime of your next maintenance window, double check the edition of your Windows server supports hot-add and confirm the Memory/CPU Hotplug property has been enabled. Thanks for reading.

Peter Viola

Creative, customer focused, results oriented, Senior Web Systems Engineer who enjoys providing the highest level of customer service supporting complex Windows hosting solutions. MCITP, MCSA, MCTS

More Posts - Website

Dec 272013
 

Thanks to Microsoft’s Web Platform Installer (Web PI) installing IIS has never been so easy. Before using Web PI to install IIS became available,  you had to use the Server Manager to install the Web Server (IIS) role and then select various Role Services that you need to be enabled. Depending on your level of expertise this could be a challenging task with lots scrolling back and forth and click upon click to get things just right,  but now you can have IIS deployed with just 3 clicks of your mouse.

Install Web PI

If you’re not familiar with the Web PI, it is a powerful tool that can be used to install not only IIS but also SQL Server Express, Visual Web Developer, Express, PHP, WordPress, Umbraco, and many other 3rd party applications from the Windows Web Application Gallery. If you haven’t already done so first Download Web PI and install it. It’s free and has a small footprint of only 2 MB.

image

Select IIS Recommended Configuration

Once Web PI has been installed just launch the program . It will open to the Spotlight tab so just click on the Products tab and click Add next to IIS Recommended Configuration. If you don’t see it in the opening list just search for it. All you need to do after this is just click Install at the bottom of the window.

 

image

 

You may be curious as to what options are installed with the IIS Recommended Configuration. Here is what will be installed:

  • ASP.NET
  • Static Content
  • Default Document
  • Directory Browsing
  • HTTP Errors
  • HTTP Logging
  • Logging Tools
  • Request Monitor
  • .NET Extensibility
  • Request Filtering
  • Static Content Compression
  • ISAPI Extensions
  • ISAPI Filters
  • WAS Process Model
  • Management Console
  • WAS Configuration API
  • WAS .NET Environment
  • .NET 4.5 Extended with ASP.NET for Windows 8
  • .NET 3.5 for Windows 8

Before the installation starts you need to accept the license terms so just click I Accept.

image

 

The installation will run for a few minutes installing the essential features for IIS to work properly.

image

 

Once Web PI has completed installing IIS just click Finish.

image

 

Using IIS Manager

Your server is now ready for hosting web sites. Open IIS Manager and you’ll see the Default web site has been configured.

image

 

When you browse http://localhost you’ll see the familiar IIS Start Page.

image

This page is named iisstart.htm and appears in the Default Documents list above default.aspx so once you upload your web site files be sure to delete this page.

Next Steps?

Now that you have IIS installed what’s next? Well you’ll want to go back to Web PI and at least install FTP Publishing. Once you have FTP Publishing installed you want to look into configuring FTP User Isolation as well as using FTP over SSL for greater security when transferring content to and from your server. You may also want to look at installing Url Rewrite 2.0 from Web PI. Url Rewrite offers many ways to rewrite urls for SEO and perform 301 redirects as well as blocking page requests.

Summary

The Web Platform Installer (Web PI) is a powerful tool for deploying a wide variety of 3rd party applications such as WordPress and other popular CMS products but it can also be used to install IIS or even SQL Server Express on your server. The Web PI offers  unparalleled ease and convenience with installing applications on Windows servers. Thanks for reading.

Peter Viola

Creative, customer focused, results oriented, Senior Web Systems Engineer who enjoys providing the highest level of customer service supporting complex Windows hosting solutions. MCITP, MCSA, MCTS

More Posts - Website