Peter Viola

Creative, customer focused, results oriented, Senior Web Systems Engineer who enjoys providing the highest level of customer service supporting complex Windows hosting solutions. MCITP, MCSA, MCTS

Blocking SQL Injection with IIS Request Filtering

 IIS, Windows Server 2008, Windows Server 2012  Comments Off on Blocking SQL Injection with IIS Request Filtering
Feb 062015
 

SQL Injection became a favorite hacking technique in 2007. Despite being widely documented for so many years it continues to evolve and be utilized.  Because SQL Injection is such a well known attack vector, I am always surprised when as sysadmin I come across someone’s site that has been compromised by it. In most instances the site was compromised because of not properly validating user data entered on web forms. Classic ASP sites using inline SQL queries with hardcoded query string parameters are especially vulnerable. Fortunately regardless of a site’s potential programming weaknesses it can still be protected. In this walkthrough I will cover how to protect your site from SQL Injection using IIS Request Filtering.

\r\n

Identifying SQL Injection Requests

\r\n

Most people find out about SQL Injection the hard way after their web site has been defaced or their database has been compromised. If you are not sure your site has been attacked you need look no further than your web site’s traffic logs. Upon visual inspection of the IIS site logs you will see blocks of legitimate requests interspersed with the malicious requests. Checking the HTTP status code at the end of each request will indicate if there is any issue. If the status code was a 404 or a 500, you know the malicious request didn’t work. However, if the request had a HTTP status code of 200 then you should be concerned.

\r\n

image

\r\n

 

\r\n

Using Log Parser to Find SQL Injection Requests

\r\n

Looking through web logs for malicious requests can be a tedious and time consuming. Microsoft’s wonderful log analysis tool Log Parser makes it easier to identify pages on your site that are being targeted by SQL Injection attacks. Running the query below will create a report of all the page requests in the log with query string parameters.

\r\n

\r\n

 

\r\n

Using Findstr to find SQL Injection Requests

\r\n

Using Log Parser to identify malicous requests is helpful however if you need to look at multiple sites’ logs the task becomes more challenging. For these situations I like to utilize Findstr. Findstr is a powerful Windows tool that uses regular expressions to search files for any string value. One powerful feature is that you can store your search strings in separate files and even exclude certain strings from being searched. In the example below, I use the /g parameter to have Findstr load a file named sqlinject.txt with my predefined string and then search all the web logs in the W3SVC1 folder. The output is redirected to a file called results.txt.

\r\n

findstr /s /i /p  /g:sqlinject.txt C:\wwwlogs\W3SVC1\*.log >c:\results.txt

\r\n

Using this syntax it is easy to extend the capability of Findstr by creating a simple batch file with all the web log folders on your server. Once setup you will be able to any identify SQL Injection requests against your server within minutes.

\r\n

\r\n

Configuring IIS Request Filtering

\r\n

The Request Filtering module was introduced in IIS 7 as a replacement for the very capable Url Scan. Using Log Parser and Findstr you will be able to identify plenty of malicious requests attacking the web sites on your server. A Request Filtering rule can block requests based on file extensions, URL, HTTP Verbs, Headers, or Query Strings. Additionally you can block requests based on a maximum size of the query string and url length.

\r\n

image

\r\n

 

\r\n

Like any other IIS module you can maintain the settings outside of IIS Manager by editing the web.config. The values are stored in the <requestFiltering> section within <system.webServer>

\r\n

image

\r\n

\r\n

Filtering Rules

\r\n

I typically create 1 rule for each Deny String that I want to block. You can add multiple strings on each rule however I find it easier to maintain when only using 1 string per rule. This way if a rule is inadvertently blocking legitimate requests you can quickly disable it while leaving the other ones operational. The request url or query string can scanned for Deny Strings. However, enabling url scanning requires a bit of forethought because if the Deny String matches any part of the name of a page on your site, requests to that page will be blocked. matches . For example if you want to block requests containing the SQL command “update” but there happens to be page called update.aspx on your site, any request to update.aspx will be blocked.

\r\n

image

\r\n

If I had to pick only 1 Deny String for filtering it would be cast(. Nearly every SQL Injection request I’ve seen uses cast( and no legitimate page or query string parameter should have this name.

\r\n

\r\n

404 Status Codes

\r\n

Any request blocked by Request Filtering will return a 404 error status with a specific substatus to identify the reason it was denied. A few common ones are listed below.

\r\n

\r\n

\r\n

\r\n

\r\n

\r\n

\r\n

\r\n

\r\n

\r\n

\r\n

\r\n

\r\n

\r\n

\r\n

\r\n

\r\n

\r\n

\r\n

\r\n

\r\n

\r\n

\r\n

\r\n

\r\n

\r\n

\r\n

\r\n

HTTP Substatus Description
404.5 URL Sequence Denied
404.14 Url Too Long
404.15 Query String Too Long
404.18 Query String Sequence Denied
404.19 Denied by Filtering Rule

\r\n

With Request Filtering is enabled, it is easy to keep an eye on blocked requests. The Log Parser query below will create a report of all the requests with HTTP status 404 and substatus greater than zero.

\r\n

\r\n

 

\r\n

Blocking User Agents

\r\n

While investigating a recent SQL Injection attack I noticed in the IIS logs that the site had been compromised by an automated tool. It was interesting to see how the first malicious request was very basic and then each subsequent one became more elaborate with complex SQL queries. What I found even more curious was that each request used the same User-Agent which in fact identified the name of the tool and where to download it.

\r\n

image

\r\n

 

\r\n

Their web site clearly states the creators of the tool released it with the intention of helping system administrators discover vulnerabilities. Unfortunately it’s far too easy for someone to use it with malicious intent.  The good news is that blocking requests based on the User-Agent is quite easy. You just need to create a new rule and specify User-Agent in the header and then the name of the agent in the Deny Strings.  As you can see by the 404.19 status in the picture above the automated tool was successfully blocked the next time around after the rule was added.

\r\n

image

\r\n

 

\r\n

In Summary

\r\n

SQL Injection is a popular attack vector for web sites but by leveraging IIS Request Filtering the malicious requests can be easily blocked. Using Log Parser and Findstr you can quickly check your site’s logs to identify suspicious activity. Thanks for reading.

\r\n

Peter Viola

Creative, customer focused, results oriented, Senior Web Systems Engineer who enjoys providing the highest level of customer service supporting complex Windows hosting solutions. MCITP, MCSA, MCTS

More Posts - Website

Troubleshoot Windows Server 2012 Performance with a Data Collector

 Windows Server 2008, Windows Server 2012  Comments Off on Troubleshoot Windows Server 2012 Performance with a Data Collector
Dec 142014
 

Regardless of whether you are running Windows Server 2012 on virtual server or physical server, the success of your business depends on having the server run at optimal capacity. To ensure the server delivers uninterrupted service, you have be aware of potential performance issues before they arise.\r\n\r\nOne of the best methods to analyze the performance of Windows Server 2012 is with Performance Monitor and a User Defined Data Collector. With this tool the identification and analysis of potential performance issues has never been easier. Upon completion, a detailed summary report will be generated providing immediate insight into key aspects of the server’s performance such as Disk IO, CPU, and RAM as well as network utilization. Reading the report summary is simplified further with the use of green, red and yellow icons that call your attention to any irregularities. Additional in depth metrics are contained in collapsible sections of the report below the Summary.\r\n

Creating a New Data Collector

\r\nTo create a new User Defined Data Collector simply open Performance Monitor, right click on User Defined, select Data Collector Set. A wizard will launch to guide you through creating a new Data Collector. Once created the Data Collector will be available to run as frequently as needed. Each time it runs a new report will be created.\r\n\r\nimage\r\n\r\n \r\n\r\nThe first step will be to enter the name of your report. I usually specify “Performance” somewhere in the name since that is the type of Data Collector I am planning on running. Choosing the default option of Create from the template is recommended. Click on Next to continue.\r\n\r\nimage\r\n\r\n \r\n\r\nThe next step will be to choose the Data Collector Template that you want to use. I am going to choose System Performance. Click on Next to continue.\r\n\r\nimage\r\n\r\n \r\n\r\nNext you will be prompted to choose a path to store the report data. Depending on how long your report runs and how frequently you run it the reports can consume a lot of space. In the event that your server has multiple disk drives, it would be better to select the larger drive for storing the reports.  Click Next to Continue.\r\n\r\nimage\r\n\r\n \r\n\r\nLeave <Default> for the Run as: user context. You can change that later if needed. We need to configure some additional settings before running so select Open properties for this data collector set and then click Finish.\r\n\r\nimage\r\n

\r\n

Additional Data Collector Properties

\r\nBefore running your new data collector there are a few properties that you want to double check first.\r\n\r\n \r\n

Setting the Stop Condition

\r\nWith the properties open, click on the Stop Condition tab so that you can enter a specific period of time for the Data Collector to run. It is important to set a Stop Condition before running otherwise it will continue to run indefinitely until you manually stop it. As I noted earlier not only can the logs can take up disk space but also running a Data Collector for an extended period of time can impact server performance so specifying a Stop Condition is a good idea. For short tests I typically set 20-30 minutes. For longer tests I’ll set 2-3 hours.\r\n\r\nimage\r\n\r\n \r\n

\r\n

Setting a Recurring Schedule

\r\nChances are you may already be aware of a performance problem on your server and need to isolate the analysis window to a specific day or time period. Clicking on the Schedule tab will enable to specify multiple dates and times to run the Data Collector. This could be especially helpful if your server gets busy with after-hours utilization and you’re not available to start the data collector manually.\r\n\r\n \r\n\r\nimage\r\n\r\n \r\n\r\nYou can even select a date range to run the data collector on specific days of the week during that period of time.\r\n\r\nimage\r\n\r\n \r\n\r\nOnce you’ve finished setting the properties of the data collector just right-click on the name to run it manually or wait for the schedule to start it automatically.\r\n\r\n \r\n

\r\n

Viewing the Summary Report

\r\nYou will be able to view and analyze the report generated by the Data Collector once it has completed running. If you try to view the report before it has completed you will be notified that the Data Collector is still running. The report is located under the User Defined Reports section of Performance Monitor.\r\n\r\nimage\r\n\r\n \r\n\r\nThe overall performance of the server is displayed at the top of the report in the Summary. Anything requiring your immediate attention is noted in the Diagnostic Results section. In the picture below we can see that the server clearly needs additional RAM to alleviate the disk paging that is occurring.  The Resource Overview offers an easy to read chart of the server’s core resources of CPU, Network, Disk, and Memory. The status of each of these is indicated with Green, Yellow, or Red icons.\r\n\r\nimage\r\n\r\n \r\n\r\nBelow the Summary are collapsible sections that offer more detailed insight into the server’s CPU, Network, Disk, and Memory utilization. Here are two examples of the additional data that is available:\r\n

CPU Utilization

\r\nIn the picture below we can see that one IIS worker process was consuming nearly 80% of the server’s CPU utilization. Performing additional analysis with Log Parser on the web site’s web logs would help identify the problems this particular web site is experiencing.\r\n\r\nimage\r\n\r\n \r\n

Disk IO

\r\nSome cloud server providers will charge overage fees for excessive disk IO so it’s important to know what’s happening there. In the Disk summary there a helpful report that shows exactly what files on your server are consuming the most IO. This report is aptly named Files Causing Most Disk IOs. In the picture below we can see that pagefile.sys is causing a lot of disk IO. This is a good indication that the server could benefit from additional RAM thereby reducing the amount of disk paging that is occurring.\r\n\r\nimage\r\n

\r\n

Viewing the Data Counters

\r\nIn addition to reading the data collector report you also have the ability to view the raw counter data. From this view you can select all the counters that were collecting data or only a few and play back the utilization as it occurred.\r\n\r\nimage\r\n\r\n \r\n

In Summary

\r\nWindows Server 2012 offers several tools for analyzing your server’s performance. The Performance Monitor Data Collector offers comprehensive insight into resource utilization and makes it easy to quickly identify performance bottlenecks. Thanks for reading.

Peter Viola

Creative, customer focused, results oriented, Senior Web Systems Engineer who enjoys providing the highest level of customer service supporting complex Windows hosting solutions. MCITP, MCSA, MCTS

More Posts - Website

How To Quickly Analyze Windows Server Disk Space Utilization

 Windows Server 2008, Windows Server 2012  Comments Off on How To Quickly Analyze Windows Server Disk Space Utilization
Nov 292014
 

When your Windows server is low on space or runs out of space entirely you need to quickly identify where the disk space is being utilized and free up space. Low disk space or worse yet no disk space can have a negative impact on your server’s performance.  Knowing the paths to a few folders that typically eat up space such as web logs isn’t enough when you need to free up space ‘now’. In this situation you need a graphical tool that can quickly analyze an entire disk drive or even multiple drives and show you how the server’s space is being utilized. Fortunately for Windows server admins JDiskReport and WinDirStat are two such tools and better still they are both free.\r\n\r\n \r\n

Using JDiskReport

\r\nJDiskReport is a free graphical disk space tool from jgoodies.com. Unlike some of those other free tools companies provide that require you to register your product before it works or that you have to pay to unlock features, JDisk is ready to use as soon as it’s installed and it’s feature complete. Installation of JDisk is straight forward and quite simple.\r\n\r\nBefore you install JDisk you should know that it requires the Java Runtime to run. If the Java Runtime is missing and you install JDisk, it prompt you to locate the path to the Java Runtime. Once you’ve downloaded Jdisk to your server just launch the installation wizard. The only additional step of the wizard will be to specify the path where you want it to be installed.\r\n\r\nimage\r\n\r\n \r\n\r\nOnce installation has completed you will be presented with the default starting screen. Any previous paths that you’ve analyzed will be displayed for added convenience. You can select the entire disk drive or a specific folder on the server.\r\n\r\nimage\r\n\r\n \r\n\r\nUnless I have a specific folder in mind I typically pick the entire disk drive. Within a few minutes, after initiating a directory scan, you will see a detailed analysis of the server’s disk space utilization. This report is more than just a pretty picture. Not only can you can click on any folder of the navigation tree to drill down more but you can also click on any part of the pie chart to see subdirectories.\r\n\r\nimage\r\n\r\n \r\n\r\nIn the picture above we can see that the Windows folder is using the most space but that is to be expected on a C: drive. Looking more closely I can see that on this server C:\temp is using over 9 GB and that’s unusual so there’s probably some files in there I can delete which will free up valuable space. In addition to the colorful chart you can also get a detailed file list and sort that according to size. In the picture below we can see a more detailed look at C:\temp.\r\n\r\nimage\r\n\r\nWithin minutes of running the scan, JDisk has helped me find several large files which can be deleted.\r\n\r\n \r\n

Using WinDirStat

\r\nWinDirStat can be downloaded from windirstat.info and is available in 12 different languages. It offers some interesting features such as an option to delete files and a color coded treemap  as well as disk space utilization based on file type. Installing WinDirStat is just as simple as installing Jdisk. Upon launching the wizard you’ll be prompted to accept the GNU GPL. After that you just need to choose the features and then pick the installation path.\r\n\r\nimage\r\n\r\n \r\n\r\nWhen the program first opens, it will display all of the disk drives available for analysis. If your server happens to have any network drives mapped, they will also be displayed. Here you have the option to scan all the drives on the server, just one drive, or a specific folder.\r\n\r\nimage\r\n\r\n \r\n\r\nScanning the disk drive completes quickly however it’s hard to say whether WinDirStat is faster or slower than JDisk. The speed of both programs will ultimately depend on how much data is being analyzed and the server hardware configuration  such as processor speed and disk drive speed. Once it completes you are presented with a detailed analysis of the disk space utilization. Clicking on any folder in the tree view enables you to drill down in the directory tree.\r\n\r\nimage\r\n\r\n \r\n\r\nFrom the application menu you can toggle showing the utilization by file type and see the treemap. Although the treemap and file type analysis are helpful, I prefer to just use the directory list because when I’m working on a server that’s running out of disk space, I need to get it resolved quickly.\r\n\r\nimage\r\n\r\n \r\n

In Summary

\r\nHaving enough free disk space is a necessity for Windows servers to perform optimally.  Graphical tools like JDiskReport and WinDirStat make it easy to identify where your server’s disk space is being consumed. Both are capable programs and work quickly to analyze disk space utilization. If I had to choose only one, I could pick WinDirStat because it doesn’t require any additional software to operate. Thanks for reading.

Peter Viola

Creative, customer focused, results oriented, Senior Web Systems Engineer who enjoys providing the highest level of customer service supporting complex Windows hosting solutions. MCITP, MCSA, MCTS

More Posts - Website

Nov 072014
 

When it comes to improving Windows server performance, most sysadmins focus on hardware such as adding CPUs or RAM. However, low disk space can also impact performance sometimes even causing critical processes such as backups to fail. Fortunately there are quite a few places to check on a Windows server to free up additional disk space.  Some paces to check are obvious such as cleaning up log files while other paces are not as obvious such as finding system temp files.

\r\n

How to See System Files

\r\n

Before searching for additional space you need to ensure that you Windows Explorer will display hidden system files and file extensions. To confirm you can see these open Windows Explorer and go to Folder & Search Options.

\r\n

image

\r\n

 

\r\n

Click on the View tab and select Show hidden files, folders, and drives. Uncheck Hide protected operating system files and Hide extensions for known file types.  Making these changes will allow you to see all the files on the server including system files and folders which could be taking up unnecessary space. Click OK to close the window.

\r\n

Before deleting anything always double check that you really don’t need the files any more and it’s safe to delete. Here are the top places that I check when I need to free up disk space on a Windows server.

\r\n

1. Empty Recycle Bin

\r\n

Cleaning up the recycle bin is most likely the easiest way to purge files unnecessarily taking up space. When you need to quickly clean up space this is the first place to check. It is surprising how much space can accumulate over time. Every disk volume on the server has a $recycle.bin folder. As mentioned above you won’t be able to see it until you enable viewing system folders. In the picture below you can see there’s plenty of deleted files waiting to be purged. Just select all the folders and right-click to delete them.

\r\n

 

\r\n

image

\r\n

 

\r\n

2. Compress IIS Log Files

\r\n

The next thing I do when I need to free up disk space is to compress the IIS site log files. The default path to these files is %SystemDrive%\inetpub\logs\LogFiles. However, I prefer to redirect that path to something easier to find at the root of the disk drive such as C:\wwwlogs. If the server has multiple drives I will store them on the largest drive. Unless you disable your site logs they will automatically grow until the disk drive has filled up or they are removed or they are deleted. Enabling Windows file compression on the IIS logs directory tree will save a considerable amount of disk space.

\r\n

image

\r\n

 

\r\n

To enable Windows file compression, just right-click on logs folder and select Properties. Click the Advanced button and as shown in the picture above and select Compress contents to save disk space. Click OK to close the window. Depending on how much content you have in the directory tree it may take several minutes to complete.

\r\n

 

\r\n

image

\r\n

The picture above is from an IIS logs folder where I enabled compression and as you can see it saved 62% of the space being utilized by the log files. You can squeeze even more free space from your IIS log files by zipping them with an archiving program. In a recent walkthrough of mine I show how to manage IIS logs with GZipStream.

\r\n

 

\r\n

3. Compress SQL Server Backups

\r\n

 

\r\n

The SQL Server backup folder is another great place to check when you need to free up some disk space. You can use the steps above to apply Windows file compression and as well zipping the files to free up additional disk space. In the photo below the SQL Server backup folder is using 1.8 GB of space without any compression.

\r\n

image

\r\n

After applying compression to this folder I was able to save approximately 60% of the disk space used by the backups. By zipping the files as well can you save can even more space. Depending on your particular business needs, you can also save additional disk space by limiting number of backups SQL Server stores on the server. This can be configured with a SQL Server Maintenance Plan.

\r\n

 

\r\n

4. Cleanup Performance Monitor Reports

\r\n

Windows Performance Monitor is an invaluable tool to analyze performance on a Widows server. Within minutes, one can easily configure a Data Collector to get deep insights on CPU, RAM, Network IO, and Disk IO. However, this convenience can also lead to disk space being needlessly consumed when you have forgotten about the reports days or weeks after the analysis has completed. This will be even more apparent if someone forgets to set a Stop Condition on the Data Collector and leaves it running for days.

\r\n

image

\r\n

The default path for the logs is usually C:\PerfLogs. The report path is also clearly shown in the Data Collector properties. Once your analysis has completed and you’ve reviewed the reports you can delete them. Applying Windows file compression to the reports folder as shown above will also help save disk space.

\r\n

 

\r\n

5. Cleanup Windows Error Reports

\r\n

Windows Error Reporting is an exceptional tool for identifying issues on your server. Unless you delete the logs or disable the feature they will accumulate over time. The default path to WER reports is C:\ProgramData\Microsoft\Windows\WER and there are two sub-directories below it. You can delete the files in the folders but you should leave the 2 folders in place. This is another great place to apply Windows file compression to save more space.

\r\n

image

\r\n

 

\r\n

 

\r\n

6. Cleanup Windows Temp Files

\r\n

There are several paths on Windows server’s that are used temporarily when installing updates or new programs. In many cases Windows will automatically delete these files after the installation has completed. However sometimes you’ll need to manually delete them yourself. One such folder is C:\Windows\ServiceProfiles\NetworkService\AppData\Local\Temp. In the picture below I was able to free up nearly 1 GB by deleting Malware Protection updates that had not been properly removed after they were installed.

\r\n

clip_image002

\r\n

Here are some other possible locations to look for temporary files that can be removed:

\r\n

    \r\n

  • C:\temp
  • \r\n

  • C:\Users\Default\AppData\Local\Temp
  • \r\n

  • %localappdata%\Microsoft\Windows\Temporary Internet Files
  • \r\n

  • %localappdata%\Microsoft\Windows\Explorer
  • \r\n

  • %windir%\ServiceProfiles\LocalService\AppData\Local\Temp
  • \r\n

\r\n

 

\r\n

7. Windows Disk Cleanup Tool

\r\n

Trying to remember all the paths to temporary files can be a daunting challenge for any sysadmin. Fortunately Microsoft recognized this as well. On Windows Server 2008 R2 and Windows Server 2012 or later,  you can get a Disk Cleanup tool like the one on the desktop versions of Windows. However, to take advantage of this you need to install the Desktop Experience feature which is available using the Server Manager’s Add Features Wizard. Just check the feature and then complete the wizard.

\r\n

image

\r\n

 

\r\n

After the server has been installed you can access the Disk Cleanup tool from the Control Panel. You will have a convenient way to clean up different types of temporary files including Windows Update files and Windows Error Reporting files.

\r\n

 

\r\n

image

\r\n

This tool is very helpful with cleaning up disk space. However, you should be aware that there will be some additional programs installed along with the Disk Cleanup tool which you may not want on your server such as Media Player. Here is a complete list of the programs that are installed with the Desktop Experience.

\r\n

 

\r\n

8. Windows Server 2008

\r\n

All of the options listed above will also work on Windows Server 2008 systems however specifically on Windows Server 2008 SP2 servers you can make the service pack permanent and free up space by running the following command which should free up nearly 1GB of disk space on the server:

\r\n

    \r\n

  • compcln.exe /VERBOSE:C:\temp\compcln.txt
  • \r\n

\r\n

 

\r\n

9. Windows Server 2003

\r\n

Windows Server 2003 “end of life” is July 14, 2015. If you haven’t started migration plans for legacy systems on that platform then you need to start planning for it asap. A great place to clean up space on Windows Server 2003 is to delete the hotfix uninstall files. Imagine my surprise when I logged into the server below to work on a low disk space situation and I found over 1 GB of these legacy files going back to 2011. There are also files in the C:\windows\$hf_mig$ folder that can be cleaned up. However, It’s always a good idea to wait at least a week or two before deleting these files in case you need to rollback one of the hotfixes.

\r\n

 

\r\n

image

\r\n

 

\r\n

One additional way to free up space would be to create a symbolic link from one directory to another on a larger disk drive. Mark Russinovich’s free Junction tool makes it very easy to do this however you have to be careful when doing this or you can inadvertently cause problems for yourself. Be sure to make a backup before using it the first time.

\r\n

In Summary

\r\n

Having your Windows server run out of space can cause serious performance issues as well as prevent important backup processes from running. I covered several great places to check on a Windows server when you need to free up space. Always confirm that files are safe to delete before you delete them. Thanks for reading.

\r\n

Peter Viola

Creative, customer focused, results oriented, Senior Web Systems Engineer who enjoys providing the highest level of customer service supporting complex Windows hosting solutions. MCITP, MCSA, MCTS

More Posts - Website

Aug 282014
 

Depending on how many sites your Windows web server is hosting maintaining IIS logs can be a challenge.  IIS logs provide valuable insight into the traffic your sites are experiencing as well as detailed SEO metrics and performance data. A typical web server will have just enough free disk space for future growth needs but ultimately will be limited by the capacity of drives in the server. If left unmonitored IIS logs can quickly fill any remaining disk space on the server. There are a few 3rd party tools that are good at compressing log files when they are under one parent folder but when the log files are in different locations such as on a WebsitePanel server I support an alternative solution is needed. In this walkthrough I will demonstrate how I solved this challenge using asp.net and GZipStream.\r\n\r\n \r\n

What about Windows Folder Compression?

\r\nEnabling Windows folder compression should always be the first step to try when freeing up space used by IIS logs. I always redirect the site logs to a folder at the root level of the server so it’s easy to find them and monitor their growth. In the example below I saved 8.3 GB of disk space without having even having to zip any files. However Windows folder compression was not going to work for my WebsitePanel server.\r\n\r\nCapture12\r\n\r\n \r\n

Finding WebsitePanel IIS Logs

\r\nWebsitePanel is a wonderful Windows control panel for Multi-Tennant web hosting and the best part is that it’s free. If you haven’t discovered this incredible application yet check it out. Once installed every aspect of site creation and maintenance is completely automated. As shown in the picture below customer sites are usually deployed with a directory structure like c:\HostingSpaces\customername\sitename. The IIS logs are always stored further down the tree in …\sitename\logs\W3SVCXX. As one can imagine on a web server with hundreds of sites it would be quite a challenge to track down and manually apply Windows compression on all the W3SVC folders. This would be further complicated as new sites are added to the server.\r\n\r\nUntitled-1 copy\r\n\r\n \r\n

Using Directory.EnumerateFiles and GZipStream

\r\nI knew I was going to do some coding to solve my challenge but I also wanted to keep the solution easy to use.  In years past I would have considered VBScript folder recursion and executing the PKZip command line tool but I wanted something more contemporary.  So I used Visual Studio 2013 and created a Console Application.\r\n\r\nCapture9\r\n\r\n \r\n\r\nimage\r\n\r\n \r\n\r\nThe GZipStream class which was introduced in .Net 2.0 would handle the necessary file compression. Since I would only be compressing individual files I didn’t have to deal with using a separate library such as DotNetZip.\r\n

\r\n

The Directory.EnumerateFiles method introduced in .Net 4 would use find all the IIS log folders from the WebsitePanel hosting spaces directory tree. Using the searchOption overload  would enable me to not only search for .log files but also limit the searching to specific folders matching “W3SVC”.  In the snippet below I loop through the search results, compress any log file older than 1 day, and delete the original file. The last step is to just write the name the file that was just zipped to an audit log.\r\n

\r\n

I only needed 30 days worth of log files to be left on the server. In the snippet below I delete any zipped log files matching my date criteria and then write the activity to the audit log.\r\n

\r\n

\r\n

\r\n

Using an XML Config File

\r\nTo help keep my application flexible I created a simple XML file to store the path to the WebsitePanel hosting spaces folder and the number of days of logs I wanted to keep on the server.\r\n

\r\n

 \r\n\r\nLoading these values into my program is done by parsing the XML data using XmlDocument.\r\n

\r\n

 \r\n

Running the Program

\r\n

Running the program from a command line I can see the progress as files are compressed. Afterwards I setup a scheduled task so it will run every day to compress new IIS log files and purge the ones older than 30 days.\r\n\r\nCapture7\r\n\r\n \r\n\r\nA few days later I came across several sites on the server using CMS products with their own application logs. These log files were not being cleaned up by the site owners and taking up free space on the server so I decided to archive them as well. Since they were being stored within each site in subfolder called “Logs”, tweaking the searchOption parameters of the Directory.EnumerateFiles method would catch them. With a quick rebuild of the program, it would now compress the CMS log files as well as the IIS log files.\r\n

\r\n

\r\n

\r\n

In Summary

\r\n

IIS logs contain wealth of data about web site traffic and performance. Left unchecked over time the logs will accumulate and take up disk space on the server. Using Windows folder compression is a simple solution to reclaim valuable disk space. If your log files are different locations then leveraging the power of ASP.NET with Directory.EnumerateFiles and GZipStream will do the trick. Thanks for reading.

Peter Viola

Creative, customer focused, results oriented, Senior Web Systems Engineer who enjoys providing the highest level of customer service supporting complex Windows hosting solutions. MCITP, MCSA, MCTS

More Posts - Website

How to Hot Add CPU and RAM with VMware vSphere

 Windows Server 2008, Windows Server 2012  Comments Off on How to Hot Add CPU and RAM with VMware vSphere
Aug 092014
 

One of the many benefits of using virtual servers over physical servers is the ability to add server resources such as CPU, RAM, and disk space on the fly without downtime. An addition drawback with a physical server is that you were often limited by the physical capacity of the server.  Once those limits were reached the server couldn’t be upgraded further.  Adding resources also required powering off the server which in turn would require coordinating with business owners and impacted users. Not all editions of Windows support hot-add so be sure to confirm your server is supported before starting. In this walkthrough I’ll show how easy it is to add server resources using VMware’s vSphere client.\r\n

Logging into vSphere Client

\r\nAfter authenticating on my network with the VMware vSphere 5.5 client I go to Hosts and Clusters under Inventory. From here I have access to all the virtual servers configured in our environment. After selecting the server to be upgraded you will be see the Getting Started tab. From here you have access to the usual administrative tasks such as starting, stopping, and restarting the server as well as performance reporting and events about the server. Click Edit virtual machine settings to add resources.\r\n\r\nCapture2\r\n\r\n \r\n

Enabling RAM and CPU Hotplug

\r\nAdding the additional resources is straight forward. However when you begin you may find the CPU and Memory properties disabled. This indicates that the server has not been been previously enabled for hot adding resources. In this instance the server will need to be shutdown before you can upgrade these resources.\r\n\r\nCapture3\r\n\r\n \r\n\r\nFortunately fixing this for future upgrades is a simple matter. When the server is powered down click on the Options tab of the Virtual Machine Properties. Under the Advanced settings go to the Memory/CPU Hotplug properties. Click Enable memory hot add and Enable CPU hot add. Click OK to save the changes. After the server is powered back on you will now be able to add CPU and Memory without having to first shutdown the server.\r\n\r\n \r\n\r\nUntitled-1\r\n\r\n \r\n\r\nTo add additional virtual CPUs simply increase the Number of virtual sockets and click OK to save the changes.\r\n\r\ncore\r\n\r\n \r\n\r\nTo add additional Memory to the server adjust the Memory Configuration accordingly and click OK to save.\r\n\r\nUntitled-2\r\n\r\n \r\n\r\n \r\n

Adding Additional Disk Space

\r\nIn addition to adding CPU and Memory to the server during this maintenance window I am also going to add disk space. Adding additional disk space is just as straight forward as adding CPU and Memory. In the Virtual Machine Properties on the Hardware tab go to the Hard disk settings. Increase the Provisioned Size by the new amount and click OK to save the changes. Windows will not automatically recognize the new space so the final step of the upgrade will be log into the server and Extend the server’s disk drive. This can either be accomplished using vShere’s server console window or by connecting to the server with Remote Desktop.\r\n\r\nCapture5\r\n\r\n \r\n

Extending Windows Disk Space

\r\nAfter logging into Windows open the Computer Management snap-in. In the console tree click on Disk Management under Storage. You may need to Rescan the disk before Windows will see that the new space is available.\r\n\r\nCapture6\r\n\r\n \r\n\r\nStep through the Extend Volume Wizard to allocate the additional space on the existing volume.\r\n\r\nCapture7\r\n\r\n \r\n

In Summary

\r\nVMware vSphere offers System Administrators complete control over virtual server properties. Adding additional CPU, RAM, and disk space is  straight forward and in many cases can be performed without having to shutdown the server.  To help minimize downtime of your next maintenance window, double check the edition of your Windows server supports hot-add and confirm the Memory/CPU Hotplug property has been enabled. Thanks for reading.

Peter Viola

Creative, customer focused, results oriented, Senior Web Systems Engineer who enjoys providing the highest level of customer service supporting complex Windows hosting solutions. MCITP, MCSA, MCTS

More Posts - Website

3 Steps to Securing FTP on IIS 8

 IIS, Windows Server 2012  Comments Off on 3 Steps to Securing FTP on IIS 8
Aug 072014
 

The FTP protocol is some 43 years old now. Yet it continues to be one of the most widely used file transfer technologies available. Over the years it has been shown to  be vulnerable to brute force attacks, packet capture, and other attack vectors.  Fortunately with IIS 8 on Windows Server 2012  your FTP server doesn’t have to be vulnerable. It goes without saying that FTP Authentication and Authorization are the most fundamental methods to secure your server.  Here are three additional things you can do to increase the security of your server’s FTP service and minimize its attack footprint.\r\n\r\n \r\n

IIS 8 FTP Logon Attempt Restrictions

\r\nOne of the most common FTP attack vectors is the dictionary attack. Using automated tools hackers will repeatedly hammer your FTP site with thousands of username and password combinations hoping to find that one account with an easy password. In the picture below you can see just a snippet of this automated activity. Fortunately none of these attempts were successful.\r\n\r\nlog\r\n\r\n \r\n\r\nIIS 8 now features FTP Logon Attempt Restrictions. This powerful feature is not available in IIS 7 or IIS 7.5. Once configured automated logon attacks will be stopped in their tracks. From IIS Manager simply click on FTP Logon Attempt Restrictions.\r\n\r\nCapture3\r\n\r\n \r\n\r\nConfiguring the FTP Logon Attempt Restrictions module is easy. Simply choose how many logon attempts are to be allowed and the time period for them to occur. When you consider that most FTP clients will save passwords in a profile, legitimate users on your FTP site should only need 1-2 logon attempts. However, depending on how many FTP users you’re hosting and their technical savvy you may need to tweak these settings.\r\n\r\nCapture4\r\n\r\n \r\n\r\nTesting my FTP site now with the new logon attempt restrictions it is easy to see how well it works. After the threshold is exceeded my connection to the server is forcibly closed. Automated hack attempts will no longer be a threat to this FTP server.\r\n\r\nCapture5\r\n\r\n \r\n

Enable FTP Over SSL with IIS 8

\r\nThe FTP protocol wasn’t originally designed for encryption. Fortunately with IIS 8 (and IIS 7) your FTP sessions can now be encrypted with SSL. To configure FTPS also known as FTP Over SSL open IIS Manager. You can either specify using SSL when adding FTP Publishing to a site or alternatively just going to the FTP SSL Settings on an existing site. Connecting with SSL can either be optional or or you can force all connections to use it. Using the drop down menu choose the certificate that you want to be used to encrypt the connections.  Windows Server 2012 has a default certificate available however you are also welcome to install your own 3rd party certificate.\r\n\r\nimage\r\n\r\n \r\n\r\nAfter you’ve configured the SSL settings on the server you just need to change your FTP client connection properties. In the picture below I’m using a legacy version of Cute FTP 8.0. Depending on which FTP client you’re using your protocol menu will look different.\r\n\r\nimage\r\n\r\n \r\n\r\nHaving changed my FTP client settings I attempt to connect to the server using SSL. The first time you connect to the server you will be prompted to accept the new SSL certificate. The log snippet below shows that my FTP session is being properly established with SSL. My communication with the server is now secure and protected.  Here is a more detailed walk through of configuring FTP over SSL on IIS 8.\r\n\r\nimage\r\n

\r\n

Configuring IIS 8 FTP User Isolation

\r\nWhen IIS 7 was released the FTP service had been completely redesigned from the ground up with security in mind. This was a welcome change indeed from IIS 6. In addition to supporting FTP over SSL it introduced FTP User Isolation. Multiple users on the same FTP site could be separated regardless of which file path they were being logged into without risk of someone traversing up parent paths to other user folders.\r\n\r\nThe FTP Authorization rules make it easy to identify multiple users or even local groups to have access to the FTP server. The user isolation is accomplished by creating a virtual directory called LocalUser and then choosing User name directory (disable global virtual directories). The LocalUser virtual directory should point to the FTP root directory and then you create a separate virtual directory for each FTP user which points to their destination path.\r\n\r\nimage\r\n\r\n \r\n\r\nWith FTP User Isolation configured your users will never be able to move up up to a parent path beyond their individual root directory. Even if a user were able to correctly guess the username and virtual path of another FTP account on the server they will not be able to reach it. Due to the confines of the isolation the FTP session can not see anything else on the server. In the example below I login with local account ftpuser2 and attempt to change the path to /ftpuser1 however that path does not exist and therefore is not accessible to my user. Here is a more detailed walkthrough of configuring FTP User Isolation on IIS 8.\r\n\r\nimage\r\n\r\n \r\n

In Summary

\r\nIIS 8 on Windows Server 2012 offers the most secure FTP service of any IIS version to date. You have multiple layers of FTP security available by leveraging FTP Logon Attempt Restrictions, FTP Over SSL, and FTP User Isolation. Your FTP server will be well protected using these built-in modules. With internet security there is no ‘patch’ for complacence.  More security is always better so implement it when it’s readily available to you.  Thanks for reading.

Peter Viola

Creative, customer focused, results oriented, Senior Web Systems Engineer who enjoys providing the highest level of customer service supporting complex Windows hosting solutions. MCITP, MCSA, MCTS

More Posts - Website

Preventing Automated Attacks with IIS Dynamic IP Restrictions

 IIS, Windows Server 2012  Comments Off on Preventing Automated Attacks with IIS Dynamic IP Restrictions
Aug 042014
 

Another one of the great built-in features of IIS 8 is Dynamic IP Restrictions (DIPR). With a few simple configuration steps you can quickly set limits for blocking IP addresses based on the number of concurrent requests or frequency of requests over a period time. With these parameters in place IIS will take over blocking requests unattended thereby making your server more secure.\r\n\r\nBefore DIPR was available on IIS 7 you could manually block 1 IP or a range of IPs easily in the IP Address and Domain Restrictions module. However this could be a time consuming task if your server was under attack. Using a tool like Log Parser to examine the site’s logs you could identify IPs with suspicious activity but then you still had manually enter Deny Rules. Determined hackers will use a variety of IPs from proxy servers so by the time you’ve blocked a handful a new range could be starting up. DIPR was released out-of-band for IIS 7 and IIS 7.5 so you can leverage this great security tool on those web servers as well. In this walk through I cover how to configure Dynamic IP Restrictions and even show a test in action.\r\n

\r\n

Installing Dynamic IP Restrictions

\r\nOpen the Server Manager and to Web Server role. Under Security ensure that IP and Domain Restrictions is installed.\r\n\r\nimage\r\n\r\n \r\n

IP Address and Domain Restrictions in IIS Manager

\r\nOpen IIS Manager and click on IP Address and Domain Restrictions.\r\n\r\nCapture2\r\n\r\n \r\n\r\nFrom this window you can either Add Allow Entry rules or Add Deny Entry rules. These rules would be for manually blocking (or allowing) one IP address or an IP address range. You have to be care when blocking an IP range because you could inadvertently block legitimate traffic. Click on Edit Dynamic Restriction Settings to set the dynamic thresholds for blocking IP addresses.\r\n\r\nimage\r\n\r\n \r\n\r\nClick Edit Feature Settings to set the Deny Action Type. In this example I’ve set Forbidden so blocked requests will receive an http 403 status error. These errors will also be recorded in the site’s log for us to review later.\r\n\r\nimage\r\n\r\n \r\n\r\nOn the Dynamic IP Restriction Settings screen you can choose the maximum number of concurrent requests to block. And you can also Deny IP addresses based on frequency of requests over a period of time.\r\n\r\nCapture4\r\n\r\n \r\n\r\nAs always depending on the volume of your web site’s traffic you should test these settings to ensure that legitimate traffic does not get blocked.\r\n\r\n \r\n

\r\n

Testing Dynamic IP Address Blocking

\r\nI didn’t have a real security incident available for testing the DIPR module so I did the next best thing. Using Fiddler the free debugging tool from Telerik and StressStimulus a free load testing plugin from StimulusTechnology I hammered my test virtual server for a few minutes and got the desired results. With Fiddler open you will see the StressStimulus module. From here you can record your test case or open an existing test case as well as edit the test case paramters.\r\n\r\nCapture12\r\n\r\n \r\n

\r\n

\r\n

Test Results

\r\nStressStimulus gives you multiple detailed charts to review to gauge the performance of your site and identify potential areas of weakness. For my test I choose to hit the wp-login.php page on my test WordPress site with 3 concurrent requests and 100 iterations. The test completed within a few minutes.\r\n

Capture8

\r\n \r\n\r\nVisiting the test page from the server running StressStimulus I get the expected result. It’s blocked by a 403 error.  The full description of this code is 403.502 – Forbidden: Too many requests from the same client IP; Dynamic IP. \r\n

Capture

\r\n \r\n\r\nUsing the Log Parser query below to analyze the site log I see that 331 requests were blocked with a 403.502 status code.\r\n

\r\n

image\r\n\r\n \r\n\r\nFurther examination of the log with Log Parser shows the full break down of the requests blocked with 403 status.\r\n\r\nSELECT TOP 100\r\nSTRCAT(EXTRACT_PATH(cs-uri-stem),’/’) AS RequestPath, sc-status,sc-substatus,\r\nEXTRACT_FILENAME(cs-uri-stem) AS RequestedFile,\r\nCOUNT(*) AS TotalHits, c-ip\r\nFROM w3svc.og TO top-403-ip-requests\r\nwhere sc-status=403\r\nGROUP BY cs-uri-stem, sc-status,sc-substatus,c-ip\r\nORDER BY TotalHits DESC\r\n

\r\n

image\r\n\r\n \r\n

Summary

\r\nThe Dynamic IP Restrictions module is available with IIS 8 as well as IIS 7 and IIS 7.5. It is a powerful tool to block automated attacks on your site and requires minimal configuration and maintenance. Thanks for reading.

Peter Viola

Creative, customer focused, results oriented, Senior Web Systems Engineer who enjoys providing the highest level of customer service supporting complex Windows hosting solutions. MCITP, MCSA, MCTS

More Posts - Website

Jul 272014
 

The other day I was troubleshooting 100%  CPU utilization on a SQL Server 2008 database server. The server had 100 or so databases of varying sizes however none were larger than a few hundred MB and each database had a corresponding web site on a separate web server.  Since the server hosted quite a few databases the high CPU needed to be resolved quickly because it was causing issues for everyone.  High CPU on a database server can often be symptomatic of a issues occurring outside the server. In this case the real issue was in fact being caused by a SQL Injection attack on a web server.\r\n

Top Queries by Total CPU Time

\r\nThe knee jerk reaction when experiencing high CPU may be to stop it immediately either by restarting services or recycling app pools however letting it run temporarily will help you to isolate the cause. SQL Server 2008 has some great built-in reports to help track down CPU utilization. On this occasion I used the Top Queries by Total CPU Time report. You can get to this report by right clicking on the server name in SQL Server Management Studio and then selecting Reports.\r\n\r\nimage\r\n\r\n \r\n\r\nThe Top Queries by Total CPU Time report will take a few minutes to run. However once it completes it provides a wealth of information. You’ll get a Top 10 report clearly showing which queries and databases are consuming the most CPU on the server at that moment. Using this report I was able to see that one of the databases on the server had 4 different queries running that were contributing to the high CPU. Now I could focus my attention on this 1 problematic database and hopefully resolve the high CPU.\r\n\r\n \r\n\r\nimage\r\n\r\n \r\n

SQL Profiler and Database Tuning Advisor

\r\nNow that I knew which database was causing the problems I fired up SQL Profiler for just a few minutes. I wanted to get a better understanding of the activity that was occurring within the database. Looking at the high number of Reads coming from the app named “Internet Information Services” I was starting to realize that web site activity was hitting the database pretty hard. I could also see plaintext  data being inserted into the database and it was clearly spam.\r\n\r\nimage\r\n\r\n \r\n\r\nBefore I turned my attention to the web site however I wanted to see if there could be any performance improvement using the Database Engine Tuning Advisor since I had the valuable profiler trace data. The DTA will analyze the database activity and provide a SQL script with optimizations using indexes, partitioning, and indexed views. Usually with DTA I’ll see 5-10 % performance improvement. I was excited to see a 97% improvement!\r\n\r\nimage\r\n

\r\n

Preventing SQL Injection with IIS Request Filtering

\r\nAfter I applied the optimizations script from the Database Engine Tuning Advisor the CPU utilization on the database server improved considerably. However, I knew the web site was experiencing suspicious activity so I used Log Parser to get some reports from the site’s traffic log. Using the query below I could see the most frequently used querystring values and it was obvious the site experiencing a SQL Injection attack.\r\n

\r\n\r\nlogparser.exe -i:iisw3c “select top 20 count(*),cs-uri-query from ex140702.log\r\n\r\ngroup by cs-uri-query order by count(*) desc” -rtp:-1 >file.txt\r\n\r\n

\r\n

 \r\n\r\nWith attacks like this a natural inclination is to start blocking IP addresses. Unfortunately sophisticated attacks will use a variety of IP addresses so as soon as you block a few address malicious requests from new ones will take over. The best solution is to block the malicious requests with Request Filtering so I quickly added a few rules to block keywords I had seen in my log parser reports.\r\n\r\nrequestfiltering\r\n\r\n \r\n\r\nImplementing the IIS Request Filtering rules stymied the SQL Injection attack. Using the Log Parser query below I could see the http status codes of all the requests hitting the site with the new rules in place.\r\n

\r\n\r\nSELECT STRCAT(TO_STRING(sc-status), STRCAT(‘.’, TO_STRING(sc-substatus))) AS Status, COUNT(*)\r\n\r\nAS Total FROM w3svc.log to TopStatusCodes.txt GROUP BY Status ORDER BY Total DESC\r\n\r\n

\r\n

 \r\n\r\nRequest Filtering uses the http substatus 404.18 when a query string sequence is denied. Looking at Log Parser report below you can see the  50,039 requests were blocked by the new Request Filtering rules.\r\n\r\ntopstatuscodes\r\n

An Once of Prevention…

\r\nThe web site that had been attacked hosted free cooking recipes and allowed visitors to submit their own recipes. Unfortunately the owner’s goodwill was easily exploited because there was no form field validation on site’s submission page and new recipes were automatically being displayed on the site without being approved. This is a dangerous site design and should never have been deployed without basic security measures in place.\r\n\r\nI did a quick select count(*) from the recipe table in the database and was amused by all the delicious recipes I found Smile.\r\n\r\nimage\r\n\r\n \r\n

In Summary

\r\nSQL Server 2008 has several built-in reports like Top Queries by Total CPU Time to help Investigate high CPU utilization. Running SQL Profiler will provide detailed analysis of database activity. Running the profiler output through the Database Tuning Advisor can yield significant performance improvements for the database. IIS Request Filtering is a powerful tool to block SQL Injection attacks against a web site. However, SQL Injection can be easily mitigated using basic data validation. Thanks for reading.

Peter Viola

Creative, customer focused, results oriented, Senior Web Systems Engineer who enjoys providing the highest level of customer service supporting complex Windows hosting solutions. MCITP, MCSA, MCTS

More Posts - Website

Jun 302014
 

One of the many great new features with IIS 8 on Windows Server 2012 is Server Name Indication (SNI).  SNI is a TLS extension that includes  the hostname or virtual domain name during SSL negotiation. The reasoning behind this was to improve SSL scalability and minimize the need for dedicated IP addresses due to IPv4 scarcity. This means that you can now host multiple SSL certificates on a web server only 1 IP address. With previous versions of IIS you were forced to bind SSL certificates with unique IP addresses  and the only workaround available for hosting multiple SSL certificates with 1 IP address was to use a wild card certificate. In this walkthrough I will show how to leverage hosting multiple certificates using SNI.\r\n

Web Hosting Certificate Store

\r\nA new certificate store was created for Windows Server 2012  called the Web Hosting store. It is similar to the Personal store however it has been designed to support a significantly higher number of certificates with only a minimal performance impact on the server. On Windows Server 2012 certificates are now loaded on-demand in memory. Previously on older versions of Windows Server all certificates on a server would be loaded from just one GET request. The end result of this was high memory usage and limited scalability.\r\n\r\nsni6\r\n\r\n \r\n

Hosting Multiple Sites Using 1 IP Address

\r\nOn my test server I have 3 sites configured using host headers and 1 IP address.\r\n\r\nsni2\r\n\r\n \r\n\r\nI have already imported 3 SSL certificates and you can see they are in the Web Hosting certificate store. Installing the certificates is straight forward but I am not going to cover that in this blog post. However, if you need help with installing certificates then here are the steps to follow.\r\n\r\nsni1\r\n\r\n \r\n

Enabling Server Name Indication

\r\nServer Name Indication (SNI) is enabled on the site binding properties by clicking the Require Server Name Indication checkbox. Click OK to save the settings and then close the Site Bindings window.\r\n\r\nsni3\r\n\r\n \r\n\r\nNow I have added  an SSL certificate for each site and enabled Server Name Indication each site’s SSL binding. The certificates have been correctly added to the Web Hosting store to ensure scalability. Looking at IIS Manager below we can see that the https binding of each site is sharing same IP address. With previous version of IIS this would not have been possible because the other 2 sites would have automatically been stopped.\r\n\r\n \r\n\r\nsni4\r\n\r\n \r\n\r\nUsing an elevated command window you can see the new SSL binding type by running the following command:\r\n

\r\nThe picture below shows the SSL bindings for the 3 sites and the hostname is now included with port 443. Running this command on Windows Server 2008 you would only see the IP address and 443.\r\n\r\nsni7\r\n\r\n \r\n

In Summary

\r\nWindows Server 2012 and IIS 8 offer many new features and performance improvements for hosting sites. Server Name Indication (SNI) offers impressive SSL scalability with the addition of the Web hosting certificate store. Now you can host multiple unique certificates on multiple sites using only 1 address. Implementing SNI offers greater site density on web servers with only a minimal memory impact. Thanks for Reading.

Peter Viola

Creative, customer focused, results oriented, Senior Web Systems Engineer who enjoys providing the highest level of customer service supporting complex Windows hosting solutions. MCITP, MCSA, MCTS

More Posts - Website