Feb 062016
 

When it comes to securing IIS web applications on Windows Server 20008 R2 or Windows Server 2012 R2 one typically thinks of firewalls, access control lists (ACL), and using an application pool identity. These security measures will protect a site from external threats . However, .Net configuration files which typically store username and password data are text files so anyone with admin access to the server can read their contents.  The only way to prevent prying eyes from seeing app.config or web.config passwords is to encrypt them. Fortunately encrypting the connectionStrings section of a config file is straight foward. You can also encrypt other configuration sections in addition to connectionStrings section. Encrypting and decrypting config files can be performed programatically using .NET Framework methods or by using the ASP.NET IIS Registration tool (aspnet_regiis.exe).  With the encryption commands you can target either the path to the config file or reference an IIS application name. In my examples I will be encrypting and decrypting the connectionStrings section with the .NET Framework 4.

 

Encrypting Configuration Sections

You will find aspnet_regiis.exe in the C:\Windows\Microsoft.NET\Framework\version\ folder.  With the .NET Framework you can use the builtin protected configuration providers RSAProtectedConfigurationProvider  or DPAPIProtectedConfigurationProvider to encrypt and decrypt sections of your config files. You can also create your own provider. The general synatax to encrypt a config section is as follows:

aspnet_regiis.exe -pef section physical_directory -prov provider
or
aspnet_regiis.exe -pe section -app virtual_directory -prov provider

It is important to note when using aspnet_regiis.exe to encrypt or decrypt config files and you specify a physical path (rather than a web app name) the command is hardcoded  for a file named “web.config”.  If you are trying to run the command against an app.config you will first need to rename that file to web.config before running the command. Rename it back afterwards before using it. For this reason I find it easier to create a .bat file hardcoded with the necessary command syntax to encrypt my configs and then a 2nd .bat file to decrypt my configs.

On my Windows 2012 R2 server I have setup an IIS 8.5 site called domain1.com. For the example below I am using the builtin DPAPI provider to encrypt a web.config in c:\domains\domain1.com. The encrypted web.config is shown below.

aspnet_regiis.exe -pef "connectionStrings" "c:\domains\domain1.com" 
-prov "DataProtectionConfigurationProvider"

 

encrypted-config

 

Decrypting Configuration Sections

Following steps above we have now encrypted the connectionStrings section of the web.config for domain1.com. Naturally We also need to be able to decrypt it. When decrypting a config section you do not need to specify the protected configuration provider. Just like when encrypting a config file we can target either a file path or IIS web application name. Here is the syntax to decrypt a configuration file section:

aspnet_regiis.exe –pdf section physical_directory 
or
aspnet_regiis.exe –pd section -app virtual_directory

In my example below I decrypt the connectionStrings section of my web.config in c:\domains\domain1.com. As a reminder again when using the –pdf option we do not need to specify “web.config” in the syntax.

aspnet_regiis.exe –pdf "connectionStrings" "c:\domains\domain1.com" 

decrypt-config-connectstrings

After running the above command, the connectionStrings section of the web.config is decrypted as shown below. Once I am done editing my connection string I will follow best practices and encrypt the connectionStrings section again.

web-config-decrypted

Failed to decrypt using provider error

It is important to note that when encrypting your config files the encryption key is stored locally on the server which means if you need to move your encrypted config file to another server you will need to either decrypt the config file first before moving it to the new server or export the key prior to moving and install it on the new server.  If you move an encrypted config file to a server without exporting the encryption key you will receive an error like below indicating: Failed to decrypt using provider … Key not valid for use in specified state.

crypto-error

 

Creating an RSA Key Container

Fortunately moving encryption keys between servers is straight forward. We can create our own RSA key container, export it to a file, and then move it from server to server as needed. This is ideal for multi node web farm solutions where applications are deployed across multiple servers.  Use the following syntax to create an RSA key container. Be sure to include the –exp option so the container can be exported:

aspnet_regiis -pc "MyFarmKey" –exp

creating-mycrypto

 

Adding the configProtectedData section to your config

Next you will add the following configProtectedData section to your web.config.

<configProtectedData>\r\n   <providers>\r\n      <add name="MyFarmCrypto" \r\n           type="System.Configuration.RsaProtectedConfigurationProvider, 
System.Configuration, Version=4.0.0.0,\r\n Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a\r\n processorArchitecture=MSIL"\r\n keyContainerName="MyFarmKey" \r\n useMachineContainer="true" />\r\n </providers>\r\n</configProtectedData>

Below is how my web.config looks now that I have added the configProtectedData section.

myfarmkey-config

 

\r\n

Assigning permissions to the RSA key container

\r\n

Before the new RSA key container is ready to be used by my site domain1.com, I need to assign the application pool identity permission to access it. On the server in my example the application pool identity for domain1.com is ApplicationPoolIdentity. I use the following syntax to assign this user to the new RSA key container:

\r\n

aspnet_regiis -pa "MyFarmKey" "iis apppool\domain1.com"

\r\n

myfarmkey-identity

\r\n

Encrypting a config with an RSA key container

After adding the configProtectedData section to the web.config and granting permission to the RSA key container for domain1.com’s application pool identity, I’ll run the encryption command again using the new “MyFarmCrypto” RSA key container:

aspnet_regiis.exe -pef "connectionStrings" "c:\domains\domain1.com"
-prov "MyFarmCrypto"

image

In the image above we see the encryption succeeded. Note in the command syntax above we are specifying the configProtectionProvider name MyFarmCrypto and not the RSA key container name. If you mix that up you’ll get an error. We can see below how domain1.com’s web.config now looks after being encrypted with the new RSA key container.

farmcrypto-encrypted\r\n

Exporting and Importing an RSA Key Container

\r\n

Now that we’ve successfully created and tested our new RSA key Container we need to export it to a file. Once it’s saved in a file we can then copy it to other servers for installation as needed. It is important to remember to use –pri option to include the private key when the export file is created otherwise you will not be able to decrypt information on the next server .

\r\n

aspnet_regiis -px "MyFarmKey" "c:\MyFarmKey.xml" –pri

\r\n

export-farmkey

\r\n\r\n

Having logged into another server and copied the MyFarmKey.xml file to c:\temp I will import the key fil using the following command:

\r\n

aspnet_regiis -pi "MyFarmKey" "c:\temp\MyFarmKey.xml"

\r\n

import-farmkey

\r\n\r\n

For security purposes, after importing the key on a new server, delete the key .xml file from the server to ensure someone unauthorized doesn’t use it decrypt data. This of course assumes that you have backed up the file off server somewhere safe.

\r\n

To permanently delete the RSA key container from a server you should run this command:

\r\n

aspnet_regiis -pz "MyFarmKey"

\r\n

Summary

\r\n

The .NET Framework offers powerful encryption tools to secure sensitive information like usernames and passwords in application connection strings. When encrypting a config file on a server the private key used to decrypt the information is local to the server. Creating an RSA key container will enable you to encrypt information with the same private key across multiple servers. Thanks for reading.

Peter Viola

Creative, customer focused, results oriented, Senior Web Systems Engineer who enjoys providing the highest level of customer service supporting complex Windows hosting solutions. MCITP, MCSA, MCTS

More Posts - Website

Aug 282014
 

Depending on how many sites your Windows web server is hosting maintaining IIS logs can be a challenge.  IIS logs provide valuable insight into the traffic your sites are experiencing as well as detailed SEO metrics and performance data. A typical web server will have just enough free disk space for future growth needs but ultimately will be limited by the capacity of drives in the server. If left unmonitored IIS logs can quickly fill any remaining disk space on the server. There are a few 3rd party tools that are good at compressing log files when they are under one parent folder but when the log files are in different locations such as on a WebsitePanel server I support an alternative solution is needed. In this walkthrough I will demonstrate how I solved this challenge using asp.net and GZipStream.\r\n\r\n \r\n

What about Windows Folder Compression?

\r\nEnabling Windows folder compression should always be the first step to try when freeing up space used by IIS logs. I always redirect the site logs to a folder at the root level of the server so it’s easy to find them and monitor their growth. In the example below I saved 8.3 GB of disk space without having even having to zip any files. However Windows folder compression was not going to work for my WebsitePanel server.\r\n\r\nCapture12\r\n\r\n \r\n

Finding WebsitePanel IIS Logs

\r\nWebsitePanel is a wonderful Windows control panel for Multi-Tennant web hosting and the best part is that it’s free. If you haven’t discovered this incredible application yet check it out. Once installed every aspect of site creation and maintenance is completely automated. As shown in the picture below customer sites are usually deployed with a directory structure like c:\HostingSpaces\customername\sitename. The IIS logs are always stored further down the tree in …\sitename\logs\W3SVCXX. As one can imagine on a web server with hundreds of sites it would be quite a challenge to track down and manually apply Windows compression on all the W3SVC folders. This would be further complicated as new sites are added to the server.\r\n\r\nUntitled-1 copy\r\n\r\n \r\n

Using Directory.EnumerateFiles and GZipStream

\r\nI knew I was going to do some coding to solve my challenge but I also wanted to keep the solution easy to use.  In years past I would have considered VBScript folder recursion and executing the PKZip command line tool but I wanted something more contemporary.  So I used Visual Studio 2013 and created a Console Application.\r\n\r\nCapture9\r\n\r\n \r\n\r\nimage\r\n\r\n \r\n\r\nThe GZipStream class which was introduced in .Net 2.0 would handle the necessary file compression. Since I would only be compressing individual files I didn’t have to deal with using a separate library such as DotNetZip.\r\n

public static void Compress(FileInfo fileToCompress)\r\n{\r\n    using (FileStream originalFileStream = fileToCompress.OpenRead())\r\n    {\r\n        if ((File.GetAttributes(fileToCompress.FullName) & \r\n            FileAttributes.Hidden) != FileAttributes.Hidden & fileToCompress.Extension != ".gz")\r\n        {\r\n            using (FileStream compressedFileStream = File.Create(fileToCompress.FullName + ".gz"))\r\n            {\r\n                using (GZipStream compressionStream = new GZipStream(compressedFileStream, CompressionMode.Compress))\r\n                {\r\n                    originalFileStream.CopyTo(compressionStream);\r\n                    Console.WriteLine("Compressed {0} from {1} to {2} bytes.",\r\n                        fileToCompress.Name, fileToCompress.Length.ToString(), compressedFileStream.Length.ToString());\r\n                }\r\n            }\r\n        }\r\n    }\r\n}

\r\n

The Directory.EnumerateFiles method introduced in .Net 4 would use find all the IIS log folders from the WebsitePanel hosting spaces directory tree. Using the searchOption overload  would enable me to not only search for .log files but also limit the searching to specific folders matching “W3SVC”.  In the snippet below I loop through the search results, compress any log file older than 1 day, and delete the original file. The last step is to just write the name the file that was just zipped to an audit log.\r\n

public static void zipFiles()\r\n{\r\n    try\r\n    {\r\n        foreach (string file in Directory.EnumerateFiles(homeDirectory, "*.log", \r\n            SearchOption.AllDirectories).Where(d => d.Contains("W3SVC")).ToArray())\r\n        {\r\n            FileInfo fi = new FileInfo(file);\r\n            if (fi.LastWriteTime < DateTime.Now.AddDays(-1))\r\n            {\r\n                Compress(fi);\r\n                fi.Delete();\r\n                activityLog("Zipping: " + file);\r\n            }\r\n        }\r\n    }\r\n    catch (UnauthorizedAccessException UAEx)\r\n    {\r\n        activityLog(UAEx.Message);\r\n    }\r\n    catch (PathTooLongException PathEx)\r\n    {\r\n        activityLog(PathEx.Message);\r\n    }\r\n}

\r\n

I only needed 30 days worth of log files to be left on the server. In the snippet below I delete any zipped log files matching my date criteria and then write the activity to the audit log.\r\n

\r\n

public static void deleteFiles()\r\n{\r\n    try\r\n    {\r\n        foreach (string file in Directory.EnumerateFiles(homeDirectory, "*.gz", \r\n            SearchOption.AllDirectories).Where(d => d.Contains("W3SVC")).ToArray())\r\n        {\r\n            FileInfo fi = new FileInfo(file);\r\n            if (fi.LastWriteTime < DateTime.Now.AddDays(-daysToSave))\r\n            {\r\n                fi.Delete();\r\n                activityLog("Deleting:" + file);\r\n            }\r\n\r\n        }\r\n    }\r\n    catch (UnauthorizedAccessException UAEx)\r\n    {\r\n        activityLog(UAEx.Message);\r\n    }\r\n    catch (PathTooLongException PathEx)\r\n    {\r\n        activityLog(PathEx.Message);\r\n    }\r\n}

\r\n

\r\n

Using an XML Config File

\r\nTo help keep my application flexible I created a simple XML file to store the path to the WebsitePanel hosting spaces folder and the number of days of logs I wanted to keep on the server.\r\n

<?xml version="1.0" encoding="utf-8"?>\r\n<MyConfig>\r\n  <HomeDirectory>D:\HostingSpaces</HomeDirectory>\r\n  <DaysToSave>30</DaysToSave>\r\n</MyConfig>

\r\n

 \r\n\r\nLoading these values into my program is done by parsing the XML data using XmlDocument.\r\n

public static void readConfig()\r\n{\r\n\r\n    string path = System.IO.Path.GetFullPath(@"logCleanConfig.xml");\r\n    XmlDocument xmlDoc = new XmlDocument();\r\n    xmlDoc.Load(path);\r\n    XmlNodeList HomeDirectory = xmlDoc.GetElementsByTagName("HomeDirectory");\r\n    homeDirectory = HomeDirectory[0].InnerXml;\r\n\r\n    XmlNodeList DaysToSave = xmlDoc.GetElementsByTagName("DaysToSave");\r\n    daysToSave = Convert.ToInt32(DaysToSave[0].InnerXml);\r\n\r\n}

\r\n

 \r\n

Running the Program

\r\n

Running the program from a command line I can see the progress as files are compressed. Afterwards I setup a scheduled task so it will run every day to compress new IIS log files and purge the ones older than 30 days.\r\n\r\nCapture7\r\n\r\n \r\n\r\nA few days later I came across several sites on the server using CMS products with their own application logs. These log files were not being cleaned up by the site owners and taking up free space on the server so I decided to archive them as well. Since they were being stored within each site in subfolder called “Logs”, tweaking the searchOption parameters of the Directory.EnumerateFiles method would catch them. With a quick rebuild of the program, it would now compress the CMS log files as well as the IIS log files.\r\n

foreach (string file in Directory.EnumerateFiles(homeDirectory, "*.log",\r\n  SearchOption.AllDirectories).Where(d => d.Contains("W3SVC") || d.Contains("Logs")).ToArray())

\r\n

\r\n

\r\n

In Summary

\r\n

IIS logs contain wealth of data about web site traffic and performance. Left unchecked over time the logs will accumulate and take up disk space on the server. Using Windows folder compression is a simple solution to reclaim valuable disk space. If your log files are different locations then leveraging the power of ASP.NET with Directory.EnumerateFiles and GZipStream will do the trick. Thanks for reading.

Peter Viola

Creative, customer focused, results oriented, Senior Web Systems Engineer who enjoys providing the highest level of customer service supporting complex Windows hosting solutions. MCITP, MCSA, MCTS

More Posts - Website

Jul 222013
 

Redirecting visitors on your site from one page to another is handled by using either a 301 redirect or a 302 redirect. The numbers 301 and 302 refer to the http status code that is returned by the web server to your browser. They may seem similar but they are quite different. A 302 indicates a temporary change and a 301 indicates a permanent change. This difference is important to understand and will impact how search engines see content changes on your site. There are a number of ways to implement a 301 redirect on your web site. Some are easier than others to configure and will depend on the version of IIS you are using. Here’s the story of how I recently had to use the global.asax and Application_BeginRequest to do a 301 redirect.

Unforeseen consequences of revoking a certificate

The other day I was helping someone who had revoked their site’s SSL certificate. They were no longer going to use SSL on their site so they figured revoking the certificate was a logical decision. Unfortunately what they had not realized was that the https:// url of their site had been indexed by most search engines and they were getting a lot organic traffic to their site using that url. By revoking the certificate many of their visitors were now seeing dire warnings in their browsers like the picture below instead of going to their site. This was not good.

image

 

Not being a technical person they figured that just removing the certificate from the site bindings would solve their problem. This was not a good idea. On the one hand it solved the problem with the browser security warnings being displayed but in fact it just caused a different problem. People were still accessing the https:// url of their site so instead of a security warning now they were just seeing an error. Using Fiddler you can see that a 502 error is generated when you try to access a site using https without having a binding for it.

image

 

The need for a redirect

We needed to take visitors accessing the https url of the site and send them to the http url of the site. This is the perfect application of using either a 301 or 302 redirect. However, here’s where things got a little more complicated. Ordinarily I would just use Url Rewrite or even a Url Rewrite Map to handle the 301 redirects. Unfortunately their site was hosted on IIS 6 so we couldn’t use Url Rewrite. Furthermore we only needed to redirect incoming requests using SSL. The site content was fine so page level redirects such as a meta tag refresh weren’t going to help in this case either.

Since the site was using .Net 2.0 I decided to use the Application_BeginRequest event in the global.asax. This is the first event in the HTTP pipeline change of execution when asp.net responds to a request. Using this event I created a conditional statement to test the HTTPS server variable to see if the request was being made using SSL or not. If the request was made with SSL then we would redirect it to the http url of the site as shown below. Bear in mind however that Response.Redirect’s default status is 302 –a temporary redirect. In my situation I needed a 301 permanent redirect so that search engines would drop the https url from their index. So I had to add the extra line of Response.StatusCode=301.

image

At this point I was pretty satisfied I had solved my friend’s problem. I had setup a test site with an SSL certificate and the redirect worked great. Unfortunately when I set it up on the live site (with the revoked certificate) nothing happened Sad smile.  It turned out that because the site’s certificate had been revoked, browsers weren’t actually loading the site which in turn meant the redirect wasn’t happening. There was only one way to solve this last piece of the puzzle and that meant putting in a valid SSL certificate again. So I created a Certificate Signing Request for my friend’s site and within minutes they had a new $9 RapidSSL certificate from Namecheap.com. Once a new certificate was bound to the site the https page requests started working again and then our custom 301 redirect in the global.asax was able to do it’s job.

 

Testing a 301 Redirect

Because I needed the redirect to be permanent I wanted to be sure it was really returning a 301 status. Checking the web site’s www logs would have confirmed this but that’s a bit cumbersome especially when a tool like Fiddler makes it so easy to check. Fiddler is a free web debugging tool. As one can see in the pictures below the redirect was in fact returning a 301 status code.

image

Here you can see the raw header and body of the request.

image

If you need to remove a url from a search engine’s index you can contact them directly:

https://support.google.com/webmasters/answer/164734?hl=en

http://www.bing.com/webmaster/help/how-can-i-remove-a-url-or-page-from-the-bing-index-37c07477

Please note that is is not a fast process and using a 301 permanent redirect is the best solution.

Summary

Sending traffic to a different location on your site can be accomplished using either a 301 permanent redirect or a 302 temporary redirect and this will ensure your search engine ranking isn’t impacted. There are many techniques to implement a redirect such as using Url Rewrite, meta tags, or even Response.Redirect. If you’re going to revoke an SSL certificate or remove one from your site, first be absolutely sure that there isn’t traffic using the certificate. Thanks for reading.

Peter Viola

Creative, customer focused, results oriented, Senior Web Systems Engineer who enjoys providing the highest level of customer service supporting complex Windows hosting solutions. MCITP, MCSA, MCTS

More Posts - Website

Dec 142012
 

Using system.net.mail to send email messages from your web site makes life so easy.  In the old days of Classic ASP you often had to rely on 3rd party components such as AspMail from serverobjects.com or AspEmail from persists.com. While they were very capable products and are still widely used today it added an additional layer of complexity to your programming. If you ever had to move a site from one server to another there was always a risk the components were not in place which would cause problems for your users.  \r\n\r\nWith system.net.mail you know as long as .Net is installed on the server hosting your site, your code will always work no matter how many times you move your web site or change hosting providers. In it’s simplest form the snippet below is the bare minimum of code you need to send a plain text message from your asp.net application. \r\n\r\n

//create the mail message\r\nMailMessage mail = new MailMessage();\r\n\r\n//set the addresses\r\nmail.From = new MailAddress("me@mycompany.com");\r\nmail.To.Add("you@yourcompany.com");\r\n\r\n//set the content\r\nmail.Subject = "This is an email";\r\nmail.Body = "this is a sample body";\r\n\r\n//send the message\r\nSmtpClient smtp = new SmtpClient("localhost");\r\nsmtp.Send(mail);

\r\n\r\nThis works great when you are sending mail using the local SMTP server. However in certain situations you may need to send mail through a remote SMTP server. In most cases that remote server will have quite a bit of security enabled to prevent relaying and blocking spammers so the above code will not be enough for your application to send mail.\r\n\r\nIn this case you will need to send your message by authenticating on the remote server with a username and password. So how does one go about doing that with system.net.mail? Well here’s a bit a code that shows you how to do just that. \r\n\r\n

string strTo = "test@gdomain-y.com";\r\nstring strFrom="test@domain-x.com";\r\nstring strSubject="Mail Test Using SMTP Auth";\r\nstring strBody="This is the body of the message";\r\n\r\nstring userName = "xxx"; //remote mail server username\r\nstring password = "xxx"; //remote mail server pasword\r\n\r\nMailMessage mailObj = new MailMessage(strFrom, strTo, strSubject, strBody);\r\nSmtpClient SMTPServer = new SmtpClient("mail.mydomain.com"); //remote smtp server\r\nSMTPServer.Credentials = new System.Net.NetworkCredential(userName, password);\r\ntry \r\n{ \r\nSMTPServer.Send(mailObj); \r\nResponse.Write("Sent!"); \r\n}\r\n \r\ncatch (Exception ex) \r\n{\r\nResponse.Write(ex.ToString()); \r\n}\r\n

\r\n
For additional examples check out the wonderful resource at http://www.systemnetmail.com. I hope this helps. Thanks for reading.
\r\n\r\n\r\n

Peter Viola

Creative, customer focused, results oriented, Senior Web Systems Engineer who enjoys providing the highest level of customer service supporting complex Windows hosting solutions. MCITP, MCSA, MCTS

More Posts - Website

Sep 302012
 

Email is everywhere. You may not know immediately if your web site is down but you’ll almost always know if your email isn’t working.  The ability to send and receive email 24×7 is critical to the success of any business. This means you need a product which is reliable, scalable, and affordable. There are not many mail server products on the market which meet that criteria. However, Smartermail by SmarterTools.com is one such product. Year after year Smartertools has evolved with more features and better security. The latest version of Smartermail is their best product to date.
Smartermail 10 GUI
However, they inadvertently introduced a small design flaw in the administration GUI. Finding disabled domains is a real pain. If your Smartermail deployment only has 5-10 domains then you probably haven’t even noticed this. However if your organization relies on Smartermail to host hundreds or even thousands of mail domains then you are already well aware of this design oversight.  Of course one can simply scroll through the domain list looking for Disabled status which is clearly marked in red however this is a daunting task for large deployments. Curiously in legacy versions of Smartermail you could easily sort your entire domain list by enabled or disabled status.

Because Smartermail stores it’s domains in a logical fashion, it is easy to programmatically access the config files and determine whether or not a domain is enabled. For example C:\Smartermail\Domains would contain all the domains of your Smartermail instance.  Each mail domain will be stored as subdirectory of this directory and the individual domain settings will be stored in file called domainconfig.xml. Contained in this file is a node called “isEnabled”.  As one might expect if the domain is enabled the value will be True whereas if the domain is disabled then the value will be False. Here is a a snippet of what it looks like.


So here’s where a bit of programming saves the day. Using C# and ASP.NET I created a simple console application which will read the Smartermail domains folder,  check the domainconfig.xml of each domain on the server, and then output the results to a log file. One complication with this is that Smartermail may not have been installed in the default location so I am using a simple xml config file for my program which specifies the path to the domains folder along with the path where I want the log file to saved.

\r\n

<?xml version="1.0" encoding="utf-8"?>\r\n<MyConfig>\r\n  <DomainsPath>H:\Smartermail\Domains</DomainsPath>\r\n  <LogPath>H:\Temp\disabled_log.txt</LogPath>\r\n</MyConfig>

\r\n\r\nHere’s how my program looks:\r\n\r\n

namespace checkDisabled\r\n{\r\n    class Program\r\n    {\r\n        public static string strLogPath;\r\n        public static string strDomainsPath;\r\n                \r\n        static void Main(string[] args)\r\n        {\r\n            string path = System.IO.Path.GetFullPath(@"checkDisabled.xml");\r\n            XmlDocument xmlDoc = new XmlDocument();\r\n            xmlDoc.Load(path);\r\n            XmlNodeList DomainsPath = xmlDoc.GetElementsByTagName("DomainsPath");\r\n            XmlNodeList LogPath = xmlDoc.GetElementsByTagName("LogPath");\r\n\r\n            strDomainsPath = DomainsPath[0].InnerXml;\r\n            strLogPath = LogPath[0].InnerXml;\r\n\r\n            checkFolders();\r\n        }\r\n\r\n        static private void checkFolders()\r\n     \r\n        static private void checkDisabled(string sDomain, string sPath)\r\n     \r\n        static private void disabledLog(string strLine)\r\n    }\r\n}\r\n

\r\n\r\nI create a subroutine called “checkFolders” which reads the Smartermail domains folder and then iterates through any subdirectory. To help keep things clean I will use a another subroutine to actually read domainconfig.xml file.\r\n\r\n

static private void checkFolders()\r\n{\r\n    string[] folders = System.IO.Directory.GetDirectories(strDomainsPath);\r\n    string strDomainName = "";\r\n    string strConfigFile = "";\r\n\r\n    Console.WriteLine("Checking " + strDomainsPath + " for disabled domains.");\r\n\r\n    foreach (string sDir in folders)\r\n    {\r\n        strConfigFile = sDir + @"\domainConfig.xml";\r\n        strDomainName = sDir.Substring(strDomainsPath.Length + 1);\r\n        if (File.Exists(strConfigFile))\r\n            checkDisabled(strDomainName, strConfigFile);\r\n    }\r\n\r\n    Console.WriteLine("Done.");\r\n}

\r\n\r\nHere is the code I use to read the domainconfig.xml file. If a domain is identified as being disabled then I write a line to the screen as well as the log file.\r\n

static private void checkDisabled(string sDomain, string sPath)\r\n{\r\n    XmlDocument xmlDoc = new XmlDocument();\r\n    xmlDoc.Load(sPath);\r\n    XmlNodeList isEnabled = xmlDoc.GetElementsByTagName("isEnabled");\r\n\r\n    if (!Convert.ToBoolean(isEnabled[0].InnerXml))\r\n    {\r\n        Console.WriteLine("Disabled: " + sDomain);\r\n        disabledLog(sDomain);\r\n    }\r\n}\r\n

\r\n\r\nWriting the output to the log file is straight forward. I check if the log file exists and if it doesn’t then just create an empty file.\r\n\r\n

static private void disabledLog(string strLine)\r\n{\r\n    if (!File.Exists(strLogPath))\r\n        using (StreamWriter sw = File.CreateText(strLogPath))\r\n            sw.WriteLine("");\r\n\r\n    using (StreamWriter sw = File.AppendText(strLogPath))\r\n        sw.WriteLine(strLine);\r\n}

\r\n\r\nRunning the program from the command line we see the results.

\r\n

H:\temp>checkdisabled\r\nChecking H:\Smartermail\domains for disabled domains.\r\nDisabled: domain2.net\r\nDisabled: domain4.com\r\nDone.\r\n\r\nH:\temp>

\r\n
\r\n
Taking it a step further we could use Windows Task Scheduler run it on a weekly or daily basis and with only a small tweak we could code emailing the log file so that we didn’t even have to login to the server to check the disabled domains.

Peter Viola

Creative, customer focused, results oriented, Senior Web Systems Engineer who enjoys providing the highest level of customer service supporting complex Windows hosting solutions. MCITP, MCSA, MCTS

More Posts - Website