Aug 282014
 

Depending on how many sites your Windows web server is hosting maintaining IIS logs can be a challenge.  IIS logs provide valuable insight into the traffic your sites are experiencing as well as detailed SEO metrics and performance data. A typical web server will have just enough free disk space for future growth needs but ultimately will be limited by the capacity of drives in the server. If left unmonitored IIS logs can quickly fill any remaining disk space on the server. There are a few 3rd party tools that are good at compressing log files when they are under one parent folder but when the log files are in different locations such as on a WebsitePanel server I support an alternative solution is needed. In this walkthrough I will demonstrate how I solved this challenge using asp.net and GZipStream.

 

What about Windows Folder Compression?

Enabling Windows folder compression should always be the first step to try when freeing up space used by IIS logs. I always redirect the site logs to a folder at the root level of the server so it’s easy to find them and monitor their growth. In the example below I saved 8.3 GB of disk space without having even having to zip any files. However Windows folder compression was not going to work for my WebsitePanel server.

Capture12

 

Finding WebsitePanel IIS Logs

WebsitePanel is a wonderful Windows control panel for Multi-Tennant web hosting and the best part is that it’s free. If you haven’t discovered this incredible application yet check it out. Once installed every aspect of site creation and maintenance is completely automated. As shown in the picture below customer sites are usually deployed with a directory structure like c:\HostingSpaces\customername\sitename. The IIS logs are always stored further down the tree in …\sitename\logs\W3SVCXX. As one can imagine on a web server with hundreds of sites it would be quite a challenge to track down and manually apply Windows compression on all the W3SVC folders. This would be further complicated as new sites are added to the server.

Untitled-1 copy

 

Using Directory.EnumerateFiles and GZipStream

I knew I was going to do some coding to solve my challenge but I also wanted to keep the solution easy to use.  In years past I would have considered VBScript folder recursion and executing the PKZip command line tool but I wanted something more contemporary.  So I used Visual Studio 2013 and created a Console Application.

Capture9

 

image

 

The GZipStream class which was introduced in .Net 2.0 would handle the necessary file compression. Since I would only be compressing individual files I didn’t have to deal with using a separate library such as DotNetZip.

public static void Compress(FileInfo fileToCompress)
{
    using (FileStream originalFileStream = fileToCompress.OpenRead())
    {
        if ((File.GetAttributes(fileToCompress.FullName) & 
            FileAttributes.Hidden) != FileAttributes.Hidden & fileToCompress.Extension != ".gz")
        {
            using (FileStream compressedFileStream = File.Create(fileToCompress.FullName + ".gz"))
            {
                using (GZipStream compressionStream = new GZipStream(compressedFileStream, CompressionMode.Compress))
                {
                    originalFileStream.CopyTo(compressionStream);
                    Console.WriteLine("Compressed {0} from {1} to {2} bytes.",
                        fileToCompress.Name, fileToCompress.Length.ToString(), compressedFileStream.Length.ToString());
                }
            }
        }
    }
}

The Directory.EnumerateFiles method introduced in .Net 4 would use find all the IIS log folders from the WebsitePanel hosting spaces directory tree. Using the searchOption overload  would enable me to not only search for .log files but also limit the searching to specific folders matching “W3SVC”.  In the snippet below I loop through the search results, compress any log file older than 1 day, and delete the original file. The last step is to just write the name the file that was just zipped to an audit log.

public static void zipFiles()
{
    try
    {
        foreach (string file in Directory.EnumerateFiles(homeDirectory, "*.log", 
            SearchOption.AllDirectories).Where(d => d.Contains("W3SVC")).ToArray())
        {
            FileInfo fi = new FileInfo(file);
            if (fi.LastWriteTime < DateTime.Now.AddDays(-1))
            {
                Compress(fi);
                fi.Delete();
                activityLog("Zipping: " + file);
            }
        }
    }
    catch (UnauthorizedAccessException UAEx)
    {
        activityLog(UAEx.Message);
    }
    catch (PathTooLongException PathEx)
    {
        activityLog(PathEx.Message);
    }
}

I only needed 30 days worth of log files to be left on the server. In the snippet below I delete any zipped log files matching my date criteria and then write the activity to the audit log.

public static void deleteFiles()
{
    try
    {
        foreach (string file in Directory.EnumerateFiles(homeDirectory, "*.gz", 
            SearchOption.AllDirectories).Where(d => d.Contains("W3SVC")).ToArray())
        {
            FileInfo fi = new FileInfo(file);
            if (fi.LastWriteTime < DateTime.Now.AddDays(-daysToSave))
            {
                fi.Delete();
                activityLog("Deleting:" + file);
            }

        }
    }
    catch (UnauthorizedAccessException UAEx)
    {
        activityLog(UAEx.Message);
    }
    catch (PathTooLongException PathEx)
    {
        activityLog(PathEx.Message);
    }
}

Using an XML Config File

To help keep my application flexible I created a simple XML file to store the path to the WebsitePanel hosting spaces folder and the number of days of logs I wanted to keep on the server.

<?xml version="1.0" encoding="utf-8"?>
<MyConfig>
  <HomeDirectory>D:\HostingSpaces</HomeDirectory>
  <DaysToSave>30</DaysToSave>
</MyConfig>

 

Loading these values into my program is done by parsing the XML data using XmlDocument.

public static void readConfig()
{

    string path = System.IO.Path.GetFullPath(@"logCleanConfig.xml");
    XmlDocument xmlDoc = new XmlDocument();
    xmlDoc.Load(path);
    XmlNodeList HomeDirectory = xmlDoc.GetElementsByTagName("HomeDirectory");
    homeDirectory = HomeDirectory[0].InnerXml;

    XmlNodeList DaysToSave = xmlDoc.GetElementsByTagName("DaysToSave");
    daysToSave = Convert.ToInt32(DaysToSave[0].InnerXml);

}

 

Running the Program

Running the program from a command line I can see the progress as files are compressed. Afterwards I setup a scheduled task so it will run every day to compress new IIS log files and purge the ones older than 30 days.

Capture7

 

A few days later I came across several sites on the server using CMS products with their own application logs. These log files were not being cleaned up by the site owners and taking up free space on the server so I decided to archive them as well. Since they were being stored within each site in subfolder called “Logs”, tweaking the searchOption parameters of the Directory.EnumerateFiles method would catch them. With a quick rebuild of the program, it would now compress the CMS log files as well as the IIS log files.

foreach (string file in Directory.EnumerateFiles(homeDirectory, "*.log",
  SearchOption.AllDirectories).Where(d => d.Contains("W3SVC") || d.Contains("Logs")).ToArray())

In Summary

IIS logs contain wealth of data about web site traffic and performance. Left unchecked over time the logs will accumulate and take up disk space on the server. Using Windows folder compression is a simple solution to reclaim valuable disk space. If your log files are different locations then leveraging the power of ASP.NET with Directory.EnumerateFiles and GZipStream will do the trick. Thanks for reading.

Jul 222013
 

Redirecting visitors on your site from one page to another is handled by using either a 301 redirect or a 302 redirect. The numbers 301 and 302 refer to the http status code that is returned by the web server to your browser. They may seem similar but they are quite different. A 302 indicates a temporary change and a 301 indicates a permanent change. This difference is important to understand and will impact how search engines see content changes on your site. There are a number of ways to implement a 301 redirect on your web site. Some are easier than others to configure and will depend on the version of IIS you are using. Here’s the story of how I recently had to use the global.asax and Application_BeginRequest to do a 301 redirect.

Unforeseen consequences of revoking a certificate

The other day I was helping someone who had revoked their site’s SSL certificate. They were no longer going to use SSL on their site so they figured revoking the certificate was a logical decision. Unfortunately what they had not realized was that the https:// url of their site had been indexed by most search engines and they were getting a lot organic traffic to their site using that url. By revoking the certificate many of their visitors were now seeing dire warnings in their browsers like the picture below instead of going to their site. This was not good.

image

 

Not being a technical person they figured that just removing the certificate from the site bindings would solve their problem. This was not a good idea. On the one hand it solved the problem with the browser security warnings being displayed but in fact it just caused a different problem. People were still accessing the https:// url of their site so instead of a security warning now they were just seeing an error. Using Fiddler you can see that a 502 error is generated when you try to access a site using https without having a binding for it.

image

 

The need for a redirect

We needed to take visitors accessing the https url of the site and send them to the http url of the site. This is the perfect application of using either a 301 or 302 redirect. However, here’s where things got a little more complicated. Ordinarily I would just use Url Rewrite or even a Url Rewrite Map to handle the 301 redirects. Unfortunately their site was hosted on IIS 6 so we couldn’t use Url Rewrite. Furthermore we only needed to redirect incoming requests using SSL. The site content was fine so page level redirects such as a meta tag refresh weren’t going to help in this case either.

Since the site was using .Net 2.0 I decided to use the Application_BeginRequest event in the global.asax. This is the first event in the HTTP pipeline change of execution when asp.net responds to a request. Using this event I created a conditional statement to test the HTTPS server variable to see if the request was being made using SSL or not. If the request was made with SSL then we would redirect it to the http url of the site as shown below. Bear in mind however that Response.Redirect’s default status is 302 –a temporary redirect. In my situation I needed a 301 permanent redirect so that search engines would drop the https url from their index. So I had to add the extra line of Response.StatusCode=301.

image

At this point I was pretty satisfied I had solved my friend’s problem. I had setup a test site with an SSL certificate and the redirect worked great. Unfortunately when I set it up on the live site (with the revoked certificate) nothing happened Sad smile.  It turned out that because the site’s certificate had been revoked, browsers weren’t actually loading the site which in turn meant the redirect wasn’t happening. There was only one way to solve this last piece of the puzzle and that meant putting in a valid SSL certificate again. So I created a Certificate Signing Request for my friend’s site and within minutes they had a new $9 RapidSSL certificate from Namecheap.com. Once a new certificate was bound to the site the https page requests started working again and then our custom 301 redirect in the global.asax was able to do it’s job.

 

Testing a 301 Redirect

Because I needed the redirect to be permanent I wanted to be sure it was really returning a 301 status. Checking the web site’s www logs would have confirmed this but that’s a bit cumbersome especially when a tool like Fiddler makes it so easy to check. Fiddler is a free web debugging tool. As one can see in the pictures below the redirect was in fact returning a 301 status code.

image

Here you can see the raw header and body of the request.

image

If you need to remove a url from a search engine’s index you can contact them directly:

https://support.google.com/webmasters/answer/164734?hl=en

http://www.bing.com/webmaster/help/how-can-i-remove-a-url-or-page-from-the-bing-index-37c07477

Please note that is is not a fast process and using a 301 permanent redirect is the best solution.

Summary

Sending traffic to a different location on your site can be accomplished using either a 301 permanent redirect or a 302 temporary redirect and this will ensure your search engine ranking isn’t impacted. There are many techniques to implement a redirect such as using Url Rewrite, meta tags, or even Response.Redirect. If you’re going to revoke an SSL certificate or remove one from your site, first be absolutely sure that there isn’t traffic using the certificate. Thanks for reading.

Dec 142012
 

Using system.net.mail to send email messages from your web site makes life so easy.  In the old days of Classic ASP you often had to rely on 3rd party components such as AspMail from serverobjects.com or AspEmail from persists.com. While they were very capable products and are still widely used today it added an additional layer of complexity to your programming. If you ever had to move a site from one server to another there was always a risk the components were not in place which would cause problems for your users. 

With system.net.mail you know as long as .Net is installed on the server hosting your site, your code will always work no matter how many times you move your web site or change hosting providers. In it’s simplest form the snippet below is the bare minimum of code you need to send a plain text message from your asp.net application.

//create the mail message
MailMessage mail = new MailMessage();

//set the addresses
mail.From = new MailAddress("me@mycompany.com");
mail.To.Add("you@yourcompany.com");

//set the content
mail.Subject = "This is an email";
mail.Body = "this is a sample body";

//send the message
SmtpClient smtp = new SmtpClient("localhost");
smtp.Send(mail);

This works great when you are sending mail using the local SMTP server. However in certain situations you may need to send mail through a remote SMTP server. In most cases that remote server will have quite a bit of security enabled to prevent relaying and blocking spammers so the above code will not be enough for your application to send mail.

In this case you will need to send your message by authenticating on the remote server with a username and password. So how does one go about doing that with system.net.mail? Well here’s a bit a code that shows you how to do just that.

string strTo = "test@gdomain-y.com";
string strFrom="test@domain-x.com";
string strSubject="Mail Test Using SMTP Auth";
string strBody="This is the body of the message";

string userName = "xxx"; //remote mail server username
string password = "xxx"; //remote mail server pasword

MailMessage mailObj = new MailMessage(strFrom, strTo, strSubject, strBody);
SmtpClient SMTPServer = new SmtpClient("mail.mydomain.com"); //remote smtp server
SMTPServer.Credentials = new System.Net.NetworkCredential(userName, password);
try 
{ 
SMTPServer.Send(mailObj); 
Response.Write("Sent!"); 
}
 
catch (Exception ex) 
{
Response.Write(ex.ToString()); 
}

For additional examples check out the wonderful resource at http://www.systemnetmail.com. I hope this helps. Thanks for reading.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS
Sep 302012
 

Email is everywhere. You may not know immediately if your web site is down but you’ll almost always know if your email isn’t working.  The ability to send and receive email 24×7 is critical to the success of any business. This means you need a product which is reliable, scalable, and affordable. There are not many mail server products on the market which meet that criteria. However, Smartermail by SmarterTools.com is one such product. Year after year Smartertools has evolved with more features and better security. The latest version of Smartermail is their best product to date.
Smartermail 10 GUI
However, they inadvertently introduced a small design flaw in the administration GUI. Finding disabled domains is a real pain. If your Smartermail deployment only has 5-10 domains then you probably haven’t even noticed this. However if your organization relies on Smartermail to host hundreds or even thousands of mail domains then you are already well aware of this design oversight.  Of course one can simply scroll through the domain list looking for Disabled status which is clearly marked in red however this is a daunting task for large deployments. Curiously in legacy versions of Smartermail you could easily sort your entire domain list by enabled or disabled status.

Because Smartermail stores it’s domains in a logical fashion, it is easy to programmatically access the config files and determine whether or not a domain is enabled. For example C:\Smartermail\Domains would contain all the domains of your Smartermail instance.  Each mail domain will be stored as subdirectory of this directory and the individual domain settings will be stored in file called domainconfig.xml. Contained in this file is a node called “isEnabled”.  As one might expect if the domain is enabled the value will be True whereas if the domain is disabled then the value will be False. Here is a a snippet of what it looks like.


So here’s where a bit of programming saves the day. Using C# and ASP.NET I created a simple console application which will read the Smartermail domains folder,  check the domainconfig.xml of each domain on the server, and then output the results to a log file. One complication with this is that Smartermail may not have been installed in the default location so I am using a simple xml config file for my program which specifies the path to the domains folder along with the path where I want the log file to saved.

<?xml version="1.0" encoding="utf-8"?>
<MyConfig>
  <DomainsPath>H:\Smartermail\Domains</DomainsPath>
  <LogPath>H:\Temp\disabled_log.txt</LogPath>
</MyConfig>

Here’s how my program looks:

namespace checkDisabled
{
    class Program
    {
        public static string strLogPath;
        public static string strDomainsPath;
                
        static void Main(string[] args)
        {
            string path = System.IO.Path.GetFullPath(@"checkDisabled.xml");
            XmlDocument xmlDoc = new XmlDocument();
            xmlDoc.Load(path);
            XmlNodeList DomainsPath = xmlDoc.GetElementsByTagName("DomainsPath");
            XmlNodeList LogPath = xmlDoc.GetElementsByTagName("LogPath");

            strDomainsPath = DomainsPath[0].InnerXml;
            strLogPath = LogPath[0].InnerXml;

            checkFolders();
        }

        static private void checkFolders()
     
        static private void checkDisabled(string sDomain, string sPath)
     
        static private void disabledLog(string strLine)
    }
}

I create a subroutine called “checkFolders” which reads the Smartermail domains folder and then iterates through any subdirectory. To help keep things clean I will use a another subroutine to actually read domainconfig.xml file.

static private void checkFolders()
{
    string[] folders = System.IO.Directory.GetDirectories(strDomainsPath);
    string strDomainName = "";
    string strConfigFile = "";

    Console.WriteLine("Checking " + strDomainsPath + " for disabled domains.");

    foreach (string sDir in folders)
    {
        strConfigFile = sDir + @"\domainConfig.xml";
        strDomainName = sDir.Substring(strDomainsPath.Length + 1);
        if (File.Exists(strConfigFile))
            checkDisabled(strDomainName, strConfigFile);
    }

    Console.WriteLine("Done.");
}

Here is the code I use to read the domainconfig.xml file. If a domain is identified as being disabled then I write a line to the screen as well as the log file.

static private void checkDisabled(string sDomain, string sPath)
{
    XmlDocument xmlDoc = new XmlDocument();
    xmlDoc.Load(sPath);
    XmlNodeList isEnabled = xmlDoc.GetElementsByTagName("isEnabled");

    if (!Convert.ToBoolean(isEnabled[0].InnerXml))
    {
        Console.WriteLine("Disabled: " + sDomain);
        disabledLog(sDomain);
    }
}

Writing the output to the log file is straight forward. I check if the log file exists and if it doesn’t then just create an empty file.

static private void disabledLog(string strLine)
{
    if (!File.Exists(strLogPath))
        using (StreamWriter sw = File.CreateText(strLogPath))
            sw.WriteLine("");

    using (StreamWriter sw = File.AppendText(strLogPath))
        sw.WriteLine(strLine);
}

Running the program from the command line we see the results.

H:\temp>checkdisabled
Checking H:\Smartermail\domains for disabled domains.
Disabled: domain2.net
Disabled: domain4.com
Done.

H:\temp>

Taking it a step further we could use Windows Task Scheduler run it on a weekly or daily basis and with only a small tweak we could code emailing the log file so that we didn’t even have to login to the server to check the disabled domains.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS