Sep 302012
 

Email is everywhere. You may not know immediately if your web site is down but you’ll almost always know if your email isn’t working.  The ability to send and receive email 24×7 is critical to the success of any business. This means you need a product which is reliable, scalable, and affordable. There are not many mail server products on the market which meet that criteria. However, Smartermail by SmarterTools.com is one such product. Year after year Smartertools has evolved with more features and better security. The latest version of Smartermail is their best product to date.
Smartermail 10 GUI
However, they inadvertently introduced a small design flaw in the administration GUI. Finding disabled domains is a real pain. If your Smartermail deployment only has 5-10 domains then you probably haven’t even noticed this. However if your organization relies on Smartermail to host hundreds or even thousands of mail domains then you are already well aware of this design oversight.  Of course one can simply scroll through the domain list looking for Disabled status which is clearly marked in red however this is a daunting task for large deployments. Curiously in legacy versions of Smartermail you could easily sort your entire domain list by enabled or disabled status.

Because Smartermail stores it’s domains in a logical fashion, it is easy to programmatically access the config files and determine whether or not a domain is enabled. For example C:\Smartermail\Domains would contain all the domains of your Smartermail instance.  Each mail domain will be stored as subdirectory of this directory and the individual domain settings will be stored in file called domainconfig.xml. Contained in this file is a node called “isEnabled”.  As one might expect if the domain is enabled the value will be True whereas if the domain is disabled then the value will be False. Here is a a snippet of what it looks like.


So here’s where a bit of programming saves the day. Using C# and ASP.NET I created a simple console application which will read the Smartermail domains folder,  check the domainconfig.xml of each domain on the server, and then output the results to a log file. One complication with this is that Smartermail may not have been installed in the default location so I am using a simple xml config file for my program which specifies the path to the domains folder along with the path where I want the log file to saved.

<?xml version="1.0" encoding="utf-8"?>
<MyConfig>
  <DomainsPath>H:\Smartermail\Domains</DomainsPath>
  <LogPath>H:\Temp\disabled_log.txt</LogPath>
</MyConfig>

Here’s how my program looks:

namespace checkDisabled
{
    class Program
    {
        public static string strLogPath;
        public static string strDomainsPath;
                
        static void Main(string[] args)
        {
            string path = System.IO.Path.GetFullPath(@"checkDisabled.xml");
            XmlDocument xmlDoc = new XmlDocument();
            xmlDoc.Load(path);
            XmlNodeList DomainsPath = xmlDoc.GetElementsByTagName("DomainsPath");
            XmlNodeList LogPath = xmlDoc.GetElementsByTagName("LogPath");

            strDomainsPath = DomainsPath[0].InnerXml;
            strLogPath = LogPath[0].InnerXml;

            checkFolders();
        }

        static private void checkFolders()
     
        static private void checkDisabled(string sDomain, string sPath)
     
        static private void disabledLog(string strLine)
    }
}

I create a subroutine called “checkFolders” which reads the Smartermail domains folder and then iterates through any subdirectory. To help keep things clean I will use a another subroutine to actually read domainconfig.xml file.

static private void checkFolders()
{
    string[] folders = System.IO.Directory.GetDirectories(strDomainsPath);
    string strDomainName = "";
    string strConfigFile = "";

    Console.WriteLine("Checking " + strDomainsPath + " for disabled domains.");

    foreach (string sDir in folders)
    {
        strConfigFile = sDir + @"\domainConfig.xml";
        strDomainName = sDir.Substring(strDomainsPath.Length + 1);
        if (File.Exists(strConfigFile))
            checkDisabled(strDomainName, strConfigFile);
    }

    Console.WriteLine("Done.");
}

Here is the code I use to read the domainconfig.xml file. If a domain is identified as being disabled then I write a line to the screen as well as the log file.

static private void checkDisabled(string sDomain, string sPath)
{
    XmlDocument xmlDoc = new XmlDocument();
    xmlDoc.Load(sPath);
    XmlNodeList isEnabled = xmlDoc.GetElementsByTagName("isEnabled");

    if (!Convert.ToBoolean(isEnabled[0].InnerXml))
    {
        Console.WriteLine("Disabled: " + sDomain);
        disabledLog(sDomain);
    }
}

Writing the output to the log file is straight forward. I check if the log file exists and if it doesn’t then just create an empty file.

static private void disabledLog(string strLine)
{
    if (!File.Exists(strLogPath))
        using (StreamWriter sw = File.CreateText(strLogPath))
            sw.WriteLine("");

    using (StreamWriter sw = File.AppendText(strLogPath))
        sw.WriteLine(strLine);
}

Running the program from the command line we see the results.

H:\temp>checkdisabled
Checking H:\Smartermail\domains for disabled domains.
Disabled: domain2.net
Disabled: domain4.com
Done.

H:\temp>

Taking it a step further we could use Windows Task Scheduler run it on a weekly or daily basis and with only a small tweak we could code emailing the log file so that we didn’t even have to login to the server to check the disabled domains.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS
Sep 262012
 

What’s that? Robocopy can send emails? Well not exactly but here’s how you can do it.

Robocopy is a free tool from Microsoft that was originally part of Windows Resource Kit for Windows NT 4. It was then included as a standard feature with Windows Server 2003 and each subsequent release of Windows Server. It is a powerful tool for transferring data from 1 place to another.  You can transfer data from 1 folder to another on the same server or between 2 servers provided you use an account with permissions on both servers. There are extensive command switches available to control how the data transferred such as preserving ACL permissions, directory structures, and even empty folders. The basic syntax is as follows:

robocopy Source Destination [File[ ...]] [Options]

Recently I had to transfer nearly 50 GB of data from a legacy server to a new server. This migration was occurring on 2 production mail servers and there would be a impact to users so downtime needed to be minimal. I had done thorough testing so I knew it was going to take approximately 3 hours to transfer the data.

Since I was transferring the data from a legacy server to a new Windows 2008 R2 server I planned to initiate robcopy from the new server. By doing this I was able to use the /MT multi-threading switch. I had done several tests using different values for this switch and settled on 32. Increasing the value beyond 32 did not produce any noticeable difference in transfer speed. Without using the switch I was consistently seeing 150 MB/min transfer speeds. Here is the syntax that I used:

robocopy.exe  "\\oldserver\bigfolder" "D:\newfolder" /LOG:d:\temp\log.txt /MT:32 /E /W:5 /Z /V /R:6

I knew that once I initiated the robocopy job there was nothing else I could do other than wait for the data transfer to finish. Furthermore, I did not want to have to frequently check the destination server to know how the transfer was progressing.  So I needed a way to count the folders as they showed up on the destination server and then send me an email.

The solution to this was to leverage the power of Windows Script Host which is native to Windows 2008 servers.  Using Vbscript I could easily create a script to count folders in a directory and then send myself an email. The 2nd half of this challenge was to have that script run every 10 minutes which I explain later in this post.

Using Vbscript I create a function that will check a folder and return count of folders it contains.

 function countFolders(strPath)
        dim objShell
        dim objFolder
        dim folderCount
        
        set objShell = CreateObject("shell.application")
        set objFolder = objShell.NameSpace(strPath)

        if (not objFolder is nothing) then
            dim objFolderItems

            set objFolderItems = objFolder.Items

            if (not objFolderItems Is Nothing) then
                 folderCount=objFolderItems.Count
            end if

            set objFolderItem = nothing
        end if

        set objFolder = nothing
        set objShell = nothing

        countFolders=folderCount
    end function

Next I needed to create a function that would send an email with the folder count. I format the message so that the pertinent details are in the subject line.

Function SendCount(strCount)

Set objMessage = CreateObject("CDO.Message")  
objMessage.Subject = strCount
objMessage.From = "user@mydomain.com"  
objMessage.To = "user@mydomain.com"  
objMessage.TextBody = strCount
 
objMessage.Configuration.Fields.Item _ 
("http://schemas.microsoft.com/cdo/configuration/sendusing") = 2  
 
'Name or IP of Remote SMTP Server 
objMessage.Configuration.Fields.Item _ 
("http://schemas.microsoft.com/cdo/configuration/smtpserver") = "mail.mydomain.com" 
 
'Server port (typically 25) 
objMessage.Configuration.Fields.Item _ 
("http://schemas.microsoft.com/cdo/configuration/smtpserverport") = 25 
 
objMessage.Configuration.Fields.Item _ 
("http://schemas.microsoft.com/cdo/configuration/smtpauthenticate") = 1 
 
objMessage.Configuration.Fields.Item _ 
("http://schemas.microsoft.com/cdo/configuration/smtpusessl") = false 
 
objMessage.Configuration.Fields.Item _ 
("http://schemas.microsoft.com/cdo/configuration/sendusername") = "user@mydomain.com" 
 
objMessage.Configuration.Fields.Item _ 
("http://schemas.microsoft.com/cdo/configuration/sendpassword") = "xyz123" 
 
objMessage.Configuration.Fields.Update 

objMessage.Send  
End Function

So now that I’ve got my two functions I need to code how they’ll be used. I create local variables for the destination path that is going to be monitored, the total number of folders being copied, and a counter.

Option Explicit

Dim destFolder 
Dim maxFolders
Dim folderCount

destFolder="d:\destinationfolder"
maxFolders=500
folderCount=countFolders(destFolder)

sendCount("F: " & folderCount & "  " & FormatPercent(folderCount/maxFolders))

At this point the script is ready to run. I knew there were 500 folders being copied to the new server so each time the script was run it would send me an email with the number of folders copied and the % complete.  In the beginning of this post I indicated I need this to be automated so the next step is to login to the destination server and create a scheduled task to run our new script. Since I know from testing this transfer will take approximately 3 hours I schedule the task to end after 3 hours but I want it to run every 10 minutes until then.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS
Sep 252012
 

Microsoft’s Log Parser is a really powerful tool for searching log files. With just a few simple SQL commands you can pull more data than you ever imagined out of your logs. It can be used to read web site log files, csv files, and even Windows event logs. Log Parser isn’t intended to compete with stats products such as Webtrends Log Analyzer or  SmarterStats by Smartertools.com which I feel is the best on the market. 

I primarily use Log Parser for troubleshooting problems with a site such as identifying http errors or long running page requests. When I look into an issue I also like to get a snapshot for the day’s traffic such as the top IP addresses making requests and the top bots hitting the site. This is all available from Log Parser once you know the right queries to run.

As one can imagine when a site is having problem you want to know as much information as quickly as possible. Manually running Log Parser queries on a site’s log files is not easy when you have an urgent problem resolve.  My biggest challenge with using Log Parser is remembering  the different queries I want and getting the syntax correct.  Solving this challenge is easier than one might think and involves just creating the scripts ahead of time that you need.

I just create a batch file called logparse.bat and use the %1 parameter for the name of the log file that I want to analyze and then I redirect the output of the query to a text file:

Here are the queries I am using:

 
"c:\temp\logparser.exe" -i:iisw3c "select top 25 count(*), cs-uri-stem from %1 group by cs-uri-stem order by count(*) desc" -rtp:-1 >top25req.txt

"c:\temp\logparser.exe" "select Top 10 count(*), c-ip from %1 group by c-ip order by Count(*) DESC" -o:csv >topIP.txt

"c:\temp\logparser.exe" "select Top 10 count(*), c-ip, cs(User-Agent) from %1 group by c-ip, cs(User-Agent) order by Count(*) DESC" -o:csv >topBot.txt

Running this from the command line is simple. You just specify path to the log file as a parameter.


c:\temp\logparse c:\wwwlogs\w3svc1\u_ex120921.log

Running this will create 3 files showing the top 25 requests, top bots hitting the site and the top IP addresses making requests. This is valuable information but we really need more information to know what’s going on with our site. Here’s a more advanced query using a SQL file. Just as before we’re going to use parameters and then call it from our logparse.bat file. This query is stored in a separate file called topstatuscodes.sql

SELECT 	STRCAT(TO_STRING(sc-status), STRCAT('.', TO_STRING(sc-substatus))) 
AS Status, COUNT(*) AS Total FROM %source% to %destination% GROUP BY Status ORDER BY Total DESC

Here is how we call this external file from our logparse.bat file:

"c:\temp\logparser.exe" file:TopStatusCodes.sql?source=%1+destination=TopStatusCodes.txt -o:NAT 

Here is what the output of this particular query looks like:

Immediately we can see that our site had 1,564 HTTP 404 errors as well as 22 HTTP 500 errors. That many 404 errors indicates a problem with the site’s design and could impact performance. So let’s create another query and output the results to a file to see where the 404 errors are coming from:

SELECT
TOP 10
STRCAT(EXTRACT_PATH(cs-uri-stem),'/') AS RequestPath, sc-status,
EXTRACT_FILENAME(cs-uri-stem) AS RequestedFile,
COUNT(*) AS TotalHits,
MAX(time-taken) AS MaxTime,
AVG(time-taken) AS AvgTime,
AVG(sc-bytes) AS AvgBytesSent
FROM %source% TO %destination%
where sc-status=404
GROUP BY cs-uri-stem, sc-status
ORDER BY MaxTime, TotalHits DESC

Here is how we run this query from our logparse.bat file:

"c:\temp\logparser.exe" file:Top10-404-WebRequests.sql?source=%1+destination=Top10-404-WebRequests.txt -o:NAT

The output of this query provides all the information we need to track down the issue including request path and the requested file.


Let’s look at one more of my favorite Log Parser queries: the top 10 longest running page requests. This will show us the 10 slowest pages on the site. This can be invaluable information when diagnosing a problem. Here is the query which I save in a file called top10webrequests.sql:

SELECT
TOP 10
STRCAT(EXTRACT_PATH(cs-uri-stem),'/') AS RequestPath,
EXTRACT_FILENAME(cs-uri-stem) AS RequestedFile,
COUNT(*) AS TotalHits,
MAX(time-taken) AS MaxTime,
AVG(time-taken) AS AvgTime,
AVG(sc-bytes) AS AvgBytesSent
FROM %source% TO %destination%
GROUP BY cs-uri-stem
ORDER BY MaxTime, TotalHits DESC

Here is how we call it from our logparse.bat file. It will redirect the output to a file called top10webrequests.txt:

"c:\temp\logparser.exe" file:Top10WebRequests.sql?source=%1+destination=Top10WebRequests.txt -o:NAT

The numbers shown in this report seem incredibly high at first glance but do not be alarmed. The IIS log format uses microseconds so you have to divide each of the numbers by 1000. So by doing that we can see that one request took well over 2 minutes to complete and another averages 19 seconds to complete. These are red flags that need immediate investigation.

So in summary I have shown 6 great log parser queries you can you run from a batch file to automate your web log analysis. Anytime someone reports a performance problem you can provide valuable data within seconds. But this is really just scratching the surface. You could add 5-10 more queries to the batch file to get even more data depending on your needs.

Please note: if you use the examples I’ve provided be sure to change the paths to where your logparser.exe is located.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS
Sep 152012
 

If your web site is hosted on a dedicated server (cloud or physical) then chances are you have some internal processes which need to happen on a recurring basis. The Windows Task Scheduler is a wonderful built-in tool that fulfills this need.  The location of this program has changed from Windows Server 2003 to Windows Server 2008. With Windows Server 2003 it was located in the Control Panel. With Windows Server 2008 it is located in Administrative Tools.

With the Windows Task Scheduler you can run any program on the server including custom scripts at any time with any recurring frequency. So this great news for system admins but what happens if you’re a web developer and you designed an admin page on your site to perform some internal housekeeping which runs when the page is loaded? As you can imagine you don’t want to sit at your desk all day hitting the refresh button.

So here’s were the power of Windows Task Scheduler comes into view. We just need to create a new scheduled task to visit the web site. Well unfortunately this is not possible. Task scheduler is not able to browse sites. However, that would be a cool feature for a future release.  So are we done before we’ve started? What could be used to open a web site url that we could then in-turn schedule as a task? Well look no further than Microsoft’s XMLHTTP object. I always say “there’s no school like old school” and in this case it is absolutely true. 

The following vbscript is all we need to open the web site url programmatically.  

       On Error Resume Next

Dim objRequest
Dim URL

Set objRequest = CreateObject("Microsoft.XMLHTTP")
URL = "http://www.peterviola.com/testme.htm"

objRequest.open "GET", URL , false
objRequest.Send
Set objRequest = Nothing

Just cut and paste the snippet above into a simple .vbs text file on your server and it will be ready to run. If you run it manually it won’t open a browser but the request is completed. To know it works you just need to check your web site logs. With this bit of code we have identified a way to programmatically call web site url from within our server without having to be logged into the server.  So looking back at our original “task” we now have all the pieces in place to get the job done. 

The next step is to just configure Windows Task scheduler and here again Microsoft makes it easy for us. When you open Task Scheduler on the right side of your screen just click “Create Basic Task” and the Create Basic Task Wizard will launch. Just follow the steps and complete the wizard.

You will be prompted to choose the program you want to run. Use the menu to find the .vbs file you created earlier.

After you complete the wizard your task will be ready to run based on the schedule you picked during the wizard. However in some cases you may want your task to run more frequently than once per day. So using the advanced properties you can choose to repeat the task as frequently as every 5 minutes forever.

As I mentioned above you can confirm it works by checking the www logs for your site. Using the powerful command findstr as shown below I can pull out just the requests I want for my test page:


findstr /S /I /P /c:"GET /testme.htm" C:\wwwlogs\W3SVC1\u_ex120915.log >testme.txt

Here are the results which clearly show the scheduled task is working as expected hitting my test page every 5 minutes.


2012-09-15 18:50:22 W3SVC74 ABC123 x.x.x.x GET /testme.htm - 80
2012-09-15 18:55:22 W3SVC74 ABC123 x.x.x.x GET /testme.htm - 80
2012-09-15 19:00:22 W3SVC74 ABC123 x.x.x.x GET /testme.htm - 80
2012-09-15 19:05:22 W3SVC74 ABC123 x.x.x.x GET /testme.htm - 80

This simple technique can be leveraged in so many powerful ways. Thanks for reading!

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS
Sep 132012
 

Before IIS 7, if you wanted to do url rewriting with IIS 6 you had to use a 3rd party program such as ISAPI Rewrite by helicontech.com. This was a good program but it wasn’t native to IIS and there were limitations such as a site hosting more than 1 domain with different applications running.

With IIS 7 url rewriting and redirecting has never been easier thanks to Microsoft’s Url Rewrite module. The rewriting is done by rules which are specified in the web.config under <system.webserver> element. Using IIS Manager you can use the Url Rewrite gui to create and maintain your rules.


You can also just put the rules directly into the web.config without using the gui. For example:

<system.webserver>
<rewrite>
<rules>
<rule name="xyz">...blah...</rule>
</rules>
<rewrite>
</system.webserver>

IIS 7 Url Rewrite WWW

One of the most common needs for SEO is to force your site to use www for all page requests so that search engines will go to www.mydomain.com instead of domain.com. This is very easy to do with IIS 7′s Url Rewrite. Here is the rule:

<rewrite>
<rules>
<rule name=”Redirect to www” patternSyntax=”Wildcard” stopProcessing=”true”>  
<match url=”*” />
<conditions>
<add input=”{HTTP_HOST}” pattern=”peterviola.com” />
  </conditions>
 <action type=”Redirect” url=”http://www.peterviola.com/{R:0}” />
</rule>
</rules>
<rewrite>

This works really well and it is a completely seamless experience for your web site visitors.  Here is how the rule looks in the IIS Manager gui.

IIS 7 Url Rewrite HTTP to HTTPS

Probably the 2nd most common use of Url Rewrite is for sites that have SSL certificates installed and need to seamlessly redirect page requests using the certificate for either the entire site or a particular folder. Here is the Url Rewrite rule for redirecting requests on the entire site. You simply detect if the request is not secure and then redirect to the secure channel:

<rewrite>
<rules>
<rule name="HTTP Redirect to HTTPS" enabled="true" stopProcessing="true">
<match url="(.*)" ignoreCase="false" />
<conditions>
<add input="{HTTPS}" pattern="off" />
</conditions>
<action type="Redirect" url="https://{HTTP_HOST}/{R:1}" appendQueryString="true" redirectType="Permanent" />
</rule>
</rules>
</rewrite>

IIS 7 Url Rewrite HTTP to HTTPS on Subfolder

The example above is great but running your entire site in HTTPS will have a performance impact so you don’t need to do it unless there is a specific business requirement for it. So then we need a rule to redirect requests to HTTPS for just one folder. In this example we’ll use a folder called “/secure”. In this instance we use the same rule as above however now we only want page requests for the “secure” folder. This is done by modifying the “match url” element.

<rewrite>
<rules>
<rule name="HTTPS on subfolder" enabled="true">
<match url="(^secure/.*)" ignoreCase="false" />
<conditions>
<add input="{HTTPS}" pattern="off" />
</conditions>
<action type="Redirect" url="https://{HTTP_HOST}/{R:1}" appendQueryString="true" redirectType="Permanent" />
</rule>
<rules>
<rewrite>

We’ve covered 3 of the most common uses of IIS 7 Url Rewrite but if you notice the rules above are really for redirecting and not url rewriting. We’ll cover more examples on rewriting in an upcoming post.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS
Sep 072012
 

Have you ever struggled to figure out why your SQL Server database is using so much space? If your database is hosted on a dedicated server then perhaps it’s not an issue for you. But what if your on a server with limited storage? For example if you have a web site that is on a shared hosting plan then chances are you have a limited amount of space for your SQL Server database. If your database grows over the plan allotment then you will probably be looking at an overage fee unless you can clean up some space within the database. Regardless of where your database is hosted it’s always a good idea to keep your database growth under control.

With a SQL Server database, the transaction log is usually the first place to look at when you need to free up space.  The transaction log is used for restoring the database in the event of a system crash to the very minute before the crash occurred. If your web site is not being used for ecommerce then you probably do not need “point in time” recovery.  By switching the SQL Server recovery model to “Simple” you’ll be ale to free up a lot of space that was being used by the transaction log.

So let’s assume you’ve already switched the SQL Server recovery model to minimize the size of your database however your hosting provider is still telling you your database is over the allotment and you need to clean up more space.  With a SQL Server database you could “shrink” it but that’s a topic for another day.  So we’ll focus on the tables in the database and see if you can clean out some legacy data.

The logical question is how does one determine which tables in a SQL Server database are using the most space? Well SQL Server has a stored procedure called sp_spaceused which can show you the total size of your database or alternatively how much space a table is using. For example you can check a table called products like this:

sp_spaceused products

This is very handy however not very practical if you have a lot of tables to test. Fortunately thanks to Bill Graziano over at SQLTeam.com this important but mundane task is a now breaze. They created a SQL script called bigtables.sql that you can run to list the 25 biggest tables in your database. It will quickly and clearly show you how much space each one of your tables is using.

Let’s take it a step further by creating a stored procedure to run the script. This way the functionality is always ready to run when we need it. Here is a script that creates a stored procedure called sp_bigtables. When we run the stored procedure we see these results which clearly show which tables are using the most space. The picture below only shows the first 30 tables however the script displays every table in the database.

Now that you’ve identified the biggest tables in SQL Server database you can focus on cleaning out some legacy data. Be sure to take a backup first before deleting any data from your database.

One final tip is that you should run DBCC CHECKUSAGE (‘databasename’) on your database first before running the script above so that you get the most accurate results.

dbcc updateusage

 

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS